unicorn/prefer-code-point Pedantic 
What it does 
Prefers usage of String.prototype.codePointAt over String.prototype.charCodeAt. Prefers usage of String.fromCodePoint over String.fromCharCode.
Why is this bad? 
Unicode is better supported in String#codePointAt() and String.fromCodePoint().
Difference between String.fromCodePoint() and String.fromCharCode()
Examples 
Examples of incorrect code for this rule:
javascript
"🦄".charCodeAt(0);
String.fromCharCode(0x1f984);Examples of correct code for this rule:
javascript
"🦄".codePointAt(0);
String.fromCodePoint(0x1f984);How to use 
To enable this rule in the CLI or using the config file, you can use:
bash
oxlint --deny unicorn/prefer-code-pointjson
{
  "rules": {
    "unicorn/prefer-code-point": "error"
  }
}