The Javascript String fromCharCode()
method can create a string from their UTF-16 code unit representation.
String.fromCharCode(num1[, ...[, numN]])
num1,...,numN
- a sequence of numbers that are UTF-16 code units. The range is between 0 and 65535 (0xFFFF)
.
Numbers greater than 0xFFFF
are truncated.
let a = String.fromCharCode(65, 66, 67); console.log(a);/* w w w .ja v a 2 s .com*/ a = String.fromCharCode(0x2014); console.log(a); a = String.fromCharCode(0x12014); console.log(a); a = String.fromCharCode(8212); console.log(a);
This method accepts any number of numbers and returns a string:
console.log(String.fromCharCode(0x61, 0x62, 0x63, 0x64, 0x65)); // "abcde"
console.log(String.fromCharCode(97, 98, 99, 100, 101)); // "abcde"
For characters in the range of U+0000 to U+FFFF
, length, charAt()
, charCodeAt()
, and fromCharCode()
all behave exactly.
The following example uses a smiley face emoji character:
The "smiling face with smiling eyes" emoji is U+1F60A.
let message = String.fromCharCode(0x1F60A); console.log(message.length);//from www .j a v a2 s. c o m console.log(message.charAt(0)); console.log(message.charCodeAt(0)); console.log(String.fromCharCode(97, 98, 55357, 56842, 100, 101)); // ab?de console.log(message.length); console.log(message.charAt(0)); console.log(message.charAt(1)); console.log(message.charAt(2)); console.log(message.charAt(3)); console.log(message.charAt(4)); console.log(message.charCodeAt(0)); console.log(message.charCodeAt(1)); console.log(message.charCodeAt(2)); console.log(message.charCodeAt(3)); console.log(message.charCodeAt(4));