Why is “👍”.length === 2?

Javascript uses UTF-16 (source) to manage strings.

In UTF-16 there are 1,112,064 possible characters. Now, each character uses code points to be represented(*). In UTF-16 one code-point use two bytes (16 bits) to be saved. This means that with one code point you can have only 65536 different characters.

This means some characters has to be represented with two code points.

String.length() returns the number of code units in the string, not the number of characters.

MDN explains quite well the thing on the page about String.length()

This property returns the number of code units in the string. UTF-16, the string format used by JavaScript, uses a single 16-bit code unit to represent the most common characters, but needs to use two code units for less commonly-used characters, so it’s possible for the value returned by length to not match the actual number of characters in the string.

(*): Actually some chars, in the range 010000 – 03FFFF and 040000 – 10FFFF can use up to 4 bytes (32 bits) per code point, but this doesn’t change the answer: some chars requires more than 2 bytes to be represented, so they need more than 1 code point.

This means that some chars that need more than 16 bits have a length of 1 anyway. Like 0x03FFFF, it needs 21 bits, but it uses only one code unit in UTF-16, so its String.length is 1.

console.log(String.fromCharCode(0x03FFFF).length)

Leave a Comment