“Least wrong” sounds very silly. Its like programmers are discovering theres a difference between bytes, unicode code points and grapheme clusters and are unsure about how their favorite programming language represents strings, and then decide there should be some behavior that doesnt follow from the documentation.
The “length of an emoji” depends on the data type used to represent it. Its that simple and that correct.
The “length of an emoji” depends on the data type used to represent it. Its that simple and that correct.