How to use five digit long Unicode characters in JavaScript
-
09-06-2021 - |
Domanda
In JavaScript I can do this:
foo = "\u2669" // 1/4 note
But I can't do this
foo = "\u1D15D" // full note -five hex digits
It will be interpreted as "\u1D15" followed by "D"
Are there any workarounds for this?
UPDATE 2012-07-09: The proposal for ECMAScript Harmony now includes support for all Unicode characters.
Soluzione
In the MDN documentation for fromCharCode, they note that javascript will only naturally handle characters up to 0xFFFF. However, they also have an implementation of a fixed method for fromCharCode that may do what you want (reproduced below):
function fixedFromCharCode (codePt) {
if (codePt > 0xFFFF) {
codePt -= 0x10000;
return String.fromCharCode(0xD800 + (codePt >> 10), 0xDC00 + (codePt & 0x3FF));
}
else {
return String.fromCharCode(codePt);
}
}
foo = fixedFromCharCode(0x1D15D);
Altri suggerimenti
Try putting the unicode between curly braces: '\u{1D15D}'
.
Nowadays, you can simply use String.fromCodePoint()
, as documented in MDN. For instance:
> String.fromCodePoint(0x1f0a1)
"🂡"
I did a little checking and it appears that there is no full note near 0x2669
. (table of unicode chars)
Although using 0x01D15D
does give me a unknown unicode character this could be because I don't have a music font though. Javascript will try to parse as byte as it can and 0x1D15D
is 2.5 bytes padding it with a 0
will make it 3 and parsable.
Also this was quite handy: unicode.org/charts/PDF/U1D100.pdf
You can use this:
function fromOutsideBMP(cp) {
// 0x01D120
var x=cp-0x10000;
var top10=parseInt("11111111110000000000",2);
var end10=parseInt("1111111111",2);
var part1=(x&top10)/1024+0xD800;
var part2=(x&end10)+0xDC00;
var s=String.fromCharCode(part1)+String.fromCharCode(part2);
return s;
}
Example:
> fromOutsideBMP(0x01d122)
"𝄢"
>