How do I convert a 16-bit UCS-2 integer value into a char?
-
04-12-2019 - |
Question
I am parsing values from a binary file. One value I am parsing is a 16-bit number which represents the UCS-2 encoding of a unicode character. I'm converting it to a character like this:
char c = (char)myInteger;
Is this safe?
Solution
Yes, as long as there are no byte-ordering issues this should be fine.
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow