You should not use wide types at all in your case.
Assuming you can get a char *
from your vector<char>
, you can stick to bytes by using the following code:
char * utf16_buffer = &my_vector_of_chars[0];
char * buffer_end = &my_vector_of_chars[vector.size()];
std::string utf8_str = boost::locale::conv::between(utf16_buffer, buffer_end, "UTF-8", "UTF-16");
between operates on 8-bit characters and allows you to avoid conversion to 16-bit characters altogether.
It is necessary to use the between
overload that uses the pointer to the buffer's end, because by default, between
will stop at the first '\0'
character in the string, which will be almost immediately because the input is UTF-16.