I'm having problems converting a wstring to jstring in unix, as the size of wchar_t on linux in 4 bytes (not 2 bytes like windows and thus I cannot use the casting of a wchar_t to a jchar).

Can anyone please help me with that?

Thanks, Reza

有帮助吗?

解决方案

You have to use something like iconv(), because C++ wide strings have an opaque (read: unknown) encoding, while Java expects UTF16. Try this:

#include <iconv.h>
#include <string>
#include <vector>
#include <iostream>

std::u16string convert(std::wstring s)
{
  iconv_t cd = iconv_open("UTF-16BE", "WCHAR_T");

  if (cd == iconv_t(-1))
  {
    std::cout << "Error while initializing iconv: " << errno << std::endl;
    iconv_close(cd);
    return std::u16string();
  }

  std::size_t n = s.length() * 2 + 1; // Each character might use up to two CUs.
  const std::size_t norig = n;
  std::size_t m = s.length() * sizeof(std::wstring::value_type);

  std::vector<char16_t> obuf(n);
  char * outbuf = reinterpret_cast<char*>(obuf.data());
  const char * inbuf = reinterpret_cast<const char*>(&s[0]);

  const std::size_t ir = iconv(cd, const_cast<char**>(&inbuf), &m, &outbuf, &n);

  if (ir == std::size_t(-1))
  {
    std::cout << "Error while converting with iconv(): " << errno << ":" << EINVAL << ", left " << m
              << ", written " << std::dec << norig - n << " bytes." << std::endl;
    iconv_close(cd);
    return std::u16string();
  }

  iconv_close(cd);

  return std::u16string(obuf.data(), (norig - n)/sizeof(std::u16string::value_type));
}

If you don't have char16_t and std::u16string, you can use uint16_t as the basic character type and std::basic_string<uint16_t> or std::vector<uint16_t> as the resulting container.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top