Question

I cannot compile any MATLAB MEX code due to the following error:

In file included from /Applications/MATLAB_R2013a.app/extern/include/mex.h:58:
In file included from /Applications/MATLAB_R2013a.app/extern/include/matrix.h:294:
/Applications/MATLAB_R2013a.app/extern/include/tmwtypes.h:819:9: error: unknown type name 'char16_t'
typedef char16_t CHAR16_T;

The only thing that has changed on my machine as far as I can remember is that Xcode was updated to version 5.1 (5B130a).

Any fix for the time being to compile MEX code in MATLAB?

[Running on OS 10.9.2 with Apple LLVM version 5.1 (clang-503.0.38) (based on LLVM 3.4svn)]

Was it helpful?

Solution

By default, the upgraded Clang doesn't set char16_t, which is required by MATLAB.

Quick fix

This works for C or C++ code but needs to be done on each mex command line.

>> mex -Dchar16_t=uint16_t ...

Other solutions below put this definition into the mex configuration or enable C++11.

Permanent solution

Options:

  • Add -std=c++11 to CXXFLAGS in your mex configuration file AND compile .cpp files instead of .c. The mex config file is mexopts.sh (pre-R2014a) or the .xml file indicated by mex -setup (R2014a+). This is what worked for OP, but the next option works too. Be sure to edit the active/installed config, not the system-wide reference. Try the next solution if you can't tell.
  • Use a #define or typedef to create char16_t before including mex.h (see "other workaround" below).
  • In some future version of MATLAB, this will have been fixed. Re-run mex -setup to have MATLAB reconfigure it for you and it works. As of R2014a, this doesn't do the trick.
  • As a last resort, you can always modify the MATLAB installation, hacking MATLAB's tmwtypes.h as Dennis suggests, but I strongly suggest NOT modifying the MATLAB installation.

Note: If you are using C and cannot or don't want to change to C++, follow the solution in this other answer, OR see the alternative workaround below.


The other workaround

If for some reason you are not able to enable the C++11 standard, you can use the preprocessor to define char16_t. Either put #define char16_t uint16_t before #include "mex.h", or set it with the compiler command line:

-Dchar16_t=uint16_t

Alternatively, use a typedef, again before including mex.h:

typedef uint16_t char16_t;

If these solutions don't work, try changing uint16_t to UINT16_T. Further yet, others have reported that simply including uchar.h brings in the type, but others don't have that header.

OTHER TIPS

I experienced the same error, also directly after upgrading to Xcode 5.1.

The relevant lines (818-824) in the file tmwtypes.h, which causes the error, are:

#if defined(__STDC_UTF_16__) || (defined(_HAS_CHAR16_T_LANGUAGE_SUPPORT) && _HAS_CHAR16_T_LANGUAGE_SUPPORT)
typedef char16_t CHAR16_T;
#elif defined(_MSC_VER)
typedef wchar_t CHAR16_T;
#else
typedef UINT16_T CHAR16_T;
#endif

A solution is to simply change the line

typedef char16_t CHAR16_T;

into

typedef UINT16_T CHAR16_T;

A must admit that I don't know if this affects any function or behaviour of mex files but at least I'm able to compile my c files again using mex.

Please see other answers if this method doesn't work.

I upgraded my gcc/g++ compilers using homebrew to version 4.8 --> gcc-4.8 and g++-4.8.

After that I changed the following lines in the mexopts.sh file:

CXXFLAGS="-fno-common -fexceptions -arch $ARCHS -isysroot $MW_SDKROOT -mmacosx-version-min=$MACOSX_DEPLOYMENT_TARGET -std=c++11"

In my mexopts.sh, this is line 150. I only added the -std=c++11 flag which is what I guess chappjc meant.

EDIT: This is covered in the update by chappjc!

I just add my own experiment (C++ only). The

#define char16_t uint16_t 

was causing some problem in the other parts of the mex file. In fact, subsequently to my mex file, char16_t was properly defined. By tracking the chain of includes, the proper type char16_t is set in a file named __config :

typedef __char16_t char16_t;

which is also the first file included from <algorithm>. So the hack consists in including algorithm before mex.h.

#include <algorithm>
#include "mex.h"

and the proper settings are performed, still in a multiplatform manner and without changing anything in the build configuration.

Include uchar.h before including mex.h...works fine. Also, the answer above (adding -std=c++11) only works for c++, not c.

#include <uchar.h>
#include "mex.h"

As part of XCode 5.1.1 char16_t is defined in __config, which is called from typeinfo.

You can add

#include <typeinfo> 

before

#include "mex.h"

to have char16_t defined.

This post might help: http://www.seaandsailor.com/matlab-xcode6.html

It was easier than I thought. Just replace all 10.x with your OS X version and add -Dchar16_t=UINT16_T to CLIBS in mexopts.sh file.

It worked on OS X 10.9 Mavericks with Xcode 6 installed.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top