Pergunta

At some point in our code, we want to know the system data type of some values. So we do a check like one of these on an incoming NSNumber:

if (strcmp([numberObject objCType], @encode(NSInteger)) == 0)
{ /* tag as integer */ }

Doing this on 64bit-Simulator with BOOLs put into NSNumber gives strange results.

NSNumber *foo = [NSNumber numberWithBool:YES];
if(strcmp([foo objCType], @encode(BOOL)) == 0))
{ /* this should work, but it does not on 64bit */ }

As a fallback we could use something like

if(strcmp([foo objCType], [[NSNumber numberWithBool:YES] objcType]) == 0))
{ /* this works, but looks like too much work for the processor */ }

Compiling for 32bit simulator, it works perfectly fine. (BOOL looks like a 'char' type in both cases, but comparing does not give the expected result on 64bit).

So, does anyone have any idea, why @encode(BOOL) doesn't match with [foo objCType] when initialized with numberWithBool:?

Foi útil?

Solução

So i did a little research and executed this:

NSNumber *foo = [NSNumber numberWithBool: YES];
NSLog(@"encode BOOL: %s", @encode(BOOL));
NSLog(@"encode boolean: %s", @encode(Boolean));
NSLog(@"encode bool: %s", @encode(bool));
NSLog(@"encode char: %s", @encode(char));
NSLog(@"object: %s", [foo objCType]);

and got this result in the 64bit-simulator:

encode BOOL: B
encode boolean: C
encode bool: B
encode char: c
object: c

and this on the 32bit-simulator

encode BOOL: c
encode boolean: C
encode bool: B
encode char: c
object: c

So the problem was, that on 32bit, encode(BOOL) returns a 'c', but on 64 bit, it returns a 'B', while objCType on a numberWithBool: will give you a 'c' on both 64bit and 32bit.

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top