Question

Please look at the code below and the result in console.

NSString *strRatio = @"0.03" ;
float f = [strRatio floatValue] ;
NSLog(@"%f, %@", f, f == 0.03 ? @"equal" : @"not equal") ;

result:

0.030000, not equal

Also I have a screenshot when I add a breakpoint at NSLog(@"%f, %@", f, f == 0.03 ? @"equal" : @"not equal") ; , it gives me a different value of f showing 0.0299999993...

Can anyone explain it ?

  1. Why is the result of f == 0.03 is false ?
  2. Why the value of f printed is 0.030000 but it shows 0.0299999993 when debug.

enter image description here

Edit :

I expect that the value of f is 0.03 after converting from @"0.03", how can I achieve it ?

It seems that float can't represent 0.03. Even if I assign 0.03 to float value forcibly, I will get 0.029999993 as the result.

Was it helpful?

Solution

The value is not 0.03, it is as shown in the debugger - 0.0299999993.

It shows as 0.03000 in the log because by default, %f shows 5 decimal places so the value 0.0299999993 is being rounded to 0.03000.

Change the log to use %.10f and you will see the real value.

OTHER TIPS

Try NSDecimalNumber instead of [string floatValue];

NSDecimalNumber *number1 = [NSDecimalNumber decimalNumberWithString:@"0.03"];

NSLog(@"number1: %@", number1); //0.03
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top