According to the documentation, [NSDecimalNumber decimalNumberWithString:] should use the locale decimal separator:

Whether the NSDecimalSeparator is a period (as is used, for example, in the United States) or a comma (as is used, for example, in France) depends on the default locale.

But when I try it, this code:

NSLog(@"%@", [NSDecimalNumber decimalNumberWithString:@"100,1"]);
NSLog(@"%@", [NSDecimalNumber decimalNumberWithString:@"100,1" locale:NSLocale.currentLocale]);

Gives...

100
100.1

...as output on both iOS 5 and iOS 6. I've tried with Swedish and French as regional settings as both these countries use comma (,) as decimal separator.

Shouldn't the output be the same?

(I know I can use [NSDecimalNumber decimalNumberWithString:locale:] to force the behavior, so this question is not about finding an alternative, just if this is a bug or I'm doing something wrong)

有帮助吗?

解决方案

NSDecimalNumber is simply a storage class for number-type data. It's running a parser (NSNumberFormatter) on the string you pass it to create its number. The reason your second log statement works "better" is because the first one is using the default number format locale (it looks like it's en_US, but I can't verify this, see the edit blow for more information.) to parse, and "100,1" isn't a valid number so the "non-number" part gets stripped off. By specifying a locale that uses "," decimal separators it's capturing the full number properly.

When you NSLog() an NSDecimalNumber it's simply calling -description, which has no locale context and can print, more or less, whatever it wants.

If you want to print properly formatted numbers use NSNumberFormatter like so:

NSDecimalNumber *number = [NSDecimalNumber decimalNumberWithString:@"100.1"];

NSLog(@"%@", number);

NSNumberFormatter *formatter = [[NSNumberFormatter alloc] init];

[formatter setNumberStyle:NSNumberFormatterDecimalStyle];

NSLocale *locale = [[NSLocale alloc] initWithLocaleIdentifier:@"fr_FR"];

[formatter setLocale:locale];

NSLog(@"%@", [formatter stringFromNumber:number]);

Or, briefly

NSDecimalNumber *number = [NSDecimalNumber decimalNumberWithString:@"100.1"];
NSLog(@"%@", [NSNumberFormatter localizedStringFromNumber:number numberStyle:NSNumberFormatterDecimalStyle]);

if you just want to use the current locale.

In summary:

  1. NSDecimalNumber is just storage. Logging it does not reflect anything about locale.
  2. In order to get NSDecimalNumber to store a number properly, its locale needs to match the locale of the expected input (-[NSLocale currentLocale] is a good choice here).
  3. In order to display numbers formatted correctly for a given locale, use NSNumberFormatter.

Edit:

Ok, I've done some more research on this.

In GNUStep it looks like it ends up using the value for NSDecimalSeparator in NSUserDefaults (from a quick browse of their code).

Doing some experimentation I've found that none of the following affect the default parsing behavior, as far as I can tell:

  1. NSDecimalSeparator in NSUserDefaults.
  2. AppleLocale in NSUserDefaults.
  3. NSLocaleCode in NSUserDefaults.
  4. The value set for CFBundleDevelopmentRegion.
  5. The Environment's LANG/LC_ALL/etc... values.
  6. +[NSLocale systemLocale].

And obviously it is not +[NSLocale currentLocale], as this question stems from the fact that the current locale has no effect.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top