Question

I want to use iOS 7 new speech synthezis API, and my application is localized in french & english.

For this to work, 2 things have to be localized :

  • speech text : I put it in usual localizable.string file, and retrieve it in code using NSLocalizedString macro.

  • speech language : AVSpeechSynthesisVoice has to be chosen for corresponding language.

Class instanciation method is AVSpeechSynthesisVoice voiceWithLanguage:(NSString *)lang. I'm currently using [NSLocale currentLocale].localeIdentifier as parameter for this method.

Problem : if user's device language is Portuguese, [NSLocale currentLocale] select portuguese prononciation, while text resolved by NSLocalizedString is english.

How can I know which locale is currently read by NSLocalizedString ?

Était-ce utile?

La solution

Ok, I finally managed to make sense of Apple APIs :

  • [NSLocale currentLocale] : DOESN'T return current language picked by User in Settings > General > international, but returns the region code selected by user in same screen.

  • [NSLocale preferredLanguages] : This list DOES give device language, it's the first string in this list

  • [[NSBundle mainBundle] preferredLocalizations] return language bundle resolved by application. I guess this is what NSLocalizedString uses. It only has 1 object in my case, but I wonder in which cases it can have more than one.

  • [AVSpeechSynthesisVoice currentLanguageCode] returns the system predefined language code.

  • [AVSpeechSynthesisVoice voiceWithLanguage:] class instanciation method needs complete language code : with language AND region. (e.g. : passing @"en" to it will return nil object, it needs @"en-US", or @"en-GB"... )

  • [AVSpeechSynthesisVoice currentLanguageCode] gives default voice, determined by OS.

So this is what my final code looks like

 // current user locale (language & region)
    NSString *voiceLangCode = [AVSpeechSynthesisVoice currentLanguageCode];
    NSString *defaultAppLang = [[[NSBundle mainBundle] preferredLocalizations] firstObject];

    // nil voice will use default system voice
    AVSpeechSynthesisVoice *voice = nil;

    // is default voice language compatible with our application language ?
    if ([voiceLangCode rangeOfString:defaultAppLang].location == NSNotFound) {
        // if not, select voice from application language
        NSString *pickedVoiceLang = nil;
        if ([defaultAppLang isEqualToString:@"en"]) {
            pickedVoiceLang = @"en-US";
        } else {
            pickedVoiceLang = @"fr-FR";
        }
        voice = [AVSpeechSynthesisVoice voiceWithLanguage:pickedVoiceLang];
    }


    AVSpeechUtterance *mySpeech = [[AVSpeechUtterance alloc] initWithString:NSLocalizedString(@"MY_SPEECH_LOCALIZED_KEY", nil)];
    frontPicUtterance.voice = voice;

This way, a user from NewZealand, Australien, GreatBritain, or Canada will get the voice that correspond most to his usual settings.

Autres conseils

Vinzzz's answer was a great start -- I've generalised it to work with any language:

NSString *language = [[[NSBundle mainBundle] preferredLocalizations] objectAtIndex:0];
NSString *voiceLangCode = [AVSpeechSynthesisVoice currentLanguageCode];
if (![voiceLangCode hasPrefix:language]) {
    // the default voice can't speak the language the text is localized to;
    // switch to a compatible voice:
    NSArray *speechVoices = [AVSpeechSynthesisVoice speechVoices];
    for (AVSpeechSynthesisVoice *speechVoice in speechVoices) {
        if ([speechVoice.language hasPrefix:language]) {
            self.voice = speechVoice;
            break;
        }
    }
}
Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top