Question

With Glass you can launch an app via the 'OK, Glass' menu and it seems to pick the nearest match unless a command is miles off, and you can obviously see the list of commands.
Is there anyway from within the app, or from the voice prompt (after the initial app trigger) to have a similar list given and return the nearest match.

Random (non-real world) example, an app that shows you a colour, "OK Glass, show the colour red"

'show the colour' could be your voice trigger and seems to be matched by glass on a 'nearest neighbor' method, however 'red' is just read in as free text and could be easily misheard as 'dread' or 'head', or even 'read' as there is no way of differentiating 'read' from 'red'.

Is there a way to pass a list of pre-approved option (red, green, blue, orange*, etc.) to this stage, or to another voice prompt within the app so the user can see the list and get more accurate results when there is a finite set of expected responses (like the main ok glass screen)?

*ok well nothing rhymes with orange, we're probably safe there

Was it helpful?

Solution

The Google GDK doesn't support this feature yet. However, the necessary features are already available in some libraries and you can use them as long as the GDK doesn't support this natively. What you have to do:

  1. Pull the GlassVoice.apk from your Glass: adb pull /system/app/GlassVoice.apk

  2. Use dex2jar to convert this apk into a jar file.

  3. Add the jar file to your build path

Now you can use this library like this:

public class VoiceActivity extends Activity {

    private VoiceInputHelper mVoiceInputHelper;
    private VoiceConfig mVoiceConfig;

        @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.voice_activity);

        String[] items = {"red", "green", "blue", "orange"};
        mVoiceConfig = new VoiceConfig("MyVoiceConfig", items);
        mVoiceInputHelper = new VoiceInputHelper(this, new MyVoiceListener(mVoiceConfig),
                VoiceInputHelper.newUserActivityObserver(this));
    }

    @Override
    protected void onResume() {
        super.onResume();
        mVoiceInputHelper.addVoiceServiceListener();
    }

    @Override
    protected void onPause() {
        super.onPause();
        mVoiceInputHelper.removeVoiceServiceListener();
    }

    public class MyVoiceListener implements VoiceListener {
        protected final VoiceConfig voiceConfig;

        public MyVoiceListener(VoiceConfig voiceConfig) {
            this.voiceConfig = voiceConfig;
        }

        @Override
        public void onVoiceServiceConnected() {
            mVoiceInputHelper.setVoiceConfig(mVoiceConfig, false);
        }

        @Override
        public void onVoiceServiceDisconnected() {

        }

        @Override
        public VoiceConfig onVoiceCommand(VoiceCommand vc) {
            String recognizedStr = vc.getLiteral();
            Log.i("VoiceActivity", "Recognized text: "+recognizedStr);

            return voiceConfig;
        }

        @Override
        public FormattingLogger getLogger() {
            return FormattingLoggers.getContextLogger();
        }

        @Override
        public boolean isRunning() {
            return true;
        }

        @Override
        public boolean onResampledAudioData(byte[] arg0, int arg1, int arg2) {
            return false;
        }

        @Override
        public boolean onVoiceAmplitudeChanged(double arg0) {
            return false;
        }

        @Override
        public void onVoiceConfigChanged(VoiceConfig arg0, boolean arg1) {

        }
    }

}

OTHER TIPS

You can take advantage of the disambiguation step that occurs when multiple Activities or Services support the same Voice Trigger: simply have multiple Activities or Services in your application support "show me the color" as the voice trigger and label them with the color options.

Your manifest would look something like:

<application
        android:allowBackup="true"
        android:label="@string/app_name"
        android:icon="@drawable/icon_50"
        >

    <activity
            android:name="com.mycompany.RedActivity"
            android:label="@string/red"
            android:icon="@drawable/icon_red"
            >
        <intent-filter>
            <action android:name="com.google.android.glass.action.VOICE_TRIGGER"/>
        </intent-filter>
        <meta-data
                android:name="com.google.android.glass.VoiceTrigger"
                android:resource="@xml/activity_start"
                />
    </activity>

    <activity
            android:name="com.mycompany.BlueActivity"
            android:label="@string/blue"
            android:icon="@drawable/icon_blue"
            >
        <intent-filter>
            <action android:name="com.google.android.glass.action.VOICE_TRIGGER"/>
        </intent-filter>
        <meta-data
                android:name="com.google.android.glass.VoiceTrigger"
                android:resource="@xml/activity_start"
                />
    </activity>
    <!-- ... -->
</application>

Those Activities or Services would only be used as a "trampoline" to launch the main logic of your app with the color selection.

If you haven't already, you should take a look at contextual voice menus that were added just a few weeks ago to the GDK. I had your exact same problem just the day before it was released, looking at it the next day and finding this helped me a lot! :)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top