Вопрос

I'm trying to learn how to use the accelerometer by creating (what I thought would be) a simple app to mimic a crude maraca.

The objective is that when the phone is flicked downwards quickly, it emits a maraca sound at the end of that flick, and likewise a different sound is emitted at the end of an upward flick.

The strategy for implementing this is to detect when the acceleration passes over a certain threshold. When this happens, ShakeIsHappening is set to true, and the data from the z axis is fed into an array. A comparison is made to see whether the first position in the z array is greater or lesser than the most recent position, to see whether the phone has been moved upwards or downwards. This is stored in a boolean called zup.

Once the acceleration goes below zero, we assume the flick movement has ended and emit a sound, chosen depending on whether the movement was up or down (zup).

Here is the code:

public class MainActivity extends Activity implements SensorEventListener {

    private float mAccelNoGrav;
    private float mAccelWithGrav;
    private float mLastAccelWithGrav;

    ArrayList<Float> z = new ArrayList<Float>();

    public static float finalZ;

    public static boolean shakeIsHappening;
    public static int beatnumber = 0;
    public static float highZ;
    public static float lowZ;
    public static boolean flick;
    public static boolean pull;
    public static SensorManager sensorManager;
    public static Sensor accelerometer;


    private SoundPool soundpool; 
    private HashMap<Integer, Integer> soundsMap;

    private boolean zup;

    private boolean shakeHasHappened;
    public static int shakesound1 = 1;
    public static int shakesound2 = 2;


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        results = (TextView) findViewById(R.id.results);
        clickresults = (TextView) findViewById(R.id.clickresults);

        sensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
        accelerometer = sensorManager
                .getDefaultSensor(Sensor.TYPE_ACCELEROMETER);

        mAccelNoGrav = 0.00f;
        mAccelWithGrav = SensorManager.GRAVITY_EARTH;
        mLastAccelWithGrav = SensorManager.GRAVITY_EARTH;

        soundpool = new SoundPool(4, AudioManager.STREAM_MUSIC, 100);
        soundsMap = new HashMap<Integer, Integer>();
        soundsMap.put(shakesound1, soundpool.load(this, R.raw.shake1, 1));
        soundsMap.put(shakesound2, soundpool.load(this, R.raw.shake1, 1));

    }

    public void playSound(int sound, float fSpeed) {
        AudioManager mgr = (AudioManager)getSystemService(Context.AUDIO_SERVICE);
        float streamVolumeCurrent = mgr.getStreamVolume(AudioManager.STREAM_MUSIC);
        float streamVolumeMax = mgr.getStreamMaxVolume(AudioManager.STREAM_MUSIC);
        float volume = streamVolumeCurrent / streamVolumeMax;  
        soundpool.play(soundsMap.get(sound), volume, volume, 1, 0, fSpeed);
       }


    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        getMenuInflater().inflate(R.menu.main, menu);
        return true;
    }

    @Override
    public void onAccuracyChanged(Sensor sensor, int accuracy) {

    }

    @Override
    public void onSensorChanged(SensorEvent event) {

        float x = event.values[0];
        float y = event.values[1];
        z.add((event.values[2])-SensorManager.GRAVITY_EARTH);
        mLastAccelWithGrav = mAccelWithGrav;
        mAccelWithGrav = android.util.FloatMath.sqrt(x * x + y * y + z.indexOf(z.size()-1) * z.indexOf(z.size()-1));
        float delta = mAccelWithGrav - mLastAccelWithGrav;
        mAccelNoGrav = mAccelNoGrav * 0.9f + delta; // Low-cut filter

        if (mAccelNoGrav > 3) {
            shakeIsHappening = true;
            z.clear();
        } 
        if (mAccelNoGrav < 0) {
            if (shakeIsHappening) {
            shakeIsHappening = false;
            shakeHasHappened = true;
        }
        }

        if (shakeIsHappening && z.size() != 0) {
            if (z.get(z.size()-1) > z.get(0)) {
                zup = true;
            } else if (z.get(0) > z.get(z.size()-1)) {
                zup  = false;
        }
        }
        if (shakeHasHappened) {
            Log.d("click", "up is" + zup + "Low Z:" + z.get(0) + " high Z:" + z.get(z.size()-1));
            if (!zup) {
                shakeHasHappened = false;
                playSound(shakesound2, 1.0f);
                z.clear();

            } else if (zup) {
                shakeHasHappened = false;
                playSound(shakesound1, 1.0f);
                z.clear();              
            }
        }
        }

Some of the problems I'm having are:

  1. I think ShakeHasHappened kicks in when deceleration starts, when acceleration goes below zero. Perhaps this should be when deceleration stops, when acceleration has gone negative and is now moving back towards zero. Does that sound sensible?

  2. The way of detecting whether the motion is up or down isn't working - is this because I'm not getting an accurate reading of where the phone is when looking at the z axis because the acceleration is also included in the z-axis data and therefore isn't giving me an accurate position of the phone?

  3. I'm getting lots of double clicks, and I can't quite work out why this is. Sometimes it doesn't click at all.

If anyone wants to have a play around with this code and see if they can find a way of making it more accurate and more efficient, please go ahead and share your findings. And if anyone can spot why it's not working the way I want it to, again please share your thoughts.

To link sounds to this code, drop your wav files into your res\raw folder and reference them in the R.raw.shake1 bit (no extension)

Thanks

EDIT: I've done a bit of research and have stumbled across something called Dynamic Time Warping. I don't know anything about this yet, but will start to look in to it. Does anyone know if DTW could be a different method of achieving a working maraca simulator app?

Это было полезно?

Решение

I can give you some pointers on this:

First of all, I noticed that you're using the same resource for both outcomes:

soundsMap.put(shakesound1, soundpool.load(this, R.raw.shake1, 1));
soundsMap.put(shakesound2, soundpool.load(this, R.raw.shake1, 1));

The resource in case of shakesound2 should be R.raw.shake2.

Second, the following only deals with one of the motions:

if (mAccelNoGrav > 3)

This should be changed to:

if (mAccelNoGrav > 3 || mAccelNoGrav < -3)

Currently, you are not intercepting downward motion.

Third, acceleration value of 3 is rather low. If you want to avoid/filter-out normal arm movement, this value should be around 6 or 7 and -6 or -7.

Fourth, you do not need to store z values to check whether the motion was up or down. You can check whether:

mAccelnoGrav > 6 ===> implies motion was upwards

mAccelnoGrav < -6 ===> implies motion was downwards

You can use this information to set zup accordingly.

Fifth: I can only guess that you are using if (mAccelNoGrav < 0) to play the sound when the motion ends. In that case, this check should be changed to:

if (mAccelNoGrav < epsilon || mAccelNoGrav > -epsilon)

where epsilon is some range such as (-1, 1).

Sixth, you should include a lockout period in your application. This would be the period after all conditions have been met and a sound is about to be played. For the next, say 1000 ms, don't process the sensor values. Let the motion stabilize. You'll need this to avoid getting multiple playbacks.

Note: Please include comments in your code. At the very least, place comments on every block of code to convey what you are trying to accomplish with it.

Другие советы

I tried to implement it by myself a time ago, and ended up using this solution.-

http://jarofgreen.co.uk/2013/02/android-shake-detection-library/

based on the same concept in your question.

Лицензировано под: CC-BY-SA с атрибуция
Не связан с StackOverflow
scroll top