문제

I am a bit puzzled by the sensor reading rates in Android. The code below reports delays of ~53 ms (ZTE Blade, rate sensor events is set to SENSOR_DELAY_FASTEST).

public void onSensorChanged(SensorEvent event) {
 synchronized (this) {
  switch (event.sensor.getType()) { 
   case Sensor.TYPE_MAGNETIC_FIELD:
     TimeNew = event.timestamp;
     delay = (long)((TimeNew - TimeOld)/1000000);
     TimeOld = TimeNew;
     Log.d("Test", delay + " ms");
    break;
    }
   }
  }

The log:

DEBUG/Test(23024): 52 ms
DEBUG/Test(23024): 53 ms
DEBUG/Test(23024): 54 ms
DEBUG/Test(23024): 56 ms
DEBUG/Test(23024): 52 ms
DEBUG/Test(23024): 52 ms
DEBUG/Test(23024): 55 ms
DEBUG/Test(23024): 52 ms

If we want to average say 100 samples, and then save the data, the time between each 100th sample will vary significantly. That is presumably because the sensor value is not changing in regular time periods.

But am I missing something ? Is there a way to get (more) regular measurements (e.g. 100 ms) ? Or perhaps I should go for averaging over a specific period of time instead of no.of samples ?

Also 50ms seems to be a bit long time. Could that be hardware limitation of the device ? Will this number vary on different platforms ?

Any advice is appreciated.

도움이 되었습니까?

해결책

I would average over a period of time rather than a number of samples. I would expect different devices with different device capabilities would generate substantially different results if you were to go with the number-based approach. If you want more regular management of the sampling I would probably disconnect the test from the event and just poll the device at whatever frequency you wish.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top