Question

I want to create a project that reads the user's gesture (accelerometer-based) and recognise it, I searched a lot but all I found was too old, I neither have problems in classifying nor in recognition, I will use 1 dollar recogniser or HMM, I just want to know how to read the user's gesture using the accelerometer.

Is the accelerometer data (x,y,z values) enough or should i use other data with it like Attitude data (roll, pitch, yaw), Gyro data or magnitude data, I don't even understand anyone of them so explaining what does these sensors do will be useful.

Thanks in advance !

Was it helpful?

Solution

Finally i did it, i used userAcceleration data which is device acceleration due to device excluding gravity, i found a lot of people use the normal acceleration data and do a lot of math to remove gravity from it, now it's already done by iOS 6 in userAcceleration.

And i used 1$ recognizer which is a 2D recongnizer (i.e. point(5, 10), no Z).
Here's a link for 1$ recognizer, there's a c++ version of it in the downloads section.

Here are the steps of my code...

  1. Read userAcceleration data with frequancy 50 HZ.
  2. Apply low pass filter on it.
  3. Take a point into consideration only if its x or y values are greater than 0.05 to reduce noise.
    (Note: The next step depends on your code and on the recognizer you use).
  4. Save x and y points into array.
  5. Create a 2D path from this array.
  6. Send this path to the recognizer to weather train it or recongize it.

Here's my code...

@implementation MainViewController {
    double previousLowPassFilteredAccelerationX;
double previousLowPassFilteredAccelerationY;
double previousLowPassFilteredAccelerationZ;

    CGPoint position;
    int numOfTrainedGestures;
    GeometricRecognizer recognizer;
}

- (void)viewDidLoad
{
    [super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.

    previousLowPassFilteredAccelerationX = previousLowPassFilteredAccelerationY = previousLowPassFilteredAccelerationZ = 0.0;

    recognizer = GeometricRecognizer();

    //Note: I let the user train his own gestures, so i start up each time with 0 gestures
    numOfTrainedGestures = 0;
}

#define kLowPassFilteringFactor 0.1
#define MOVEMENT_HZ 50
#define NOISE_REDUCTION 0.05

- (IBAction)StartAccelerometer
{
    CMMotionManager *motionManager = [CMMotionManager SharedMotionManager];
    if ([motionManager isDeviceMotionAvailable])
    {
        [motionManager setDeviceMotionUpdateInterval:1.0/MOVEMENT_HZ];
        [motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue]
                                       withHandler: ^(CMDeviceMotion *motion, NSError *error)
         {
             CMAcceleration lowpassFilterAcceleration, userAcceleration = motion.userAcceleration;

             lowpassFilterAcceleration.x = (userAcceleration.x * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationX * (1.0 - kLowPassFilteringFactor));
             lowpassFilterAcceleration.y = (userAcceleration.y * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationY * (1.0 - kLowPassFilteringFactor));
             lowpassFilterAcceleration.z = (userAcceleration.z * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationZ * (1.0 - kLowPassFilteringFactor));

             if (lowpassFilterAcceleration.x > NOISE_REDUCTION || lowpassFilterAcceleration.y > NOISE_REDUCTION)
                 [self.points addObject:[NSString stringWithFormat:@"%.2f,%.2f", lowpassFilterAcceleration.x, lowpassFilterAcceleration.y]];

             previousLowPassFilteredAccelerationX = lowpassFilterAcceleration.x;
             previousLowPassFilteredAccelerationY = lowpassFilterAcceleration.y;
             previousLowPassFilteredAccelerationZ = lowpassFilterAcceleration.z;


             // Just viewing the points to the user
             self.XLabel.text = [NSString stringWithFormat:@"X : %.2f", lowpassFilterAcceleration.x];
             self.YLabel.text = [NSString stringWithFormat:@"Y : %.2f", lowpassFilterAcceleration.y];
             self.ZLabel.text = [NSString stringWithFormat:@"Z : %.2f", lowpassFilterAcceleration.z];
         }];
    }
    else NSLog(@"DeviceMotion is not available");
}


- (IBAction)StopAccelerometer
{
    [[CMMotionManager SharedMotionManager] stopDeviceMotionUpdates];

    // View all the points to the user
    self.pointsTextView.text = [NSString stringWithFormat:@"%d\n\n%@", self.points.count, [self.points componentsJoinedByString:@"\n"]];

    // There must be more that 2 trained gestures because in recognizing, it gets the closest one in distance
    if (numOfTrainedGestures > 1) {
        Path2D path = [self createPathFromPoints]; // A method to create a 2D path from pointsArray
        if (path.size()) {
            RecognitionResult recongnitionResult = recognizer.recognize(path);
            self.recognitionLabel.text = [NSString stringWithFormat:@"%s Detected with Prob %.2f !", recongnitionResult.name.c_str(),
                                      recongnitionResult.score];
        } else self.recognitionLabel.text = @"Not enough points for gesture !";
    }
    else self.recognitionLabel.text = @"Not enough templates !";

    [self releaseAllVariables];
}
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top