There is a good youtube video from a Google I/O session called "the sensitive side of Android" that explains how you could do something similar. Basically you use the light sensors and then you register a drop. This is quite simple but you gotta remember that changing light sources while running the app can affect the results of this feature.
That said I think they use hand detection on the front camera in order to pull off all those gestures and that would be quite complex. Also I haven't seen any api that makes it simple to do hand detection.