I created an app that uses Core Image (CIDetector, CIDetectorTypeFace) for face detection (tracking) in realtime. Something similar to: http://www.youtube.com/watch?v=D_XiwtoorLk I wonder how can I filter the face position and size in order to have something smooth. I guess I have to use a low pass or high pass filter, but not sure how can I do it. Does anyone have an idea or recommendation? I saw an app doing something like that, but I don't remember it. Thanks for help.

EDIT: When I say filtering, this is what I mean: Left: What I receive from core image when a face is detected. Right: The filtered signal from the left.

enter image description here

有帮助吗?

解决方案

You'll want to look more into object tracking in general. This post suggests using Kalman filters, which may be your best bet.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top