質問

I created an app that uses Core Image (CIDetector, CIDetectorTypeFace) for face detection (tracking) in realtime. Something similar to: http://www.youtube.com/watch?v=D_XiwtoorLk I wonder how can I filter the face position and size in order to have something smooth. I guess I have to use a low pass or high pass filter, but not sure how can I do it. Does anyone have an idea or recommendation? I saw an app doing something like that, but I don't remember it. Thanks for help.

EDIT: When I say filtering, this is what I mean: Left: What I receive from core image when a face is detected. Right: The filtered signal from the left.

enter image description here

役に立ちましたか?

解決

You'll want to look more into object tracking in general. This post suggests using Kalman filters, which may be your best bet.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top