So I have a QTMovieView displaying a QTMovie. Via my interface the user can add and remove CoreImage filters that are applied to the movie in real time. That works fine.

Now I want to extract images from the movie (say when the user hits a button). This works too with the frameImageAtTime method. But what I get is an unfiltered image from the original movie. My guess is that the CI filters are applied tho the QTMovieView's display layer and not to the movie itself.

Is there a workaround?

有帮助吗?

解决方案

I got a working solution: Use frameImageAtTime to retrieve a CIImage, apply the same filters to it as on the QTMovie and convert the resulting filtered image into a NSImage(via NSBitmapImageRep).

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top