سؤال

OK, I've been looking around for a while, and I'm currently pretty stumped. I'd appreciate any help I can find on this.

I have an application that opens multiple cameras over multiple filter graphs. Each of them are identical, so I'm just going to illustrate one of them. the high-level idea is that a PAL video stream is read in through a video grabber card, then processed by our application. The app builds up DirectShow video filter graphs. Each graph includes a video source, a sample grabber filter and a null renderer filter. The grabbed samples are then used to display various stills from the videos and to be drawn on and scaled with OpenCV. So, the current filter graph looks something like this:

[Video Source] --> [/* Some kind of codec filter */] --> [Sample Grabber] --> [Null Renderer]

I say 'some kind' because I allow DirectShow to render it itself, using

hr = pCaptureGraphs[i]->RenderStream(&PIN_CATEGORY_CAPTURE, &MEDIATYPE_Video, pSourceFilters[i], pGrabberFilters[i], pNullRendFilters[i]);

However, I have a problem with interlacing coming from the PAL videos. Using FFDShow and the GraphEditor app, I built up a similar graph to the one I have above, but with a FFDShow filter before the SampleGrabber, set to perform cubic de-interlacing on the video feed. Or, even better, use it to decode the video stream coming from the video grabber and de-interlace it as well.

My problem is this: I have no idea how to create an FFDShow filter in code. Is there a way to add the filter in code, and if so, what do I include or link to the project to do so? Alternatively, is there a way to configure DirectShow to use the FFDShow filter that does the de-interlacing when you call RenderStream?

Any help on this topic would be appreciated.

UPDATE 1:

OK, so I found out that the PC I was developing on, a Windows 8.1 machine, does not run the FFDShow filter for some reason, but the target machine, a Windows 7 machine, runs the exact same code just fine. Seems there's some kind of incompatibility there. :/

Now on to the next phase: de-interlacing. I've managed to isolate the Sample Grabber filter and the filter preceding it (in this case, an AVI decompressor), and I've disconnected them from one another, then connected the FFDShow raw video filter between them. The other FFDShow filter messes up the graph somehow that no output is read from the sample grabber, so I'm going with the former filter. Now the question becomes, how do I activate de-interlacing of the video feed? I've built the same graph in the GraphEdit program and the interlacing disappears, so I know the filter is capable of doing it. But how do I enable it in code?

UPDATE 2 / SOLUTION

I managed to get it working by running the filter graph, then selecting the filter's tray icon and setting the deinterlacing in the menu by right-clicking on the icon. That worked perfectly. Thanks to Roman R for his help.

هل كانت مفيدة؟

المحلول

You need to create an instance of FFDshow Video Decoder in code, e.g.:

class __declspec(uuid("{04FE9017-F873-410E-871E-AB91661A4EF7}")) FfdshowVideoDecoder;
CComPtr<IBaseFilter> pBaseFilter;
HRESULT nResult = pBaseFilter.CoCreateInstance(__uuidof(FfdshowVideoDecoder));

Then you IGraphBuilder::AddFilter to the filter graph

And then you query input/output IPins from this instance and connect to other filters appropriately, similar to how you'd do it in GraphEdit interactively. You can possibly have this done by RenderStream too if you use this IBaseFilter as intermediate filter.

مرخصة بموجب: CC-BY-SA مع الإسناد
لا تنتمي إلى StackOverflow
scroll top