Question

Can anyone clarify the following questions? A decent source code snippet or example would be wonderful, the only relevant example I've found so far is WavSource in the SDK.

  1. How is the raw bitmap data presented to the encoder? Is it (A)RGB 32BPP or similar?

  2. On the basis of the WavSource sample something calls RequestSample() on the input stream, this returns data, play time and duration. So I'd be returning a suitably formatted buffer with a playtime and a duration of 1s?

Thanks.

Was it helpful?

Solution

Take a look at this link on how to write a Custom Media Source: http://msdn.microsoft.com/en-us/library/windows/desktop/ms700134(v=vs.85).aspx

Basically raw bitmap is encoded in the sample. You can simply allocate a sample of size 4*row*height for 32 BPP and pass the raw buffer.

You can do this:

  1. MFCreateMemoryBuffer to create buffer
  2. MFCopyImage to copy you bitmap to buffer
  3. MFCreateSample to create a sample

For encoding video samples, take a look at: http://msdn.microsoft.com/en-us/library/windows/desktop/ee663604(v=vs.85).aspx

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top