Pregunta

I am trying to grab the frames captured by the camera, encode them and finally send them using RTP/RTSP.

To do the capturing I am using the CameraSource class of stagefright. The preview on the screen (Surface passed from Java) is great. But when I try to extract the frames I get frames of 20 bytes.

What am I doing wrong ?

Size videoSize;
videoSize.width = 352;
videoSize.height = 288;
sp<CameraSource> myCamera = CameraSource::CreateFromCamera(NULL, NULL, 
                             1 /*front camera*/, videoSize, 25, mySurface, true);
myCamera->start();

//the following is from a reader thread.
status_t err = OK;
MediaBuffer* pBuffer;
while ((err = myCamera->read(&pBuffer)) == OK)
{
    // if not getting a valid buffer from source, then exit
    if (pBuffer == NULL)
    {
        return;
    }
    else
    {
        LOGD("The Size of the returned buffer is: %d", pBuffer->size() );
    }
    pBuffer->release();
    pBuffer = NULL;
}
¿Fue útil?

Solución

You are doing everything correctly, but Samsung decided not to support the route you tried to implement. The only way to use CameraSource on Galaxy S2 (and many other Samsung devices) is to connect it directly to the hardware encoder.

Otros consejos

The 20 bytes is actually some sort of metadata used to communicate between camera and the video encoder. unless you know how to interpret this data, it is difficult to extract the video frame from this data.

Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top