Question

I am about to grab the video output of my raspberry pi to pass it to kinda adalight ambient lightning system. The XBMC's player for PI, omxplayer, users OpenMAX API for decoding and other functions.

Looking into the code gives the following:
m_omx_tunnel_sched.Initialize(&m_omx_sched, m_omx_sched.GetOutputPort(), &m_omx_render, m_omx_render.GetInputPort());

as far as I understand, this sets a pipeline between the video scheduler and the renderer [S]-->[R].

Now my idea is to write a grabber component and plug-in it hardly into the pipeline [S]-->[G]->[R]. The grabber will extract the pixels from the framebuffer and pass it to a deamon which will drive the leds.

Now I am about to dig into OpenMAX API which seems to be pretty weird. Where should I start? Is it a feasible approach?

Best Regards

Was it helpful?

Solution

If you want the decoded data then just do not send to the renderer. Instead of rendering, take the data and do whatever you want to do. The decoded data should be taken from the output port of the video_decode OpenMAX IL component. I suppose you'll also need to set the correct output pixel format, so set the component output port to the correct format you need, so the conversion is done by the GPU (YUV or RGB565 are available).

OTHER TIPS

At first i think you should attach a buffer to the output of camera component, do everything you want with that frame in the CPU, and send a frame through a buffer attached to the input port of the render, its not going to be a trivial task, since there is little documentation about OpenMax on the raspberry.

Best place to start: https://jan.newmarch.name/RPi/

Best place to have on hands: http://home.nouwen.name/RaspberryPi/documentation/ilcomponents/index.html

Next best place: source codes distributed across the internet.

Good luck.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top