Question

I would like to transfer rgb frames(real time) data over the web as a video video stream into a browser(chrome/firefox).

so i got the rgb -> yuv - >yv12 -> vp8 conversions completed. I can play the vp8 video locally and it looks good.

now I need to be able to feed this data to webrtc so that other "client" can see this.

Now is this doable?

I was going to wrap vp8 around a webm video file but not write the actual file length, because I don't know the length as it is a real time streaming and create a tag on browser and start feeding the data. (I did this with Theora before, " faking streaming, sure it was a bit slow and was TCP but it worked with a delay).

Or i am going to feed the data to webrtc. I don't even know if the api's allow something like this. I might need to hack it a bit.

Recap : Process A creates a NxM rgb video and writes it to its buffer. Process B reads that buffer via IPC encodes it to VP8 , Process B sends this to browser with VP8 either in webm or webRtc.

Was it helpful?

Solution

As far as I've been able to find researching a related situation, this is not possible yet. Video can only come from devices configured on the system (webcam, audio, etc). I believe it is possible in a proposed spec draft that I read a while back, but we aren't there yet. I know they are working on allowing access from the web audioAPI, but I'm uncertain on the current status.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top