Question

I realize that the official supported streaming protocol for the iPhone is HTTP streaming . This is great, but many appliances implement the RTSP protocol to stream video. I have looked around for quite some time looking for RTSP libraries in objective c and have not found them. Does anyone know of such libaries?

If not, does anyone know of some demo/code examples from people who have tried to get this to work. Since Apple supports h264 in hardware, I'm assuming it is possible to get low level, implement the stream, then construct the video packet and pass it along as if you have streamed using HTTP streaming. Anyone advice on how this might be done is appreciated.

Was it helpful?

Solution

Check out live555. This will handle all the RTSP handshaking and deliver data (in your case, h264) to you application for further processing/decoding. It is a C/C++ library, and hence can run on iOS.

Your options for integration with a cocoa app are:

1) Run live555 on its own thread using the event loop mechanism given as part of the library (note then that all operations directly related to live555 need to be run on this thread as live555 itself is not designed to be thread-safe).

2) Provide a cocoa implementation of "TaskScheduler", in which you use the cocoa library for async network callbacks, timers etc.

The above points will make more sense to you after reviewing the live555 doco.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top