Question

I'm developing a multimedia application, which stream video and audio over RTSP using the liveMedia library. I read raw video frames from the camera, encode it using libx264, save it in the tbb::concurrent_queue and run RTSP server. I use DynamicRTSPServer.cpp and live555MediaServer.cpp as example for creating my RTSP module and I have a problem - one thread( which execute BasicTaskScheduler::doEventLoop ) uses too much processor time - over 80-90% of one processor core (I have Intel Dual-Core T3100). And for my video stream I have a long delay and VLC can't play video stream (audio stream play normally) with errors:     main warning: picture is too late to be displayed (missing 3780 ms)     avcodec error: more than 5 seconds of late video -> dropping frame (computer too slow ?) When I disable audio subsession I also have high cpu usage but VLC can play video stream normally.

I read the log file of server and find that I have a lot of failed attempts to read new data from an empty queue. This is fragment of my log file (look at a time of debug messages - it is different call of FramedSource::doGetNextFrame)

[10:49:7:621]: videoframe readed. size:  0
[10:49:7:622]: videoframe readed. size:  0
[10:49:7:622]: videoframe readed. size:  0
[10:49:7:622]: videoframe readed. size:  0
[10:49:7:622]: videoframe readed. size:  0
[10:49:7:622]: videoframe readed. size:  0
[10:49:7:623]: videoframe readed. size:  0
[10:49:7:623]: videoframe readed. size:  0
[10:49:7:623]: videoframe readed. size:  0
[10:49:7:623]: videoframe readed. size:  0
[10:49:7:623]: videoframe readed. size:  0
[10:49:7:624]: videoframe readed. size:  0

My video stream has low framerate (camera can only 8 fps or less) and when doGetNextFrame is called my buffer don't have any encoded frame in the queue yet. I have next call stack at this attempts:

ConcurrentQueueBuffer::doGetNextFrame()
FramedSource::getNextFrame(unsigned char* to, unsigned maxSize,
                                afterGettingFunc* afterGettingFunc,
                                void* afterGettingClientData,
                                onCloseFunc* onCloseFunc,
                                void* onCloseClientData)
StreamParser::ensureValidBytes1(unsigned numBytesNeeded)
StreamParser::ensureValidBytes(unsigned numBytesNeeded) 
StreamParser::test4Bytes()
H264VideoStreamParser::parse()
MPEGVideoStreamFramer::continueReadProcessing()
MPEGVideoStreamFramer::continueReadProcessing(void* clientData,
                         unsigned char* /*ptr*/, unsigned /*size*/,
                         struct timeval /*presentationTime*/)
StreamParser::afterGettingBytes1(unsigned numBytesRead, struct timeval presentationTime)
StreamParser::afterGettingBytes(void* clientData,
                                     unsigned numBytesRead,
                                     unsigned /*numTruncatedBytes*/,
                                     struct timeval presentationTime,
                                     unsigned /*durationInMicroseconds*/)
FramedSource::afterGetting(FramedSource* source)
AlarmHandler::handleTimeout()
...

I tried to change logic of my buffer and lock thread while encoder give me new frames, but in this case I have a very long delay for my audio stream also. I set few delay at server starting for save more data in the buffer, but livemedia read data from buffer faster then encoder encode it :(

What is can be the reason of this problem? And how liveMedia detect frequency of attempts to read from FramedSource?

My H264BufferMediaSubsession inherit liveMedia's H264VideoFileServerMediaSubsession and override only createNewStreamSource() virtual method, where I create FramedSource which will read data from my buffer.

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top