質問

In Flash, I have an application where multiple people are sharing live camera feeds via NetStreams to Flash Media Server/Adobe Media Server at different bit rates and different quality settings.

I need the mobile users to receive the live feeds at a different quality setting as the rest.

I found receiveVideoFPS in the Adobe API documentation, but after much sweat and much more tears, it seems it's not supported w/ H.264 ... (go figure - http://forums.adobe.com/message/3841837#3841837#3841837 )

So is there any other way of doing so that anyone knows of ? Or will I need to do something custom for this or what ?? Any ideas ? workarounds?

Currently: Flash Applicaion Publisher > Share Camera via NetStream > publish to FMS/AMS Goal: Flash Application Recipient > Subscribe to published NetStream at different fps and/or resolution

役に立ちましたか?

解決

I see 2 ways to do this.

The first way is suitable if any stream could be seen by PC or mobile users only at one time. Then you have to get input video resolution and if it's not suitable, send a message with the right video height/width via NetStream "send" method. The publisher must receive this command and change his Camera object parameters.

The second one is more difficult, but it can be used in every situation. You have to transcode and restream you stream at server. So everyone will stream in PC quality, and when a mobile user attempts to watch stream, he sends a restream request to mediaserver, which receives it and creates a new stream with the same name and some postfix (for example "stream" -> "stream_MOBILE"). A transcoded video must be pushed in this stream (you can use FFmpeg transcoder). When this is done, you mobile device should listen to "_MOBILE" stream.

I faced similar problem a month ago. By the way, I'm using Wowza Media Server, not FMS. I know, there is a native transcoder tool in Wowza, but there were no money to buy it, so I was forced to use FFmpeg (it's totaly free). In my situation, the solution was very easy: I wrote a little server-side module, which was listening to HTTP-GET request like:

http://[adr]:[port]/restream?id=ID&w=XX&h=YY

where ID is stream's name, and XX/YY are new video dimensions.

When such request received, a script triggers, which only functionality is to start FFmpeg. It starts with command like this:

ffmpeg -i rtmp://[adr]/[appName]/[streamName] -vcodec libx264 -s 352x288 -acodec copy -f flv rtmp://[adr]/[appName]/[newStreamName]

This command makes FFmpeg to transcode -i stream to -f dir with size -s and codecs -vcodec and -acodec.

As you can see, it's very easy. There are lots of parameters in FFmpeg, such as bitrate, framerate, image quality, etc.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top