Question

I'm recording video from a webcam with Flash, saving it on an Adobe (Flash) Media Server

What are all the things that can contribute to a choppy video full of missing frames? What do I need to adjust to fix this, or what can contribute to the problem?

The server is an Amazon Web Services Medium (M1) EC2 instance. 2 ghz processor, with 3.75gb RAM Looking on the admin console for AMS, The server is never getting maxed out in terms of RAM or CPU percentage

Bandwidth is never over 4mb.

The flash recorder captures at 320x240 at 15fps

I used setQuality(0,100) on the camera. I can still make out individual "pixels" when viewing my recording, but it isn't bad

Was it helpful?

Solution

Server has nothing to do with this. If the computer running the Flash file can't keep up, then you get dropped frames. You basically have 1000/stage.framerate ms to run every calculation for every frame. If your application is running at 30 fps, that is roughly 33ms per frame. You need to make sure everything that needs to happen on each frame can run in that amount of time, which is obviously difficult/impossible to do with the wide range of hardware out there.

Additionally, 15fps itself is too low. The low-end threshhold for what the human eye can perceive as motion is around 24 fps. So at 15fps, you will notice choppiness. Ideally, you want to record at 30fps, which is near the is about where the human eye stops being able to distinguish individual frames.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top