Question

There is a question in my operating systems book about scheduling a system.

The question is: A real-time system needs to handle two voice calls that each run every 5 msec and consumes 1 msec of CPU time per burst, plus one video at 25 frames/sec, with each frame requiring 20 msec of CPU time. Is the system schedulable?

The solution manual has this answer: Each voice call runs 200 times/second and uses up 1 msec per burst, so each voice call needs 200 msec per second or 400 msec for the two of them. The video runs 25 times a second and uses up 20 msec each time, for a total of 500 msec per second. Together they consume 900

The book does not explain how to come to this conclusion, or gives an algorithm. So I was hoping someone could explain how this answer is worked out?

Thank you.

Was it helpful?

Solution

The required timeslices are quite a bit smaller than 1 second, so one can try to see if the different tasks can complete their work within 1 second.

  • Voice runs every 5 ms. 1/0.005 = 200 times/second.

  • Video runs at 25 frames/second. = 25 times/second

  • Voice needs 1 ms each time it's run = 200 ms/second.

  • Video needs 20 ms each time it's run = 25*0.020 = 500 ms

  • 2 voice tasks + 1 video task = 200ms*2 + 500ms = 900ms

How one wants an RTOS to schedule such tasks depends among other things on how much jitter you can live with for the different tasks . e.g. the 2 voice tasks could have equal priority, but higher than than the video task - allowing the voice tasks to run whenever they need in a fifo order. (meaning one voice task might need to wait at most 1ms to be scheduled), and the video task gets the remaining CPU time

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top