Question

I have a cross-platform C++ application, where I download a huge file x. While downloading, I would like to show the "current" download speed and estimated time. Sadly, download speed jumps all over the place (from 1.5MB to 100KB), which in turn leads to the estimated time being not really useful.

Instead of reinventing the wheel I am curious, if there are any best practices when it comes to damping the changes in download speed or how to best do these calculation in a user friendly manner.

Was it helpful?

Solution

You can simply average out speed over longer period of time, like 30 seconds or more. Algorithm is very simple: keep track of what file size was at some time, for example every second in array. Then you can compute average speed over last 30 seconds by subtracting file size as of 30 seconds ago from current file size and dividing it by 30 seconds.

Another popular technique is to give more weight to more current measurements. For example, measurements over last 30 seconds are given weight 1, over previous 30 seconds are given less weight (like 1/2), then 1/4 and so on (you should probably stop at 1/8 or so). This way, old speed will still play some role, but will be quickly forgotten, and some dampening will still take place.

You should read about Trailing twelve months, it should give you some ideas how it works.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top