Question

I've run into some nasty problem with my recorder. Some people are still using it with analog tuners, and analog tuners have a tendency to spit out 'snow' if there is no signal present.

The Problem is that when noise is fed into the encoder, it goes completely crazy and first consumes all CPU then ultimately freezes. Since main point od the recorder is to stay up and running no matter what, I have to figure out how to proceed with this, so encoder won't be exposed to the data it can't handle.

So, idea is to create 'entropy detector' - a simple and small routine that will go through the frame buffer data and calculate entropy index i.e. how the data in the picture is actually random.

Result from the routine would be a number, that will be 0 for completely back picture, and 1 for completely random picture - snow, that is.

Routine in itself should be forward scanning only, with few local variables that would fit into registers nicely.

I could use zlib or 7z api for such task, but I would really want to cook something on my own.

Any ideas?

Was it helpful?

Solution

PNG works this way (approximately): For each pixel, replace its value by the value that it had minus the value of the pixel left to it. Do this from right to left.

Then you can calculate the entropy (bits per character) by making a table of how often which value appears now, making relative values out of these absolute ones and adding the results of log2(n)*n for each element.

Oh, and you have to do this for each color channel (r, g, b) seperately.

For the result, take the average of the bits per character for the channels and divide it by 2^8 (assuming that you have 8 bit per color).

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top