Question

I've started differentiating two images by counting the number of different pixels using a simple algorithm:

private int returnCountOfDifferentPixels(String pic1, String pic2)
    {
        Bitmap i1 = loadBitmap(pic1);
        Bitmap i2 = loadBitmap(pic2);
        int count=0;

        for (int y = 0; y < i1.getHeight(); ++y)
               for (int x = 0; x < i1.getWidth(); ++x)
                    if (i1.getPixel(x, y) != i2.getPixel(x, y)) 
                        {
                        count++;
                        }

          return count;

    }

However this approach seems to be inefficient in its initial form, as there is always a very high number of pixels which differ even in very similar photos. I was thinking of a way of to determine if two pixels are really THAT different. the bitmap.getpixel(x,y) from android returns a Color object.

How can I implement a proper differentiation between two Color objects, to help with my motion detection?

Was it helpful?

Solution

You are right, because of noise and other factors there is usually a lot of raw pixel change in a video stream. Here are some options you might want to consider:

  1. Blurring the image first, ideally with a Gaussian filter or with a simple box filter. This just means that you take the (weighted) average over the neighboring pixel and the pixel itself. This should reduce the sensor noise quite a bit already.

  2. Only adding the difference to count if it's larger than some threshold. This has the effect of only considering pixels that have really changed a lot. This is very easy to implement and might already solve your problem alone.

Thinking about it, try these two options first. If they don't work out, I can give you some more options.

EDIT: I just saw that you're not actually summing up differences but just counting different pixels. This is fine if you combine it with Option 2. Option 1 still works, but it might be an overkill.

Also, to find out the difference between two colors, use the methods of the Color class:

int p1 = i1.getPixel(x, y);
int p2 = i2.getPixel(x, y);
int totalDiff = Color.red(p1) - Color.red(p2) + Color.green(p1) - Color.green(p2) + Color.blue(p1) - Color.blue(p2);

Now you can come up with a threshold the totalDiff must exceed to contribute to count.

Of course, you can play around with these numbers in various ways. The above code for example only computes changes in pixel intensity (brightness). If you also wanted to take into account changes in hue and saturation, you would have to compute totalDifflike this:

int totalDiff = Math.abs(Color.red(p1) - Color.red(p2)) + Math.abs(Color.green(p1) - Color.green(p2)) + Math.abs(Color.blue(p1) - Color.blue(p2));

Also, have a look at the other methods of Color, for example RGBToHSV(...).

OTHER TIPS

I know that this is essentially very similar another answer here but I think be restating it in a different form it might prove useful to those seeking the solution. This involves have more than two images over time. If you only literally then this will not work but an equivilent method will.

Do the history for all pixels on each frame. For example, for each pixel: history[x, y] = (history[x, y] * (w - 1) + get_pixel(x, y)) / w

Where w might be w = 20. The higher w the larger the spike for motion but the longer motion has to be missing for it to reset.

Then to determine if something has changed you can do this for each pixel:

changed_delta = abs(history[x, y] - get_pixel(x, y))

total_delta += changed_delta

You will find that it stabilizes most of the noise and when motion happens you will get a large difference. You are essentially taking many frames and detecting motion from the many against the newest frame.

Also, for detecting positions of motion consider breaking the image into smaller pieces and doing them individually. Then you can find objects and track them across the screen by treating a single image as a grid of separate images.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top