Question

I recently wrote an extremely basic edge detection algorithm that works on an array of chars. The program was meant to detect the edges of blobs of a single particular value on the array and worked by simply looking left, right, up and down on the array element and checking if one of those values is not the same as the value it was currently looking at. The goal was not to produce a mathematical line but rather a set of ordered points that represented a descritized closed loop edge.

The algorithm works perfectly fine, except that my data contained a bit of noise hence would randomly produce edges where there should be no edges. This in turn wreaked havoc on some of my other programs down the line.

There is two types of noise that the data contains. The first type is fairly sparse and somewhat random. The second type is a semi continuous straight line on the x=y axis. I know the source of the first type of noise, its a feature of the data and there is nothing i can do about it. As for the second type, i know it's my program's fault for causing it...though i haven't a hot clue exactly what is causing it.

My question is: How should I go about removing the noise completely?

I know that the correct data has points that are always beside each other and is very compact and ordered (with no gaps) and is a closed loop or multiple loops. The first type of noise is usually sparse and random, that could be easily taken care of by checking if any edges is next that noise point is also counted as an edge. If not, then the point is most defiantly noise and should be removed.

However, the second type of noise, where we have a semi continuous line about x=y poses more of a problem. The line is sometimes continuous for random lengths (the longest was it went half way across my entire array unbroken). It is even possible for it to intersect the actual edge.

Any ideas on how to do this?

Was it helpful?

Solution

Normally in image processing a median filter.

You also often do a dilate (make lines bigger) than an erode (make lines thinner) to close up any gaps in the lines

OTHER TIPS

Noise tends to concentrate at higher frequencies, so run a low pass filter over the image before you do edge detection. I've seen this principle used to do sub-pixel edge detection.

This is the sort of thing that I'll throw into unit tests. Get some minimal datasets that exhibit this problem (something small enough that it can be directly encoded into the test file), run the tests, and with the small dataset just step through and see what's going on.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top