Question

I'm working on a particle filter for an autonomous robot right now, and am having trouble producing expected distance measurements by which to filter the particles. I have an image that I'm using as a map. Each pixel represents a certain scaled area in the enviroment. Space the robot can occupy is white, walls are black, and areas that are exterior to the enviroment are grey.

If you are unfamiliar with what a particle filter is, my python code will create a predetermined number of random guesses as to where it might be (x,y,theta) in the white space. It will then measure the distance to the nearest wall with ultrasonic sensors at several angles. The script will compare these measurements with the measurements that would have been expected at each angle for each guessed location/orientation. Those that most closely match the actual measurements will survive while guesses that are less likely to be right will be eliminated.

My problem is finding the nearest wall AT a given angle. Say the sensor is measuring at 60°. For each guess, I need to adjust the angle to account for the guessed robot orientation, and then measure the distance to the wall at that angle. It's easy enough find the nearest wall in the x direction:

from PIL import Image
#from matplotlib._png import read_png
from matplotlib.pyplot import *
mapp = Image.open("Map.png")
pixels = mapp.load()
width = mapp.size[0]
height = mapp.size[1]
imshow(mapp)

pixelWidth = 5

for x in range(width):
    if mapp.getpixel((x, 100)) == (0,0,0,255): #Identify the first black pixel
        distance = x*pixelWidth self.x

The problem is that I can't tell the script to search one pixel at a time going at a 60°, or 23°, or whatever angle. Right now the best thing I can think of is to go in the x direction first, find a black pixel, and then use the tangent of the angle to determine how many pixels I need to move up or down, but there are obvious problems with this, mostly having to do with corners, and I can't imagine how many if statements it's going to take to work around it. Is there another solution?

Was it helpful?

Solution

Okay, I think I found a good approximation of what I'm trying to do, though I'd still like to hear if anyone else has a better solution. By checking the tangent of the angle I've actually traveled so far between each pixel move, I can decide whether to move one pixel in the x-direction, or in the y-direction.

for i in range(len(angles)):
    angle = self.orientation+angles[i]
       if angle > 360:
           angle -= 360
        x = self.x
        y = self.y
        x1 = x
        y1 = y
        xtoy_ratio = tan(angle*math.pi/180)
        if angle < 90:
            xadd = 1
            yadd = 1
        elif 90 < angle < 180:
            xadd = -1
            yadd = 1
        elif 180 < angle < 270:
            xadd = -1
            yadd = -1
        else:
            xadd = 1
            yadd = -1
        while mapp.getpixel(x,y) != (0,0,0,255):
            if (y-y1)/(x-x1) < xtoy_ratio:
                y += yadd
            else:
                x += xadd
        distance = sqrt((y-y1)^2+(x-x1)^2)*pixel_width

The accuracy of this method of course depends a great deal on the actual length represented by each pixel. As long as pixel_width is small, accuracy will be pretty good, but if not, it will generally go pretty far before correcting itself.

As I said, I welcome other answers.

Thanks

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top