質問

Recently I have been working on a robotics project in Java. The robot is instructed to drive around the room logging data about the area around it. The robot has 16 sonar sensors all the way around the robot. The Java code I have written is irrelevant to my question but all it does is drive the robot around and writes data out a CSV file.

The sensors are positioned as follows:

Sonar layout

The file looks like this: file output here

Each row in the CSV file contains the following information: xpos, ypos, yaw(in radians), 16 sonar readouts, timestamp

When the robot starts it is in the top right corner facing right. The robot sees the world as a 20x20 grid (-10 to 10).

I am using python to process this information and I am trying to map the environment around the robot. As the CSV file tells me the robots x,y location in the world and the robots yaw I want to then use python Image library to plot the robots position on the image and then add plot a red dot around the robot at the specified distance from the sonar sensors.

Currently I have this code to generate the robots path:

with open('data.csv') as fp:
    for line in fp:
        tempLine = line.split(',')
        x = (float(tempLine[0])+10.0)*100
        y = (float(tempLine[1])+10.0)*100

        idraw.rectangle([(x,2000-y), (x+10,2000-(y+10))], fill=(0,0,0)) 

Giving the following output:enter image description here

I'm at a lost cause trying to get the sonar points to map. I was taking the approach of working out where the sensors distance is and then rotate the coordinates around the robots yaw.

This has not worked at all and has given me some rather strange results: enter image description here

Any help will be appreciated! Thanks!

役に立ちましたか?

解決

Assumptions

The area is 20m by 20m with origin at centre of area and x positive towards the right and y positive upwards. x,y gives position of ‘centre’ of robot in meters. (this makes calculations regarding sensors more straight forward.) Each sensor is 5cm from ‘centre’ of robot. Angle, A, of sensor, S, is as given in the image below

enter image description here

Yaw 0 is when robot is facing as shown below.

enter image description here

Yaw is given in anti-clockwise direction in radians.

NOTE:-The follow calculations are based on real world coordinates adjustments for screen coordinates for drawing on screen would still need to be made.

With robot at (x,y) and yaw T

Calculation of coordinates of object sensed by sensor S, where angle of S is A in degrees, A*PI/180 in radians and reading for S is r metres

enter image description here

(x+(r+0.05)cos(API/180 +T),y+(r+0.05)sin(API/180 +T))

Using these calculations gives the following scatter graph (which I produced in Excel - sorry but Python is not my language).

If this scattergraph is close to what you expect then above calculation is on right lines. robot path in blue, readings in red

enter image description here

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top