Question

I am Developing an App which maps a room by steering a Sphero robot ball along it's edges.

For this I need three datasets continously streamed from sphero to the (android) app:

  • Collission detection to detect corners which go "inside" the room
  • Locator Data to know where a detected corner is
  • Movement of Sphero to the side in combination with driving a bit tilted towards the edge to detect corners which go "outside" the room.

My implementation of collission detection and locator data streaming is working without problems, however there are a few questions about additionaly streaming movement data.

The first big question is how to do this since locator data and movement data are streamed the same way, via asynchronous data streaming. This question was already answered when I had E-Mail contact to Sphero support:

By linking several data streaming masks declared in SetDataStreamingCommand via bitwise OR (|) it is possible to configure Sphero to stream multiple datasets simultaneously.

However, I have still open questions:

  1. I am planning to use the velocity-x mask for movement data, will this data be sphero relative (movements with an angle of 90° to the driving direction) or world relative (like locator velocity data)?

  2. Where will the velocity-x data be stored in the event data that gets passed to the listener?

thanks in before for your answers

Was it helpful?

Solution

The Velocity X value is part of the locator, so it is the velocity relative to the locator grid.

When you get the data back in the AsyncDataListener, you can find the values in the same LocatorData object you used for the locator information:

LocatorData#getVelocityX():float

Like so:

data.getVelocityX();

This time, however, the values will actually contain information instead of being zeroes, because you had previously asked for the velocity in the SetDataStreamingCommand.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top