Player/Stage: User Manual¶
About Player/Stage¶


Warning
This document is for users only. If you want to build, configure, or add features to your Donnie, please refer to the Donnie Assistive Robot: Developer Manual
Donnie is an assistive technology project, whose objective is to use robotics to facilitate programming teaching to visually impaired students. It is divided in two main parts:
- The construction and fine-tuning of the titular mobile robot, Donnie;
- The project’s software stack, including an intuitive parser/interpreter and a robot simulation environment;
The project is in its second version, developed in the Laboratório de Sistemas Autônomos (LSA) of the Pontific Catholic University of Rio Grande do Sul (PUCRS), Brazil.
How to Use Player/Stage¶
6th Edition
Using Player 3.0.2 and Stage 4.1.1 (development versions)
Kevin Nickels, Jennifer Owen, Alexandre Amory
2 July 2017
A user manual for the Player/Stage robot simulator.
This document is intended as a guide for anyone learning Player/Stage for the first time. It explains the process of setting up a new simulation environment and how to then make your simulation do something, using a case study along the way. Whilst it is aimed at Player/Stage users, those just wishing to use Player on their robot may also find sections of this document useful (particularly the parts about coding with Player).
If you have any questions about using Player/Stage there is a guide to getting help from the Player community at http://playerstage.sourceforge.net/wiki/Getting_help
This edition of the manual uses the development versions of Player and Stage. They can be found at https://github.com/playerproject/player.git and at https://github.com/rtv/Stage.git . The build process for player and stage is described in http://playerstage.sourceforge.net/doc/Player-3.0.2/player/install.html and http://rtv.github.io/Stage/install.html
There are only minor changes from v4.1.0 to this version of the manual. Versions older than that contain significant changes are are now outdated. Other versions of this manual are available at
TABLE OF CONTENTS¶
- Introduction
- The Basics
- Building a World
- Writing a Configuration (.cfg) File
- Getting Your Simulation to Run Your Code
- Controllers (C++)
- Controllers (C)
- Controllers (Py-libplayercpp)
- Controllers (Py-libplayerc)
- Building a New Interface (C/C++)
- Building a New Driver (C/C++)
Change Log¶
- 3 July 2017 forked by LSA (Laboratório de Sistemas Autônomos) to include updated instructions for the build process and instruction to create new drivers and interfaces.
- 15 Sept 2015 forked off development version of manual
- 7 Aug 2015 released v4.1.0 covering stable versions
- 30 June 2015 updating markdown for readthedocs.org
- 18 May 2015 began migration from LaTeX to MARKDOWN on GitHub
- August 2013 updated manual to Stage 4.1.1
- 1st August 2013 Source code made available online
- 16th April 2010 updated manual to Stage 3.2.2
- 10th July 2009 original manual covering Stage versions 2.1.1 and 3.1.0
Player/Stage is a robot simulation tool, it comprises of one program, Player, which is a Hardware Abstraction Layer. That means that it talks to the bits of hardware on the robot (like a claw or a camera) and lets you control them with your code, meaning you don’t need to worry about how the various parts of the robot work. Stage is a plugin to Player which listens to what Player is telling it to do and turns these instructions into a simulation of your robot. It also simulates sensor data and sends this to Player which in turn makes the sensor data available to your code.
A simulation then, is composed of three parts:
- Your code. This talks to Player.
- Player. This takes your code and sends instructions to a robot. From the robot it gets sensor data and sends it to your code.
- Stage. Stage interfaces with Player in the same way as a robot’s hardware would. It receives instructions from Player and moves a simulated robot in a simulated world, it gets sensor data from the robot in the simulation and sends this to Player.
Together Player and Stage are called Player/Stage, and they make a simulation of your robots.
These instructions will be focussing on how to use Player/Stage to make a simulation, but hopefully this will still be a useful resource for anyone just using Player (which is the same thing but on a real robot, without any simulation software).
1.1 - A Note on Installing Player/Stage¶
The install script described below has been tested with Ubuntu 12.04,
14.04, and 16.04. Each new version might introduce slightly different
package names. So it’s up to you to keep updating this script in the
future. This script is located at code/install.sh
directory.
For OSX users you might find the following install instructions useful: http://alanwinfield.blogspot.com/2009/07/installing-playerstage-on-os-x-with.html
Even after it’s installed, you may need to do some per-user setup on
your system. For example, on our system, the following two lines
(adapted as needed to your particular system) need to be added to each
user’s $HOME/.bashrc
file (or to the system-wide one). Edit the
PLAYERDIR
according to the directory selected for instalation. If
this parameter has not changed, then the typical dir is /usr/local
.
export PLAYERDIR=/usr/local
export LD_LIBRARY_PATH=/usr/lib:/usr/local/lib/:${PLAYERDIR}/lib/x86_64-linux-gnu/:${LD_LIBRARY_PATH}
export PATH=${PLAYERDIR}/bin:${PATH}
export PKG_CONFIG_PATH=/usr/lib/pkgconfig:/usr/lib/x86_64-linux-gnu/pkgconfig/:/usr/share/pkgconfig/:/usr/local/lib/pkgconfig/:${PKG_CONFIG_PATH}
export CMAKE_MODULE_PATH=${PLAYERDIR}/share/cmake/Modules/:${CMAKE_MODULE_PATH}
Does anything go wrong during the installation ? It is usually
straightforward to solve these problems. Typically, the error message
says the library or piece of code with error. Google this library name
and your Linux distribution version to find how to install the library.
Install it and try it again. If the error is after installation, during
the execution of Player, then probably you forgot to add the environment
variables to your $HOME/.bashrc
file or your system has different
path names. Adjust it and try it again. You might also run
ldd /usr/local/bin/player
to find out which dynamic library was not
found. Once you got the missing library name, find this library in the
system. If it was found, then add its path to the LD_LIBRARY_PATH
environment variable in the $HOME/.bashrc
file. If it was not found,
Google it and install the library into your system. Then, try to execute
Player again.
1.2 - A Note about TRY IT OUT sections¶
There will be sections scattered throughout this tutorial labeled TRY
IT OUT that explain how to run examples. You’ll need to download the
example
code
which will contain the files. In these sections, you’ll be given
commands to type in a terminal window (or bash shell). They’ll be shown
prefixed with a carrot >
and typeset in monospace font. For example,
> ls
means to go to a terminal window and type the command given (ls
),
without the >
character, then hit return.
In many cases, you’ll need to run the first command
(player configfile.cfg
) in the background, since it doesn’t quit
till player is done. To do this, you add an ampersand (&
) after the
command. So if you type player configfile.cfg
in a terminal, it will
appear to freeze. If you however type player configfile.cfg &
into
the terminal, it will start up player and return. Player will however
print information and debugging messages to your terminal, which can be
a bit confusing at times.
When you use files, you should examine the files to see how they are working, just running the examples will not teach you (as) much!
1.3 - TRY IT OUT (Preparation)¶
- If you haven’t already, download the sample code from http://github.com/NickelsLab/Player-Stage-Manual/archive/v4.1.0.zip
- Next, you’ll need to extract the sample code. To do this, open a
terminal and cd to the directory where you put the file
master.zip
, then extract using zip. Yes, there are GUI-based ways to do this too. I won’t cover them here. - I’ll assume that you want to put this directory in your home directory. If not, just replace the commands given with the appropriate directory.
- I’ll likewise assume that you downloaded the code into $HOME/Downloads. (Again, your specific path may differ.)
- (From here on out, I’ll just say that your extracted source code is
in
<source_code>
for brevity and generality.)
At this point, you should see one directory for each setion of this
manual, which contain the code examples for the respective chapters, and
one, bitmaps
, that has pictures used in several different examples.
1.4 - TRY IT OUT¶
First we will run a world and configuration file that comes bundled with
Stage. In a terminal window, you will navigate to the Stage/worlds
folder, by default (in Linux at least) this is
/usr/local/share/stage/worlds
. Type the following commands to run
the ``simple world’’ that comes with Player/Stage:
Assuming Player/Stage is installed properly you should now have a window open which looks like the figure below. Congratulations,you can now build Player/Stage simulations!

The simple.cfg world after being run
img
2.1 - Important File Types¶
In Player/Stage there are 3 kinds of file that you need to understand to get going with Player/Stage:
- a .world file
- a .cfg (configuration) file
- a .inc (include) file
The .world file tells Player/Stage what things are available to put in the world. In this file you describe your robot, any items which populate the world and the layout of the world. The .inc file follows the same syntax and format of a .world file but it can be included. So if there is an object in your world that you might want to use in other worlds, such as a model of a robot, putting the robot description in a .inc file just makes it easier to copy over, it also means that if you ever want to change your robot description then you only need to do it in one place and your multiple simulations are changed too.
The .cfg file is what Player reads to get all the information about the robot that you are going to use.This file tells Player which drivers it needs to use in order to interact with the robot, if you’re using a real robot these drivers are built in to Player (or you can download or write your own drivers, but I’m not going to talk about how to do this here.) Alternatively, if you want to make a simulation, the driver is always Stage (this is how Player uses Stage in the same way it uses a robot: it thinks that it is a hardware driver and communicates with it as such). The .cfg file tells Player how to talk to the driver, and how to interpret any data from the driver so that it can be presented to your code. Items described in the .world file should be described in the .cfg file if you want your code to be able to interact with that item (such as a robot), if you don’t need your code to interact with the item then this isn’t necessary. The .cfg file does all this specification using interfaces and drivers, which will be discussed in the following section.
2.2 - Interfaces, Drivers and Devices¶
- Drivers are pieces of code that talk directly to hardware. These are built in to Player so it is not important to know how to write these as you begin to learn Player/Stage. The drivers are specific to a piece of hardware so, say, a laser driver will be different to a camera driver, and also different to a driver for a different brand of laser. This is the same as the way that drivers for graphics cards differ for each make and model of card. Drivers produce and read information which conforms to an interface. The driver design is described in Chapter 11.
- Interfaces are a set way for a driver to send and receive information from Player. Like drivers, interfaces are also built in to Player and there is a big list of them in the Player manual. They specify the syntax and semantics of how drivers and Player interact. The interface design is described in Chapter 10.
- A device is a driver that is bound to an interface so that Player can talk to it directly. This means that if you are working on a real robot that you can interact with a real device (laser, gripper, camera etc) on the real robot, in a simulated robot you can interact with their simulations.
The official documentation actually describes these three things quite well with an example. (Actually, the official documentation still refers to the depreciated laser interface, but I’ve updated all the references in this manual to use the new ranger interface.)
Consider the ranger interface. This interface defines a format in which a planar range-sensor can return range readings (basically a list of ranges, with some meta-data). The ranger interface is just that: an interface. You can’t do anything with it.
Now consider the sicklms200 driver. This driver controls a SICK LMS200, which is particular planar range sensor that is popular in mobile robot applications. The sicklms200 driver knows how to communicate with the SICK LMS200 over a serial line and retrieve range data from it. But you don’t want to access the range data in some SICK-specific format. So the driver also knows how to translate the retrieved data to make it conform to the format defined by the ranger interface.
The sicklms200 driver can be bound to the ranger interface … to create a device, which might have the following address:
localhost:6665:ranger:0
The fields in this address correspond to the entries in the
player_devaddr_t
structure: host, robot, interface, and index. The host and robot fields (localhost and 6665) indicate where the device is located. The interface field indicates which interface the device supports, and thus how it can be used. Because you might have more than one ranger, the index field allows you to pick among the devices that support the given interface and are located on the given host:robot Other lasers on the same host:robot would be assigned different indexes.
The last paragraph there gets a bit technical, but don’t worry. Player talks to parts of the robot using ports (the default port is 6665), if you’re using Stage then Player and Stage communicate through these ports (even if they’re running on the same computer). All this line does is tell Player which port to listen to and what kind of data to expect. In the example it’s laser data which is being transmitted on port 6665 of the computer that Player is running on (localhost). You could just as easily connect to another computer by using its IP address instead of ``localhost’‘. The specifics of writing a device address in this way will be described in Chapter 4
img
3.1 - Building an Empty World¶
When we tell Player to build a world we only give it the .cfg file as an input. This .cfg file needs to tell us where to find our .world file, which is where all the items in the simulation are described. To explain how to build a Stage world containing nothing but walls we will use an example.
To start building an empty world we need a .cfg file. First create a
document called empty.cfg
(i.e. open in your favorite text editor -
gedit
is a good starter program if you don’t have a favorite) and
copy the following code into it:
driver
(
name "stage"
plugin "stageplugin"
provides ["simulation:0" ]
# load the named file into the simulator
worldfile "empty.world"
)
The configuration file syntax is described in Chapter
4, but basically what is happening here is that your
configuration file is telling Player that there is a driver called
stage
in the stageplugin
library, and this will give Player data
which conforms to the simulation
interface. To build the simulation
Player needs to look in the worldfile called empty.world
which is
stored in the same folder as this .cfg. If it was stored elsewhere you
would have to include a filepath, for example ./worlds/empty.world
.
Lines that begin with the hash symbol (#) are comments. When you build a
simulation, any simulation, in Stage the above chunk of code should
always be the first thing the configuration file says. Obviously the
name of the worldfile should be changed depending on what you called it
though.
Now a basic configuration file has been written, it is time to tell Player/Stage what to put into this simulation. This is done in the .world file.
3.1.1 - Models¶
A worldfile is basically just a list of models that describes all the stuff in the simulation. This includes the basic environment, robots and other objects. The basic type of model is called “model”, and you define a model using the following syntax:
define model_name model
(
# parameters
)
This tells Player/Stage that you are defining
a model
which you
have called model_name
, and all the stuff in the round brackets are
parameters of the model. To begin to understand Player/Stage model
parameters, let’s look at the map.inc
file that comes with Stage,
this contains the floorplan
model, which is used to describe the
basic environment of the simulation (i.e. walls the robots can bump
into):
define floorplan model
(
# sombre, sensible, artistic
color "gray30"
# most maps will need a bounding box
boundary 1
gui_nose 0
gui_grid 0
gui_move 0
gui_outline 0
gripper_return 0
fiducial_return 0
ranger_return 1
)
We can see from the first line that they are defining a model
called
floorplan
.
color
: Tells Player/Stage what colour to render this model, in this case it is going to be a shade of grey.boundary
: Whether or not there is a bounding box around the model. This is an example of a binary parameter, which means the if the number next to it is 0 then it is false, if it is 1 or over then it’s true. So here we DO have a bounding box around our “map” model so the robot can’t wander out of our map.gui_nose
: this tells Player/Stage that it should indicate which way the model is facing. Figure 3.2 shows the difference between a map with a nose and one without.gui_grid
: this will superimpose a grid over the model. Figure 3.3 shows a map with a grid.gui_move
: this indicates whether it should be possible to drag and drop the model. Here it is 0, so you cannot move the map model once Player/Stage has been run. In 1.4 - Try It Out when the Player/Stage examplesimple.cfg
was run it was possible to drag and drop the robot because itsgui_move
variable was set to 1.gui_outline
: indicates whether or not the model should be outlined. This makes no difference to a map, but it can be useful when making models of items within the world.fiducial_return
: any parameter of the form some_sensor_return describes how that kind of sensor should react to the model. “Fiducial” is a kind of robot sensor which will be described later in Section 3.2 - Fiducial. Settingfiducial_return
to 0 means that the map cannot be detected by a fiducial sensor.ranger_return
: Settingranger_return
to a negative number indicates that a model cannot be seen by ranger sensors. Settingranger_return
to a number between 0 and 1 (inclusive) (Note: this means thatranger_return 0
will allow a ranger sensor to see the object — the range will get set, it’ll just set the intensity of that return to zero.) See Section 5.3.2 - Interaction with Proxies — Ranger for more details. controls the intensity of the return seen by a ranger sensor.gripper_return
: Likefiducial_return
,gripper_return
tells Player/Stage that your model can be detected by the relevant sensor, i.e. it can be gripped by a gripper. Heregripper_return
is set to 0 so the map cannot be gripped by a gripper.
![]() |
![]() |
Figure 3.2: The first picture shows an empty map without a nose. The second picture shows the same map with a nose to indicate orientation, this is the horizontal line from the centre of the map to the right, it shows that the map is actually facing to the right.
| | | —————| | | | Figure 3.3: An empty
map with gui_grid enabled. With gui_grid disabled this would just be
an empty white square. |
To make use of the map.inc
file we put the following code into our
world file:
include "map.inc"
This inserts the map.inc
file into our world file where the include
line is. This assumes that your worldfile and map.inc
file are in
the same folder, if they are not then you’ll need to include the
filepath in the quotes. Once this is done we can modify our definition
of the map model to be used in the simulation. For example:
floorplan
(
bitmap "bitmaps/helloworld.png"
size [12 5 1]
)
What this means is that we are using the model “floorplan”, and making some extra definitions; both “bitmap” and “size” are parameters of a Player/Stage model. Here we are telling Player/Stage that we defined a bunch of parameters for a type of model called “floorplan” (contained in map.inc) and now we’re using this “floorplan” model definition and adding a few extra parameters.
bitmap
: this is the filepath to a bitmap, which can be type bmp, jpeg, gif or png. Black areas in the bitmap tell the model what shape to be, non-black areas are not rendered, this is illustrated in Figure 3.4. In the map.inc file we told the map that its “color” would be grey. This parameter does not affect how the bitmaps are read, Player/Stage will always look for black in the bitmap, thecolor
parameter just alters what colour the map is rendered in the simulation.size
: This is the size in metres of the simulation. All sizes you give in the world file are in metres, and they represent the actual size of things. If you have 3m x 4m robot testing arena that is 2m high and you want to simulate it then thesize
is [3 4 2]. The first number is the size in the x dimension, the second is the y dimension and the third is the z dimension.
![]() |
![]() |
Figure 3.4: The first image is our “helloworld.png” bitmap, the second image is what Player/Stage interprets that bitmap as. The coloured areas are walls, the robot can move everywhere else.
A full list of model parameters and their descriptions can be found in the official Stage manual Most of the useful parameters have already been described here, however there are a few other types of model which are relevant to building simulations of robots, these will be described later in Section 3.2 - Building a Robot.
3.1.2 - Describing the Player/Stage Window¶
The worldfile also can be used to describe the simulation window that Player/Stage creates. Player/Stage will automatically make a window for the simulation if you don’t put any window details in the worldfile, however, it is often useful to put this information in anyway. This prevents a large simulation from being too big for the window, or to increase or decrease the size of the simulation.
Like a model, a window is an inbuilt, high-level entity with lots of parameters. Unlike models though, there can be only one window in a simulation and only a few of its parameters are really needed. The simulation window is described with the following syntax:
window
(
# parameters...
)
The two most important parameters for the window are size
and
scale
.
size
: This is the size the simulation window will be in pixels. You need to define both the width and height of the window using the following syntax:size [width height]
.scale
: This is how many metres of the simulated environment each pixel shows. The bigger this number is, the smaller the simulation becomes. The optimum value for the scale is window_size/floorplan_size and it should be rounded downwards so the simulation is a little smaller than the window it’s in, some degree of trial and error is needed to get this right.
A full list of window parameters can be found in the Stage manual under “WorldGUI”
3.1.3 - Making a Basic Worldfile¶
We have already discussed the basics of worldfile building: models and the window. There are just a few more parameters to describe which don’t belong in either a model or a window description, these are optional though, and the defaults are pretty sensible.
interval_sim
: This is how many simulated milliseconds there are between each update of the simulation window, the default is 100 milliseconds.interval_real
: This is how many real milliseconds there are between each update of the simulation window. Balancing this parameter and theinterval_sim
parameter controls the speed of the simulation. Again, the default value is 100 milliseconds, both these interval parameter defaults are fairly sensible, so it’s not always necessary to redefine them.
The Stage manual contains a list of the high-level worldfile parameters
Finally, we are able to write a worldfile!
include "map.inc"
# configure the GUI window
window
(
size [700.000 700.000]
scale 41
)
# load an environment bitmap
floorplan
(
bitmap "bitmaps/cave.png"
size [15 15 0.5]
)
If we save the above code as empty.world (correcting any filepaths if necessary) we can run its corresponding empty.cfg file (see Section 3.1 - Empty World) to get the simulation shown in Figure 3.5.
| | | —————| | | | Figure 3.5: Our Empty
World |
3.2 - Building a Robot¶
In Player/Stage a robot is just a slightly advanced kind of model, all the parameters described in Models can still be applied.
3.2.1 - Sensors and Devices¶
There are six built-in kinds of model that help with building a robot,
they are used to define the sensors and actuators that the robot has.
These are associated with a set of model parameters which define by
which sensors the model can be detected (these are the _return
s
mentioned earlier). Each of these built in models acts as an interface
(see Section 2.2 - Interfaces, Drivers, and
Devices) between the
simulation and Player. If your robot has one of these kinds of sensor on
it, then you need to use the relevant model to describe the sensor,
otherwise Stage and Player won’t be able to pass the data between each
other. It is possible to write your own interfaces, but the stuff
already included in Player/Stage should be sufficient for most people’s
needs. A full list of interfaces that Player supports can be found in
the Player
manual
although only the following are supported by the current distribution of
Stage (version 4.1.X). Unless otherwise stated, these models use the
Player interface that shares its name:
3.2.1.1 - camera¶
The camera model adds a camera to the robot model and allows your code to interact with the simulated camera. The camera parameters are as follows:
resolution [x y]
: the resolution, in pixels, of the camera’s image.range [min max]
: the minimum and maximum range that the camera can detectfov [x y]
: the field of view of the camera in DEGREES.pantilt [pan tilt]
: angle, in degrees, where the camera is looking. Pan is the left-right positioning. So for instance pantilt [20 10] points the camera 20 degrees left and 10 degrees down.
3.2.1.2 - blobfinder¶
The blobfinder simulates colour detection software that can be run on the image from the robot’s camera. It is not necessary to include a model of the camera in your description of the robot if you want to use a blobfinder, the blobfinder will work on its own.
In previous versions of Stage, there was a blob_return
parameter to
determine if a blobfinder could detect an object. In Stage 4.1.1, this
does not seem to be the case. However, you can simply set an object to
be a color not listed in the colors[]
list to make it invisible to
blobfinders.
The parameters for the blobfinder are described in the Stage manual, but the most useful ones are here:
colors_count <int>
: the number of different colours the blobfinder can detectcolors [ ]
: the names of the colours it can detect. This is given to the blobfinder definition in the form["black" "blue" "cyan"]
. These colour names are from the built in X11 colour database rgb.txt. This is built in to Linux – the filergb.txt
can normally be found at /usr/share/X11/rgb.txt assuming it’s properly installed, or see Wikipedia for details.image [x y]
: the size of the image from the camera, in pixels.range <float>
: The maximum range that the camera can detect, in metres.fov <float>
: field of view of the blobfinder in DEGREES. Unlike the camerafov
, the blobfinderfov
respects theunit_angle
call as described in http://playerstage.sourceforge.net/wiki/Writing_configuration_files#Units. By default, the blobfinderfov
is in DEGREES.
3.2.1.3 - fiducial¶
A fiducial is a fixed point in an image, so the fiducial
finder
simulates image processing software that locates fixed points in an
image. The fiducialfinder is able to locate objects in the simulation
whose fiducial_return
parameter is set to true. Stage also allows
you to specify different types of fiducial using the fiducial_key
parameter of a model. This means that you can make the robots able to
tell the difference between different fiducials by what key they
transmit. The fiducial finder and the concept of fiducial_key
s is
properly explained in the Stage manual. The fiducial sensors parameters
are:
range_min
: The minimum range at which a fiducial can be detected, in metres.range_max
: The maximum range at which a fiducial can be detected, in metres.range_max_id
: The maximum range at which a fiducial’s key can be accurately identified. If a fiducial is closer thatrange_max
but further away thanrange_max_id
then it detects that there is a fiducial but can’t identify it.fov
: The field of view of the fiducial finder in DEGREES.
3.2.1.4 - ranger sensor¶
The ranger
sensor
simulates any kind of obstacle detection device (e.g. sonars, lasers, or
infrared sensors). These can locate models whose ranger_return
is
non-negative. Using a ranger model you can define any number of ranger
sensors and apply them all to a single device. The parameters for the
sensor
model and their inputs are described in the Stage manual, but
basically:
size [x y]
: how big the sensors are.range [min max]
: defines the minimum and maxium distances that can be sensed.fov deg
: defines the field of view of the sensors in DEGREESsamples
: this is only defined for a laser - it specifies ranger readings the sensor takes. The laser model behaves like a large number of rangers sensors all with the same x and y coordinates relative to the robot’s centre, each of these rangers has a slightly different yaw. The rangers are spaced so that there are samples number of rangers distributed evenly to give the laser’s field of view. So if the field of view is 180 and there are 180 samples the rangers are 1 apart.
3.2.1.5 - ranger device¶
A ranger
device is
comprised of ranger sensors. A laser is a special case of ranger sensor
which allows only one sensor, and has a very large field of view. For a
ranger device, you just provide a list of sensors which comprise this
device, typically resetting the pose for each. How to write the
[x y yaw]
data is explained in Yaw Angles.
sensor_name (pose [x1 y1 z1 yaw1])
sensor_name (pose [x2 y2 z2 yaw2])
3.2.1.6 - gripper¶
The gripper model is a simulation of the gripper you get on a Pioneer robot. The Pioneer grippers looks like a big block on the front of the robot with two big sliders that close around an object. If you put a gripper on your robot model it means that your robot is able to pick up objects and move them around within the simulation. The online Stage manual says that grippers are deprecated in Stage 3.X.X, however this is not actually the case and grippers are very useful if you want your robot to be able to manipulate and move items. The parameters you can use to customise the gripper model are:
size [x y z]
: The x and y dimensions of the gripper.pose [x y z yaw]
: Where the gripper is placed on the robot, relative to the robot’s geometric centre. The pose parameter is decribed properly in Section 3.2.1 - Robot Sensors and Devices.
3.2.1.7 - position¶
The position
model
simulates the robot’s odometry, this is when the robot keeps track of
where it is by recording how many times its wheels spin and the angle it
turns. This robot model is the most important of all because it allows
the robot model to be embodied in the world, meaning it can collide with
anything which has its obstacle_return
parameter set to true. The
position model uses the position2d
interface, which is essential for
Player because it tells Player where the robot actually is in the world.
The most useful parameters of the position model are:
drive
: Tells the odometry how the robot is driven. This is usually “diff” which means the robot is controlled by changing the speeds of the left and right wheels independently. Other possible values are “car” which means the robot uses a velocity and a steering angle, or “omni” which means it can control how it moves along the x and y axes of the simulation.localization
: tells the model how it should record the odometry “odom” if the robot calculates it as it moves along or “gps” for the robot to have perfect knowledge about where it is in the simulation.odom_error [x y angle]
: The amount of error that the robot will make in the odometry recordings.
3.2.2 - An Example Robot¶
To demonstrate how to build a model of a robot in Player/Stage we will build our own example. First we will describe the physical properties of the robot, such as size and shape. Then we will add sensors onto it so that it can interact with its environment.
3.2.2.1 - The Robot’s Body¶
Let’s say we want to model a rubbish collecting robot called “Bigbob”.
The first thing we need to do is describe its basic shape, to do this
you need to know your robot’s dimensions in metres. Figure 3.6 shows the
basic shape of Bigbob drawn onto some cartesian coordinates, the
coordinates of the corners of the robot have been recorded. We can then
build this model using the block
model parameter. In this example
we’re using blocks with the position model type but we could equally use
it with other model types.
| | | —————| | | | Figure 3.6: The basic
shape we want to make Bigbob, the units on the axes are in metres.|
define bigbob position
block
(
points 6
point[0] [0.75 0]
point[1] [1 0.25]
point[2] [1 0.75]
point[3] [0.75 1]
point[4] [0 1]
point[5] [0 0]
z [0 1]
)
)
bigbob
(
name "bob1"
pose [ 0 0 0 0]
color "gray"
)
In the first line of this code we state that we are defining a
position
model called bigbob
. Next block
declares that this
position
model contains a block.
The following lines go on to describe the shape of the block;
points 6
says that the block has 6 corners and
point[number] [x y]
gives the coordinates of each corner of the
polygon in turn. Finally, the z [height_from height_to]
states how
tall the robot should be, the first parameter being a lower coordinate
in the z plane, and the second parameter being the upper coordinate in
the z plane. In this example we are saying that the block describing
Bigbob’s body is on the ground (i.e. its lower z coordinate is at 0)
and it is 1 metre tall. If I wanted it to be from 50cm off the ground to
1m then I could use z [0.5 1]
.
TRY IT OUT (Position Model)¶
In this example, you can see the basic shape in an empty environment.
3.2.2.2 - Adding Teeth¶
Now in the same way as we built the body we can add on some teeth for Bigbob to collect rubbish between. Figure 3.7 shows Bigbob with teeth plotted onto a cartesian grid:
![]() |
Figure 3.7: The new shape of Bigbob. |
define bigbob position
(
size [1.25 1 1]
# the shape of Bigbob
block
(
points 6
point[5] [0 0]
point[4] [0 1]
point[3] [0.75 1]
point[2] [1 0.75]
point[1] [1 0.25]
point[0] [0.75 0]
z [0 1]
)
block
(
points 4
point[3] [1 0.75]
point[2] [1.25 0.75]
point[1] [1.25 0.625]
point[0] [1 0.625]
z [0 0.5]
)
block
(
points 4
point[3] [1 0.375]
point[2] [1.25 0.375]
point[1] [1.25 0.25]
point[0] [1 0.25]
z [0 0.5]
)
)
To declare the size of the robot you use the size [x y z]
parameter,
this will cause the polygon described to be scaled to fit into a box
which is x
by y
in size and z
metres tall. The default size
is 0.4 x 0.4 x 1 m, so because the addition of rubbish-collecting teeth
made Bigbob longer, the size parameter was needed to stop Player/Stage
from making the robot smaller than it should be. In this way we could
have specified the polygon coordinates to be 4 times the distance apart
and then declared its size to be 1.25 x 1 x 1
metres, and we would
have got a robot the size we wanted. For a robot as large as Bigbob this
is not really important, but it could be useful when building models of
very small robots. It should be noted that it doesn’t actually matter
where in the cartesian coordinate system you place the polygon, instead
of starting at (0, 0)
it could just as easily have started at
(-1000, 12345)
. With the block
parameter we just describe the
shape of the robot, not its size or location in the map.
TRY IT OUT (BigBob with Teeth)¶
This example shows the more accurate rendering of Big Bob.
3.2.2.3 - Yaw Angles¶
You may have noticed that in Figures 3.6 and 3.7 Bigbob is facing to the
right of the grid. When you place any item in a Player/Stage simulation
they are, by default, facing to the right hand side of the simulation.
Figure 3.3 shows that the grids use a typical Cartesian coordinate
system, and so if you want to alter the direction an object in the
simulation is pointing (its “yaw”) any angles you give use the x-axis as
a reference, just like vectors in a Cartesian coordinate system (see
Figure 3.8) and so the default yaw is 0 degrees. This is also why in
Section 3.1 - Empty World the
gui_nose
shows the map is facing to the right. Figure 3.9 shows a
few examples of robots with different yaws.
![]() |
Figure 3.8: A cartesian grid showing how angles are described. |
![]() |
Figure 3.9: Starting from the top right robot and working anti-clockwise, the yaws of these robots are 0, 90, -45 and 200. |
By default, Player/Stage assumes the robot’s centre of rotation is at
its geometric centre based on what values are given to the robot’s
size
parameter. Bigbob’s size
is 1.25 x 1 x 1
so
Player/Stage will place its centre at (0.625, 0.5, 0.5)
, which means
that Bigbob’s wheels would be closer to its teeth. Instead let’s say
that Bigbob’s centre of rotation is in the middle of its main body
(shown in Figure 3.6 which puts the centre of rotation at
(0.5, 0.5, 0.5)
. To change this in robot model you use the
origin [x-offset y-offset z-offset]
command:
define bigbob position
(
# actual size
size [1.25 1 1]
# centre of rotation offset
origin [0.125 0 0]
# the shape of Bigbob
block
...
...
...
)
TRY IT OUT (Different Origin)¶
Click on the robot, and it should hilight. You can drag bigbob around with the left (primay) mouse button. Click and hold down the right (secondary) mouse button, and move the mouse to rotate bigbob about the centre of the body, not the centre of the entire block.
3.2.2.4 - Drive¶
Finally we will specify the drive
of Bigbob, this is a parameter of
the position
model and has been described earlier.
define bigbob position
(
# actual size
size [1.25 1 1]
# centre of rotation offset
origin [0.125 0 0]
# the shape of Bigbob
block
...
...
...
# positonal things
drive "diff"
)
3.2.2.5 - The Robot’s Sensors¶
Now that Bigbob’s body has been built let’s move on to the sensors. We will put sonar and blobfinding sensors onto Bigbob so that it can detect walls and see coloured blobs it can interpret as rubbish to collect. We will also put a laser between Bigbob’s teeth so that it can detect when an item passes in between them.
Bigbob’s Sonar¶
We will start with the sonars. The first thing to do is to define a model for the sonar sensor that is going to be used on Bigbob:
define bigbobs_sonars sensor
(
# parameters...
)
define bigbobs_ranger ranger
(
# parameters...
)
Here we tell Player/Stage that we will define a type of sensor called bigbobs_sonars. Next, we’ll tell Player/Stage to use these sensors in a ranging device. Let’s put four sonars on Bigbob, one on the front of each tooth, and one on the front left and the front right corners of its body.
When building Bigbob’s body we were able to use any location on a
coordinate grid that we wanted and could declare our shape polygons to
be any distance apart we wanted so long as we resized the model with
size
. In contrast, sensors - all sensors not just rangers - must be
positioned according to the robot’s origin and actual size. To work
out the distances in metres it helps to do a drawing of where the
sensors will go on the robot and their distances from the robot’s
origin. When we worked out the shape of Bigbob’s body we used its actual
size, so we can use the same drawings again to work out the distances of
the sensors from the origin as shown in Figure 3.10.
![]() |
Figure 3.10: The position of Bigbob’s sonars (in red) relative to its origin. The origin is marked with a cross, some of the distances from the origin to the sensors have been marked. The remaining distances can be done by inspection. |
First, we’ll define a single ranger (in this case sonar) sensor. To define the size, range and field of view of the sonars we just consult the sonar device’s datasheet.
define bigbobs_sonar sensor
(
# define the size of each transducer [xsize ysize zsize] in meters
size [0.01 0.05 0.01 ]
# define the range bounds [min max]
range [0.3 2.0]
# define the angular field of view in degrees
fov 10
# define the color that ranges are drawn in the gui
color_rgba [ 0 1 0 1 ]
)
Then, define how the sensors are placed into the ranger device. The process of working out where the sensors go relative to the origin of the robot is the most complicated part of describing the sensor.
define bigbobs_sonars ranger
(
# one line for each sonar [xpos ypos zpos heading]
bigbobs_sonar( pose [ 0.75 0.1875 0 0]) # fr left tooth
bigbobs_sonar( pose [ 0.75 -0.1875 0 0]) # fr right tooth
bigbobs_sonar( pose [ 0.25 0.5 0 30]) # left corner
bigbobs_sonar( pose [ 0.25 -0.5 0 -30]) # right corner
)
TRY IT OUT (driving a robot)¶
This file includes everything described up till now.
This will start player in the background, then start a “remote control” (also in the background). You may need to move the playerv window out of the way to see the Stage window.
See the playerv documentation for details on playerv. For now, the “remote control” just makes the ranger sensor cones appear.
Bigbob’s Blobfinder¶
Now that Bigbob’s sonars are done we will attach a blobfinder:
define bigbobs_eyes blobfinder
(
# parameters
)
Bigbob is a rubbish-collector so here we should tell it what colour of rubbish to look for. Let’s say that the intended application of Bigbob is in an orange juice factory and he picks up any stray oranges or juice cartons that fall on the floor. Oranges are orange, and juice cartons are (let’s say) dark blue so Bigbob’s blobfinder will look for these two colours:
define bigbobs_eyes blobfinder
(
# number of colours to look for
colors_count 2
# which colours to look for
colors ["orange" "DarkBlue"]
)
Then we define the properties of the camera, again these come from a datasheet:
define bigbobs_eyes blobfinder
(
# number of colours to look for
colors_count 2
# which colours to look for
colors ["orange" "DarkBlue"]
# camera parameters
image [160 120] #resolution
range 5.00 # m
fov 60 # degrees
)
TRY IT OUT (blobfinder)¶
Similar to the previous example, playerv
just makes the camera show
up in the PlayerViewer window.
Bigbob’s Laser¶
The last sensor that needs adding to Bigbob is the laser, which will be used to detect whenever a piece of rubbish has been collected, the laser’s location on the robot is shown in Figure 3.11. Following the same principles as for our previous sensor models we can create a description of this laser:
define bigbobs_laser sensor
(
size [0.025 0.025 0.025]
range [0 0.25] # max = dist between teeth in m
fov 20 # does not need to be big
color_rgba [ 1 0 0 0.5]
samples 180 # number of ranges measured
)
define bigbobs_lasers ranger
(
bigbobs_laser( pose [ 0.625 0.125 -0.975 270 ])
)
With this laser we’ve set its maximum range to be the distance between
teeth, and the field of view is arbitrarily set to 20 degrees. We have
calculated the laser’s pose
in exactly the same way as the sonars
pose
, by measuring the distance from the laser’s centre to the
robot’s origin (which we set with the origin
parameter earlier). The
z coordinate of the pose parameter when describing parts of the robot
is relative to the very top of the robot. In this case the robot is 1
metre tall so we put the laser at -0.975 so that it is on the ground.
The laser’s yaw is set to 270 degrees so that it points across
Bigbob’s teeth. We also set the size of the laser to be 2.5cm cube so
that it doesn’t obstruct the gap between Bigbob’s teeth.
![]() |
Figure 3.11: The position of Bigbob’s laser (in red) and its distance, in metres, relative to its origin (marked with a cross). |
Now that we have a robot body and sensor models all we need to do is put
them together and place them in the world. To add the sensors to the
body we need to go back to the bigbob position
model:
define bigbob position
(
# actual size
size [1.25 1 1]
# centre of rotation offset
origin [0.125 0 0]
# the shape of Bigbob
block
...
...
...
# positonal things
drive "diff"
# sensors attached to bigbob
bigbobs_sonars()
bigbobs_eyes()
bigbobs_laser()
)
The extra line bigbobs_sonars()
adds the sonar model called
bigbobs_sonars()
onto the bigbob
model, likewise for
bigbobs_eyes()
and bigbobs_laser()
.
At this point it’s worthwhile to copy this into a .inc file, so that the model could be used again in other simulations or worlds. This file can also be found in the example code in /Ch5.3/bigbob.inc
To put our Bigbob model into our empty world (see Section 3.1.3 -
Making a Basic Worldfile) we need to
add the robot to our worldfile empty.world
:
include "map.inc"
include "bigbob.inc"
# size of the whole simulation
size [15 15]
# configure the GUI window
window
(
size [ 700.000 700.000 ]
scale 35
)
# load an environment bitmap
floorplan
(
bitmap "bitmaps/cave.png"
size [15 15 0.5]
)
bigbob
(
name "bob1"
pose [-5 -6 0 45]
color "green"
)
Here we’ve put all the stuff that describes Bigbob into a .inc file
bigbob.inc
, and when we include this, all the code from the .inc
file is inserted into the .world file. The section here is where we put
a version of the bigbob model into our world:
bigbob
(
name "bob1"
pose [-5 -6 0 45]
color "green"
)
Bigbob is a model description, by not including any define
stuff in
the top line there it means that we are making an instantiation of that
model, with the name bob1
. Using an object-oriented programming
analogy, bigbob
is our class, and bob1
is our object of class
bigbob
. The pose [x y yaw]
parameter works in the same way as
spose [x y yaw]
does. The only differences are that the coordinates
use the centre of the simulation as a reference point and pose
lets
us specify the initial position and heading of the entire bob1
model, not just one sensor within that model.
Finally we specify what colour bob1
should be, by default this is
red. The pose
and color
parameters could have been specified in
our bigbob model but by leaving them out it allows us to vary the colour
and position of the robots for each different robot of type bigbob
,
so we could declare multiple robots which are the same size, shape and
have the same sensors, but are rendered by Player/Stage in different
colours and are initialised at different points in the map.
When we run the new bigbob6.world
with Player/Stage we see our
Bigbob robot is occupying the world, as shown in Figure 3.12.
| | | —————| | | | Figure 3.12: Our bob1
robot placed in the simple world, showing the range and field of view of
all of the ranger sensors. |
TRY IT OUT (Bigbob in environment)¶
This should show you Figure 3.12
You may wish to zoom in on the teeth to see the tooth laser.
3.2.3 - Building Other Stuff¶
We established in Section 3.2.2 - An Example Robot that Bigbob works in a orange juice factory collecting oranges and juice cartons. Now we need to build models to represent the oranges and juice cartons so that Bigbob can interact with things.
oranges¶
We’ll start by building a model of an orange:
define orange model
(
# parameters...
)
The first thing to define is the shape of the orange. The block
parameter is one way of doing this, which we can use to build a blocky
approximation of a circle. An alternative to this is to use bitmap
which we previously saw being used to create a map. What the bitmap
command actually does is take in a picture, and turn it into a series of
blocks which are connected together to make a model the same shape as
the picture, as illustrated in Figure 3.13 for an alien bitmap.
![]() |
![]() |
Figure 3.13: The left image is the original picture, the right image is its Stage interpretation.
In our code, we don’t want an alien, we want a simple circular shape (see Figure 3.14), so we’ll point to a circular bitmap.
![]() |
![]() |
Figure 3.14: The orange model rendered in the same Stage window as Bigbob.
define orange model
(
bitmap "bitmaps/circle.png"
size [0.15 0.15 0.15]
color "orange"
)
In this bit of code we describe a model called orange
which uses a
bitmap to define its shape and represents an object which is 15cm x
15cm x 15cm and is coloured orange. Figure 3.14 shows our orange
model next to Bigbob.
Juice Cartons¶
Building a juice carton model is similarly quite easy:
define carton model
(
# a carton is retangular
# so make a square shape and use size[]
block
(
points 4
point[0] [1 0]
point[1] [1 1]
point[2] [0 1]
point[3] [0 0]
z [0 1]
)
# average litre carton size is ~ 20cm x 10cm x 5cm ish
size [0.1 0.2 0.2]
color "DarkBlue"
)
We can use the block
command since juice cartons are boxy, with boxy
things it’s slightly easier to describe the shape with block
than
drawing a bitmap and using that. In the above code I used block
to
describe a metre cube (since that’s something that can be done pretty
easily without needing to draw a carton on a grid) and then resized it
to the size I wanted using size
.
Putting objects into the world¶
Now that we have described basic orange
and carton
models it’s
time to put some oranges and cartons into the simulation. This is done
in the same way as our example robot was put into the world:
orange
(
name "orange1"
pose [-2 -5 0 0]
)
carton
(
name "carton1"
pose [-3 -5 0 0]
)
We created models of oranges and cartons, and now we are declaring that
there will be an instance of these models (called orange1
and
carton1
respectively) at the given positions. Unlike with the robot,
we declared the color
of the models in the description so we don’t
need to do that here. If we did have different colours for each orange
or carton then it would mess up the blobfinding on Bigbob because the
robot is only searching for orange and dark blue. At this point it would
be useful if we could have more than just one orange or carton in the
world (Bigbob would not be very busy if there wasn’t much to pick up),
it turns out that this is also pretty easy:
orange(name "orange1" pose [-1 -5 0 0])
orange(name "orange2" pose [-2 -5 0 0])
orange(name "orange3" pose [-3 -5 0 0])
orange(name "orange4" pose [-4 -5 0 0])
carton(name "carton1" pose [-2 -4 0 0])
carton(name "carton2" pose [-2 -3 0 0])
carton(name "carton3" pose [-2 -2 0 0])
carton(name "carton4" pose [-2 -1 0 0])
Up until now we have been describing models with each parameter on a new
line, this is just a way of making it more readable for the programmer
– especially if there are a lot of parameters. If there are only a few
parameters or you want to be able to comment it out easily, it can all
be put onto one line. Here we declare that there will be four orange
models in the simulation with the names orange1
to orange4
, we
also need to specify different poses for the models so they aren’t all
on top of each other. Properties that the orange models have in common
(such as shape, colour or size) should all be in the model definition.
TRY IT OUT (full worldfile)¶
This should show you Figure 3.15.
The full worldfile is at <source_code>/Ch3/bigbob7.world
, this
includes the orange and carton models as well as the code for putting
them in the simulation. Figure 3.15 shows the populated Player/Stage
simulation.
![]() |
Figure 3.15: The Bigbob robot placed in the simulation along with junk for it to pick up. |
img
As mentioned earlier, Player is a hardware abstraction layer which connects your code to the robot’s hardware. It does this by acting as a Server/Client type program where your code and the robot’s sensors are clients to a Player server which then passes the data and instructions around to where it all needs to go. This stuff will be properly explained in Chapter 5 - Controllers, it all sounds more complicated than it is because Player/Stage takes care of all the difficult stuff. The configuration file is needed in order to tell the Player server which drivers to use and which interfaces the drivers will be using.
For each model in the simulation or device on the robot that you want to interact with, you will need to specify a driver. This is far easier than writing worldfile information, and follows the same general syntax. The driver specification is in the form:
driver
(
name "driver_name"
provides [device_address]
# other parameters...
)
The ‘name’ and ‘provides’ parameters are mandatory information, without
them Player won’t know which driver to use (given by ‘name’) and what
kind of information is coming from the driver (‘provides’). The ‘name’
parameter is not arbitrary, it must be the name of one of Player’s
inbuilt drivers. It is also possible to build your own drivers for a
hardware device but this document won’t go into how to do this because
it’s not relevant to Player/Stage. that have been written for Player to
interact with a robot device. A list of supported driver
names
is in the Player manual, although when using Stage the only one that is
needed is "stage"
.
The ‘provides’ parameter is a little more complicated than ‘name’. It is
here that you tell Player what interface to use in order to interpret
information given out by the driver (often this is sensor information
from a robot), any information that a driver ‘provides’ can be used by
your code. For a Stage simulated robot the "stage"
driver can
provide the interfaces to the sensors discussed in Section 3.2.1
-Sensors and Devices Each
interface shares the same name as the sensor model, so for example a
ranger
model would use the ranger
interface to interact with
Player and so on (the only exception to this being the position
model which uses the position2d
interface). The Player manual
contains a list of all the different interfaces that can be
used,
the most useful ones have already been mentioned in Section 3.2.1
-Sensors and Devices,
although there are others too numerable to list here.
The input to the ‘provides’ parameter is a “device address”, which
specifies which TCP port an interface to a robot device can be found,
Section 4.1 - Device Address has more
information about device addresses. This uses the
key:host:robot:interface:index
form separated by white space.
provides ["key:host:robot:interface:index"
"key:host:robot:interface:index"
"key:host:robot:interface:index"
...]
After the two mandatory parameters, the next most useful driver
parameter is model
. This is only used if "stage"
is the driver,
it tells Player which particular model in the worldfile is providing the
interfaces for this particular driver. A different driver is needed for
each model that you want to use. Models that aren’t required to do
anything (such as a map, or in the example of Section 3.2.3 - Building
Other Stuff oranges and
boxes) don’t need to have a driver written for them.
The remaining driver parameters are ‘requires’ and ‘plugin’. The
‘requires’ is used for drivers that need input information such as
‘vfh’, it tells this driver where to find this information and which
interface it uses. The ‘requires’ parameter uses the same
key:host:robot:interface:index
syntax as the ‘provides’ parameter.
Finally the plugin
parameter is used to tell Player where to find
all the information about the driver being used.
Earlier we made a .cfg file in order to create a simulation of an empty (or at least unmoving) world, the .cfg file read as follows:
driver
(
name "stage"
plugin "stageplugin"
provides ["simulation:0" ]
# load the named file into the simulator
worldfile "empty.world"
)
This has to be done at the beginning of the configuration file because
it tells Player that there is a driver called "stage"
that we’re
going to use and the code for dealing with this driver can be found in
the stageplugin
plugin. This needs to be specified for Stage because
Stage is an add-on for Player, for drivers that are built into Player by
default the plugin
doesn’t need to be specified.
4.1 - Device Addresses¶
A device address is used to tell Player where the driver you are making
will present (or receive) information and which interface to use in
order to read this information. This is a string in the form
key:host:robot:interface:index
where each field is separated by a
colon.
key
: The Player manual states that: “The purpose of the key field is to allow a driver that supports multiple interfaces of the same type to map those interfaces onto different devices” This is a driver level thing and has a lot to do with thename
of the driver that you are using, generally for"stage"
thekey
doesn’t need to be used. If you’re using Player without Stage then there is a useful section about device address keys in the Player manual.host
: This is the address of the host computer where the device is located. With a robot it could be the IP address of the robot. The default host is “localhost” which means the computer on which Player is running.robot
: this is the TCP port through which Player should expect to receive data from the interface usually a single robot and all its necessary interfaces are assigned to one port. The default port used is 6665, if there were two robots in the simulation the ports could be 6665 and 6666 although there’s no rule saying which number ports you can or can’t use.interface
: The interface to use in order to interact with the data. There is no default value for this option because it is a mandatory field.index
: If a robot has multiple devices of the same type, for instance it has 2 cameras to give the robot depth perception, each device uses the same interface but gives slightly different information. The index field allows you to give a slightly different address to each device. So two cameras could becamera:0
andcamera:1
. This is very different from thekey
field because having a “driver that supports multiple interfaces of the same type” is NOT the same as having multiple devices that use the same interface. Again there is no default index, as this is a mandatory field in the device address, but you should use 0 as the index if there is only one of that kind of device.
If you want to use any of the default values it can just be left out of
the device address. So we could use the default host and robot port and
specify (for example) a laser interface just by doing "ranger:0"
.
However, if you want to specify fields at the beginning of the device
address but not in the middle then the separating colons should remain.
For example if we had a host at "127.0.0.1"
with a ranger
interface then we would specify the address as
"127.0.0.1::ranger:0"
, the robot field is empty but the colons
around it are still there. You may notice that the key field here was
left off as before.
4.2 - Putting the Configuration File Together¶
We have examined the commands necessary to build a driver for a model in the worldfile, now it is just a case of putting them all together. To demonstrate this process we will build a configuration file for the worldfile developed in Section 3.1.3 - Making a Basic Worldfile. In this world we want our code to be able to interact with the robot, so in our configuration file we need to specify a driver for this robot.
driver
(
# parameters...
)
The inbuilt driver that Player/Stage uses for simulations is called
"stage"
so the driver name is "stage"
.
driver
(
name "stage"
)
The Bigbob robot uses position
, blobfinder
and ranger
sensors. These correspond to the position2d
, blobfinder
and
ranger
interfaces respectively.
All range-finding sensors (i.e. sonar, laser, and IR sensors) are represented by the ranger interface. In Stage 4.1.1 there is only legacy support for separate laser or IR interfaces. All new development should use rangers.
We want our code to be able to read from these sensors, so we need to
declare interfaces for them and tell Player where to find each device’s
data, for this we use the configuration file’s ‘provides’ parameter.
This requires that we construct device addresses for each sensor; to
remind ourselves, this is in the key:host:robot:interface:index
format. We aren’t using any fancy drivers, so we don’t need to specify a
key. We are running our robot in a simulation on the same computer as
our Player sever, so the host name is ‘localhost’ which is the default,
so we also don’t need to specify a host. The robot is a TCP port to
receive robot information over, picking which port to use is pretty
arbitrary but what usually happens is that the first robot uses the
default port 6665 and subsequent robots use 6666, 6667, 6668 etc. There
is only one robot in our simulation so we will use port 6665 for all our
sensor information from this robot. We only have one sensor of each
type, so our devices don’t need separate indices. What would happen if
we did have several sensors of the same type (like say two cameras) is
that we put the first device at index 0 and subsequent devices using the
same interface have index 1, then 2, then 3 and so on.
There are lots of ranger sensors in our model but when we created the robot’s sensors in Section 3.2.1 - Robot Sensors and Devices. we put them all into two ranger models (one for all the sonars and one for the one laser). So as far as the configuration file is concerned there are only two ranging devices, because all the separate sonar sensors are lumped together into one device. We don’t need to declare each sonar device on an index of its own.
Finally we use interfaces appropriate to the sensors the robot has, so
in our example these are the position2d
, blobfinder
interfaces
and for our sonar and laser devices we will use the ranger
interface.
Putting all this together under the provides
parameter gives us:
driver
(
name "stage"
provides ["position2d:0"
"ranger:0"
"blobfinder:0"
"ranger:1" ]
)
The device addresses can be on the same line as each other or separate lines, just so long as they’re separated by some form of white space.
The last thing to do on our driver is the model "model_name"
parameter which needs to be specified because we are using Player/Stage.
This tells the simulation software that anything you do with this driver
will affect the model "model_name"
in the simulation. In the
simulation we built we named our robot model “bob1”, so our final driver
for the robot will be:
driver
(
name "stage"
provides ["position2d:0"
"ranger:0"
"blobfinder:0"
"ranger:1"]
model "bob1"
)
If our simulation had multiple Bigbob robots in it, the configuration file drivers would be very similar to one another. If we created a second robot in our worldfile and called it “bob2” then the driver would be:
driver
(
name "stage"
provides ["position2d:0"
"ranger:0"
"blobfinder:0"
"ranger:1"]
model "bob2"
)
Notice that the port number and model name are the only differences because the robots have all the same sensors.
A driver of this kind can be built for any model that is in the
worldfile, not just the robots. For instance a map driver can be made
which uses the map
interface and will allow you to get size, origin
and occupancy data about the map. The only requirement is that if you
want to do something to the model with your code then you need to build
a driver for it in the configuration file. Finally when we put the bit
which declares the stage
driver (this part is compulsory for any
simulation configuration file) together with our drivers for the robot
we end up with our final configuration file:
driver
(
name "stage"
plugin "stageplugin"
provides ["simulation:0" ]
# load the named file into the simulator
worldfile "worldfile_name.world"
)
driver
(
name "stage"
provides ["position2d:0"
"ranger:0"
"blobfinder:0"
"ranger:1"]
model "bob1"
)
4.3 - TRY IT OUT (driving a robot)¶
To drive the robots around, you select Devices/Position2d/Subscribe, then select Devices/Position2d/Command in a playerv window, then drag the red bulls-eye around.
img
To learn how to write code for Player or Player/Stage it helps to
understand the basic structure of how Player works. Player uses a
Server/Client structure in order to pass data and instructions between
your code and the robot’s hardware. Player is a server, and a hardware
device. Remember, a device is a piece of hardware that uses a driver
which conforms to an interface. See Section 2.3 - Interfaces, Drivers,
and Devices. On the
robot is subscribed as a client to the server via a thing called a
proxy. The .cfg file associated with your robot (or your simulation)
takes care of telling the Player server which devices are attached to
it, so when we run the command player some_cfg.cfg
this starts up
the Player server and connects all the necessary hardware devices to the
server.
Figure 5.1 shows a basic block diagram of the structure of Player when implemented on a robot. In Player/Stage the same command will start the Player server and load up the worldfile in a simulation window, this runs on your computer and allows your code to interact with the simulation rather than hardware. Figure 5.2 shows a basic block diagram of the Player/Stage structure. Your code must also subscribe to the Player server so that it can access these proxies and hence control the robot. Player has functions and classes which will do all this for you, but you still need to actually call these functions with your code and know how to use them.
| | | :—————:| | | | Figure 5.1: The
server/client control structure of Player when used on a robot. There
may be several proxies connected to the server at any time. |
![]() |
Figure 5.2: The server/client control structure of Player/Stage when used as a simulator. There may be several proxies connected to the server at any time. |
5.1 Types of controllers¶
Player is compatable with C, C++ or Python player controllers. The official documentation for each can be found at:
- libplayerc -http://playerstage.sourceforge.net/doc/Player-3.0.2/player/group__player__clientlib__libplayerc.html
- libplayerc++ -http://playerstage.sourceforge.net/doc/Player-3.0.2/player/group__player__clientlib__cplusplus.html
- python bindings for libplayerc -http://playerstage.sourceforge.net/doc/Player-3.0.2/player/group__player__clientlib__libplayerc__py.html
- python bindings for libplayerc++ - not separately documented, see the libplayerc++ documentation.
There are also such things as “stage controllers” such as those
distributed in the stage source code under examples/ctrl
, but in
this manual we’ll only describe player controllers. Earlier versions of
simple.world
had a line ctrl wander
that automatically started
the simulated robot working with a stage controller. If you happen to
encounter this simple.world
file, just comment out that line to use
the examples given here. Player controllers can control a real or a
simulated robot.
We will start in Chapter 6 by using C++ since it’s pretty general. Then, we’ll cover C controllers in Chapter 7, and Python controllers in Chapter 8 and Chapter 9
The process of writing Player code is mostly the same for each different language though. The libplayerc and libplayercpp proxy functions have different names for each language, but work in more or less the same way, so even if you don’t plan on using C++ or Stage this section will still contain helpful information.
5.2 Example Controllers¶
Some example controllers in various languages can be found in the Player
source code under examples/
. These and more are documented at
http://playerstage.sourceforge.net/wiki/PlayerClientLibraries.
Some matlab and python examples based on this manual are given at http://turobotics.blogspot.com/2013/08/client-controllers-for-player-302-and.html.
5.3 Wrap your code!¶
Before beginning a project it is highly recommended that for any programs other than basic examples you should always wrap your Player commands around your own functions and classes so that all your code’s interactions with Player are kept together the same file. This isn’t a requirement of Player, it’s just good practice. For example, if you upgrade Player or if for some reason your robot breaks and a certain function no longer works you only have to change part of a file instead of searching through all your code for places where Player functions have been used.
img
In this chapter, C++ is used to demonstrate how to write external controllers. Since Player interacts with controlling code over network sockets, it’s pretty easy to control robots (physical or simulated) with other languages as well. Player officially supports C++, C, and Python (see http://playerstage.sourceforge.net/doc/Player-3.0.2/player/group__clientlibs.html). There are also Third party libraries with support for clients ranging from Smalltalk to Java to MATLAB.
First, I’ll present serveral examples of how to interface with different sensors and actuators, then I’ll present two case studies showing these interfaces in action.
6.1 - Getting Started¶
In order to compile your C++ program you use the following commands (in Linux):
g++ -o example0 `pkg-config --cflags playerc++` example0.cc `pkg-config --libs playerc++`
That will compile a program to a file called example0
from the C++
code file example0.cc
.
An even easier and more general way is to make a Makefile
that
explains how to compile your code for you. The details of
Makefiles are
beyond the scope of this manual, but a minimal
example
is given in the tutorial files that came with this manual. If you have
this Makefile in the same directory as your code, you can just type
make file
and if the make program finds file.cc
it will just “do
the right thing”.
TRY IT OUT (Minimal C++ Controller)¶
This is a minimal controller, written in C++, almost identical to the one distributed with player. It goes forward and does very simple collision avoidance based on the sonars.
Read through the code before executing.
6.2 - Connecting to the Server and Proxies With Your Code¶
The first thing to do within your code is to include the Player header file. Assuming Player/Stage is installed correctly on your machine then this can be done with the line
#include <libplayerc++/playerc++.h>
Next we need to establish a Player Client, which will interact with the Player server for you. To do this we use the line:
PlayerClient client_name(hostname, port);
What this line does is declare a new object which is a PlayerClient
called client_name
which connects to the Player server at the given
address. The hostname and port is like that discussed in Section 4.1
-Device Address. If your code is running on
the same computer (or robot) as the Player server you wish to connect to
then the hostname is “localhost” otherwise it will be the IP address of
the computer or robot. The port is an optional parameter usually only
needed for simulations, it will be the same as the port you gave in the
.cfg file. This is only useful if your simulation has more than one
robot in and you need your code to connect to both robots. So if you
gave your first robot port 6665 and the second one 6666 (like in the
example of Section 4.2 - Putting the Configuration File
Together)
then you would need two PlayerClients, one connected to each robot, and
you would do this with the following code:
PlayerClient robot1("localhost", 6665);
PlayerClient robot2("localhost", 6666);
If you are only using one robot and in your .cfg file you said that it would operate on port 6665 then the port parameter to the PlayerClient class is not needed.
Once we have established a PlayerClient we should connect our code to the device proxies so that we can exchange information with them. Which proxies you can connect your code to is dependent on what you have put in your configuration file. For instance if your configuration file says your robot is connected to a laser but not a camera you can connect to the laser device but not the camera, even if the robot (or robot simulation) has a camera on it.
Proxies take the name of the interface which the drivers use to talk to Player. Let’s take part of the Bigbob example configuration file from Section 4.2 - Putting the Configuration File Together):
driver
(
name "stage"
provides ["position2d:0"
"ranger:0"
"blobfinder:0"
"ranger:1" ]
)
Here we’ve told the Player server that our “robot” has devices which use the position2d, ranger, and blobfinder interfaces. In our code then, we should connect to the position2d, ranger, and blobfinder proxies like so:
Position2dProxy positionProxy_name(&client_name,index);
RangerProxy sonarProxy_name(&client_name,index);
BlobfinderProxy blobProxy_name(&client_name,index);
RangerProxy laserProxy_name(&client_name,index);
A full list of which proxies Player supports can be found in the Player manual. They all follow the convention of being named after the interface they use.
In the above case Proxy_name
is the name you want to give to the
proxy object, client_name
is the name you gave the PlayerClient
object earlier and index
is the index that the device was given in
your configuration file (probably 0).
6.2.1 - Setting Up Connections: an Example.¶
For an example of how to connect to the Player sever and device proxies we will use the example configuration file developed in Section 4.2 -Putting the Configuration File Together. For convenience this is reproduced below:
driver
(
name "stage"
plugin "stageplugin"
provides ["simulation:0" ]
# load the named file into the simulator
worldfile "worldfile_name.world"
)
driver
(
name "stage"
provides ["6665:position2d:0"
"6665:ranger:0"
"6665:blobfinder:0"
"6665:ranger:1"]
model "bob1"
)
To set up a PlayerClient and then connect to proxies on that server we can use principles discussed in this section to develop the following code:
#include <stdio.h>
#include <libplayerc++/playerc++.h>
int main(int argc, char *argv[])
{
using namespace PlayerCc; /*need to do this line in c++ only*/
PlayerClient robot("localhost");
Position2dProxy p2dProxy(&robot,0);
RangerProxy sonarProxy(&robot,0);
BlobfinderProxy blobProxy(&robot,0);
RangerProxy laserProxy(&robot,1);
//some control code
return 0;
}
6.3 - Interacting with Proxies¶
As you may expect, each proxy is specialised towards controlling the device it connects to. This means that each proxy will have different commands depending on what it controls.
In Player version 3.0.2 there are 39 different proxies which you can choose to use, many of which are not applicable to Player/Stage. This manual will not attempt to explain them all, a full list of avaliable proxies and their functions is in the Player manual, although the returns, parameters and purpose of the proxy function are not always explained.
The following few proxies are probably the most useful to anyone using Player or Player/Stage.
6.3.1 - Position2dProxy¶
The Position2dProxy is the number one most useful proxy there is. It controls the robot’s motors and keeps track of the robot’s odometry (where the robot thinks it is based on how far its wheels have moved).
6.3.1.1 - SetSpeed( )¶
The SetSpeed command is used to tell the robot’s motors how fast to turn. There are two different SetSpeed commands that can be called, one is for robots that can move in any direction and the other is for robots with differential or car-like drives.
SetSpeed(double XSpeed, double YSpeed, double YawSpeed)
SetSpeed(double XSpeed, double YawSpeed)
SetCarlike(double XSpeed, double DriveAngle)
![]() |
Figure 6.3: A robot on a cartesian grid. This shows what directions the X and Y speeds will cause the robot to move in. A positive yaw speed will turn the robot in the direction of the + arrow, a negative yaw speed is the direction of the - arrow. |
Figure 6.3 shows which direction the x, y and yaw speeds are in relation
to the robot. The x speed is the rate at which the robot moves forward
and the y speed is the robot’s speed sideways, both are to be given in
metres per second. The y speed will only be useful if the robot you want
to simulate or control is a ball, since robots with wheels cannot move
sideways. The yaw speed controls how fast the robot is turning and is
given in radians per second, Player has an inbuilt global function
called dtor()
which converts a number in degrees into a number in
radians which could be useful when setting the yaw speed.
If you want to simulate or control a robot with a differential drive
system then you’ll need to convert left and right wheel speeds into a
forward speed and a turning speed before sending it to the proxy. For
car-like drives there is the SetCarlike
which again is the forward
speed in m/s and the drive angle in radians.
6.3.1.2 - GetSpeed ( )¶
The GetSpeed commands are essentially the reverse of the SetSpeed command. Instead of setting a speed they return the current speed relative to the robot (so x is the forward speed, yaw is the turning speed and so on).
GetXSpeed()
: forward speed (metres/sec).GetYSpeed()
: sideways (perpendicular) speed (metres/sec).GetYawSpeed()
: turning speed (radians/sec).
6.3.1.3 - Get_Pos ( )¶
This function interacts with the robot’s odometry. It allows you to monitor where the robot thinks it is. Coordinate values are given relative to its starting point, and yaws are relative to its starting yaw.
GetXPos()
: returns current x coordinate relative to its x starting position.GetYPos()
: returns current y coordinate relative to its y starting position.GetYaw()
: returns current yaw relative to its starting yaw.
TRY IT OUT (GetSetPositions)¶
In Section 3.2.1 - The Position Model, we specified whether player would record odometry by measuring how much the robot’s wheels have turned, or whether the robot would have perfect knowledge of its current coordinates (by default the robot does not record odometry at all). If you set the robot to record odometry using its wheels then the positions returned by these get commands will become increasingly inaccurate as the simulation goes on. If you want to log your robots position as it moves around, these functions along with the perfect odometry can be used.
6.3.1.4 - SetMotorEnable( )¶
This function takes a boolean input, telling Player whether to enable the motors or not. If the motors are disabled then the robot will not move no matter what commands are given to it, if the motors are enabled then the motors will always work, this is not so desirable if the robot is on a desk or something and is likely to get damaged. Hence the motors being enabled is optional. If you are using Player/Stage, then the motors will always be enabled and this command doesn’t need to be run. However, if your code is ever likely to be moved onto a real robot and the motors are not explicitly enabled in your code, then you may end up spending a long time trying to work out why your robot is not working.
6.3.2 - RangerProxy¶
A RangerProxy interfaces with any ranger sensor.
A laser is represented by a ranger device with one ranger sensor, whose
samples
attribute is greater than one. To minimize confusion with
the depreciated laser interface, I’ll refer to these as single-sensor
devices.
A set of sonars or IR sensors is represented by a ranger device with
multiple ranger sensors whose samples
attributes are not set (or set
to 1). To minimize confusion with the depreciated sonar and IR
interfaces, I’ll refer to these as multiple-sensor devices.
Angles are given with reference to the laser’s centre front (see Figure 6.4).
A RangerProxy has a number of useful methods: * GetRangeCount()
:
The number of ranger measurements that the sensor suite measures. In the
case of a single-sensor device, this is given by the samples
attribute. In the case of a multiple-sensor device, this is given by the
number of sensors. * rangerProxy_name[ranger_number]
: The range
returned by the ranger_number
th scan point. For a single-sensor
device, scan points are numbered from the minimum angle at index 0, to
the maximum angle at index GetRangeCount()-1
. For a multiple-sensor
device, the ranger_number
is given by the order in which you
included the sensor in the world file. * GetRange(ranger_number)
:
Same as rangerProxy_name[ranger_number]
. * GetMinAngle()
: gives
the minimum angle (One tricky thing - you need to be sure to call
RequestConfigure()
once before accessing the min or max angles, they
are initialized to zero!) covered by a ranger sensor. Only makes sense
for a single-sensor device. * GetMaxAngle()
: gives the maximum
angle covered by a ranger sensor. Only makes sense for a single-sensor
device. * GetAngularRes()
: gives the angular resolution (Θ in
Figure 6.4)
![]() |
Figure 6.4: How laser angles are referenced. In this diagram the laser is pointing to the right along the dotted line, the angle θ is the angle of a laser scan point, in this example θ is negative. |
![]() |
Figure 6.5: A laser scanner. The minimum angle is the angle of the rightmost laser scan, the maximum angle is the leftmost laser scan. θ is the scan resolution of the laser, it is the angle between each laser scan, given in radians. |
This example shows how ranger sensors can be read. Read through the code before executing.
6.3.3 - BlobfinderProxy¶
The blobfinder module analyses a camera image for areas of a desired
colour and returns an array of the structure
`playerc_blobfinder_blob_t
<http://playerstage.sourceforge.net/doc/Player-3.0.2/player/structplayer__blobfinder__blob.html>__,
this is the structure used to store blob data. First we will cover how
to get this data from the blobfinder proxy, then we will discuss the
data stored in the structure.
- GetCount(): Returns the number of blobs seen.
- blobProxy_name[blob_number]: This returns the blob structure data for the blob with the index blob_number. Blobs are sorted by index in the order that they appear in the image from left to right.
- GetBlob(blob_number): same as blobProxy_name[blob_number]`
Once we receive the blob structure from the proxy we can extract data we
need. The playerc_blobfinder_blob_t
structure, documented in the
Player
manual
contains the following fields (see Figure 6.6 for illustration):
color
: The colour of the blob it detected. This is given as a hexadecimal value.area
: The area of the blob’s bounding box. (In Stage 4.1.1, there is a bug with respect to the area. It is computed as anint
, but return as anunsigned int
. In order to use it, you must explicitly cast it as an int ((int)area
). See http://sourceforge.net/p/playerstage/bugs/362/ and/or https://github.com/rtv/Stage/issues/41 for the details.x
: The horizontal coordinate of the geometric centre of the blob’s bounding boxy
: The vertical coordinate of the geometric centre of the blob’s bounding boxleft
: The horizontal coordinate of the left hand side of the blob’s bounding boxright
: The horizontal coordinate of the right hand side of the blob’s bounding boxtop
: The vertical coordinate of the top side of the blob’s bounding boxbottom
: The vertical coordinate of the bottom side of the blob’s bounding box
![]() |
Figure 6.6: What
the fields in
playerc_blobfi nd
e r_blob_t
mean. The blob on
the left has a
geometric centre at
(x,y), the blob
on the right has a
bounding box with
the top left corner
at (left, top)
pixels, and a lower
right coordinate at
(right, bottom)
pixels. Coordinates
are given with
reference to the
top left corner of
the image. |
TRY IT OUT (blobfinder)¶
This example shows how to extract info from a blobfinder. Read through the code before executing.
6.3.4 - GripperProxy¶
The GripperProxy allows you to control the gripper. Once the gripper is holding an item, the simulated robot will carry it around wherever it goes. Without a gripper you can only jostle an item in the simulation and you would have to manually tell the simulation what to do with an item. The GripperProxy can also tell you if an item is between the gripper teeth because the gripper model has inbuilt beams which can detect if they are broken.
GetBeams()
: This command will tell you if there is an item inside the gripper. If it is a value above 0 then there is an item to grab.GetState()
: This will tell you whether the gripper is opened or closed. If the command returns a 1 then the gripper is open, if it returns 2 then the gripper is closed, and 3 if the gripper is moving.Open()
: Tells the gripper to open. This will cause any items that were being carried to be dropped.Close()
: Tells the gripper to close. This will cause it to pick up anything between its teeth.
TRY IT OUT (gripper)¶
This example shows a robot approaching a box, gripping it, and dragging it backwards. Read through the code before executing.
6.3.5 - SimulationProxy¶
The simulation proxy allows your code to interact with and change aspects of the simulation, such as an item’s pose or its colour.
6.3.5.1 - Get/Set Pose¶
The item’s pose is a special case of the Get/SetProperty function, because it is so likely that someone would want to move an item in the world they created a special function to do it.
setPose2d(char *item_name, double x, double y, double yaw)
In this case item_name
is as with Get/SetProperty, but we can
directly specify its new coordinates and yaw (coordinates and yaws are
given with reference to the map’s origin).
GetPose2d(char *item_name, double &x, double &y, double &yaw)
This is like SetPose2d only this time it writes the coordinates and yaw to the given addresses in memory.
TRY IT OUT (GetSetPose)¶
This example shows how to Get and Set pose of objects. Read through the code before executing.
6.3.5.2 - Get/Set Property¶
In Stage 4.1.1 the Get/SetProperty simulation proxy functions are only implemented for the property “color”. None of the other properties are supported. Previous versions of Stage (before 3.2.2) had some code but it wasn’t fully implemented, and it’s been removed since.
If you desperately need this functionality you can use an earlier release of Stage, and the first edition of this manual describes how to get and set a model’s property in those distributions.
In this edition of the manual I will describe the only functioning Get/SetProperty, which is “color”.
To retrieve or change a property of an item in the simulation we use the following functions:
GetProperty(char *item_name, char *property, void *value, size_t value_len)
SetProperty(char *item_name, char *property, void *value, size_t value_len)
item_name
: this is the name that you gave to the object in the worldfile, it could be any model that you have described in the worldfile. For example, in Section 3.2.2 - An Example Robot in the worldfile we declared a Bigbob type robot which we called “bob1” so theitem_name
for that object is “bob1”. Similarly in Section 3.2.3- Building Other Stuff <WORLDFILES.md#323-building-other-stuff>__ we built some models of oranges and called the “orange1” to “orange4” so the item name for one of these would be “orange1”. Anything that is a model in your worldfile can be altered by this function, you just need to have named it, no drivers need to be declared in the configuration file for this to work either. We didn’t write controllers for the oranges but we could still alter their properties this way.
property
: Currently,"_mp_color"
is the only supported property about a model that you can change.value
: a pointer to the value you want fill with the property or assign to the property (see below).value_len
: is the size of the value you gave in bytes. This can easily be found with the C or C++sizeof()
operator.
The value
parameter is dependant on which property
you want to
set.
"color"
: This requires an array of fourfloat
values, scaled between 0 and 1. The first index of the array is the red component of the colour, the second is the green, third is blue and fourth is alpha (how light or dark the colour is, usually 1). For example if we want a nice shade of green, which has RGB components 171/224/110 we scale these between 0 and 1 by dividing by 255 to get 0.67/0.88/0.43 we can now put this into a float array with the linefloat green[]={0.67, 0.88, 0.43, 1};
. This array can then be passed into ourSetProperty
function like so:SetProperty("model_name", "color", (void*)green, sizeof(float)*4 );
TRY IT OUT (GetSetProperty)¶
This example shows how to reset the color of an object. Read through the code before executing.
6.4 - General Useful Commands¶
6.4.1 - Read( )¶
To make the proxies update with new sensor data we need to tell the
player server to update, we can do this using the PlayerClient object
which we used to connect to the server. All we have to do is run the
command playerClient_name.Read()
every time the data needs updating
(where playerClient_name is the name you gave the PlayerClient object).
Until this command is run, the proxies and any sensor information from
them will be empty. The devices on a typical robot are asynchronous and
the devices in a Player/Stage simulation are also asynchronous, so
running the Read()
command won’t always update everything at the
same time, so it may take several calls before some large data
structures (such as a camera image) gets updated.
6.4.2 - GetGeom( )¶
Most of the proxies have a function called GetGeom
or
GetGeometry
or RequestGeometry
, or words to that effect. What
these functions do is tell the proxy retrieve information about the
device, usually its size and pose (relative to the robot). The proxies
don’t know this by default since this information is specific to the
robot or the Player/Stage robot model. If your code needs to know this
kind of information about a device then the proxy must run this command
first.
6.5 - Using Proxies: Case Study 1: using C++ for a Trash-Zapping Robot¶
To demonstrate how to write code to control a Player device or Player/Stage simulation we will use the example robot “Bigbob” developed in Section 3.2.2 - An Example Robot and Section 4.2 - Putting the Configuration File Together) which collects oranges and juice cartons from a factory floor. In previous sections we have developed the Stage model for this robot and its environment and the configuration file to control it. Now we can begin to put everything together to create a working simulation of this robot.
6.5.1 - The Control Architecture¶
To zap rubbish we have three basic behaviours:
- Wander: to search for rubbish.
- Move to item: for when an item is spotted and the robot wants to zap it
- Collect item: for dealing with zapping items.
The robot will also avoid obstacles but once this is done it will switch back to its previous behaviour. The control will follow the state transitions shown in Figure 6.7.
![]() |
Figure 6.7: The state transitions that the Bigbob rubbish zapping robot will follow. |
6.5.2 - Beginning the Code¶
In Section 6.2 - Connecting to Server we discussed how to connect to the Player server and proxies attached to the server, and developed the following code:
#include <stdio.h>
#include <libplayerc++/playerc++.h>
int main(int argc, char *argv[])
{
/*need to do this line in c++ only*/
using namespace PlayerCc;
PlayerClient robot("localhost");
Position2dProxy p2dProxy(&robot,0);
RangerProxy sonarProxy(&robot,0);
BlobfinderProxy blobProxy(&robot,0);
RangerProxy laserProxy(&robot,1);
//some control code
return 0;
}
Using our knowledge of the proxies discussed in Section 6.3 -Interacting with Proxies we can build controlling code on top of this basic code. Firstly, it is good practice to enable the motors and request the geometry for all the proxies. This means that the robot will move and that if we need to know about the sensing devices the proxies will have that information available.
//enable motors
p2dProxy.SetMotorEnable(1);
//request geometries
p2dProxy.RequestGeom();
sonarProxy.RequestGeom();
laserProxy.RequestGeom();
laserProxy.RequestConfigure();
//blobfinder doesn't have geometry
Once things are initialised we can enter the main control loop. At this point we should tell the robot to read in data from its devices to the proxies.
while(true)
{
robot.Read();
//control code
}
6.5.3 - Wander¶
First we will initialise a couple of variables which will be the forward speed and the turning speed of the robot. We’ll put this with the proxy initialisations.
Position2dProxy p2dProxy(&robot,0);
RangerProxy sonarProxy(&robot,0);
BlobfinderProxy blobProxy(&robot,0);
RangerProxy laserProxy(&robot,1);
double forwardSpeed, turnSpeed;
Let’s say that Bigbob’s maximum speed is 1 metre/second and it can turn 90 degrees a second. We will write a small subfunction to randomly assign forward and turning speeds between 0 and the maximum speeds.
void Wander(double *forwardSpeed, double *turnSpeed)
{
int maxSpeed = 1;
int maxTurn = 90;
double fspeed, tspeed;
//fspeed is between 0 and 10
fspeed = rand()%11;
//(fspeed/10) is between 0 and 1
fspeed = (fspeed/10)*maxSpeed;
tspeed = rand()%(2*maxTurn);
tspeed = tspeed-maxTurn;
//tspeed is between -maxTurn and +maxTurn
*forwardSpeed = fspeed;
*turnSpeed = tspeed;
}
In the control loop we include a call to this function and then set the resulting speeds to the motors.
while(true)
{
// read from the proxies
robot.Read();
//wander
Wander(&forwardSpeed, &turnSpeed);
//set motors
p2dProxy.SetSpeed(forwardSpeed, dtor(turnSpeed));
}
The dtor()
function is a Player function that turns a number in
degrees into a number in radians. Our calculations have been done in
degrees but SetSpeed
requires radians, so this function is used to
convert between the two. At present the motors are being updated every
time this control loop executes, and this leads to some erratic
behaviour from the robot. Using the sleep()
command we will tell the
control loop to wait one second between each execution. sleep()
is a
standard C function and is included in the unistd.h
header.
At this point we should also seed the random number generator with the
current time so that the wander behaviour isn’t exactly the same each
time. For the sleep command we will need to include unistd.h
and to
seed the random number generator with the current system time we will
need to include time.h
.
#include <stdio.h>
#include <unistd.h>
#include <time.h>
#include <libplayerc++/playerc++.h>
void Wander(double *forwardSpeed, double *turnSpeed)
{
//wander code...
}
int main(int argc, char *argv[])
{
/*need to do this line in c++ only*/
using namespace PlayerCc;
//connect to proxies
double forwardSpeed, turnSpeed;
srand(time(NULL));
//enable motors
//request geometries
while(true)
{
// read from the proxies
robot.Read();
//wander
Wander(&forwardSpeed, &turnSpeed);
//set motors
p2dProxy.SetSpeed(forwardSpeed, dtor(turnSpeed));
sleep(1);
}
}
6.5.4 - Obstacle Avoidance¶
Now we need to write a subfunction that checks the sonars for any obstacles and amends the motor speeds accordingly.
void AvoidObstacles(double *forwardSpeed, double *turnSpeed,
RangerProxy &sp)
{
//will avoid obstacles closer than 40cm
double avoidDistance = 0.4;
//will turn away at 60 degrees/sec
int avoidTurnSpeed = 60;
//left corner is sonar no. 2
//right corner is sonar no. 3
if(sp[2] < avoidDistance)
{
*forwardSpeed = 0;
//turn right
*turnSpeed = (-1)*avoidTurnSpeed;
return;
}
else if(sp[3] < avoidDistance)
{
*forwardSpeed = 0;
//turn left
*turnSpeed = avoidTurnSpeed;
return;
}
else if( (sp[0] < avoidDistance) && \
(sp[1] < avoidDistance))
{
//back off a little bit
*forwardSpeed = -0.2;
*turnSpeed = avoidTurnSpeed;
return;
}
return; //do nothing
}
This is a very basic obstacle avoidance subfunction will update the motor speeds only if there is an obstacle to avoid. If we call this function just before sending data to the motors then it will overwrite any other behaviours so that the obstacle will be avoided. Once the obstacle is no longer in the way then the robot will continue as it was, this will allow us to transition from any behaviour into obstacle avoidance and then back again, as per the requirement of our control structure. All we need to do now is call this function in our control loop:
while(true)
{
// read from the proxies
robot.Read();
//wander
Wander(&forwardSpeed, &turnSpeed);
//avoid obstacles
AvoidObstacles(&forwardSpeed, &turnSpeed, sonarProxy);
//set motors
p2dProxy.SetSpeed(forwardSpeed, dtor(turnSpeed));
sleep(1);
}
6.4.5 - Move To Item¶
For this state we want the robot to move towards a blob that it has spotted. There may be several blobs in its view at once, so we’ll tell the robot to move to the largest one because it’s probably the closest to the robot. The following subfunction finds the largest blob and turns the robot so that the blob’s centre is near the centre of the image. The robot will then move towards the blob.
void MoveToItem(double *forwardSpeed, double *turnSpeed,
BlobfinderProxy &bfp)
{
int i, centre;
//how many blobs are there?
int noBlobs = bfp.GetCount();
playerc_blobfinder_blob_t blob;
int turningSpeed = 10;
/*number of pixels away from the image centre a blob
can be, to be in front of the robot. This is
essentially the margin of error.*/
int margin = 10;
//find the largest blob
int biggestBlobArea = 0;
int biggestBlob = 0;
for(i=0; i<noBlobs; i++)
{
//get blob from proxy
playerc_blobfinder_blob_t currBlob = bfp[i];
if( abs((int)currBlob.area) > biggestBlobArea)
{
biggestBlob = i;
biggestBlobArea = currBlob.area;
}
}
blob = bfp[biggestBlob];
// find centre of image
centre = bfp.GetWidth()/2;
//adjust turn to centre the blob in image
/*if the blob's centre is within some margin of the image
centre then move forwards, otherwise turn so that it is
centred. */
//blob to the left of centre
if(blob.x < centre-margin)
{
*forwardSpeed = 0;
//turn left
*turnSpeed = turningSpeed;
}
//blob to the right of centre
else if(blob.x > centre+margin)
{
*forwardSpeed = 0;
//turn right
*turnSpeed = -turningSpeed;
}
//otherwise go straight ahead
else
{
*forwardSpeed = 0.5;
*turnSpeed = 0;
}
return;
}
We want the robot to transition to this state whenever an item is seen, so we put a conditional statement in our control loop like so:
if(blobProxy.GetCount() == 0)
{
//wander
Wander(&forwardSpeed, &turnSpeed);
}
else
{
//move towards the item
MoveToItem(&forwardSpeed, &turnSpeed, blobProxy);
}
6.5.6 - Collect Item¶
This behaviour will be the most difficult to code because Stage doesn’t support pushable objects (the required physics is far too complex), what happens instead is that the robot runs over the object and just jostles it a bit. As a work-around to this problem we will have to somehow find out which item is between Bigbob’s teeth so that we can find its “name” and then change that item’s pose (for which we need the item’s name) so that it is no longer in the simulation. In essence, instead of having our robot eat rubbish and store it within its body, what we are doing is making the laser zap the rubbish out of existence.
We can find the name of an item between Bigbob’s teeth by cross
referencing the robot’s pose with the poses of the items in the world to
find out which item is nearest the robot’s laser. The first step is to
create a list of all the items in the world, their names and their poses
at initialisation. Since we know the names of the items are “orange1” to
“orange4” and “carton1” to “carton4”, we can find their poses with a
simple call to a simulation proxy. We’ll have to connect to the
simulation proxy with our code first using the line
SimulationProxy simProxy(&robot,0);
, then we can access this
information and put it into a struct.
struct Item
{
char name[16];
double x;
double y;
}typedef item_t;
We can populate the structure with information using the following code:
item_t itemList[8];
void RefreshItemList(item_t *itemList, SimulationProxy &simProxy)
{
int i;
//get the poses of the oranges
for(i=0;i<4;i++)
{
char orangeStr[] = "orange%d";
sprintf(itemList[i].name, orangeStr, i+1);
double dummy; //dummy variable, don't need yaws.
simProxy.GetPose2d(itemList[i].name, \
itemList[i].x, itemList[i].y, dummy);
}
//get the poses of the cartons
for(i=4;i<8;i++)
{
char cartonStr[] = "carton%d";
sprintf(itemList[i].name, cartonStr, i-3);
double dummy; //dummy variable, don't need yaws.
simProxy.GetPose2d(itemList[i].name, \
itemList[i].x, itemList[i].y, dummy);
}
return;
}
Here we are making a string of the item names, for example orange1 and
storing that in the item’s name. We then use this string as an input
into the GetPose2d
function so that we can also get the item’s
location in the simulation.
Next we can begin the “Collect Item” behaviour, which will be triggered by something breaking the laser beam. When this happens we will check the area around Bigbob’s teeth, as indicated by Figure 6.8. We know the distance from the centre of this search circle to Bigbob’s origin (0.625m) and the radius of the search circle (0.375m), we can get the robot’s exact pose with the following code.
double x, y, yaw;
simProxy.GetPose2d("bob1", x, y, yaw);
Cross referencing the robot’s position with the item positions is a
matter of trigonometry, so isn’t particularly relevant to a manual on
Player/Stage. We won’t reproduce the code here, but the full and final
code developed for the Bigbob rubbish zapping robot can be found at
github. The
method we used is to find the Euclidian distance of the items to the
circle centre, and the smallest distance is the item we want to destroy.
We made a subfunction called FindItem
that returns the index of the
item to be destroyed. (We could also equip BigBob with a gripper, and
call gripper.close()
, and haul the trash somewhere else to drop it
off. See GripperProxy for more details, and
bigbob11
for an example.)
![]() |
Figure 6.8: Where to look for items which may have passed through Bigbob’s laser. |
Now that we can find the item to destroy it’s fairly simple to trigger our subfunction when the laser is broken so we can find and destroy an item.
if(laserProxy[90] < 0.25)
{
int destroyThis;
/*first param is the list of items in the world
second is length of this list
third parameter is the simulation proxy with
the pose information in it*/
destroyThis = FindItem(itemList, 8, simProxy);
//move it out of the simulation
simProxy.SetPose2d(itemList[destroyThis].name, -10, -10, 0);
RefreshItemList(itemList, simProxy);
}
The laser has 180 samples, so sample number 90 is the one which is perpendicular to Bigbob’s teeth. This point returns a maximum of 0.25, so if its range was to fall below this then something has passed through the laser beam. We then find the item closest to the robot’s teeth and move that item to coordinate (-10, -10) so it is no longer visible or accessible.
Finally we have a working simulation of a rubbish zapping robot! The
code comprises the source <source_code>/Ch6.5/bigbob.cc
, the
simulation world <source_code>/Ch6.5/bigbob.world
, and configuration
file <source_code>/Ch6.5/bigbob.cfg
.
This example shows the final code for the trash-zapping robot. Read through the code before executing.
6.6 - Case Study 2: Simulating Multiple Robots¶
Our robot simulation case study only shows how to simulate a single robot in a Player/Stage environment. It’s highly likely that a simulation might want more than one robot in it. In this situation you will need to build a model of every robot you need in the worldfile, and then its associated driver in the configuration file. Let’s take a look at our worldfile for the case study, we’ll add a new model of a new Bigbob robot called “bob2”:
bigbob
(
name "bob1"
pose [-5 -6 45]
color "green"
)
bigbob
(
name "bob2"
pose [5 6 225]
color "yellow"
)
7.6.1 - Each robot on it’s own port¶
If there are multiple robots in the simulation, the standard practice is to put each robot on its own port (see Section 4.1 - Device Address. To implement this in the configuration file we need to tell Player which port to find our second robot on:
driver( name "stage"
provides ["6665:position2d:0" "6665:ranger:0"
"6665:blobfinder:0" "6665:ranger:1"]
model "bob1" )
driver( name "stage"
provides ["6666:position2d:0" "6666:ranger:0"
"6666:blobfinder:0" "6666:ranger:1"]
model "bob2" )
If you plan on simulating a large number of robots then it is probably worth writing a script to generate the world and configuration files.
When Player/Stage is started, the Player server automatically connects to all the ports used in your simulation and you control the robots separately with different PlayerClient objects in your code. For instance:
//first robot
PlayerClient robot1("localhost", 6665);
Position2dProxy p2dprox1(&robot1,0);
RangerProxy sprox1(&robot1,0);
//second robot
PlayerClient robot2("localhost", 6666);
Position2dProxy p2dprox2(&robot2,0);
RangerProxy sprox2(&robot2,0);
Each Player Client represents a robot, this is why when you connect to a proxy the PlayerClient is a constructor parameter. Each robot has a proxy for each of its devices, no robots share a proxy, so it is important that your code connects to every proxy of every robot in order to read the sensor information.
How you handle the extra PlayerClients and proxies is dependent on the scale of the simulation and your own personal coding preferences. It’s a good idea, if there’s more than maybe 2 robots in the simulation, to make a robot class which deals with connecting to proxies and the server, and processes all the information internally to control the robot. Then you can create an instance of this class for each simulated robot (obviously the robot’s port number would need to be a parameter otherwise they’ll all connect to the same port and consequently the same robot.) and all the simulated robots will run the same code.
This example shows the final code for two trash-zapping robots. Read through the code before executing.
7.6.2 - Each interface on it’s own index¶
An alternative to using a port for each robot is to use the same port but a different index.
For example, the Bigbob robot uses interfaces and indexes: position2d:0, ranger:0, blobfinder:0 and ranger:1. If we configured two Bigbob robots to use the same port but a different index our configuration file would be like this:
driver( name "stage"
provides ["6665:position2d:0" "6665:ranger:0"
"6665:blobfinder:0" "6665:ranger:1"]
model "bob1" )
driver( name "stage"
provides ["6665:position2d:1" "6665:ranger:2"
"6665:blobfinder:1" "6665:ranger:3"]
model "bob2" )
In our code we could then establish the proxies using only one PlayerClient:
PlayerClient robot("localhost", 6665);
//first robot
Position2dProxy p2dprox1(&robot,0);
RangerProxy sprox1(&robot,0);
//second robot
Position2dProxy p2dprox2(&robot,1);
RangerProxy sprox2(&robot,2);
//shared Simultion proxy...
SimulationProxy sim(&robot,0);
This example shows the final code for the trash-zapping robot. Read through the code before executing.
img
In Chapter 6 only C++ was used as an example. Since Player interacts with controlling code over network sockets, it’s pretty easy to control robots (physical or simulated) with other languages as well. Player officially supports C++, C, and Python (see http://playerstage.sourceforge.net/doc/Player-3.0.2/player/group__clientlibs.html). There are also Third party libraries with support for clients ranging from Smalltalk to Java to MATLAB.
In this chapter, I will review the same examples given in Chapter 6 for C++, hilighting the differences in the new language. Then, I will provide a new version of the Trash Zapping Robot for each.
7.1 - Getting Started¶
If you are coding in C use the following command to compile:
gcc -o simple `pkg-config --cflags playerc` simple.c `pkg-config --libs playerc`
An even easier and more general way is to make a Makefile
that
explains how to compile your code for you. The details of
Makefiles are
beyond the scope of this manual, but an example is given in the tutorial
files that came with this manual. If you have this Makefile
in the
same directory as your code, you can just type make file
and the
make program will search for file.c
and “do the right thing”.
TRY IT OUT (Minimal C Controller)¶
This is a minimal controller, written in C, almost identical to the one distributed with player. It goes forward and does very simple collision avoidance based on the sonars.
Read through the code before executing.
7.2 - Connecting to the Server and Proxies With Your Code¶
The first thing to do within your code is to include the Player header file. Assuming Player/Stage is installed correctly on your machine then this can be done with the line
#include <libplayerc/playerc.h>
Next we need to establish a Player Client, which will interact with the Player server for you. To do this we use the lines:
playerc_client_t *client;
client = playerc_client_create(NULL, "localhost", 6665);
What this does is declare a new object which is a playerc_client called
client
which connects to the Player server at the given address. The
hostname and port is like that discussed in Section 4.1 - Device
Address. If your code is running on the same
computer (or robot) as the Player server you wish to connect to then the
hostname is “localhost” otherwise it will be the IP address of the
computer or robot. The port is an optional parameter usually only needed
for simulations, it will be the same as the port you gave in the .cfg
file. This is only useful if your simulation has more than one robot in
and you need your code to connect to both robots. So if you gave your
first robot port 6665 and the second one 6666 (like in the example of
Section 4.2 - Putting the Configuration File
Together)
then you would need two player clients, one connected to each robot, and
you would do this with the following code:
client1 = playerc_client_create(NULL, "localhost", 6665);
client2 = playerc_client_create(NULL, "localhost", 6666);
If you are only using one robot and in your .cfg file you said that it would operate on port 6665 then the port parameter to the player client constructor is not needed.
Once we have established a player client we should connect our code to the device proxies so that we can exchange information with them. Which proxies you can connect your code to is dependent on what you have put in your configuration file. For instance if your configuration file says your robot is connected to a laser but not a camera you can connect to the laser device but not the camera, even if the robot (or robot simulation) has a camera on it.
Proxies take the name of the interface which the drivers use to talk to Player. Let’s take part of the Bigbob example configuration file from Section 4.2 - Putting the Configuration File Together):
driver
(
name "stage"
provides ["position2d:0"
"ranger:0"
"blobfinder:0"
"ranger:1" ]
)
Here we’ve told the Player server that our “robot” has devices which use the position2d, ranger, and blobfinder interfaces. In our code then, we should connect to the position2d, ranger, and blobfinder proxies like so:
position2d_name = playerc_position2d_create(client_name, index);
playerc_position2d_subscribe(position2d_name, PLAYER_OPEN_MODE);
ranger_name = playerc_ranger_create(client_name,index);
playerc_ranger_subscribe(ranger_name,PLAYER_OPEN_MODE);
blobfinder_name = playerc_blobfinder_create(client_name,index);
playerc_blobfinder_subscribe(blobfinder_name,PLAYER_OPEN_MODE);
A full list of which proxies PlayerC supports can be found in the
Player
manual
they all follow the convention of being named after the interface they
use. In the above case proxy_name
is the name you want to give to
the proxy object, client_name
is the name you gave the
playerc_client object earlier and index
is the index that the
device was given in your configuration file (probably 0).
7.2.1 - Setting Up Connections: an Example.¶
For an example of how to connect to the Player sever and device proxies we will use the example configuration file developed in Section 4.2 -Putting the Configuration File Together. For convenience this is reproduced below:
driver
(
name "stage"
plugin "stageplugin"
provides ["simulation:0" ]
# load the named file into the simulator
worldfile "worldfile_name.world"
)
driver
(
name "stage"
provides ["6665:position2d:0"
"6665:ranger:0"
"6665:blobfinder:0"
"6665:ranger:1"]
model "bob1"
)
To set up a player client and then connect to proxies on that server we can use principles discussed in this section to develop the following code:
#include <stdio.h>
#include <libplayerc/playerc.h>
int main(int argc, char *argv[])
{
playerc_client_t *robot;
/* Create a client and connect it to the server. */
robot = playerc_client_create(NULL, "localhost", 6665);
if (0 != playerc_client_connect(robot)) return -1;
/* Create and subscribe to a position2d device. */
p2dProxy = playerc_position2d_create(robot, 0);
if (playerc_position2d_subscribe(p2dProxy, PLAYER_OPEN_MODE)) return -1;
/* Create and subscribe to a ranger (sonar) device. */
sonarProxy = playerc_ranger_create(robot, 0);
if (playerc_ranger_subscribe(sonarProxy, PLAYER_OPEN_MODE)) return -1;
/* Create and subscribe to a blobfinder device. */
BlobfinderProxy = playerc_blobfinder_create(robot, 0);
if (playerc_blobfinder_subscribe(BlobfinderProxy, PLAYER_OPEN_MODE)) return -1;
/* Create and subscribe to a ranger (laser) device. */
laserProxy = playerc_ranger_create(robot, 1);
if (playerc_ranger_subscribe(laserProxy, PLAYER_OPEN_MODE)) return -1;
/*some control code */
return 0;
}
7.3 - Interacting with Proxies¶
As you may expect, each proxy is specialised towards controlling the device it connects to. This means that each proxy will have different commands depending on what it controls. In Player version 3.0.2 there are 39 different proxies which you can choose to use, many of which are not applicable to Player/Stage. This manual will not attempt to explain them all, a full list of avaliable proxies and their functions is in the Player manual, although the returns, parameters and purpose of the proxy function is not always explained.
The following few proxies are probably the most useful to anyone using Player or Player/Stage.
7.3.1 - position2dproxy¶
The position2dproxy is the number one most useful proxy there is. It controls the robot’s motors and keeps track of the robot’s odometry (where the robot thinks it is based on how far its wheels have moved).
7.3.1.1 - SetSpeed ( )¶
The SetSpeed command is used to tell the robot’s motors how fast to turn. There are three different SetSpeed commands that can be called, one is for robots that can move in any direction (omnidirectional), one is for for robots with differential drive (i.e. one drive wheel on each side), and the last for car-like drives.
playerc_position2d_set_cmd_vel (playerc_position2d_t *device, double XSpeed, double YSpeed, double YawSpeed, int state)
playerc_position2d_set_cmd_vel_head (playerc_position2d_t *device, double XSpeed, double YSpeed, double YawHeading, int state)
playerc_position2d_set_cmd_car (playerc_position2d_t *device, double XSpeed, double SteerAngle)
![]() |
Figure 7.3: A robot on a cartesian grid. This shows what directions the X and Y speeds will cause the robot to move in. A positive yaw speed will turn the robot in the direction of the + arrow, a negative yaw speed is the direction of the -arrow. |
Figure 7.3 shows which direction the x, y and yaw speeds are in relation
to the robot. The x speed is the rate at which the robot moves forward
and the y speed is the robot’s speed sideways, both are to be given in
metres per second. The y speed will only be useful if the robot you want
to simulate or control is onidirectional, since robots with standard
wheels cannot move sideways. The yaw speed controls how fast the robot
is turning and is given in radians per second. Player has an inbuilt
global function called dtor()
which converts a number in degrees
into a number in radians which could be useful when setting the yaw
speed.
If you want to simulate or control a robot with a differential drive
system then you’ll need to convert left and right wheel speeds into a
forward speed and a turning speed before sending it to the proxy. For
car-like drives there is the SetCarlike
which, again is the forward
speed in m/s and the drive angle in radians.
7.3.1.2 - GetSpeed ( )¶
With playerc, there are no explicit GetSpeed comments - you just read
from the relevant fields in the playerc_position2d_t
strucure.
device->vx
: forward speed (metres/sec).device->vy
: sideways (perpendicular) speed (metres/sec).device->va
: turning speed (radians/sec).
7.3.1.3 - Get_Pos ( )¶
Again, in playerc these fields are read directly from the
playerc_position2d_t
structure.
This allows you to monitor where the robot thinks it is. Coordinate values are given relative to its starting point, and yaws are relative to its starting yaw.
device->px
: gives current x coordinate relative to its x starting position.device->py
: gives current y coordinate relative to its y starting position.device->pa
: gives current yaw relative to its starting yaw.
TRY IT OUT (GetSetPositions)¶
In Section 3.2.1 - The Position Model, we specified whether it would record odometry by measuring how much its wheels have turned, or whether the robot would have perfect knowledge of its current coordinates (by default the robot does not record odometry at all). If you set the robot to record odometry using its wheels then the positions returned by these get commands will become increasingly inaccurate as the simulation goes on. If you want to log your robots position as it moves around, these functions along with the perfect odometry setting can be used.
7.3.1.4 - Motor Enable¶
The playerc_position2d_enable(device, enable)
fuction takes a
boolean input, telling Player whether to enable the motors or not. If
the motors are disabled then the robot will not move no matter what
commands are given to it, if the motors are enabled then the motors will
always work, this is not so desirable if the robot is on a desk or
something and is likely to get damaged. Hence the motors being enabled
is optional. If you are using Player/Stage, then the motors will always
be enabled and this command doesn’t need to be run. However, if your
code is ever likely to be moved onto a real robot and the motors are not
explicitly enabled in your code, then you may end up spending a long
time trying to work out why your robot is not working.
7.3.2 - rangerproxy¶
A RangerProxy interfaces with any ranger sensor.
A laser is represented by a ranger device with one ranger sensor, whose
samples
attribute is greater than one. To minimize confusion with
the depreciated laser interface, I’ll refer to these as single-sensor
devices. A set of sonars or IR sensors is represented by a ranger device
with multiple ranger sensors whose samples
attributes are not set
(or set to 1). To minimize confusion with the depreciated sonar and IR
interfaces, I’ll refer to these as multiple-sensor devices.
Angles are given with reference to the laser’s centre front (see Figure 7.4).
playerc_ranger_get_geom()
: Retrieves the configuration of the ranger sensor, and fills in the playerc_ranger_t structure. The most useful elements of this structure are:ranges_count
: The number of ranger measurements that the sensor suite measures. In the case of a single-sensor device, this is given by thesamples
attribute. In the case of a multiple-sensor device, this is given by the number of sensors.ranges[ranger_number]
: The range returned by theranger_number
th scan point. For a single-sensor device, scan points are numbered from the minimum angle at index 0, to the maximum angle at indexranges_count
. For a multiple-sensor device, theranger_number
is given by the order in which you included the sensor.min_angle
: gives the minimum angle covered by a ranger sensor. Only makes sense for a single-sensor device.max_angle
: gives the maximum angle covered by a ranger sensor. Only makes sense for a single-sensor device.bearings[ranger_number]
: scan bearings in the XY plane (See Figure 7.4)
![]() |
Figure 7.4: How laser angles are referenced. In this diagram the laser is pointing to the right along the dotted line, the angle θ is the angle of a laser scan point, in this example θ is negative. |
![]() |
Figure 7.5: A laser scanner. The minimum angle is the angle of the rightmost laser scan, the maximum angle is the leftmost laser scan. θ is the scan resolution of the laser, it is the angle between each laser scan, given in radians. |
This example shows how ranger sensors can be read. Read through the code before executing.
7.3.3 - BlobfinderProxy¶
The blobfinder module analyses a camera image for areas of a desired
colour and returns a structure playerc_blobfinder_t
. This structure
contains very few elements:
width
- Width of image (pixels)height
- Height of image (pixels)blobs_count
- the number of blobs seenblobs
- a pointer to an array ofplayerc_blobfinder_blob_t
structures.
So for a blobfinder proxy blobfinder, the blobs are given by *
blobfinder->blobs[blob_number]
: This returns the blob structure data
for the blob with the index blob_number
. Blobs are sorted by index
in the order that they appear in the image from left to right. This can
also be achieved with the BlobfinderProxy function
GetBlob(blob_number)
.
Once we receive the blob structure from the proxy we can extract data we
need. The playerc_blobfinder_blob_t
structure, documented in the
Player
manual
contains the following fields (see Figure 6.6 for illustration):
color
: The colour of the blob it detected. This is given as a hexadecimal value.area
: The area of the blob’s bounding box. (In Stage 4.1.1, there is a bug with respect to the area. It is computed as anint
, but return as anunsigned int
. In order to use it, you must explicitly cast it as an int ((int)area
). See http://sourceforge.net/p/playerstage/bugs/362/ and/or https://github.com/rtv/Stage/issues/41 for the details.x
: The horizontal coordinate of the geometric centre of the blob’s bounding boxy
: The vertical coordinate of the geometric centre of the blob’s bounding boxleft
: The horizontal coordinate of the left hand side of the blob’s bounding boxright
: The horizontal coordinate of the right hand side of the blob’s bounding boxtop
: The vertical coordinate of the top side of the blob’s bounding boxbottom
: The vertical coordinate of the bottom side of the blob’s bounding box
![]() |
Figure 7.6: What the
fields in
playerc_blobfind er
_ blob_t
mean. The blob on the
left has a geometric
centre at (x,y),
the blob on the right
has a bounding box
with the top left
corner at (left,
top) pixels, and a
lower right
coordinate at
(right, bottom)
pixels. Coordinates
are given with
reference to the top
left corner of the
image. |
TRY IT OUT (blobfinder)¶
This example shows how to extract info from a blobfinder. Read through the code before executing.
7.3.4 - GripperProxy¶
The GripperProxy allows you to control the gripper, once the gripper is holding an item, the simulated robot will carry it around wherever it goes. Without a gripper you can only jostle an item in the simulation and you would have to manually tell the simulation what to do with an item. The GripperProxy can also tell you if an item is between the gripper teeth because the gripper model has inbuilt beams which can detect if they are broken.
playerc_gripper_open_cmd (playerc_gripper_t *device)
: Tells the gripper to open. This will cause any items that were being carried to be dropped.playerc_gripper_close_cmd (playerc_gripper_t *device)
: Command the gripper to close. Tells the gripper to close. This will cause it to pick up anything between its teeth.playerc_gripper_stop_cmd (playerc_gripper_t *device)
: Command the gripper to stop. If it is opening, it may not complete opening. If it’s closing, it may not complete closing.playerc_gripper_store_cmd (playerc_gripper_t *device)
: Command the gripper to store. Any objects between the gripper’s teeth is stored in a hypothetical sstorage, if there is still capacity (it dissapears from the world).playerc_gripper_retrieve_cmd (playerc_gripper_t *device)
: Command the gripper to retrieve. If there is a stored object in the storage, it re-appears in the gripper.playerc_gripper_printout (playerc_gripper_t *device, const char *prefix)
: Print a human-readable version of the gripper state.playerc_gripper_get_geom (playerc_gripper_t *device)
: Get the gripper geometry. This is placed in theplayerc_gripper_t
structure, which contains the following information:num_beams
: The number of breakbeams the gripper has.capacity
: The capacity of the gripper’s store - if 0, the gripper cannot store.state
: The gripper’s state: may be one of PLAYER_GRIPPER_STATE_OPEN, PLAYER_GRIPPER_STATE_CLOSED, PLAYER_GRIPPER_STATE_MOVING or PLAYER_GRIPPER_STATE_ERROR.beams
: The position of the object in the gripper. This command will tell you if there is an item inside the gripper. If it is a value above 0 then there is an item to grab.stored
: The number of currently-stored objects.
TRY IT OUT (gripper)¶
This example shows a robot approaching a box, gripping it, and dragging it backwards. Read through the code before executing.
7.3.5 - SimulationProxy¶
The simulation proxy allows your code to interact with and change aspects of the simulation, such as an item’s pose or its colour.
7.3.5.1 - Get/Set Pose¶
The item’s pose is a special case of the Get/SetProperty function, because it is so likely that someone would want to move an item in the world they created a special function to do it.
playerc_simulation_set_pose2d (playerc_simulation_t *device, char *item_name,
double gx, double gy, double ga)
In this case item_name
is as with get/set_property, but we can
directly specify its new coordinates and yaw (coordinates and yaws are
given with reference to the map’s origin).
playerc_simulation_get_pose2d (playerc_simulation_t *device, char *identifier,
double *x, double *y, double *a)
This is like set_pose2d only this time it writes the coordinates and yaw to the given addresses in memory.
TRY IT OUT (GetSetPose)¶
7.3.5.2 - Get/Set Property¶
In version 4.1.1 of Stage the get/set_property simulation proxy functions are only implemented for the property “color”. None of the other properties are supported. Previous versions of Stage (before 3.2.2) had some code but it wasn’t fully implemented, and it’s been removed since.
If you desperately need this functionality you can use an earlier release of Stage, and the first edition of this manual describes how to get and set a model`s property in those distributions.
In this edition of the manual I will describe the only functioning get/set_property, which is “color”.
To retrieve or change a property of an item in the simulation we use the following functions:
playerc_simulation_get_property (playerc_simulation_t *device,
char *item_name, char *property, void *value, size_t value_len)
playerc_simulation_set_property (playerc_simulation_t *device,
char *item_name, char *property, void *value, size_t value_len)
item_name
: this is the name that you gave to the object in the worldfile, it could be any model that you have described in the worldfile. For example, in This Example in the worldfile we declared a Bigbob type robot which we called “bob1” so theitem_name
for that object is “bob1”. Similarly in Section 3.2.3- Building Other Stuff <WORLDFILES.md#323-building-other-stuff>__ we built some models of oranges and called the “orange1” to “orange4” so the item name for one of these would be “orange1”. Anything that is a model in your worldfile can be altered by this function, you just need to have named it, no drivers need to be declared in the configuration file for this to work either. We didn’t write controllers for the oranges but we could still alter their properties this way.
property
: Currently,"_mp_color"
is the only supported property about a model that you can change.value
: a pointer to the value you want fill with the property or assign to the property (see below).value_len
: is the size of the value you gave in bytes. This can easilyvalue_len
: is the size of the value you gave in bytes. This can easily be found with the C or C++sizeof()
operator.
The value
parameter is dependant on which property
you want to
set.
"color"
: This requires an array of fourfloat
values, scaled between 0 and 1. The first index of the array is the red component of the colour, the second is the green, third is blue and fourth is alpha (how light or dark the colour is, usually 1). For example if we want a nice shade of green, which has RGB components 171/224/110 we scale these between 0 and 1 by dividing by 255 to get 0.67/0.88/0.43 we can now put this into a float array with the linefloat green[]={0.67, 0.88, 0.43, 1};
. This array can then be passed into ourSetProperty
function like so:playerc_simulation_set_property(sp,(char *)"puck1",(char*)"color",green,4*sizeof(float));
TRY IT OUT (GetSetProperty)¶
This example shows how to reset the color of an object. Read through the code before executing.
7.4 - General Useful Commands¶
7.4.1 - read( )¶
To make the proxies update with new sensor data we need to tell the
player server to update, we can do this using the playerc_client object
which we used to connect to the server. All we have to do is run the
command playerc_client_read(playerc_client_name)
every time the data
needs updating (where playerc_client_name is the name you gave the
player client object). Until this command is run, the proxies and any
sensor information from them will be empty. The devices on a typical
robot are asynchronous and the devices in a Player/Stage simulation are
also asynchronous, so running the read()
command won’t always update
everything at the same time, so it may take several calls before some
large data structures (such as a camera image) gets updated.
7.4.2 - getgeom()¶
Most of the proxies have a function called get_geom
or
get_geometry
or request_geometry
, or words to that effect. What
these functions do is tell the proxy to retrieve information about the
device, usually its size and pose (relative to the robot). The proxies
don’t know this by default since this information is specific to the
robot or the Player/Stage robot model. If your code needs to know this
kind of information about a device then the proxy must run this command
first.
7.5 - Using Proxies: Case Study: using C for a Trash-Zapping Robot¶
To demonstrate how to write code to control a Player device or Player/Stage simulation we will use the example robot “Bigbob” developed in Section 3.2.2 - An Example Robot and Section 4.2 - Putting the Configuration File Together) which collects oranges and juice cartons from a factory floor. In previous sections we have developed the Stage model for this robot and its environment and the configuration file to control it. Now we can begin to put everything together to create a working simulation of this robot.
7.5.1 - The Control Architecture¶
To zap rubbish we have three basic behaviours:
- Wandering: to search for rubbish.
- Moving towards item: for when an item is spotted and the robot wants to zap it
- Collecting item: for dealing with zapping items.
The robot will also avoid obstacles but once this is done it will switch back to its previous behaviour. The control will follow the state transitions shown in Figure 7.7.
![]() |
Figure 7.7: The state transitions that the Bigbob rubbish zapping robot will follow. |
7.5.2 - Beginning the Code¶
In Section 7.2 - Connecting to Server we discussed how to connect to the Player server and proxies attached to the server, and developed the following code:
#include <stdio.h>
#include <libplayerc/playerc.h>
int main(int argc, char *argv[])
{
playerc_client_t *robot;
playerc_position2d_t *p2dProxy;
playerc_ranger_t *sonarProxy;
playerc_blobfinder_t *blobProxy;
playerc_ranger_t *laserProxy;
playerc_simulation_t *simProxy;
/* Create a client and connect it to the server. */
robot = playerc_client_create(NULL, "localhost", 6665);
if (0 != playerc_client_connect(robot)) return -1;
/* Create and subscribe to devices. */
p2dProxy = playerc_position2d_create(robot, 0);
if (playerc_position2d_subscribe(p2dProxy, PLAYER_OPEN_MODE)) return -1;
sonarProxy = playerc_ranger_create(robot, 0);
if (playerc_ranger_subscribe(sonarProxy, PLAYER_OPEN_MODE)) return -1;
blobProxy = playerc_blobfinder_create(robot, 0);
if (playerc_blobfinder_subscribe(blobProxy, PLAYER_OPEN_MODE)) return -1;
laserProxy = playerc_ranger_create(robot, 1);
if (playerc_ranger_subscribe(laserProxy, PLAYER_OPEN_MODE)) return -1;
simProxy = playerc_simulation_create(robot, 0);
if (playerc_simulation_subscribe(simProxy, PLAYER_OPEN_MODE)) return -1;
/* some control code */
}
Using our knowledge of the proxies discussed in Section 7.3 -Interacting with Proxies we can build controlling code on top of this basic code. Firstly, it is good practice to enable the motors and request the geometry for all the proxies. This means that the robot will move and that if we need to know about the sensing devices the proxies will have that information available.
//enable motors
playerc_position2d_enable(p2dProxy,1);
//request geometries
playerc_position2d_get_geom(p2dProxy);
playerc_ranger_get_geom(sonarProxy);
playerc_ranger_get_geom(laserProxy);
//blobfinder doesn't have geometry
Once things are initialised we can enter the main control loop. At this point we should tell the robot to read in data from its devices to the proxies.
while(true)
{
/* read from the proxies */
playerc_client_read(robot);
/*control code*/
}
7.5.3 - Wander¶
First we will initialise a couple of variables which will be the forward speed and the turning speed of the robot.
double forwardSpeed, turnSpeed;
Let’s say that Bigbob’s maximum speed is 1 metre/second and it can turn 90 degrees a second. We will write a small subfunction to randomly assign forward and turning speeds between 0 and the maximum speeds.
void Wander(double *forwardSpeed, double *turnSpeed)
{
int maxSpeed = 1;
int maxTurn = 90;
double fspeed, tspeed;
//fspeed is between 0 and 10
fspeed = rand()%11;
//(fspeed/10) is between 0 and 1
fspeed = (fspeed/10)*maxSpeed;
tspeed = rand()%(2*maxTurn);
tspeed = tspeed-maxTurn;
//tspeed is between -maxTurn and +maxTurn
*forwardSpeed = fspeed;
*turnSpeed = tspeed;
}
In the control loop we include a call to this function and then set the resulting speeds to the motors.
while(true)
{
// read from the proxies
playerc_client_read(robot);
//wander
Wander(&forwardSpeed, &turnSpeed);
//set motors
playerc_position2d_set_cmd_vel(p2dProxy, forwardSpeed, 0.0, DTOR(turnSpeed), 1);
}
The dtor()
function is a Player function that turns a number in
degrees into a number in radians. Our calculations have been done in
degrees but SetSpeed
requires radians, so this function is used to
convert between the two. At present the motors are being updated every
time this control loop executes, and this leads to some erratic
behaviour from the robot. Using the sleep()
command we will tell the
control loop to wait one second between each execution. sleep()
is a
standard C function and is included in the unistd.h
header.
At this point we should also seed the random number generator with the
current time so that the wander behaviour isn’t exactly the same each
time. For the sleep command we will need to include unistd.h
and to
seed the random number generator with the current system time we will
need to include time.h
.
#include <stdio.h>
#include <unistd.h>
#include <time.h>
#include <libplayerc/playerc.h>
void Wander(double *forwardSpeed, double *turnSpeed)
{
//wander code...
}
int main(int argc, char *argv[])
{
//connect to proxies
double forwardSpeed, turnSpeed;
srand(time(NULL));
//enable motors
//request geometries
while(true)
{
// read from the proxies
playerc_client_read(robot);
//wander
Wander(&forwardSpeed, &turnSpeed);
//set motors
playerc_position2d_set_cmd_vel(p2dProxy, forwardSpeed, 0.0, DTOR(turnSpeed), 1);
sleep(1);
}
}
7.5.4 - Obstacle Avoidance¶
Now we need to write a subfunction that checks the sonars for any obstacles and amends the motor speeds accordingly.
void AvoidObstacles(double *forwardSpeed, double *turnSpeed,
playerc_ranger_t *sproxy)
{
double *sp = sproxy->ranges; // pointer to range array
//will avoid obstacles closer than 40cm
double avoidDistance = 0.4;
//will turn away at 60 degrees/sec
int avoidTurnSpeed = 60;
//left corner is sonar no. 2
//right corner is sonar no. 3
if(sp[2] < avoidDistance)
{
*forwardSpeed = 0;
//turn right
*turnSpeed = (-1)*avoidTurnSpeed;
return;
}
else if(sp[3] < avoidDistance)
{
*forwardSpeed = 0;
//turn left
*turnSpeed = avoidTurnSpeed;
return;
}
else if( (sp[0] < avoidDistance) && \
(sp[1] < avoidDistance))
{
//back off a little bit
*forwardSpeed = -0.2;
*turnSpeed = avoidTurnSpeed;
return;
}
return; //do nothing
}
This is a very basic obstacle avoidance subfunction will update the motor speeds only if there is an obstacle to avoid. If we call this function just before sending data to the motors then it will overwrite any other behaviours so that the obstacle will be avoided. Once the obstacle is no longer in the way then the robot will continue as it was, this will allow us to transition from any behaviour into obstacle avoidance and then back again, as per the requirement of our control structure. All we need to do now is call this function in our control loop:
while(true)
{
// read from the proxies
playerc_client_read(robot);
//wander
Wander(&forwardSpeed, &turnSpeed);
//avoid obstacles
AvoidObstacles(&forwardSpeed, &turnSpeed, sonarProxy);
//set motors
playerc_position2d_set_cmd_vel(p2dProxy, forwardSpeed, 0.0, DTOR(turnSpeed), 1);
sleep(1);
}
7.4.5 - Move To Item¶
For this state we want the robot to move towards a blob that it has spotted. There may be several blobs in its view at once, so we’ll tell the robot to move to the largest one because it’s probably the closest to the robot. The following subfunction finds the largest blob and turns the robot so that the blob’s centre is near the centre of the image. The robot will then move towards the blob.
void MoveToItem(double *forwardSpeed, double *turnSpeed,
playerc_blobfinder_t *bfp)
{
int i, centre;
//how many blobs are there?
int noBlobs = bfp->blobs_count;
playerc_blobfinder_blob_t blob;
int turningSpeed = 5; // in deg/s
/*number of pixels away from the image centre a blob
can be, to be in front of the robot. This is
essentially the margin of error.*/
int margin = 10;
//find the largest blob
int biggestBlobArea = 0;
int biggestBlob = 0;
for(i=0; i<noBlobs; i++)
{
//get blob from proxy
playerc_blobfinder_blob_t currBlob = bfp->blobs[i];
if( abs((int)currBlob.area) > biggestBlobArea)
{
biggestBlob = i;
biggestBlobArea = abs((int)currBlob.area);
}
}
blob = bfp->blobs[biggestBlob];
// find centre of image
centre = bfp->width/2;
//adjust turn to centre the blob in image
/*if the blob's centre is within some margin of the image
centre then move forwards, otherwise turn so that it is
centred. */
//blob to the left of centre
if(blob.x < centre-margin)
{
*forwardSpeed = 0;
//turn left
*turnSpeed = turningSpeed;
}
//blob to the right of centre
else if(blob.x > centre+margin)
{
*forwardSpeed = 0;
//turn right
*turnSpeed = -turningSpeed;
}
//otherwise go straight ahead
else
{
*forwardSpeed = 0.5;
*turnSpeed = 0;
}
return;
}
We want the robot to transition to this state whenever an item is seen, so we put a conditional statement in our control loop like so:
if(blobProxy->blobs_count == 0)
{
//wander
Wander(&forwardSpeed, &turnSpeed);
}
else
{
//move towards the item
MoveToItem(&forwardSpeed, &turnSpeed, blobProxy);
}
7.5.6 - Collect Item¶
This behaviour will be the most difficult to code because Stage doesn’t support pushable objects (the required physics is far too complex), what happens instead is that the robot runs over the object and just jostles it a bit. As a work-around to this problem we will have to somehow find out which item is between Bigbob’s teeth so that we can find its “name” and then change that item’s pose (for which we need the item’s name) so that it is no longer in the simulation. In essence, instead of having our robot eat rubbish and store it within its body, what we are doing is making the laser zap the rubbish out of existence.
We can find the name of an item between Bigbob’s teeth by cross
referencing the robot’s pose with the poses of the items in the world to
find out which item is nearest the robot’s laser. The first step is to
create a list of all the items in the world, their names and their poses
at initialisation. Since we know the names of the items are “orange1” to
“orange4” and “carton1” to “carton4”, we can find their poses with a
simple call to a simulation proxy. We’ll have to connect to the
simulation proxy with our code first using the line
simProxy = playerc_simulation_create(robot, 0);
, then we can access
this information and put it into a struct.
struct Item
{
char name[16];
double x;
double y;
}typedef item_t;
We can populate the structure with information using the following code:
item_t itemList[8];
void RefreshItemList(item_t *itemList, playerc_simulation_t *simProxy)
{
int i;
//get the poses of the oranges
for(i=0;i<4;i++)
{
char orangeStr[] = "orange%d";
sprintf(itemList[i].name, orangeStr, i+1);
double dummy; //dummy variable, don't need yaws.
playerc_simulation_get_pose2d(simProxy,itemList[i].name, \
&(itemList[i].x), &(itemList[i].y), &dummy);
}
//get the poses of the cartons
for(i=4;i<8;i++)
{
char cartonStr[] = "carton%d";
sprintf(itemList[i].name, cartonStr, i-3);
double dummy; //dummy variable, don't need yaws.
playerc_simulation_get_pose2d(simProxy,itemList[i].name, \
&(itemList[i].x), &(itemList[i].y), &dummy);
}
return;
}
Here we are making a string of the item names, for example orange1 and
storing that in the item’s name. We then use this string as an input
into the get_pose2d
function so that we can also get the item’s
location in the simulation.
Next we can begin the “Collect Item” behaviour, which will be triggered by something breaking the laser beam. When this happens we will check the area around Bigbob’s teeth, as indicated by Figure 7.8. We know the distance from the centre of this search circle to Bigbob’s origin (0.625m) and the radius of the search circle (0.375m), we can get the robot’s exact pose with the following code.
double x, y, yaw;
playerc_simulation_get_pose2d(sim,(char*)"bob1", &x, &y, &yaw);
Cross referencing the robot’s position with the item positions is a
matter of trigonometry, so isn’t particularly relevant to a manual on
Player/Stage. We won’t reproduce the code here, but the full and final
code developed for the Bigbob rubbish zapping robot can be found at
github. The
method we used is to find the Euclidian distance of the items to the
circle centre, and the smallest distance is the item we want to destroy.
We made a subfunction called FindItem
that returns the index of the
item to be destroyed. (We could also equip BigBob with a gripper, and
call gripper.close()
, and haul the trash somewhere else to drop it
off. See GripperProxy for more details, and
bigbob11
for an example.)
![]() |
Figure 7.8: Where to look for items which may have passed through Bigbob’s laser. |
Now that we can find the item to destroy it’s fairly simple to trigger our subfunction when the laser is broken so we can find and destroy an item.
if(laserProxy[90] < 0.25)
{
int destroyThis;
/*first param is the list of items in the world
second is length of this list
third parameter is the simulation proxy with
the pose information in it*/
destroyThis = FindItem(itemList, 8, simProxy);
//move it out of the simulation
playerc_simulation_set_pose2d(simProxy,itemList[destroyThis].name, -10, -10, 0);
RefreshItemList(itemList, simProxy);
}
The laser has 180 samples, so sample number 90 is the one which is perpendicular to Bigbob’s teeth. This point returns a maximum of 0.25, so if its range was to fall below this then something has passed through the laser beam. We then find the item closest to the robot’s teeth and move that item to coordinate (-10, -10) so it is no longer visible or accessible.
Finally we have a working simulation of a rubbish zapping robot! The
code comprises the source <source_code>/Ch7.5/bigbob.cc
, the
simulation world <source_code>/Ch7.5/bigbob.world
, and configuration
file <source_code>/Ch7.5/bigbob.cfg
.
This example shows the final code for the trash-zapping robot. Read through the code before executing.
7.6 - Simulating Multiple Robots¶
Our robot simulation case study only shows how to simulate a single robot in a Player/Stage environment. It’s highly likely that a simulation might want more than one robot in it. In this situation you will need to build a model of every robot you need in the worldfile, and then its associated driver in the configuration file. Let’s take a look at our worldfile for the case study, we’ll add a new model of a new Bigbob robot called “bob2”:
bigbob
(
name "bob1"
pose [-5 -6 45]
color "green"
)
bigbob
(
name "bob2"
pose [5 6 225]
color "yellow"
)
7.6.1 - Each robot on it’s own port¶
If there are multiple robots in the simulation, the standard practice is to put each robot on its own port (see Section 4.1 - Device Address. To implement this in the configuration file we need to tell Player which port to find our second robot on:
driver( name "stage"
provides ["6665:position2d:0" "6665:ranger:0"
"6665:blobfinder:0" "6665:ranger:1"]
model "bob1" )
driver( name "stage"
provides ["6666:position2d:0" "6666:ranger:0"
"6666:blobfinder:0" "6666:ranger:1"]
model "bob2" )
If you plan on simulating a large number of robots then it is probably worth writing a script to generate the world and configuration files.
When Player/Stage is started, the Player server automatically connects to all the used ports in your simulation and you control the robots separately with different player client objects in your code. For instance:
/* First Robot */
playerc_client_t *robot;
playerc_position2d_t *p2dProxy;
playerc_ranger_t *sonarProxy;
playerc_blobfinder_t *blobProxy;
playerc_ranger_t *laserProxy;
playerc_simulation_t *simProxy;
/* Create a client and connect it to the server. */
robot = playerc_client_create(NULL, "localhost", 6665);
if (0 != playerc_client_connect(robot)) return -1;
/* Create and subscribe to devices. */
p2dProxy = playerc_position2d_create(robot, 0);
if (playerc_position2d_subscribe(p2dProxy, PLAYER_OPEN_MODE)) return -1;
sonarProxy = playerc_ranger_create(robot, 0);
if (playerc_ranger_subscribe(sonarProxy, PLAYER_OPEN_MODE)) return -1;
blobProxy = playerc_blobfinder_create(robot, 0);
if (playerc_blobfinder_subscribe(blobProxy, PLAYER_OPEN_MODE)) return -1;
laserProxy = playerc_ranger_create(robot, 1);
if (playerc_ranger_subscribe(laserProxy, PLAYER_OPEN_MODE)) return -1;
simProxy = playerc_simulation_create(robot, 0);
if (playerc_simulation_subscribe(simProxy, PLAYER_OPEN_MODE)) return -1;
/* Create a client and connect it to the server. */
robot2 = playerc_client_create(NULL, "localhost", 6666);
if (0 != playerc_client_connect(robot2)) return -1;
/* Second Robot */
playerc_client_t *robot2;
playerc_position2d_t *p2dProxy2;
playerc_ranger_t *sonarProxy2;
playerc_blobfinder_t *blobProxy2;
playerc_ranger_t *laserProxy2;
/* Create and subscribe to devices. */
p2dProxy2 = playerc_position2d_create(robot2, 0);
if (playerc_position2d_subscribe(p2dProxy2, PLAYER_OPEN_MODE)) return -1;
sonarProxy2 = playerc_ranger_create(robot2, 0);
if (playerc_ranger_subscribe(sonarProxy2, PLAYER_OPEN_MODE)) return -1;
blobProxy2 = playerc_blobfinder_create(robot2, 0);
if (playerc_blobfinder_subscribe(blobProxy2, PLAYER_OPEN_MODE)) return -1;
laserProxy2 = playerc_ranger_create(robot2, 1);
if (playerc_ranger_subscribe(laserProxy2, PLAYER_OPEN_MODE)) return -1;
Each player client represents a robot, this is why when you connect to a proxy the player client is a constructor parameter. Each robot has a proxy for each of its devices, no robots share a proxy, so it is important that your code connects to every proxy of every robot in order to read the sensor information.
How you handle the extra player client and proxies is dependent on the scale of the simulation and your own personal coding preferences. It’s a good idea, if there’s more than maybe 2 robots in the simulation, to make a robot class which deals with connecting to proxies and the server, and processes all the information internally to control the robot. Then you can create an instance of this class for each simulated robot (obviously the robot’s port number would need to be a parameter otherwise they’ll all connect to the same port and consequently the same robot.) and all the simulated robots will run the same code.
This example shows the final code for two trash-zapping robots. Read through the code before executing.
7.6.2 - Each interface on it’s own index¶
An alternative to using a port for each robot is to use the same port but a different index.
For example, the Bigbob robot uses interfaces and indexes: position2d:0, ranger:0, blobfinder:0 and ranger:0. If we configured two Bigbob robots to use the same port but a different index our configuration file would be like this:
driver( name "stage"
provides ["6665:position2d:0" "6665:ranger:0"
"6665:blobfinder:0" "6665:ranger:1"]
model "bob1" )
driver( name "stage"
provides ["6665:position2d:1" "6665:ranger:2"
"6665:blobfinder:1" "6665:ranger:3"]
model "bob2" )
In our code we could then establish the proxies using only one player client:
/* Create a client and connect it to the server. */
playerc_client_t *robot;
robot = playerc_client_create(NULL, "localhost", 6665);
/* Create and subscribe to devices. */
/* first robot */
p2dProxy = playerc_position2d_create(robot, 0);
if (playerc_position2d_subscribe(p2dProxy, PLAYER_OPEN_MODE)) return -1;
sonarProxy = playerc_ranger_create(robot, 0);
if (playerc_ranger_subscribe(sonarProxy, PLAYER_OPEN_MODE)) return -1;
blobProxy = playerc_blobfinder_create(robot, 0);
if (playerc_blobfinder_subscribe(blobProxy, PLAYER_OPEN_MODE)) return -1;
laserProxy = playerc_ranger_create(robot, 1);
if (playerc_ranger_subscribe(laserProxy, PLAYER_OPEN_MODE)) return -1;
/* second robot */
p2dProxy2 = playerc_position2d_create(robot, 1);
if (playerc_position2d_subscribe(p2dProxy2, PLAYER_OPEN_MODE)) return -1;
sonarProxy2 = playerc_ranger_create(robot, 2);
if (playerc_ranger_subscribe(sonarProxy2, PLAYER_OPEN_MODE)) return -1;
blobProxy2 = playerc_blobfinder_create(robot, 1);
if (playerc_blobfinder_subscribe(blobProxy2, PLAYER_OPEN_MODE)) return -1;
laserProxy2 = playerc_ranger_create(robot, 3);
if (playerc_ranger_subscribe(laserProxy2, PLAYER_OPEN_MODE)) return -1;
/* shared simulation proxy */
simProxy = playerc_simulation_create(robot, 0);
if (playerc_simulation_subscribe(simProxy, PLAYER_OPEN_MODE)) return -1;
img
In Chapter 6 only C++ was used as an example. Since Player interacts with controlling code over network sockets, it’s pretty easy to control robots (physical or simulated) with other languages as well. Player officially supports C++, C, and Python (see http://playerstage.sourceforge.net/doc/Player-3.0.2/player/group__clientlibs.html). There are also Third party libraries with support for clients ranging from Smalltalk to Java to MATLAB.
8.1 - Coding in Python with playercpp.py
¶
8.1.1 - Setting up playercpp.py
interface¶
The C++ bindings are NOT made by default in player. You’ll need to configure and compile player locally to make these - how to do this is well beyond the scope of this manual, but an overall procedure can be found here.
To see if the bindings are available, and to locate where they are, type
locate playercpp.py
and observe the path with site-packages
in it’s name.
TRY IT OUT¶
8.2 Connecting to the Server and Proxies With Your Code¶
The first thing to do within your code is to include the Player interface file. Assuming Player/Stage is installed correctly on your machine then this can be done with the line
from playercpp import *
Next we need to establish a Player Client, which will interact with the Player server for you. To do this we use the line:
robot = PlayerClient("localhost");
What this does is declare a new object which is a PlayerClient called
robot
which connects to the Player server at the given address. The
hostname and port is like that discussed in Section 4.1 - Device
Address. If your code is running on the same
computer (or robot) as the Player server you wish to connect to then the
hostname is “localhost” otherwise it will be the IP address of the
computer or robot. The port is an optional parameter usually only needed
for simulations, it will be the same as the port you gave in the .cfg
file. This is only useful if your simulation has more than one robot in
and you need your code to connect to both robots. So if you gave your
first robot port 6665 and the second one 6666 (like in the example of
Section 4.2 - Putting the Configuration File
Together)
then you would need two PlayerClients, one connected to each robot, and
you would do this with the following code:
robot1 = PlayerClient("localhost",6665);
robot2 = PlayerClient("localhost",6666);
If you are only using one robot and in your .cfg file you said that it would operate on port 6665 then the port parameter to the PlayerClient class is not needed.
Once we have established a PlayerClient we should connect our code to the device proxies so that we can exchange information with them. Which proxies you can connect your code to is dependent on what you have put in your configuration file. For instance if your configuration file says your robot is connected to a laser but not a camera you can connect to the laser device but not the camera, even if the robot (or robot simulation) has a camera on it.
Proxies take the name of the interface which the drivers use to talk to Player. Let’s take part of the Bigbob example configuration file from Section 4.2 - Putting the Configuration File Together):
driver
(
name "stage"
provides ["position2d:0"
"ranger:0"
"blobfinder:0"
"ranger:1" ]
)
Here we’ve told the Player server that our “robot” has devices which use the position2d, ranger, and blobfinder interfaces. In our code then, we should connect to the position2d, ranger, and blobfinder proxies like so:
positionProxy_name = Position2dProxy (client_name,index)
sonarProxy_name = RangerProxy (client_name,index)
blobProxy_name = BlobfinderProxy (client_name,index)
laserProxy_name = RangerProxy (client_name,index)
A full list of which proxies Player supports can be found in the Player manual. They all follow the convention of being named after the interface they use.
In the above case Proxy_name
is the name you want to give to the
proxy object, client_name
is the name you gave the PlayerClient
object earlier and index
is the index that the device was given in
your configuration file (probably 0).
8.2.1 - Setting Up Connections: an Example¶
For an example of how to connect to the Player sever and device proxies we will use the example configuration file developed in Section 4.2 -Putting the Configuration File Together. For convenience this is reproduced below:
driver
(
name "stage"
plugin "stageplugin"
provides ["simulation:0" ]
# load the named file into the simulator
worldfile "worldfile_name.world"
)
driver
(
name "stage"
provides ["6665:position2d:0"
"6665:ranger:0"
"6665:blobfinder:0"
"6665:ranger:1"]
model "bob1"
)
To set up a PlayerClient and then connect to proxies on that server we can use principles discussed in this section to develop the following code:
from playercpp import *
robot = PlayerClient("localhost");
p2dProxy = Position2dProxy(robot,0);
sonarProxy = RangerProxy(robot,0);
blobProxy = BlobfinderProxy(robot,0);
laserProxy = RangerProxy(robot,1);
# some control code
return 0;
8.3 Interacting with Proxies¶
As you may expect, each proxy is specialised towards controlling the device it connects to. This means that each proxy will have different commands depending on what it controls.
In Player version 3.0.2 there are 39 different proxies which you can choose to use, many of which are not applicable to Player/Stage. This manual will not attempt to explain them all, a full list of avaliable proxies and their functions is in the Player manual, although the returns, parameters and purpose of the proxy function are not always explained.
The following few proxies are probably the most useful to anyone using Player or Player/Stage.
8.3.1 Position2dProxy¶
The Position2dProxy is the number one most useful proxy there is. It controls the robot’s motors and keeps track of the robot’s odometry (where the robot thinks it is based on how far its wheels have moved).
8.3.1.1 - SetSpeed ( )¶
The SetSpeed command is used to tell the robot’s motors how fast to turn. There are three different SetSpeed commands that can be called, one is for robots that can move in any direction (omnidirectional), one is for for robots with differential drive (i.e. one drive wheel on each side), and the last for car-like drives.
SetSpeed(XSpeed, YSpeed, YawSpeed)
SetSpeed(XSpeed, YawSpeed)
SetCarlike(XSpeed, DriveAngle)
![]() |
Figure 8.3: A robot on a cartesian grid. This shows what directions the X and Y speeds will cause the robot to move in. A positive yaw speed will turn the robot in the direction of the + arrow, a negative yaw speed is the direction of the -arrow. |
Figure 8.3 shows which direction the x, y and yaw speeds are in relation
to the robot. The x speed is the rate at which the robot moves forward
and the y speed is the robot’s speed sideways, both are to be given in
metres per second. The y speed will only be useful if the robot you want
to simulate or control is a ball, since robots with wheels cannot move
sideways. The yaw speed controls how fast the robot is turning and is
given in radians per second, Python has an inbuilt global function
called math.radians()
which converts a number in degrees into a
number in radians which could be useful when setting the yaw speed.
If you want to simulate or control a robot with a differential drive
system then you’ll need to convert left and right wheel speeds into a
forward speed and a turning speed before sending it to the proxy. For
car-like drives there is the SetCarlike
which again is the forward
speed in m/s and the drive angle in radians.
8.3.1.2 - Get_Speed ( )¶
The GetSpeed commands are essentially the reverse of the SetSpeed command. Instead of setting a speed they return the current speed relative to the robot (so x is the forward speed, yaw is the turning speed and so on).
GetXSpeed()
: forward speed (metres/sec).GetYSpeed()
: sideways (perpendicular) speed (metres/sec).GetYawSpeed()
: turning speed (radians/sec).
8.3.1.3 - Get_Pos ( )¶
This function interacts with the robot’s odometry. It allows you to monitor where the robot thinks it is. Coordinate values are given relative to its starting point, and yaws are relative to its starting yaw.
GetXPos()
: returns current x coordinate relative to its x starting position.GetYPos()
: returns current y coordinate relative to its y starting position.GetYaw()
: returns current yaw relative to its starting yaw.
TRY IT OUT (GetSetPositions)¶
In Section 3.2.1 - The Position Model, we specified whether player would record odometry by measuring how much the robot’s wheels have turned, or whether the robot would have perfect knowledge of its current coordinates (by default the robot does not record odometry at all). If you set the robot to record odometry using its wheels then the positions returned by these get commands will become increasingly inaccurate as the simulation goes on. If you want to log your robots position as it moves around, these functions along with the perfect odometry can be used.
8.3.1.4 - SetMotorEnable( )¶
This function takes a boolean input, telling Player whether to enable the motors or not. If the motors are disabled then the robot will not move no matter what commands are given to it, if the motors are enabled then the motors will always work, this is not so desirable if the robot is on a desk or something and is likely to get damaged. Hence the motors being enabled is optional. If you are using Player/Stage, then the motors will always be enabled and this command doesn’t need to be run. However, if your code is ever likely to be moved onto a real robot and the motors are not explicitly enabled in your code, then you may end up spending a long time trying to work out why your robot is not working.
8.3.2 RangerProxy¶
A RangerProxy interfaces with any ranger sensor.
A laser is represented by a ranger device with one ranger sensor, whose
samples
attribute is greater than one. To minimize confusion with
the depreciated laser interface, I’ll refer to these as single-sensor
devices.
A set of sonars or IR sensors is represented by a ranger device with
multiple ranger sensors whose samples
attributes are not set (or set
to 1). To minimize confusion with the depreciated sonar and IR
interfaces, I’ll refer to these as multiple-sensor devices.
Angles are given with reference to the laser’s centre front (see Figure 8.4).
GetRangeCount()
: The number of ranger measurements that the sensor suite measures. In the case of a single-sensor device, this is given by thesamples
attribute. In the case of a multiple-sensor device, this is given by the number of sensors.rangerProxy_name[ranger_number]
: The range returned by theranger_number
th scan point. For a single-sensor device, scan points are numbered from the minimum angle at index 0, to the maximum angle at indexGetRangeCount()-1
. For a multiple-sensor device, theranger_number
is given by the order in which you included the sensor in the world file. ** BUG ALERT **RangerProxy
does not support indexing in the current distributed version ofplayer
/playercpp.py
. UseGetRange()
below.GetRange(ranger_number)
: Same asrangerProxy_name[ranger_number]
.GetMinAngle()
: gives the minimum angle (One tricky thing - you need to be sure to callRequestConfigure()
once before accessing the min or max angles, they are initialized to zero!) covered by a ranger sensor. Only makes sense for a single-sensor device.GetMaxAngle()
: gives the maximum angle covered by a ranger sensor. Only makes sense for a single-sensor device.GetAngularRes()
: gives the angular resolution (Θ in Figure 8.4)
![]() |
Figure 8.4: How laser angles are referenced. In this diagram the laser is pointing to the right along the dotted line, the angle θ is the angle of a laser scan point, in this example θ is negative. |
![]() |
Figure 8.5: A laser scanner. The minimum angle is the angle of the rightmost laser scan, the maximum angle is the leftmost laser scan. θ is the scan resolution of the laser, it is the angle between each laser scan, given in radians. |
This example shows how ranger sensors can be read. Read through the code before executing.
8.3.3 BlobfinderProxy¶
The blobfinder module analyses a camera image for areas of a desired
colour and returns an array of the structure
`playerc_blobfinder_blob_t
<http://playerstage.sourceforge.net/doc/Player-3.0.2/player/structplayer__blobfinder__blob.html>__,
this is the structure used to store blob data. First we will cover how
to get this data from the blobfinder proxy, then we will discuss the
data stored in the structure.
- GetCount(): Returns the number of blobs seen.
- blobProxy_name[blob_number]: This returns the blob structure data for the blob with the index blob_number. Blobs are sorted by index in the order that they appear in the image from left to right.
- GetBlob(blob_number): same as blobProxy_name[blob_number]`
Once we receive the blob structure from the proxy we can extract data we
need. The playerc_blobfinder_blob_t
structure, documented in the
Player
manual
contains the following fields (see Figure 6.6 for illustration):
BUG ALERT Unfortunately, the C to Python interface doesn’t do a good job at accessing data buried in C structures from python. So you can’t get at the properties of the blob.
color
: The colour of the blob it detected. This is given as a hexadecimal value.area
: The area of the blob’s bounding box. (In Stage 4.1.1, there is a bug with respect to the area. It is computed as anint
, but return as anunsigned int
. In order to use it, you must explicitly cast it as an int ((int)area
). See http://sourceforge.net/p/playerstage/bugs/362/ and/or https://github.com/rtv/Stage/issues/41 for the details.)x
: The horizontal coordinate of the geometric centre of the blob’s bounding boxy
: The vertical coordinate of the geometric centre of the blob’s bounding boxleft
: The horizontal coordinate of the left hand side of the blob’s bounding boxright
: The horizontal coordinate of the right hand side of the blob’s bounding boxtop
: The vertical coordinate of the top side of the blob’s bounding boxbottom
: The vertical coordinate of the bottom side of the blob’s bounding box
![]() |
Figure 8.6: What the
fields in
playerc_blobfind er
_ blob_t
mean. The blob on the
left has a geometric
centre at (x,y),
the blob on the right
has a bounding box
with the top left
corner at (left,
top) pixels, and a
lower right
coordinate at
(right, bottom)
pixels. Coordinates
are given with
reference to the top
left corner of the
image. |
TRY IT OUT (blobfinder)¶
8.3.4 - GripperProxy¶
The GripperProxy allows you to control the gripper. Once the gripper is holding an item, the simulated robot will carry it around wherever it goes. Without a gripper you can only jostle an item in the simulation and you would have to manually tell the simulation what to do with an item. The GripperProxy can also tell you if an item is between the gripper teeth because the gripper model has inbuilt beams which can detect if they are broken.
GetBeams()
: This command will tell you if there is an item inside the gripper. If it is a value above 0 then there is an item to grab.GetState()
: This will tell you whether the gripper is opened or closed. If the command returns a 1 then the gripper is open, if it returns 2 then the gripper is closed, and 3 if the gripper is moving.Open()
: Tells the gripper to open. This will cause any items that were being carried to be dropped.Close()
: Tells the gripper to close. This will cause it to pick up anything between its teeth.
TRY IT OUT (gripper)¶
8.3.5 - SimulationProxy¶
The simulation proxy allows your code to interact with and change aspects of the simulation, such as an item’s pose or its colour.
8.3.5.1 - Get/Set Pose¶
The item’s pose is a special case of the Get/SetProperty function, because it is so likely that someone would want to move an item in the world they created a special function to do it.
SetPose2d(item_name, x, y, yaw)
In this case item_name
is as with Get/SetProperty, but we can
directly specify its new coordinates and yaw (coordinates and yaws are
given with reference to the map’s origin).
BUG ALERT
Unfortunately, the C++ to Python interface doesnt allow passing pointers from python. Thus, the GetPose2d call doesn’t work.
If you download and compile the latest version of player, found on github, has this fix applied.
GetPose2d(item_name, double &x, double &y, double &yaw)
This is like SetPose2d only this time it writes the coordinates and yaw to the given addresses in memory.
TRY IT OUT (GetSetPose)¶
8.3.5.2 - Get/Set Property¶
In Stage 4.1.1 the Get/SetProperty simulation proxy functions are only implemented for the property “color”. None of the other properties are supported. Previous versions of Stage (before 3.2.2) had some code but it wasn’t fully implemented, and it’s been removed since.
If you desperately need this functionality you can use an earlier release of Stage, and the first edition of this manual describes how to get and set a model’s property in those distributions.
In this edition of the manual I will describe the only functioning Get/SetProperty, which is “color”.
To change a property of an item in the simulation we use the following function:
GetProperty(item_name, property, *value, value_len)
SetProperty(item_name, property, *value, value_len)
\*\* BUG ALERT \*\* Unfortunately, the C to Python interface doesn't
do a good job at accessing data behind a pointer. So you can't get
at the data within the \*value.
item_name
: this is the name that you gave to the object in the worldfile, it could be any model that you have described in the worldfile. For example, in Section 3.2.2 - An Example Robot in the worldfile we declared a Bigbob type robot which we called “bob1” so theitem_name
for that object is “bob1”. Similarly in Section 3.2.3- Building Other Stuff <WORLDFILES.md#323-building-other-stuff>__ we built some models of oranges and called the “orange1” to “orange4” so the item name for one of these would be “orange1”. Anything that is a model in your worldfile can be altered by this function, you just need to have named it, no drivers need to be declared in the configuration file for this to work either. We didn’t write controllers for the oranges but we could still alter their properties this way.
property
: Currently,"_mp_color"
is the only supported propery about a model that you can change.value
: a pointer to the value you want fill with the property or assign to the property (see below).value_len
: is the size of the value you gave in bytes.
The value
parameter is dependant on which property
you want to
set.
"color"
: This requires an array of fourfloat
values, scaled between 0 and 1. The first index of the array is the red component of the colour, the second is the green, third is blue and fourth is alpha (how light or dark the colour is, usually 1). For example if we want a nice shade of green, which has RGB components 171/224/110 we scale these between 0 and 1 by dividing by 255 to get 0.67/0.88/0.43 we can now put this into a float array with the linefloat green[]={0.67, 0.88, 0.43, 1};
. This array can then be passed into ourSetProperty
function like so:SetProperty("model_name", "color", (void*)green, sizeof(float)*4 );
TRY IT OUT (GetSetProperty)¶
This example shows how to reset the color of an object. Read through the code before executing.
** NONFUNCTIONAL**
8.4 General Useful Commands¶
8.4.1 - Read()¶
To make the proxies update with new sensor data we need to tell the
player server to update, we can do this using the PlayerClient object
which we used to connect to the server. All we have to do is run the
command playerClient_name.Read()
every time the data needs updating
(where playerClient_name is the name you gave the PlayerClient object).
Until this command is run, the proxies and any sensor information from
them will be empty.
The devices on a typical robot are asynchronous and the devices in a
Player/Stage simulation are also asynchronous, so running the Read()
command won’t always update everything at the same time, so it may take
several calls before some large data structures (such as a camera image)
gets updated.
8.4.2 - GetGeom( )¶
Most of the proxies have a function called GetGeom
or
GetGeometry
or RequestGeometry
, or words to that effect. What
these functions do is tell the proxy to retrieve information about the
device, usually its size and pose (relative to the robot). The proxies
don’t know this by default since this information is specific to the
robot or the Player/Stage robot model. If your code needs to know this
kind of information about a device then the proxy must run this command
first.
8.5 Case Study 1: Using Python for a Trash-Zapping Robot¶
This case study is not yet ported to python, due to the inability to access data within the blobdata structure.
8.6 Case Study 2: Simulating Multiple Robots¶
Our robot simulation case study only shows how to simulate a single robot in a Player/Stage environment. It’s highly likely that a simulation might want more than one robot in it. In this situation you will need to build a model of every robot you need in the worldfile, and then its associated driver in the configuration file. Let’s take a look at our worldfile for the case study, we’ll add a new model of a new Bigbob robot called “bob2”:
bigbob
(
name "bob1"
pose [-5 -6 45]
color "green"
)
bigbob
(
name "bob2"
pose [5 6 225]
color "yellow"
)
8.6.1 - Each robot on it’s own port¶
If there are multiple robots in the simulation, the standard practice is to put each robot on its own port (see Section 4.1 - Device Address. To implement this in the configuration file we need to tell Player which port to find our second robot on:
driver( name "stage"
provides ["6665:position2d:0" "6665:ranger:0"
"6665:blobfinder:0" "6665:ranger:1"]
model "bob1" )
driver( name "stage"
provides ["6666:position2d:0" "6666:ranger:0"
"6666:blobfinder:0" "6666:ranger:1"]
model "bob2" )
If you plan on simulating a large number of robots then it is probably worth writing a script to generate the world and configuration files.
When Player/Stage is started, the Player server automatically connects to all the ports used in your simulation and you control the robots separately with different PlayerClient objects in your code. For instance:
# first robot
robot = PlayerClient("localhost",6665);
sp = RangerProxy(robot,0);
lp = RangerProxy(robot,1);
pp = Position2dProxy(robot,0);
# second robot
robot2 = PlayerClient("localhost",6666);
sp2 = RangerProxy(robot2,0);
lp2 = RangerProxy(robot2,1);
pp2 = Position2dProxy(robot2,0);
Each Player Client represents a robot, this is why when you connect to a proxy the PlayerClient is a constructor parameter. Each robot has a proxy for each of its devices, no robots share a proxy, so it is important that your code connects to every proxy of every robot in order to read the sensor information.
How you handle the extra PlayerClients and proxies is dependent on the scale of the simulation and your own personal coding preferences. It’s a good idea, if there’s more than maybe 2 robots in the simulation, to make a robot class which deals with connecting to proxies and the server, and processes all the information internally to control the robot. Then you can create an instance of this class for each simulated robot (obviously the robot’s port number would need to be a parameter otherwise they’ll all connect to the same port and consequently the same robot.) and all the simulated robots will run the same code.
This example shows the final code for two trash-zapping robots. Read through the code before executing.
8.6.2 - Each interface on it’s own index¶
An alternative to using a port for each robot is to use the same port but a different index.
For example, the Bigbob robot uses interfaces and indexes: position2d:0, ranger:0, blobfinder:0 and ranger:1. If we configured two Bigbob robots to use the same port but a different index our configuration file would be like this:
driver( name "stage"
provides ["6665:position2d:0" "6665:ranger:0"
"6665:blobfinder:0" "6665:ranger:1"]
model "bob1" )
driver( name "stage"
provides ["6665:position2d:1" "6665:ranger:2"
"6665:blobfinder:1" "6665:ranger:3"]
model "bob2" )
In our code we could then establish the proxies using only one PlayerClient:
robot = PlayerClient("localhost",6665);
sp = RangerProxy(robot,0);
lp = RangerProxy(robot,1);
pp = Position2dProxy(robot,0);
sp2 = RangerProxy(robot,2);
lp2 = RangerProxy(robot,3);
pp2 = Position2dProxy(robot,1);
This example shows the final code for the trash-zapping robot. Read through the code before executing.
img
In Chapter 6 only C++ was used as an example. Since Player interacts with controlling code over network sockets, it’s pretty easy to control robots (physical or simulated) with other languages as well. Player officially supports C++, C, and Python (see http://playerstage.sourceforge.net/doc/Player-3.0.2/player/group__clientlibs.html). There are also Third party libraries with support for clients ranging from Smalltalk to Java to MATLAB.
9.1 - Coding in Python with playerc.py
¶
9.1.1 - Setting up playerc.py
interface¶
The C bindings are made by default in player. To check to see if the bindings are available, and to locate where they are, type
locate playerc.py
and observe the path with site-packages
in it’s name.
TRY IT OUT¶
9.2 Connecting to the Server and Proxies With Your Code¶
The first thing to do within your code is to include the Player interface file. Assuming Player/Stage is installed correctly on your machine then this can be done with the line
from playerc import *
Next we need to establish a Player Client, which will interact with the Player server for you. To do this we use the line:
robot = playerc_client(None, 'localhost',6665)
What this does is declare a new object which is a playerc_client called
robot
which connects to the Player server at the given address. The
hostname and port is like that discussed in Section 4.1 - Device
Address. If your code is running on the same
computer (or robot) as the Player server you wish to connect to then the
hostname is “localhost” otherwise it will be the IP address of the
computer or robot. The port will be the same as the port you gave in the
.cfg file. So if you gave your first robot port 6665 and the second one
6666 (like in the example of Section 4.2 - Putting the Configuration
File
Together)
then you would need two player client, one connected to each robot, and
you would do this with the following code:
robot1 = playerc_client(None, 'localhost',6665)
robot2 = playerc_client(None, 'localhost',6666)
Unlike in C, even if you are only using one robot and in your .cfg file you still need to specify the port parameter.
Once we have established a player client we should connect our code to the device proxies so that we can exchange information with them. Which proxies you can connect your code to is dependent on what you have put in your configuration file. For instance if your configuration file says your robot is connected to a laser but not a camera you can connect to the laser device but not the camera, even if the robot (or robot simulation) has a camera on it.
Proxies take the name of the interface which the drivers use to talk to Player. Let’s take part of the Bigbob example configuration file from Section 4.2 - Putting the Configuration File Together):
driver
(
name "stage"
provides ["position2d:0"
"ranger:0"
"blobfinder:0"
"ranger:1" ]
)
Here we’ve told the Player server that our “robot” has devices which use the position2d, ranger, and blobfinder interfaces. In our code then, we should connect to the position2d, ranger, and blobfinder proxies like so:
position2d_name = playerc_position2d_create(client_name, index);
playerc_position2d_name.subscribe(position2d_name, PLAYER_OPEN_MODE);
sonar_name = playerc_ranger_create(client_name,index);
playerc_sonar_name.subscribe(sonar_name,PLAYER_OPEN_MODE);
blobfinder_name = playerc_blobfinder_create(client_name,index);
playerc_blobfinder_name.subscribe(blobfinder_name,PLAYER_OPEN_MODE);
laser_name = playerc_ranger_create(client_name,index);
playerc_laser_name.subscribe(laser_name,PLAYER_OPEN_MODE);
A full list of which proxies PlayerC supports can be found in the Player manual They all follow the convention of being named after the interface they use.
In the above case Proxy_name
is the name you want to give to the
proxy object, client_name
is the name you gave the player client
object earlier and index
is the index that the device was given in
your configuration file (probably 0).
9.2.1 - Setting Up Connections: an Example¶
For an example of how to connect to the Player sever and device proxies we will use the example configuration file developed in Section 4.2 -Putting the Configuration File Together. For convenience this is reproduced below:
driver
(
name "stage"
plugin "stageplugin"
provides ["simulation:0" ]
# load the named file into the simulator
worldfile "worldfile_name.world"
)
driver
(
name "stage"
provides ["6665:position2d:0"
"6665:ranger:0"
"6665:blobfinder:0"
"6665:ranger:1"]
model "bob1"
)
To set up a player client and then connect to proxies on that server we can use principles discussed in this section to develop the following code:
from playerc import *
# Make proxies for Client, blobfinder
robot = playerc_client(None, 'localhost', 6665)
if robot.connect():
raise Exception(playerc_error_str())
p = playerc_position2d(robot,0)
if p.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
s = playerc_ranger(robot,0)
if s.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
bf = playerc_blobfinder(robot,0);
if bf.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
l = playerc_ranger(robot,1)
if l.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
# some control code
return 0;
9.3 Interacting with Proxies¶
As you may expect, each proxy is specialised towards controlling the device it connects to. This means that each proxy will have different commands depending on what it controls.
In Player version 3.0.2 there are 39 different proxies which you can choose to use, many of which are not applicable to Player/Stage. This manual will not attempt to explain them all, a full list of avaliable proxies and their functions is in the Player manual, although the returns, parameters and purpose of the proxy function is not always explained.
The following few proxies are probably the most useful to anyone using Player or Player/Stage.
9.3.1 position2dproxy¶
The position2dproxy is the number one most useful proxy there is. It controls the robot’s motors and keeps track of the robot’s odometry (where the robot thinks it is based on how far its wheels have moved).
9.3.1.1 - SetSpeed ( )¶
The SetSpeed command is used to tell the robot’s motors how fast to turn. There are three different SetSpeed commands that can be called, one is for robots that can move in any direction (omnidirectional), one is for for robots with differential drive (i.e. one drive wheel on each side), and the last for car-like drives.
set_cmd_vel (XSpeed, YSpeed, YawSpeed, int state)
set_cmd_vel_head (XSpeed, YSpeed, YawHeading, state)
set_cmd_car (XSpeed, SteerAngle)
![]() |
Figure 9.3: A robot on a cartesian grid. This shows what directions the X and Y speeds will cause the robot to move in. A positive yaw speed will turn the robot in the direction of the + arrow, a negative yaw speed is the direction of the -arrow. |
Figure 9.3 shows which direction the x, y and yaw speeds are in relation
to the robot. The x speed is the rate at which the robot moves forward
and the y speed is the robot’s speed sideways, both are to be given in
metres per second. The y speed will only be useful if the robot you want
to simulate or control is a ball, since robots with wheels cannot move
sideways. The yaw speed controls how fast the robot is turning and is
given in radians per second, Python has an inbuilt global function
called math.radians()
which converts a number in degrees into a
number in radians which could be useful when setting the yaw speed.
If you want to simulate or control a robot with a differential drive
system then you’ll need to convert left and right wheel speeds into a
forward speed and a turning speed before sending it to the proxy. For
car-like drives there is the SetCarlike
which again is the forward
speed in m/s and the drive angle in radians.
9.3.1.2 - GetSpeed ( )¶
With playerc, there are no explicit GetSpeed comments - you just read
from the relevant fields in the playerc_position2d_t
structure.
position2dProxy.vx
: forward speed (metres/sec).position2dProxy.vy
: sideways (perpendicular) speed (metres/sec).position2dProxy.va
: turning speed (radians/sec).
9.3.1.3 - Get_Pos ( )¶
Again, in playerc these fields are read directly from the
playerc_position2d_t
structure.
This allows you to monitor where the robot thinks it is. Coordinate values are given relative to its starting point, and yaws are relative to its starting yaw.
position2dProxy.px
: gives current x coordinate relative to its x starting position.position2dProxy.py
: gives current y coordinate relative to its y starting position.position2dProxy.pa
: gives current yaw relative to its starting yaw.
TRY IT OUT (GetSetPositions)¶
In Section 3.2.1 - The Position Model, we specified whether player would record odometry by measuring how much the robot’s wheels have turned, or whether the robot would have perfect knowledge of its current coordinates (by default the robot does not record odometry at all). If you set the robot to record odometry using its wheels then the positions returned by these get commands will become increasingly inaccurate as the simulation goes on. If you want to log your robots position as it moves around, these functions along with the perfect odometry can be used.
9.3.1.4 - Motor Enable( )¶
This function takes a boolean input, telling Player whether to enable the motors or not. If the motors are disabled then the robot will not move no matter what commands are given to it, if the motors are enabled then the motors will always work, this is not so desirable if the robot is on a desk or something and is likely to get damaged. Hence the motors being enabled is optional. If you are using Player/Stage, then the motors will always be enabled and this command doesn’t need to be run. However, if your code is ever likely to be moved onto a real robot and the motors are not explicitly enabled in your code, then you may end up spending a long time trying to work out why your robot is not working.
9.3.2 rangerproxy¶
A RangerProxy interfaces with any ranger sensor.
A laser is represented by a ranger device with one ranger sensor, whose
samples
attribute is greater than one. To minimize confusion with
the depreciated laser interface, I’ll refer to these as single-sensor
devices.
A set of sonars or IR sensors is represented by a ranger device with
multiple ranger sensors whose samples
attributes are not set (or set
to 1). To minimize confusion with the depreciated sonar and IR
interfaces, I’ll refer to these as multiple-sensor devices.
Angles are given with reference to the laser’s centre front (see Figure 9.4).
get_geom()
: Retrieves the configuration of the ranger sensor, and fills in the playerc_ranger_t structure. The most useful elements of this structure are:ranges_count
: The number of ranger measurements that the sensor suite measures. In the case of a single-sensor device, this is given by thesamples
attribute. In the case of a multiple-sensor device, this is given by the number of sensors.ranges[ranger_number]
: The range returned by theranger_number
th scan point. For a single-sensor device, scan points are numbered from the minimum angle at index 0, to the maximum angle at indexranges_count
. For a multiple-sensor device, theranger_number
is given by the order in which you included the sensor.min_angle
: gives the minimum angle covered by a ranger sensor. Only makes sense for a single-sensor device. ** Doesn’t seem to work in the python interface.**max_angle
: gives the maximum angle covered by a ranger sensor. Only makes sense for a single-sensor device. ** Doesn’t seem to work in the python interface.**bearings[ranger_number]
: scan bearings in the XY plane (See Figure 7.4)
![]() |
Figure 9.4: How laser angles are referenced. In this diagram the laser is pointing to the right along the dotted line, the angle θ is the angle of a laser scan point, in this example θ is negative. |
![]() |
Figure 9.5: A laser scanner. The minimum angle is the angle of the rightmost laser scan, the maximum angle is the leftmost laser scan. θ is the scan resolution of the laser, it is the angle between each laser scan, given in radians. |
This example shows how ranger sensors can be read. Read through the code before executing.
9.3.3 BlobfinderProxy¶
The blobfinder module analyses a camera image for areas of a desired
colour and returns an array of the structure
`playerc_blobfinder_blob_t
<http://playerstage.sourceforge.net/doc/Player-3.0.2/player/structplayer__blobfinder__blob.html>__,
this is the structure used to store blob data. First we will cover how
to get this data from the blobfinder proxy, then we will discuss the
data stored in the structure.
- GetCount(): Returns the number of blobs seen.
- blobProxy_name[blob_number]: This returns the blob structure data for the blob with the index blob_number. Blobs are sorted by index in the order that they appear in the image from left to right.
- GetBlob(blob_number): same as blobProxy_name[blob_number]`
Once we receive the blob structure from the proxy we can extract data we
need. The playerc_blobfinder_blob_t
structure, documented in the
Player
manual
contains the following fields (see Figure 6.6 for illustration):
BUG ALERT
Unfortunately, the C to Python interface doesn’t do a good job at accessing data buried in C structures from python. So you can’t get at the properties of the blob.
color
: The colour of the blob it detected. This is given as a hexadecimal value.area
: The area of the blob’s bounding box. (In Stage 4.1.1, there is a bug with respect to the area. It is computed as anint
, but return as anunsigned int
. In order to use it, you must explicitly cast it as an int ((int)area
). See (http://sourceforge.net/p/playerstage/bugs/362/) and/or (https://github.com/rtv/Stage/issues/41) for the details.)x
: The horizontal coordinate of the geometric centre of the blob’s bounding boxy
: The vertical coordinate of the geometric centre of the blob’s bounding boxleft
: The horizontal coordinate of the left hand side of the blob’s bounding boxright
: The horizontal coordinate of the right hand side of the blob’s bounding boxtop
: The vertical coordinate of the top side of the blob’s bounding boxbottom
: The vertical coordinate of the bottom side of the blob’s bounding box
![]() |
Figure 9.6: What the
fields in
playerc_blobfind er
_ blob_t
mean. The blob on the
left has a geometric
centre at (x,y),
the blob on the right
has a bounding box
with the top left
corner at (left,
top) pixels, and a
lower right
coordinate at
(right, bottom)
pixels. Coordinates
are given with
reference to the top
left corner of the
image. |
TRY IT OUT (blobfinder)¶
9.3.4 - GripperProxy¶
The GripperProxy allows you to control the gripper. Once the gripper is holding an item, the simulated robot will carry it around wherever it goes. Without a gripper you can only jostle an item in the simulation and you would have to manually tell the simulation what to do with an item. The GripperProxy can also tell you if an item is between the gripper teeth because the gripper model has inbuilt beams which can detect if they are broken.
open_cmd (playerc_gripper_t *device)
: Tells the gripper to open. This will cause any items that were being carried to be dropped.close_cmd (playerc_gripper_t *device)
: Command the gripper to close. Tells the gripper to close. This will cause it to pick up anything between its teeth.stop_cmd (playerc_gripper_t *device)
: Command the gripper to stop. If it is opening, it may not complete opening. If it’s closing, it may not complete closing.printout (playerc_gripper_t *device, const char *prefix)
: Print a human-readable version of the gripper state.get_geom (playerc_gripper_t *device)
: Get the gripper geometry. This is placed in theplayerc_gripper_t
structure, which contains the following information:num_beams
: The number of breakbeams the gripper has.state
: The gripper’s state: may be one of PLAYER_GRIPPER_STATE_OPEN, PLAYER_GRIPPER_STATE_CLOSED, PLAYER_GRIPPER_STATE_MOVING or PLAYER_GRIPPER_STATE_ERROR.beams
: The position of the object in the gripper. This command will tell you if there is an item inside the gripper. If it is a value above 0 then there is an item to grab.
TRY IT OUT (gripper)¶
This example shows a robot approaching a box, gripping it, and dragging it backwards. Read through the code before executing.
9.3.5 - SimulationProxy¶
The simulation proxy allows your code to interact with and change aspects of the simulation, such as an item’s pose or its colour.
9.3.5.1 - Get/Set Pose¶
The item’s pose is a special case of the Get/SetProperty function, because it is so likely that someone would want to move an item in the world they created a special function to do it.
set_pose2d(item_name, x, y, yaw)
In this case item_name
is as with Get/SetProperty, but we can
directly specify its new coordinates and yaw (coordinates and yaws are
given with reference to the map’s origin).
rtn, x, y, a = get_pose(item_name)
This is like SetPose2d only this time it returns the given variables as returns.
TRY IT OUT (GetSetPose)¶
This example shows how to Get and Set pose of objects. Read through the code before executing.
9.3.5.2 - Get/Set Property¶
In Stage 4.1.1 the Get/SetProperty simulation proxy functions are only implemented for the property “color”. None of the other properties are supported. Previous versions of Stage (before 3.2.2) had some code but it wasn’t fully implemented, and it’s been removed since.
If you desperately need this functionality you can use an earlier release of Stage, and the first edition of this manual describes how to get and set a model’s property in those distributions.
In this edition of the manual I will describe the only functioning Get/SetProperty, which is “color”.
To change a property of an item in the simulation we use the following function:
GetProperty(item_name, property, *value, value_len)
SetProperty(item_name, property, *value, value_len)
\*\* BUG ALERT \*\* Unfortunately, the C to Python interface doesn't
do a good job at accessing data behind a pointer. So you can't get
at the data within the \*value.
item_name
: this is the name that you gave to the object in the worldfile, it could be any model that you have described in the worldfile. For example, in Section 3.2.2 - An Example Robot in the worldfile we declared a Bigbob type robot which we called “bob1” so theitem_name
for that object is “bob1”. Similarly in Section 3.2.3- Building Other Stuff <WORLDFILES.md#323-building-other-stuff>__ we built some models of oranges and called the “orange1” to “orange4” so the item name for one of these would be “orange1”. Anything that is a model in your worldfile can be altered by this function, you just need to have named it, no drivers need to be declared in the configuration file for this to work either. We didn’t write controllers for the oranges but we could still alter their properties this way.
property
: Currently,"_mp_color"
is the only supported propery about a model that you can change.value
: a pointer to the value you want fill with the property or assign to the property (see below).value_len
: is the size of the value you gave in bytes.
The value
parameter is dependant on which property
you want to
set.
"color"
: This requires an array of fourfloat
values, scaled between 0 and 1. The first index of the array is the red component of the colour, the second is the green, third is blue and fourth is alpha (how light or dark the colour is, usually 1). For example if we want a nice shade of green, which has RGB components 171/224/110 we scale these between 0 and 1 by dividing by 255 to get 0.67/0.88/0.43 we can now put this into a float array with the linefloat green[]={0.67, 0.88, 0.43, 1};
. This array can then be passed into ourSetProperty
function like so:SetProperty("model_name", "color", (void*)green, sizeof(float)*4 );
TRY IT OUT (GetSetProperty)¶
This example shows how to reset the color of an object. Read through the code before executing.
** NONFUNCTIONAL**
9.4 General Useful Commands¶
9.4.1 - read()¶
To make the proxies update with new sensor data we need to tell the
player server to update, we can do this using the player client object
which we used to connect to the server. All we have to do is run the
command playerClient_name.read()
every time the data needs updating
(where playerClient_name is the name you gave the player client
object). Until this command is run, the proxies and any sensor
information from them will be empty.
The devices on a typical robot are asynchronous and the devices in a
Player/Stage simulation are also asynchronous, so running the Read()
command won’t always update everything at the same time, so it may take
several calls before some large data structures (such as a camera image)
gets updated.
9.4.2 - GetGeom( )¶
Most of the proxies have a function called get_geom
or
get_geometry
or request_geometry
, or words to that effect. What
these functions do is tell the proxy to retrieve information about the
device, usually its size and pose (relative to the robot). The proxies
don’t know this by default since this information is specific to the
robot or the Player/Stage robot model. If your code needs to know this
kind of information about a device then the proxy must run this command
first.
9.5 Case Study 1: Using Python for a Trash-Zapping Robot¶
This case study is not yet ported to python, due to the inability to access data within the blobdata structure.
9.6 Case Study 2: Simulating Multiple Robots¶
Our robot simulation case study only shows how to simulate a single robot in a Player/Stage environment. It’s highly likely that a simulation might want more than one robot in it. In this situation you will need to build a model of every robot you need in the worldfile, and then its associated driver in the configuration file. Let’s take a look at our worldfile for the case study, we’ll add a new model of a new Bigbob robot called “bob2”:
bigbob
(
name "bob1"
pose [-5 -6 45]
color "green"
)
bigbob
(
name "bob2"
pose [5 6 225]
color "yellow"
)
If there are multiple robots in the simulation, the standard practice is to put each robot on its own port (see Device Address). To implement this in the configuration file we need to tell Player which port to find our second robot on:
driver( name "stage"
provides ["6665:position2d:0" "6665:ranger:0"
"6665:blobfinder:0" "6665:ranger:1"]
model "bob1" )
driver( name "stage"
provides ["6666:position2d:0" "6666:ranger:0"
"6666:blobfinder:0" "6666:ranger:1"]
model "bob2" )
If you plan on simulating a large number of robots then it is probably worth writing a script to generate the world and configuration files.
When Player/Stage is started, the Player server automatically connects to all the ports used in your simulation and you control the robots separately with different player client objects in your code. For instance:
# first robot
robot = playerc_client(None, 'localhost', 6665)
if robot.connect():
raise Exception(playerc_error_str())
sp = playerc_ranger(robot,0)
if sp.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
lp = playerc_ranger(robot,1)
if lp.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
pp = playerc_position2d(robot,0)
if pp.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
//second robot
robot2 = playerc_client(None, 'localhost', 6666)
if robot2.connect():
raise Exception(playerc_error_str())
sp2 = playerc_ranger(robot2,0)
if sp2.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
lp2 = playerc_ranger(robot2,1)
if lp2.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
pp2 = playerc_position2d(robot2,0)
if pp2.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
Each player client represents a robot, this is why when you connect to a proxy the player client is a constructor parameter. Each robot has a proxy for each of its devices, no robots share a proxy, so it is important that your code connects to every proxy of every robot in order to read the sensor information.
How you handle the extra player client and proxies is dependent on the scale of the simulation and your own personal coding preferences. It’s a good idea, if there’s more than maybe 2 robots in the simulation, to make a robot class which deals with connecting to proxies and the server, and processes all the information internally to control the robot. Then you can create an instance of this class for each simulated robot (obviously the robot’s port number would need to be a parameter otherwise they’ll all connect to the same port and consequently the same robot.) and all the simulated robots will run the same code.
An alternative to using a port for each robot is to use the same port but a different index.
For example, the Bigbob robot uses interfaces and indexes: position2d:0, ranger:0, blobfinder:0 and ranger:0. If we configured two Bigbob robots to use the same port but a different index our configuration file would be like this:
driver( name "stage"
provides ["6665:position2d:0" "6665:ranger:0"
"6665:blobfinder:0" "6665:ranger:1"]
model "bob1" )
driver( name "stage"
provides ["6665:position2d:1" "6665:ranger:2"
"6665:blobfinder:1" "6665:ranger:3"]
model "bob2" )
In our code we could then establish the proxies using only one player client:
# first robot
robot = playerc_client(None, 'localhost', 6665)
if robot.connect():
raise Exception(playerc_error_str())
sp = playerc_ranger(robot,0)
if sp.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
lp = playerc_ranger(robot,1)
if lp.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
pp = playerc_position2d(robot,0)
if pp.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
# second robot
sp2 = playerc_ranger(robot,2)
if sp2.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
lp2 = playerc_ranger(robot,3)
if lp2.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
pp2 = playerc_position2d(robot,1)
if pp2.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
# shared simulation proxy
simp = playerc_simulation(robot,0)
if simp.subscribe(PLAYERC_OPEN_MODE):
raise Exception(playerc_error_str())
The main advantage of configuring the robot swarm this way is that it allows us to only have one simulation proxy which is used by all robots. This is good since there is only ever one simulation window that you can interact with and so multiple simulation proxies are unnecessary and confusing.
img
The previous sections, from 1 to 9, focus on the construction of a new scenario (world and models) and the client side controller software. However, if you have to build a new interface or device driver for Player, you will only find out-of-date and incomplete instructions. Thus, the motivation of this new section is to reduce effort so that new resources can be built for Player.
10.1 - Definition of a Player Interface¶
An interface is defined in two different parts of Player documentation
- Definition 1. All Player communication occurs through interfaces, which specify the syntax and semantics for a set of messages.
- Definition 2. A specification of how to interact with a certain class of robotic sensor, actuator, or algorithm. The interface defines the syntax and semantics of all messages that can be exchanged with entities in the same class.
Consider the ranger interface. This interface defines a format in which a generic (vendor independent) range sensor can return range readings. So it defines the data structure of messages exchanged between a client-side controller and a serve-side device driver. This is how Player abstracts the specifics of a hardware module, such as a ranger device. For example, SICK LMS200 device has its own protocol to interface with its hardware, however, when integrated with Player, the Player driver for SICK LMS200 will use the generic vendor independent ranger interface. Thus, the client-side controller software is not aware of the laser ranger brand.
So, when a client-side controller software creates the sonarProxy
device based on the RangerProxy
, we are defining that this device
obeys the ranger interface
, sending and receiving messages according
to the data types (structs) defined in this interface. For instance,
when the sonarProxy.GetRange(i)
method is executed, the read message
has the following format player_ranger_data_range_t
, defined in the
file libplayerinterface/interfaces/062_ranger.def
.
10.2 - When to Create a New Interface¶
It is unlikely that you have to create a new interface if you are using a conventional robot because Player already has a vast number of interfaces for the most used resources in robots. If you do have to define a new interface, please follow the recommended guidelines to maintain future software compatibility and documentation style.
10.3 - How an Interface is Created¶
There is a very brief description about adding new interfaces at
libplayerinterface/interfaces/ADDING_INTERFACES
. It gives the basic
information about interfaces, but it lacks useful information which were
included in this document. The ADDING_INTERFACES
says:
There is also an example interface located at
examples\plugins\exampleinterface
which might be useful for some
users, but I think it is quite complicated for beginners.
The ranger interface is also quite complicated. Thus, it’s not an
adequate example to start with. On the other hand, the speech interface
(libplayerinterface/interfaces/012_speech.def
) is one of the
simplest Players interfaces. This interface is a generic interface to
any speech synthesizer. It is defined as:
The speech interface has only one message and one data structure
(player_speech_cmd_t
) carrying the sentence to be spoken by the
robot.
Another interesting interface is the bumper, defined at
libplayerinterface/interfaces/014_bumper.def
:
The bumper interface has three messages and its respective data type.
Note that there are different message parameters in this interface.
According to the Player manual, a message is defined as
message { TYPE, SUBTYPE, SUBTYPE_CODE, DATA_TYPE };
The TYPE
field can be DATA, REQ, CMD. Data is used, for example, to read data out
of a sensor, to read the robot’s pose, device status, etc. REQ is used
to query a device, in a request/replay communication format. CMD is
mostly used for an actuator to set its state. The SUBTYPE
and
SUBTYPE_CODE
is used only to differentiate messages of the same
TYPE. Finally, the DATA_TYPE
is the struct used to carry the actual
data of the message.
10.4 - Creating an Interface¶
As an example, we are are going to create the so called ``playsound`` interface. This interface will send the filename of an audio file so that the robot can play this file. We are assuming that the client side knows the audio files in the robot’s computer. No actual audio file is transferred in the message, just the audio filename.
Looking at the Player
interfaces
list, there is the
audio
interface, but it is not adequate for our purposes since it is much more
complicated than our specification. For this reason, we are going to
create the playsound
interface.
10.4.1 - Creating the Message Type¶
The playsound interface is defined below, and it is located at
<source_code>/Ch10.1
.
It has a single message and data type, which is used to send the audio
filename to be played at the robot. This file must be saved as
libplayerinterface/interfaces/067_playsound.def
. The number 067 does
not matter. But it must be the last interface number used in the
directory libplayerinterface/interfaces/
.
Now we have to edit the libplayerinterface/CMakeLists.txt
file to
compile our new interface. The part of the CMakefile that defines the
interface files must change from
to
The next step is to create the new Proxies that use the playsound interface. We are going to call them PlaySound Proxy. Two versions are created, one for C (PlayerC) and C++ (PlayerCpp).
10.4.2 - Creating the PlayerC Proxy¶
The <source_code>/Ch10.1/dev_playsound.c
file must be copied to
client_libs/libplayerc/dev_playsound.c
. The functions defined in
this proxy are listed below. Except for the last function, the other are
mandatory for all proxies, with very similar prototypes across diferent
proxies.
void playerc_playsound_putmsg(playerc_playsound_t *device,
player_msghdr_t *header,
player_playsound_cmd_t *data,
size_t len);
// Create a new sound proxy
playerc_playsound_t *playerc_playsound_create(playerc_client_t *client, int index);
// Destroy a sound proxy
void playerc_playsound_destroy(playerc_playsound_t *device);
// Subscribe to the sound device
int playerc_playsound_subscribe(playerc_playsound_t *device, int access);
// Un-subscribe from the sound device
int playerc_playsound_unsubscribe(playerc_playsound_t *device);
// Process incoming data
void playerc_playsound_putmsg(playerc_playsound_t *device, player_msghdr_t *header,
player_playsound_cmd_t *data, size_t len);
// set the file to be played into the audio device
int playerc_playsound_play(playerc_playsound_t *device, char *filename);
The next step is to edit the client_libs/libplayerc/playerc.h
file
to define the playsound class. At the end of the file insert the
following code:
Edit the client_libs/libplayerc/CMakeLists.txt
file from
to
including the new proxy file for compilation.
10.4.3 - Creating the PlayerCpp Proxy¶
The <source_code>/Ch10.1/playsoundproxy.cc
file must be copied
to\ client\_libs/libplayerc++/playsoundproxy.cc\
. This file
encapsulates the functions defined
in\ client\_libs/libplayerc/dev\_playsound.c\
as a C++ class
called\ PlaySoundProxy.\` The next step is to edit the
client_libs/libplayerc++/playerc++.hfile to define the playsound class. At the end of the file insert the following code: .. code:: tiobox /** The @p PlaySoundProxy class is used to play an audio file located in the robot's computer. */ class PLAYERCC_EXPORT PlaySoundProxy : public ClientProxy { private: void Subscribe(uint32_t aIndex); void Unsubscribe(); /// the interface data structure playerc_playsound_t *mDevice; public: /// constructor PlaySoundProxy(PlayerClient *aPc, uint32_t aIndex=0); /// destructor ~PlaySoundProxy(); /// the main method of the proxy, used to send the audio filename to be player void play(char *filename); }; At the very end of the same file, there is block of << operator such as .. code:: tiobox PLAYERCC_EXPORT std::ostream& operator << (std::ostream& os, const PlayerCc::RFIDProxy& c); PLAYERCC_EXPORT std::ostream& operator << (std::ostream& os, const PlayerCc::WSNProxy& c); just insert another definition, like this one. .. code:: tiobox PLAYERCC_EXPORT std::ostream& operator << (std::ostream& os, const PlayerCc::RFIDProxy& c); PLAYERCC_EXPORT std::ostream& operator << (std::ostream& os, const PlayerCc::WSNProxy& c); PLAYERCC_EXPORT std::ostream& operator << (std::ostream& os, const PlayerCc::PlaySoundProxy& c); Edit the
client_libs/libplayerc++/CMakeLists.txt``
file from
to
including the new proxy file for compilation.
10.5 - Compiling the Interface¶
Go to the Player source code dir where you edited the files above. If there is not a build dir, then create it and proceed with the Player normal compilation procedure.
When you change some interface definition, Player is recompiled from strach.
10.6 - TRY IT OUT¶
An interface and proxy do anything except abstract the hardware. At least one driver is required to use this new interface. So, the actual test of the new interface is postponed to the next chapter, where we are going to build a device driver using the PlaySound.
For now, we will create a controller using the new proxies, just to test the compilation process. It won’t produce any noticeable result. We are going to use the following files for this test.
Player CFG:
Client Controller:
Go to a terminal an launch Player:
and finally, on another terminal, compile and run the controller software.
It will only connects to Player, prints a message, and disconnects right after. On the Player terminal it will show
$ player simple.cfg
Player v.3.1.1-dev
* Part of the Player/Stage/Gazebo Project [http://playerstage.sourceforge.net].
* Copyright (C) 2000 - 2013 Brian Gerkey, Richard Vaughan, Andrew Howard,
* Nate Koenig, and contributors. Released under the GNU General Public License.
* Player comes with ABSOLUTELY NO WARRANTY. This is free software, and you
* are welcome to redistribute it under certain conditions; see COPYING
* for details.
Listening on ports: 6665
accepted TCP client 0 on port 6665, fd 15
closing TCP connection to client 0 on port 6665
and on the client terminal it will show
$ ./client
playerc warning : warning : [Player v.3.1.1-dev] connected on [localhost:6665] with sock 3
bye bye Player !
img
11.1 - Definition of a Player Device Driver¶
A device driver, or just driver, is defined in the Player documentation as
A piece of software (usually written in C++) that talks to a robotic sensor, actuator,
or algorithm, and translates its inputs and outputs to conform to one or more interfaces.
The driver's job is hide the specifics of any given entity by making it appear to
be the same as any other entity in its class.
The driver is a vendor specific and it actually accesses the device hardware. However, it uses Player interfaces an proxies to abstract the vendor specifics details from the client controller software. The Player documentation gives and example of a laser range finder. A driver is an implementation of a specific laser model of a specific vendor, such as SICK LMS200. However, the Player programmer uses RangerProxy to access a generic laser ranger. The specific laser running on the robot is informed in the cfg file.
11.2 - The Architecture of a Player Device Driver¶
A Player driver source codes are located in the server\drivers
dir,
where the robot drivers are located under server\drivers\mixed
dir.
The drivers opaque
and speech
are probably the simplest ones, so
they are recommended to start understanding the driver architecture.
There are example drivers located at examples\plugins\exampledriver
This section explains the driver architecture using an example of a new Player driver for the audio library called LibSoX. The existing audio resources in Player are quite complicated, motivating the creation of a simpler and modern audio interface for robots.
The source code and compilation scripts for the soxplayer
driver are
located at <source_code>/Ch11.1/
. As can be seem in
<source_code>/Ch11.1/soxplayer.h
, a Player driver is an extension of
the class ThreadedDriver
. This class has virtual methods that must
be defined in the new driver, such as:
virtual int ProcessMessage(QueuePointer & resp_queue, player_msghdr * hdr, void * data);
It deals with messages coming from the clientsvirtual int MainSetup();
Set up the device when a client connects to the drivervirtual void MainQuit();
Shutdown the device when the client is shutdownvirtual void Main();
Main function for device thread
It is also mandatory to have a constructor which parses the cfg file, such as:
Soxplayer(ConfigFile* cf, int section);
The driver must also have a device address attribute, declared as
player_devaddr_t sound_addr;
The most important method is the ProcessMessage
, where the incoming
messages are checked and processed. PLAYER_PLAYSOUND_CMD_VALUES
and
player_playsound_cmd_t
must match with the interface used by the
driver, in this case, the playsound
interface, defined in Chapter
10. The MatchMessage
compares the incoming
message header (hdr
) with the expected interface header. If they
match, then the incoming data (data
) is typecasted to data interface
type (player_playsound_cmd_t
) and executed accordingly.
The Play
method actually deals with the lib SoX details such as,
open/read the audio file, apply effects, and send it to audio device
driver at the OS level, in this case, alsa
.
The SoxPlayer driver is a fairly simple driver dealing with a single
interface, but a driver can be more complex and dealing with multiple
interfaces. It is highly recommended after doing this tutorial to
experiment with more complex existing drivers in the server\drivers
dir.
11.2 - Compiling the Player Device Driver¶
You can compile the driver as a stand-alone library or integrated to the Player source code distribution. Usually, you would start with the stand-alone compilation until the driver is fully tested. Once it is stable, documented, and it useful for other people, you can submit it as a patch for the Player maintainer, as explained next.
11.2.1 - Driver Depedencies¶
The soxplayer driver depends on the LibSoX. So, now it is the time to install this library with the following command:
> sudo apt-get install -y libsox-dev
11.2.2 - Compiling the Player Device Driver as a Stand-Alone driver¶
Initially, while you are still developing and testing your new driver,
it is better to compile it separately from Player. This section shows
you how to compile like this. The environment variables
CMAKE_MODULE_PATH
and PKG_CONFIG_PATH
are very important in this
step. Please check the Section 1.1 - A Note on Installing
Player/Stage to see
how to set these variables. Details of CMake
and PKGConfig are beyond
the scope of this manual, but it is highly recommended subject of study
if you plan to be a serious Player/stage developer or Linux software
developer.
> cd <source_code>/Ch11.1/
> mkdir -p build
> cd build/
> cmake ../
> make
The driver library will be created in the current dir
(Ch11.1/build
). For convenience, there is a compile.sh
script
that does the same thing with a single command. Alternatively, there is
a the Ch11.1/Mafefile
, which uses PKGConfig definitions instead of
CMake. Both scripts can be used as a template to compile other drivers.
11.2.3 - Compiling the Player Device Driver as part of Player distribution¶
If you really think that your driver can be useful for other people, it is highly recommended to include it into the Player source code distribution. Here were are assuming a git version system is used for Player.
Download a new, clean, and updated Player source code it from www.github/playerproject/player and follow these steps to add the soxplayer source code and scripts. For this step, you cannot reuse an existing local copy because it might have your local changes.
> git clone https://github.com/playerproject/player.git
> cd player/server/drivers
> mkdir sox
> cd sox
> cp <source_code>/Ch11.1/soxplayer.cc .
> cp <source_code>/Ch11.1/soxplayer.h .
> cp <source_code>/Ch11.2/soxplayer.cfg .
> cp <source_code>/Ch11.2/soxplayer-test.cc .
> cp <source_code>/Ch11.3/CMakeLists.txt .
Change the player/server/drivers/CMakeLists.txt
file from
ADD_SUBDIRECTORY (sonar)
ADD_SUBDIRECTORY (speech)
to
ADD_SUBDIRECTORY (sonar)
ADD_SUBDIRECTORY (sox)
ADD_SUBDIRECTORY (speech)
Now it is time to recompile Player with its new driver
> cd player/
> mkdir build
> ccmake ..
check the option soxplayer if it is ON or OFF
> cmake ..
> make -j 8
If it is ok, then create a patch file and submitted it to the https://github.com/playerproject/player
> cd player/
> git diff --cached > soxplayer_driver.patch
TO DO - this step gives this error.
CMakeFiles/playerdrivers.dir/__/drivers/sox/soxplayer.o: In function `player_driver_init':
soxplayer.cc:(.text+0x7d): multiple definition of `player_driver_init'
CMakeFiles/playerdrivers.dir/__/drivers/position/snd/snd.o:snd.cc:(.text+0x82): first defined here
collect2: error: ld returned 1 exit status
make[2]: *** [server/libplayerdrivers/libplayerdrivers.so.3.1.1-dev] Error 1
make[2]: Leaving directory `/home/osboxes/repos/player-new/build'
make[1]: *** [server/libplayerdrivers/CMakeFiles/playerdrivers.dir/all] Error 2
make[1]: Leaving directory `/home/osboxes/repos/player-new/build'
11.3 - TRY IT OUT¶
This test will fully test not only the soxplayer driver, but also the
playsound interface developed in Chapter 10. The
example is located at <source_code>/Ch11.2
.
The CFG file defines the soxplayer driver, without any parameter:
It assumes that the driver library is located in the same directory or its path is defined in the LD_LIBRARY_PATH environment variable.
C++ client controller:
It defines the PlaySoundProxy
and it executes the filename
sound.play(argv[1])
passed as argument.
Launch Player:
and finally, on another terminal, compile and run the controller software.
It will play the R2D2 sound file and the client side terminal will show the following message.
$ ./soxplayer-test R2D2.mp3
playerc warning : warning : [Player v.3.1.1-dev] connected on [localhost:6665] with sock 3
closing Player client
img
Papers¶
If you are using Donnie and/or its software on your research projects, please cite our papers:
@inproceedings{oliveira2017teaching,
title={Teaching Robot Programming Activities for Visually Impaired Students: A Systematic Review},
author={Oliveira, Juliana Damasio and de Borba Campos, M{\'a}rcia and de Morais Amory, Alexandre and Manssour, Isabel Harb},
booktitle={International Conference on Universal Access in Human-Computer Interaction},
pages={155--167},
year={2017},
organization={Springer}
}
@inproceedings{guilherme2017donnie,
title={Donnie Robot: Towards an Accessible And Educational Robot for Visually Impaired People},
author={Guilherme H. M. Marques, Daniel C. Einloft, Augusto C. P. Bergamin, Joice A. Marek, Renan G. Maidana Marcia B. Campos, Isabel H. Manssour, Alexandre M. Amory},
booktitle={Latin American Robotics Symposium (LARS)},
year={2017}
}
Disclaimer¶
Donnie and its software are protected under the MIT License:
Copyright 2018, Laboratório de Sistemas Autônomos
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in the
Software without restriction, including without limitation the rights to use, copy,
modify, merge, publish, distribute, sublicense, and/or sell copies of the Software,
and to permit persons to whom the Software is furnished to do so, subject to the
following conditions:
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF
CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE
OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.