Donnie Assistive Robot: Developer Manual

About Donnie

sempre use alt para descreve a imagem p um deficiente visual sempre use alt para descreve a imagem p um deficiente visual

Warning

This document is for developers only. If you want to use and test Donnie, please refer to the Donnie Assistive Robot: User Manual

Donnie is an assistive technology project, whose objective is to use robotics to facilitate programming teaching to visually impaired students. It is divided in two main parts:

  • The construction and fine-tuning of the titular mobile robot, Donnie;
  • The project’s software stack, including an intuitive parser/interpreter and a robot simulation environment;

The project is in its second version, developed in the Laboratório de Sistemas Autônomos (LSA) of the Pontific Catholic University of Rio Grande do Sul (PUCRS), Brazil.

Getting Started

Before going into the tutorials, follow the instructions below to prepare your environment.

About Donnie

Robotics has been used to teach young students the basics of programming. However, most of the programming environments for kids are high visual, based on grab and drag blocks. Blind students or students with some visual disability cannot use these teaching resources.

The Donnie project proposes an inclusive robotic programming environment which all students (with or without visual disabilities) can use.

Donnie comes with two usage options: with the simulated and with the physical robots. It is recommend to start with simulation since it does not require building the robot. Moreover, the physical robot is functional, but still under test.

Features
  • Robot programming environment for young students with or without visual impairment;
  • Assistive programming language called GoDonnie. GoDonnie is TTS and screen reader friendly;
  • Integration with a Arduino-based robot with Player robotic middleware;
  • Extension of Stage simulator to generate sound clues while the robot is moving;
  • Software developed for the simulated robot is compatible with the real Donnie robot;

The simulation is recommended if you want to known about Donnie but don’t have the required resources to build your own Donnie robot.

How to Install Donnie’s Software

Operating System Requirement
Compile and Install Donnie Software on a Desktop Computer

Open a terminal, and execute the following commands:

mkdir ~/donnie; cd ~/donnie
git clone --recurse-submodules -b devel https://github.com/lsa-pucrs/donnie-assistive-robot-sw.git
cd donnie-assistive-robot-sw
chmod +x ./install.sh
export DONNIE_PATH=/opt/donnie
./install.sh

After the execution of the last command above, if the installation finished successfully you are ready to go! note:The last command above, in general, requires lot of time to finish.

Initializing the environment

With Donnie’s environment installed on your computer, open a new terminal (crtl + alt + t) and run the command:

donnie_player

Wait a few seconds for the environment to boot, and then run GoDonnie. There are two modes of execution: Terminal mode: The code must be entered at the GoDonnie terminal and is executed by pressing the ESC key.

GoDonnie -t

File Mode: Allows you to play GoDonnie files (extension .gd or .txt)

GoDonnie -f <filename>

Some examples of GoDonnie files are in the directory.

/opt/donnie/test/GoDonnie/

Note: To execute a file that is in another directory, you must indicate the directory path where it is located. For example, the file test.gd is in the /opt/donnie/test/GoDonnie/directory, to run it use the GoDonnie command as follows:

  • GoDonnie -f /opt/donnie/test/GoDonnie/test.gd

Or go to the directory the file is in, before executing:

  • cd /opt/donnie/test/GoDonnie/
  • GoDonnie -f test.gd
Configuring Donnie

The installation script composes a standard instalation that we believe is the most appropriate for the average user. However, advanced parameters can be set if the user has experience with the appropriate tools.

The build system is based on cmake, so experience with Linux, make, and cmake is required. All the individual parts of Donnie’s Software Stack are also based on CMake. These are the software parts that can be customized, each with its own set of parameters:

each of these packages have their one sets of parameters.

Developers interested in customization might want to read the following files:

Parameters for Donnie’s Software

The following list explains Donnie’s main compilation parameters:

BUILD_DOCS           OFF              Generate Donnie's documents.
BUILD_DOXYGEN        ON               This is the default document in HTML, meant only for developers.
BUILD_DOXYGEN_PDF    OFF              The same document before, but in PDF.
BUILD_EXAMPLES       OFF              Build the examples for each part of Donnie.
BUILD_MANUAL         OFF              Build the manuals: software manual, hardware manual, user manual.
CMAKE_BUILD_TYPE     Release | Debug  Debug mode is for developers only !
DOC_LANGUAGE         en | pt-br | es  The language used to build documents and the GoDonnie interpreter. Future work !

How to Build Your Own Donnie Robot

To build your own Donnie robot, please refer to the following repositories (github login and read access required):

Install Donnie’s Software on an Embedded Computer (Raspberry Pi)

Once the eletronics and 3D printing are done, the operating system and Donnie’s software must be installed in the robot’s Raspberry Pi.

Loading the Donnie Image into the Pi’s SD Card

The easiest way to setup the embedded computer is to use the pre-built image (comming soon!). Please follow these steps to burn the SD card:

wget xxxxxxxx.img
continue ...
Compile, Configure the OS, and Install the Donnie Image into the SD Card

This option is for programmers experient with Raspberry Pi.

Raspbian 8.0 (Jessie) is the recommended OS distribution for the robot. Log onto Donnie’s embedded computer, open a terminal, and run the following to download and execute the software installation script:

mkdir ~/donnie; cd ~/donnie
wget https://github.com/lsa-pucrs/donnie-assistive-robot-sw/raw/devel/install-rpi.sh
chmod +x ./install-rpi.sh
./install-rpi.sh

Experienced programmers can configure the same paramerters presented here, for the Raspberry Pi.

Software Description

Donnie Programming Environment

Introduction

The Donnie Programming Environment is guided by a programming language called GoDonnie. This language was created with the purpose of being easy and less visual, motivating people with visual impairment to pursue a career in the area of programming and technology. The GoDonnie language commands a robot called Donnie, which can be programmed to describe the environment around him and thus help the person with visual impairment to better understand the place where they are.

GoDonnie Programming Language

GoDonnie is a programming language that commands a robot called Donnie. This robot works in its own environment. The GoDonnie User Manual can be found in the Donnie User Manual.

GoDonnie Interpreter

The GoDonnie.cpp is GoDonnie’s main file.
  1. It parses GoDonnie’s command line arguments. Type GoDonnie –h
  2. If GoDonnie is run in terminal mode, it gets the terminal’s commands using readline library
  3. If GoDonnie is run in batch mode, it reads the entire GoDonnie file
  4. Despite the mode (terminal or batch) to call the parser with the command ‘Compiler.parseGD’
The Compiler.cpp class implements the GoDonnie’s programming language parser
  1. The ‘parseGD’ method creates the lexer and parser tree for the incoming string in the GoDonnie programming language format
  2. This file includes the files “GoDonnieLexer.h” “GoDonnieParser.h”, which are automatically generated by Antlr, based on the file GoDonnie.g
  3. If the parsing is successful, then it runs its commands with the method ‘run’
  4. The ‘run’ method executes each command in the parse tree. This method is a big switch used to select the current token to be executed.
  5. For most of the GoDonnie’s instruction call command in the Donnie attribute of the ExprTreeEvaluator class. This attribute implements the DonnieClient class, with all command GoDonnie can execute. DonnieClient does the interface between the parser and Player.
The GoDonnie.g file implements the rules of the GoDonnie language
  1. The initial part of this file, before the rule ‘start_rule’ is just a resource made to change the ANTLR default error message such that these messages are more user-friendly
  2. After the rule ‘start_rule’, it is the language definition itself. All tokens and grammar are defined here.

The DonnieClient.cpp file implements the interface with Player middleware. It’s method implements all actions that Donnie can execute via Player. The list of commands include:

  1. moveForward, moveBackward, GetPos, Scan, GetRange, Speak, Color, Goto, among others

The Exception, Historic, and DonnieMemory are auxiliary files with secondary functions

Player Robotic Middleware

Introduction

explica brevemente, cita o artigo, mostra algum exemplo pronto.

Software Organization

o que um driver, interface, client, etc

Explain the Cfg File

fazer tipo um tutorial usando os Cfg do Donnie como exemplo

Build a Cfg for Multiple Robots

fazer tipo um tutorial passo a passo

Stage Multi Robot Simulator

Introduction

Player/Stage is a robot simulating tool, it comprises of one program, Player, which is a Hardware Abstraction Layer. That means that it talks to the bits of hardware on the robot (like a claw or a camera) and lets you control them with your code, meaning you don’t need to worry about how the various parts of the robot work. Stage is a plugin to Player which listens to what Player is telling it to do and turns these instructions into a simulation of your robot. It also simulates sensor data and sends this to Player which in turn makes the sensor data available to your code.

A simulation then, is composed of three parts:

  • Your code. This talks to Player.
  • Player. This takes your code and sends instructions to a robot. From the robot it gets sensor data and sends it to your code.
  • Stage. Stage interfaces with Player in the same way as a robot’s hardware would. It receives instructions from Player and moves a simulated robot in a simulated world, it gets sensor data from the robot in the simulation and sends this to Player.

In Player/Stage there are 3 kinds of file that you need to understand to get going with Player/Stage:

  • a .world file
  • a .cfg (configuration) file
  • a .inc (include) file

The .world file tells Player/Stage what things are available to put in the world. In this file you describe your robot, any items which populate the world and the layout of the world. The .inc file follows the same syntax and format of a .world file but it can be included. So if there is an object in your world that you might want to use in other worlds, such as a model of a robot, putting the robot description in a .inc file just makes it easier to copy over, it also means that if you ever want to change your robot description then you only need to do it in one place and your multiple simulations are changed too.

The .cfg file is what Player reads to get all the information about the robot that you are going to use.This file tells Player which drivers it needs to use in order to interact with the robot, if you’re using a real robot these drivers are built in to Player (or you can download or write your own drivers, but I’m not going to talk about how to do this here.) Alternatively, if you want to make a simulation, the driver is always Stage (this is how Player uses Stage in the same way it uses a robot: it thinks that it is a hardware driver and communicates with it as such). The .cfg file tells Player how to talk to the driver, and how to interpret any data from the driver so that it can be presented to your code. Items described in the .world file should be described in the .cfg file if you want your code to be able to interact with that item (such as a robot), if you don’t need your code to interact with the item then this isn’t necessary. The .cfg file does all this specification using interfaces and drivers.

How to Create a New Environment

Building an Empty World

To start building an empty world we need a .cfg file. First create a document called empty.cfg (i.e. open in your favorite text editor) and copy the following code into it:

driver
(
   name "stage"
   plugin "stageplugin"

   provides ["simulation:0" ]

   # load the named file into the simulator
   worldfile "empty.world"
)

Basically what is happening here is that your configuration file is telling Player that there is a driver called stage in the stageplugin library, and this will give Player data which conforms to the simulation interface. To build the simulation Player needs to look in the worldfile called empty.world which is stored in the same folder as this .cfg. If it was stored elsewhere you would have to include a filepath, for example ./worlds/empty.world. Lines that begin with the hash symbol (#) are comments. When you build a simulation, any simulation, in Stage the above chunk of code should always be the first thing the configuration file says. Obviously the name of the worldfile should be changed depending on what you called it though.

Now a basic configuration file has been written, it is time to tell Player/Stage what to put into this simulation. This is done in the .world file.

Models

A worldfile is basically just a list of models that describes all the stuff in the simulation. This includes the basic environment, robots and other objects. The basic type of model is called “model”, and you define a model using the following syntax:

define model_name model
(
     # parameters
)

This tells Player/Stage that you are defining a model which you have called model_name, and all the stuff in the round brackets are parameters of the model. To begin to understand Player/Stage model parameters, let’s look at the map.inc file that comes with Stage, this contains the floorplan model, which is used to describe the basic environment of the simulation (i.e. walls the robots can bump into):

define floorplan model
(
# sombre, sensible, artistic
color "gray30"

# most maps will need a bounding box
boundary 1

gui_nose 0
gui_grid 0
gui_move 0
gui_outline 0
gripper_return 0
fiducial_return 0
ranger_return 1
)

We can see from the first line that they are defining a model called floorplan.

  • color: Tells Player/Stage what colour to render this model, in this case it is going to be a shade of grey.
  • boundary: Whether or not there is a bounding box around the model. This is an example of a binary parameter, which means the if the number next to it is 0 then it is false, if it is 1 or over then it’s true. So here we DO have a bounding box around our “map” model so the robot can’t wander out of our map.
  • gui_nose: this tells Player/Stage that it should indicate which way the model is facing.
  • gui_grid: this will superimpose a grid over the model.
  • gui_move: this indicates whether it should be possible to drag and drop the model. Here it is 0, so you cannot move the map model once Player/Stage has been run.
  • gui_outline: indicates whether or not the model should be outlined. This makes no difference to a map, but it can be useful when making models of items within the world.
  • fiducial_return: any parameter of the form some_sensor_return describes how that kind of sensor should react to the model.
  • ranger_return: Setting ranger_return to a negative number indicates that a model cannot be seen by ranger sensors. Setting ranger_return to a number between 0 and 1 (inclusive) (Note: this means that ranger_return 0 will allow a ranger sensor to see the object — the range will get set, it’ll just set the intensity of that return to zero.)
  • gripper_return: Like fiducial_return, gripper_return tells Player/Stage that your model can be detected by the relevant sensor, i.e. it can be gripped by a gripper. Here gripper_return is set to 0 so the map cannot be gripped by a gripper.

To make use of the map.inc file we put the following code into our world file:

include "map.inc"

This inserts the map.inc file into our world file where the include line is. This assumes that your worldfile and map.inc file are in the same folder, if they are not then you’ll need to include the filepath in the quotes. Once this is done we can modify our definition of the map model to be used in the simulation. For example:

floorplan
(
   bitmap "bitmaps/helloworld.png"
   size [12 5 1]
)

What this means is that we are using the model “floorplan”, and making some extra definitions; both “bitmap” and “size” are parameters of a Player/Stage model. Here we are telling Player/Stage that we defined a bunch of parameters for a type of model called “floorplan” (contained in map.inc) and now we’re using this “floorplan” model definition and adding a few extra parameters.

  • bitmap: this is the filepath to a bitmap, which can be type bmp, jpeg, gif or png. Black areas in the bitmap tell the model what shape to be, non-black areas are not rendered, this is illustrated in Figure 3.4. In the map.inc file we told the map that its “color” would be grey. This parameter does not affect how the bitmaps are read, Player/Stage will always look for black in the bitmap, the color parameter just alters what colour the map is rendered in the simulation.
  • size: This is the size in metres of the simulation. All sizes you give in the world file are in metres, and they represent the actual size of things. If you have 3m x 4m robot testing arena that is 2m high and you want to simulate it then the size is [3 4 2]. The first number is the size in the x dimension, the second is the y dimension and the third is the z dimension.
Describing the Player/Stage Window

The worldfile also can be used to describe the simulation window that Player/Stage creates. Player/Stage will automatically make a window for the simulation if you don’t put any window details in the worldfile, however, it is often useful to put this information in anyway. This prevents a large simulation from being too big for the window, or to increase or decrease the size of the simulation.

Like a model, a window is an inbuilt, high-level entity with lots of parameters. Unlike models though, there can be only one window in a simulation and only a few of its parameters are really needed. The simulation window is described with the following syntax:

window
(
   # parameters...
)

The two most important parameters for the window are size and scale.

  • size: This is the size the simulation window will be in pixels. You need to define both the width and height of the window using the following syntax: size [width height].
  • scale: This is how many metres of the simulated environment each pixel shows. The bigger this number is, the smaller the simulation becomes. The optimum value for the scale is window_size/floorplan_size and it should be rounded downwards so the simulation is a little smaller than the window it’s in, some degree of trial and error is needed to get this right.

We have already discussed the basics of worldfile building: models and the window. Finally, we are able to write a worldfile!

include "map.inc"

# configure the GUI window
window
(
   size [700.000 700.000]
   scale 41
)

# load an environment bitmap
floorplan
(
   bitmap "bitmaps/cave.png"
   size [15 15 0.5]
)

If we save the above code as empty.world (correcting any filepaths if necessary) we can run its corresponding empty.cfg file.

::
> cd <source_code>/worlds > player empty.cfg &

Running the empty.cfg file you should see the following simulation:

_images/simpleworld.png

To modify your simulation’s scenario just create a drawing in black in an image editor of your preference and save the file in one of the specified formats. After that, just put the name of the file in the bitmap parameter inside your .world file. Save the image in the bitmaps folder. In case you prefer to save the image in another folder you’ll have to especify the path to the image in the .world file.

How to Create an Environment with Multiple Robots

If you want to create an environment with multiple robots, you can learn how to do it in the Simulating Multiple Robots page.

Hardware Description

Building Your Donnie Robot

Introduction

This manual has all files required to understand how to build the Donnie robot. It explains how to print the Donnie’s body with a 3D printer and manufacture the necessary boards. It also tells you the operation of the firmware and teaches you how to assembly the parts.

Required Material

Donnie requires about 500g of PLA. We use PLA because its low retraction factor in large pieces.

Production Phase

  1. The 3D printer requires the stl files, in the stl_files folder.
  2. We use Slicer (1.2.9) to slice and 3d printing the robot. We use the following configs on slicer:
  • Infill: 20%;
  • Layer height: 0.3mm;
  • Without support (parts that need support have it in the model).

Modifying Donnie’s Body

We used the Solidworks 2014 to model the robot. All the source files are in the solidworks directory.

Visualization

You can visualize the 3D PDF files with Adobe Reader 9 or above. You just need to click in “Enable 3D View” when open the 3D PDF.

Meet Donnie !!!

Meet Donnie !!!

Assembly the Arduino Part

Donnie’s PCB

The repository has all files related to Donnie’s hardware (PCB design, schematics, eletrical diagrams, gerber files, BOM files). Donnie has two daugther boards (or ‘shields’). One for the Arduino Mega and the other for the Raspberry Pi.

The following image shows Donnie’s brain and its electronics.
Meet Donnie Brain!!!

Manufacturing the boards

Send the Gerber for manufacturing

If you just want to manufacture these boards as they are, we recommend the following steps:

  1. Send the Gerber ZIP files (arduino-shield and raspberrypi-shield) to manufacture to Seeedstudio. You should use the following tutorial Fusion PCB Order Submission Guidelines
Arduino Shield
_images/ArduinoShield.jpg
Raspberry Pi Shield
_images/RaspShield.jpg
Assembly

After you receive the PCBs, then follow these steps to assemble the boards:

  1. First of all, separe and buy the components indicated in BOM file (arduino-shield and raspberrypi-shield);
  2. Print the PDF schemmatic and BOM file;
  3. Place and weld the componnects in the PCB with the BOM’s indicated PART.
Change the PCB Design

If you want to change the PCB design, we recommend to use Eagle version XYZ.

Setting Up the Raspberry Pi

Installing the OS

  • download the OS. write down the reasons to choose this os distribution
  • how to burn the sdcard
  • how to the partitioning
  • how to resize the image

Setting Up the OS

  • which basics packages to install
  • how to setup the wireless
  • main depedencies to intall
  • setup automatic login
  • how to enable the rpi pins and protocols (i2c, gpio, pwm, spi, etc)

Installing Donnie

How to install the driver and its depedencies
  • where/how to download
  • how to configure it
  • how to install its depedencies
  • how to install software depedencies and additional required nodes
  • provide a script to install it all at once
Known limitations

describe here any known limitation of the software so that the next student is aware of it.

How to test it
  • basic testing to see if the is procedure working on the RPi

Hooking Up Peripherals to the Raspberry Pi

This section shows how to add the following peripherals to the RPi board

Installing the Raspicam to the Raspberry Pi

Warning

@ To be done by Renan

About the sensor
  • where to buy, how much
  • link to datasheet of the models available at LSA
How to physically connect it to the RPi
  • describe power requirements
  • bill of materials if required (ftdi, cables, etc)
  • show fritzing schematics to connect the sensors, power, other boards, etc
How to install the driver and its depedencies
  • how to install software depedencies and drivers the required
Known limitations of the sensor

describe here any known limitation of the sensor or its drivers so the next student is aware of it.

How to test it
  • basic testing to see if the sensor is working on the RPi

Video Streaming Tutorials

Video Streaming with RaspberryPi Using VLC

Tip

In this tutorial, you will:
  • Learn how to configure your Raspberry Pi for video streaming
  • Know the commands needed for simple video streaming through the VLC media tool

Tip

This demonstration was tested on:
  • VLC 2.2.4 on a Windows 8.1 64-bit Computer
  • 2017/1/11 Raspbian Jessi on a RBpi 2 Model B V1.1 using Pi Camera rev 1.3
  • Note: Pi Camera V2.1 was also tested successfully

This tutorial will introduce to you to your Raspberry Pi Camera Module to view a video stream from your Pi setup, the server using Raspbian, to a different computer, a client using Windows, in your home network

_images/raspberrypi2.jpg
Configuring your RaspberryPi

Firstly, on your Pi’s terminal, Update and Upgrade the environment so it can be up to date. This helps in reducing future problems. Don’t forget to ENABLE your Raspberry Pi Camera using ‘raspi-config’.

$ sudo apt-get update
$ sudo apt-get upgrade
$ sudo raspi-config

A blue BIOS-like screen will appear, go into the Enable Camera Options and enable the camera.

_images/Blue1.png _images/Blue2.png

Note

Depending on your version of Raspbian, the Enable setting may not first appear on the main list. You will have to go under the settings in the blue screen to locate the enable option.

It is also advised now to see what is the IP Address of your Pi. Type in the following to locate the IP as you will need it in the VLC program for your Windows machine.

$ ifconfig
If you are using a wireless connection,
the IP you want is located in the lo section under inet addr:x.x.x.x
If you are using ethernet,
it will be under eth0 in inet addr:x.x.x.x

Getting VLC

On your Client PC that is running Windows, download the VLC software media tool on here through the VLC’s Website

Now on your Pi’s terminal, download and install the VLC for Raspbian.

$ sudo apt-get install vlc

Note

Make sure that your Pi is up-to-date and also now has VLC and that your PC has VLC installed, before going to the next step

Initiating the Stream

Once installed, you may now start the video streaming by typing the folloing in your Pi’s Terminal.

$ raspivid -o - -t 0 -hf -w 800 -h 400 -fps 24 |cvlc -vvv stream:///dev/stdin --sout '#standard{access=http,mux=ts,dst=:8160}' :demux=h264
  • -o Specifies the output filename. the ‘-‘ beside denotes no filename
  • -t is the duration of the recoding, 0 being infinity
  • -hf is Horizontal Flip
  • -w and -h is the resolution for Width and Height
  • -fps is Frames per Second
  • The rest means that on port 8160, data will be sent through http using h264 as stdout using VLC

Once entered, the Pi Camera will turn on and start recording and simultaneously send it over http. This is now the time to go to your Windows machine and watch your streaming footage.

Note

You may want to experiment and change settings like -w, -h, and -fps.

Open the VLC program on your Windows Machine.

And under Media > Open Network Stream > Network > Please enter a network URL:

Type in the IP address that you got from ifconfig like so;

_images/vlc.png
http://x.x.x.x:8160

As we have specified the port to be 8160 in our terminal on the Pi

Once entered, VLC will automatically start playing the stream from the Pi over your network.

Conclusion

Note

As you can see from the stream that the video quality is not that ground breaking but is acceptable, and the latency is the biggest issue of this streaming method.

Video Demonstration

Note

The Monitor on the left displays real time from the Raspberry directly, whereas the Laptop is displaying the VLC stream.

Raspberry Pi camera module streaming video to another computer. This video tutorial shows the overview of this written tutorial.


END
Video Streaming with RapsberryPI Using gStreamer

Tip

In this tutorial, you will:
  • Learn how to configure your Raspberry Pi for video streaming through the gStreamer Method
  • Know the commands needed for simple video streaming through gStreamer

Note

This demonstration uses a Linux based environment (Ubuntu) as the client side, NOT a Windows PC like the other methods.

Tip

This demonstration was tested on:
  • Google Chrome Version 56.0.2924.87 on Ubuntu 14.04 64-bit
  • 2017/1/11 Raspbian Jessi on a RBpi 2 Model B V1.1 using Pi Camera rev 1.3
  • Note: Pi Camera V2.1 was also tested successfully

This tutorial will introduce to you to your Raspberry Pi Camera Module to view a video stream from your Pi setup, the server using Raspbian, to a different computer, a client using Ubuntu, in your home network

_images/raspberrypi2.jpg
Configuring your RaspberryPi

Firstly, on your Pi’s terminal, Update and Upgrade the environment so it can be up to date. This helps in reducing future problems. Don’t forget to ENABLE your Raspberry Pi Camera using ‘raspi-config’.

$ sudo apt-get update
$ sudo apt-get upgrade
$ sudo raspi-config

A blue BIOS-like screen will appear, go into the Enable Camera Options and enable the camera.

_images/Blue1.png _images/Blue2.png

Note

Depending on your version of Raspbian, the Enable setting may not first appear on the main list. You will have to go under the settings in the blue screen to locate the enable option.

It is also advised now to see what is the IP Address of your Pi. Type in the following to locate the IP as you will need it in the Browser for your Windows machine.

$ ifconfig
If you are using a wireless connection,
the IP you want is located in the lo section under inet addr:x.x.x.x
If you are using ethernet,
it will be under eth0 in inet addr:x.x.x.x

Getting gStreamer

Now we will get into the main focus of this tutorial, gStreamer. gStreamer is a multimedia tool that connects a sequence of elements through a pipeline.

We will now get gStreamer for both the Pi and your Ubuntu

$ sudo add-apt-repository ppa:gstreamer-developers/ppa
$ sudo apt-get update
$ sudo apt-get install gstreamer1.0*
Initiating the Video Stream

After the installation, to begin the video stream, we can type in the Pi:

$ raspivid -fps 26 -h 450 -w 600 -vf -n -t 0 -b 200000 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96! gdppay ! tcpserversink host=x.x.x.x port=5000
..NOTE::
  • You can remove -n so you can start a preview on your Pi, -n disables the preview
  • -b is for the bitrate

Please note that the host here must be changed to YOUR host IP from the ifconfig above. That will initiate the stream from the Pi side.

On your client with Linux, also install gStreamer, and then type in the terminal

$ gst-launch-0.10 -v tcpclientsrc host=x.x.x.x port=5000 ! gdpdepay ! rtph264depay ! ffdec_h264 ! ffmpegcolorspace ! autovideosink sync=false

Please note that the host here must be changed to YOUR host IP from the ifconfig above. Now you will see the stream from the Pi server.

Note

As you can see, the quality and latency is ground breaking this time compared to the VLC and mjpgStreamer methods.

Video Demonstration

Note

The Monitor on the left displays real time from the Raspberry directly, whereas the Laptop is displaying the gStreamer stream.

Wirelessly streaming a video from a Raspberry to a remote laptop. This video tutorial shows the overview of this written tutorial.

END 3
Video Streaming with RapsberryPI Using mjpgStreamer

Tip

In this tutorial, you will:
  • Learn how to configure your Raspberry Pi for video streaming through the mjpgStreamer Method
  • Know the commands needed for simple video streaming through mjpgStreamer
  • Acquire the dependencies needed for mjpgStreamer

Tip

This demonstration was tested on:
  • Google Chrome Version 56.0.2924.87 on a Windows 8.1 64-bit Computer
  • 2017/1/11 Raspbian Jessi on a RBpi 2 Model B V1.1 using Pi Camera rev 1.3
  • Note: Pi Camera V2.1 was also tested successfully

This tutorial will introduce to you to your Raspberry Pi Camera Module to view a video stream from your Pi setup, the server using Raspbian, to a different computer, a client using Windows, in your home network

_images/raspberrypi2.jpg
Configuring your RaspberryPi

Firstly, on your Pi’s terminal, Update and Upgrade the environment so it can be up to date. This helps in reducing future problems. Don’t forget to ENABLE your Raspberry Pi Camera using ‘raspi-config’.

$ sudo apt-get update
$ sudo apt-get upgrade
$ sudo raspi-config

A blue BIOS-like screen will appear, go into the Enable Camera Options and enable the camera.

_images/Blue1.png _images/Blue2.png

Note

Depending on your version of Raspbian, the Enable setting may not first appear on the main list. You will have to go under the settings in the blue screen to locate the enable option.

It is also advised now to see what is the IP Address of your Pi. Type in the following to locate the IP as you will need it in the Browser for your Windows machine.

$ ifconfig
If you are using a wireless connection,
the IP you want is located in the lo section under inet addr:x.x.x.x
If you are using ethernet,
it will be under eth0 in inet addr:x.x.x.x

Getting mjpgStreamer

We will now install mjpgStreamer on our Pi, the main focus of this method To do this, we will go to the mjpgStreamer website which will automatically start the download.

We will need to decompress the file, this process will also install it at the same time.

$ tar -zxvf mjpg-streamer.tar.gz

Press Enter, and you should see a new directory called mjpg-streamer

Note

You can check for directories in the terminal by typing in ls

Getting mjpgStreamer’s Dependencies

Now we need mjpgStreamer’s dependancies to make it fully functional.

$ sudo apt-get install libjpeg8-dev
$ sudo apt-get install imagemagick

After this is done, go into the mjpg-streamer directory inside the already existing mjpg-streamer. Yes, type it twice. And then type make which will build the system and compile it

$ cd mjpg-streamer
$ cd mjpg-streamer
$ make

In order to start the capture, we must make a temporary file that will save the image taken by raspistill, and then it will get updated many times every second. So in ~/mjpg-streamer/mjpg-streamer $ type in:

$ mkdir /tmp/stream

We can now initiate the stream by typing in

$ LD_LIBRARY_PATH=./ ./mjpg_streamer -i "input_file.so -f /tmp/stream -n pic.jpg" -o "output_http.so -w ./www"

Open a new terminal window and type

$ raspistill -w 640 -h 480 -q 5 -o /tmp/stream/pic.jpg -tl 1 -t 9999999 -th 0:0:0
  • -w and -h is resolution
  • -q is quality
  • -o is the Specified output filename
  • -tl is the time interval between each snap shot (here is 1 millisecond)
  • -t is the camera’s ON time in seconds, 9999999 is 115 Days
  • -th Set thumbnail parameters (x:y:quality)

Now, on your client computer, open your preferred browser and type in your IP and port# which will be 8080 by default.

x.x.x.x:8080

A website will display showing you the mjpgStreamer Demo Page and a congratulation message. Go to the stream section in the menu to see the live footage from your Pi.

Note

As you can see from the stream that the video quality is not that ground breaking but is acceptable, although a little worse than the VLC method, however the latency is a so much better than in the VLC method.

Video Demonstration

Note

The Monitor on the left displays real time from the Raspberry directly, whereas the Laptop is displaying the mjpgSteamer stream.

Raspberry Pi Camera Stream Web Video. This video tutorial shows the overview of this written tutorial.

END 2

The Raspberry Pi camera module can be used to take high-definition video, as well as stills photographs. This tutorial will introduce to you the Raspberry Pi Camera Module to view a video stream from your Pi setup and show you how to start video streaming through several tools.

Thorough Tests for the Board

  • describe here how one can test the features of the board
Possible Faults
  • describe here usual fault and how to solve it
  • describe where to buy replacement parts

Arduino

Arduino Firmware

Firmware Overview Section

To make your robot work you’ll need to download the .ino file and upload it into the arduino.

Before explaining how the arduino firmware arrangement works, it’s important to learn a little about where the firmware takes place throughout the project. There is the high level language called GoDonnie, which connects with the Stage and the simulated robot or with the physical robot. When this connection is established with the physical robot the Raspberry Pi, that communicates with the language, translates the high level commands into lower level commands and then sends them to the arduino. The arduino, in turn, commands directly the sensors and the actuators of the physical robot.

_images/firmware.png

The firmware is the code that intermediate between the GoDonnie language and the hardware device, and it runs in the arduino. The arduino firmware it’s directly connected with the Raspberry Pi, which sends commands to the arduino that causes the motors to move and the sensors to function. Shortly thereafter the arduino sends back to the Raspberry Pi the information obtained by the sensors. The Player server runs in the Rasp, which is connected with the GoDonnie through the computer. The robot’s camera is also connected through the Rasp, that receives the image from the camera and sends to the Player, which processes the images.

Detailed Firmware Section
  • Special Bytes Definition

Some bytes have a special meaning at certain points within a packet. These are given symbolic names as follows.

SYNC0   0xFA
SYNC1   0xFB
END     0xFE
ARG     0x3B
NARG    0x1B
SARG    0x2B

When integers are sent as arguments, they are always inserted into the byte stream as 2 bytes. The first byte is the low byte, the second byte is the high byte of the integer.

  • Packet Protocol

The protocol is based on command packets that are sent to the controller, and information packets that are received by the host PC. All packets have the following format.

SYNC0
SYNC1
count
count-2 bytes of data
checksum (1 byte)
  • Checksum Calculation

The checksum is calculated on the full packet. The checksum algorithm is given here in C code. The argument size is the number of bytes, and *msg is the vector of bytes in the packet. This checksum algorithm is based on the CRC8 formulas by Dallas/Maxim.

uint8_t Player::checksum(const uint8_t *msg, uint8_t size) {
    uint8_t crc = 0x00;
    while (size--) {
        uint8_t extract = *msg++;
        for (uint8_t tempI = 8; tempI; tempI--){
            uint8_t sum = (crc ^ extract) & 0x01;
            crc >>= 1;
            if (sum) {
                crc ^= 0x8C;
            }
            extract >>= 1;
        }
    }
    return crc;
}
  • Arduino-based Firmware
_images/code.png

The main loop in the image above (lines 8 to 15) performs the robot control. It initially reads incoming packets from the serial port (line 9), executes the commands (e.g. move commands, line 10), updates the sensor readings into the internal memory (line 11), updates the indicators (LEDs, buzzer, vibration motors) based on the command and sensor readings (line 12), and sends the new data via serial port to the Player Driver (line 13). The last line updates counters that control the frequency to send the serial messages.

The Enlace-level of the serial messages presented in figure below has two constants bytes of header, one byte of packet length, one byte for message types, variable number of bytes for the payload, and a final byte with checksum. Each functionality in the Arduino board has a corresponding message type.

_images/package.png

When the user adds a new functionality to the robot, he/she has to define a new message type and adapt both the firmware and the driver to handle this new message. The firmware and driver codes have comments to give clues to the user as in where to change.

Building Your Vibrating Belt

Introduction

With the goal of improving the quality of life of people with visual impairment and helping their mobility through a better perception of the environment it was created a tactile belt. Capable of working with different intensities, the tactile belt makes the user perceive, through vibrations, the approaching of an object. In this manual there’s everything you need to know to build you own vibrating belt. You’ll learn how to manufacture your belt, how to assembly the parts and how the hardware and the firmware work.

Manufacturing

The belt hardware basically is composed by 12 haptic motors, an arduino nano, a PCB shield and bypass connectors.

If you want to manufacture the PCB shield you will find all the information you need (PCB design, schematic, eletrical diagram, BOM file) in the Vib-belt Repository. The PCB schematic is shown in the image bellow.

_images/schematiceagle.png

Assembly

The belt has 2 motor vibracall MV50 modules in each of its 6 columns. These modules have three wires connected to them (Gnd, Vcc-5V and the command sign). To connect the wires to the modules we recommend that you use bypass connectors. The modules are spaced 10cm of each other in the two directions. The image shows exactly how to organize the modules in the belt.

_images/Belt.png

The image bellow shows how to assembly the parts of the belt. If you prefer, the Fritzing file is also avaliable for you.

_images/vibbelt.png

Firmware

When using physical robot

_images/beltrobot.png

When using the simulation environment

_images/beltstage.png

To make you belt work you’ll need to upload the .ino file into your arduino.

Building Donnie Robot Environment

Braille Cell Manual

Drawing the Parts

The panels and braille cells production process starts with they being drawn in the CAD Corel Draw® software. The drawn pieces can be divided in three types:

  1. Braille Cells fixation panel
Figure 1: Fixation panel drawn in CorelDraw®
  1. Braille cell for pin insertion that compose the representation in braille of letters, numbers and symbols.
Figure 2: The base (a) and the braille cell (b), both drawn in CorelDraw®
  1. Pins that are used for insertion in the braille cells.
Figura 3: Pins that were drawn in CorelDraw®
Cutting the Parts

The second stage of the production process is the laser cutting. In our production we used a laser cutting machine model CMA1080. The cutting of each of the three parts can be seen in the images bellow.

  1. Braille Cells fixation panel:

    This panel was made with 3mm thick milky white acrylic. Two grooves were made in the edges so the braille cells could move through the fixation panel. This fixation panel can be made as large as necessary, in this case we used the size to support up to ten braille cells.

Figure 4: Braille cells fixation panel cutted by the laser cutting machine
  1. Braille cell for pin insertion that compose the representation in braille of letters, numbers and symbols:

    For the confection of the cell was used a 5mm thick blue EVA (Figure 5b). As the basis of the EVA was used a 3mm thick milky white acrylic (Figure 5a).

Figure 5: The white acrylic base (a) and the blue EVA braille cell (b) Figure 6: Blue EVA cells cutted by the laser cutting machine
  1. Pins that are used for insertion in the braille cells:

    These pins were made in a 6mm thick red acrylic. The choice for the red color of the pins and the blue color of the braille cell are due to the ideal contrast for image processing and character recognition.

Figure 7: Red acrylic pins cutted by the laser cutting machine
Assembling the Parts

After cutting the parts in the laser cutting machine it’s possible to assemble the fixation panel and the braille cells. The EVA braille cell was glued in the white acrylic part (Figure 8). For the panels was made some kind of fitting in order that the cells could slip through the fixation panel and that it could also be easily organized.

Figure 8: The EVA braille cell (b) was glued to the white acrylic part (a)

The result after the fixation panel and the braille cell was assembled is shown below:

Figure 9: Braille cell and fixation panel able to support up to ten cells Figure 10: Braille cell and pins used to represent letters, numbers and symbols Figure 11: Fixation panel, braille cell and pins that are used to represent letters, numbers and symbols

Final result!!

_images/figure12.jpeg

Additional Resources

Donnie Contributors

The list of contributors to this document.

Papers

If you are using Donnie and/or its software on your research projects, please cite our papers:

@inproceedings{oliveira2017teaching,
  title={Teaching Robot Programming Activities for Visually Impaired Students: A Systematic Review},
  author={Oliveira, Juliana Damasio and de Borba Campos, M{\'a}rcia and de Morais Amory, Alexandre and Manssour, Isabel Harb},
  booktitle={International Conference on Universal Access in Human-Computer Interaction},
  pages={155--167},
  year={2017},
  organization={Springer}
}
@inproceedings{guilherme2017donnie,
  title={Donnie Robot: Towards an Accessible And Educational Robot for Visually Impaired People},
  author={Guilherme H. M. Marques, Daniel C. Einloft, Augusto C. P. Bergamin, Joice A. Marek, Renan G. Maidana Marcia B. Campos, Isabel H. Manssour, Alexandre M. Amory},
  booktitle={Latin American Robotics Symposium (LARS)},
  year={2017}
}

Disclaimer

Donnie and its software are protected under the MIT License:

Copyright 2018, Laboratório de Sistemas Autônomos

Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in the
Software without restriction, including without limitation the rights to use, copy,
modify, merge, publish, distribute, sublicense, and/or sell copies of the Software,
and to permit persons to whom the Software is furnished to do so, subject to the
following conditions:
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF
CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE
OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Feedback

Don’t hesitate to ask about additional info or the next guides, and also if you find some mistakes, please let us know. Issues and push requests can be done on github.