Close

Programming solutions

A project log for Robot for telepresence and VR experiments

This is a robot platform with stereo cameras and omniwheel drive. It can be connected to HMD (e.g., google cardboad, ocolus rift, ...)

btomBTom 05/29/2016 at 14:210 Comments

We are running multiple programs on the robot. We used C++ (with openCV), python, Arduino programming on various tasks in the project.

This is the connection schematic between the programs (and hardware)

The raspberry Pi on the robot is configured to a wifi access point, we can connect to the robot with multiple devices.

The PC is the remote controlling machine. It sends the connected joystick command to the robot via sockets by a python program. In this program we can set deadzones, for precise control.

We using separate sockets for the movement control from the joystick, and for the head tracking.

On the raspberry Pi we are running multiple programs. Two socket servers, and two mjpeg stream server.

One socket server is for the movement of the robot. In this python program we sending the servo speeds forward to the Arduino. In the arduino there is a simple program what is mapping the speed values to the servo positions. Because we modified the wheel driving servo to continuous rotation, we can control the movement speed with the servo position. The servo center position is where the wheel is stopped. Positive values make the servos turn forward at a given speed, and negative values make the servos turn backward at a given speed. On the arduinos we can set the servos center position for small adjustments.

We made another socket program for the head movement. In this program we sending degrees to the second arduino. On the arduino we mapping this to servo positions. In the arduino program we can limit the servo movement to the physical limits of the mechanic.

For the video stream we are using mjpeg streamer solution. We can connect to the stream for example with browsers, openCV applications, FFMPEG player, VLC player. So its really universal.

In the current experimental setup we are steaming 640x360 videos without mjpeg encoding on two separate ports. We can connect to this stream via mobil browser, and display the two picture in real time. With HTML5 device orientation and websockets, we sending back the position data from the phone to the raspberry Pi. For remote controlling the robot we are using a Logitech Extrem Pro 3D joystick. With this setup we can use small windows tablet to send the joystick data to the robot, and a simple android, or iOs phone with HTML5 compatible browser, to demonstrate our robot at meetups.

We only using low resolution video, for the experimenting, because the android phone what we use can't decode high resolution stream in real time. On PC we can easily achieve 720p video in real time. The another reason for the low resolution is, the low quality of the cardboard what we are using. It's only good for the proof of concept, not for the real usage.

In the next phase we will connect the robot to an Oculus Rift DK2. It will be a little more complicated to us, to make possible for everyone to try out our robot at meetups, and similar events. But we think it will be more spectacular, with higher resolution and much more comfortable headset.

Discussions