Close
0%
0%

OSCAR: Omni Service Cooperative Assistant Robot

A project aimed at developing a humanoid ballbot platform.

Similar projects worth following
This project represents our first attempt at a build in field of social robotics. A social robot is an autonomous robot that can interact with humans in various contexts while following the social behavior applicable to that contexts. Before we get into the finer nuances of social behaviors and interactions, we need a robotic base. A natural choice that comes mind is a humanoid robot. Another aspect of robotics that we are fascinated with is that of dynamically stable robots. Nothing typify this concept more than a ballbot. So for this build, we plan to challenge ourselves by trying to design, construct and test a ball balancing humanoid robot.

As a sideline, our project idea also foots the bill for the requirements for a local (Singapore) contest, Tech Factor Challenge, to build a robotic waiter.

A brief video introduction can be found here:
https://www.youtube.com/watch?v=BtobfMgeqv8

For this project, we considered various technical and design considerations:

1. Form Factor

In such a service oriented industry like the restaurant industry, a close and personal touch is essential. It makes the diners feel welcomed and at ease, thus providing a conducive and pleasant dining experience. Many form factors for the robotic waiter are possible that will carry out the day to day operations of a restaurant equally well. However, to be in line with our earlier point, our team thinks that a humanoid robotic waiter, although harder to implement, will best approximate the interpersonal interaction between a human waiterand a diner. We also note that with current technology, we are not able to come up with a facsimile of a real person with complete fidelity in terms of outwards appearance and movement, thereby introducing a 'creepy' factor in the robot if we attempt to do so. Thus we have chosen to go part way on this and came up with the humanoid design (shown in figure on the right) for OSCAR. With this design we think that the customer will experience almost the same level of interaction as with a human waiter, without constantly needing to come to term with the humanistic characteristic of the robot, because the outward appearance clearly suggest it is not a human.

2. Locomotion, Navigation, and Collision Avoidance

When we think of waiters, we think of people in uniform, gracefully negotiating the narrow spaces between tables while skillfully balancing a number of items their hand as they serve the diners. Traditional statically stable (remains stable without the need for any active feedback) robots require large bases, thus a big footprint to remain stable. Due to this reason, they are not optimized for the small confines of the walking spaces in between tables or within the kitchen. Their movement are sometimes slow and not as fluid as that of human locomotion. Dynamically stable (require active feedback mechanism to remain stable) robots are more suited to this purpose. They have the advantage of having a small base thus footprint and are very maneuverable as a result of their dynamic stability.

Our team have opted for less common ball-drive configuration shown on the right for OSCAR. The concept of the ball drive is deceptively simple. It consists of three omni wheels (wheels with rollers on their circumference to allow for side way motion) driven by motors which are mounted at an angle to each other. Synchronized movement of the three wheels drive the ball roller allowing for motion in any direction (thus the omni in OSCAR) and even turning in it's own footprint. Attitude of OSCAR will be measured by the IMU. With this information together with the feedback from the rotary encoders, the computer can precisely make the necessary adjustment to maintain OSCAR's balance when stationary or moving.

We have design a set of stabilizing struts to prevent OSCAR from toppling when dormant. As a measure of safety, these struts are spring loaded and can be deployed in an instant in an event of a fault condition which will cause OSCAR to become unbalanced.

As the interior of a restaurant is relatively fixed and well defined, a map of it can be created and stored in OSCAR. Using map together with the laser range finder (the black range finder window is shown at the lower part of the torso) and head mounted stereoscopic cameras (behind the led face display), OSCAR can perform autonomous navigation and collision avoidance without the need for any markers.

3. Power Source

The power source for OSCAR is a series of high capacity lithium ion batteries mounted at the top of the electronics package (see figure below) to provide a high center of gravity. Charging will be done wirelessly with a charging base, eliminating of the need for any external contacts. OSCAR will be able to automonously navigate and dock itself for charging when the batteries are running low.

4. Interactivity

Orders taking and answering to the diners' request require a...

Read more »

  • 3 × Holomonic Wheels 100mm diameter holonomic wheels, capable of two perpendicular direction of travel to 'sit' on the ball.
  • 1 × Bowling Ball A bowling ball for the drive. All the dimensions of the robot will be designed in accordance to this.
  • 3 × Stepper Motor 3 beefiest stepper motors we can find: 125 oz.in (200 steps/rev). We will first try this before switching to geared DC motors, as these don't require feedback
  • 1 × Stepper Controller At the onset, we will use the quadstepper controller from Sparksfun. These are often used in 3D printers as they have four channels. The stepping pulses need to be generated outside the board, but this should'nt be a problem.
  • 1 × IMU The IMU will be used for balancing purposes. Additionally, it will be used for head-locking purpose.

View all 7 components

  • Ping... Ping... Ultrasonic Sensors

    Poh Hou Shun10/04/2015 at 14:43 0 comments

    Actually an ultrasonic sensor goes click.. click... Anyway let's start. The lidar was mounted at a height of approximately 71 cm from the ground. That would mean that any obstacles short than that, and there are plenty of those, will not be detected by the lidar. This would make collision avoidance difficult to say the least.

    So what we planned was to have an array of inexpensive distance sensor, we chose ultrasonic sensors, to be mount close to the ground of collision avoidance. Infrared range finders also foot the bill if it was'nt for the fact that they don't work very well outdoor conditions or area fill with infrared light.

    We chose the most inexpensive ultrasonic sensor we can find the Devantech SRF02:

    It has a working range of about 15 cm to 300 cm making it suitable for our purpose. The working minimum distance is due to the fact that there is only one transducer which both generates the pulse and receives it. After sending a pulse, the transducer needs a certain amount of time to 'ring down'. This ring down time corresponds to a minimum working distance. If you really want to get a smaller minimum working distance from these sensors, you would need to use one to send the pulse and another to receiver it. Ultrasonic sensors with two transducers do not suffer from this issue.

    The SRF02 can output distance in cm, in, and microseconds (used with speed of sound to calculate distance) over serial or I2C. It runs on a 5V supply. For our purpose, we would be using I2C communication as it allows up to 16 sensors to be chained together.

    For controlling the sensor we used an Arduino Leonardo. No special reason for this choice other than it has a dedicated SDA and SCL pin. Ideally there should be a pull up of the two lines to 5 V using a 10 k resistor. However using the wire Arduino library should enable the internal pull up resistor.

    The first order of business was to give each sensor an unique address. A caveat here is that the SRF02 uses a 8 bit addressing while the Arduino I2C uses 7 bit addressing. The way to convert from 8 bit to 7 bit addressing is to take the 7 highest bits of the 8 bit address.

    When you power up the SRF02, it would flash its onboard led, a long pulse followed by a series of short pulses which indicate its address. For explanation of address changing, you can refer to http://www.robot-electronics.co.uk/htm/srf02techI2C.htm.

    We have attached here the code for changing the address of the SRF02. As the credit shows, it was modified from codes written by Nicholas Zambetti and James Tichenor

    // I2C SRF02 Devantech Ultrasonic Ranger Finder
    // by Nicholas Zambetti 
    // and James Tichenor 
    // Modified by Poh Hou Shun
    
    // Address chnage of Devantech Ultrasonic Rangers SFR02
    
    // Created 24 September 2015
    
    // This example code is in the public domain.
    
    #include 
    
    void setup() {
    
      Wire.begin();                // join i2c bus (address optional for master)
      changeAddress(0x71, 0xF6);   // change address, changeAddress(oldAddress(7 bits), newAddress (8 bits))
    }
    
    void loop() {}
    
    // The following code changes the address of a Devantech Ultrasonic Range Finder (SRF02)
    // usage: changeAddress(0x70, 0xE6);
    
    void changeAddress(byte oldAddress, byte newAddress)
    {
      
      Wire.beginTransmission(oldAddress);
      Wire.write(byte(0x00));
      Wire.write(byte(0xA0));
      Wire.endTransmission();
    
      Wire.beginTransmission(oldAddress);
      Wire.write(byte(0x00));
      Wire.write(byte(0xAA));
      Wire.endTransmission();
    
      Wire.beginTransmission(oldAddress);
      Wire.write(byte(0x00));
      Wire.write(byte(0xA5));
      Wire.endTransmission();
    
      Wire.beginTransmission(oldAddress);
      Wire.write(byte(0x00));
      Wire.write(newAddress);
      Wire.endTransmission();
      
    }
    
    /*
    
      Address 8 bit -> 7 bit map
      
      0xE0 -> 0x70
      0xE2 -> 0x71  
      0xE4 -> 0x72
      0xE6 -> 0x73
      0xE8 -> 0x74
      0xEA -> 0x75
      0xEC -> 0x76
      0xEE -> 0x77
      0xF0 -> 0x78
      0xF2 -> 0x79
      0xF4 -> 0x7A
      0xF6 -> 0x7B
      0xF8 -> 0x7C
      0xFA -> 0x7D
      0xFC -> 0x7E
      0xFE -> 0x7F
     
     */
    The next step was simply to write a code that request a measurement from a number... Read more »

  • First Foray into ROS

    Poh Hou Shun10/04/2015 at 10:19 9 comments

    Now it was time to tackle some of the software (talking about codes running on the computer not the firmware) side of things. The very first thing we did was to load our under-powered PC (it is a 32 bit system by the way) with the latest version of Ubuntu (14.04.3 LTS).

    With it 'sudo apt-get updated, upgraded, and then dist-upgraded'. We are finally ready to install the 'Robot Operating System (ROS)'. For those who are not familiar with ROS, wikipedia calls it:

    "Robot Operating System (ROS) is a collection of software frameworks for robot software development, (see also Robotics middleware) providing operating system-like functionality on a heterogeneous computer cluster. ".

    Meanwhile we call it a kick ass software that ties the various components of a robot together. It has a large open-source community supported library to boot. Anyone who is really serious about robotics should take a look at it. This is also the first time getting our feet wet in the 'ROS lake'.

    For our system we installed the most current version 'ROS Jade Turtle' (detailed instruction here) IN FULL (don't to leave out any important packages). The ROS website provides a series of excellent tutorials that really helped us... in the beginning (more on this later). In a nutshell, what we want to in ROS is to implement something called Hector SLAM (no it is not a WWE thing). Firstly SLAM is a abbreviation of Simultaneous Localization and Mapping. It refers to the process in which a robot (it can be any number of similar things) can in parallel map out an unknown area and at the same time determine its location in the map. This is helpful in getting a robot autonomously from point A to B.

    Now Hector SLAM is something more advanced. It can build up a map without need for odometry. Let us explain: Imagine the lidar sensor we featured earlier. It outputs a distance versus polar angle kind of reading. When the sensor moves, the readings change, kind of like a scene A to B. Now the Hector SLAM algorithm is able to the components of A in B. Thus it does'nt require odometry input as a reference., although it can be augmented by it.

    Now ROS rears its baffling head. It is well known to have a steep learning curve (maybe that is only for us). While there are a multitude of examples of sucessful implementation of RPLIDAR with Hector SLAM, there is not step by step instruction how to do it. After much toil we got it to work and we will document the steps here. Hope it is helpful:

    First, one needs to get RPLIDAR to talk to ROS (we are making a major assumption that you are using Ubuntu), the steps for this procedure are:

    1. Go to https://github.com/robopeak/rplidar_ros

    2. Clone the folder listed on the site into your catkin's workspace src folder (assuming you are using Jade Turtle here)

    3. Go the catkin directory: cd catkin

    4. Source the setup.bash file (we found that we need to do this for every command prompt that we open to run ROS stuff): source /opt/ros/jade/setup.bash

    5. Compile: catkin_make

    6. After that is done connect the RPLIDAR and run your new spanning rplidar ros node (that is what it is called): roslaunch rplidar_ros view_rplidar.launch

    The last step open up what is call RVIZ which a ROS visualization program. What you should see is something like this:

    Now what you would immediately notice on the left is a warning about 'No tf data'. To put it simply it is trying to say that it is lacking in some defined reference frame. This will be a problem later on when trying to run Hector ROS.

    Alternatively you can quit RVIZ and run the following two commands in separate command prompt (remember to source your setup.bash in both command prompt):

    roslaunch rplidar_ros rplidar.launch

    rosrun rplidar_ros rplidarNodeClient

    to see the data stream.

    Now the next step is to implement the Hector SLAM node itself. Here are the step:

    1. Now clone the Hector SLAM package (https://github.com/tu-darmstadt-ros-pkg/hector_slam) into the src directiory.

    2. Now from the catkin directory, source the setup.bash...

    Read more »

  • Modification to a Holomonic Base

    Poh Hou Shun10/04/2015 at 10:00 0 comments

    After weeks of trying, we still did'nt manage to balance the test base on the ball. The torque from the steppers are simply too little to drive the base reliably. To add salt to injury, because the omni wheels are mounted at an angle, driving the base without the ball while possible, introduces loads of vibration due to the spacing between the roller. This rendered us unable to test any of the navigation sensors.

    With the deadline for the our Tech Factor challenge looming, something mus be done. So as a compromise, we decided to redesign the base such that wheels are not mounted at an angled. For this configuration the base needs to be enlarged and the stepper mounts redesigned. Here is our design:

    After extensive modification and upgrading, it looked like this:

    After words about the upgrade. We have fitted two 12 V, 7 Ah sealed lead acid batteries. They are connected in parallel and power distributed by the driver interface board that we fabricated earlier. For the brain, we fitted a grossly under-powered QBOX-1000, a fan-less Box PC (the black box with two antennae sticking out) running Intel Atom Processor N270 with 2 Gb ram and a 64 Gb SSD (I believed). Above the brain we have mounted a Trendnet TV-IP651WI IP camera. This allows us to remotely drive the platform around and not 'kill' anyone. Lastly mounted above that is the RPLIDAR. The RPLIDAR came with a M2.5 threaded nylon spacers. For future compatibility to standard spacers, we re-tapped them to M3.

  • Seeing the Light with RPLIDAR

    Poh Hou Shun10/04/2015 at 09:35 0 comments

    Good news everyone (in Professor Farnsworth's voice)! The first of our navigation sensor the RPLIDAR has arrived. We gotten ours from Seeedstudio for a little under 400 bucks. For those who are not familiar with the RPLIDAR, it is probably one of the cheapest lidar one can get on the market, without getting into "Poor Man's Lidar (PML)" . It has a range of about 6 m with a refresh range of 5.5 Hz and 2000 samples/s at 1 degree resolution. It works by sending out a modulated (intensity I supposed) laser beam which gets reflected from an obstacle. The return signal is then detected by an onboard camera and decoded (probably some timing measurement goodness here) for the distance measurement.

    Included in the box is the RPLIDAR itself (fortunately), a 7-way ribbon cable, and some kind of serial to USB adapter with builtin power suppy for the lidar and its motor. For testing, using the adapter is probably the best beg as it just streams the measurement through USB. However with the adapter one cannot selectively control the power to the motor. Using the demo program we easily verified that the lidar was working.

    An issue that we faced with our RPLIDAR was that the spinning head does not seem to be balanced. As it spins it was introducing a fair bit of vibration, but not enough to be a showstopper. Maybe in the future we will try to balance out the head.

  • Some Inital Balancing Tests

    Poh Hou Shun10/04/2015 at 04:22 0 comments

    It has been about 2 months since I have last updated the build logs, there have been lots of development since then. Now I shall attempt to document them in the next few logs.

    In the previous log, we documented that we have fabricated the test base together with a rudimentary drive electronics. Next, we tackled the task of getting the bowling ball for the test base. We gotten one from a thrift stall for a few bucks (not sure about the specifics of it except that it is approximately 20 cm in diameter. Sorry to all the bowling fans). We attempted to smooth out the finger holes and the debossed letterings with some plaster (we gathered there are fillers for this exact purpose, but unfortunately we cannot find any of them).

    There was no other ways to describe the result except that it was BAD. It was very difficult to get a complete fill of the finger holes. Even when it was supposedly completely filled up, when it dries, the plaster would shrink. Anyway we were determined not to be stopped by a few finger holes, let us continue.

    With the platform inverted we placed the bowling ball on the fabricated base under the tune of the theme of '2001: A Space Odyssey'... seriously. The omni wheels engages the bowling ball perfectly.

    After changing some signs in the firmware, we managed to move the bowling ball in the intended direction. Although we still have not sorted out the part of moving in a straight line while rotating about its axis. A few issues we observed are: 1. Due to the size of the rollers on the omni wheels, there were gaps in between the two layers of rollers. Since the bowling ball engages the wheels in some weird way, there was significant vibration when we move the ball. 2. Two related issues. The steppers were rated at 2 A per phase. When we tried to crank up the current on the quadstepper, the driver would go into this erratic behavior mode. It seems that the driver chip (A4988) is overheating and going into some sort of thermal shutdown. This was despite the installed heatsink. Cooling it down using an inverted can of compressed air eliminated this behavior, thus confirming out diagnosis. Unfortunately due to the layout of the quadstepper board, there was no quick fix for the overheating issue.

    The only thing we can do for the moment was to reduce the current to about 0.5 A, a level where the overheating issue seems to disappear. With the reduced current there was reduced torque. As we move the bowling ball, sometimes one or more steppers would stall. Of course in retrospect, the more expensive and complicated DC geared motors are better suited for the purpose.

    We popped in an AltIMU-10 IMU from Pololu and tried to balance the platform. the first thing we observed was a large latency, of about 1 sec, between tilting the platform and actuation that corrects for the tilt. With such large latency, no amount of playing around with the PID values would solve it. So back to the drawing board.

  • Firmware for the Stepper Controller

    Poh Hou Shun08/17/2015 at 18:00 0 comments

    We now begin coding the firmware for the stepper controller. Our conceptual leap was to realize that the code for driving the three stepper motors is the same as that driving a three holomonic robotic base. Basically the intended motion is resolved into three direction 120 degrees apart. The resolved motion is then translated into the motion of the three stepper motors. There are codes for handling mixing of linear and rotational motion. Lastly, there are some codes for implementing the balancing and head-locking, but these are commented out for initial tests.


    //***************************************************************************************************************
    /*
    
     Ballbot Firmware V1
     
     Copyright (c) 2014 Space Trek Systems.
     http://www.spacetreksystems.com/
     
     */
    
    //***************************************************************************************************************
    
    // header files
    
    #include "EasyTransferI2C.h"
    #include <PID_v1.h>
    //#include <TimerOne.h>
    #include <TimerThree.h>
    #include <TimerFour.h>
    #include <TimerFive.h>
    #include <Wire.h>
    
    //***************************************************************************************************************
    //communication with reciever module
    //create object
    EasyTransferI2C ET; 
    
    struct RECEIVE_DATA_STRUCTURE{
      //put your variable definitions here for the data you want to send
      //THIS MUST BE EXACTLY THE SAME ON THE OTHER ARDUINO
      double Xspeedtrans;
      double Yspeedtrans;
      double Rotationtrans;
    };
    
    //give a name to the group of data
    RECEIVE_DATA_STRUCTURE senddata;
    
    //define slave i2c address
    #define I2C_SLAVE_ADDRESS 9
    
    //***************************************************************************************************************
    
    //imu definition
    
    /*
    
     MinIMU-9-Arduino-AHRS
     Pololu MinIMU-9 + Arduino AHRS (Attitude and Heading Reference System)
     
     Copyright (c) 2011 Pololu Corporation.
     http://www.pololu.com/
     
     MinIMU-9-Arduino-AHRS is based on sf9domahrs by Doug Weibel and Jose Julio:
     http://code.google.com/p/sf9domahrs/
     
     sf9domahrs is based on ArduIMU v1.5 by Jordi Munoz and William Premerlani, Jose
     Julio and Doug Weibel:
     http://code.google.com/p/ardu-imu/
     
     MinIMU-9-Arduino-AHRS is free software: you can redistribute it and/or modify it
     under the terms of the GNU Lesser General Public License as published by the
     Free Software Foundation, either version 3 of the License, or (at your option)
     any later version.
     
     MinIMU-9-Arduino-AHRS is distributed in the hope that it will be useful, but
     WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
     FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for
     more details.
     
     You should have received a copy of the GNU Lesser General Public License along
     with MinIMU-9-Arduino-AHRS. If not, see .
     
     */
    
    // Uncomment the below line to use this axis definition: 
    // X axis pointing forward
    // Y axis pointing to the right 
    // and Z axis pointing down.
    // Positive pitch : nose up
    // Positive roll : right wing down
    // Positive yaw : clockwise
    int SENSOR_SIGN[9] = {
      1,1,1,-1,-1,-1,1,1,1}; //Correct directions x,y,z - gyro, accelerometer, magnetometer
    // Uncomment the below line to use this axis definition: 
    // X axis pointing forward
    // Y axis pointing to the left 
    // and Z axis pointing up.
    // Positive pitch : nose down
    // Positive roll : right wing down
    // Positive yaw : counterclockwise
    //int SENSOR_SIGN[9] = {1,-1,-1,-1,1,1,1,-1,-1}; //Correct directions x,y,z - gyro, accelerometer, magnetometer
    
    // tested with Arduino Uno with ATmega328 and Arduino Duemilanove with ATMega168
    
    // LSM303 accelerometer: 8 g sensitivity
    // 3.9 mg/digit; 1 g = 256
    #define GRAVITY 256  //this equivalent to 1G in the raw data coming from the accelerometer 
     
     #define ToRad(x) ((x)*0.01745329252)  // *pi/180
     #define ToDeg(x) ((x)*57.2957795131)  // *180/pi
     
     // L3G4200D gyro: 2000 dps full scale
     // 70 mdps/digit; 1 dps = 0.07
     #define Gyro_Gain_X 0.007 //X axis Gyro gain
     #define...
    Read more »

  • Stepper Controller Electonics

    Poh Hou Shun08/17/2015 at 17:47 0 comments

    The controlling the stepper, we used the quadstepper board from Sparksfun. Heatsinks were placed on the controller chip to prevent overheating.

    This is controlled solely by an Arduino Mega which generate the stepping pulses. An Arduino Mega was chosen because it has enough hardware timer for generating the stepping pulses. Another Arduino Uno serve to decode the PWM signal from an RC receiver. The decode signals are then sent to the Arduino Mega via I2C, so that the stepper motors can be remotely controller. An interface board is fabricated to tie all the components together. After alot of botching, the circuit finally works.

  • Ball Balancing Test Base Fabrication and Assembly

    Poh Hou Shun08/17/2015 at 16:24 0 comments

    The designs for the balancing test base were done in Solidworks and exported to .dxf files. These were then brought out for laser cutting. Instead of acrylic, plexiglas (also a type of acrylic) was used. We found that plexiglas is more flexible and stands up the rigor of drilling without cracking.

    With all the pieces back from the laser cutter, we begin the ardous task of drilling all the holes for securing the mount for the stepper motor. Holes of 3.5mm diameter were drilled and tapped for M4 screws.

    The spacers were cut to the appropriate length using a lathe. The central hole in the wheel hub for the holomonic wheels where also enlarged to 1/4" to accommodate the stepper motor shaft. The wheels were then installed onto the motor shaft.

    Finally all the parts were assembled.

  • Ball Balancing Test Base Concept

    Poh Hou Shun08/17/2015 at 15:12 0 comments

    As we have no experience at all at building a balancing robot of any kind, it is prudent that we build a test base for this purpose. The design we have can be found here:

    The structure will will consist of 8mm thick laser cut acrylic. Laser cutting is the fastest and most economical prototyping means available to us. The holomonic wheels engages the bowling ball at an angle of 45 degrees, midway between the pole and equator of the ball.

    Plates which are about 260mm in diameter are being held together by 10mm diameter spacers made of acrylic rods. Driver and balancing electronics will be mounted on the plate just on top of the stepper motors to minimize cable length. The battery which constitutes a bulk of the weight will be mounted high up for stability (akin to the configuration of an inverted pendulum). We will be using inexpensive Pd acid battery in the beginning which will probably be switched to LiPo in the future.

View all 9 project logs

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates