ROSCOE - A Scalable, Platform Independent Robot

A new algebraic machine cognition model and a novel machine vision architecture

Public Chat
Similar projects worth following
Driven by ROSJavaLite and powered by virtually any ad-hoc combination of SBCs or wedges, There are no specialized parts or custom PCB builds in this project. More of a platform than a specific robot, this system powers several different robots with widely differing propulsion systems and computational hardware collections. Several novel algorithms and machine cognition processes contribute to the ongoing evolution of this project.

The focus of this project is a very low cost robotic platform that provides a test bed for a new model of machine cognition.

A purpose built micro cluster serves as the 'brain' and the BigSack deep K/V store was optimized for cluster operation.

The Robot Operating System Java version, is used as the main control bus.

For the maximum power and lowest cost, BLDC e-bike hubs which can also function as jumbo servos to provide articulating joints of industrial power were used.

The system uses a 24V electronics bus and a 48V propulsion bus with 5 and 12v buck converters to power the RPi's and Bosch BNO055 IMU. Unlike most robotics projects this one emphasizes multiple compact SBC processors with a high speed network rather than one massive, expensive, power-hungry boardset .

Using the current sensor bus, ROSCOE can:
Detect rising barometric pressure and verbally warn of storms
Roll through clouds of toxic gas to deliver live video feed
Detect unusual magnetic anomalies and verbally report
Detect seismic shocks and tremors to issue verbal warning
Detect rising temperature gradients and issue notification
Warn you of low battery and optionally attempt recharge
Warn of motor faults
Interface with dozens of advanced sensors with hot pluggable drivers including LIDAR
Warn of impending impact
Detect motion and take action if necessary
Interface to Internet or work standalone. Video and data feed from anywhere in WiFi range

Push a loaded trash bin to the corner, *before* the trash guy gets there


Length Overall: 24"

Width and Wheel Track: 16"

Height: 32" max.

Weight: Est 60-70 lbs.

Wheel diameter: 16"

Number of Wheels: 3, 2 driven, 1 360 degree passive castor

Top Speed: Est 35-40 MPH

Drive Type: Differential, IMU integrated

Power Bus: 48V/24V

Base OS: Linux/ROS

Microcontrollers: ODroid C1, Odroid C2, RPi 3, Arduino Mega2560 realtime

Network: Switched ethernet

Sensors: IMU, 2 cameras (front/rear), 2 forward ultrasonic sensors hi/lo

Est. Cost: ~$1200


Visible:Bosch BNO055 IMU mounted in pill bottle sealed with gorilla tape upper right. 3 Motor drivers left side. Middle top; low voltage dropout and power distro custom board. Front right; camera and ultrasonic nodes. 2 Odroid C2s, master ROS node and computation on one. IMU, camera, wireless AP on the other. 1 Odroid C1, real time motor controller talking to Mega2560 via G-code to custom realtime c++ OO driver module. ROS Java Lite drives the entire system. This is one of 2 robots using radically different propulsion and hardware but the same exact code dropped in except for 1 config file, which is the point of the exercise.

JPEG Image - 1.23 MB - 03/12/2018 at 21:20



Permobil C400 robot, former surplus mobility chair of Hackerspace entombment. Shown with charger attached in an older configuration. ROS Java Lite drives the entire system. This is one of 2 robots using radically different propulsion and hardware but the same exact code dropped in except for 1 config file, which is the point of the exercise. A remote PS3 controller and first person viewer with 5x7 color screen is used for remote control as the robot contains a wireless N access point. Yes,I got new tires.

JPEG Image - 1.08 MB - 03/12/2018 at 21:18


  • 1 × 20mm Ammo case
  • 2 × Brushless DC 3 Phase e-bike hubs
  • 2 × Heavy duty L brackets for rear suspension
  • 1 × Harbor Freight 8" pneumatic casters
  • 2 × 16 inch 32 hole spoked rims

View all 26 components

  • Stereo Coplanar Area Matching Principal Component Analysis Parallel (SCAMPCAP)

    J Groff08/14/2019 at 21:36 0 comments

    Stereo Coplanar Area Matching Principal Component Analysis Parallel (SCAMPCAP)

    The following reflects improvements to the previously published work on 3D Depth reconstruction from stereo camera images for machine vision. The core of the new algorithm relies on Principal Component Analysis (PCA) to match key epipolar coplanar regions in the 2 stereo images. Selection of these corresponding coplanar regions seems to mimic the manner in which biological systems extract key features for visual cortex processing. Several improvements to the previously published algorithm were made to increase accuracy. Efficiency gains were realized through more accurate region matching, constraining search areas, and finally, paralleling of the process. The Java threading model with cyclic barrier synchronization is the mechanism by which the parallel processing is achieved. Performance remains suitable for realtime applications and is immune to calibration errors and color balance. the code is operational and in place on a number of robotic test platforms in both indoor and outdoor environments.

     Improved algorithm for stereo matching:

    (The terms region, node, and octree node and octree cell are synonymous)

     The steps in the pipeline are:

     1.) Edge detect both images using Canny edge detection. A single thread performs these operations.

     2.) Generate 2 octrees of edge detected images at the minimal octree node level, now 7. The number of threads in the octree build process is proportional to the image scan height. A pool of threads that is 1/10 the scan height of the images is used to process each scan line of the images in parallel.

     3.) Use Principal Component Analysis (PCA) on the 2 images to find the minimal coplanar regions in both octrees and generate the eigenvectors and eigenvalues of the principal axis of the minimal coplanar regions.

    // A typical barrier synchronized parallel execution block
    // this one manages the building of octrees from the stereo image data
    for(int syStart = 0; syStart < execLimit; syStart++) {
        SynchronizedFixedThreadPoolManager.getInstance(numThreads, execLimit).spin(new Runnable() {
            public void run() {
                imagesToOctrees(dataL, dataR, imageLx, imageRx, yStart.getAndIncrement(), camWidth, camHeight, nodel, noder);
                          } // for syStart

    4.) Sort the resulting collections of minimal coplanar regions by their centroid Y values.

    5.) Process the left image coplanar regions against the right coplanar regions by comparing vector cross products of the two secondary eigenvector axis. Limit processing regions to those whose centroids are within a preset tolerance of the difference in Y value of the 2 centroids. Currently, 25 pixels is used as a tolerance. If more than one candidate is found, select the one with minimal difference in eigenvector cross product. Parallel work is assigned using the sorted centroid Y values to partition the processing into blocks of regions with centroids of common Y value.

    6.) Assign a disparity to the minimal octree region based on differences in linear distance of the 2 octree centroids of the points in the coplanar minimal regions from step 4. The regions are small enough that centroid value is accurate enough to give a reasonable disparity value. Work is now assigned to parallel threads based on the size of the collection of minimal regions from the left image, as it is the left image that drives the process of depth assignment. The thread pool is set to a subset of the collection size and the maximum number of threads is set at the number of elements in the collection.

    7.) Calculate the mean, variance, and standard deviation of the depth of all the minimal regions, once assigned by step 6. 

    8.) Reset the left image octree and regenerate the coplanar regions via PCA at a higher octree level, using octree...

    Read more »

  • ROSCOEs RosJavaLite powering 24v gear motor surplus wheelchair

    J Groff03/12/2018 at 21:24 0 comments

    Permobil C400 robot, former surplus mobility chair. ROS Java Lite drives the entire system. This is one of 2 robots using radically different propulsion and hardware but the same exact code dropped in except for 1 config file, which is the point of the exercise. A remote PS3 controller and first person viewer with 5x7 color screen is used for remote control as the robot contains a wireless N access point.

  • New Brain: 24v 9 node supercluster in a box

    J Groff12/20/2014 at 20:13 0 comments

  • ROSCOES New Brain: Supercluster in a Shoebox

    J Groff12/11/2014 at 19:47 0 comments

    I'm building my dream machine! Purpose built supercluster in a box matched to BigSack software. Not often as a developer you get to build a machine to match your software. I am using 8 ODroid C1 1.5 gHz quad proc Cortex A5 SBC with 1 gig memory each and gigabit ethernet. Also got 8 16GB eMMC cards, about 3x performance from SD flash. Each of these 8 workers will service one tablespace of every data store. Each database is composed of 8 memory mapped tablespaces. The master will be an ODroid XU3 octocore Cortex A15 with 2 gig running at 2 gigahertz. The master handles the redo log files and multiplexes requests to the worker nodes and gets blocks back. The transport is UDP fixed block size @ 4096 bytes. Each worker node is on its own UDP port. A cluster lives and dies by its switching fabric so I got 2 6 port 24V 1000TX industrial switches, those were the most expensive part, spent about a grand on the computer boards but the switches were $1300 apiece. Had to get 2 Dc/Dc converters from 24V to 5V @ 8A to power the worker nodes, those were $300 a pop. Still I doubt there is any mainframe that can match it pound for kilo when its running. When it is, I can transplant it right into ROSCOE himself. I have accepted the fact that I need a separate 24V bus just to power computing.

    The eMMC is supposed to be 300 MB/sec so thats 2.4 gigabytes per second IO to permanent storage total aggregate.

    8 procs at 1.5 GHz is 12 GHz plus 2 is 14 gigahertz aggregate clock.

    8 cores master + 4 cores each node times 8 nodes is 40 total cluster processor cores

    10 gigabytes total cluster RAM. 8 x 16 GB nodes plus 64GB master is 180GB eMMC total cluster perm storage (without extra flash, eMMC leaves Micro SD flash space open)

    8 gigabit ethernet channels running UDP is total 1 gigabyte per second burst cluster throughput with all worker nodes fully active

    After assembly, I will run my new Canny/Hough/Inverse DCT transform and build the AMI Algebraic Machine Intelligence model based on the exemplars of the Caltech 101. I can then test the Caltech 101 full dataset on the new model. After refinement its on to Caltech 256, PASCAL VOC 2007 and whatever else. If I am cracking the Caltech 256 at 90% with my shoebox robot brain attention will follow.

  • ROSCOE's Oriented Roundel

    J Groff08/31/2014 at 19:07 0 comments

    No, its not a band, yet. The ARDrone has a built-in machine vision capability that recognizes certain key images, one of which is shown below. The data stream responds with the estimated distance, which I believe is derived from the ultrasonic sensor, and the orientation in degrees rotation with the below image being at 90 reported degrees. So ROSCOE has a 'toy' that he get agitated and tries to find if he cant see it within 500 CM, if he can he will sit 500CM away and stare at it. In a way its an extremely sophisticated 'off' switch as we move into more motion studies. The movement algorithm is extremely simple; move forward by issuing 50mm forward motion Twist messages on the ROS 'cmd_vel' topic every 500ms, which is a standard ROS communication pattern. Four or five inches per second indoors for this thing is enough, it will push the furniture out of the way, and not just the end tables either. The ARDrone published sensor data and external ultrasonics are partially fused to avoid shocks/off axis and we include the publication of 'ardrone/vision' Pose2D range/orientation message if a roundel is detected by the ARDrone. The 'cmd_vel' movement publisher subscribes to the 'ardrone/vision' topic and if we see the roundel within 500 cm don't issue movement publication on 'cmd_vel' topic for that 500ms interval. If we are moving and an obstacle or shock is encountered, rotate 30 degrees off axis, or if the ultrasonic readings are below 200mm range bottom sensor or 300mm top sensor, back up 50mm, then continue to process Twist messages from the 'cmd_vel' ROS topic.   The baseline behavior here will allow us to integrate the  functor-based natural transformation cognitive functions now that there is a partial data fusion pipeline of video/accelerometer/gyro/ultrasonic/machine vision/magnetometer. 

  • New XmlRpcServer on GitHub

    J Groff08/30/2014 at 03:14 0 comments

    ROSJava takes a special version of XmlRpcServer with bugfixes. I have added to that by replacing ThreadPool with a ThreadPoolManager singleton that uses java.util.concurrent Executors as intended by modern standards. Thread management is much better and I can see a much faster spinup of the Master and Slave servers. There also seem to be less NPE's but I also refactored ROSJava to do better checks on collection members so nulls are accounted for in 'equals' and 'hashCode' methods. Anyway, onward and upward. New link is to the left.

  • Fun with Obstacles

    J Groff08/29/2014 at 01:39 0 comments

    This is the most powerful robot I have encountered. I lost the lower ultrasonic so it kept getting 0 readings making it think something was right on top of it so of course it spins in circles trying to avoid the 0 distance object, duh. I tried to grab it to shut down the controller but even at 1/10 power it cranked me. It will obviously have to learn that 1 360 deg. rotation is sufficient to decide a stop. Due to the kinematics it seems like enough power to move the machine is enough to render it too much for muscular override due to the weight of the thing. 

  • Robusst

    J Groff08/19/2014 at 18:41 0 comments

    Bus active for about 36 hours with full nav, video, ultrasonics, etc publishing and subscribing. The unit is positioned such that when someone passes by within range of 300mm it will announce "Excuse me, you are nn millimeters too close to my feet" and/or "Excuse me, you are nn millimeters too close to my head" so its easy to confirm the bus is still operational just by walking by, or if the cat is becoming petrified by walking by the lower ultrasonic and causing it to speak only of its 'feet'. The subscriber is listening to 'range' topics from nodes ardrone and robocore (Parrot and floor level ultrasonics) and doing a simple check which results in the VoxHumana speech module activating or not. Occasionally, an error from the RS232 comms between lower ultrasonic->arduino->core process->ROS allows a zero to creep through so you get "Excuse me, but you are zero millimeters too close to my feet". Passive debugging tools are neato! I did lose the Edamax wireless USB dongle that talks to the ARDrone boards from RPi 1, perhaps from heat. It gradually degraded such that errors from the nav thread were popping up in the form of broken DatagramSockets so I added code to reconnect and tickle the port under those conditions but after replacing the dongle, few broken sockets. Also had to work out all the static persistent routes for bootup.  Serendipitous security was achieved when I moved the master registration server to the second RPi. RPi 1 is not set up to source route packets to RPi 2 so its not possible to connect to the master from outside of RPi 1. Nobody can hijack your 'bot by connecting wirelessly and spinning up a ROS node. About the most that could be done is to issue disconnects from the wireless network or cut off video or sensor data. So we are finally good to go with a stable, open platform and can move on to the really interesting part of this project.

  • 100% Pure Functionality

    J Groff08/17/2014 at 22:32 0 comments

    Replaced xuggler dependencies preventing full implementation on Raspian. Modified h264j to be optimized with ROSARDrone. End-to-end testing on RPi successful. Moved master to roscoe2, second RPi, to distribute load. Performance is acceptable at this point. Added VoxHumana speech module to GitHub.

  • Robot Up

    J Groff08/15/2014 at 23:19 0 comments

    After ensuring that no more localhost bindings were taking place by ensuring the ROS_IP / ROS_HOSTNAME were set the same problem with Connection Refused occurred. Like many problems this one was related to a combination of my own stupidity and a bugs in the system. If ROS encounters a new node registration with the same name as an existing node it will try to replace the slave server, however, it assumes an immediate slave server restart and tries to connect to that as-yet-unstarted server. That is the bug. The stupidity was my accidentally using the same node name as I did for the publisher, causing the situation to begin with. Eventually, after numerous retries and cringe-inducing stack dumps a connection is made to the new server and traffic flows. My initial test case is having ROSCOE listen to range and status and speaking the status message and checking range for values < 100 cm and report that by saying "Excuse me, but you are <nnn> centimeters too close", all of which it now does via the ROS bus on Raspberry Pi!

View all 29 project logs

  • 1
    Step 1


    To configure the RPi to connect to the ARDrone boardset edit the following /etc/wpa_supplicant/wpa_supplicant.conf

    ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev

    The /etc/network/interfaces should appear as follows.

    auto lo eth0 wlan0
    iface lo inet loopback
    iface eth0 inet static
    allow-hotplug wlan0
    iface wlan0 inet manual
    wpa-roam /etc/wpa_supplicant/wpa_supplicant.conf
    iface default inet dhcp
    post-up route add -host wlan0
    post-up route add -host wlan0

  • 2
    Step 2


    The order ops for building the ROSJava is:

    1) Use ant build.xml on rosjava_bootstrap
    2) proceed to rosjava_messages, build.xml which uses RosBase.jar above
    3) proceed to rosjava and build using RosBase.jar and RosMsgs.jar from above

  • 3
    Step 3


    These are the third party libs needed for ROSJava:


    The result of the execution of all build.xml ant tasks:

    RosBase.jar - org.ros.* implementations (topic,field,messages,namespace)

    RosMsgs.jar - standard messages (see pic)
    RosMsgsGeom.jar - geometry messages
    Ros.jar - ROS core

View all 7 instructions

Enjoy this project?



J Groff wrote 07/31/2014 at 17:47 point
In the 'Wish List' department:
LIDAR! for $398.90! Which still seems like too much! I'll get one when its the same price as Kinect, $129.95 is my price point of choice!

  Are you sure? yes | no

Adam Fabio wrote 07/24/2014 at 03:26 point
Thanks for entering The Hackaday Prize Jonathan! ROSCOE is quite an amazing robot-in-progress! You've definitely done a ton of work just standing up your ROS-java fork. Keep the updates coming in, and don't forget about the video!

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates