Close
0%
0%

LabRATory Telepresence Robot

This robot moves vicariously for a laboratory mouse while its brain is imaged by a microscope!

Public Chat
Similar projects worth following
Recently I was hired to design and build a telepresence robot for a neuroscience lab. They have built one of the best two-photon fluorescence microscopes in the world, allowing them to image living brain tissue!

Why the bot?

During imaging, the mouses head must be fixed. However, this lab is specifically trying to target area's of the animals visual cortex and motor cortex as it is solving motion and navigation problems.

The mouse is perched atop a giant trackball. As the mouse moves across the trackball. the robot mimics the movement and streams live video back to a display in front of the mouse, allowing the lab to correlate the mouses motion, field of vision, and neural activity!

My client benevolently allowed me to share ALL of my work: motor sizing, pcb design, firmware, software, control methods, etc. The only caveat is that I am not allowed to take my own footage of their testing, but hopefully they will make footage public soon.

Here is a quick video showing off some of the robots functionality. I took this at 11:00 PM at the lab, so I apologize for the quality:


I'd like to begin by posting a few of the key design features before linking in all of the design files. Then each project update will showcase each phase of design, and how I planned to meet these requirements. 

If feature creep scares you, then this lifestyle ain't for you! This project started as just the robot, and later grew to  incorporate the trackball electronics,  the video server, a GUI to wrap it all together, and bonus software features requested on the week of delivery ;) 

Here were the essential requirements for the robot itself:

DescriptionQuantitative Requirement
Wirelessly stream low-latency video to computers within the same sub-network.640x480 at less than 200 mS latency
Save the displayed video stream as a sequence of individual images with an associated timestamp of the moment it was displayed,~10ms accuracy
Mechanically mimic the kinematics of  a lab mouse under a "head-fixed" position.> Top linear velocity of 50 cm/s
> Minimum linear acceleration of 2 m/s/s
> Top rotational speed of 2 rot/s (from stopped position)
Control robot movement wirelessly from trackball
Minimum battery life of 1 hour

Pretty lenient all things considered! No demands for any specific hardware, no requirements for any specific software, and given they're operating under an exploratory research grant, no "do or die" style deadlines.

My client expects this project to go through several iterations (which is always a sign of sanity in a potential employer), but who wouldn't?

Consider a testing session with the mouse under the 'scope for 30 minutes. You've got one camera on the microscope, one the bot, and a handful of others monitoring mouse behavior and orientation, all sampling at ~30 frames/sec. Between the two "main" video feeds, your looking at over a 100k frames of data (or about 126 GB of uncompressed pixel data). And although I don't understand all the details, I'd be pretty amazed if that was all the data you needed to uncover all of the convolution networks of a rodents visual cortex...

Given the sheer size of the data that needs to be collected and analyzed, any improvements can save a lot of peoples time, and are well worth the money! 

_____________________________________________________________________________

I realize I still have a lot of information to unpack here from a documentation standpoint, so I put together a quick chart to show how data is flowing through this system,

trackball_backup.zip

The most up to date trackball scripts.

x-zip-compressed - 49.67 kB - 06/05/2018 at 00:00

Download

fpvbot_scripts.zip

The most up to date bot scripts.

x-zip-compressed - 7.23 kB - 06/05/2018 at 00:00

Download

Gerbers.zip

These are the gerber files for the board that sits on top of the raspberry pi. It uses an STM32F446 with the mbed environment to handle all of the real-time motion control. All of the code is compiled natively on the pi use platformio from the command line. This is uploaded to the MCU using OpenOCD & arm-none-eabi-gdb... There are a couple errors with the current design that required one "bodge" jumper cable to power up the motor driver ics. Also one of the pwm pins is not natively supported by mbed and requires modifying some of the mbed source code. If you are looking at building one of these, please feel free to contact me!

x-zip-compressed - 24.29 kB - 06/04/2018 at 23:39

Download

  • Raspberry Pi Zero: Controller Loop Stability

    Brett Smith02/07/2019 at 12:52 0 comments

    Raspberry Pi Zero: Controller Timing Stability

    Servo motor control is usually implemented with an embedded MCU as typically very fast, consistant control loops are required for real life motion controls. Programming, capturing data, and PID tuning can be tedious in an embedded system environment. So in this notebook we will be examining raspberry pi zero loop stability while controlling a brushed, dc motor.We will be testing the loop timing under different conditions:

    1) Executing normally
    2) Executing as an individual python thread
    3) Executing during video streaming
    4) Executing with multiple controller threads
    5) Executing with overclocked CPU

    Procedure

    Because this platform is ultmately being developed for a raspbery pi robot with a live video stream using gstreamer, I will test the loop time both while the raspberry pi is streaming and not streaming video. A total of 8 tests were performed in a variety of combinations of the above conditions. Those conditions can be seen in the table below:

    Video Not Streaming Video Streaming Overclocked CPU: Video Not Streaming
    Main Loop Test 1 Test 3 Test 7
    Threaded Loop Test 2 Test 4 Test 8
    5 Threaded Controllers Not Tested Test 5 Test 9
    2 Threaded Controllers Not Tested Test 6 Test 10
    We will build a list of execution times for each testing state and analyze the distribution of those execution times. This will enable us to analyze the frequency and stability of the control loop.

    Software

    The raspberry pi is using berryconda to manage python virtual environments, and is running a native jupyter notebook server to allow quick code editing and easy data capture.

    Hardware

    Between the raspberry pi there is a micrcontroller and an h-bridge. The microcontroller is setup to measure encoder counts and send them when requested to the Raspberry Pi. Likewise, the microcontroller listens to the serial line and sets the motor direction and PWM based off raspberry pi commands.

    IMG-20190207-142511

    In [43]:
    #! /home/pi/berryconda3/envs/pidtuner/bin/python
    %matplotlib inline
    
    import matplotlib 
    import seaborn as sns
    import matplotlib.pyplot as plt
    import serial
    import time
    import RPi.GPIO as GPIO
    import atexit
    import random
    import threading
    import numpy as np
    from scipy.stats import norm
    from scipy import stats
    
    In [3]:
    ser = serial.Serial(
    port = '/dev/ttyS0',
    baudrate = 115200,
    bytesize = serial.EIGHTBITS,
    parity = serial.PARITY_NONE,
    stopbits = serial.STOPBITS_ONE,
    timeout = 1,
    xonxoff = False,
    rtscts = False,
    dsrdtr = False,
    writeTimeout = 2
    )
    
    In [4]:
    def setPwm(pwm):
        message = str.encode('R' + str(pwm) + '!')
        ser.write(message)
    
    def getRevs():
        ser.write(str.encode('?!'))
        val = ser.readline()
        return (int(val))
    
    In [5]:
    class PID (threading.Thread):
    
        setpoint = 0
        last = 0
        error = 0
        lastValue = 0
    
        Tkp = 0
        Tki = 0
        Tkd = 0
    
        def __init__(self, kp, ki, kd, direction, outputFunc, inputFunc, threadID):
            threading.Thread.__init__(self)
            self.kp = kp
            self.ki = ki
            self.kd = kd
            self.direction = direction
            self.threadID = threadID
            self.outputFunc = outputFunc
            self.inputFunc = inputFunc
            self.output = 0
            
            #Used for thread timing
            self.count = 0
            self.runtimes = []
    
            #This is used to terminate the thread
            self.shutdown_flag = threading.Event()
    
            self.min = 0
            self.max = 0
    
        def join(self, timeout=None):
            self.shutdown_flag.set()
            threading.Thread.join(self, timeout)
    
        def setLimits(self, outputMin, outputMax):
            self.min = outputMin
            self.max = outputMax
            
        def __run__(self):
            controllerInput = self.inputFunc()
            controllerOutput = self.compute(controllerInput)
            self.outputFunc(controllerOutput)
    
        def run(self):
            while not self.shutdown_flag.is_set():
                #self.__run__()
                dt = self.timeFunc(self.__run__)
                self.runtimes.append(dt)
                self.count += 1
    
        def timeFunc(self, func):
            start = time.monotonic()
            func()
            return time.monotonic() - start...
    Read more »

  • Motor Sizing & Wheel Selection

    Brett Smith06/05/2018 at 09:10 0 comments

    Hello All!
    In this post I am hoping to showcase some of the design work that goes into motor selection. If you are curious about this process, I highly recommend the free resources available at the California Mechatronics Center .

    This company is actually run by one of my previous university professors and the workflow presented SERIOUSLY cuts down on a lot of mechanical design iteration. The documentation shown here could certainly help you build much more precise CNC/servo systems than I present here (although beam deflection seems to be missing...).

    After all was said and done, this process seemed to work out fairly well considering all of the parts are DIY robotics components. It met all of our requirements on the first go!

    I did, however, make one glaring mistake: 

    After assembly and testing, the motors could stop so quickly that robot would flip over! Pretty much a catastrophe an industrial scale, but not a big deal with low-mass system such as this. Certainly something to take into account in the future. Rather than programming in a maximum de-acceleration rate, a small counterweight remedied the problem. 


    So lets dive into it!

    The primary issue in selecting a motor is ensuring that the motor can meet both of your torque and speed requirements. If you have these specifications before you start, the worst is over! If not, you will have to estimate them yourself.

    Here is the big boy check list. Seriously, this process separates the engineers from the hobbyists.

    The first step involves in creating a motion profile for each of your motors. The key here is to find your systems top speed and the amount of time you are allowed for the motors to spin up. This will help you determine the motors required angular acceleration. See below

    Notice that we have created two motion profiles. One helps determine the acceleration required to meet our requirement for the bots linear motion (i.e. forwards and backwards movement), and the other to determine the acceleration to meet the requirements for the bots turning speed (i.e. spinning about its center).

    The big design take away from both these graphs (for our project anyway): the relationship between wheel sizes, robot diameter and the maximum required motor velocity.

    A smaller diameter robot really cuts down on the acceleration required for turning, and larger wheels will likewise cut down the acceleration required for linear velocity.


    Now that we have our head around that, lets take a look at our kinematics, and calculate our peak torque:

    Luckily for me, this system is a simple one:

    • There are no major safety hazards if one of the motors fails.
    • There are no mechanics (belts, ballscrews)  to complicate our calculations -our motor go to a gearbox, and then wheels. These add extra inertia loads and frictional forces to account for. 
    • Our system is stable even if the motors are not running.
    • No shear force on the motor shafts outside the weight of the bot.

    So we don't really have to consider much in the way of frictional, gravitational, or thrusting forces.

    Pretty easy living! All we have to really determine is the inertia the bot imposes on the motors, which can be treated as tangent load on the edge of each wheel (see the equation for Jrobot).

    Tthe main thing to notice is that increasing wheel diameter is not without its drawbacks. While this does decrease the maximum required speed of the motor, it seems to add quite a bit more inertia (notice the inertia exerted by the bot increases with the square of the wheel radius). 

    So all that is left is to accurately estimate all of our equation parameters. I use matlab, but this could easily be excel. This will let you tweak each of your design parameters (in our case, bot diameter and wheel diameter). Lets plot all of our motor parameters as they vary with our bot dimensions.
    Here is the output of our matlab script:

    One parameter that is commonly overlooked is the ratio between motor inertia and load...

    Read more »

View all 2 project logs

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates