A silly desktop animatronic using an rPi, OpenCV and ROS2
To make the experience fit your profile, pick a username and tell us what interests you.
We found and based on your interests.
First step to get anywhere is to get ROS2 and the camera working. Which ideally would have been a matter of some apt-gets. Except there were a few complications:
This means that i don't get a easy "apt install ros-iron" and call it a day. Either i need to get the driver working on the older distro, wait till may 2024 for a new LTS ubuntu + ROS2, or get ROS2 working on the newer via alternative means. Since the driver is meant for a significantly newer Linux Kernel and i know nothing about drivers, the former is pretty much off the table. I gotta find a different way to get ROS2.
There are two ways to do this. Either i build it myself OR i use a Docker container.
I would prefer not to wrestle with docker just yet, even though it is the easiest way to get things running. If i can run something without any overhead i'd rather use that. so i figured building ROS2 would be worth a try. I can always try to migrate to a newer Distro with Docker later and see if it has any impact.
Let me pre-empt by saying I came to regret this decision and urge anyone to just use the 24.04 LTS+ROS2 Jazzy (if available by time of completion) or use Docker to run it on Raspberry Pi OS. The following is more for the curious/brave. Took weeks to get things compiled and working on a Zero2 (memory contraints). NO Guarantee the following even works!!!
This is assuming a freshy flashed OS just booted into for the first time and is being accessed via SSH.
First things first. Ubuntu doesn't start with swap space, which isn't a good thing on a tiny pi like the zero2. Gotta get that fixed, preferably before the Kernel starts to throw a fit and the Pi freezes. First we allocate about 2Gb to a swapfile, Give it permissions and than make it a swap.
sudo fallocate -l 2G /swapfile sudo chmod 600 /swapfile sudo mkswap /swapfile sudo swapon /swapfile
We now have a 2GB Swap file ready and active. To make it permanent, we open up the fstab file:
sudo nano /etc/fstab
And add as entry:
/swapfile swap swap defaults 0 0
So that it will always be active whenever the Pi is started.
After that we go through the apt upgrade and apt update routine (you should know how it goes)
Time to get the first set of prerequisites needed. Run the following command:
sudo apt install -y git colcon python3-rosdep2 vcstool wget python3-flake8-docstrings python3-pip python3-pytest-cov python3-flake8-blind-except python3-flake8-builtins python3-flake8-class-newline python3-flake8-comprehensions python3-flake8-deprecated python3-flake8-import-order python3-flake8-quotes python3-pytest-repeat python3-pytest-rerunfailures python3-vcstools tmux
This should hopefully get every that is needed. If some parts don't install, note them down for later checking. Some python parts may not want to install via the apt install
approach. Often you can try to forcefully install them anyway via "-pip install (PACKAGE) –break-system-package
". The break package flag tells it to ignore the system package manager and do it anyway.
mkdir -p ~/ros2_iron/src
cd ~/ros2_iron
vcs import --input https://raw.githubusercontent.com/ros2/ros2/iron/ros2.repos src
sudo rm /etc/ros/rosdep/sources.list.d/20-default.list
sudo apt upgrade
sudo rosdep init
rosdep update
rosdep install --from-paths src --ignore-src -y --skip-keys "fastcdr rti-connext-ds-6.0.1 urdfdom_headers"
With this. Everything should be there to let one build. But don't do it yet!!
Now it is of vital importance to start disabling...
Read more »First things first. Trying to figure out some things to look at and experiment with.
The idea isn't the most complex. No actual mobility is needed it just needs to be able to emote. For which the ability to move its head should be enough along with some OLED eyes. I came up with an idea involving 6 servos. 3 for the head movement, 1 for the neck (to tilt forward/backwards) and two for ears.
Overall skeletal structure would be akin to your typical Egyptian cat statue posture. Body holds logic with ribbon cables going to the various components. Outer shell is mounted with screws onto the skeletal structure. Initial aim is for a low-poly look. Cause i think it would make my life a little easier and not look ghastly for a first try.
Only thing i am not certain on. Is how to deal with the Camera. I am dead set on using a wide-angle camera. They give a very wide field of view (about 120 degrees) which makes them ill-suited for photos, but great for detection. Mounting in the head would be the obvious way, but the wide view may be enough for a torso mount. Something to try out first.
This will take a bit more than your average controller to pull off. So a single-board computer is the obvious choice and on that the Raspberry family stands unrivaled. Plus they apparently aren't made out of unobtanium anymore.
Gonna try to get the job done with a Zero2 first. It is the smallest and the lack of USB and Ethernet is of no concern for this project. Only limitation is Memory. At 512Mb it doesn't have that much room, but by going headless it should be possible to keep memory use low enough.
To handle some of the more specific I/O such as controlling the servos a RP2040 is to paired up as a co-controller. A advantage is that the Raspberry itself can program the RP2040 by bitbanging a SWD interface.
The main form of input will be a Camera. Ideally the Camera Module 3 Wide is used as its wide field of view is great for detection tasks.
Other inputs are figured out as i go along.
This kind of scope project is kind of annoying to do in a single program. So many components that need individual testing, calibration and constantly iterating. Since the project is already planned to be realized on a single board computer with regular Operating System, it makes sense to utilize its ability to handle task scheduling and divide the project over multiple individual programs that together make up the animatronic's software.
To let the processes operate in tandem a method of inter-process communication is needed that lets them pass data between eachother. e.g. the Vision program tells the behavior Agent that a face was spotted, the behavioragent then tells the animation program to track, etc. The most common way is to use something that enables Inter-Process Communication. A simple Subscriber/Publisher or Request/Reply message system should do the trick.
Ideally i would be able to use the Robot Operating System 2 (ROS2), which was specifically intended for this kind of task. It gives any CPP/Python project run through it, access to a Data-Distribution-Service (DDS) that enables them to discover each-other and pass messages to one another. It is however not the easiest to get working as it is maintained specifically for the Ubuntu Operating system. For others you gotta build it yourself.
Alternatively if it proves difficult to get operational (you never know with Embedded linux). ZeroMQ can act as an alternative. Much like ROS2 it enables processes to pass messages to eachother, but instead of using a DDS it uses more conventional methods such as TCP for inbetween devices or a local file that acts as a bridge inbetween processes. It is lightweight, but the use of a in-between file means hammering the flash storage with constant writes. I'd rather avoid that.
Should be enough to get a start.
Create an account to leave a comment. Already have an account? Log In.
Become a member to follow this project and never miss any updates