OpenCat is the open-source Arduino and Raspberry Pi-based quadruped robotic pet framework developed by Petoi, the maker of futuristic programmable robotic pets.
The goal is to foster collaboration in quadruped(four-legged) robotic research, education, and engineering development of agile and affordable quadruped robot pets, bring STEM concepts to the mass and inspire newcomers (including many kids and adults) to join the robotic AI revolution to create more applications.
OpenCat has been deployed on Petoi's bionic palm-sized, realistic lifelike cute robot cat Nybble and high-performance robot dog Bittle. We now have established a production line and can ship these affordable robotic kits and accessories worldwide.
This project provides a base open-source platform to create amazing programmable gaits, locomotion, and deployment of inverse kinematics quadruped robots and bring simulations to the real world via C/C++/Python programming languages.
Open the Anaconda Prompt (Windows) or the Terminal (Linux / macOS) and enter the following commands to create and activate a virtual environment (the environment name is "venv", and can also be customized to other names):
conda create --name venv
conda activate venv
Follow the guide to install the Ailia SDK, it will be used for pose detection. The guide mainly includes the following steps: download and install the ailia SDK and the related python library files.
When downloading the Ailia SDK, you need to enter the correct email address to receive the download link address and license file. The free license is valid for only one month. The SDK is about 2GB, so it takes time to download.
Move the downloaded license file (AILIA.lic) to the directory where bootstrap.py is located ({ailia sdk directory}/python), and then follow the steps to continue installing the Ailia SDK.
cd {ailia sdk directory}/python
python3 bootstrap.py pip3 install .
Download the requirements.txt from this page (https://github.com/axinc-ai/ailia-models) to the directory where bootstrap.py is located before running the following command:
pip install -r requirements.txt. It may take 30 minutes.
Use the USB uploader or Bluetooth module to connect the robot and power it on. The computer needs to be connected to a camera device.
Run the following command to start the OpenCat Imitation program:
cd {your file directory}/OpenCat-Imitation/
# set python import path
export PYTHONPATH=$PWD:$PWD/serialMaster
python opencat_imitation/imitation.py -v0
You can execute run.sh within OpenCat-Imitation/
2. Run on the Jetson Nano 2GB Developer kit
You may want to run the demo on a Jetson Nano to experiment with some GPU features.
The developer kit uses a microSD card as a boot device and for main storage. It’s important to have a card that’s fast and large enough for your projects; the minimum requirement is a 32GB UHS-1 card. Many projects with Jetson Nano 2GB Developer Kit will utilize swap space on the MicroSD Card due to only 2GB physical memory. For this reason, we recommend 64GB or larger microSD cards. High-endurance microSD cards are also recommended.
Please refer to the user guide to record the system image file (JetPack 4.6) into the microSD card and complete the system initialization.
2. Use a network cable to connect the Jetson Nano development board to a router or other computer hosts so that it can access the Internet.
3. Clone or download the code from GitHub, and install the Ailia SDK according to the guide, the specific method is the same as the above step 3 in Run on the host computer, but NO need to execute the statement:
pip install -r requirements.txt
4. Install the relevant python library files using the following command:
sudo apt install python3-pip
sudo apt install python3-matplotlib
sudo apt install python3-scipy
pip3 install cython
pip3 install numpy
sudo apt install nvidia-jetpack
pip3 install dataclasses
pip3 install pyserial
Petoi, the maker of futuristic robotic pets, today launches the Bittle robot dog STEM kit, a lower cost version of its palm sized robot dog that can help teach tech enthusiasts, young and old, about STEM, robotics and coding.
Small but agile, Bittle can be programmed to walk, do tricks and roll around just like a real dog. Differing slightly from Bittle V1, the STEM kit contains plastic-gear servos instead of metal making it lighter and more flexible with different movements.
We just released BiBoard, an ESP32-based robot dog controller that's equipped with high-performance processors, larger memory and storage, and wireless connections, the Audio function is also included.
Check the following demo videos of running BiBoard on Bittle to see its great performance:
Arduino developers, Robotics coders, programmers, developers, and engineers would love this great board at a reasonable price.
Learn how to dress up Bittle and program it to play Halloween comics
Story
Hi there,
It's been two years since our launch of the Bittle robot. Last weekend, I made a horror comic clip with Bittle to celebrate Halloween. You may check out the final cut and read the following tutorial to make your Petoi robots play like a pro.
Idea
The idea comes from many cute animal videos that dress dogs/cats in costumes.
I have both the Bittle dog and Nybble cat robots handy. Since their leg structures are very similar to the real ones, I think it would be funny to dress them up. However, the robots are about 15cm in body length, so I adjusted a lot to fit them into the tiniest costumes for real pets.
Hardware
I have printed a pumpkin cat head before, but the original Thingiverse design file seems deleted. You may find some variants or design your own.
I ODMed some ultrasonic sensors with RGB LEDs. It has three WS2812 RGB LEDs in each column. They can be programmed as blinking eyes using the Adafruit NeoPixel library.
I inserted the ultrasonic sensor into the pumpkin head. I also glued a small plastic block between the pumpkin and Bittle's head to raise the pumpkin above the tall collar.
Now the main character is ready to go.
Storyline
I need to add some dramatic storylines to make the simple movements less boring. I have some anatomy models for reference when designing bionic robots. They happen to fit the Halloween theme perfectly.
They also look pretty creepy. The assassin (Bittle) should get shocked when facing his mighty victim. I glued a small magnet to the assassin's hand to make the dagger detachable. The magnet's strength is tuned by a thin layer of hot glue so that the dagger will drop with a moderate shock.
I will also utilize the built-in behavior "check-around" and "backflip" to make the assassin observe the surroundings and jump backward when shocked.
Software
Over the past years, I've optimized the OpenCat software to make it user-friendly. I only need to uncomment the macro definition to activate the LEDs on the ultrasonic sensor.
I need to disable the distance measuring function to stop the robot from automatic reactions.
The basic Arduino code defines the instinctive motions and reactions of the robot. It's pretty encapsulated so that users don't need to worry about the tedious hardware details of a complex robot. A set of top-level string commands can control it through the serial port.
I used the Skill Composer to create and preview a new jump behavior. Below are two demos showing how the Skill Composer works.
I used a Mac to create a Python script and align all the events in order. You can read it like a regular play script.
The script is executed in the terminal by python3 hlw.py. It will send queries to all the existing serial ports in parallel, decide if a serial port is connected to a Petoi robot, and then send the tasks over the serial port. The serial port can be wired or wireless connection. For the video, I used the Bluetooth dongle to eliminate the wires.
After the workflow is well-tuned, I can make Bittle repeatedly play the sequences and shoot videos from different perspectives with a single camera. Bittle is the most patient actor and only complained about a low battery once!
I shot about 4 hours to collect all media resources. It took about 20 tries to get the dagger to drop in the best direction. The post-editing took more time to search for the proper licensed BGM, clip the appropriate time window, align the soundtrack with motion, and make other tedious adjustments to achieve the best results.
Result
The final video (at the beginning of this post) turned out cinematic. My friends loved it, and I even got likes from a few professional film directors. They want to use Nybble or Bittle as the hero in their movies.
Learn how to use a voice control module for Raspberry Pi with Petoi Bittle — a palm-sized, open-source, programmable robot dog for STEM and fun. - by Friende Peng, intern engineer at Petoi in 2021 summer.
Petoi's Bittle is a palm-sized, open-source, programmable robot dog for STEM and fun. Bittle can connect with Raspberry Pi and can be easily extended. The goal was to develop a real-time voice control module for Bittle and command Bittle to do some actions.
Demo Video
Voice Control Petoi with USB Adapter & MacBook Pro
Voice Control Petoi with Raspberry Pi Model 3 A+
Abstraction
The conclusion is that I use VAD(Voice Activity Detection) + DTW + Vosk
Use Python to record
I used PyAudio at the beginning, but it is an old library. So I used Sounddevice and Soundfile instead.
Command/Key Words Recognition
From a functional point of view, the methods to do this can be divided into:
Speech to Text. And then look up the commands in the text. One good thing is that this can be combined with NLP applications but this is an overkill for Speech2Text.
Use acoustic features to do analysis and detect commands.
DTW (Dynamic Time Warping) (Used)
This belongs to the second category and it's similar to template matching. DTW can calculate the cost to match one piece of audio with template audio. We can pick the audio with the lowest cost. This method does not need training and is also applicable even if you want to add new commands. The bad thing is that the calculation is time-consuming. But at least the command audios are short in time and we can find ways to eliminate the silence and extract MFCC(Mel-frequency Cepstral Coefficients) feature.
Offline recognition, provides lightweight tflite models for low-resource devices.
Requires 16bit 16KHz mono channel audio. A new version supports Chinese.
I tested it by using non-strip and stripped audios with both large and small size models but it did not do well. For example:
起立 -> 嘶力/成立
向前跑 -> 睡前跑
向前走 -> 当前走
So I tested it again using English:
Hey Bittle
Stand up
Walk forward
Run forward
I have used 16 recordings for now. An empty result is shown when it encounters OOV(out of vocabulary) words. "Bittle" would be recognized as "be to". After silence elimination, some results have changed from wrong to correct, and some have changed from correct to the wrong (this may be due to the reduction of the silence between the pronunciation of words).
16 English Tests, 9 were correct &16 Chinese Tests, 3 were correct.
Petoi's popular open source quadruped robotic pet framework OpenCat has been upgraded to 2.0 with major updates to the Arduino codes.
We redesigned the code structure to make the workflow easier to understand and modify:
Improved the smoothness of motion and the speed of balance adaption. Fixed many bugs in the original workflow. For example, the robot no longer skips commands randomly. Combine WriteInstinct.ino and OpenCat.ino into a single OpenCat.ino. Use the #define MAIN_SKETCH macro to switch between the two modes.
Python API Updates and Demos We have more detailed documentation of the OpenCat Python API and demos to help users connect the robot with another program, such as voice, vision, gaits trained by deep learning, and ROS. It also allows the robot to perform an infinite length of tasks defined by a preset scheduler.
Please offer the cat body like the dog, the dog is cool and sleek...... The cat, made of wood seems cheap and not a nice polished look, I would love a sleek looking cat model to print
We plan to release a new version of Nybble with PVC parts in 2023. Please stay tuned.