Close
0%
0%

SamuRoid: 22-DOF Embodied AI & ROS Humanoid

Open-source bipedal robot powered by Raspberry Pi 4B. Features IK-based gait, 30kg.cm bus servos, and DeepSeek LLM integration for AI resear

Similar projects worth following
Overview SamuRoid is a multi-modal, 22-DOF bionic humanoid based on Raspberry Pi 4B. It integrates AI vision, voice interaction, and LLM reasoning, providing a powerful platform for research and high-end robotics hacking. Mechanical & Actuators Structure: Aluminum alloy with 22 Degrees of Freedom (Head 2, Shoulder 2, Arm 4, Hand 2, Leg 10, Foot 2). Servos: XRS300 high-voltage serial bus servos (30kgf.cm @ 12V), enabling 30+ preset actions like kicking, dancing, and gesturing. Electronic System Brain: Raspberry Pi 4B (4GB) + PWR.ROSBOT.X driver board. Connectivity: WiFi, BLE 5.0, Gigabit Ethernet. Sensing: Robot-Eye 4.0 (1080P), MPU6050 6-axis IMU, and USB high-precision Mic. Expansion: Fully broken-out GPIOs compatible with 40+ modular sensors. Software & AI Architecture Control: ROS Melodic on Ubuntu 18.04. Features IK-based gait and Inverted Pendulum algorithm for dynamic balance. Vision: OpenCV-powered facial recognition, color tracking, and automatic ball targeting. In

Project Details: Samuroid Technical Deep-Dive

1. Project Vision

SamuRoid is an advanced 22-DOF bipedal humanoid platform designed for the intersection of Embodied AI and Real-time Bipedal Locomotion. By bridging Large Language Models (LLMs) like DeepSeek with the ROS (Robot Operating System) ecosystem, Samuroid transforms high-level semantic intent into precise physical movements.

2. Mechanical Design & Kinematics

The chassis is constructed from high-strength Aluminum Alloy with a static spray finish, ensuring structural rigidity for dynamic balancing.

  • Degrees of Freedom (22 DOF) Breakdown:
    • Head: 2 DOF (Pan/Tilt for vision tracking)
    • Shoulders/Arms: 2 DOF (Shoulder) + 4 DOF (Arms) + 2 DOF (Hands)
    • Lower Body: 10 DOF (Legs) + 2 DOF (Feet)
  • Actuation: Powered by XRS300 High-Voltage Serial Bus Servos.
    • Stall Torque: ≥30kgf.cm @ 12V.
    • Feedback: Real-time position, temperature, and load monitoring via the serial protocol.

3. Electronic Architecture & Sensing

The system adopts a master-slave control architecture to balance high-level AI reasoning and low-level motor control.

  • Compute Brain: Raspberry Pi 4B (4GB RAM) running Ubuntu 18.04.
  • Motion Controller: PWR.ROSBOT.X dedicated driver board. It handles DC-DC power management (SY8120ABC), audio amplification, and acts as a hardware abstraction layer for the 22 servos.
  • Sensing Suite:
    • IMU: MPU6050 6-axis gyroscope for attitude estimation and gait stabilization.
    • Vision: 1080P 120° Wide-angle USB camera (Robot-Eye 4.0).
    • Audio: High-precision AEC (Acoustic Echo Cancellation) Microphone + 1W Speaker for voice interaction.

4. Software Stack & Locomotion Algorithms

SamuRoid is fully integrated with ROS Melodic. The codebase is open-source and supports both C++ and Python development.

  • Locomotion Engine: Implementation of Inverse Kinematics (IK) combined with the Linear Inverted Pendulum Model (LIPM). This ensures the Center of Mass (CoM) remains stable during dynamic gait transitions.
  • Vision Pipeline: OpenCV-based modules for face recognition, color tracking, and QR code localization.

5. Embodied AI: Integrating LLMs

The defining feature of Samuroid is its Multimodal AI Integration.

By connecting to DeepSeek and Doubao LLM APIs, the robot performs semantic parsing of natural language. Instead of hard-coded commands, the robot can understand intent:

  • Input: "I am tired, show me some fun."
  • Process: LLM interprets "tired" -> selects "Dance" action group -> triggers ROS Action Server.
  • Feedback: Real-time status report via the integrated voice system.


6. Technical Specifications Summary

  • Dimensions: 190.98 * 141.6 * 389.81 mm
  • Weight: 2.3 kg
  • Battery: 12V 3000mAh Li-po (60A discharge protection)
  • Communication: Dual-band WiFi (2.4G/5G), Bluetooth 5.0, PS2 Wireless Controller.

  • 1 × Main Controller Raspberry Pi development main control board
  • 1 × Driver Board PWR.ROSBOT.X
  • 1 × Camera Robot-Eye 4.0 (1080P, 120° wide-angle, 2-megapixel)
  • 1 × Display 0.96-inch OLED display
  • 1 × IMU sensor 9-axis sensor (3-axis gyroscope + 3-axis accelerometer + 3-axis magnetometer)

View all 7 components

  • Project Idea: SamuRoid – A Humanoid Robot Platform for Embodied AI and ROS Development

    alisa.wu2 days ago 0 comments

    We recently started working on a humanoid robotics project called SamuRoid. The main goal is to explore how an affordable humanoid robot platform can be used for embodied AI experiments, robotics education, and ROS development.

    Most humanoid robots today are either research-grade systems that cost tens of thousands of dollars, or simple toy robots that are difficult to extend. With Samuroid, we are trying to build something in between — a capable but accessible humanoid robot platform for developers, students, and robotics enthusiasts.

    The robot is built around a Raspberry Pi 4B running Ubuntu 18.04 and ROS Melodic. Using the ROS framework allows us to integrate different modules such as motion control, machine vision, and AI interaction in a standardized way.

    Mechanically, the robot uses a 22-DOF humanoid structure including head, arms, legs, and feet joints. High-torque serial bus servos (≥30kgf.cm) are used to drive the joints, enabling complex humanoid movements such as walking, waving, dancing, and kicking a ball.

    Currently we are testing several locomotion algorithms based on inverse kinematics and inverted pendulum control. A built-in MPU6050 IMU helps the robot maintain stability during walking.

    For perception, SamuRoid integrates a 1080P wide-angle camera combined with OpenCV-based computer vision algorithms. We are experimenting with several AI vision capabilities including:

    - face recognition  
    - color recognition  
    - QR code detection  
    - object tracking  
    - autonomous ball tracking and kicking

    Another interesting direction we are exploring is multimodal AI interaction. By connecting the robot to large language model APIs such as DeepSeek and Doubao, the robot can understand voice commands and perform actions through natural language interaction.

    The robot also includes a voice input system with a microphone and speaker for real-time audio interaction.

    Because the platform runs ROS and supports Python and C++ development, it is also suitable for robotics education, AI experimentation, and developer research projects.

    We are still refining the motion control and expanding the AI interaction capabilities.

    If anyone in the Hackaday community is interested in humanoid robotics, embodied AI, or ROS-based robot platforms, we would love to hear your ideas and suggestions.

    More technical information and development resources about the Samuroid robot platform can be found here:

    https://www.xiaorgeek.net/products/samuroid-ai-humanoid-robot-with-raspberry-pi-integrated-multimodal-ai-model-large-language-models-vision-interactive-voice-based-ros-xiaorgeek

View project log

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates