• Dev Update #4

    TripleL Robotics03/25/2024 at 01:05 0 comments

    After a brief delay, we resume the development of ddbot, mainly focusing on software. To point out where the software utilities are needed, we show a scatch of system structure here. The numbered arrows and blocks in red are where the software should cover. These numbers help as a reference to track the development status of each software component and are used below as [x] to match specific functionality.

    Progress: We have developed and tested the software for several functionalities, including the wireless video streaming of the UVC camera [1,5], the UART communication between Orangepi and ESP32 [6,7], a terminal UI-based control panel [4], and a simple motor control [8].

    Video streaming enables wireless, real-time visualization of the ddBot camera view on a webpage, facilitating remote FPV control. UART communication facilitates high- to low-level device information exchange, allowing the Orangepi to process compute-extensive AI tasks like image classification via deep learning, with its results sent to the ESP32 to control low-level hardware like motors. The terminal UI-based control panel offers users an easy way to control robot movement via terminal programs and keyboard inputs. Additionally, we do full voltage motor control using H-bridge mode. That works but leads to rapid movement and collision, causing the motor drive board to burn up. We show the demos of these functions as follows.

                                                                                Wireless camera video streaming

    UART communication between Pi and ESP32
                                                                 UART communication between Pi and ESP32
    UART communication between Pi and ESP32
                                                               UART communication between Pi and ESP32
    Terminal UI-based control panel: use the local computer keyboard to send commands to the robot via SSH
              Terminal UI-based control panel: use the local computer keyboard to send commands to the robot via SSH

                                                                                 Simple but unfortunate motor control


    Plan: We need to reconfigure motor control software using PWM to limit robot speed and add safety measures. Also, we will post the tutorials on current progress and integrate all software components to have the first complete example.

  • Dev Update #3

    TripleL Robotics03/24/2024 at 23:43 0 comments

    We basically finished up building the hardware for the whole "ddbot" robot, including the third and fourth layers of its chassis.

    Progress: On the third layer, we add an OrangePi 3 LTS as the high-level central controller responsible for managing all peripherals, including sensors, and coordinating with other modules such as the ESP32 located on the second layer. It runs a modified version of the Linux Ubuntu operating system. Moreover, we attach an 8MP USB video camera (UVC) module on this layer at the very front of the robot body. This is required for all vision-based tasks. On the fourth layer, we add a Time-of-Flight (ToF)-based 2D LiDAR at the top for scanning the distance of nearby environment. This is useful for projects such as SLAM and collision avoidance. The robot is shown in the image below.


    Plan: Over the following weeks, we shift the focus to software development, from verifying all hardware modules to adding several AI-based projects and their tutorials. In terms of the software programming language, we will choose Python as it is widely used in the AI & deep learning community and is easy to learn. Specifically, we will use Python within the Linux OS on the OrangePi and use its micro-controller version, named CircuitPython, on the ESP32. Stay tuned ...

  • Dev Update #2

    TripleL Robotics03/24/2024 at 23:41 0 comments

    We continue the development of our "ddbot" project.

    Progress: we have built up the second layer of chassis for the robot, as shown below. 

    We selected a dual-channel DC motor drive module to drive two motors simultaneously and provide useful voltage outputs like 3.3V, and 5V for other modules. We chose ESP32 as the microcontroller, which provides low-level interfaces such as I2C, SPI, PWM, etc. It can control devices with these interfaces, e.g. PWM for adjusting motor speed, and I2C for reading IMU data. A 6-axis IMU sensor is included to measure the car's attitude. For easy setup and flexibility, we use a breadboard for wire connections for now.

    Plan: next, we try to set up the third layer of the chassis, adding a camera module for vision-based tasks and a Pi (a.k.a. single board computer) for high-level central control. If time permits, we will add the fourth layer to include a 2D Lidar as well. Stay tuned...

  • Dev Update #1

    TripleL Robotics03/24/2024 at 23:26 0 comments

    This is the first log of this robot project, temporally named "ddbot", which is a ground robot with differential-drive kinematics. This robot is going to be our first product and serves as a testbed for subsequent development.

    Progress: what we have done is to design and build up the first layer of chassis for the robot, as shown below.

    first layer
                                                                                          first chassis layer

    We 3D print the base to support the battery, motors, and connectors. Right now the 1800mAh LIPO battery is used and we will see if it can last long enough for the applications. DC gearbox motors are used. Two main wheels and one universal caster wheel (in the front, below the base) are used for differential drive mode.

    universal wheel
                                                                                         universal wheel

    Plan: next, we try to set up the second layer of the chassis, adding the motor control board and an MCU for low-level control. Stay tuned...