Close
0%
0%

Self Driving House Robot

Dead-Reckoning Robot with Sensor Fusion and AI Planning

Public Chat
Similar projects worth following
This project is a modular robotics platform that explores autonomous navigation using affordable sensors and powerful edge processing. The core idea is to build a robot capable of self-localization and navigation through dead reckoning, refined with real-time sensor fusion of IMU, GPS, wheel encoders, ultrasonic distance sensors, and eventually LIDAR. The Jetson Nano serves as the high-level planner and fusion brain, while an Arduino handles low-level motor control and basic sensor polling. The system is open-ended and modular and is thus perfect for experimentation in real-world navigation, sensor calibration, or robotics control theory.

If you’ve ever wanted to watch a robot go from clueless drifting to precision mapping—this is the journey.

After 14 years in aerospace, robotics, and drone development, I’ve worked across everything from mechanical design to SW simulation and CI. But I often found myself focused on subsystems - writing test libraries, integrating sensors -without the deep picture of how it all works together.

One of the most interesting projects I contributed to was autonomous vehicles, where perception, planning, and controls had to work in sync. Still, the scale made it tough to trace the full pipeline from sensor input to real-world action.

So now, with some time and curiosity on my side, I’m building a low-cost robot that navigates my house - encompassing mapping, obstacle avoidance, and recognizing objects. It’s a hands-on way to explore how real robotic systems work, end to end. As the Top Gear crew would say: How hard can it be?

This project is both a personal deep dive and a learning tool. I’ll be using:

  • A 4WD mecanum-wheeled chassis (for maneuverability and control complexity)
  • A Jetson Nano (for high-level tasks like planning, perception, and SLAM)
  • An Arduino (for low-level motor control and IO)
  • A suite of sensors, including a camera, LIDAR, and a 9DOF IMU

The goal is to gradually build a fully autonomous indoor robot. Starting with sensor calibration and localization, I’ll move on to dead-reckoned driving, then introduce mapping (first with ultrasound, then LIDAR), and eventually add advanced sensor fusion and object detection. The final stretch will include a simple GUI for placing waypoints and visualizing maps, and hopefully some camera-based perception for detecting people and objects along the way.

And, the thing that makes this time stand out from before, I’m able to lean on ChatGPT as my trusty co-pilot - helping with quick debugging, script scaffolding, and advice on modules and architectures.

This is a project about exploration, learning, and building something meaningful - one subsystem at a time.

THE CODE FOR THE PROJECT CAN BE FOUND ON A PUBLIC REPO IN MY GITHUB: https://github.com/arteml8/robot_project

  • 1 × NVIDIA Jetson Nano
  • 2 × L298N DC Motor Driver Interface and IO ICs / Peripheral Drivers and Actuators
  • 1 × SparkFun 9DOF Sensor Stick (old)
  • 5 × LM393 Optocoupler w. Encoders
  • 2 × HC-SR04 Ultrasonice Sensor

View all 8 components

  • Robot Chassis, Encoder, and Motor Control Integration Update (06/06/2025)

    Artem Lepilov4 days ago 0 comments

    Over the past couple weeks, I think I’ve made solid progress on getting the robot’s drivetrain and feedback control system up and running. Here's what I've tackled so far:

    🔩 Hardware Assembly

    • Assembled the robot chassis by mounting the gearboxes with motors onto the assembly components, and installed mecanum wheels on the output shafts

    • Custom aligned and mounted encoder sensors Unfortunately, my robot chassis didn't come with motor encoders, and as a result, I had to retrofit them on manually, along with the encoder break-beam sensors, which required putting standoffs in place and drilling holes in the chassis where they needed to be installed so they can observe the encoder slits.

    • Installed two L298N motor controllers, wired for four independent motor channels.

    • Connected encoder outputs, motor drivers, and Bluetooth serial to the Arduino Mega. Also wired up power and verified each motor independently.

    🧠 Embedded Code Architecture

    • Built out modular Arduino libraries:

      • MotorController: for handling PWM and direction logic.

      • EncoderReader: to poll break-beam encoder ticks at high frequency.

    • Implemented update() and getTicks() functions to reliably track motor movement over time.

    🐞 Debugging Encoder Feedback

    • Initially observed:

      • One encoder not registering ticks (turned out to be a misaligned sensor with a wire stuck in its beam).

      • Large inconsistency between wheel tick counts — differences of 20–30% between motors for the same commanded distance.

      • Sensors missing transitions or counting unevenly.

    ✅ Fixes and Improvements

    • Tried to switch to interrupt polling and add external pull-up resistors to improve signal stability. (Which didnt help much, as interrupts werent registering for some reason)

    • Switched from interrupt-based encoder reading (which had issues) to high-frequency polling using micros() — much more consistent.

    • Added a PID speed control loop per motor to dynamically adjust motor PWM based on tick feedback — this brought the error down significantly:

      • Now seeing consistency within ~5–10% across all four wheels.

    • Tuned PID and tested multiple speeds — most motors now start and stop accurately, with only slight lag on one wheel.

    🤖 Motion Control System

    • Started integrating a MotionController layer to command coordinated motion (forward, rotate, strafe).

    • Early tests show all wheels move correctly, but:

      • The robot does not stop as expected — likely due to an issue in how we’re checking for movement completion vs target distance.

      • Planning to debug that next.

    🧪 Known Challenges / To-Do

    • Motors struggle at very low speeds — they buzz but don’t turn unless nudged. Planning to implement a "kickstart burst" technique to overcome stiction.

    • Still need to:

      • Finalize the motion control logic (with distance tracking).

      • Set up communication over Ethernet or Bluetooth from the Jetson Nano to the Arduino.

      • Let the Jetson handle high-level motion planning (target heading/speed) while Arduino handles low-level control.

    Encoders Mounted Top
    Encoders Mounted - Side
    Robot Powered Up!
    Encoder Mount Trial
    Robot Chassis Encoder Base
    Sizing Up Encoder to Mount
    Arduino Installed
    Arduino Wired Up
    Sensors and Motor Drivers Connected to Power
    Sensors and Motor Drivers Connected to Power - Another View


  • ​Weekly Update: GNSS + IMU Integration Progress (05/31/2025)

    Artem Lepilov4 days ago 0 comments

    🛰️ This past week’s focus was on integrating and validating the GNSS and IMU sensor stack for the robot's localization subsystem. While the expectations were high without knowing too much about GPS accuracy, the outcome wasnt as precise as I was expecting. Because of a lot of trial and error, it was a very light week in terms of progress, here's where things stand:

    ✅ What Worked

    • Got GNSS Module Online: Wired up and initialized the GNSS module to the Jetson. Parsed live NMEA data streams and verified the core fields (latitude, longitude, time, velocity, etc.).

    • Live Output Evaluation: Logged and evaluated raw GNSS output. Noticed quirks in the stationary state - including up to ~0.5-1.0 m/s of apparent "drift" in velocity, confirming inherent noise when stationary. This is pretty minimal when flying an aircraft or driving a car at speed, but for my applications it is a bit too much. Unfortunately it was expected if I knew the limitations of GNSS in general.

    • IMU Integration Framework: Designed and tested a modular sensor interface. Created reusable libraries for the IMU and GNSS modules using a clean Python class structure to keep the robot control code maintainable and portable.

    • Initial Fusion Planning: Began setting up the structure for fusing GNSS and IMU data for dead reckoning down the line.

    ⚠️ Challenges / Future

    • GNSS Noise: Drift in velocity readings when stationary is within expectations for the module's grade, but highlights the need for smoothing/filtering or sensor fusion.

    • Data Rate Syncing: The GNSS and IMU operate on different data rates and noise profiles, which will need thoughtful handling in future sensor fusion logic.

    • Placeholder Fusion: Full GNSS+IMU data fusion isn’t implemented yet - this update was about creating a stable base to build from.

    Next up: extending this sensor base by wiring up encoder feedback and motor control for basic motion tests, which will help validate the IMU/GNSS stability in motion vs. at rest.

  • Project Log – Phase 1 Summary: Dead-Reckoning with IMU

    Artem Lepilov05/23/2025 at 18:13 0 comments

    📝 Project Log – Phase 1 Summary: Dead-Reckoning with IMU

    🎯 Objective

    To build familiarity with inertial sensors and sensor data processing, I started by implementing a dead-reckoning position estimation system using a 9DOF IMU. This is the foundation for eventual navigation and SLAM on my robotic platform.

    📦 Hardware Setup

    • Jetson Nano – handling I²C communication and data processing
    • IMU Components:
      • ADXL345 – 3-axis accelerometer
      • ITG3200 – 3-axis gyroscope
      • HMC5883L – 3-axis magnetometer (These are all mounted on an older SparkFun 9DOF stick)

    🔍 Methodology

    • Acquired raw sensor data over I²C using Python.
    • Applied static calibration during initial idle periods (ZUPT).
    • Performed dead-reckoning integration to estimate displacement:
      • Complementary Filter (tag: self_derived_imu_with_complement)
      • Madgwick Filter (tag: imu_with_madgwick_filter)
        • Gradient descent-based fusion for orientation
        • Lightweight and efficient for embedded use

    All code is available in my repo:

    • imu/imu.py: Complementary filter approach
    • imu/imu_madgwick.py: Madgwick filter implementation

    📉 Results

    • Stationary estimation is fairly stable, thanks to zero-velocity updates.
    • Motion tracking drifts quickly due to:
      • Sensor noise
      • Lack of absolute positioning reference
      • Noisy acceleration integration

    Despite this, I achieved my initial goal of retrieving and integrating raw IMU data and becoming familiar with the strengths and weaknesses of inertial tracking alone.

    🔧 Next Steps

    • Odometry: I’ve now ordered LM393 IR encoders to mount on each motor gearbox. This will allow me to:
      • Track wheel rotations
      • Add reliable short-term displacement feedback
      • Begin fusing encoder + IMU data for better dead-reckoning
    • Ultrasonic Range Sensors:
      • Already available and will be used for:
        • Basic wall-detection
        • Distance-to-obstacle estimation
        • Ground-truth alignment when GPS is unavailable
    • GPS/GNSS Integration:
      • I bought a u-blox NEO-6M GPS unit to start using GPS for timing and initialization
    • Motion Planning + Control:
      • Jetson Nano will act as the planner, Arduino as the low-level controller.
      • Goal: Dead-heading navigation where the Jetson sets trajectory and Arduino executes.

    🧠 Reflections

    This first phase was valuable for:

    • Building foundational knowledge in sensor calibration and filtering
    • Learning how quickly sensor noise accumulates without external references
    • Understanding in practical terms what’s needed for robust odometry

    While current IMU-only results are not accurate for long-term localization, they’ve set a solid baseline. The upcoming integration of wheel encoders and other sensors will help close the loop on short-term navigation and provide feedback for improving inertial estimation.

    🚀 Subscribe + Follow Along

    If you'd like to follow along or contribute:

    • All code is publicly available in my GitHub repo
    • Tags are clearly marked by IMU filter type

View all 3 project logs

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates