Close
0%
0%

Autonomous UV Robot with SLAM

The affordable autonomous robot provides localization and mapping facilities and safely navigates the robot through the environment.

Public Chat
Similar projects worth following
As we face a worldwide healthcare crisis caused by COVID-19, there's a large need for disinfection. Currently, such units are either stationary or moved by humans. We improved upon that concept and built an autonomous, mobile unit that may move around lobbies and hallways to get into rooms for disinfection.
Three Lamps mounted on the robot would emit intense UV light at 240 nm wavelength, as per current medical device standards.

Our key innovation is in making this mobile with a state of the art LIDAR based SLAM technology that allows operation without exposing the workers to either the harmful UV light or infected rooms of patients. The mobility allows the disinfection of various corners and narrow alleys. We will be using a mobile app to communicate with the robot and the user can see the real-time status of the robot.

The bottom side can be used for other autonomous applications for surveillance, for delivery, etc.

UV BOT WORKING VIDEO

UV BOT APP WORKING

Wio terminal

We are using the Wio terminal which has many inbuilt sensors. The Wio Terminal is a SAMD51-based microcontroller with Wireless Connectivity supported by Realtek RTL8720DN that’s compatible with Arduino and MicroPython. It runs at 120MHz (Boost up to 200MHz), 4MB External Flash, and 192KB RAM. It supports both Bluetooth and Wi-Fi providing the backbone for IoT projects. The Wio Terminal itself is equipped with a 2.4” LCD Screen, onboard IMU(LIS3DHTR), Microphone, Buzzer, microSD card slot, Light sensor, and Infrared Emitter(IR 940nm). Really cool features in one single device right. In our case, it acts as a wifi access point and we can connect the raspberry pi and our Virtual machine to a single access point. The inbuilt display will show the time and other messages.

We are using the IMU sensor, inbuilt temperature sensor, and buzzer in the WIO terminal to detect a collision, overheating and it alerts through buzzer as well as our mobile app.

We are adding the UV light sensor to detect and get the feedback if UV light is ON or OFF.

The WIO terminal is also connected to the relay to control the UV lights. The WIO terminal acts as an access point as well as a server. So that we can control the relay from

UV SENSOR

Uv sensor photo

Uv sensor photo

It uses a UV photodiode, which can detect the 240-370nm range of light. The signal from the photodiode is a very small level, in the nano-ampere level, hence an opamp to amplify the signal to a more Manageable voltage-level.

Use this basic code to test it

//input this code to test UV sensorvoid setup() {Serial.begin(115200);pinMode(A8, INPUT);}void loop() {int uvsensor = analogRead(A8);Serial.print("UV intensity: ");Serial.println(uvsensor);delay(50);}

DHT sensor

There is a thermistor. There is also a very basic chip inside that does some analog to digital conversion and shows a digital signal with the temperature and humidity. The digital signal is fairly easy to read using wio terminal. Here we are using it as a safety feature. The internal temperature of the robot is being measured here. If it is getting really high the complete system will shut down.

UV light

We are using 20w unit 2ft Sanyo Japan UVC lamp with a full assembly which provides an area coverage of 70 sqft which is sufficient for our application. You can refer to these articles to find out more regarding the disinfection properties of UVC light.

PIR sensor

PIR sensors allow you to sense motion, almost always used to detect whether a human has moved in or out of the sensors range. They are small, inexpensive, low-power, easy to use, and don't wear out But it is a little inaccurate in our case we have to use Contact-less Infrared Thermopile Sensor with OpenCV to get more accurate data for human detection. We are working on the Thermopile sensor we will update in this tutorial really fast.

We are running the access point in the Realtek board and sensors are working from the wio terminal.

You have to burn the bootloader to get access to the Realtek board in Wio terminal, follow this link to do it.

This is the code we used.

#include <WiFi.h>// Current timeunsigned long currentTime = millis();// Previous timeunsigned long previousTime = 0;// Define timeout time in milliseconds (example: 2000ms = 2s)const long timeoutTime = 2000;String header;// Auxiliar variables to store the current output stateString output5State = "off";String output4State = "off";const int output5 =5;const int output4 = 4;char ssid[] = "UV bot"; //Set the AP's SSIDchar pass[] = "12345678"; //Set the AP's passwordchar channel[] = "1"; //Set the AP's channelint status = WL_IDLE_STATUS;...
Read more »

iges - 10.01 MB - 05/26/2021 at 09:56

Download

charging_circuit_ino.ino

charging circuit code

ino - 1.53 kB - 05/26/2021 at 09:56

Download

charging_circuit_y9FkZhTv5O.fzz

charging ckt fritzing file

fzz - 7.87 kB - 05/26/2021 at 09:56

Download

  • 1 × raspberry pi Raspberry Pi is a series of small single-board computers developed in the United Kingdom by the Raspberry Pi Foundation in association with Broadcom.
  • 1 × Arduino UNO microcontroller board
  • 1 × Arduino MEga 2560 MIcrocontroller board
  • 1 × Ydlidar X4 Lidar is a method for determining ranges by targeting an object with a laser and measuring the time for the reflected light to return to the receiver.
  • 1 × UV lights UV type C lights for disinfection

View all 13 components

  • Stepper motor nema 34

    Alex M Sunny06/14/2021 at 17:00 0 comments

    to increase the payload capacity we plan to use 4 nema 34 steppers with 4 absolute rotary encoders into the project which will increase the whole payload and  better actuation

  • jetson Nano

    Alex M Sunny06/14/2021 at 16:54 0 comments

    we used raspberry pi for this project . raspberry pi is not efficient  and has less processing power, to increase the speed of the system we redisigned the internal  systems and  changed the processor to jetson nano

  • Thermal Camera update

    Alex M Sunny06/12/2021 at 07:01 0 comments

    we are planning to add Thermal camera AMG8833 to the microprocessor for accurate human detection.

  • Mobile_unit_for_hospitals

    Alex M Sunny06/10/2021 at 17:01 0 comments

    A mobile covid_19 Disinfection unit for Hospitals. we designed this machine for disinfecting elevators, hallways and rooms 

View all 4 project logs

  • 1
    ROS

    ROS (Robot Operation System) is a framework that facilitates the use of a wide variety of "packages" to control a robot. It acts as a middleware. Although ROS is not an operating system. ROS provides low-level device control, implementation of commonly-used functionality, message-passing between processes, and package management also provide packages range all the way from motion control, to path planning, mapping, localization, SLAM, perception, and more. ROS provides a relatively simple interface, and the ability to of course create custom packages.

    We are using Raspberry Pi 3 for this project. Raspberry PI 4 and Jetson nano would be a great alternative for this project.

    Get the disc image

    We downloaded the Ubuntu 16.04 Xenial with pre-installed ROS from Ubiquity Robotics. The link provided has all the steps to how to install it.

    • Download the image from the top of the page.
    • Flash it to an SD card (at least 8GB) Class 10 preferable.
    • Connect to the WiFi network that starts with ubiquityrobot. Password is robotseverywhere.
    • Go to Terminal, and connect to your Pi using ssh ubuntu@10.42.0.1. Password is ubuntu.
    • Run roscore to make sure that things are working properly. If you get a warning/errors, try stopping ROS and starting it again with killall -9 roscore.
  • 2
    Working with ROS

    There are three ways you can control and code your robot.

    The easiest way was to connect the monitor and keyboard to raspberry pi to work with code.

    Something we would want to be able to do is to access the ROS communication messages from our laptop.

    Spin a Linux machine with ROS Kinetic Kame. Either a virtual machine or a real machine. I used a VMWare-Fusion with Ubuntu 16.04 We will refer to that machine as the Observer machine. The robot is the Master.

    You can follow this tutorial to set up the robot as amaster and Laptop as an observer machine.

    Notes

    • RVIZ won't work in VNC. I tried and failed. You need to use Virtual Box or Vmware to work with ROS remotely. 
  • 3
    Connecting to WiFi

    Follow this website to know how to connect your robot to your wifi.

    • On the robot machine, pifi add YOURNETWOKNAME YOURNETWORKPASSWORD
    • Restart the Pi, sudo reboot. Now the Raspberry Pi will connect to your WiFi network on startup. To connect to it, connect your computer to the same network, and ssh ubuntu@ubiquityrobot.local with the password ubuntu.

    Woo! Now both machines have internet and can communicate over SSH.

View all 10 instructions

Enjoy this project?

Share

Discussions

vellamkunnelakhiljose wrote 06/10/2021 at 17:57 point

Nice work 👌

  Are you sure? yes | no

Alex M Sunny wrote 06/12/2021 at 06:49 point

thank you

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates