Artemis is an eyeglass-mounted device that can be configured to locate a specific type of object, or a person. When the target is found, Artemis will track it with a laser.
How It Works
An eyeglass-mounted camera streams images to a Jetson AGX Xavier. An SSD300 model is used for object localization within these images. When the object of interest has been found, a laser diode is turned on.
A servo is also mounted on the eyeglasses for X-axis control of the laser. A second servo is mounted on top of the first, at a 90 degree angle, to give Y-axis control. The laser is mounted on the second servo.
Images are thresholded in OpenCV to determine the location of the laser pointer. With the location of the object, and also the laser, now determined it is possible to adjust the servos to place the laser over the object of interest.
Any arbitrary model that provides object localization could be inserted in place of this model, and could be trained to detect anything of interest.
An Adafruit Itsy Bitsy M4 Express microcontroller dev board was used to simplify control of the servos. The Arduino sketch is available here.
See it in action. In this video, Artemis has been directed to find a pair of scissors: YouTube
To run the software, you'll need Python3 with the following modules installed:
numpy cv2 RPi.GPIO torch skimage
Then clone the repo:
git clone https://github.com/nickbild/artemis.git
Switch to the
artemis directory, then run:
The Fritzing file can be downloaded here.
Bill of Materials
- NVIDIA Jetson AGX Xavier
- Adafruit Itsy Bitsy M4 Express
- USB Webcam
- 2 x micro servos (e.g. Tower Pro SG92R)
- Laser diode
- NPN transistor
- Half breadboard
- Glasses / sunglasses
- Miscellaneous copper wire
About the Author