Artemis is an eyeglass-mounted device that can be configured to locate a specific type of object, or a person. When the target is found, Artemis will track it with a laser.

How It Works

An eyeglass-mounted camera streams images to a Jetson AGX Xavier. An SSD300 model is used for object localization within these images. When the object of interest has been found, a laser diode is turned on.

A servo is also mounted on the eyeglasses for X-axis control of the laser. A second servo is mounted on top of the first, at a 90 degree angle, to give Y-axis control. The laser is mounted on the second servo.

Images are thresholded in OpenCV to determine the location of the laser pointer. With the location of the object, and also the laser, now determined it is possible to adjust the servos to place the laser over the object of interest.

SSD300 Model

To save time in the prototyping phase, a model pre-trained with the COCO dataset was used (PyTorch Hub). It is able to recognize and localize 80 different object types.

Any arbitrary model that provides object localization could be inserted in place of this model, and could be trained to detect anything of interest.

Servo Control

An Adafruit Itsy Bitsy M4 Express microcontroller dev board was used to simplify control of the servos. The Arduino sketch is available here.


See it in action. In this video, Artemis has been directed to find a pair of scissors: YouTube

The processing: core

The device: core

The glasses: glasses


To run the software, you'll need Python3 with the following modules installed:


Then clone the repo:

git clone

Switch to the artemis directory, then run:



Fritzing Diagram

The Fritzing file can be downloaded here.

Bill of Materials

  • NVIDIA Jetson AGX Xavier
  • Adafruit Itsy Bitsy M4 Express
  • USB Webcam
  • 2 x micro servos (e.g. Tower Pro SG92R)
  • Laser diode
  • NPN transistor
  • Half breadboard
  • Glasses / sunglasses
  • Miscellaneous copper wire

About the Author

Nick A. Bild, MS