Close

Proposed Interface

A project log for Owi arm controlled by a LEAP Motion

The idea is to create a non convencional interface to control a robotic arm

Giovanni LealGiovanni Leal 08/24/2018 at 22:390 Comments

The Leap Motion input interface enables the interaction
between the user and the robotic arms by capturing the
motions and gestures made by the user’s hands. The
developed software receives the information of the motions
and gestures in real-time and then processes it according to
the commands presented in the function below (Fig. 3).
These commands depend on the hand being used (right or
left, which the Leap Motion can detect on its own) as well as
on the position of the hand over the X, Y, and Z axis,
defining (0, 0, 0) as the center of the space. Additionally,
hand gestures made by the user are taken into account. In the
case that any of these triggers is detected, the corresponding
command is sent to the Arduino board, which is responsible

for sending electrical signals through the Adafruit Motor-
Shield to move a robotic arm according to the user’s motions

and/or gestures. All of these factors determine the first letter
of the command (R for right or L for Left) and a number
corresponding to a selected ASCII character.

This formula corresponds a two hand configuration setup.

Discussions