08/25/2018 at 15:14 •
Well... we will give you the data and youlll make your conclusions.
First the metodology:
- The project was developed with a framework of agile software development (DAD).
- Through three incremental iterations, where solutions were provided to each one by adding new functionalities in the next one.
- There are two configurations. (natural and mirror)
- The three iterations consisted of one arm, two arms and two arms remotely.
- The measurements and indicators were defined for two kinds of observations. User’s perception of comfort and performance of the completion of the tasks.
So we started on the first iteration:
- One robotic arm
- Interaction with one hand
- Mirror and natural configuration
- One Leap Motion sensor
- Two tasks
As you can see the first one we also marked the space where the sensor picks up the hand. The results are the following:
The learning curve show that the system provides a easy to use interface.
And the feedback was pretty good!
Overall everyone was very please, keep in mind that the task where designed for no more than 10 minutes because you will get tired from keeping your arms straigth on the air. For example BMW cars now equip this feature on its cars but only for short commands.
So as always feel free to ask us anything!
08/24/2018 at 22:39 •
The Leap Motion input interface enables the interaction
between the user and the robotic arms by capturing the
motions and gestures made by the user’s hands. The
developed software receives the information of the motions
and gestures in real-time and then processes it according to
the commands presented in the function below (Fig. 3).
These commands depend on the hand being used (right or
left, which the Leap Motion can detect on its own) as well as
on the position of the hand over the X, Y, and Z axis,
defining (0, 0, 0) as the center of the space. Additionally,
hand gestures made by the user are taken into account. In the
case that any of these triggers is detected, the corresponding
command is sent to the Arduino board, which is responsible
for sending electrical signals through the Adafruit Motor-
Shield to move a robotic arm according to the user’s motions
and/or gestures. All of these factors determine the first letter
of the command (R for right or L for Left) and a number
corresponding to a selected ASCII character.
This formula corresponds a two hand configuration setup.
08/24/2018 at 15:17 •
Normally when people saw the project they always get interested for the arm( even though the LEAP is much more interesting) so lets talk about it.
The arm is not a great robotic arm... in fact is a pretty crappy robotic arm, because it's a toy!!! Having said that, it is one of the best toys there is. Like really if you have a child and its age appropiate buy it for him, go ahead.
We choosed the arm beacause its cheap and easy to get to the components. Also it was a blast having to assemble it.
In the instructions part we didnt talk about ditching the controller part because if you assemble it it wil be obvious.
The short comings to the OWI are pretty narrow but important... its a DC motor based arm so there is no positioning system. We encourage you to find a servo based and implement it a tell us about it. That would take real advantage from the LEAP and would make a great project.
Here in Hackaday you can also find how to make your own: https://hackaday.io/project/181-mearm-your-robot
08/24/2018 at 14:40 •
The study of the ways in which humans interact with
robots is a multidisciplinary field with multiple contributions
from electronics, robotics, human-computer interaction,
ergonomics and even social sciences. The robotics industry is
mainly focused on the development of conventional
technologies that improve efficiency and reduce the amount of
repetitive work. To achieve this, enterprises must train their
technical staff to accompany the robot when performing tasks,
during configuration and technical programming for proper
operation. Taking the latter into account, the development and
creation of unconventional interfaces for interaction between
humans and robots is critical, because they allow for a natural
control over a robot to generate wide acceptance and massive
use in the performance of a wide range of possible tasks. This
paper presents the challenges in the design, implementation
and testing of a hand-based interface to control two robotic
arms and the benefits of this technology that is between
robotics and human interaction.
12/20/2017 at 19:56 •