Close

Transcript

A event log for Training Robots by Touch Hack Chat

We're talking about training robots with Kent Gilson, inventor of the Dexter Robotic Arm

lutetiumLutetium 08/03/2018 at 20:210 Comments

TRHaddington Dynamics joined  the room.2:56 PM

Okay! Lets get started. Welcome to Hack Chat @Haddington Dynamics , its great to have you here!

Haddington Dynamics3:02 PM
Great to be here

It would be great if people can put their questions at https://hackaday.io/event/160038-training-robots-by-touch-hack-chat so we can keep the chat going smoothly. While we wait for questions to roll in, can you tell us a bit about your work?

kent gilson joined  the room.3:03 PM

Haddington Dynamics3:03 PM
Kent will answer that directly - he is signing in now.

kent gilson3:05 PM
Hello and thanks for the invite to talk about Dexter. I have been doing tech for about 35 years (yea, i'm old) starting with early 8080 processors and eventually working with FPGA's and supper computers

kent gilson3:05 PM
super

kent gilson3:06 PM
I have always loved robots and after I sold my super computer company I decided to build an FPGA controlled Robot

Cool! Our first question from @Lutetium "What makes a robot "Trainable" ? What software do you use for this? " ?

Haddington Dynamics3:10 PM
Kent is answering this but we wrote our own software doing this.

Haddington Dynamics3:10 PM
He is very slow at typing.

Orlando Hoilett3:11 PM
Is it rude if I ask how long the software took to write? It's totally in jest. ;) Lol.

Haddington Dynamics3:12 PM
He codes fast creating sentences :)

Boian Mitov3:12 PM
I would expect the software uses some type of AI Classifiers. What kind?

Orlando Hoilett3:12 PM
Lol

Orlando Hoilett3:12 PM
Nice! You need to be efficient at the right things.

@Orlando Hoilett lets keep questions to https://hackaday.io/event/160038-training-robots-by-touch-hack-chat so that the chat stays easy to follow :)

Haddington Dynamics3:12 PM
No AI classifers.

kent gilson3:15 PM
I had an idea for creating a very high resolution optical encoder that could change the way that robots were controlled by precisely measuring each axis directly at the point of rotation. this needs to be done at the arc-second scale (1296000 points per revolution). in order to do this, you need to calculate very fast (thats why we need an FPGA). because we can measure very precisely we can measure the difference between where we told the stepper motors to go and where the axis actually is. This is a emergent torque sensor. So, we can grab the robot and it feels us and moves out of the way.

Our next question is from @Kevin Cheng "Have you ever tried to use any ROS stack? Why or why not?"

kent gilson3:17 PM
So, training is just pushing the robot around and recording all of the paths then playing them back

kent gilson3:17 PM
we have used ros and we plan on supporting it even more in the near future

kent gilson3:19 PM
we also have our own JavaScript development environment DDE Dexter Development Environment

kent gilson3:19 PM
And, we have a TCP/IP socket that exposes the API on the robot so that any language can be used

kent gilson3:20 PM
One of our Kickstarter backers built a Unity3D asset stack that interfaces with the socket API

Interesting! Our next question si from @Josh Starnes "How does a robot "learn" I mean does it record data to a table and eventually use that data somehow?"

kent gilson3:26 PM
As we move the robot around and capture paths and points, we store this in a json file that is re-playable. we can also send this movement data over to a second robot connected through the internet and control the remote robot. this is where we plan to use AI and machine learning by using the human movement data to train an AI

Rebecca E. Skinner3:27 PM
Is the json file or program specific to a given physical environment, or general to a class of actions ?

kent gilson3:28 PM
This remote robot training is very exciting to us. Humans will be able to interact remotely with each other through their robots

kent gilson3:29 PM
The json file is xyz and pose data. So you could use it to program another type of robot

Rebecca E. Skinner3:29 PM
So the concept of agency will be expanded to include the physical instantiation of one's personal robot. That is an interesting development.

Our next question is from @Frank Buss "Do you think an AI needs to control a robot to interact with the physical world to get self-consciousness and sentient? One with torque feedbacks sounds perfect for this."

kent gilson3:30 PM
Yes, we talk about being able to project your presence all over the globe.

Rebecca E. Skinner3:31 PM
You mean, through having multiple robots in various geographical locations ?

kent gilson3:33 PM
Rebecca: yes. and once there are millions of robots around the world the anyone can go anywhere and help out

Frank Buss3:33 PM
telepresence robots can be a security problem: https://io9.gizmodo.com/watch-as-a-hacker-frees-this-telepresence-robot-from-it-1663636319

Frank Buss3:35 PM
an imagine an AI hacks them, the Singularity will happen soon after it

kent gilson3:35 PM
Frank: I think AI will need to experience similar things as humans in order to have context that we will recognize as sentience

Haddington Dynamics3:36 PM
@Frank Buss you are absolutely right. That is where we are focused on a ground truth system. These robots will have a PUF (physically unclonable function) and a keystore.

Frank Buss3:36 PM
but it could be in an emulated world

anfractuosity3:36 PM
@Haddington Dynamics - i'm curious if silicon PUFs can be forged with a FIB

kent gilson3:37 PM
True, but then we have the brain in the vat computational problem. the real world is much easier to simulate.

Haddington Dynamics3:37 PM
What is an FIB?

anfractuosity3:37 PM
focussed ion beam

Rebecca E. Skinner3:38 PM
Sentience as humans have it is a rather high-level challenge- goals, dreams, moral principles, pragmatic rules, self-preservation. You could certainly imagine using this system as proposed in a fantastic but much more limited way: for instance, a distributed army of robots that just clean the street and apply bioremediation chemicals in Superfund sites- but that don't have scripts that involve being "a real person".

kent gilson3:38 PM
There are multiple ways to create a PUF. I assume FIB could be one of those.

@kent gilson could you tell us a bit about future plans?

anfractuosity3:38 PM
i mean to clone a PUF using a FIB

kent gilson3:39 PM
Then the original PUF is not "provably" unique.

Frank Buss3:40 PM
and someone controls the robots, can be theoretical not hackable with public/private keys, but what if the controlling computer is hacked?

anfractuosity3:40 PM

http://users.sec.t-labs.tu-berlin.de/~nedos/host2013.pdf

TU-BERLIN

Read this on Tu-berlin

anfractuosity3:40 PM
heh been done apparently already

kent gilson3:40 PM
We are putting the PUF and the keystore directly connected to the FPGA and not accessible by the processor and os

Rebecca E. Skinner3:43 PM
Could you explain a bit more about FIB and its role in physically unclonable function ?

Frank Buss3:43 PM
@Rebecca E. Skinner we don't know if a highly complex AI/robot system don't developed sentience on its own, might emerge in any sophisticated AI systems

anfractuosity3:44 PM
@kent gilson so the results from the PUF are used for what out of interest?

Rebecca E. Skinner3:44 PM
Understood. You could also get something incredibly useful in the real world without reaching for any sort of general AI sentience.

Frank Buss3:45 PM
or we could produce a paperclip AI which gets super intelligent and consumes the whole universe producing paperclips

kent gilson3:47 PM
Life 3.0 book has this dystopic examination

Frank Buss3:48 PM
right, just finished reading it :-)

Boian Mitov3:48 PM
Well... it all boils down the the definition of life, and what is a better life... Who is to say that the clips don't deserve to take over the world... :-D

Frank Buss3:49 PM
I don't think it is very dystopic, but realistic. Better than the all optimistic and happy views of Kurzweil.

kent gilson3:49 PM
we like the idea of humans interacting with other humans through haptic interfaces. This may give us some time to further explore the implications of AI's using our robot infrastructure to do nefarious things.

Alright. I know @kent gilson wanted to show us a live feed of training the robot, so lets wrap up the questions and head over to the Zoom feed! https://zoom.us/j/811893179

kent gilson3:52 PM
thanks everyone we will start the zoom demo in 5 min so everyone who wants can have time to join

Boian Mitov3:52 PM
It's not going to install one of them remote controlled robots that are out to kill us... will it ? (The install you posted ;-) )

kent gilson3:52 PM
lol

Boian Mitov3:52 PM
Sorry had to put shurt on

Boian Mitov3:52 PM
shirt

Boian Mitov3:53 PM
I am in California :-D


Discussions