TRHaddington Dynamics joined the room.2:56 PM
Okay! Lets get started. Welcome to Hack Chat @Haddington Dynamics , its great to have you here!
Great to be here
It would be great if people can put their questions at https://hackaday.io/event/160038-training-robots-by-touch-hack-chat so we can keep the chat going smoothly. While we wait for questions to roll in, can you tell us a bit about your work?
Kent will answer that directly - he is signing in now.
Hello and thanks for the invite to talk about Dexter. I have been doing tech for about 35 years (yea, i'm old) starting with early 8080 processors and eventually working with FPGA's and supper computers
I have always loved robots and after I sold my super computer company I decided to build an FPGA controlled Robot
Cool! Our first question from @Lutetium "What makes a robot "Trainable" ? What software do you use for this? " ?
Kent is answering this but we wrote our own software doing this.
He is very slow at typing.
Is it rude if I ask how long the software took to write? It's totally in jest. ;) Lol.
He codes fast creating sentences :)
I would expect the software uses some type of AI Classifiers. What kind?
Nice! You need to be efficient at the right things.
@Orlando Hoilett lets keep questions to https://hackaday.io/event/160038-training-robots-by-touch-hack-chat so that the chat stays easy to follow :)
No AI classifers.
I had an idea for creating a very high resolution optical encoder that could change the way that robots were controlled by precisely measuring each axis directly at the point of rotation. this needs to be done at the arc-second scale (1296000 points per revolution). in order to do this, you need to calculate very fast (thats why we need an FPGA). because we can measure very precisely we can measure the difference between where we told the stepper motors to go and where the axis actually is. This is a emergent torque sensor. So, we can grab the robot and it feels us and moves out of the way.
Our next question is from @Kevin Cheng "Have you ever tried to use any ROS stack? Why or why not?"
So, training is just pushing the robot around and recording all of the paths then playing them back
we have used ros and we plan on supporting it even more in the near future
And, we have a TCP/IP socket that exposes the API on the robot so that any language can be used
One of our Kickstarter backers built a Unity3D asset stack that interfaces with the socket API
Interesting! Our next question si from @Josh Starnes "How does a robot "learn" I mean does it record data to a table and eventually use that data somehow?"
As we move the robot around and capture paths and points, we store this in a json file that is re-playable. we can also send this movement data over to a second robot connected through the internet and control the remote robot. this is where we plan to use AI and machine learning by using the human movement data to train an AI
Is the json file or program specific to a given physical environment, or general to a class of actions ?
This remote robot training is very exciting to us. Humans will be able to interact remotely with...
Read more »