Close
0%
0%

Training Robots by Touch Hack Chat

We're talking about training robots with Kent Gilson, inventor of the Dexter Robotic Arm

Friday, August 3, 2018 12:00 pm PDT Local time zone:
Hack Chat
Similar projects worth following

Kent Gilson will be hosting the Hack Chat on Friday, August 3rd, 2018 at noon PDT. 

Time zones got you down? Here's a handy time converter!

Join Hack Chat

Kent Gilson is a prolific inventor, serial entrepreneur and pioneer in reconfigurable computing (RC). 

After programming his first computer game at age 12, Kent has launched eight entrepreneurial ventures, won multiple awards for his innovations, and created products and applications used in numerous industries across the globe. His reconfigurable recording studio, Digital Wings, won Best of Show at the National Association of Music Merchants and received the Xilinx Best Consumer Reconfigurable Computing Product Award. 

Kent is also the creator of Viva, an object-oriented programming language and operating environment that for the first time harnessed the power of field programmable gate arrays (FPGAs) into general-purpose supercomputing.

In this chat Kent will be answering questions about:
 - Building trainable robots.
 - Developing robotics haptics.
 - Training robots to manufacture.
 - Heterogenous direct digital manufacturing.

Learn more about Dexter in the video below!

  • Transcript

    Lutetium08/03/2018 at 20:21 0 comments

    TRHaddington Dynamics joined  the room.2:56 PM

    Okay! Lets get started. Welcome to Hack Chat @Haddington Dynamics , its great to have you here!

    Haddington Dynamics3:02 PM
    Great to be here

    It would be great if people can put their questions at https://hackaday.io/event/160038-training-robots-by-touch-hack-chat so we can keep the chat going smoothly. While we wait for questions to roll in, can you tell us a bit about your work?

    kent gilson joined  the room.3:03 PM

    Haddington Dynamics3:03 PM
    Kent will answer that directly - he is signing in now.

    kent gilson3:05 PM
    Hello and thanks for the invite to talk about Dexter. I have been doing tech for about 35 years (yea, i'm old) starting with early 8080 processors and eventually working with FPGA's and supper computers

    kent gilson3:05 PM
    super

    kent gilson3:06 PM
    I have always loved robots and after I sold my super computer company I decided to build an FPGA controlled Robot

    Cool! Our first question from @Lutetium "What makes a robot "Trainable" ? What software do you use for this? " ?

    Haddington Dynamics3:10 PM
    Kent is answering this but we wrote our own software doing this.

    Haddington Dynamics3:10 PM
    He is very slow at typing.

    Orlando Hoilett3:11 PM
    Is it rude if I ask how long the software took to write? It's totally in jest. ;) Lol.

    Haddington Dynamics3:12 PM
    He codes fast creating sentences :)

    Boian Mitov3:12 PM
    I would expect the software uses some type of AI Classifiers. What kind?

    Orlando Hoilett3:12 PM
    Lol

    Orlando Hoilett3:12 PM
    Nice! You need to be efficient at the right things.

    @Orlando Hoilett lets keep questions to https://hackaday.io/event/160038-training-robots-by-touch-hack-chat so that the chat stays easy to follow :)

    Haddington Dynamics3:12 PM
    No AI classifers.

    kent gilson3:15 PM
    I had an idea for creating a very high resolution optical encoder that could change the way that robots were controlled by precisely measuring each axis directly at the point of rotation. this needs to be done at the arc-second scale (1296000 points per revolution). in order to do this, you need to calculate very fast (thats why we need an FPGA). because we can measure very precisely we can measure the difference between where we told the stepper motors to go and where the axis actually is. This is a emergent torque sensor. So, we can grab the robot and it feels us and moves out of the way.

    Our next question is from @Kevin Cheng "Have you ever tried to use any ROS stack? Why or why not?"

    kent gilson3:17 PM
    So, training is just pushing the robot around and recording all of the paths then playing them back

    kent gilson3:17 PM
    we have used ros and we plan on supporting it even more in the near future

    kent gilson3:19 PM
    we also have our own JavaScript development environment DDE Dexter Development Environment

    kent gilson3:19 PM
    And, we have a TCP/IP socket that exposes the API on the robot so that any language can be used

    kent gilson3:20 PM
    One of our Kickstarter backers built a Unity3D asset stack that interfaces with the socket API

    Interesting! Our next question si from @Josh Starnes "How does a robot "learn" I mean does it record data to a table and eventually use that data somehow?"

    kent gilson3:26 PM
    As we move the robot around and capture paths and points, we store this in a json file that is re-playable. we can also send this movement data over to a second robot connected through the internet and control the remote robot. this is where we plan to use AI and machine learning by using the human movement data to train an AI

    Rebecca E. Skinner3:27 PM
    Is the json file or program specific to a given physical environment, or general to a class of actions ?

    kent gilson3:28 PM
    This remote robot training is very exciting to us. Humans will be able to interact remotely with...

    Read more »

View event log

Enjoy this event?

Share

Discussions

Jonathan Díaz wrote 08/03/2018 at 19:32 point

Hi Kent. First of all great work on the dextra hand!

We have been working on our own and currently are in the sensor predicament even made our own in this project https://hackaday.io/project/159728-pulse-sensor-to-actuate-a-robotic-hand  we would love your opinion on the matter or if you could recommend an off the shelf component better that the one we have made

  Are you sure? yes | no

Frank Buss wrote 08/03/2018 at 19:28 point

Do you think an AI needs to control a robot to interact with the physical world to get self-consciousness and sentient? One with torque feedbacks sounds perfect for this.

  Are you sure? yes | no

Josh Starnes wrote 08/03/2018 at 19:22 point

What language do you use, what is your favorite and where did you learn?

  Are you sure? yes | no

Josh Starnes wrote 08/03/2018 at 19:22 point

How does a robot "learn" I mean does it record data to a table and eventually use that data somehow?

  Are you sure? yes | no

Kevin Cheng wrote 08/03/2018 at 19:14 point

Have you ever tried to use any ROS stack? Why or why not?

  Are you sure? yes | no

Lutetium wrote 08/03/2018 at 19:06 point

What makes a robot "Trainable" ? What software do you use for this? 

  Are you sure? yes | no

Interested in attending?

Become a member to follow this event or host your own