Close
0%
0%

Cognitive Developmental Robotics Platform

Trainable personal robots that behave and learn naturally, inspired by neuroscience.

Public Chat
Similar projects worth following
A platform for modular, customizable personal robots that learn and make decisions on their own.

Behavioral Robotics explores creating personal robots capable of lifelike responses and self-directed learning. The open platforms leverage modularity and customization to enable diverse robotic organisms. Configurable brains and sensors allow the bots to learn and adapt in home environments. By sharing collective discoveries, the swarm intelligence of the bots compound. The goal is responsive, autonomous robots that behave less like machines and more like organisms in nature - reacting intelligently to their surroundings. Behavioral Robotics aims to push robotics closer to this biomimicry future through accessible and extensible maker platforms.

  • Internal representation using HTM

    Holotype Robotics11/26/2023 at 22:40 0 comments

    To be able to replicate a basic brain it's important to define the scope of what brains do.
    At its core, a brain learns models and makes predictions about its senses and environment in order to make reasoned actions. This ability to represent reality is key for context-aware decisions.

    All robots represent their environment internally in some fashion in order to make proper decisions based on the current context.

    Whether implicitly programmed or learned, robots must have internal representations of the external world created from their sensory inputs, in order to respond intelligently based on the situation.

    The cortex is the part of the brain that does this in all living things.
    As the seat of perception, thought and memory, the cortex contains hierarchical networks that learn models of senses across time and space. This enables understanding reality.

    The cortex can be thought of a common subspace for all the "sensors". The cortical hierarchy integrates different sensory inputs into a common framework of knowledge about time and space. This convergence enables inference and prediction.

    The cortex can be boiled down to the algorithms used in HTM theory. HTM provides a computational theory capturing core algorithms like spatial pooling, temporal memory, and sensor fusion learned by the laminar neocortex.

    This is how all representations of the environment will be stored for our robot. By implementing an HTM framework, our robot can build and leverage a learned model of reality just like animal brains, enabling meaningful responses.

    HTM in Robots

     HTM has compelling advantages over conventional programming:

    1. It produces inherent invariances automatically from real-time data rather than manually defining filters.

    2. The hierarchical structure allows high-level inferences and transfers based on sequenced low-level observations.

    3. Sensor fusion happens intrinsically matching patterns across data streams in time and space.

    4. Making predictions rather than static observations enables smarter real-time decisions.

    5. The neocortex approach better handles novel inputs compared to task-specific systems.

    By more closely replicating cortical structure, HTM-based robots will display more adaptive responses like animals. This project explores that potent concept through biologically inspired robotics!

    HTM Overview

    The HTM cortical learning algorithms contain three core functions: Spatial pooling - This mechanism in HTM does pattern recognition on the input data to identify common features, regardless of location. Like visual edge detectors.

    Temporal memory - This memory system models time, discovering causes and effects by sequencing spatial patterns. It is analogous to grid cells activating at specific moments.

    Sensor fusion - HTM integrates different sensory inputs together spatially and temporally. Like hearing a bark and seeing a dog simultaneously from different senses.

    By stacking regions and columns using these principles, HTM forms a hierarchical network that can capture increasingly sophisticated features and behaviors over broader timescales.

    For more information on Hirachical Temporal Memory, watch this very useful playlist by Numenta.

  • Project Scope

    Holotype Robotics11/26/2023 at 22:37 0 comments

    Before moving forward on with this project, I felt that it is important to explain its scope since some aspects might not be immediately clear, especially if you're not familiar with the details. Let's make sure we're on the same page.

    Simply put, this project aims to make robots learn and adapt naturally, much like animals do. Instead of being explicitly programmed, they'll learn from real-world experiences. Picture robots navigating homes and offices, understanding the environment through touch, vision, and balance, just like infants do.

    What's unique? These robots won't follow preset instructions; they'll intelligently respond to unfolding scenarios. Think of them assessing situations and making decisions on the fly, blending internal states with external stimuli.

    And here's the kicker: the knowledge they gain won't stay confined to a single robot. Through a shared cloud knowledge base, robots will pass on their learned skills, accelerating collective learning.

    This means that anyone can train a robot, and someone else across the globe can use those learned skills for their robot.

    We're not replicating humanoid robots; the goal is versatile automatons that think smartly and act practically. This overview isn't rigid; it's just what we're aiming for - an exciting journey into a future of adaptable, useful robots.

    Key Scope Elements:

    Learn from Physical Interactions - Robots operate in home and office spaces, constructing knowledge in an unstructured fashion through movement and object interactions, much like infants. Touch, vision, balance - varied senses feed expanded neural models.

    Respond Intelligently - Rather than just pre-mapped behaviors, the robots assess scenarios as they unfold to determine appropriate activities. Goal-directed yet reactive actions couple internal state with external stimuli.

    Transfer Knowledge - Learned skills can transfer across robots through a cloud knowledge base, allowing skills to build rapidly by sharing rather than isolating experiences.

    Critical Associations - Correlating occurrences in time and space provides environmental insight - when doors open, movements cause loud noises, certain objects appear together. Causal inference distills wisdom.  

  • Early attempts at robot behavior.

    Holotype Robotics11/17/2023 at 06:17 0 comments

      Early on, I experimented with emergent behaviors using a simple evolutionary neural network algorithm. While it sounds complex, the core idea was straightforward - simulate natural evolution to automatically develop behaviors.

      The neural networks had inputs connected to sensors, hidden layers of "neurons" with adjustable weights, and outputs tied to actions. The network structure was randomly initialized.

      These networks controlled basic organisms in a software environment. The organisms competed and the least fit would "die off" each round.
      A key step was randomly mutating the neural networks by tweaking the number of neurons, changing connection weights, or altering the network structure.

      Over many generations of selection pressure, I observed the organisms develop surprisingly effective and complex behaviors entirely on their own, without any direct programming.

      This demonstrated how the basic principles of evolution and neuroscience can produce robust intelligence through iteration on simple rules. The emergent results exceeded hand coded logic. While rudimentary, it provided a promising starting point for developing lifelike robot behaviors.

      You can run the program here.


  • Origin Story

    Holotype Robotics11/15/2023 at 06:43 0 comments

    I remember when I first got a robot dog as a kid. I was so excited to have my own robotic pet! I imagined how cool it would be when I took it out of the box - surely it would start looking around, wagging its tail, and responding to me.

    But then when I actually opened it up and turned it on, I was pretty disappointed. It only had a few canned responses, moved in simple pre-programmed ways, and basically ignored everything happening around it. All it could do was walk forward, bark a couple times, and turn left or right if it hit something. I wanted it to actually react to me!

    I tried "playing" with it but it just wasn't that fun. It didn't learn, didn't get more capable, and didn't even seem to notice I was there most of the time. Take it out of the environment it was built for and it was useless. Just a bunch of programmed responses that never changed.

    I had dreams of robots that could perceive the world and learn - like a real dog. But the toy companies were still stuck on robots that blindly followed scripts. I realized if I wanted something more lifelike, I'd have to build it myself one day. That experience sparked a passion that's been with me ever since!

    This passion drove me to create Holotype Robotics. Holotype Robotics creates modular robotic kits. This will serve as an umbrella project for several different projects on my page surrounding the different components, kits, and software that I create. I have already made a project for the Dendrite-S3 Robot Controller, and will soon make a post for the upcoming N20 based servo that I designed, so be sure to follow this project for more updates!

View all 4 project logs

Enjoy this project?

Share

Discussions

Does this project spark your interest?

Become a member to follow this project and never miss any updates