In addition to trying to provide this robot build with autonomous behaviour (e.g. SLAM navigation), another objective is to explore various innovative ways of controlling its moving and walking behaviour, and its interaction with the user. The walking and steering motion of the robot can already be controlled via the GUI, keyboard or XBox One game controller.
So I am currently working on the improvement of the current user-controlled functions, as well as the implementation of some interesting new ones:
- Fine-tune input for controlling the walking and steering motion of the robot.
- Exploring interesting ways that a robot 'tail' can interact with the user.
- Adding a 3D sensor head to the robot, such as an Intel RealSense depth camera, and visualising the environment.
- Updating the user's graphical interface, based on previous work done using a Qt-based GUI written in C++ and integrated with the USB motor controller as well as the ROS ecosystem.
More details and progress on this project can be found on my blog: