The Idea: Implement a RC Car controlled with EMG / EEG signals
So, the two main ideas we had prior to the hackathon were either to create a drawing application or the car. The drawing application would have basically been a software application similar Microsoft Paint, but simpler and controlled with the provided tools.
The general idea we had for the car was to first get the car working as a regular RC car, something we knew we could do during the event, and then figure out how to apply the data we would get from just a single Spiker to control the car. Initially, we thought we weren’t going to get anything else but a Spiker. Thankfully, we got more than expected!
Both projects would have been feasible given the team’s skillset and schedule. In the end, though, we opted for the car because the team’s skillset was excellent for BOTH hardware and software!
The Team: Meet GetRekt!
The following are short descriptions on each member’s main contributions to the project. Keep in mind, we all contributed in various aspects of the project, not just those described here. So certain responsibilities, such as a connecting the hardware of the car itself, was done by the team as a whole.
Christian Ward - Developed Python Class for interfacing with / getting data from Emotiv Insight. This included testing and determining how to best interpret the data ( power averages ) from the Emotiv.
James Kollmer - Built and configured Spikers. This included writing a large portion of the bare metal C code that runs on a Arduino UNO development board to acquire the EMG samples. Was also the “test dummy” in the video.
Robert Irwin - Wrote the bare metal, embedded software to control the car’s Arduino UNO. With James, also responsible for acquiring EMG samples with UNO paired with the two Spikers.
Andrew Powell - Created C Python / classes for wirelessly communicating between host computer and car’s UNO, and host computer and Spiker’s UNO. Wrote Python application for running system from a host computer.
The Implementation: Making it work!
The block diagram quickly describes all the major components of the project! Why these components? Well, a HUGE limitation to our project was the time we had to complete it. Since in reality we only had a day and a half to finish the entire project, we mostly relied on tools we were familiar with and/or could get for free! We already had the Xbee transceivers for wireless communication between the car’s UNO and the host computer. We already had the UNOs, one for the car and the other for the Spikers. Obviously we had computers, one of which was chosen to act as the host computer. The Parallax stuff for the car was lent to us by a professor from our college ( Thanks Dr. Helferty! ). The only tools we were totally inexperienced with were the Spikers and Emotiv.
Arguably the most challenging aspect of the project was interpreting the data from the Emotiv Insight. Sure, it took some effort to put the other components together, but we made sure to choose something we knew we could do quickly ( i.e. in little over a day’s time ). But, the EMG signals were incredibly easy to work with, especially compared to the EEG signals! Initially, we toyed with the idea of using specific body movements --- such as eye movements or tapping our feet --- as controlling mechanisms. However, we later realized the results were just too inconsistent.
In spite of this setback, we managed to implement the Emotiv as an “on and off” sort of switch! Not as grandiose as our original plans, but it gets the job done!
Well, that is! If this project interested you, check out the demonstration video and the other pictures! The project also has a repository with all the sources! Keep in mind, though, all the time went into making the thing work!
You’ve been warned! :D