12/16/2015 at 05:18 •
Been a long time since my last log.
A lot has happened since.
I lightened the load and switched to an all plastic setup using the bottom half of an old multimeter case to mount the wheels in, and a bandaid box from costco to mount the arduino, motor driver shield, new hc06 Bluetooth module, and batteries in.
I don't have a phone that works with the Bluetooth now so I was testing things connected via Bluetooth from my MacBook.
Here is a video of what today in my test I dubbed Band aid bot. I will make it straight and not mounted with tape crookedly but wanted to try it out real quick. Also I will get a phone that has a working Bluetooth and I can run my android program and do cool things once I get a working android phone again.
Here's the video of me goofing around with controlling it. It runs great being lighter, I'm glad I switched from the metal frame to all plastic:
08/20/2014 at 04:44 •
My htc phone is an option if I can't find my lg android phone. I will have to see if I can utilize the battery from my wife's android phone.
08/20/2014 at 04:38 •
trying to locate my android phone. I somehow misplaced my android phone the other day. I was able to program my Arduino some more and added the bluetooth code. I will add in this code to this project page later. I need to test out my code but first need to find my phone.
07/29/2014 at 04:31 •
Scripting with s.o.a.r.:
1 go forward
2 go backward
3 turn left
4 turn right
5 say a word
6 listen for a response
7 take a picture
Some of these commands would be done on the android phone and some would be done with the robot and it's microcontroller.
With allowing responses to be part of offline script, the robot can be programmed to respond to things autonomously.
Command: "spin in circle"
Response audio: "hello" - say "friendly" say "huh?"
"Spin in circle"
Response audio: null- say "not" say "friendly"
These types of things can be ant to be stored on the robot. They can also be stored in a google spreadsheet or other server based storage to be retrieved by others for sharing purposes. In this way a library of behaviors could be created.
The android phone will need a command interpreter. Commands are received from the google script/ spreadsheet (at first this would probably be via polling the google spreadsheet). This architecture could be changed to interact in a more direct manner in the future but using a googlespreadsheet and googlescript that is polled by the android phone is easy and will allow for it to be easily shared.
The command interpreter:
The commands should come in as "immediate" or "program". This allows the command interpreter to not have to look and update any commands it can just overwrite if it has one of the same name. Possibly a linked list would be a good structure to start with when programming the interpretation of the commands because it allows for chaining the commands together. Some pseudo code for the android command interpreter is:
For programming commands:
//sample command previously received:
string command="program,spin in circle,turn right, turn right, turn right";
//load our existing linked list of commands:
ObjectInputStream oi =newObjectInputStream(newFileInputStream("save.ser"));Object listIn = oi.readObject();
//create linked list from serialized linked list
LinkedList ll = (Linked List) listIn;// add new elements to the linked list
//to do: convert array list into linked list and search through existing list.// If command is the same overwrite existing list otherwise add commands to the list
//the trick is some commands may be existing linked lists and if so we will need to overwrite then
//serialize linked list again:
ObjectOutputStream oo =newObjectOutputStream(newFileOutputStream("save.ser")); oo.writeObject(ll); oo.close();//this is all pseudo code and not complete or working
// will spawn many questions I'm sure and I haven't written the actual interpreter that sends commands to arduino or to android routines just idea of serializing and deserializing a linked list with commands badicakky
07/25/2014 at 12:19 •
Here are some more details on how the robot will interact with google spreadsheets/forms/scripts: First, on the android app there will something like a loop or an event handler which will receive commands from , and send data/ sensory input, etc back to, the google spreadsheet or form or app.( any are possibilities- to start out I will use a spreadsheet probably to receive data.) for starters, an acknowledge signal will be passed back up to google. Also, images, sounds(maybe an audio stream), if I ever put other sensors like transducers on the robot, that data as well. The robot can receive a set of commands such as: go left, right, forward, back, up, down, take a picture(or I might have a mode where it streams pictures or video we will see). I also plan on trying to allow it to listen with the speech recognition api and send the results if what it heard back up to the cloud brain. More to come including code examples as soon as I can get a break from a giant project I have at work...