Seeing Space Table

A table to see what mechatronic systems are thinking.

Similar projects worth following
Anyone who knows about Brett Victor's Seeing Spaces idea should recognize this immediately.

Sometimes, it's just too hard to figure out what's going on from putting the rover up on blocks and playing with the sensors. Or maybe it's hard to control the sensors. It'd be a lot easier if you had logged all the sensor data from every single test run you had ever done, and could just look at the data, or search it.

Welcome to the future. Walk in to your local hackerspace with an early version of your line following rover. Log in to their debug table, create your project. Draw a line on the table (it'll wash off), set up the bot, and let it run itself off the table. There's your sensor data, already lined up with the motor data. Find a curious pattern in the sensors? Search for other instanes of that pattern. Did the robot do something odd you've seen before? Search for particular movement patterns.

There are three main parts to the system: The sensors, the table, and the server

The current goal is to support 10 sensors, 5 motors, and a video feed with automatic detection of what moved. Hopefully that's enough for most mechatronic systems. All this data will feed into the table. Raw sensor data will be transmitted over a simple radio link, while the video feed will most likely use a wired connection.

A Raspberry Pi controls the receiver radios, timestamps the sensor data, and sends it off to the server over WiFi. Timestamping is important because not all data will arrive at the same time, a sensor might die for a random period of time, and the server certainly won't know what time everything arrived.

The server will act as the indexer, and as the primary data store.

Immediately below is a system diagram of how the various pieces communicate.

  • 1 × Mini Computer Raspberry Pi or Beaglebone
  • 18 × NRF24L01+ Wireless Tranceiver
  • 1 × Table
  • 1 × 1x Server Should be able to store hours of video and several GB of storage per project.

  • Further Software Progress

    minifig40403/23/2015 at 01:51 0 comments

    As much as I like designing things, at some point, I need something that works.

    In that vein, I have abandoned my API design, and started work on spinning up a 5-machine Riak cluster.

    The goal there is to get a cluster together, and see how much traffic it can take. Then, I start defining bucket types, and run the load test again. Of particular interest will be how quickly I can get a document to show up in read requests once I write it.

    An external indexer will be required. So, I'm also going to try to get Mesos/Marathon/Spark running. It will be interesting to see how the Mesos ecosystem (which prefers Hadoop) interacts with Riak.

    On the client side, I haven't progressed far since I got a Beagle Bone wired up to one of the radios. I should really get going on that.

  • Software Progress

    minifig40411/25/2014 at 05:20 0 comments

    Software design is not trivial. For this particular project, I decided to start with designing the API and data structures to go with this project. I'm not sure how exactly to document my UI design, so hopefully I can get back to that later.

    I've added a mindmap for the required data, and started writing up an API Blueprint, and added them to github. Very little is final yet, but it feels like I might actually get somewhere before the end of next week.

    There are three things the API needs to be able to handle immediately: (1) Moderate/High-volume data logs (thousands of writes per second), (2) A concept of a "session" to group logs together, and provide metadata around the logs, and (3) A way to organize those sessions.

    Logging is mostly a matter of structuring your data correctly, and using an existing Database Server. Riak, CouchDB, PostgreSQL, and HBase all provide something to the discussion, and for the immediate future, which one of those I choose will not matter very much, as long as I shard the logs.

    Sessions are also "just another object to juggle."

    To organize sessions, I've provided a way for users to define projects. This allows the user to tell the system which project they're working on, and then either browse sessions or create a new one. Once I get to searching logs, this will also allow some extra context around the sessions that match the search.

  • Electronics Acquisition

    minifig40410/16/2014 at 02:53 0 comments

    So far, I've acquired radios, a BeagleBone, and some prototyping supplies. Software is not progressing very quickly.

    I think a BeagleBone is a lot better of a choice for this project than a Raspberry Pi; there's a fair amount of computing going on, and not very much rendering required. Specifically, OpenCV at >30fps.

    The project BOM recommends a radio per sensor. I'm keeping that there for the moment, but for the sake of putting things together quickly, I'm starting with an Arduino, a special intercept shield, and just one pair of radios. If this works well, I might decide to change the BOM to account for the shift in design.

    Hopefully, I'll at least be able to assemble all the components this week.

    In previous weeks, I've also gotten help building a table. It isn't complete yet, but it should work for my purposes for now.

  • Bret Victor and the UI

    minifig40408/21/2014 at 01:19 0 comments

    Bret Victor had a few requirements for the UI beyond data collection. Most of this discussion is premature, but it could have an impact on the data collection aspect of the design, so I'll talk about it now.

    There are three things Bret wanted to be able to see:

    • Seeing Inside (Automatic Data Collection, Displaying Data, Taken for Granted)
    • Seeing Across Time (Controlling Time, Automatic Notebook)
    • Seeing Across Possibilities (Automatic Experimentation)

    Seeing inside is the most baic layer of functionality. Every single bit of data about the robot should be collected, without any effort from the user ("Taken for Granted"), and displayed in a usable format automatically.

    Once that is in place, we can display all of the data we have for this session in one graph, rewind to a previous point and see what the world was then, and compare the data from this session to previous sessions ("Automatic Notebook").

    What I Can Do For Free

    My main UI for viewing data in a particular session will handle seeing across time by default; all the data is displayed as a line graph. This takes care of both displaying the data, and seeing across time.

    I interpret Controlling Time as being able to focus on a particular point in the past, and see highlighted data at that point. In that context, Controlling Time does not add any new requirements to data collection; it's just a slider on the UI.

    Similarly, Automatic Notebooks are a trivial part of having multiple sessions in the first place. This just requires that the user is able to walk up, select a project, and start recording.

    Taken for Granted

    The first hard part is making the sensor data collection trivial. I'm assuming that the best way to do this is to instrument the sensors; provide a variety of small devices that can connect to most analog or digital sensors on one side, and any microcontroller on the other, and use those to do the data logging.

    One alternative is to provide a number of sensors that work. The number of ways to measure some tiny scrap of our world is simply staggering. Each of those ways has a multitude of ways of representing the same signal. Plus, adding a radio to each sensor adds up to a lot of power quickly. The main advantage of instrumenting the sensors out-of-the-box is I no longer have to trust that the user can figure out how to do that on their own, which could be an engineering challenge itself.

    I could also provide a variety of microcontrollers and microcomputers (IE, RasPi) that work out-of-the-box. This would drag me into custom PCB manufacture, and force me to choose which development environments I wanted to force on my users. On the flipside, I can simply log every pin on the controller, and be done with all of my logging needs. Additionally, there is only one radio connection to worry about, not five. As my main concern is with easy development and debugging, I really do not want to force my users to use a particular development environment; it just feels wrong.

    Automatic Experimentation

    What Bret wants to be able to do is take a variable or constant, and automatically find the best value for it to do a particular thing. So, if we wanted a light-seeking robot, what sensitivity should we use on the light sensor? If we're making a remote-controlled rover, how big should the deadzone on the motors be?

    There are two things we need to be able to do in order to implement this. The table needs to identify constants in code, and recompile and download the code. In order to do those, we have to combine the debug environment with the development environment.

    Up until this section, there's always been a robot between the development environment (presumably a laptop), and the debug environment (the table). You couldn't do anything with the table if you didn't have a mechanically working robot. This means the debug environment had no idea what the code was.

    Therefore, automatic experimentation is inherently low on the priority list. I can't force people to work with Arduino if they need a RasPi,...

    Read more »

  • Current Status

    minifig40408/20/2014 at 05:09 0 comments

    The questions regarding the radios and database choice are still open. Regarding the radios, I can just try both and see which one works better. It is a little harder to do that with the database question...

    Next step is to order a couple of radios, and start architecting the server-side portion. After that, I can design UI flows, and put together all the pieces. Once I have basic sensor logging and searching done, I can add the video feed, and video processing to identify movement.

    For now, rovers are going to be the main focus. I'll add in tweaking for other robot types later.

  • Choice of Database

    minifig40408/20/2014 at 03:10 0 comments

    Ultimately, the question comes down to "How much structure does the sensor log need?"

    The list of users and projects should be very structured, and their indexing needs are not advanced. However, no database is good at storing video of the robots operating. Nor does the indexing associated with the video necessarily belong in a DB. Maybe some hashes could be put into the DB, but SQL is probably a poor fit for the sort of video processng this project will eventually do.

    Which brings us back to sensor samples. If each sample can be reduced to a series of identical-structure points, and indexing doesn't need any special processing, then SQL could be useful. On the other hand, if I need to aggregate all the sensor data myself in order to come up with a useful visual search, is there an efficient way in which SQL can store the results of that computation? If not, can Map/Reduce indexes fill the gap?

    Probably the best bet is to start with pure SQL, and abstract everything behind microservices, and let the microservices deal with the DB. Then, if I need to migrate data later, I can deploy new microservices, and use them to recreate the new equivalent of the old data.

  • Radio Choice

    minifig40408/20/2014 at 01:32 0 comments

    The challenge for the radio communication is two-fold: 

    1) I need to send up to 15k samples (12-bit ints) every second.

    2) I need to send 15 samples simultaneously from different devices without worrying about protocol or collisions -- radio is not my expertise.

    The two options are the NRF24L01 and a RF4432 module.

    The latter has several hundred MHz of adjustable frequency, but I can't actually use the entire frequency range on a single receiver, nor can a single SDR module cover the entire frequency range. If the frequency range was more limited, I could maybe use an SDR hooked up to a Raspi or a BeagleBone to act as 15 receivers via frequency analysis -- an FFT, or something that can decompose a signal into its frequency components.

    With an NRF24L01, the frequency range is significantly more limited, and a more crowded area of spectrum is used, but each module put into received mode specifically supports 6 chips, each on a different. That support requires that only one of the 6 transmitters can send at any given time. Since collisions are pretty much guaranteed in this project, this may not be a feasible solution.

    Note that both of these problems vanish if I can have the controller chip, not the sensors, do the radio operation. Essentially, this is centralizing radio communication, and eliminating Challenge #2. However, that requires limiting which microcontrollers people can use -- a limit on free will I'm less comfortable with. Limiting sensor choice seems like it would be more acceptable across projects.

View all 7 project logs

Enjoy this project?



Finnlet wrote 08/21/2014 at 00:04 point
In what format will the sensor data be exported to the server? Maybe Bret Victor describes it, but my memory of the video is fuzzy, and I'm not in a place where I can watch it again.

  Are you sure? yes | no

minifig404 wrote 08/21/2014 at 00:10 point
Bret never really goes into the details of how the system works, just the user experience he'd like to see. How to describe the sensor data is one particular detail he's a little vague on.

The technical representation for each sensor would probably be a (name, date, [reading]) tuple. The user would probably be a line graph of reading-vs-time for a particular session.

  Are you sure? yes | no

PointyOintment wrote 08/19/2014 at 18:08 point
For the "searching by gesture" thing, perhaps a perceptual hash algorithm could help.

  Are you sure? yes | no

minifig404 wrote 08/20/2014 at 00:28 point
That's a cool idea, now that I've googled it. Once I have some data to play with, I'll have to try it out. I'm not sure how a 2D algorithm will react to 1D data (IE, an unexpected motor power spike), but I can try a few things to see if they work.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates