-
Back at it with the Teensycorder
11/21/2022 at 06:39 • 6 commentsThree years is perhaps a bit of a long while for an update, but here we go.
I've had precious little time over the past three years for my open hardware projects -- being a tenure track professor and a father to a little one already takes up most of my day, and when the pandemic hit it just took away every bit of time I had. But, things are starting to get slightly back to normal, and I've been eager to finish off some projects including this latest iteration of the open source science tricorder project.
The world has of course also changed in the last three years -- in the electronics world, it's almost impossible to get ahold of some parts, including the Raspberry Pi Zero W's that Iteration 8 was originally designed with. I have thought for a number of years of simply redesigning Iteration8 with a number of changes:
- Microcontroller: Use a microcontroller instead of a ]Raspberry Pi Zero W, as the Pi is very power hungry, and very hard to come by.
- Screen: Instead of making the device a backpack for a phone, make it it's own stand-alond device like all the previous iterations.
- Sensor connectors: I really like the idea of connecting all of the sensors using connectors. I'd like to do this but to the extreme -- connectors for everything, to make it easy to add to/modify.
I'm also intending on embracing the 80/20 rule: you can often get a project 80% of where you'd like it to be in about 20% of the time it would take to get it to 100%. In my past open source science tricorders I've spent a lot of time keeping the board compact and saving every millimeter. Here I'd like to try and use off-the-shelf parts as much as possible, even if it makes the device a little bigger, in favor of finishing the project quickly.
I have drivers for the previous sensors from Iteration 8 currently working:
- The 8x8 pixel magnetic imaging tile
- The Radiation Watch Type 5 high-energy particle sensor with my external backpack to increase its sensitivity
- The SCD30 CO2 sensor with a wide detection range (up to 10,000 ppm)
- The SPS30 air particulate matter sensor
- The TFMini Plus distance sensor
- MLX9040 low-resolution thermal camera
The switch from a FLIR Lepton to an MLX90640 was a bit difficult because of the lower resolution, and one of the main benefits of the Pi is that it can easily connect to the Lepton through USB. I've tinkered with Lepton drivers for the Teensy without luck, and can always hook up the Lepton through one of the SPI headers if I'm able to figure it out in the future.
Just like Iteration8, the sensor boards have small adapters that I put on the back to adapt their standard 0.1 inch headers to the small board-to-board cable connectors.
The motherboard for the Teensycorder essentially is a large collection of headers and connectors. Currently, there are specialized headers for the radiation sensor, the magnetic tile (SPI), the TF Mini Pro (UART), two unused headers for the Hamamatsu spectrometer and an SPI camera module. There are also plenty of standard QWIIC connectors for I2C devices, so that sensors with that interface can be added in easily.
For this prototype I embraced the idea of headers -- centrally because it's extremely hard to find nearly all of the parts on the device, and I was very lucky to find two Teensy 4.1's in the current shortage, and didn't want to waste one by soldering it to a board that might not work. The story is much the same for the power circuitry -- while the 3.3V buck power supply is relatively easy to find, it's very hard to get ahold of the 5V boost + lipoly charger, so I decided to make my mistakes cheaply and put everything on headers so that I can reuse them between prototypes.
With the drivers for most of the major components and sensors seemingly working, my next steps are brainstorming ideas for a lightweight but attractive graphical user interface for the sensors. I'd also like to try and put as many of the sensors in a power-saving mode as possible, as the current draw can currently be quite large with all of them enabled.
That's all for now -- thanks for reading!
-
It works! First data
09/22/2019 at 06:45 • 1 commentA quick update, with what is essentially the first data from Iteration 8!
To say this project has been progressing slower than I'd hoped is an understatement. Raising a little one and being on the tenure track takes up nearly every waking hour of each day, but I'm starting to be able to squeak in an hour after the little one goes to bed every now and again. And with that -- substantial progress, finally!
Thermal Images from the FLIR Lepton
Above is a picture from the FLIR lepton, plugged into the GroupGets PureThermal2 USB breakout, interfaced to the Raspberry Pi Zero W. I made use of disasterarchy's very helpful thermalZero example code to acquire frames of raw temperature data, and modified the example to pipe these out to stdout as a JSON string that any calling program can read. I put together a very quick server using node.js on the backend, with some of the plotly node.js widgets to display data, and everything seems to be mostly working.
One of the issues with interfacing high-bandwidth sensors with a phone has always been how to get the data in. While Android phones sometimes allow a physical connection piping data, this is fairly challenging to do with an iPhone. Bluetooth initially seems like a viable alternative, but the bandwidth is also generally rather limited. And even if either of those alternatives were viable, they would still require writing native Android or iPhone applications to read and display the sensor data, which would be a skill that I unfortunately don't have the time to learn. The (proposed) solution with Iteration 8 has been to have the device take the form of a small web server mounted to the back of an arbitrary phone, so that the user has only to point their web browser to the device address, and they'll be able to interact with the sensing device using a standard web interface. I am primarily a researcher and engineer, but I have had the opportunity to acquire somewhere between mediocre and barely-passable web programming skills over the past few years, so I think I have enough skill to make something that looks reasonably well (or, is at least functional) without having to invest a huge amount of time using this sort of sensing-device-as-a-web-server programming/interaction paradigm.
The above screenshot is a capture from my phone (an iPhone 8), using a regular browser (safari), displaying a thermal image from the FLIR camera using the plotly heatmap widget. The framerate being captured by the device is about 10fps, but the web client is only updating about once every 2 seconds -- so much slower than the data is being captured. I think this is a limitation in my knowledge about sending data streams -- right now I'm sending data in a non-streaming, regular get-request fashion, and sending it as JSON, all of which are likely slowing things down a great deal. So this part is currently working as a proof-of-concept, but at a much slower framerate than is ideal. Even still, that's a much better problem to have than having no data at all.
Magnetic Imager Tile
By far one of the sensors I'm most excited about is my magnetic imaging tile, which is essentially a low spatial resolution (8x8 pixels), high temporal resolution magnetic field camera. Though I've only run it at about 2000 frames per second, the DRV5053VA magnetometers used as pixels appear to have a theoretical maximum of about 20Khz, and I'm eager to try and run it at such high speeds and see what I can see. While 2000 fps is enough to see the magnetic fields of transformers oscillating or fan motors spinning, I'm curious if 20Khz would allow one to image the coil of a speaker rendering audio frequencies, which would be very interesting (and cool) if possible.
One of the issues with the magnetic imager tile is that while I had written an Arduino driver for it, I had not yet written a Pi driver -- and I'm much more of a low-level person, so writing a user-space driver for an SPI device with extra GPIO clocking was something I had been putting off.
The tile has been available from Sparkfun for some time, so I decided to do a quick Google a few weeks ago to see if someone had written Raspberry Pi drivers for it. Thankfully, someone had put in much of the work already, and posted their progress to Github. It was me. Towards the end of 2017. Shortly before our baby was born. It's somewhat embarassing, and reminds me of the rate I was able to get work done pre-sleep-deprivation when my brain was well rested each day. I was able to port the rest of the Arduino driver quickly, and now we (finally!) have a Raspberry Pi driver for the magnetic tile v3. The use of the Python GPIO library limits the framerate to about 100-200fps, but that's more than enough for testing, and should be able to be improved substantially by porting the code to the C GPIO library. The Pi also has the advantage of being able to store extremely long captures, where as the memory in a microcontroller (like on the Chipkit Max32 board I was using to interface with the tile) tends to max out at a few thousand frames, limiting the use cases.
My cobbled-together web serving of the magnetic imaging tile (above) still lacks many of the basic features of the Processing example, and also updates rather slowly, but it's still plenty good for a proof-of-concept. The above image is of a magnetic screwdriver bit (the red pixels) standing on edge on the tile, which is a quick essentially static image for testing.
High-Energy Particle Detector (i.e. Radiation Sensor)
I also put together a quick visualization for the radiation sensor (a Radiation Watch 5 modified with my digital potentiometer backpack), displaying the average counts-per-minute (top), and a histogram of the pulse widths from the detector (bottom). I think that the pulse widths correlate with the energy level of the particles striking the detector -- I previously observed different distributions on the histogram for Cd109, Ba133, and background radiation -- and once observed a large number of counts in the highest bin around the same time there was a lot of solar activity. Unfortunately I don't have the facilities to really characterize this, and I think at best it's a very rough correlation -- ideally one needs to design a more sophisticated detector intended for characterizing energy level.
Above is the result of about 45 minutes of data collection, representing normal background radiation.
Carbon Dioxide (CO2)
Above is the trace for the CO2 sensor, a Sensiron SCD30 sensor that uses infrared light rather than heaters (like most low-cost gas sensors) as a detection mechanism. Above shows CO2 over about 45 minutes, where I entered the room briefly to turn on the unit to collect data (samples 0-200), left for about 30 minutes (200-1200), and then came back in (1200+). I had no idea that my presence affected the CO2 in the room so much, but I've seen news articles recently mentioning that small, enclosed, poorly-ventilated meeting rooms can increase average indoor CO2 levels as well -- so perhaps this shouldn't be a surprise. Below is a longer capture:
The samples from 1200 onward represent me (and our cat) being in the office for about an hour, with levels starting to reach close to 700ppm, according to the sensor.
Air Particles
Above are measurements from the 4 channels of the Sensirion SPS30 air particle detector, which optically senses particulate matter in the air in four common bin measurements: PM1.0, PM2.5, PM4.0, and PM10.0 . I was originally using a Plantower A003 air particle sensor together with a custom board that I put together that includes additional atmospheric sensors, but to make Iteration 8 more easily replicable, I've tried to restrict myself to parts that are easily purchased or hand-soldered, and the (somewhat challenging to find) Plantower A003 sensor combined with some very tiny surface mount soldering for the BME280 and CCS811 would make replication much more difficult for most folks. I have a BME680 breakout on it's way, which should hopefully fill in the gap for the functionality of the previous design, but be a lot easier for most folks to source.
One thing I will mention is that it seems to be challenging for me to get the SPS30 to show different readings. With the Plantower sensor, I could put a soldering iron beside it and start to solder, and notice a quick difference. I've repeated that test three times today with the SPS30 and only noticed an increase once, so it's not clear to me what the issue or difference may be. The traces for the different readings also seem to be very close to each other, and it's not clear to me whether that's an issue, or just an artifact of the overall readings being very low.
Next steps
It's been wonderful to finally see some data from the device, and have an end-to-end system working going from sensor to device to phone to visualization (minor bugs/issues notwithstanding). Aside from adding a few more sensors, for which I think the workflow should generally be reasonably straight forward given the work with the above sensors, I'd like to start working on mounting the device to the back of the phone. I designed Iteration 8 to fit on the back of an iPhone, but ultimately figuring out a mounted enclosure of some kind with mount points for all the sensors is non-trivial, and will determine exactly how much space is left over for additional sensors.
Slowly but surely, progress is being made. Thanks for reading!
-
A New Motherboard Design, now with more Pi
06/19/2019 at 07:58 • 0 commentsA very quick update, with a new motherboard design.
The concept with Iteration 8 is to have a science-tricorder-like device that fits on the back of a smartphone. The central challenges have been figuring out how to interface high-bandwidth sensors (like the thermal camera and high-speed magnetic field camera tile) to phones that generally aren't able to easily have external input -- and more specifically, to do this with an iPhone like the one I carry around (it takes beautiful pictures, and has about 5,000 of my very loved 17-month old on it right now). The compounding difficulties are that being on the tenure track with a little one, I have essentially no time, and my first design for an Iteration 8 motherboard was completely off-base, being quite large, and having a number of design errors and oversights (as prototypes do). It incorporated an Arduino Pro Mini for easily interfacing with the low-bandwith sensors, and a Raspberry Pi Zero for interfacing with the high-bandwidth sensors (and, communicating with the phone over WiFi) -- but it turned out pairing these two together with the Raspbery Pi programming the Arduino was very cumbersome, and the Pi Zero appears far too underpowered to run a responsive X-windows GUI, let alone develop the software for the Arduino on it. The SD card on the Pi was corrupted from the unsafe powerdowns associated with a development environment enough to be very prohibitive to development. But I think the general idea of connecting the two together is solid, so I decided to take another pass.
I decided I was overthinking the motherboard design -- trying to have all the features on it from the first pass, rather than having something simple that just worked, and progressively iterating new features with it. Enter the new motherboard design above -- it includes:
- A Raspberry Pi Zero W (for the FLIR thermal camera, and magnetic imaging tile)
- An Arduino Pro Mini, for the rest of the sensors
- A pre-built lithium polymer charger and 5V booster (for the Pi/USB voltage). The 3.3V is supplied from the Pi, rather than from a secondary converter.
- A single USB port for the thermal camera, rather than the full 4-port hub I included in the last design.
- An assortment of JST connectors for connecting sensors, including several QWIIC I2C connectors.
And that's pretty much it! Intelligent power saving? Not in this version, with the exception of a single manual switch on the USB connector to enable/disable the thermal camera (which can consume a lot of current). Other than that, the whole thing basically consumes plenty of current all the time to keep it's ~2000mah battery warm. But a lot of my mental effort has went to figuring out how to add enough power saving features to have this design powered for approximately 24 hours straight, which is rather challenging given the power requirements of the Pi Zero, so I decided to just continue on without power saving features in this iteration, and focus on getting something working end-to-end.
To keep the height profile small, the Pi Zero is soldered directly onto the motherboard. The USB is connected using the same clever mechanism of the ZeroStem, which uses two castellated throughholes to solder directly onto the USB data test points on the bottom of the Pi Zero. Simple, clever, and appears to work well so far!
I've connected a number of sensors to the design so far, including the above two atmospheric sensors from Sensirion. The SPS30 Particulate Matter Sensor (left) detects the distribution of particulate matter in the air, while the SCD30 CO2 Sensor optically measures atmospheric CO2 levels, while also measuring temperature and humidity. I had included a VOC sensor, but I have questions about the accuracy of the measurements, so I'll set that aside for the moment. Both sensors connect using I2C, which makes connection very easy.
The Radiation Watch Type 5 sensor is also included, with my external comparator backpack for being able to tune the detection threshhold from it's factory calibrated value of ~60keV down much closer to the noise floor. Using this method, I'm able to observe substantially more detections from the Ba133 radioisotope source used in the Open Source Computed Tomography Scanner, which has a large spike in emissions around 31 keV.
Above is the external comparator backpack for the Radiation Watch Type 5, with the JST connector and cable to the motherboard.
My high-speed magnetic tile imager can be driven quite slowly for low framerates, but the really interesting stuff happens when you start observing fast-moving magnetic fields at 1000+ frames per second. That requires a fairly high-speed analog-to-digital converter, and (more importantly) more RAM than most microcontrollers have if you want to record more than a second or two of video, so this sensor is connected to the Pi.
Though I haven't yet worked to get this working with the Pi Zero, I added a small JST connector to the back of this board that connects directly to the motherboard through a short cable. It exposes the SPI port and several GPIO pins from the Pi that should (hopefully) be plenty fast to drive this sensor. I prefer accessing low-level sensors from low-level devices (like microcontrollers), so I haven't yet put this driver together to test this configuration -- hopefully soon!
That's all for this quick update -- thanks for reading!
-
An atmospheric breakout, and radiation sensor backpack
09/01/2018 at 20:09 • 1 commentA quick update, with an atmospheric sensor board, and a quick-connect version of the radiation watch backpack.
One of the things my colleagues usually joke about is that I often say we need to "make our mistakes cheaply". I say this to the students all the time, and it's a philosophy that one of my own mentors in grad school always used to impart upon us. I thought I was pretty good at making my mistakes cheaply before parenthood, but now with a wonderful 7 month old, and almost no time for anything, I've been reevaluating my approaches to getting things done quickly with the constraint of having almost no time, to try and figure out new ways of still being productive on a vastly reduced time budget.
Enter the idea of small, independently-iterable modules, and tiny JST connectors with standard pinouts on everything. Most of my hardware designs try to strike a balance between being monolithic and modular, but I really think flexibility and modularity will be the way to productively move forward. The idea here is building sensor modules that are small enough (from a design perspective) to be quick to design and test, with low-profile connectors that allow them to be placed in a small device, and easily replaced as they are iterated or improved. If the shape of the ultimate device changes, it's likely not too big an issue, since the board-to-cable connectors allow much more design flexibility than the board-to-board connectors I've typically used in the past.
The above is an atmospheric sensing board based on the Sparkfun Environmental Combo Breakout that includes a CCS811 VOC sensor, a BME280 Temperature/Pressure/Humidity sensor, and the new and very wonderful Plantower A003 air particle sensor. One of the really exciting features of the A003 is that it gives a histogram of air particle sizes, which (to a scientist like myself) is much more exciting than just a single number representing the PM1.0 or PM2.5:
Particles > 0.3um / 0.1L air:39 Particles > 0.5um / 0.1L air:6 Particles > 1.0um / 0.1L air:0 Particles > 2.5um / 0.1L air:0 Particles > 5.0um / 0.1L air:0 Particles > 50 um / 0.1L air:0
This particular board includes a cutout between the CCS811 and BME280 sensors, which helps reduce some of the difficulties with temperature measurement in this scenario -- the CCS811 is reported to warm up a fair amount, and this can throw off the readings of the BME280 if it's quite close.
Radiation Watch Type 5 Backpack
The Radiation Watch Type 5 is one of my favorite sensors, being very sensitive, compact, and having very modest requirements to get it up and running quickly. Usually when I use this sensor I add on an external comparator that allows one to tune the threshold of the output signal, and essentially increase the sensitivity of the sensor from the stock threshold of ~60KeV to something closer to the noise threshold of the circuit. For the first iteration of the OpenCT scanner I used a 10-turn pot to tune this threshold. For the Arducorder Mini, I used a static resistor, but found that it was perhaps sometimes tuned a little /too/ close to the noise floor, and on certain days would trigger almost constantly -- perhaps a function of temperature. I've since iterated different versions of the detector circuit, including some intended to do spectroscopy through peak characterization (unsuccessfully). Though I'd love to get that working one day, here, I settled on the simple version of the backpack, but with the pot/static resistor swapped out for a digital potentiometer, so that the threshold could be dynamically retuned at issue. The not-too-mechanically-stable 0.1" header from the Arducorder Mini is also exchanged here with a low-profile 6-pin JST connector, for easy use and mounting.
Slowly but surely, progress is being made. Thanks for reading!
-
A Third, High-Speed Magnetic Imager Tile
02/12/2018 at 20:22 • 24 commentsA quick update with a new high-speed version of the magnetic imaging tile!
I've been very interested in visualizing things that are very difficult for us to normally see for a long while -- it absolutely fascinates me that there's so much around us that we just can't easily perceive, and I often wonder how much more we'd be able to understand about our worlds, and how much more interesting bits of science we'd be able to get done, if only we could easily see them as naturally as we see things like colour.
One of the most memorable examples of this, for me, is a little more than a decade ago when I'd literally just assembled my open source science tricorder mark 1, turned it on for the first time, and started wandering around the room to see what interesting things it might detect. A moment later I'd stumbled upon a power adapter plugged into the wall with a transformer, and the oscillating field inside being detected by the magnetometer. That magnetometer and visualization was updating at only a few cycles per second, much slower than the 60Hz oscillations inside the transformer, but it was enough for me to watch that aliased oscillating magnetic field vector make large swings back and forth, and make me wonder what it would look like if only we could see it hundreds of times faster, and with an image instead of just a single vector. Magnetic field visualization also seemed much like low-lying fruit -- we've been very good at producing very low cost, accurate magnetic field sensors for a long time, so this seemed possible in the near term.
I can't say things happened as quickly as I'd hoped, or even that I was able to form an idea for what a magnetic imager might look like that evening. But years later I ended up becoming fascinated with magnetic resonance imaging (MRI), and tried to dream up an idea of what an inexpensive desktop MRI might look like. One of the difficulties with MRI is that it requires very homogeneous magnetic fields to be placed over a sample, in order to spatially resolve different parts of the sample (and render an image). The normal way this is done is with a large, beautifully homogeneous, incredibly expensive superconducting magnet, but I had wanted to figure out a way to do it with a terrible, not super homogeneous, cheap electromagnet by reading all the field variations, and calibrating them out -- and I had decided to try to do this by creating a first magnetic imaging tile built out of a lot of 3-axis magnetometers in a 10mm grid. That low-field MRI project stalled a bit (it turns out being a tenure-track professor takes a lot of your time), and along the way I was able to create a two much simpler (but incredibly slow, incredibly low resolution) desktop computed tomography (CT) scanners, but the idea of creating a magnetic imager for it's own sake (apart from the idea of building a low-field MRI) seemed very clear. I put together an array of Hall Effect sensors that (unfortunately) lose the 3-axis vectors of many magnetometers, but have very fast updates, and built the Magnetic Imaging Tile V2. That tile was very cool -- for the first time I had an imager with a reasonable spatial resolution (~4mm) and a large number of pixels (12x12, though I had only ever populated a smaller ~8x8 subsection). But unfortunately my design choice of using an I2C I/O multiplexer for addressing each "pixel" (magnetometer) in the tile meant that it was limited to a framerate of about 10-30Hz. This meant I could see very neat static fields live, but placing a transformer near the tile only showed it rapidly oscillating and unable to render a good image, because the tile was about a hundred times slower than it needed to be to sample such a fast field.
The schematic for the new V3 tile is shown above (click here for a PDF).
One of the critical issues I had was how to resolve was how to make the tile addressing much faster, by replacing the I2C I/O multiplexer with something else, while keeping the external pin-count for whatever microcontroller would be interfacing with the tile still low. After a bit of brainstorming (and by this, I mean mostly doing other things for a long time until the answer suddenly came to me one day), it occurred to me that since we essentially need to move through each pixel only once per frame, and that the CD74HC4067M analog multiplexers have 4-bit binary addressing (with an active-low enable line each), the I2C I/O multiplexer could be replaced with a 6-bit binary counter, with the lower 4-bits for addressing a specific channel on each multiplexer, and the next 2-bits used for selecting which multiplexer was enabled (through a 2-to-4 decoder). Some 1970s-era 7400 series logic later (a 74HC590A 8-bit counter, and a 74LVC1G139 decoder), I had an addressing mechanism put together that still uses only 2 lines (a clock, and a reset back to 0 -- the first pixel), but can likely be clocked at some ridiculous speed -- in the megahertz, and far faster than any ADC that I have readily available (or, possibly, the response time of the magnetic field sensors themselves).
A quick note about the addressing, which is interesting. Though each analog MUX has 16-channels, for routing considerations it's not easily possible to build a system that iteratively addresses the pixels in straight lines, one after another. Instead, each analog MUX is responsible for a 4x4 array of magnetometer pixels, and still not in any easily recognizable pattern -- just a regular pattern that was (painstakingly) found to fit and be tile-able with a 2-layer board. It's a little challenging to route this, and while it looks complicated, a little bit of software in the form of a look-up table ensures that it's still possible to read each pixel's data and place it in the right spot in a framebuffer at a very high speed.
I also decided to go back to Eagle to do this, since they'd just advertised adding a push-and-shove router, and I wanted to test it out. I've been using Eagle for more than a decade, but had switched to KiCad for the push-and-shove router, and when Eagle switched to a subscription based license model (In spite of having purchased two previous versions of Eagle, I will never purchase subscription-based software). Still the free version brought me back, and it only took me 4 hours to route, versus two weeks of evenings and weekends with the KiCad router, which is mostly due to a combination of my being very new to it, and the usability issues that the KiCad folks are progressively ironing out with each iteration.
The tile itself is intended for Iteration 8 -- a sensor package that I'd like to place on the back of a phone, since it's way more powerful at visualization than any stand-alone system that I would be able to realistically design from scratch (in spite of how enjoyable it is to design such devices, like the Arducorder Mini). I've been tinkering away on this, very slowly, for about a year -- and the V2 tile, in addition to being unable to reach high framerates, is also a little too large to easily fit on the back of a phone while allowing room for other sensors to fit as well. So I decided to reduce the size from 12x12 to 8x8, since this made the tile physically much smaller (53 x 35 mm), and also much cheaper/easier to make.
The tile design itself only has 6 separate parts (1x binary counter, 1x decoder, 1x ADC, 4x Analog MUX, 10x 4.7uF 0603 capacitors, and 64x magnetometers). I actually used the internal ADC on the Chipkit MAX32 to achieve an even faster framerate (~2000 frames per second) than the onboard 14-bit 100KSps AD7940, but I include it so that the tile could be used with a host that doesn't have an ADC (like a Raspberry Pi). The ADC is actually very expensive (about $10 in quantity), so replacing this with a faster, less expensive 12-bit ADC might be a solid idea for a second revision.
While the fastest I've been able to achieve is 2000 Hz on the Chipkit MAX32, it's entirely possible that much higher framerates would be possible. I think the limiting factor is currently the ~20KHz bandwidth of the DRV5053VA magnetometers, meaning that 20KHz framerates might be possible with an ADC that has a greater than ~1.5MSPS sampling time. I'm not sure what fields one is likely to encounter will move this quickly, and a quick Google seems to show most magnetic imagers today having updates ranging from 1-1000Hz -- but it would still be very interesting to see. Even though the DRV5053VA is the most sensitive analog magnetometer I was able to find at this price point, my intuition is that most fields one is likely to encounter that are oscillating that fast may also be very low intensity, so having a higher sensitivity might be required. I would still love to do this with an array of very sensitive 3-axis magnetometers, if I was able to get the density and speed up, while keeping the cost low (which I haven't been able to find yet). But having 3-axis vector information for each pixel would allow for some incredible visualizations, similar to Ted Yapo's very cool manually translated single-magnetometer scanner and the great visualizations he's put together.
Here's a picture of the tile assembled. It took only about an hour to hand-place the parts with a tweezer after using an inexpensive stencil from OSH Stencils. I really enjoy the aesthetics of the tile -- seeing a large array of the sensors on the board. Theoretically the same method could be used for sensors of a variety of different modalities if they're available in the same package footprint.
And the bottom of the board, with the analog multiplexers, and the connector. I had intended on using a locking pattern for the 0.1" header (since I'm still not sure what connector it will have to interface with whatever form Iteration 8 takes), but the locking pattern looked to be not quite offset enough (or the header I used was a little small), so I added a few dabs of solder. If another revision is made, this would also be a positive (and very small) change to make.
The source files for this version 3.0 of the magnetic imaging tile is currently available on the Gitub repository.
Thanks for reading!
Edit: SparkX has begun doing a small run of these to gauge interest. The product page is available here, with a few from the initial batch in stock, and more on the way in about a week. Please share pictures or videos of interesting applications or things you discover to scan!
-
A motherboard design, and figuring out where everything goes
08/28/2017 at 06:35 • 3 commentsA very quick post with a very long picture -- but what I hope is a significant milestone after a year of collecting parts and breakouts. After a few weeks of sketches and stacking boards atop each other, I think a design for placing the major sensing components on the back of a phone has emerged.
The central ideas behind this iteration, Iteration 8, are to
- sit on the back of a phone, using the phone's display and communications capabilities, and
- to have a very low barrier to assembly, ideally requiring a shopping list of boards from major long-term vendors, and an afternoon with only modest (mostly through-hole) soldering skills.
I think I've largely been able to keep to these design goals, and put together a candidate motherboard design (currently being fabricated) to serve as a test platform for this idea.
Though I didn't realize it until after I'd begun designing the motherboard, fundamentally the motherboard is simply an Arduino Pro Mini, a low-power off-the-shelf microcontroller for interfacing with sensors, connected to a Raspberry Pi Zero W, a high-power board capable of running a webserver (for the phone to interact with, to display data), communicating with the Arduino Pro Mini (for sensor data), while also itself interfacing with the high-bandwidth sensors: the FLIR Lepton thermal camera, the standard Raspberry Pi camera, the Hamamatsu microspectrometer, and the Magnetic Imaging Tile. The rest of the board is largely universal 0.1" headers designed with enough room that it should be easy to connect whatever breakout boards the user would like to the system, with a recommended set supplied. This makes the system very simple, expandable, low-cost, much quicker to put together, and ideally with a fair amount of longevity.
The expense of this simplicity is two-fold -- the first being size (the backpack will add at least 15mm of thickness to the phone, in addition to some area on the side of the phone for the Pi itself and an air particle sensor that will protrude towards the front (screen side) of the phone). The second is power budget -- the Pi consumes a lot of power, and my hope is to be able to keep it in a low-power state with the Arduino collecting and buffering data, then regularly wake it either to collect data from the Arduino (then sleep again), or wake for longer periods for user interaction, or to use the larger, higher-bandwidth sensors.
Hopefully in a few weeks when the boards arrive I'll be able to finally test these ideas, get a sense of whether this is a viable design, and begin putting together the prototype.
Below is a quick sketch, to scale, with the major components depicted, as well as how they "stack up" in several layers on the back of an example phone:
thanks for reading (this admittedly very short post)!
-
A Magnetic Imager Tile
07/29/2017 at 21:52 • 10 commentsA quick update with a new sensor I've been working on, a magnetic imager tile (something like a "magnetic camera"). It's definitely very cool to see magnetic fields live!
Magnetic Imager Tile
I've always been interested in visualizing things that are challenging to visualize, particularly those that are pervasively around us. Magnetic fields certainly qualify for this -- I think it's absolutely fascinating that they're everywhere, but that we (generally) don't make more than point measurements of the fields, and only very rarely are images taken.
The most intuitive way to make an imager for something is to get a whole bunch of single sensors for that something, and place them in an array. Alternatively, you can take a single sensor, and physically translate it through space, as in Ted Yapo's magnetic field scanner ( https://hackaday.io/project/11865-3d-magnetic-field-scanner ). A few years ago I worked to build an imager by putting together an 8x8 array of the popular HMC5883L magnetometers, spaced about 1cm apart ( https://hackaday.io/project/5030-low-field-mri/log/15914-concept ). This has plenty of positives -- each sensor is a 3-axis magnetometer, and the whole array could be read using a simple I2C interface. Some of the difficulties are that such a large board with very tight-pitched components is a bit challenging to assemble -- I was only able to successfully assemble a 4x4 version, with the 8x8 (and it's 64 magnetometers) unfortunately only working as an object d'art. One of the other challenges with the HMC5883 array was the packing density -- the number of external components meant the maximum density I could achieve were pixels (magnetometers) spaced 10mm (1cm) apart.
It's been a few years since I had a go at this, and so I decided to put together another attempt:
- Simpler sensors: Large-pitch analog hall-effect sensors instead of I2C sensors.
- Higher density: a 4mm density using SOT-23 sensors requiring no external components
- Addressable array: Analog addressing through a large array of analog multiplexers on the back of the board
- Tileable: Able to create larger arrays by putting multiple boards adjacent to each other
- Easy to solder: Only large pitched components, so it would be quick and easy to solder in a toaster reflow oven (for the array side) and with a hand iron (for the analog multiplexer side)
- 12x12: The size of the array (12x12) makes it big enough to see interesting things, and small enough to (I hope) fit on the back of the eventual Iteration 8.
I learned from my earlier attempt with the HMC5883L array that this would be a bit of a routing nightmare, and so I decided to try switching from EagleCAD to the open source KiCAD, to make use of it's push-and-shove router. It took a bit of getting used to -- KiCAD still has significant usability issues, in my opinion -- but with some work the board artwork came together, with exactly enough room for everything. The Hall Effect sensors and power traces are on the top side of the board, with the array of analog multiplexers to route the analog signals from the sensors placed on the bottom. The bottom also contains an I2C I/O expander (U110) so that the array can be addressed using only 2 I2C pins instead of over a dozen address lines, as well as an external 14-bit SPI ADC (U111). The raw analog signal is also broken out on the connector (J1), so that the board can be connected to an external ADC (like the internal ADC on an Arduino) very easily.
The board itself had very small 6 mil traces, which lead to some manufacturing issues. I was able to cut around the bridged traces and wire-wrap fixes, and populated about half the array for testing:
The bottom of the array, with the analog multiplexers:
Frame Rate
I ideally would like an imager with a high (~1000fps) framerate, to be able to visualize interesting high-speed fields, like the fields oscillating in a transformer. While the 14-bit SPI ADC on this board can achieve about 100KSps (or about 700fps maximum), the I2C multiplexer is much slower, and would likely need to be replaced to achieve high framerates. With the Arduino Uno sample code streaming to the Processing visualization (in the youtube video), it's able to sustain about 10fps, and the Raspberry Pi python example achieves about 30fps after increasing the I2C bus from 400KHz to 800KHz. This could likely be increased a bit with some streamlining of the code.
In terms of slow/static fields, it seems pretty easy to find the speakers/vibration motors in cell phones with the current iteration. The neatest picture that it's rendered so far is of someone's open air pod charger -- lots of magnetic fields of different directions to hold in the ear bud headphones while they're charging, I suspect!
Schematics, Design files, Example Code
Available on Github: https://github.com/opensensinglab/magneticimager
Bill of Materials, Approximate Cost
- After looking through the datasheets for nearly every inexpensive hall effect sensor I could find, I settled on the DRV5053VA, which seems to be a good balance of sensitivity and cost. It's range is only +/-9 milliTesla (on the order of about fridge magnet strength), where are most other sensors are +/-100mT, which means they would be much less sensitive to the smaller everyday fields that are more interesting. For larger fields (like those serious neodymium magnets in the video) you can always hold them further from the imager, but if the sensor isn't sensitive enough in the first place, there's no easy way to detect the smaller fields. The DRV5053VA only outputs from 0-2V (1V center = 0mT), but it has a wide input voltage range, and seems very tolerant to mishandling (it's made for automotive applications). Octopart shows that the DRV5053VA tends to get to about ~0.34 in QTY>1000, or about $49 just in hall sensors.
- Analog MUX ( CD74HC4067M ) is $0.60 in QTY100, so about $6 per board.
- I2C I/O Expander ( MCP23017T-E/SO ) is about $1 in any quantity (but, replace this for faster operation)
- The onboard 14-bit 100KSPS ADC (AD7940BRJZ) is expensive, at about $10 in QTY100.
This puts the BOM at around $70 for the 12x12 imager in modest quantities. Making a single board costs about $200.
Thanks for reading -- and apologies to be slow on updates, Iteration 8 is going much less quickly than I expected. This is my first year as a tenure-track professor, and it's largely an all-day every-day job (which I do gladly, it's very rewarding -- it just means there's less time for other projects, which move a little slower than I'd like).
-
A first attempt at figuring out the MAX30105 Air Particle Sensor
03/27/2017 at 03:57 • 6 commentsOne of the areas of sensing that I don't have a lot of experience with is atmospheric sensing. While I've become familiar with sensors for temperature, pressure, and humidity, I am largely inexperienced with the array of sensors available for sensing various gasses or air quality metrics -- in large part because they've appeared too large or too power hungry for a handheld device, and/or generally having issues with accuracy. But clearly measuring air quality is an important social topic, and very useful for science education, and so it'd be great to be able to do this reliably in a small handheld instrument in one's pocket.
I was excited last autumn to see Maxim release the MAX30105 Air Particle Sensor, an extremely small (~6x3x2mm) surface mount sensor listed as being able to detect air particles (they give the example of smoke detection on the product website). I was eager to see how this might work for detecting air particle measures like ambient dust level, or particle counts (e.g. PM2.5, a measure of how many particles are in the air that have a diameter of less than 2.5 microns), so I thought I'd run a few experiments.
A quick first pass
In order to see how well the MAX30105 compares with traditional air quality sensors, I ordered a few common air quality sensors (the DSM501A, above, and one of the popular Sharp line of sensors), and quickly cobbled together a MAX30105 breakout board to get a sense of what the data coming off the sensor looked like by eye. After about half an hour of recording in open air, I could easily notice changes in particle density from the DSM501A when (for example) opening up a window, but was not easily able to see these changes reflected in the MAX30105 data.
The MAX30105 has 3 LED channels (red, IR, and green). Above are histograms of the counts coming off each channel. Before seeing this, I hypothesized that particle detection might appear as follows:
- Hypothesis 1: A dust (or other particle) drifts by the sensor, partially reflecting some amount of light back towards the sensor, and shows as a (significant?) increase in the ADC count for one particular sample. Sampling over long periods of time then plotting a histogram, one would then expect to see a bimodal distribution -- a central bulge from returns where there was no reflection, then a smaller bulge of higher-intensity reflections, proportional to the dust sensity/properties of the air. (This is essentially how I understand the DSM501A functions -- using comparators to measure the number of counts over a certain threshold).
- Hypothesis 2: The air generally reflects some very small proportion of light that one shines at it, proportional to the particle/dust density in the air. The general intensity of the reflection will correlate with the dust density in the air. (This is essentially how I understand the Sharp ambient dust sensor works).
The above distributions look very close to unimodal Gaussian distributions -- so no extra overlapping distributions to support Hypothesis 1. The means also didn't seem to reflect Hypothesis 2, but I had very little data, and given the width of the distributions, any number of other issues could be at play -- noise from ambient light (though there is supposed to be some ambient rejection), improper mechanical placement, and many other issues.
The MAX30105 datasheet is very light on details about the particle sensing application, so I e-mailed technical support with my data to see if any additional help or application notes were available. They were only able to say that the MAX30105 requires "very smart algorithms" to function, and that they were happy to sell those algorithms through a third-party distributor. It seems unusual to me to sell an air particle sensor without describing how it can be used for particle sensing, but hopefully with some work one can characterize what air particle sensing tasks it's useful for, and how well it performs at those tasks.
A less quick second pass
To help control for as many variables as easily possible, I constructed a mount for the sensors to help (a) reduce the effect of ambient light on the sensor readings (if any), (b) contain all three sensors being tested, and (c) help ensure that all three sensors were being exposed to similar atmospheric conditions (without knowing enough about fluid dynamics to control for things like air flow). I ended up with this plumbing pipe from Home Depot, that contains a mount with the sensors in the middle, and two elbows on either end to dramatically reduce the ambient light inside the tube. An Arduino Uno using the open drivers for these sensors controls everything.
The mount (placed inside the tube) has the Sharp sensor mounted at the front, followed by the DSM501A sensor, and the Sparkfun breakout for the MAX30105 sensor last.
The tube also contains a fan on the bottom elbow that (very slowly) moves air through the tube, as well as the cables from all the sensors.
Data collection
I collected data from a small handful of locations:
- Data Set 1: Home, beside an open window while the neighbours were barbequing. I also collected a large data set over one day and one night, but there was not a great deal of variation in this latter data, and it was larger than Octave could easily load, so I did not include it.
- Data Set 2: Several hours of recording while placed at the local makerspace, Xerocraft, during open hours. The sensor was placed near the woodshop while it was stuffed full of woodworking enthusiasts generating a great deal of dust.
- Data Set 3: As above, but placed inside the woodshop (when it was less busy, and had a bit of room to spare).
- Data Set 4: Outside the back porch of Xerocraft. (Note: Someone had briefly plugged in a chop saw to the same outlet as the dust sensor, so there may be some electrical noise in this data).
The above dataset is available here in csv format (3 channels MAX30105, 2 columns from Sharp, and 4 columns from the DSM501A. The Sharp and DSM501A have both raw and interpreted values -- see code below).
For the sampling protocol:
- 10,000 samples were recorded from each of the red, IR, and green channels of the MAX30105 at a requested rate of 1000 samples per second. Because these were written to an SD card (which is a little slow on the Arduino), the actual sampling rate was likely about 10 times slower.
- Then, 10 samples from the Sharp sensor were taken, with only the final one used. (I noticed that the first few readings of this sensor were often much higher than subsequent readings -- which may signify an error in this sensor -- so I sampled for a while, then used the final reading, to see if this data made sense).
- Then, 10 seconds of sampling from the DSM501A, listening to output port 1 (O1), then another 10 seconds listening to port 2 (O2).
Ideally the sampling would have been simultaneous between all sensors, but for speed I was making use of the open drivers -- so here the readings between sensors may be delayed by as much as a minute or more. To reduce this effect, I throw out most of the 10,000 MAX30105 samples, comparing only the first 1,000 samples to the most recent Sharp and DSM501A data (hopefully delayed by only 10-60 seconds).
Here are the relevant configuration bits for the MAX30105 driver that this test used:
// The following are relevant configuration snipits for the MAX30105 from the test code // MAX30105 Configuration byte ledBrightness = 0x7F; //Options: 0=Off to 255=50mA byte sampleAverage = 1; //Options: 1, 2, 4, 8, 16, 32 byte ledMode = 3; //Options: 1 = Red only, 2 = Red + IR, 3 = Red + IR + Green int sampleRate = 1000; //Options: 50, 100, 200, 400, 800, 1000, 1600, 3200 int pulseWidth = 411; //Options: 69, 118, 215, 411 int adcRange = 4096; //Options: 2048, 4096, 8192, 16384 // Sharp, DSM501A drivers ... // Setup() ... // Main Loop unsigned long loops = 0; void loop() { samplesTaken = 0; Serial.println(loops); loops += 1; particleSensor.setup(ledBrightness, sampleAverage, ledMode, sampleRate, pulseWidth, adcRange); //Configure sensor with these settings while (samplesTaken < 10000) { particleSensor.check(); //Check the sensor, read up to 3 samples while (particleSensor.available()) { samplesTaken++; if (ENABLE_SERIAL_OUT == 1) { Serial.print(particleSensor.getFIFORed()); Serial.print(","); Serial.print(particleSensor.getFIFOIR()); Serial.print(","); Serial.print(particleSensor.getFIFOGreen()); Serial.print(","); Serial.print(voMeasured); // Sharp Serial.print(","); Serial.print(dustDensity, 3); // Sharp Serial.print(","); Serial.print(lowpulseoccupancy); // DSM501A O1 Serial.print(","); Serial.print(concentration); // DSM501A O1 Serial.print(","); Serial.print(lowpulseoccupancy2);// DSM501A O2 Serial.print(","); Serial.print(concentration2); // DSM501A O2 Serial.println(); } if (ENABLE_SD_OUT == 1) { dataFile.print(particleSensor.getFIFORed()); dataFile.print(","); dataFile.print(particleSensor.getFIFOIR()); dataFile.print(","); dataFile.print(particleSensor.getFIFOGreen()); dataFile.print(","); dataFile.print(voMeasured); dataFile.print(","); dataFile.print(dustDensity, 3); dataFile.print(","); dataFile.print(lowpulseoccupancy); dataFile.print(","); dataFile.print(concentration); dataFile.print(","); dataFile.print(lowpulseoccupancy2); dataFile.print(","); dataFile.print(concentration2); dataFile.println(); } particleSensor.nextSample(); //We're finished with this sample so move to next sample } } //particleSensor.setup(0, sampleAverage, ledMode, sampleRate, pulseWidth, adcRange); //Configure sensor with these settings // Poll sharp sensor delay(1000); for (int i=0; i<10; i++) { readSharpSensor(); delay(1000); } // Poll DSM501A sensor readDM501A_fine(); readDM501A_coarse(); // Flush SD card dataFile.flush(); }
Data Analysis
I chose three methods for the analysis, based on the above hypotheses for how the measurements might reflect air quality. They're all based on determining the correlation between various measures from the MAX30105 and the Sharp and DSM501A sensors using Spearman's Rho. Assuming that the data from each of these sensors is linearly related, a correlation of 0 means that the below measures from the MAX30105 are returning completely different information than the measures from the Sharp and DSM501A sensors, and a correlation of 1 means that they're returning exactly the same information. I chose this measure because it should help control for issues like the relative sensitivity between each of the sensors. It would not control for cases where the data from the sensors has a non-linear relationship (e.g. exponential), but hopefully we'll be able to verify this by plotting the data.
Method 1: Correlations (Mean values of 1000 Red/IR/Green samples)
With this first method, I take the mean value of 1000 samples of the red, IR, and green channels from the MAX30105 (nominally about 1 second of data), and determine the correlation with the most recent output from the Sharp and DSM501A sensors.
Interpretation: Unfortunately in this case it doesn't look like the mean MAX30105 data is strongly correlated with the Sharp or DSM501A sensors. The small correlations on individual datasets wildly change magnitude and direction, further reenforcing this. This makes Hypothesis 2 unlikely -- that the air quality data is reflected in the mean of the distribution.
It's also interesting to note that there isn't much of a correlation between the Sharp and DSM501A sensors (or, between the two DSM501A channels, for that matter) -- further reenforcing the idea that these are measuring two different things (e.g. overall air dust level vs particle size counts).
Correlations:
Data Set 1: Beside an open window at home while the neighbours were barbequing
Spearman's Rho MAX RED MAX IR MAX GREEN Sharp DSM501A O1 DSM501A 02 MAX RED 1 0.92 0.87 0.42 -0.29 -0.07 MAX IR 1 0.84 0.49 -0.16 0.05 MAX GREEN 1 0.42 -0.24 -0.30 Sharp 1 0.13 0.21 DSM501A 01 1 0.10 DSM501A 02 1 Data Set 2: Several hours of heavy woodshop use at the local makerspace (from an adjoining room)
Spearman's Rho MAX RED MAX IR MAX GREEN Sharp DSM501A O1 DSM501A 02 MAX RED 1 0.18 0.91 0.32 -0.43 -0.39 MAX IR 1 0.02 -0.04 0.09 0.12 MAX GREEN 1 0.39 -0.58 -0.51 Sharp 1 -0.43 -0.34 DSM501A 01 1 0.42 DSM501A 02 1 Data Set 3: As above, but directly inside the shop
Spearman's Rho MAX RED MAX IR MAX GREEN Sharp DSM501A O1 DSM501A 02 MAX RED 1 0.64 0.29 0.33 0.17 0.52 MAX IR 1 0.14 0.29 0.17 0.43 MAX GREEN 1 0.55 -0.51 -0.09 Sharp 1 -0.03 -0.11 DSM501A 01 1 0.57 DSM501A 02 1 Data Set 4: Outdoors on the back porch of the Makerspace
Spearman's Rho MAX RED MAX IR MAX GREEN Sharp DSM501A O1 DSM501A 02 MAX RED 1 0.50 0.90 -0.77 -0.17 -0.33 MAX IR 1 0.21 -0.52 0.59 -0.36 MAX GREEN 1 -0.59 -0.48 -0.45 Sharp 1 -0.07 0.39 DSM501A 01 1 0.57 DSM501A 02 1 Example Data:
Above is a plot from data set 2 (near the wood shop), comparing the red channel from the MAX30105 with one of the DSM501A channels. Here the data is really all over the place -- It'd be very hard to draw a straight line that captures the relationship between these two sets of data, so the correlation is low.
Method 2: Correlations (Standard Deviation of 1000 Red/IR/Green samples)
My second method is largely a guess -- if the mean value doesn't appear to reflect the air particle level, perhaps the amount of variance in the distribution (i.e. the standard deviation) on short timescales (i.e. 1 second at 1000 samples/second) contains some of this information. This is largely a guess -- I can come up with a few physical reasons why this might be the case, but perhaps also just as many for why it likely wouldn't be the case (so definitely a post hoc test). Let's see what happens.
Interpretation: DSM501A: We actually do see a moderate-to-strong (0.52 to 0.75) correlation between the red channel of the MAX30105 and the DSM501A Output Channel 2 across all four datasets. There are also smaller correlations with the green channel of the MAX30105. There does not appear to be a correlation with the IR channel -- which is interesting, as it's my understanding that the Sharp and DSM501A sensors use an IR LED/photodiode for their detection (though perhaps of a significantly different wavelength).
Sharp: There does not appear to be a correlation between the standard deviation of the MAX30105 measurements and the value of the measurement from the Sharp sensor.
Data Set 1: Beside an open window at home while the neighbours were barbequing
Spearman's Rho MAX RED MAX IR MAX GREEN Sharp DSM501A O1 DSM501A 02 MAX RED 1 -0.17 0.71 0.27 0.28 0.58 MAX IR 1 0.08 -0.12 -0.03 -0.06 MAX GREEN 1 0.26 0.23 0.65 Sharp 1 0.13 0.21 DSM501A 01 1 0.10 DSM501A 02 1 Data Set 2: Several hours of heavy woodshop use at the local makerspace (from an adjoining room)
Spearman's Rho MAX RED MAX IR MAX GREEN Sharp DSM501A O1 DSM501A 02 MAX RED 1 0.04 0.82 -0.40 0.77 0.69 MAX IR 1 0.14 -0.08 -0.01 -0.04 MAX GREEN 1 -0.30 0.58 0.57 Sharp 1 -0.43 -0.34 DSM501A 01 1 0.43 DSM501A 02 1 Data Set 3: As above, but directly inside the shop
Spearman's Rho MAX RED MAX IR MAX GREEN Sharp DSM501A O1 DSM501A 02 MAX RED 1 0.60 0.72 0.06 0.56 0.75 MAX IR 1 0.77 0.35 0.08 0.43 MAX GREEN 1 0.29 0.11 0.38 Sharp 1 -0.03 -0.11 DSM501A 01 1 0.57 DSM501A 02 1 Data Set 4: Outdoors on the back porch of the Makerspace
Spearman's Rho MAX RED MAX IR MAX GREEN Sharp DSM501A O1 DSM501A 02 MAX RED 1 -0.40 0.95 0.05 0.47 0.52 MAX IR 1 -0.26 -0.52 -0.78 -0.79 MAX GREEN 1 -0.19 0.42 0.43 Sharp 1 0.07 0.39 DSM501A 01 1 0.57 DSM501A 02 1 That looks promising -- let's combine these four datasets into one, and have a look at the data to see if the relationship between the MAX30105 RED channel and the DSM501A O2 looks roughly linear:
Note, the zeros from the DSM501A reflect 10 second sampling periods where no particles were detected. These are automatically removed from the data set before running the correlation, and are likely an artefact of the very short (10 second) sampling period used for the DSM501A. With simultaneous sampling of the MAX30105 and DSM501A, these sampling periods could be increased quite a bit without having to worry about the sensors sampling different conditions.
Aside from the zeros (outliers) that floor the distribution, the remainder of the distribution does give the impression that there is a linear relationship between the MAX30105 RED channel and the DSM501A O2. The correlation on this combined dataset (0.71, below) also seems to support this.
Concatenated Data Set: Most of all of the above datasets
Spearman's Rho MAX RED MAX IR MAX GREEN Sharp DSM501A O1 DSM501A 02 MAX RED 1 -0.12 0.82 -0.11 0.63 0.71 MAX IR 1 0.02 0.10 -0.21 -0.14 MAX GREEN 1 -0.10 0.44 0.58 Sharp 1 -0.32 0.02 DSM501A 01 1 0.38 DSM501A 02 1 Method 3: Only looking at the tail mass
In spite of the above, I still feel like hypothesis 1 -- that particles moving in front of the detector should create a bimodal distribution, much like the detection methodology of the DSM501A sensor -- is a likely source of signal, and that we should be able to detect these cases by masking out the main bulk of the distribution, and looking only at the outliers. Histograms like the one below, where a small number of samples appear to the right (higher reflectance) of the main distribution, only make me think this might be where the majority of the signal is hiding, and that the standard deviation is just leaking some of this information through.
Masking out the main bulge (+/-5 bins from the center), and summing the remaining mass, this is the plot we get over ~80 sampling periods (below, using dataset 2, a long collection near the woodshop). If this truly reflects air particle level, it would be suggesting that there was a gradual dip around time point 30, and a steady increase up until point 80. This would make some physical sense, since (anecdotally) the woodshop probably became dustier as more folks came in to work on their projects.
Unfortunately this isn't clearly reflected in the distribution from the conventional air quality sensors (the Sharp and DSM501A) in the plot below, or in the correlations between this measure and the Sharp/DSM501A values. To give a sense of this data, the DSM has a lot of high-frequency changes:
Where the Sharp generally reports a value that oscillates around a mean value:
And, to verify that the mean value of the Sharp sensor changes depending on the environment, let's have a look at the means for the different data sets:
Data Mean ADC Value Mean Dust Density (mg/m^3)* Data set 1 (indoors beside window at home) 244.7 0.104 Data Set 2 (near busy woodshop) 287.2 0.140 Data Set 3 (inside busy woodshop) 287.4 0.142 Data Set 4 (outside in dusty parking lot) 264.2 0.124 (* the open source driver uses the characterization from http://www.howmuchsnow.com/arduino/airquality/ to derive the dust density)
Next Steps
This is a promising first step -- it looks like the MAX30105 may deliver measurements similar to the particle measurements from one of the outputs of the DSM501A (which I believe is sensitive to PM2.5 levels, though the datasheet is a little unclear about this).
There is a great post on Make by Tim Dye who characterized several inexpensive sensors with a professional air particle reference instrument, and showed that some inexpensive sensors tend to have a strong correlation of 0.7-0.8 with the reference instrument he used, under different particle conditions. If we consider this characterization of the MAX30105 as a promising first step, then ideally a similar characterization using a proper air particle reference instrument (rather than these inexpensive sensors) can yield a more full characterization of the MAX30105's capabilities under a variety of different particulate scenarios, and help enable inexpensive and millimeter-scale air quality characterization instrumentation.
Thanks for reading.
-
From the Arducorder towards Iteration 8
11/24/2016 at 07:21 • 2 commentsIt's been two years since I developed the Arducorder Mini, and I have found myself brainstorming about what a next model would look like over the past few months. It feels like it's time to work on the next iteration of my open source, handheld, pocket-sized scientific instruments -- Iteration 8.
Arducorder Mini: What went right
The Arducorder Mini was a substantial undertaking, and turned out exceptionally well -- it's personally my favorite open sensing project, and I very much enjoyed the development process, and getting to see the final product. I'd like to briefly describe what went well with the project, and what could use improvement:
- Diversity: The Arducorder contains nearly a dozen very different sensors.
- Capability: Spectroscopy, radiation sensing, and thermal imaging have all been sensors on my wishlist for handheld sensing devices for quite some time. Here, these sensing modalities finally began to be incorporated. Other sensors, like the barometric pressure sensor, have so high a resolution that you can often measure someone's height simply using the difference in air pressure between their head and feet!
- Connectivity: Ability to share many of the sensor readings wirelessly through Plotly.
- Interface: A simple, visually attractive interface, that is very usable for core tasks.
- Reuse: In the spirit of open source, many of the aspects of the Arducorder were individually reused for other projects. Most notably, the Arducorder serves as a reference design for the Hamamatsu microspectrometer, and the folks at GroupGets helped use this to bootstrap and enable a community of makers and engineers to order and use small quantities of these beautiful microspectrometers.
Arducorder Mini: What could have used improvement
Many aspects of the project worked very well. As with all experiments, there was room for improvement:
- Usefulness: The Arducorder Mini is has the most diverse array of sensors of any portable electronic device that I'm aware of. One of the most common questions I get asked is, "What can I use it for?". Individually, there are many applications for each of the sensors apart from science education -- for example, thermal imaging can be used to find heat leaks in your home, or for various tasks in industrial settings. Radiation sensing is something that likely isn't a part of everyday life for most folks, but the Arducorder Mini's radiation energy histogram showed an unusually high concentration of very high energy particles while sitting on my desk one afternoon when there happened to be a solar flare -- likely my first handheld solar flare measurement! Does this mean we can create crowd-sourced cosmic ray observatories with some large number of handheld instruments such as these? Similarly, the visible light spectrometer is an extremely powerful instrument, but needs much more work on specific applications, and industrial design supporting specific kinds of common measurements -- for example, allowing absorptive measurements through small sample vials. The list goes on. All together, the device is quite capable, but identifying and then developing specific use scenarios will help increase it's usefulness.
- Industrial Design: It is extremely challenging to meet the mechanical and industrial design requirements for a dozen different sensors. Some of the readings (for example, from the atmospheric temperature sensor) are not accurate, because the unit heats up quite a bit from the battery and processor. Gaskets are missing on atmospheric sensors. As above, sample container adapters and/or other mechanisms for absorptive measurement need to be incorporated to increase the utility of the sensors in common use cases.
- Ruggedness: The design is pretty solid, but not solid enough that I feel comfortable carrying it around everyday in my pocket without fear that I'll break it after a week or two of constant use. When I ship it to folks to use in demonstrations, it sometimes comes back with some of the sensor boards unconnected (from rattling during shipping). Ideally the next iteration of a design would be something that could be carried around in one's pocket everyday without having to worry.
- Build time: The Arducorders take forever to build. It takes me two weeks of evenings and weekends to build one, and I designed the thing.
- Replicability: While every effort was made to make the Arducorder replicable and as easy-to-build as possible, it is a project with nearly 200 very small surface mount components spread across 7 boards, and I have to accept that this is simply beyond the capability of all but the most motivated and experienced makers/citizen scientists. The next iteration has to make serious improvements in lowering the barrier to construction.
- Maintenance: I am only one person, and this is a project with an extremely large hardware and code base. In spite of rewriting the Plotly Arduino library, plotly changed the streaming interface a few months later, and I simply didn't have time to update the library, so this feature became deprecated. I broke the firmware for the thermal camera part way through the build, only discovering it was a software issue after making several copies of the board when it mysteriously quit working (though I didn't have time to find and fix this bug). I had hoped to bootstrap a community of makers that would continue to develop the hardware and code, but due to the barriers to making one's own unit, this never happened. In a very real way the project can't be successful unless I not only design a capable device, but make it easy enough to replicate that it bootstraps a community that's able to further develop and maintain that device. This is something that I'm still learning to do, though I think I get better at it each time.
- Time: The Arducorder took a very large number of person hours to develop. As a postdoc, most of my evenings and weekends for six months were spent designing and building the Arducorder. Now that I'm a professor in my first year on the tenure track, nearly every evening and weekend is spent working on papers, doing my teaching, and applying for grants to support my research at a time when public scientific funding is extremely low. All this means that I have to make design choices that don't take tremendous amounts of time to develop, and have a high chance of working out. Hopefully this will implicitly help reduce the build time, and increase replicability.
A first roadmap for the next iteration
I've assembled a first-pass at a roadmap for building the next iteration, that hopefully embodies many of the design principles outlined above. The significant differences in design principles from the Arducorder are:
- Breakout Boards: To make the design as easily replicable as possible, try to build the device using existing breakout boards from major suppliers (e.g. Sparkfun, Adafruit, etc) whenever possible. If a part or sensor is too new, try and convince a supplier to carry a breakout board for this. This will increase the size, but also greatly decrease the barrier for construction.
- Interface with a smartphone: Smartphones have beautifully usable interfaces, and are very good at displaying information, interacting with the user, and communicating with the world -- exactly the bits that are often most time consuming. Here, try to make a backpack that will fit on the back of a smartphone, and interface between the two ideally with a wireless method (e.g. wifi). Hopefully this should simplify making a robust enclosure, and the whole thing can just be attached to the back of a smartphone and carried with you wherever you go.
- Target the mechanical design around use-case scenarios: Figure out common use-case scenarios for each sensor, and design the mechanical considerations of the device around these. Does the spectrometer need an integrated sample container holder to be able to be broadly useful? Or a method of easily performing absorbance measurements? Do the atmospheric sensors need to be mounted in a tube that periodically samples air using a microfan to be most useful and accurate?
It looks like an exciting (and hopefully tractable!) project, and it'll be great to go through the design process to see how things take form as time progresses.
Thanks for reading!