Close

Human Trials Begin

A project log for A Halo For Lucy

It's not what you think.

bud-bennettBud Bennett 04/02/2019 at 00:070 Comments

The PCBs and parts have arrived. The near-term objective is to create a working prototype halo that will prove the concept using the Arduino Feather M0 Express along with a simple motherboard for the sensors.

I removed the VL53L0X sensor ICs from the AliExpress boards and installed them on five daughterboards. This was more difficult than expected. The VL53L0X sensors seem to be sensitive to heat. I melted the first sensor that I removed from the AliExpress board while attempting to solder it to the daughterboard. The most satisfactory method was to heat the AliExpress board from the bottom, with a hot-air gun, until the sensor came loose. Then I turned the sensor over and added solder bumps to the pads with a soldering iron set to 300°C. I spread flux over the pads on the PCB and created solder bumps on those pads as well. I placed the sensor over the PCB pads and heated the PCB from below until the solder bumps melted and the sensor moved to align with the PCB pads. This method prevented the sensor from being damaged by the hot-air gun. The final result looks pretty good (compared to the huge Adafruit sensor board).

I soldered the five daughterboards onto the prototype motherboard and the code worked! All five sensors were configured and report distances. (I was absolutely amazed that all five sensors were functional.)

The next step is to attach the piezo speakers, and begin to evaluate the performance using humans (i.e me) as guinea pigs.

Early Human Trial Feedback:

[2019-04-05] 
Humans are different than dogs. I hooked up an 18650 Li-Ion battery and switch to power the prototype. I also added a pair of $1.00 earphones to be able to hear the audio feedback. The earphone wires are solid core ultra thin, so I had to glob hot-melt glue over the solder joints to provide a minimum amount of strain relief.
The code needed some extra features to make it useable: the PWM output were generating spurious frequencies when set below about 800Hz — this was confusing, I increased the PWM frequency and added a low pass filter to slow the rate of change of the PWM frequency (noise in the distance measurement) that was causing a warbling of the tone. There were a few minor code changes to simplify and make it easier to make global changes like maximum detection distance and other parameters. It seems to be performing acceptably now.
 
When I first tried to use the prototype it was immediately apparent that two sensors either side of the middle/front sensors weren’t making any discernible improvement. The audio mix did not indicate that the nearest object was slightly left or right of the mid point. I changed the code to just use three sensors (middle and two outer) with better results. When you are dependent upon these sensors you tend to sweep the area ahead slightly back-and-forth so having complete coverage is not a big advantage. When I stopped scanning those sensors the update rate improved and made a big difference in the speed of the feedback.
I quickly found out that the vertical field of view of the sensor was too small to be very effective for tall humans. I kept bumping into things because: a) the response time was too slow with five sensors, b) the maximum sensor distance was set to 500mm which is too short for humans, and c) the sensor was too high off the ground. Things improved with only 3 sensors set to a max distance of 700mm. I’m considering modifying the code to increase the max distance sensitivity of the middle sensor and have the outer sensors have shorter detection distances. This would help when trying to navigate the opening to a room when the two side sensors are both reporting obstructions.

Discussions