The design consists of 2 identical boards, each with 100 transducers.
Each board also has a controller and memory, and something to calculate and generate the phase signals for each transducer. A Raspberry Pi is used to control the two boards. The transducers I sourced are 10mm in diameter and rated to 40V. To drive each transducer, I used a MOSFET driver configured as a full H-bridge to essentially double the power the university group had used, figuring that I’d be able to move the bead that much faster (which actually turned out not to be the case). To generate the transducer signals, I decided to use an FPGA. This was partly because I needed a lot of I/O pins – more than 100 on each board – and also because I wanted to be able to calculate and change the individual signal phases at a rate of 40kHz, something that seemed to be a stretch to do on a micro. I had hoped to find an FPGA in a TQFP package with enough I/O and gates to be able to run all 100 transducers on each board but such a thing didn’t exist. It was simpler just to stick 2 FPGAs on each board and have them each run 50 transducers.I also added EEPROMs for each FPGA to store images.
On one of the boards, I put a Raspberry Pi W as the main controller. The PI sends SPI commands to all the FPGAs simultaneously. To keep the FPGAs synchronized, one of the FPGAs generates a 40kHz synchronization pulse that all the other FPGAs listen to. Because I was a little paranoid about all the EMI I might be generating, I decided to use RS485 differential signalling to send the SPI and sync signals from one board to the other.
To illuminate the foam ball, I use 4 3W RGB LEDs at each corner of the ultrasonic array. Each color is driven by dedicated LED drivers – the drivers can be PWMed to change the relative brightness of each color.
Lastly, there are 4 DC/DC converters – one to step the 24V input voltage down to drive the transducers, two separate 3V and 1.2V converters to drive the FPGA and associated logic, and one 5V converter to power the Pi