11/17/2020 at 16:22 •
Finished new controller board for laser scanner and can now stream data via a ringbuffer to prism scanner. No more micro-controllers from now on but only FPGAs :-). Thanks to Claire Wolf, Migen and Litex team. Code can be found here.
11/06/2020 at 10:41 •
On Tuesday 3 November 2020, I gave a talk on coloring textiles with lasers. Lasers allow you to locally tune the diffusion of a colorant into the textile by applying heat. My aim is to reduce waste and create a more sustainable world with this technology and promote my prism scanner :-). You can watch the video here.
10/19/2020 at 11:28 •
I created an open-hardware project and got part of my inspiration while working for the Dutch State (TNO). The core idea is that a laser bundle is moved by rotating a prism. The Dutch state got a patent for a plurality of laser bundles but not for a single laser bundle. For the printed circuit board application, it founded LDI Systems in 2015. This failed and they wasted multiple million tax dollars. I only spent 10K dollars on a working system and paid taxes. I thought they would leave it there. But TNO requested another subsidy from the Dutch government (NWO) to pump 45K euro into a new company AM systems BV for the 3D printing application. I have contacted NWO in this regard as I don't see how the original issues can be solved. I also wonder how much "own" money is brought in (company is largely owned by the Dutch State). A company is not subject to tender law but the government is. As such the company might be use to circumvent it. I am also not aware of employees in this company and know they contacted an optical consultancy. My work is free within the constraints provided by the typical licenses MIT and GPL. But I think is strange the state is still sponsoring this project. Hackaday gave me 3K and the Dutch states gives a chosen business developer 45K to explore their failed project. That's not really fair and in that sense the state it is sponsoring "closed hardware".
Often the government throws away the result if unsuccessful, if it was open source one could at least learn something. Anyhow, I filed an official complaint and have talked to NWO.
Links are no longer online, you can still find it in Google Cache. NWO stated it updated its website.
08/06/2020 at 13:40 •
Although, I do not plan to participate in the Hackaday Prize 2020. It is inspiring to look for sustainable applications of laser technology.
In 2019, the United Nations set up a campaign for sustainable fashion. Most of the clothes are manufactured in Asia. This process requires a lot of water and produces toxic waste.
Luckily, Dr. Laura Morgan, looked for ways to dye textile with lasers, see her extensive PhD thesis.
She showed that lasers can be used to dye textiles. Results can be seen in the image below.
In her laser experiments, Dr. Morgan uses 2 Joules per square centimeter. Results are improved with multiple passes, e.g. 15 passes. A carbon dioxide laser is used in her experiments.
She also shows other applications; fading linen, increasing the absorption of wool and applying 3D texture via heat distortion.
I think textile coloring could be a nice angle for the Hexastorm. If I have the time, I will do some experiments with a 450 nm laser at 10W. This is much safer than a carbon dioxide laser as infrared light is invisible. The process still requires a washing step but the chemicals involved are a lot less dangerous than the chemicals used for PCBs. The application I see is adding patterns to existing products. I think the process is too slow due to insufficient laser power for large scale production.
Which brings me to an update of my current progress.
At the moment, I am mostly busy with creating the FPGA electronics for the scanner.
I have made a new board as I made a mistake in the previous board. It turned out that I couldn't use the SPI programming port for sending laser data to the FPGA.
I also wrote new software, this has so far only been tested virtually.
06/11/2020 at 15:38 •
Managed to make the LED blink on my custom FPGA board. It was quite a challenge and I am glad I didn't go for a BGA package.
I placed the ICE40 chip on the board with drag soldering. The other components you can hand solder except for the oscillator which requires hot air ( used AYOUE 852) and paste. In the next iteration, I plan to use mainly hot air (it is much faster). Used TS100 as a soldering iron, the Loctice GC10 for paste, flux and some desoldering wire to get the solder on the TQ144 component right. Biggest headache is that I didn't realize I had to send a wake up packet to the flash ram. I tried the python script ice zero prog to flash the memory but it didn' t work; turns out the wake up packet is implemented in icezprog but not in the old python script (OMG!).
Also found out the board has too many caps and resistors, icezero uses less see schema
05/17/2020 at 13:37 •
The current proof of concept module uses the programmable real time unit of the Beaglebone. This is a device similar to a microprocessor and runs at 200 MHz.
In my next iteration, I plan to use a FPGA. FPGA can be faster and offer more tight control. The current model has a vendor lock to the AM355X ARM core of Texas instruments.
I just finished routing the hat for the Raspberry Pi with uses the ICE40HX4K. This chip comes in a TQ144 package so is easy to solder. The chip is also quite cheap, around 5 euro's. I was able to reuse a lot of the design of the beaglewire and icecore. Information for the laser controller is streamed to the chip via SPI. Maybe in a later stage, I will use the SDIO connector which is faster.
Schematic is shown below, full design available here.
04/13/2020 at 14:15 •
I build a rotor balancer for my prism scanner with an Arduino Nano 33 BLE and an infrared led sensor. I am able to reduce the unbalance of the prism by a factor 10 at a rotor frequency of 90 hertz. Note that for this motor the rotor frequency does not equal the pulse frequency.
The prism seems to have multiple plane unbalance so I was not able to perfectly balance it.
A single measurement is shown below;
Multiple measurements are taken. For each measurement the amplitude of the accelerometer signal and phase difference between the IR sensor and the accelerometer signal are determined. As the phase seems dependent on frequency, the prism seems to have multiple plane unbalance, What also seems to play a role is that the electric motor works in reverse. The spinning disk probably creates current in the motor which is turned off. In the future, I will try to substract the measurements of unbalanced prism with a balanced one... Maybe their difference will be more intuitive.
The code, measurements and a brief discussion of the results are available on Github.
01/10/2020 at 14:54 •
One of the challenges still open is how to balance the prism. Earlier, I discussed the mechanics in a blog named prism balancing. In this blog, I want to give a practical example of how to determine the mass and position from measurements.
Vibrations caused by a spinning rotor are measured with an accelerometer. As sensor, I used the MMA8452Q but would recommend the MMA8451Q or LIS3DH as they have better specs. I used the Raspberry Pi 3B to measure the signal and generated a signal for the polygon motor via hardware pwm using pigpiod.
I pulsed the rotor at 20 Hertz and recorded the vibrations for 1 second at 800 Hertz sampling frequency.
Sampling of the signal must be equidistant or otherwise the discrete Fourier transform can not be calculated. As mirror motor, I used the Panasonic AN4000A.
The discrete Fourier transform of a balanced mirror, shipped with the motor, is shown below
There is a peak around 100 Hertz. The discrete Fourier transform of one of the prism I use is shown below.
An enormous peak close to 100 Hertz can be seen. This amplitude can be compared to the amplitude of a known balance weight to determine the required mass. As the centripetal force is linear proportional to mass. This procedure is only so simple for single plane balancing, in two plane balancing the procedure is more complicated.
To determine the position of the mass, the position of the rotor must be measured. For example by using a photo tachometer. Earlier, I discussed how a camera can be used to measure the position of the rotor. A camera is quite expensive and it would require some image analysis.
The DT2234C+ photo tachometer only costs 18 dollars. It was also discussed widely on Hackaday.
As a result, I bought one and did a couple of experiments with it and attached reflective tape to one of the corners of the prism. The photo tachometer measured a speed of 5750 rpm, i.e. 96 Hertz.
This corresponds to my measurements as I can see a peak near 100 Hertz.
This also explains why I have so many problems with this motor and recommend the Sharp 160.
The Sharp 160 is able to spin at 2000 RPM. Also the relation between pulse frequency to the motor and the final RPM is direct. Pulsed at 20 Hertz the Sharp would spin at 1200 RPM and not 5750 RPM.
The position of the peak in the cross-correlation of the accelerometer and photo tachometer will be an estimate of the phase difference.
If I know the phase-difference with respect to the marking used by the photo-tachometer, I can use that to calculate the position of the balance weight.
The code used for the measurements can be found on Github. I also did a successful experiment with an artificial signal to calculate the cross correlation.
I still have to extract the signal from the photo tachometer and turn on the photo tachometer digitally instead of with a button. Also, I have to order balance weight. Ben Wishoshavich pointed out I could use armature balance putty.
01/03/2020 at 12:26 •
The last weeks, I have been doing experiments to see if I can replace the programmable realtime unit on the Beaglebone with a FPGA. On the Beaglebone, there are two programmable real time units, which are very similar to micro-controllers, running at 200 MHZ with 8K byte memory per core and 12K byte shared between them. At the moment, I use one PRU and one memory of 8K byte which acts as a ring buffer.
There are some challenges with the current design. I am locked into the AM335x processor which runs at 1 GHz and is single core. The latest Raspberry 4 runs at 1.5 GHz and is quad core.
If I am able to build an extension for the Raspberry it would not be hard to use the scanner with other boards, e.g. the edge TPU or the NVidia nano.
When I started this project, I actually tried to do this with the Spartan 6 FPGA, using the Xula-LX25.
In the end, I managed to get something to work by writing the laser data to a sdram and then reading this data from the sdram. This was all written in MyHDL, see Github.
MyHDL is converted to verilog and this can then be converted to a bitstream by the Xilinx Ise.
In my latest experiments, I used the IceZero with the ICE40HX4K and Migen. The oscillator on the Icezero runs at 100 MHz and the ICE40HX4k has 81920 bits of memory. In practice, the memory is larger as the Icestorm toolchain is able to program the ICE40HX4k as a ICE40HX8k.
The ICE40HX8k is a lot less powerful than the Spartan 6. It has 1/6 of the SRAM memory and 1/3 of the LUT. The main advantage for me is the open source icestorm toolchain which runs on linux.
My current setup works as follows. Data is streamed to the FPGA via SPI and stored on the FPGA using the internal SRAM. The sram is used as a buffer before the data is placed on the substrate with the laser. My code can be found here, especially look at the spimemmap example.
I did some experiments on the Raspberry 3 using Python and Spidev and I am able to get data rates up to 25 megabit per second.
In my current laser head, the scan line is 8 mm long and a pixel is 10 micrometers. So there are roughly 800 pixels per line, i.e 100 byte per line. At 20.000 RPM * 4 (facets)/ 60 = 1333 lines per second. This implies 133 kB/s or 1 megabit per second, much less than 25 megabit per second.
Another problem, I am still working on is balancing the prism. I hope to report some progress on that soon.
What might be of interest is that MIT recently released a press statement in which they show that they are working with lasers and ultrasound. The technology is very similar to my previous blog post. The only difference is that they use a laser to detect the vibrations.
Another interesting was one by blog the drive. There apparently is a Dutch startup which use lasers to clean railway tracks.
11/16/2019 at 22:03 •
Won the honorable mention for best communication; 3000 dollars and a lot of branding. It was great to meet you all! Thanks for the award! You can also find this room in the supply frame video, door is opened at 0:44 see video.