Most people have their smartphone almost always near by, so that in the ideal case for the "evaluation unit with screen" no further costs arise, since the smartphone assumes this. Older smartphones or keyboard handhelds often have a smaller IR blocking filter (different from devices and manufacturers, the iPhone, for example, has a very strong IR blocking filter), so they are not capable of performing the optimization algorithms, but in the video preview Could already show veins more clearly by mere IR irradiation. In this case, a very cost-effective solution would be possible (only IR LEDs are required). Another possibility to develop a mobile variant is to connect the modified webcam to a smartphone using a USB-on-the-go (UTG) adapter. Not all smartphones support this - but it would be a way to bypass the built-in camera. Then you would have a corresponding IR-sensitive camera with built-in lighting and the possibility to additionally optimize the video mobile by software. We chose the second version.
For the hardware you simply need to print the stl-files according to your 3D-Printer’s software. The case is designed to fit the Raspberry Pi3 with the three encoders attached, but you can adjust it to you needs. The Encoders attach to GND and to the GPIOS 20,21 / 18, 23 and 24,25. The IR-LEDs used have a peak at 940 or 950 nm and require a 12 Ohm Resistor, it you connect three of the LEDs in series. Connect 3 series of the LEDs with a resistor parallel and you have an array of 3x3 LEDs, which will fit into the casing designed for the reflectors.
If you want to stream the calculated image to your smartphone, TV or tablet, you either need to integrate the Raspberry into your local Wi-Fi network – or just start a new one. We don’t want the user to have to deal with editing Wi-Fi settings on a terminal session,
The veins are illuminated with IR light (950nm) and the back scattering is captured by the Raspberry Camera (the one without the IR-filter). You can use old analogue film tape as a filter to block visible light and let only pass IR- light. The camera picture is processed in several stages to get an improved distribution of light and dark parts of the image (histogram equalization). The reason to use near IR illumination lies in the optical properties of human skin and in the absorbance spectrum of hemoglobin.
The device was developed by us (code, illumination, 3d-files as well as numerous tests of prototypes and real-world tests in a hospital). In this tutorial we reference to the following blogs who helped us developing this mobile version of the “Venenfinder”:
We cite some steps from Adrian’s blog on how to install openCV on the Raspberry Pi from scratch: https://www.pyimagesearch.com
We just decided to turn the Pi into a hotspot. Here we followed Phil Martin’s blog on how to use the Raspberry as a Wi-Fi Access point: https://frillip.com/using-your-raspberry-pi-3-as-a-wifi-access-point-with-hostapd/
Since you need a way to change the setting of the image enhancement, we decided to use rotary encoders. These are basically just 2 switches and they sequence they close and open tells you the direction the knob was turned. We soldered 3 rotary encoders to a little board and created a Raspberry HAT on our own. For the code we used: http://www.bobrathbone.com/raspberrypi/Raspberry%20Rotary%20Encoders.pdf
We used some code from Igor Maculan – he programmed a Simple Python Motion Jpeg (mjpeg) Server using a webcam, and we changed it to Picam, added the encoder and display of parameters. Original Code:
https://gist.github.com/n3wtron/4624820
To rebuild this you can find the 3d files and the python program on my blog: https://zerozeroonezeroonezeroonezero.wordpress.com/
And there is a tutorial that is linked here, where you can turn the Raspberry into a hotspot step by step.
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.