Close
0%
0%

Plant LED biofeedhack system

multispectral LED computer vision plant biofeedhack system

Similar projects worth following
The goal of my project is to ‘bio-hack’ a plant in order to optimize its overall growth. The project incorporates a digital camera, light emitting diodes (LEDs), and software image analysis tools to conduct non-contact plant growth analysis. The system measures growth parameters such as leaf size, plant height and leaf reflectance. Plant growth characteristics can then be used to regulate the spectral and temporal output of the LED grow lights. The goal of the system is to provide real time feedback to optimize plant development with minimum energy.

I believe this project directly relates to the climate change, overpopulation, food shortage challenge of the ‘Design your concept’ contest. Combining LED and computer vision technology creates the opportunity to develop plant light algorithms that can simultaneously improve plant yield and greenhouse energy efficiency.



The figure shown on the project board is my first attempt to demonstrate the multispectral computer vision plant biofeedhack system concept. The picture was taken with a modified LED grow light and Raspberry Pi noir camera. The picture shows different images of a five week old basil plant leaf illuminated by different LED wavelengths. Top left is a computer vision enhanced image. Top right is white light image. Images below are recorded with ultraviolet (uv1,uv2), blue(blu), green (grn), red and infrared (ir) LED light. Remaining pictures are processed composite images that highlight color contrast. Computer vision software (SimpleCV) extracts the leaf image (colored dark green) and measures number of pixels in the area, length, width and perimeter (bottom text). The picture shows that a measurement system can be built to characterize plant growth and multispectral leaf reflectance. The measurements can then be used to tailor the LED light for different growth conditions.

Update Mar 2019.

Several related computer vision projects using opencv  are posted at:    

https://publiclab.org/profile/MaggPi

schematic.png

schematic

Portable Network Graphics (PNG) - 125.43 kB - 03/30/2017 at 04:34

Preview

  • Timelapse computer vision measurement

    mn271828182804/28/2017 at 05:51 0 comments

    Attached video shows computer vision measurments of a basil plant over a 30 day period. The goal was to understand computer vision performance for the LED plant biofeedhack system. Images show the ability of computer vision to measure small features less than 1mm without contact. The next step is to regulate the LED grow lights with data collected from the computer vision measurements. Thanks for following and special thanks for those who 'liked' the project!


    Computer vision software (SimpleCV) extracts basil plant ‘blob’ features from soil background and counts number of pixels in each plant ’blob’. Picture info: Unprocesssed Image (right), computer vision processed image (left) marks counted pixels in green. Text below displays # of pixels in area/ length/width and perimeter for each image. Images are scaled/calibrated by ruler - 1 pixel measures .6mm x .6mm.

View project log

Enjoy this project?

Share

Discussions

Jasson Adder wrote 06/13/2022 at 11:29 point

The investment community has begun to respond, if not in practise than at least in words, to Ross's declaration that 2019 is "the year the world woke up to climate change." Concerns regarding environmental, social, and governance (also known as ESG) are increasingly widely recognised as having an effect on the decisions made in the capital market. ESG considerations are now routinely incorporated into investment analyses conducted by fund managers and private wealth managers, investment banks, pension plans, and individual investors. Additionally, businesses regularly publicise their ESG credentials. By the year 2020, "68 percent of UK savers wanted their investments to evaluate the impact on people and planet alongside the financial performance," as stated by Ross.

More than just a Glance at the Whole

The book "Investing to Save the Planet" is geared toward private investors as its target demographic. It does an excellent job of accomplishing its aim, which is to present a condensed, clear introduction to environmentally responsible investing while omitting no essential material.

The chapters go quickly and combine an incredible amount of information with the author's personal anecdotes and experiences. This is not merely something that will make you feel wonderful. It is likely that readers who are interested in this book value ESG for deeply personal reasons, and the personal narratives that are covered in this book put the quantitative facts in a context that is rich and complicated.

At the end of each chapter is a practical piece of advice labeled "What Should I Do?" As a way to demonstrate that Ross has a deep understanding of the needs of her audience, she presents different lists of proposed actions for low-, medium-, and high-risk investors.if you want to get more information 

https://pureclimatestocks.com/book-review-investing-to-save-the-planet-alice-ross/

  Are you sure? yes | no

greenaum wrote 07/10/2021 at 03:32 point

Hi. You know that, for example, using red and green light together doesn't really make yellow light? It's an illusion caused by the human vision system. We have receptors, nominally red, that is sensitive to red but also a fair way into yellow. Nominally-green responds to orange through to cyan. Etc.


So true spectral yellow of 580nm excites the red and green cones. Your brain is wired to know that, so something stimulating R + G is assumed to be yellow. But it could also just be a red and green light, separately, shining on the same area, reflecting into the eye. Seeing red and green simultaneously, the brain assumes it's seeing yellow, but no 580nm is there, just 550nm and 650nm. The brain can't comprehend two separate colours in one area. So it assumes.

Of course, that's how your monitor works. But at school it was implied the colours "mix" somehow or somewhere, perhaps on the illuminated object. But we're also taught that light waves of different colours don't interfere. The final step, it's an illusion in the human head-meat, I was certainly never told and I don't think most people are.

If a plant cares about colour (and they do), then you have no yellow to offer them. The chemical reactions in their leaves aren't fooled by imitation yellow! (or magenta, or any other visible colour beside the 3 you have supplied).

So my point is, perhaps you need a wide spectrum of LEDs, you can get them in all sorts of in-between wavelengths, it's surprising how many variants there are, even plain red and green have a few varieties, often "superbright" LEDs are just of a wavelength we respond better to.

I'm not certain whether you need these other wavelengths, just pointing out something that, certainly, I wasn't aware of for quite a long time into my adult life. 

Oh of course this R+G+B illusion thing also applies to most cameras! They use the same system, since they were built for humans. This might not be a problem, since like us, they can see yellow, even though they really can't see yelllow! So you can still take photos of whichever wavelength, you just need to be sure you're applying the correct one.

[hmm, wouldn't it be interesting to make a camera whose RGB filters were very strict and narrow? so they'd only detect a narrow band around 650nm red, etc. Yellow or cyan would show up dark, not pass the receptor. In a way it would demonstrate human sight and how things actually look! In a way...]

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates