Close
0%
0%

IoT Farms - The Future of Agriculture

A cloud-based platform enabling data-driven farming to increase yield and automate agriculture

Similar projects worth following
To help monitor crops, soil quality, weather, fertiliser use, water content and market conditions I plan to deploy a mesh network of nodes at the edges(the farmland) which would connect to the Helium network through a central monitoring station placed on the farm. The monitoring station would receive soil quality data including mineral and water content from the mesh of nodes, deployed on the farm atop poles at strategic locations, which would communicate amongst each other and with the station via BLE. The monitoring station would have a weather station to predict disturbances beforehand.
Once, the data from each node is received it is processed extensively using Machine learning techniques in the cloud and stored in DynamoDB. An AWS IoT Events handler sends notifications about anomalies directly to the farmer and also automates irrigation. The farmer can control irrigation plans and monitor crop conditions remotely through the web application.
The nodes use the PSoC 6 dev kit.

THE PROBLEM
Agriculture, as of today is highly dependent on weather conditions and the framer`s experience in taking decisions. Often, the level of knowledge or insight is insufficient as it is not possible to monitor crops in real time. This leads to decreased yield, excess fertiliser or water disposal, insecurity against weather conditions, insecurity against weed or pest attacks, lack of knowledge of market conditions...the list is endless. This is a huge problem as at the rate the population is increasing very soon many countries are going to run into food and water shortages. Also, a decreased yield inhibits the purchasing power in the rural sector which affects the economic growth of the nation. In a country like India, where more than 50% of the workforce comprises of farmers, it is very essential to empower them so that they contribute to economic growth and are able to meet surging demands. Further, to control weeds and pests there is a prejudicial and unmindful usage of chemical fertilisers which not only deteriorate quality of the soil but also pose risks to public health. There is no way for the consumer to know whether the food he buys is safe from these fertilisers or not. There exists no single solution to all these problems.

THE SOLUTION
To help monitor crops, soil quality, weather, fertiliser use, water content and market conditions I plan to deploy a mesh network of nodes at the edges(the farmland) which would connect to AWS IoT cloud through a central monitoring station placed on the farm. The monitoring station would receive soil quality data including mineral and water content from the mesh of nodes, deployed on the farm atop poles at strategic locations, which would communicate amongst each other and with the station via BLE. The monitoring station would have a weather station to predict disturbances beforehand.   
Once, the data from each node is received it is processed extensively using Machine learning techniques on AWS IoT Greengrass and only those nodes where abnormalities are found are noted along with the type of error present and published to the AWS cloud in a coded format, when the need arises. It must be noted that minor issues like water or mineral deficit would immediately be resolved by opening valves at the node where the error is present. This way, the irrigation is completely autonomous.
The central station essentially is a quadcopter sitting atop a charging pad. Once, in every two or three days, it flies over the farm and captures images of crops of only those regions where mineral stress or any unnatural growth is expected and after analysis flies over to the farmer along with the data collected over the course of the past few days wirelessly over BLE or WiFi. The imagery data and the analysed crop health parameters can be received by the farmer on his smartphone or desktop through an app  that is based AWS Iot ThingsGraph and IoT core.
This data can then be analysed for taking important decisions like harvesting or tweaking the irrigation plan. The data is also automatically uploaded to the AWS DynamoDB for sharing with health-conscious consumers or for agricultural and meteorological research.
On the software side would be a web service written on NodeRed that would fetch data published onto the AWS cloud from the central station and send notifications to the farmer alongwith geotagged abnormal behaviour. The farmer would be able to override autonomous regulations by the central station like change the irrigation plan or the imagery data acquisition. 
The webservice would also fetch agri-market data from online feeds and inform the farmer about selling prospects. The farmer would have the option to publish collected soil quality data to the web service which consumers may subscribe to, to ensure that what they get on the table is safe to consume. 

THE IMPLEMENTATION
The nodes would be based on the NRF52832 module sitting atop a wooden pole on the farm. Mounted on...

Read more »

  • 1 × Q250 Generic Drone Frame
  • 4 × BLHeli OneShot ESC
  • 4 × Generic BLDC with 6" propellers
  • 2 × Capacitive soil moisture sensor
  • 2 × TDS sensor

View all 13 components

  • 1
    The Implementation

    THE IMPLEMENTATION
    The nodes would be based on the NRF52832 module sitting atop a wooden pole on the farm. Mounted on the pole would be a solar panel which would supply the capacitive soil moisture sensor and a TDS soil mineral content sensor(based on availablity). Also connected would be a solenoid value N1/2 which would regulate irrigation. These nodes would be placed around 50m apart.
    The central station would be situated atop a wooden pole with a large solar panel for charging a 8000mAh LiPo battery. A standard 250mm quadcopter frame with 6" propellers and powered by generic BLDCs would be used. The LiPo battery would supply a power distribution board on the frame which would then supply four ESCs and a Cypress PSoC 6 Prototyping Kit which would be the main controller. An Arduino Uno based Hat would be the flight controller and would be controlled by open source Mission Planner firmware. The PSoC board would receive imagery data from an IR camera and a normal 8MP camera via the USB host.
    The deep learning models for crop and weed classification has already been trained and verified successfully(Check my github please). The imagery data would be processed and inferences drawn with the help of an Intel Neural Compute Stick 2  that would also connect to the USB host. The drone and the image classification setup have already been tested successfully (but independently). For storing imagery data temporarily, a micro SD card would be used, which would be interfaced directly with the PSoC 6 Prototyping kit. 
    A webservice based on NodeRed would fetch data uploaded on the AWS IoT Core and notify the farmer in case of detected anomalies in the farm. It would also have a panel for the farmer to override and monitor current regulation procedures and tweak them. Also, it would give notifications about market trends and a provision for the farmer to upload collected data from the farm for use by interested consumers or agricultural researchers. Work has not been started on the webservice but I have done this before and will only need to tweak existing code.
    Now, that the build is outlined let me explain the implementation.
    The nodes would from a mesh network and transfer data sequentially to the central station at intervals of one hour. Even if the depth of the mesh is large, the latency introduced is negligible in this regard. The central station would continually analyse this data and observe trends. The DL models installed would try to figure out if there exists any problem. If there is a moisture deficit at a node, the paticular node is alerted which then opens the particular valve for irrigation. As soon as moisture content is normal, the valve is closed. The nodes sleep for the rest of the time. 
    If in turn a mineral excess or deficit is detected, the farmer is informed and the necessary values published on the cloud. Observing these trends, the farmer may command the station for an aerial view of the area at the end of the day and take necessary decisions after that. A sensor hat mounted on the dev kit would log important weather data and inform the farmer beforehand in case of any disturbance. The webservice also subscribes to online databases for weather reports in surrounding regions to directly notify the farmer. The central station would send suggestions to the farmer after analysing the data from the nodes throughout the day.
    After every two or three days or when the farmer desires, the central station hovers over stressed regions or regions which the farmer geotags through the webservice and takes imagery data which is then classified using DL models to inform the farmer about expected yield, weed attacks, crop health, growth rate and other suggestions like harvesting time, changing irrigation or fertilising plan etc. After analysis, the station flies over to the farmer and dumps all data collected on to the farmer`s network by WiFi.
    The central station would be based on AWS IoT Greengrass and communicate with other stations in adjoining farms. If connectivity is lost, these stations would collaborate and ensure that data is continually received. The weather data would also be shared among these nodes. The data uploaded to the cloud is stored on DynamoDB and analysed using IoT Core and IoT Analytics. The farmer may at any time visualise collected data with the help of AWS IoT ThingsGraph.
    The autonomous drone was made as part of another project. So was the image classification setup(part of academic project). The only things that I need to make are the nodes, add the PSoC 6 Prototyping dev kit and configure the web service. 
    For proof of concept, I will collect data from only two nodes(as I only have three solar panels) and automate error conditions(as I dont have access to a real farm). Also, for the image analysis, I will use plants in my garden. Hopefully, if the lockdown is lifted I can do real tests on a farm.

View all instructions

Enjoy this project?

Share

Discussions

Abdullah wrote 7 days ago point

This is impressive, Absolutely love this kind of projects!

Keep going :)  

Abdul

  Are you sure? yes | no

Mayke wrote 06/05/2020 at 13:09 point

I believe that your problem is overtly generic in its formulation. To believe that a set of sensors will 'solve' most of the problems posed by traditional agriculture is a bit naïve. Real time data does not necessarily help a farmer or a company, as it bring a new set of problems, the most apparent being how to understand and pre-process and analyse the datasets. If I were you, I would break it into smaller problems that are easier to define and solve; for example, in a region with increasing drought problems, irrigation sensors will feature more heavily than others. Another issue is the reach, quality and duration of your sensors in collecting data. A capacitive soil sensor will not give you any indication about mineral content in the soil, it simply measures resistance. Another massive problem is sensor calibration; if you are measuring things like irradiance, reflectance, or even more complicated things like fluorescence, the calibration process is painful and labour intensive.

Let's talk for one second about your quadcopter, but focusing just on the payload. that is, I don't even want to get involved into the huge challenge of deploying a drone remotely, as this is the realm of serious engineering. So, let's talk about the camera, you mention two in fact, one nir camera and one RGB. I asume that both consist of the same sensors and lenses; I also asume that you do some postprocessing with the images such as mosaicing and image registration. For these two types of processes, you need much more than just images, you need accurate and relevant metadata, not only precise GPS coordinates, but also the camera angle, altitude, timestamp at millisecond level, and, OF COURSE, calibration data. The easiest way of getting calibration data is by deploying a lambertian pannel that you need to photograph prior, and after your mission, so you have a meaningful way of processing you images. Otherwise, all these pictures are useless. You can also collect that data the hard, very hard, way. It means you need to develop an up-down welling sensor, preferably with spectrometers, but it can also be built using photodiodes that measure individual bands. Such sensor would have to be mounted on a sort of gimbal on top of your drone. Otherwise, all the angles, turns, twists, etc. will destroy your data. Another option is to collect such data using a base station but, then again, that data will differ of what you camera is collecting and will need to be cleaned using complex maths. Let's talk about image sensors. You need to use expensive, monochrome sensors. Why expensive? Well, you want to be the proud owner of a product sheet with all the specificacions, such as QE. Not all sensors are sensible to light in the same way and if your camera is good capturing nir light, it might not be that good capturing blue light, so, you will have to acount for that in your postprocessing. Why monochrome? A bayer pattern filter will mess your data collection simply because is a layer printed on top of your sensor and it will change the amount of photons that reach in each color. You can compare the brightness of an image collected with a real monochrome sensor and one of the bands of your rgb camera, and you will see that there is a big difference. Of course, you can always compensate for that but, again, you one your data to be accurate as possible.

Then let's talk about ML and processing of images to gain indexes and substract from your images things like types of crops,n weeds, or even more complicated, nutrient deficiencies. Everyone relish the wholy NDVI as an everything-solver. It is not. Fro training models to be accurate, e.g. supervised classification, etc. you need good data. Data that you will need to collect by hand on the field. The data you need are spectral profiles of things you'll look for on you images later. So you need a spectral profile of a specific weed, spectral profiles of the crops you want to discern, etc. To collect spectral data you need a spectrometer calibrated for reflectance, and a tonne of field days. And hey! I am just talking about how things would look like. What about looking for plants with a specific problem that you want to find later? A disease, a nutrient deficiency, water stress? How do they differ of other things? Fever can be a sympton of tonnes of illnesses, so if you looking for fever, how do you know what is causing it?

If you speak to a farmer, or an industry specialist, they will be extremely sceptical of all-out solutions; sometimes they would even frown at the sight of 'problems' they don't experience, or simply are not high on their most do list. For example, by the time a nutrient deficiency is visible in your imagery, there might not be solution anymore, so the solution is to prevent these things to happen in first place. If you farmer is looking for nutrient deficiencies during the season, his crop is already gone.  

Another issue for field sensors is that they interfere with traditional agriculture machinery. Farmers need to spray and fertilize their fields several times in a season. if your sensors are bulky or tall, they will send you packing because they don't want to have more issues.

I recall a conversation with the CEO of a large UK based palm oil producing company with plantations in Indonesia. I told him we could tag each individual plant with RQ codes and the field workers will use their phones to log info about each plant: fruit weight, diseases, prunning, etc. He laughed and told me that his company was barely coping with all the data they already had, and more data will mean more confusion for his field staff. I left the meeting fuming, but he was deadly right.

Please do not let me discourage you. I hope the comments above are food for thought. I bet that you hhave solved most of these issues already. Best of luck with you project!

  Are you sure? yes | no

Moinak Ghosh wrote 06/17/2020 at 10:11 point

@Mayke Thank you so much for your insights. I am relatively new to this field and after you pointed out the loopholes in the solution I did lots of background research and discussed some issues with my professors.

I also took help from a startup founded by a senior of mine based in India. You were absolutely right about your concerns. I cannot thank you more. It is people like you who uplift the spirit of open-source. I will keep updating the logs as I move on with the project.

  Are you sure? yes | no

seigneurcanard wrote 06/05/2020 at 07:54 point

I was not familiar with the helium network, seems cool. Wouldn't lorawan and https://www.thethingsnetwork.org/ have a better range and be better suited for the big area of a farm?

  Are you sure? yes | no

Keshav Bagri wrote 06/05/2020 at 12:56 point

I get your concern but please also take into account the cost of deploying LoRa nodes which is way beyond poor farmers can afford. LoRaWAN is obviously better for long distance applications but the setup eliminates the need for LoRa

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates