Machines find place in agricultural fields : Scientists are training computer to detect stress in soybean


Iowa State University scientists are working towards a future wherein farmers can use unmanned aircraft to identify, and even predict, disease and stress of their crops. Their vision depends on machine learning, an automated process wherein technology may also help farmers respond to plant stress more efficiently.

Arti Singh, an adjunct assistant professor of agronomy, is leading a multi-disciplinary research team that recently received a three-year, $499,845 grant from the U.S.Department of Agriculture’s National Institute of Food and Agriculture to develop machine learning technology that might automate the ability of farmers to diagnose a range of major stresses in soybeans. The technology under development would make use of cameras hooked up to unmanned aerial vehicles, or UAVs, to gather birds-eye view of soybean fields. A computer application would automatically analyse the photographs and alert the farmer of trouble spots.


“At its most basic, machine studying is solely coaching a machine to do something we do,” Singh mentioned. “When you need to educate a child what a car is, you present them cars. That is what we’re doing to train computer algorithms, exhibiting numerous pictures of varied soybean stresses to identify, classify, quantify and predict stresses within the area.”

The research team has assembled an enormous information set of soybean pictures, some wholesome and some undergoing stress and disease, which they then labelled. A computer program goes through the labelled pictures and assembles algorithms that may recognise stress in new pictures. Singh mentioned the machine studying program could be able to recognizing a variety of frequent soybean stresses, including fungal, bacterial and viral diseases, as well as nutrient deficiency and herbicide injury.

The usage of hyper-spectral imaging, or cameras that capture wavelength ranges beyond those seen by the human eye, may enable the technology to predict the presence of stresses earlier than symptoms even appear, giving farmers additional time to handle the issue, she mentioned.

Singh’s fascination with machine studying started in 2014 when she attended a seminar on the topic hosted by the ISU Plant Sciences Institute. She instantly thought the technology held promise for plant breeding and plant pathology, however a survey of the academic literature confirmed the majority of work within the area got here from engineering disciplines, and  not plant sciences. She realised more collaboration can be essential to advance this area in agriculture.

We have to embrace plant scientists as well,” she mentioned. “Otherwise, we’ll have engineers engaged on plant science problems. The collaboration among disciplines is what makes it possible.”

She helped to assemble an interdisciplinary team that created an app that permits smartphone users to take footage of soybean crops to find out if the crops suffer from iron deficiency. Now, the research team aims to scale up their work from the original app, which requires manually shot photographs to diagnose a single stress,  to algorithms capable of taking pictures from UAVs and figuring out a range of stresses.


The future of the technology rests on the ability of scientists and engineers to assemble the right kind of data set and then develop the ability to analyse that information. By the end of the grant, Singh mentioned the team intends to have completed a framework of best practices for data collection using UAVs. That features determining optimum picture resolutions, in addition to optimum heights and speeds for the UAVs. The researchers hope to seamlessly integrate data collection curation and analysis leading to its application in farm fields to detect and mitigate plant stresses in a timely manner. Singh mentioned the team will make all their findings publicly accessible on the conclusion of the project.

The approach has the potential for application in many other crops as well, Singh mentioned.

Leave a Reply

Your email address will not be published. Required fields are marked *