UAVs: Where dreams meet dust
By Amy Petherick
After using a UAV to collect aerial images, Abuleil’s online tool allows users to label it according to their interests which was red clover ground cover in this study. Photo by Ammar Abuleil, University of Guelph.
The dream of using Unmanned Aerial Vehicles (UAVs) for precision agriculture took off faster than many developers could realistically keep up with, but researchers at the University of Guelph are hoping to close some critical technical gaps.
The UAV equipment now commercially available is highly sophisticated, featuring a wide range of image sensors that are capable of collecting a vast amount of information. So much data, in fact, that it becomes very difficult to make much use of it all. Which is why Ammar Abuleil, a Master in Engineering student, is trying to teach these flying machines how to produce something more than a pretty picture.
Under the direction of Dr. Graham Taylor, an expert in managing large data sets, and Dr. Medhat Moussa, who specializes in robotics, Abuleil has been creating an algorithm which filters the information collected by a UAV into a map that’s based on user-defined criteria. This would allow a farmer to upload the pictures taken of a field to the Internet and receive a colour-coded map back, indicating areas of weed infestation, flooding, canopy closure or any other label the farmer wanted to program into the model. For the purposes of developing the tool, Abuleil has been working on an assessment of red clover stands in wheat fields.
“What we’re trying to do is use remote sensing platforms and machine learning to try and make sense of what’s happening in the field without actually having to take samples,” Abuleil explains.
Abuleil worked with seven farms in the Guelph area, but only ended up collecting data from two because there were kinks to work out of the UAV’s system. In one particularly patchy 19-acre field, researchers asked the farmer to identify areas on a map of the field where the red clover stand achieved 100 per cent, 67 per cent, 33 per cent and 0 per cent ground cover. After that, the UAV flew over the field to collect visual data and the researchers collected 100 50-by-50 cm samples to verify the accuracy of the final image produced. Abuleil says using the classifications provided by the farmer as a scale and a very simple algorithm, the final image produced was 70 per cent accurate. With just a little more tweaking, Abuleil thinks he can improve those results to an accuracy rating of 80 per cent.
Although they focused on the red clover in the field, Abuleil says he has designed his algorithm to respond to any input reference so if the farmer wanted to assess the oilseed radish stand in that same field, it could do that too. “Because it’s a machine-learning algorithm, this program was not written specifically for this application, it can be applied to any application,” he explains. “So if a farmer, for example, circles ‘good moisture content’ and ‘bad moisture content’, then the algorithm will apply what the user circled and what the user labelled on the entire image.” The only thing that would limit what the machine could learn would simply be the quality of input data it collected. This is where the quality of the machine being used, and especially the quality of the sensors it can house, comes into play.
The UAV model Abuleil has been using to conduct his research is a Precision Hawk Lancaster Platform, which was selected and purchased for the university upon the advice of Loblaw Chair of Sustainable Food Production, Ralph Martin. Martin says he selected the model particularly because of the sensing options it offered. “From a research perspective, we wanted to have options to use sensors with as much capacity as possible,” he explains.
Many of the other different models he considered had fairly similar visual sensors, offered red-green and blue-green near infared sensors much like this UAV, and some also had the thermal sensors and LIDAR, which evaluates elevation, that it had. “But the reason we decided to go with Precision Hawk is that they also have a hyper-spectral sensor with a range from about 400 nanometres to 1000 nanometres,” he said. Since the platforms and sensors don’t mix and match, at least not at the time of purchase, it was a clear advantage to go with the only company that had been able to miniaturize this sensor. According to Martin, this is where the real advances will be in making UAVs a viable technology for precision agriculture.
“With the basic visual sensors there are a few things you can do,” he says. “You’ll get a pretty quick estimation of whether or not there are things like deer damage; or in the spring you might want to get an idea of how wet it is, but that has limited use.”
Testing these new sensors to ensure they deliver what they promise, however, is an important part of developing the technology, work that Abuleil and his supervisors are contributing to. “Groundtruthing,” as they call it, costs significant man hours. But Martin’s sure the work being conducted will be worth the wait.
“We have to keep testing until we’re confident that what we see from the sky is really what we can measure on the ground,” Martin emphasizes. “It takes a little more time than some people would like it to take, but we don’t want to oversell the potential of UAV technology because we feel that we still have a lot of research to do.”
Martin got involved with UAV research in the first place because he saw a clear fit with his mandate to engage in research activities most likely to increase the future sustainability of the agricultural industry. If given the time and resources to properly develop these new tools, he believes farmers could be well positioned to get just the right amount of crop inputs exactly where they’re needed, attaining economic and environmental benefit all around. Nicole Rabe, land resource specialist with the Ontario Ministry of Agriculture, Food and Rural Affairs, sees similar potential, if only problems in using the technology could be eliminated. For example, she says, development really needs to reach the point where most agronomists can receive real-time data from the UAVs. For most, uploading massive data files into a van and waiting a day or two for a map is still far less efficient than walking fields.
“The volume issue is going to go away very quickly with folks like Ammar and his supervisor Graham Taylor working on industry software developments around real-time processing of UAV imagery after the photos are acquired,” she says. “That’s probably going to go away before you and I know it.”
Rabe strongly believes the real power of UAV technology is the mapping element. There is a real difference between eye-balling everything and estimating from ground references as opposed to holding a map in hand that clearly identifies the exact boundaries of a trouble spot.
“We can have a continuous map of the entire crop, you can calculate acres, you can quantify product, based on the decisions we made on that image,” she says. “It’s another map tool that allows us to quantify what we’re doing, so maybe we don’t have to put fungicide on the whole field or maybe we only have to spread a micronutrient across part of the field because the UAV image brought the scout to the region of the field that needed the farmers attention.”
The environmental and economic benefit of using the imagery marries perfectly with precision ag philosophies. So although it may still take time to realize the full potential of UAV imagery as another precision agriculture tool, it does remain a goal worth working toward.”