Â鶹´«Ã½

Skip to content

Ag game changing and the intelligence is inside

Artificial intelligence in agriculture is at the technology farmgate with machines that think like farmers.
WESTERN PRODUCER — Artificial intelligence is on the cusp of causing a sea change in agriculture that promises to quickly challenge conventional crop-production and farm management techniques.

Many components required to build autonomous, smart agricultural equipment for vegetable and grain production in North America are already proven technologies.

Sensors including camera, lidar, and radar, as well as components that enable the electrification of machines such as hydraulic pumps, batteries and control systems are also far more available now compared to a few years ago.

The rapid increase of component options available to OEMs, short-line and start-up equipment manufacturers suggests competitive pricing pressure will be an early feature of the autonomous farm equipment market.

Most of the agriculture systems available today capable of performing tasks semi-autonomously, from autosteer to spot spraying systems including Weed-It or John Deere’s See and Spray Select, are based on straightforward, if complex, computer programming.

For instance, the spot sprayers mentioned above use cameras to identify plants on a field, and then commands are sent to individual nozzles to only spray areas where plants were found.

These systems are not capable of differentiating between a weed or plant, but this will change quickly because AI and machine learning strategies that drove the digital revolution in other sectors are being adapted for agriculture.

For instance, Bilberry is a tech startup founded in 2015 by Guillaume Jourdain, Hugo Serrat and Jules Beguerie, who developed an AI-based system to drive green on green spot spraying with the help of technology developed for autonomous automobiles.

Green on green spot spraying occurs when individual weeds are found and sprayed in-crop.

“At that time (2015) it was really the beginning of artificial intelligence embedded in vehicles, in a very broad way. Before it was very difficult to solve all the technological challenges that exist for spot spraying. But with AI and the rise of embedded systems, really we were at the right time to work on this and try to finish the solution,” said Guillaume Jourdain of Bilberry.

The Bilberry system has a camera every three metres on the boom. Each camera has a dedicated processor that sends the information to a central computer in the sprayer’s cab.

“From there we send the information to the nozzles to open them and close them in real-time; so individual nozzle control. Obviously, we are also linked to the GPS so we also have a section control that’s working,” Jourdain said.

“Normal speed for us to be about 20 km-h. It can be a bit faster, but 20 k is where we are very comfortable.”

Training machine learning algorithms is a long and tedious process.

Bilberry started by driving fields with sprayers and four-wheel drive vehicles equipped with cameras.

The images were then labelled by manually identifying the plants in the photos and then the AI training process could begin.

“AI training means basically showing the labelled images to the algorithm several hundred or thousands of times so that it can start learning what the weeds are, what the crop is, and then in a new situation it will be able to say, ‘OK, that’s a weed or that’s a crop’,” Jourdain said.

He said Bilberry’s spot spray system is effective at taking broadleaf weeds out of cereal crops, with a 90 percent average hit rate of the weeds while using a fraction of the chemical required for blanket applications.

The company continues to train their algorithms to improve its ability to differentiate different kinds of weeds in many crops including canola, but the system is already commercially available in parts of Australia.

Bilberry is working with multiple spraying manufacturers including Europe’s Agrifac and SwarmFarm Robotics, an Australian start up that sells small autonomous robots that can be used for multiple applications.

Bosch and BASF’s new joint venture, called Bosch BASF Smart Farming, will offer its AI-based green on green smart spraying technology in Brazil by the end of the year, and plans to expand the service to North America.

An American startup called FarmWise builds an autonomous weeding robot for the vegetable industry that detects every plant on a field, both weed and crop, and then onboard computers send instructions to the robotic weeding arms.

FarmWise spent years developing in-house AI algorithms that are made for the specific purpose of detecting crops and weeds.

“We rely on deep learning algorithms and a lot of data that we accumulated over the years to get to a very accurate decision-making process, in terms of what type of plant this is, where it’s located, and then a few other parameters that help us do a very good job at the service that we deliver,” said Sebastien Boyer, chief executive officer of FarmWise.

A Seattle-based startup uses AI in its 9,000 pound autonomous robot that uses a 72-horsepower Cummins diesel engine to power weed-blasting lasers.

Creator of Carbon Robotics, Paul Mikesell said the machine was built to manage weeds in vegetable row crops.

“There are eight lasers across. They are arranged in a fashion that’s parallel to the furrows. So if you imagine a row with our vehicle in it that’s driving forward, those lasers are arranged linearly pointing back to front. Then through some optics the targeting bounces that beam down at the bottom of the robot to target the weeds,” Mikesell said.

Mikesell’s background is in computer vision and deep learning, which he applied to help the robot differentiate weeds from crops.

“It’s a learning algorithm, so it’s a neural net that has many different layers to it. It runs on GPU’s (graphics processing unit’s), which originally originated for graphics processing and have now been used for other things, you know like crypto currency mining. We use them a lot in deep learning because it’s very fast vector operations, things that can run in parallel, much like a human brain does,” Mikesell said.

He said the learning procedure involves providing the algorithm many sample images with enough labels that say what’s in the image.

“By label I mean pixel locations that have a label associated with it. So like this would be an onion for example and there’s an outline of an onion, or this is a weed that we want to shoot and we’ll have the center of the weed meristematic growth plate of the weed that we’re shooting with the laser,” Mikesell said.

Once the neural net is given enough samples it will learn to differentiate weeds and crops.

“Now it can make inferences about things that it hasn’t seen before, and it can say, ‘oh that’s this kind of plant I’ve seen that before, it’s not an exact copy but I know that’s an onion. Or I’ve seen that before, it’s not an exact copy, but I know that’s a purslane, which is a type of weed, or lambs quarters, which is a type of weed.’ So it learns, and then as we feed it more and more information and it gets better and better.”

The processing and predictions are made in real-time by the on-board computer, which does not need broadband connectivity.

However, during the training process, the neural nets require the team to gather example imagery and upload it to computers that conduct the deep learning processes.

Before turning the laser-blasting robots loose on a field, Carbon Robotic scouts weeds to fine-tune the AI for a specific field.

“Sometimes we can deploy the exact same ones (neural nets) that we’ve had before. Sometimes there are some smaller tweaks, what’s generally referred to as fine-tuning,” Mikesell said.

“The procedure usually takes 24 to 48 hours from initial arrival (at the field) to getting a good neural net, good predictions for us. That’s assuming it’s a new field.”

The example listed above is just the precursor when it comes to how AI will help autonomous agricultural equipment make real-time decisions.

There are also companies involved with environmental monitoring that offer AI-based products to help producers make decisions.

For instance, a California company has developed an insect-monitoring system that uses AI to classify insects by the sound they make.

The sounds insects make when in flight vary considerably between species, but it is difficult to use microphones in field sensors because of environmental noise.

So FarmSense developed a sensor that uses a curtain of light at the opening of the trap and when a bug flies through the light and causes a very specific disruption pattern.

“We have a kind of microphone but it actually records sound bits with light, not the actual sound. We call it pseudo acoustic,” said Eamonn Keogh of FarmSense.

The sensor tracks the movement of an insect’s wing beats with a laser and a phototransistor array, and then converts the disruption to a sound file, which FarmSense then processes with algorithms capable of identifying the bug associated with the sound.

Keogh said the algorithms are able to detect more information than just how fast the wings are beating to help differentiate specific bugs, but these algorithms had to be trained to do this.

He said over the last few years the company built a large archive of insect sound data for the machine-learning algorithms to work off.

“For the last several years we’ve been taking insect larva, put them in a cage with our sensor, and let them hatch. We watched them from birth to death, 24 hours a day under different temperature and lighting conditions, air pressure, hungry versus not hungry and so forth,” Keogh said.

Beyond autonomous farming and environmental monitoring systems, tools based on AI that bring together data sets and then runs compounding analytics on them will become essential for farmers.

Management decisions on Canadian farms will increasingly be made by software, said Greg Killian of EPAM, an international software engineering services company that works across many industries.

Killian said agriculture is under pressure from robust competition throughout the global food supply chain, and efficiency gains advanced software can provide are at an early stage in the same cycle other sectors have already experienced.

He said retail, finance, transportation and life science are all much further along the digital revolution path compared to agriculture.

“They’ve had to adapt to similar pressures, pricing pressures, competitive pressures and things like that, which have forced them to become effectively software companies. If you look at Walmart or Target or the large banks, software has become a huge part of what they’re doing,” Killian said.

Many data sets, from demographic trends to global logistics, feed machine-learning algorithms that play a pivotal role in how these companies are managed.

Large grain-trading companies already work with powerful algorithms that crunch massive data sets from around the globe, from weather, logistics, and supply and demand metrics to help them understand what market positions to take.

There are already companies that offer programs that use historical production data to enable growers to play with a few parameters, including low and high rainfall projections and crop prices, to help find optimal field-specific fertility rates and crop variety recommendations.

Farm management software will become increasingly sophisticated and powerful as more data sets are identified with relevant relationships to agronomic, marketing and purchasing decisions.

AI will play a big role in identifying these relationships.

On the research side, universities and government organizations around the world are examining how to use AI to make efficiency gains in agriculture.

In Canada, the Disease Risk Tool (DiRT1) built by Agriculture Canada in 2016 is being updated to include crops beyond canola, with the help of AI.

DiRT1 combines information from satellites and user inputs into a web application that can be used to investigate the accuracy of crop-disease forecast models.

The prototype integrates geospatial data from Environment Canada on temperature, precipitation, and the web application allows users to provide information on seeding density and disease history in the field or in the region.

Information from the annual crop inventory and soil moisture are combined with data on the crop’s growth stage, a method developed by A.U.G. Signals.

A.U.G. Signals has worked for more than three decades in signal, image and data processing and works with organizations in multiple sectors.

Zeeshan Ashraf, senior manager of strategy and business development at A.U.G., said the company has collaborated with the Canadian government since 2014 to use remote-sensing technology to create a tool for crop phenology estimation.

A.U.G. also developed an early stage crop classification tool, to help government agencies get an early glimpse of what farmers are growing in regions throughout the country.

“A.U.G.’s technologies are based on both the optical data from the optical sensors, the satellite sensors, and also the radar-based data,” Ashraf said.

Clouds and fog can prevent satellite cameras from collecting data in crucial periods during the growing period, but these conditions do not block radar signals.

The RADARSAT Constellation Mission is a three-spacecraft fleet of Earth observation satellites operated by the Canadian Space Agency that began operating in 2019.

“This is a hundred percent Canadian technology, the farmers and the end users have day and night coverage of their crops on their fields for any purposes that they may need it for,” Ashraf said.

The new tool A.U.G. is building with Ag Canada, DiRT2, uses the company’s artificial intelligence and data analytics technology.

“Data from all these different sensors are aggregated, and then really the online-based tool will produce predictions and the models for the end-user where they would be notified about their crop condition, whether it is the right time to have the application of, for example pesticides,” Ashraf said.

This is just the tip of the iceberg in how AI may be used to provide a hands-free prediction model to farmers, to help them make production and marketing decisions.

The EPAM Cloud Pipeline enables scientists and companies to build and run customized scripts and workflows required to support modelling and simulation, as well as machine learning.

The company has been involved with big data management for 14 years and has large platforms under its management, from drug research to video games including Fortnite, and it processes more than 100 petabytes of data and more than a billion of messages and events every day.

Killian said the rate of change in agriculture as a result of high-level software strategies, will be much faster compared to industries further along the digital revolution path.

“There is more innovation to draw upon,” Killian said.

“Now, almost ubiquitously, high performance computing in genetic science, whether it’s for life sciences or with genetic science for agriculture, are all being done in high-performance computing, which use graphical computing processors that came out of gaming,” Killian said.

For instance, a typical laptop today has around eight graphics processing units (GPU) cores, whereas graphics cards now contain thousands of cores and provide processing power that would have been unthinkable a short time ago.

Killian said AI will continue to change agriculture through life science research, including genetic sequencing of crops and diseases, and through phenotyping, which produces massive amounts of visual data that needs to be analyzed.

Computer vision basically means the process of digitization of visual data, and Killian said it doesn’t matter if the digital information came from a video game’s digital environment or from a camera on a sprayer, because algorithms do not care where the data originated.

“Similar to gaming there’s a lot of high-speed processing which is being done in real-time. So whether it’s an actual tractor moving in a field taking in visual information and then adapting an output to it, or if it’s Fortnite, for example, and it’s having to present a lot of visual data very quickly back to a player,” Killian said.

push icon
Be the first to read breaking stories. Enable push notifications on your device. Disable anytime.
No thanks