Scientists at the Department of Energy’s Oak Ridge National Laboratory have demonstrated an autonomous robotic field monitoring, sampling and data-gathering system that could accelerate understanding of interactions among plants, soil and the environment.
The challenge of enhancing biomass productivity and carbon storage in bioenergy plantations requires a better understanding of how carbon cycles between soils, plants and the atmosphere. Soil carbon can be difficult to measure precisely. Current estimates place the amount of carbon locked away in soils worldwide at trillions of tons, or more than three times the amount in the atmosphere.
To speed data-gathering for better predictions of plant responses and soil carbon, ORNL biologists, ecologists, robotics engineers and computational scientists collaborated to build and demonstrate a first-of-a kind prototype autonomous system that can monitor vegetation and the environment, sample soils and transfer real-time data back to the lab.
The team developed a custom robotic platform, called the Sensors, Machine vision, Automation and Robotics for Transforming Plants Field Series (SMART Plant F-Series) and successfully demonstrated it at ORNL’s SMART field site where researchers are growing poplar plants as feedstock candidates for making biofuels. The system was designed with unique navigation features such as GPS waypoints and a laser sensing system, to help the robot navigate around plants without disturbing them.
“We’re getting better at characterizing aboveground phenotypes—the structure and chemistry of plants that you see above the soil. We have over 250,000 different above-ground phenotypes collected of poplar alone through our DOE investments in bioenergy research,” said project lead Udaya Kalluri, senior staff scientist in ORNL’s Biosciences Division. “But we have only a handful of belowground phenotypes. It’s not easy to characterize what’s going on with root-microbe interactions, and carbon, water and nutrient transformations in the soil beneath plants.”
Improving lab-to-field connections with robotic systems
“One of our goals with this project is to make connections between the laboratory and the field seamless—to order, for instance, automated sampling at a specific coordinate where we want to learn more and resolve asymmetries between above- and belowground data,” Kalluri added. “Our self-guided system is then automatically deployed to those coordinates to gather samples that can inform researchers about plant productivity and soil characteristics.”
The field sampling system can work in parallel with the Advanced Plant Phenotyping Laboratory, or APPL, at ORNL—a robotic greenhouse that moves plants through a unique system of imaging stations to measure plant properties. APPL is already providing insights such as confirming a link between a plant gene and greater biomass, analyzing the link between microbes and heat tolerance in plants, and how soil fungus colonizing roots can influence plant growth. APPL is adding an underground imaging station in 2025 that will further expand its capabilities to image and analyze plant roots and soils.
“By developing and demonstrating SMART field-based automated data collection and custom robotic action, we are complementing what we’re doing with APPL and moving closer to our ultimate goal of enabling capabilities that improve lab-to-field performance predictions and accelerating the pace of both fundamental and applied research,” Kalluri said.
The team’s larger vision for the project is an interconnected ecosystem leveraging a fleet of robots and sensors to automatically gather and transmit data from geographically distributed observation sites such as bioenergy plantations or natural ecosystems. Data would inform real-time models and could be sent to an edge-, cloud- or high-performance computing resource to inform smart field management, Kalluri said.
Kalluri built upon her previous automation collaborations with ORNL’s Manufacturing Science Division to develop the robotics-based solution for field applications. Advanced manufacturing researchers Chris Masuo, Peter Wang and Andrzej Nycz worked together to adapt a commercial robot platform equipped with a navigation system that utilizes real-time kinematic positioning, which incorporates surveying principles to correct common global positioning errors.
Connecting bioscience with manufacturing expertise
At DOE’s Manufacturing Demonstration Facility, or MDF, at ORNL, researchers customized the robot, which could be controlled remotely, by adding sensors, onboard software and an electromechanical system with 3D-printed components for collecting soil samples. The project marks the development of a more compact mobile platform for soil sampling, Masuo said.
To prove the concept, researchers demonstrated the robot’s accurate navigation through remotely-transmitted GPS waypoints. A laser sensing platform called light detection and ranging, or LiDAR, helps the robot dodge obstacles. ORNL engineers outfitted the robot with an external frame equipped with augers, tools shaped like a large metal drill. Microcontrollers manage lowering the augers so their rotation churns soil upward into a bucket. The robot can take up to four samples at once before returning independently to its home position, said Masuo, who developed the software as well as some hardware and electrical adaptations to the robot.
The technology supports efforts to study interactions between plants and the environment. Using robots for sampling could enable scientists and farmers to gather more data while focusing on analysis, Masuo said.
The project lays the foundation for future field research integrating automation with real-time computing for SMART monitoring or ecosystem management. Development of newer, more durable, multifunctional, self-powered sensors will further enable continuously gathering above- and belowground data on system performance and responses, supporting higher-precision process models and artificial intelligence foundation models in the future.
Additional sensors could be added to the robot to expand monitoring capabilities and deepen understanding of ecosystem health, as well as conditions that enhance productivity and resilience. For example, the LiDAR and visual cameras that are part of the unit could be used to check plant density and stem diameter to better understand plant growth, Masuo said. LiDAR essentially creates a 3D map, which can be combined with video footage afterward to understand surrounding environmental conditions. Productivity of the plants has also been monitored using pole-mounted LiDAR.
The robotic platform as well as sensors in the SMART field system could be enhanced, for instance, to facilitate research into early indicators of disease and responses to environmental stress or disturbances such as wildfire, as well as how plants are absorbing and storing atmospheric CO2.
This is the third collaboration between Kalluri and engineers from ORNL’s Manufacturing Science Division involving robotics applications. Previously, they worked together on two projects to develop automated monitoring and sampling systems for greenhouse research settings at ORNL, as the lab accelerates the transformation of plants for better bioenergy crops for biobased fuels, chemicals and materials, enabling a thriving bioeconomy.
“Through INTERSECT, I grow to understand needs I wouldn’t see because they are outside my area of study,” Nycz said. “When researchers from different fields are exposed to those needs and what is possible—that’s where the new ideas are being formed.”
“It’s been impressive how well this cross-cut team has worked together to reach the demonstration point in under a year,” Kalluri said. “We’ve tied together some very cool expertise that can make a difference in enhancing our understanding of the processes at play in both bioenergy plant productivity and carbon storage.”
Citation:
Autonomous robotics, sensors and advanced computing can now help with harvesting plant data (2024, December 18)
retrieved 19 December 2024
from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.