Meet Vinobot and Vinoculer, a duo that can visualize how plants adapt to their surroundings.
By Haniya Rae
In a cornfield in Missouri, two robots, one stacked on top of the other, file down the narrow rows. As they move, they collect information about the plants using various sensors—enough to create a 4-D graphic model on a computer. By building these models, scientists can show how plants react and adapt to their surrounding conditions. Someday, more robots like these might toil in cities and forests as well, helping humans determine how a plant species is responding to climate change.
“We wanted these robots to investigate different species of plants,” says Gui DeSouza, an associate professor of electrical engineering and computer science at the University of Missouri’s Vision-Guided and Intelligent Robotics Laboratory. “One plant may respond better to flood conditions, another to extreme heat. We’re essentially trying to correlate the plant’s phenotype, or the plant’s observable behavior during an environmental change, to its shape and physiology.”
DeSouza’s research as an engineer centers on formable objects, such as plant leaves, and devising ways to calculate their measurements. Leaves, he says, constantly move and sway, making their surface area and structure difficult to calculate. But with two robots collecting images, DeSouza says it’s possible to illustrate how plants adjust to their environment over time and build visualizations of complex plant behavior that have traditionally been difficult to capture.
Vinobot, one of the two robots, has sensors at three levels: near the ground, a few feet above the ground around the midlevel of the plant, and at the top of the plant. At each level, the Vinobot gathers information about light, humidity, and temperature. The top level of sensors on Vinobot has a quantum sensor model that can measure photosynthetically active radiation, or the amount of light the plants are able to use during photosynthesis.
The other robot, Vinoculer, is a tower equipped with an arm that has two steady cameras, one of them using infrared to detect temperature changes over time, that rotate around plants to create a 3-D image of the plant as well as a 4-D temperature change model that displays heat on the plant at different times. It can also scan its surrounding area from 30 to 60 feet in any direction to identify plants of interest, such as ones that are struggling.
“We can measure leaf angles and the distance between leaves,” DeSouza says. “We can also measure seeding and the size of the corn—which suggests how that family of corn is responding.” Once the imagery is taken, it’s compiled and sent back to the lab, the Bradford Research Center, for further analysis. This might entail comparing older 3-D models to newer 3-D models to see how the plant’s growth is progressing.
“What we’re really studying is how climate change is changing migration patterns and temperatures,” DeSouza says, adding that a California orange orchard has been in contact about using the robots for studying orange trees, and he expects other farmers will want to build their own robots to gather plant information. DeSouza says that these robots are cheaper to assemble than unmanned aerial vehicles and don’t need Federal Aviation Administration approval to fly, making them attractive options for studying vegetation. “Anywhere that a deformable object changes, any type of plant, Vinobot and Vinoculer can use our algorithms to model the object,” he says, “and show how different variables affect it.”
One thought on “These Robots Know Their Plants”