Verdant runs on the NVIDIA Jetson edge AI platform and relies on TAO Toolkit’s transfer learning to boost model production by 5x.

Startup Sprouts Multitasking Farm Tool for Organics
Startup Sprouts Multitasking Farm Tool for Organics

Scott Martin | NVIDIA

It all started with two software engineers and a tomato farmer on a West Coast road trip.

Visiting farms to survey their needs, the three hatched a plan at an apple orchard: build a highly adaptable 3D vision AI system for automating field tasks.

Verdant, based in the San Francisco Bay Area, is developing AI that promises versatile farm assistance in the form of a tractor implement for weeding, fertilizing and spraying.

Founders Lawrence Ibarria, Gabe Sibley and Curtis Garner — two engineers from Cruise Automation and a tomato farming manager — are harnessing the NVIDIA Jetson edge AI platform and NVIDIA Metropolis SDKs such as TAO Toolkit and DeepStream for this ambitious slice of farm automation.

The startup, founded in 2018, is commercially deployed in carrot farms and in trials at apple, garlic, broccoli and lettuce farms in California’s Central Valley and Imperial Valley, as well as in Oregon.

Verdant plans to help with organic farming by lowering production costs for farmers while increasing yields and providing labor support. It employs the tractor operator, who is trained to manage the AI-driven implements. The company’s robot-as-service model, or RaaS, enables farmers to see metrics on yield improvements and reductions in chemical costs, and pay by the acre for results.

“We wanted to do something meaningful to help the environment,” said Ibarria, Verdant’s chief operating officer. “And it’s not only reducing costs for farmers, it’s also increasing their yield.”

The company recently landed more than $46 million in series A funding.

Another recent event at Verdant was hiring as its chief technology officer Frank Dellaert, who is recognized for using graphical models to solve large-scale mapping and 4D reconstruction challenges. A faculty member at Georgia Institute of Technology, Dellaert has led work at Skydio, Facebook Reality Labs and Google AI while on leave from the research university.

“One of the things that was impressed upon me when joining Verdant was how they measure performance in real-time,” remarked Dellaert. “It’s a promise to the grower, but it’s also a promise to the environment. It shows whether we do indeed save from all the chemicals being put into the field.”

Verdant is a member of NVIDIA Inception, a free program that provides startups with technical training, go-to-market support, and AI platform guidance.

Companies worldwide — Monarch TractorBilberryGreeneyeFarmWiseJohn Deere and many others — are building the next generation of sustainable farming with NVIDIA Jetson AI.

 

Working With Bolthouse Farms

Verdant is working with Bolthouse Farms, based in Bakersfield, Calif., to help its carrot-growing business transition to regenerative agriculture practices. The aim  is to utilize more sustainable farming practices, including reduction of herbicides.

Verdant is starting with weeding and expanding next into precision fertilizer applications for Bolthouse.

The computation and automation from Verdant have enabled Bolthouse Farms to understand how to achieve its sustainable farming goals, according to the farm’s management team.

 

Riding With Jetson AGX Orin

Verdant is putting the Jetson AGX Orin system-on-module inside tractor cabs. The company says that Orin’s powerful computing and availability with ruggedized cases from vendors makes it the only choice for farming applications. Verdant is also collaborating with Jetson ecosystem partners, including RidgeRun, Leopard Imaging and others.

The module enables Verdant to create 3D visualizations showing plant treatments for the tractor operator. The company uses two stereo cameras for its field visualizations, for inference and to gather data in the field for training models on NVIDIA DGX systems running NVIDIA A100 Tensor Core GPUs back at its headquarters. DGX performance allows Verdant to use larger training datasets to get better model accuracy in inference.

“We display a model of the tractor and a 3D view of every single carrot and every single weed and the actions we are doing, so it helps customers see what the robot’s seeing and doing,” said Ibarria, noting this can all run on a single AGX Orin module, delivering inference at 29 frames per second in real time.

 

DeepStream-Powered Apple Vision 

Verdant relies on NVIDIA DeepStream as the framework for running its core machine learning to help power its detection and segmentation. It also uses custom CUDA kernels to do a number of tracking and positioning elements of its work.

Verdant’s founder and CEO, Sibley, whose post-doctorate research was in simultaneous localization and mapping has brought this expertise to agriculture. This comes in handy to help present a logical representation of the farm, said Ibarria. “We can see things, and know when and where we’ve seen them,” he said.

This is important for apples, he said. They can be challenging to treat, as apples and branches often overlap, making it difficult to find the best path to spray them. The 3D visualizations made possible by AGX Orin allow a better understanding of the occlusion and the right path for spraying.

“With apples, when you see a blossom, you can’t just spray it when you see it, you need to wait 48 hours,” said Ibarria. “We do that by building a map, relocalizing ourselves saying, ‘That’s the blossom, I saw it two days ago, and so it’s time to spray.’”

 

NVIDIA TAO for 5x Model Production

Verdant relies on NVIDIA TAO Toolkit for its model building pipeline. The transfer learning capability in TAO Toolkit enables it to take off-the-shelf models and quickly refine them with images taken in the field. For example, this has made it possible to change from detecting carrots to detecting onions, in just a day. Previously, it took roughly five days to build models from scratch that achieved an acceptable accuracy level.

“One of our goals here is to leverage technologies like TAO and transfer learning to very quickly begin to operate in new circumstances,” said Dellaert.

While cutting model building production time by 5x, the company has also been able to hit 95% precision with its vision systems using these methods.

“Transfer learning is a big weapon in our armory,” he said.

 

The content & opinions in this article are the author’s and do not necessarily represent the views of AgriTechTomorrow

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

How to overcome GNSS limitations with RTK correction services

How to overcome GNSS limitations with RTK correction services

Although GNSS offers ubiquitous coverage worldwide, its accuracy can be hindered in some situations - signals can be attenuated by heavy vegetation, for example, or obstructed by tall buildings in dense urban canyons. This results in signals being received indirectly or via the multipath effect, leading to inaccuracy, or even blocked entirely. Unimpeded GNSS positioning in all real world scenarios is therefore unrealistic - creating a need for supporting technologies, such as real time kinematic (RTK) positioning and dead reckoning, to enable centimeter-accuracy for newer mass-market IoT devices.