June 20, 2024


Sapiens Digital

How to Build an AI Robot With Nvidia’s $399 Jetson Xavier NX

5 min read

The tiny Nvidia Jetson Xavier NX looks a bit like a Raspberry Pi, but it’s far more powerful. The latest member of Nvidia’s expanding family of artificial intelligence, the Jetson Xavier NX is about the size of a credit card and is designed to be mounted on a robot to serve as its brains. 

Don’t let its diminutive size fool you, though. The $399 device, now available as a kit for AI developers, has a powerful graphics processor built on Nvidia’s latest Volta GPU microarchitecture, with hundreds of processing cores that let it accept multiple streams of data and process them with separate neural networks simultaneously.

It’s the ideal platform for testing a complex robot, Nvidia says. With the Linux-powered Jetson Xavier NX development kit serving as its brains, the bot could have multiple cameras to detect the expressions of nearby humans, as well as microphones to listen to their commands and respond appropriately. 

Nvidia Jetson Xavier NX development kit

In total, the graphics processor in the Jetson Xavier NX can simultaneously decode four streams of 4K video at 30 frames per second (fps), or a whopping 32 simultaneous streams of full HD (1080p) video at 30fps. The system has 384 CUDA cores and 48 Tensor cores available to run the neural networks that process these streams. 

The result is that a delivery robot powered by an Xavier NX can avoid collisions, plan its path down the sidewalk, identify objects, and respond to queries from humans—all at the same time. 

Cloud-Based Neural Nets for Earth-Based Robots

None of these capabilities are terribly revolutionary, of course. Delivery robots have been roaming the streets of large cities for a few years now, while others are making pizzas and patrolling grocery store aisles. But these sorts of bots typically belong to well-funded companies with lots of MIT-educated experts on hand to perfect their AI networks and troubleshoot them when they inevitably get tripped up by software bugs. 

With the Jetson Xavier NX and a little help from cloud-based computing, many more people can try their hands at building robots and other AI applications. The new module, like others in the Nvidia Jetson family, dispenses with the traditional monolithic model of AI engineering, in which teams of experts spend months or years building and fine-tuning multiple neural networks and then deploy them all at once. It’s the equivalent of needing to update your iPhone’s operating system every time there’s a new version of the Spotify app. 

Nvidia Jetson Xavier NX development kit

Instead, the Jetson Xavier NX lets developers create and refine multiple different neural networks to perform different tasks and deploy them whenever they’re ready. It’s a paradigm shift in the world of AI development, Nvidia says. Developers can create their applications either in the cloud, on a desktop PC, or on any Jetson developer kit and then “containarize” them for installation on a robot or wherever else they’re needed. 

More than 3,000 customers use Nvidia’s existing Jetson AI production modules, mostly the larger Jetson TX2, based on the company’s earlier Pascal GPU architecture. The company expects that many of them will migrate to the Jetson NX for future AI projects. 

Testing a Virtual Jetson Xavier NX Robot

Nvidia sent us the developer kit version of the Jetson Xavier NX to test out, along with a pre-written collection of neural networks that simulates what developers might create to power a robot. The development kit includes the Xavier NX itself as well as memory, USB ports, a microSD card slot, and other familiar accoutrements that turn the AI compute module into a full-fledged computer running Linux. 

Lacking the time or the expertise to actually build a physical robot with sensors and a drivetrain, I instead used sample video footage that Nvidia provided that approximates what a customer service robot might encounter as it makes its rounds. These robots need to identify humans, understand what a customer is asking, and provide useful answers, all of which require many cameras and sensors as well as the AI skills to analyze their inputs. The skills include gaze detection (to see when someone is looking at it), speech recognition, and natural language processing to understand and answer questions. 

The sample footage mimics a robot performing four such skills simultaneously. Once the Jetson Xavier NX is connected to an external monitor and running the demo, here’s what the virtual robot sees:

Screenshot of Nvidia Jetson Xavier NX sensor processing

The top-left quadrant detects people from four simultaneous camera feeds, identifying the number of people in each stream. The bottom-left quadrant is a neural network that can guess someone’s pose, so that it knows if a person is pointing at a specific product on a store shelf or motioning for the robot to follow it, for instance. The bottom-right quadrant figures out where people are looking—whenever someone looks at the robot, the boxes around the person’s eyes turn green, prompting the bot to ask if he or she needs assistance. 

Finally, the top-right quadrant simulates speech detection and natural language processing using the Google-developed BERT protocol. The demonstration has some preloaded topics that Nvidia provided, including NFL trivia and directions to the keynote address at the company’s annual developer conference. Ask a question using a USB microphone plugged into the Jetson Xavier NX dev kit, and the model will attempt to find the answer from within the pre-populated topic. 

Tweaking Neural Networks in Real Time

To make things more interesting and simulate real-time updates to one neural network while the others are still running, I ditched Nvidia’s BERT sample and created a new topic by copying the boilerplate text found at the bottom of every page on PCMag.com, including the one you’re reading now. I was able to create and deploy the new topic while the Jetson Xavier NX was busy crunching away at the other three quadrants, detecting humans, gazes, and gestures. Talk about multitasking!

The update turned out to be a success. When I asked “What does PCMag deliver?” the virtual robot responded with, “Labs-based, independent reviews of the latest products and services.” It was only 46 percent confident in its response—perhaps because I left out the “.com” part—but it needn’t have worried. 

Nvidia AI demo with PCMag topic

This simple demo is child’s play compared to sophisticated commercial robots like Temi or finely optimized commercial voice assistants like Siri or Alexa. But similar demos can be downloaded from the internet, installed on the Jetson Xavier NX, and customized, allowing anyone with basic basic programming skills and a few hundred spare dollars to dabble in the world of intelligent robots.

Further Reading

Robotic Reviews

Source Article

Copyright © All rights reserved. | Newsphere by AF themes.