NVIDIA Announces Omniverse Microservices to Supercharge Physical AI

NVIDIA

NVIDIA announced NVIDIA Omniverse Cloud Sensor RTX, a set of microservices that enable physically accurate sensor simulation to accelerate the development of fully autonomous machines of every kind.

Sensors, which comprise a growing, multibillion-dollar industry, provide autonomous vehicles, humanoids, industrial manipulators, mobile robots and smart spaces with the data needed to comprehend the physical world and make informed decisions. With NVIDIA Omniverse Cloud Sensor RTX, developers can test sensor perception and associated AI software at scale in physically accurate, realistic virtual environments before real-world deployment — enhancing safety while saving time and costs.

“Developing safe and reliable autonomous machines powered by generative physical AI requires training and testing in physically based virtual worlds,” said Rev Lebaredian, vice president of Omniverse and simulation technology at NVIDIA. “NVIDIA Omniverse Cloud Sensor RTX microservices will enable developers to easily build large-scale digital twins of factories, cities and even Earth — helping accelerate the next wave of AI.”

Also Read: Unlock the Power of Rapid SAP App Development with Neptune Software’s No-Code and Low-Code Tools

Supercharging Simulation at Scale
Built on the OpenUSD framework and powered by NVIDIA RTX™ ray-tracing and neural-rendering technologies, Omniverse Cloud Sensor RTX accelerates the creation of simulated environments by combining real-world data from videos, cameras, radar and lidar with synthetic data.

Even for scenarios with limited real-world data, the microservices can be used to simulate a broad range of activities, such as whether a robotic arm is operating correctly, an airport luggage carousel is functional, a tree branch is blocking a roadway, a factory conveyor belt is in motion, or a robot or person is nearby.

Research Wins Drive Real-World Deployment
The Omniverse Cloud Sensor RTX announcement comes at the same time as NVIDIA’s first-place win at the Computer Vision and Pattern Recognition conference’s Autonomous Grand Challenge for End-to-End Driving at Scale.

NVIDIA researchers’ winning workflow can be replicated in high-fidelity simulated environments with Omniverse Cloud Sensor RTX — giving autonomous vehicle (AV) simulation developers the ability to test self-driving scenarios in physically accurate environments before deploying AVs in the real world.

Ecosystem Access and Availability
Foretellix and MathWorks are among the first software developers to which NVIDIA is providing Omniverse Cloud Sensor RTX access for AV development.

Omniverse Cloud Sensor RTX will also enable sensor manufacturers to validate and integrate digital twins of their sensors in virtual environments, reducing the time needed for physical prototyping.

SOURCE: NVIDIA 

We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.