Friday, November 22, 2024

NVIDIA Announces Omniverse Microservices to Supercharge Physical AI

Related stories

Deep Instinct Expands Zero-Day Security to Amazon S3

Deep Instinct, the zero-day data security company built on...

Foxit Unveils AI Assistant in Admin Console

Foxit, a leading provider of innovative PDF and eSignature...

Instabase Names Junie Dinda CMO

Instabase, a leading applied artificial intelligence (AI) solution for...
spot_imgspot_img
NVIDIA Omniverse Cloud Sensor RTX Generates Synthetic Data to Speed AI Development of Autonomous Vehicles, Robotic Arms, Mobile Robots, Humanoids and Smart Spaces

NVIDIA announced NVIDIA Omniverse Cloud Sensor RTX™, a set of microservices that enable physically accurate sensor simulation to accelerate the development of fully autonomous machines of every kind.

Sensors, which comprise a growing, multibillion-dollar industry, provide autonomous vehicles, humanoids, industrial manipulators, mobile robots and smart spaces with the data needed to comprehend the physical world and make informed decisions. With NVIDIA Omniverse Cloud Sensor RTX, developers can test sensor perception and associated AI software at scale in physically accurate, realistic virtual environments before real-world deployment — enhancing safety while saving time and costs.

Also Read: CyberLink Releases FaceMe® Security Version 7.15, Enabling More Proactive and Flexible Security

“Developing safe and reliable autonomous machines powered by generative physical AI requires training and testing in physically based virtual worlds,” said Rev Lebaredian, vice president of Omniverse and simulation technology at NVIDIA. “NVIDIA Omniverse Cloud Sensor RTX microservices will enable developers to easily build large-scale digital twins of factories, cities and even Earth — helping accelerate the next wave of AI.”

Supercharging Simulation at Scale

Built on the OpenUSD framework and powered by NVIDIA RTX™ ray-tracing and neural-rendering technologies, Omniverse Cloud Sensor RTX accelerates the creation of simulated environments by combining real-world data from videos, cameras, radar and lidar with synthetic data.

Even for scenarios with limited real-world data, the microservices can be used to simulate a broad range of activities, such as whether a robotic arm is operating correctly, an airport luggage carousel is functional, a tree branch is blocking a roadway, a factory conveyor belt is in motion, or a robot or person is nearby.

Research Wins Drive Real-World Deployment

The Omniverse Cloud Sensor RTX announcement comes at the same time as NVIDIA’s first-place win at the Computer Vision and Pattern Recognition conference’s Autonomous Grand Challenge for End-to-End Driving at Scale.

NVIDIA researchers’ winning workflow can be replicated in high-fidelity simulated environments with Omniverse Cloud Sensor RTX — giving autonomous vehicle (AV) simulation developers the ability to test self-driving scenarios in physically accurate environments before deploying AVs in the real world.

Source: NVIDIA

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img