Stereolabs, a pioneer in vision-based sensing technology, is making its ZED SDK and ZED X stereo cameras accessible in NVIDIA Isaac Sim, a powerful simulation application for developing, testing and managing AI-based robots. Additionally, these cameras are now compatible with NVIDIA Isaac ROS.
Within the NVIDIA Isaac Sim platform, developers can now harness the power of Stereolabs’ ZED X stereo cameras virtually and get full access to ZED SDK, the most advanced stereovision software stack to build space-aware applications.
There are many advantages to using the new integration to virtually design and test autonomous robots:
- Hardware: Users can now get access to the exact 3D model of the ZED X Camera to test and validate the number of cameras required, camera positioning, resolution, frame rate and field-of-view.
- Software: Developers have access to the complete ZED SDK library, including depth sensing, localization and object detection and can therefore simulate the end-to-end software stack. They can take full advantage of advanced simulation tools like NVIDIA Isaac Sim and get massive gains in terms of reproducibility, scalability, complex scenario testing and data collection. This can lead to major improvements in overall software performance.
Simulations are increasingly becoming a new standard for robot development. And as demand for autonomous robots skyrockets in light of labor shortages and supply chain challenges, simulation platforms like NVIDIA Isaac Sim have enabled developers to design, develop and test any hardware stack in a faster and safer way — saving significant time and resources by iterating frequently without the time-consuming setup that might be required in a physical test.
“Simulation is the future of robotics engineering,” said Cecile Schmollgruber, CEO of Stereolabs. “By using Stereolab cameras integrated with NVIDIA Isaac Sim, developers and engineers will be able to experiment with stereovision in a virtual environment for the first time and build space-aware applications before ever attaching a real camera to a real robot. This capability will empower developers and engineers to design, develop and test autonomous robots in much faster cycles to save massively on resources and obtain the best results in the real world.”
SOURCE : PRWeb