Sunday, December 22, 2024

Groundlight Launches Open-Source ROS Package, Revolutionizing Embodied AI for Robotics

Related stories

Doc.com Expands AI developments to Revolutionize Healthcare Access

Doc.com, a pioneering healthcare technology company, proudly announces the development...

Amesite Announces AI-Powered NurseMagic™ Growth in Marketing Reach to Key Markets

Amesite Inc., creator of the AI-powered NurseMagic™ app, announces...

Quantiphi Joins AWS Generative AI Partner Innovation Alliance

Quantiphi, an AI-first digital engineering company, has been named...
spot_imgspot_img

Groundlight, a pioneer in visual AI solutions, announced the release of its open-source ROS package, accelerating the development of embodied AI in robotics. This innovative tool enables ROS2 developers to effortlessly incorporate advanced computer vision capabilities into their projects. By merging machine learning with real-time human oversight, Groundlight’s ROS package makes robots more perceptive and adaptable to real-world environments.

The classic computer vision (CV) process has been a bottleneck in developing robust robotic systems. The conventional method involves a time-consuming and labor-intensive process: gathering a comprehensive dataset, meticulously labeling each image, training a model, evaluating its performance, and then iteratively refining the dataset and model to handle edge cases. This can take months for each use case. After all this, when robots encounter situations outside their training set, they act unpredictably, even dangerously. Fixing that requires developers to redo much of the model development process.

Groundlight’s open-source ROS package revolutionizes this approach by offering fast, customized edge models that can run locally, tailored to each robot’s specific needs. Backed by automatic cloud training and 24/7 human oversight, robots will simply pause and await human guidance when faced with unfamiliar situations. This system enables real-time adaptation to unexpected scenarios. Human-verified responses typically arrive in under a minute, and are instantly trained back into the model and pushed down to the edge, improving safety and reliability, while dramatically speeding the development process.

Also Read: Vantiva Unveils ONYX: Smart Media Device with AI

“Our ROS package gives reliable eyes to embodied AI systems,” said Leo Dirac, CTO of Groundlight. “Modern LLMs are too slow and expensive for direct robotic control, and often fail at simple visual tasks. We combine fast edge models with human oversight, enabling robots to see and understand their environment efficiently and reliably.”

The Groundlight ROS package allows developers to ask binary questions about images in natural language. Queries are first processed by the current ML model, with high-confidence answers provided immediately. Low-confidence cases are escalated to human reviewers for real-time responses. This human-in-the-loop approach ensures reliability while continuously improving the underlying ML model without manual retraining cycles.

Robotics pioneer Sarah Osentoski, Ph.D., commented, “Groundlight’s ROS package is a game changer for teams building robotic systems for unstructured environments. It makes human fallback simple, and automatically incorporates exception handling into ML models, improving efficiency over time.”

This release marks a significant milestone in robotics and computer vision. By combining machine learning speed with human oversight reliability, Groundlight enables developers to create more intelligent, adaptive robotic systems easily. Whether for industrial automation, research, or innovative applications, this node paves the way for the next generation of visually-aware robots.

Groundlight is a leading innovator in visual AI solutions, dedicated to making computer vision more accessible and reliable for robotics and automation applications. By combining cutting-edge machine learning with human intelligence, Groundlight empowers developers to create smarter, more adaptable systems that thrive in real-world environments.

SOURCE: Businesswire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img