Saturday, September 28, 2024

SambaNova Unveils Sambaverse, Empowering Developers to Compare the Speed and Accuracy of Open Source LLMs

Related stories

A Comprehensive Guide to AI Orchestration

As organizations are exploring opportunities to integrate more artificial...

9fin Appoints Moisés García as Chief Product Officer

9fin, the data and predictive analytics platform for debt...

Teledyne Unveils Next-Gen AI Smart Camera for Industry

Teledyne DALSA, a Teledyne Technologies company and global leader...

Gaia Launches Domain Name for Ecosystem Interactions

By simplifying interactions within the decentralized AI ecosystem with GDN, Gaia...
spot_imgspot_img

SambaNova Systems unveils Sambaverse, a unique playground and API where developers can test hundreds of available open source large language models (LLMs) from a single endpoint and directly compare their responses for any given application.

With Sambaverse, developers can compare multiple open source LLMs with a single query from a curated list of some of the most popular models found on Hugging Face. Upon entering a single prompt, Sambaverse simultaneously queries each model and displays the responses in real-time, enabling developers to compare and contrast the speed and accuracy of each.

“New base models like Llama2 from Meta, Mistral from MistralAI, and Gemma from Google are being developed by the open source community to create powerful expert models. However, the challenge is that developers don’t know which of these expert models, or combination of expert models, are best for their use case. They can’t rely on the quantitative benchmarks that exist today because they don’t adequately represent a developer’s use cases, so actual uses-case evaluation is the best way to choose which model is right for you,” said Kunle Olukotun, Co-founder and Chief Technologist at SambaNova Systems.

Also Read: Dataloop AI Launches a Marketplace to Drastically Enhance AI Development Processes and Slash Time-to-Market

“While there are inference services to test a small number of these open source expert models, the future is multimodal. The diversity of models and ability to evaluate them quickly, connect them to create workflows, and access them concurrently is how one achieves real-time inference. Until now, there wasn’t a solution on the market today that offered the breadth of experts and the ability to concurrently evaluate these models in real-time. Sambaverse solves a very visceral problem for the developer community,” Olukotun continued.

Sambaverse’s ability to compare multiple models in a single interface is made possible by its unique Composition of Experts (COE) model architecture, which runs best on SambaNova‘s SN40L system, with its 3-tier memory and dataflow architecture. This approach, available within Samba-1, gives enterprises and governments the performance, scalability, privacy and access control they require.

SOURCE: BusinessWire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img