Thursday, December 19, 2024

TensorOpera and Qualcomm Technologies Join Forces to Empower AI Developers with Cutting-Edge Generative AI Solutions

Related stories

ePlus Launches Secure GenAI Accelerator

ePlus inc. announced its Secure GenAI Accelerator offering. Part of...

Lockheed Martin Launches Astris AI to Enable Secure AI Solution

Lockheed Martin has announced the formation of Astris AI,...

Boomi Expands Data Management with Rivery Acquisition

Boomi™, the intelligent integration and automation leader, announced a...

Tray.ai Launches Merlin Agent Builder to Break the Traps of Custom Code and SaaS Agents

Tray.ai, innovator of the AI-ready composable integration platform, announced...

Aily Labs and Mila Partner to Advance AI Agents and Decision Intelligence

Aily Labs GmbH, pioneer of an AI-powered decision intelligence...
spot_imgspot_img

TensorOpera, Inc., the company providing “Your Generative AI Platform at Scale,” has announced a technology collaboration with Qualcomm Technologies, Inc. to deliver solutions to enable artificial intelligence (AI) developers to build, deploy, and scale generative AI applications. Pairing the company’s TensorOpera® AI Platform with Qualcomm® Cloud AI 100 inference solutions from Qualcomm Technologies, developers will be able to harness Qualcomm Technologies’ advanced AI technologies featured on TensorOpera AI Platform.

The rapid growth of powerful open-source foundation models, along with the availability of faster and more affordable AI hardware, has encouraged many enterprises—from startups to large companies—to develop their own generative AI applications, providing greater privacy, control, and ownership. However, many encounter challenges with complex generative AI software stacks, infrastructure management, and high computational costs for scaling and bringing their applications to production.

To help address these challenges, developers can look to TensorOpera’s AI Platform comprehensive stack designed to simplify the complexities of generative AI development. With the Cloud AI 100’s ability to facilitate distributed intelligence from the cloud to client edge, as well as its industry-leading energy efficiency, portability and flexibility, the TensorOpera AI Platform will be able to provide exceptional performance-per-dollar and cost efficiency making it an attractive choice for developers and enterprises.

Also Read: Kobiton Streamlines Mobile Test Management with New Capabilities

AI developers now have the opportunity to access Cloud AI 100 instances on the TensorOpera AI Platform, designed to enable the use of popular generative AI models, including Llama3 by Meta and Stable Diffusion by Stability AI. They can choose from various usage models, including API access, on-demand (pay-as-you-go), or dedicated deployments, while leveraging many capabilities like autoscale, comprehensive endpoint monitoring, optimized job scheduling, and AI Agent creation.

Salman Avestimehr, co-founder and CEO of TensorOpera, expressed enthusiasm about the technology collaboration: “We are thrilled to work with Qualcomm Technologies. It expands the compute options for AI developers on our AI Platform. Our work together also aligns with our shared long-term vision of integrated edge-cloud platforms, which we believe will drive widespread adoption of generative AI. In line with this vision, TensorOpera will soon launch its new foundation model optimized for smartphones and edge devices. Integrated into the TensorOpera AI Platform, this model enables the development of powerful AI agents directly on mobile devices—a field where Qualcomm has significantly invested by delivering high-performance, efficient compute chips for smartphones.”

“With the explosion of new generative AI models, developers around the world are hungry for easy, effective access to high-performance AI inference for deployment,” said Rashid Attar, Vice President, Cloud Computing, Qualcomm Technologies, Inc. “By combining TensorOpera’s AI Platform with Qualcomm Technologies’ Cloud AI 100, developers now have immediate access to deploy the most popular GenAI/Large Language Models – Llama3, Mistral, SDXL – at the push of a button. We are excited to collaborate with TensorOpera to deliver a high-performance inference platform that offers exceptional value and convenience to developers.”

Source: BusinessWire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img