Tuesday, November 5, 2024

Cerebras Enables Faster Training of Industry’s Leading Largest AI Models

Related stories

Absci and Twist Bioscience Collaborate to Design Novel Antibody using Generative AI

Absci Corporation a data-first generative AI drug creation company, and...

GreyNoise Intelligence Discovers Zero-Day Vulnerabilities in Live Streaming Cameras with the Help of AI

GreyNoise Intelligence, the cybersecurity company providing real-time, verifiable threat...

Medidata Launches Bundled Solutions to Support Oncology and Vaccine Trials

Medidata, a Dassault Systèmes brand and leading provider of...

Blend Appoints Mike Mischel as SVP of AI Consulting

Blend, a leader in data science and AI-powered solutions,...

Patronus AI Launches Industry-First Self-Serve API for AI Evaluation and Guardrails

Patronus AI announced the launch of the Patronus API, the first...
spot_imgspot_img

Collaboration with Dell Technologies Expands Cerebras’ AI Solutions and ML Expertise to Top Global Organizations

Cerebras Systems, a pioneer in accelerating generative artificial intelligence (AI), announced a collaboration with Dell Technologies, to deliver groundbreaking AI compute infrastructure for generative AI. The collaboration combines best-of-breed technology from both companies to create an ideal solution designed for large-scale AI deployments.

The Cerebras and Dell solution includes AI systems and supercomputers, white-glove LLM training, and ML expert services. The collaboration also includes a new memory storage solution powered by Dell Technologies and AMD EPYC™ CPUs for Cerebras AI supercomputers, enabling enterprises to train models orders of magnitude larger than the current state of the art. Leveraging the performance of AMD EPYC 9354P CPUs and Dell PowerEdge R6615 servers, Cerebras accelerates its unique approach to memory-intensive compute, extending its 82TB streaming memory supercomputers clusters almost infinitely to train models of any size.

Also Read: Alphawave Semi Collaborates with Arm on High-Performance Compute Chiplet

“Our new collaboration with Dell is a turning point for Cerebras,” says Andrew Feldman, co-founder and CEO, Cerebras Systems. “This opens up our global sales distribution channels in a meaningful way, while providing customers with the additional AI hardware, software and expertise needed to enable full-scale enterprise deployments.”

Cerebras offers the world’s fastest AI acceleration technology that enables 880 times the memory capacity of GPUs, 97 percent less code to build large language models, push-button model scaling and superior data preprocessing solutions. Cerebras technology enables effortless compute and memory scaling, data parallelism for less debugging, and its simple structure enables a lower total cost of ownership. Cerebras also offers a deep bench of machine learning experts that have contributed open-source state-of-the-art language models and datasets, and published numerous research papers that advance generative AI.

Source: Businesswire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img