Tuesday, November 5, 2024

DDN Storage Solutions Deliver 700% Gains in AI and Machine Learning for Image Segmentation and Natural Language Processing

Related stories

Absci and Twist Bioscience Collaborate to Design Novel Antibody using Generative AI

Absci Corporation a data-first generative AI drug creation company, and...

GreyNoise Intelligence Discovers Zero-Day Vulnerabilities in Live Streaming Cameras with the Help of AI

GreyNoise Intelligence, the cybersecurity company providing real-time, verifiable threat...

Medidata Launches Bundled Solutions to Support Oncology and Vaccine Trials

Medidata, a Dassault Systèmes brand and leading provider of...

Blend Appoints Mike Mischel as SVP of AI Consulting

Blend, a leader in data science and AI-powered solutions,...

Patronus AI Launches Industry-First Self-Serve API for AI Evaluation and Guardrails

Patronus AI announced the launch of the Patronus API, the first...
spot_imgspot_img

From Cancer Detection, Robotaxis, and Chatbots to Sentiment Analysis, DDN Leads and Accelerates the Way in Safe and Power Efficient AI Adoption

DDN, the global leader in artificial intelligence (AI) and multi-cloud data management solutions, announced impressive performance results of its AI storage platform for the inaugural AI storage benchmarks released this week by MLCommons Association. The MLPerf Storage v0.5 benchmark results confirm DDN storage solutions as the gold standard for AI and machine learning applications.

DDN’s entries cover Image Segmentation and Natural Language Processing categories of the MLPerf Storage Benchmark. Using powerful single and multi-node GPU configurations, DDN’s A3I® AI400X2 storage appliance effortlessly scales to deliver faster and more reliable data access, while maximizing GPU utilization and delivering highest efficiency for demanding AI workloads.

In an individual compute node evaluation, a single DDN AI400X2 NVMe appliance equipped with DDN’s EXAScaler® 6.2 parallel filesystem fully served 40 AI accelerators, delivering a remarkable throughput of 16.2 GB/s.1 In a multi-node configuration, the same DDN AI400X2 NVMe appliance quadrupled its output, serving 160 accelerators over ten GPU compute nodes, achieving a throughput of 61.6 GB/s.2 These remarkable results demonstrate 700% better efficiency on a per storage node basis when compared to the competitive on-premises solution submissions.

Also Read: Leading Experiential Marketing Platform, AnyRoad, Announces Pinpoint New Natural Language Processing and Generative AI Feedback Assistant

“DDN’s cutting-edge data storage solutions fuel and accelerate GPUs in data centers and in the cloud, helping organizations develop better cancer detection methodologies, putting safe and reliable robotaxis on our roads and highways, and bringing to market more effective chatbots and virtual assistants to make our lives easier,” said Dr. James Coomer, SVP of Products at DDN. “We’re proud to lead the way in safe and power-efficient AI adoption, setting new standards for innovation and performance in the industry.”

The ability to power AI workloads, machine learning and Large Language Models at highest levels of efficiency and scale, while minimizing power usage and data center footprint is critical. With thousands of systems deployed on premise and in the cloud, DDN‘s AI infrastructure storage systems are the solution of choice to power GPUs for the most demanding and innovative organizations in the world.

SOURCE: PRNewswire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img