Friday, November 22, 2024

AIC & Unigen Launch Ultra-Efficient AI Inference Server

Related stories

Capgemini, Mistral AI & Microsoft Boost Generative AI

Capgemini announced a global expansion of its Intelligent App...

Rackspace Launches Adaptive Cloud Manager for Growth

Rackspace Technology®, a leading hybrid, multicloud, and AI technology...

Theatro Launches GENiusAI to Boost Frontline Productivity

Theatro, a pioneer in voice-controlled mobile communication technology, is...

Denodo 9.1 Boosts AI & Data Lakehouse Performance

Latest release adds an AI-powered assistant, an SDK to...

Health Catalyst Launches AI Cyber Protection for Healthcare

Health Catalyst, Inc., a leading provider of data and...
spot_imgspot_img

AIC, a global leader in design and manufacturing of industrial-strength servers, in partnership with Unigen Corporation has launched the EB202-CP-UG, an ultra-efficient Artificial Intelligence (AI) inference server boasting over 400 trillion operations per second (TOPS) of performance. This innovative server is designed around the robust EB202-CP, a 2U Genoa-based storage server featuring a removable storage cage. By integrating eight Unigen Biscotti E1.S AI modules in place of standard E1.S SSDs, AIC is offering a specialized configuration for AI, the EB202-CP-UG—an air-cooled AI inference server characterized by an exceptional performance-per-watt ratio that ensures long-term cost savings.

“We are excited to partner with AIC to introduce innovative AI solutions,” said Paul W. Heng, Founder and CEO of Unigen. “Their commitment to excellence in every product, especially their storage servers, made it clear that our AI technology would integrate seamlessly.”

Michael Liang, President and CEO of AIC, stated, “By collaborating with Unigen to carve out a technological niche in AI, we have successfully demonstrated an efficient, powerful, air-cooled server that aligns perfectly with our customers’ stringent requirements.”

Also Read: Microchip Launches High-Performance PCIe® Gen 5 SSD Controller

The EB202-CP-UG is built on the Capella Motherboard platform, which accommodates the AMD EPYC (Genoa) CPU. It features a unique daughter-card/baseboard specifically designed for EDSFF signals from E1.S modules, combined with 128GB of high-speed DDR5 memory and dual power-efficient modular power supplies. Driven by eight Unigen Biscotti E1.S AI modules, each equipped with twin Hailo-8 AI Inference Accelerators, the server achieves an impressive 21,500 frames per second (FPS) in Resnet_V1_50. Leveraging the new AVX technology in the AMD server CPU, the EB202-CP-UG is capable of decoding one hundred 720P video streams at 25 frames per second each, while conducting AI analytics on each frame with significant processing headroom.

When paired with a Linux Ubuntu OS and VMS/AI software from Network Optix, this AI inference server offers unparalleled safety, security, and peace of mind for the most discerning IT and security professionals.

The EB202-CP-UG AI Inference Server is available for purchase now. Check out a demo of the EB202-CP-UG at the Future of Memory and Storage (FMS) at AIC’s booth (#915) from August 6th to 8th. Please contact AIC, and we’ll connect you with your local AIC System Integrator to begin your AI journey.

Source: PRNewswire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img