Monday, December 23, 2024

AIC & Unigen Launch Ultra-Efficient AI Inference Server

Related stories

Doc.com Expands AI developments to Revolutionize Healthcare Access

Doc.com, a pioneering healthcare technology company, proudly announces the development...

Amesite Announces AI-Powered NurseMagic™ Growth in Marketing Reach to Key Markets

Amesite Inc., creator of the AI-powered NurseMagic™ app, announces...

Quantiphi Joins AWS Generative AI Partner Innovation Alliance

Quantiphi, an AI-first digital engineering company, has been named...
spot_imgspot_img

AIC, a global leader in design and manufacturing of industrial-strength servers, in partnership with Unigen Corporation has launched the EB202-CP-UG, an ultra-efficient Artificial Intelligence (AI) inference server boasting over 400 trillion operations per second (TOPS) of performance. This innovative server is designed around the robust EB202-CP, a 2U Genoa-based storage server featuring a removable storage cage. By integrating eight Unigen Biscotti E1.S AI modules in place of standard E1.S SSDs, AIC is offering a specialized configuration for AI, the EB202-CP-UG—an air-cooled AI inference server characterized by an exceptional performance-per-watt ratio that ensures long-term cost savings.

“We are excited to partner with AIC to introduce innovative AI solutions,” said Paul W. Heng, Founder and CEO of Unigen. “Their commitment to excellence in every product, especially their storage servers, made it clear that our AI technology would integrate seamlessly.”

Michael Liang, President and CEO of AIC, stated, “By collaborating with Unigen to carve out a technological niche in AI, we have successfully demonstrated an efficient, powerful, air-cooled server that aligns perfectly with our customers’ stringent requirements.”

Also Read: Microchip Launches High-Performance PCIe® Gen 5 SSD Controller

The EB202-CP-UG is built on the Capella Motherboard platform, which accommodates the AMD EPYC (Genoa) CPU. It features a unique daughter-card/baseboard specifically designed for EDSFF signals from E1.S modules, combined with 128GB of high-speed DDR5 memory and dual power-efficient modular power supplies. Driven by eight Unigen Biscotti E1.S AI modules, each equipped with twin Hailo-8 AI Inference Accelerators, the server achieves an impressive 21,500 frames per second (FPS) in Resnet_V1_50. Leveraging the new AVX technology in the AMD server CPU, the EB202-CP-UG is capable of decoding one hundred 720P video streams at 25 frames per second each, while conducting AI analytics on each frame with significant processing headroom.

When paired with a Linux Ubuntu OS and VMS/AI software from Network Optix, this AI inference server offers unparalleled safety, security, and peace of mind for the most discerning IT and security professionals.

The EB202-CP-UG AI Inference Server is available for purchase now. Check out a demo of the EB202-CP-UG at the Future of Memory and Storage (FMS) at AIC’s booth (#915) from August 6th to 8th. Please contact AIC, and we’ll connect you with your local AIC System Integrator to begin your AI journey.

Source: PRNewswire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img