Sunday, June 1, 2025

EnCharge AI Unveils EN100, AI Chip for On-Device Computing

Related stories

ClickHouse Raises $350M to Boost AI-Era Analytics

ClickHouse, a leading provider of real-time analytics, data warehousing,...

Proofpoint to acquire Hornetsecurity in signed agreement

This strategic acquisition marks an important milestone in furthering...

LinkSquares Debuts AI Tool to Automate Contract Risk Scoring

LinkSquares, the leading provider of AI-driven contract lifecycle management...

Voiant Unveils Hub 2.0 at ASCO, Sets Imaging Standard

Voiant Hub 2.0 is the newest version of Voiant...
spot_imgspot_img

Advanced analog in-memory computing technology delivers 200+ TOPS of AI compute power in a highly optimized package

EnCharge AI announced the EnCharge EN100, the industry’s first AI accelerator built on precise and scalable analog in-memory computing. Designed to bring advanced AI capabilities to laptops, workstations, and edge devices, EN100 leverages transformational efficiency to deliver 200+ TOPS of total compute power within the power constraints of edge and client platforms such as laptops.

“EN100 represents a fundamental shift in AI computing architecture, rooted in hardware and software innovations that have been de-risked through fundamental research spanning multiple generations of silicon development,” said Naveen Verma, CEO at EnCharge AI. “These innovations are now being made available as products for the industry to use, as scalable, programmable AI inference solutions that break through the energy efficiency limits of today’s digital solutions. This means advanced, secure, and personalized AI can run locally, without relying on cloud infrastructure. We hope this will radically expand what you can do with AI.”

Previously, models driving the next generation of AI economy—multimodal and reasoning systems—required massive data center processing power. Cloud dependency’s cost, latency, and security drawbacks made countless AI applications impossible.

EN100 shatters these limitations. By fundamentally reshaping where AI inference happens, developers can now deploy sophisticated, secure, personalized applications locally.

This breakthrough enables organizations to rapidly integrate advanced capabilities into existing products—democratizing powerful AI technologies and bringing high-performance inference directly to end-users

Also Read: MiTAC Launches NVIDIA MGX AI Server G4527G6 at COMPUTEX 2025

EN100, the first of the EnCharge EN series of chips, features an optimized architecture that efficiently processes AI tasks while minimizing energy. Available in two form factors – M.2 for laptops and PCIe for workstations – EN100 is engineered to transform on-device capabilities:

  • M.2 for Laptops: Delivering up to 200+ TOPS of AI compute power in an 8.25W power envelope, EN100 M.2 enables sophisticated AI applications on laptops without compromising battery life or portability.
  • PCIe for Workstations: Featuring four NPUs reaching approximately 1 PetaOPS, the EN100 PCIe card delivers GPU-level compute capacity at a fraction of the cost and power consumption, making it ideal for professional AI applications utilizing complex models and large datasets.

EnCharge AI’s comprehensive software suite delivers full platform support across the evolving model landscape with maximum efficiency. This purpose-built ecosystem combines specialized optimization tools, high-performance compilation, and extensive development resources—all supporting popular frameworks like PyTorch and TensorFlow.

Compared to competing solutions, EN100 demonstrates up to ~20x better performance per watt across various AI workloads. With up to 128GB of high-density LPDDR memory and bandwidth reaching 272 GB/s, EN100 efficiently handles sophisticated AI tasks, such as generative language models and real-time computer vision, that typically require specialized data center hardware. The programmability of EN100 ensures optimized performance of AI models today and the ability to adapt for the AI models of tomorrow.

“The real magic of EN100 is that it makes transformative efficiency for AI inference easily accessible to our partners, which can be used to help them achieve their ambitious AI roadmaps,” says Ram Rangarajan, Senior Vice President of Product and Strategy at EnCharge AI. “For client platforms, EN100 can bring sophisticated AI capabilities on device, enabling a new generation of intelligent applications that are not only faster and more responsive but also more secure and personalized.”

Early adoption partners have already begun working closely with EnCharge to map out how EN100 will deliver transformative AI experiences, such as always-on multimodal AI agents and enhanced gaming applications that render realistic environments in real-time.

Source: Businesswire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img