Thursday, November 27, 2025

Lexar Introduces Industry’s First “AI Storage Core” for Next-Generation Edge AI Devices

Related stories

AI computing is moving quickly from the cloud to the edge. Devices like AI-powered PCs, smart vehicles, and robots are now facing bigger storage problems. Traditional storage solutions struggle to meet the needs of real-time multimodal data, random I/O workloads, and strict reliability demands. Also, transferring systems and data across devices adds to the challenge.

Recognizing this shift, Lexar a global leader in high-performance memory solutions unveiled the industry’s first “AI Storage Core.” The new storage module offers up to 4 TB capacity, high-speed performance, and a hot-swappable design purpose-built for AI-enabled endpoints.

Core Innovations for the AI Era

High Performance for AI Acceleration

The Lexar AI Storage Core delivers sequential read/write speeds that significantly outperform traditional memory cards, enabling fast handling of AI-scale data. Lexar has optimized small-block (512 B) I/O and enabled host-system collaboration using SLC Boost and Read Cache layers enhancements that accelerate loading of large language models (LLMs), generative image workflows, and other real-time AI workloads.

High Reliability in Harsh Environments

Manufactured using integrated packaging technology from Longsys, the module is dustproof, waterproof, shock-resistant and radiation-resistant. Planned models will support an extended operating temperature range from –40 °C to 85 °C addressing the needs of autonomous driving, outdoor robotics, and other mission-critical use cases.

High Flexibility for Cross-Device AI Collaboration

The hot-swappable design allows users to insert or remove the module while the system is running. A co-engineered thermal solution ensures stable performance under sustained workloads. With PCIe-boot support, users can launch the operating system (such as Windows), applications, and data directly from the module, enabling seamless OS portability and smooth cross-device collaboration.

Also Read: NVIDIA and RIKEN Propel Japan into a New Era of Scientific Computing with Dual Supercomputers

Built for Key AI Application Scenarios

AI PCs – High capacity and rapid performance accelerate model loading, LLM-driven workflows, and generative tasks. Hot-swap capability supports mobile workstation portability.

AI Gaming – High IOPS and quick random-read performance help reduce load times and stutter, facilitating high-frame-rate rendering and responsive real-time AI gaming interactions.

AI Cameras – Sustained performance supports continuous 4K/8K video capture plus real-time AI processing (e.g., subject tracking, scene optimization). The shock-resistant design suits outdoor and professional imaging environments.

AI Driving – The module can ingest multi-sensor data streams from cameras, radar, and LiDAR. Future wide-temperature, shock-resistant versions aim to ensure stable performance under demanding automotive conditions.

AI Robotics – Compact packaging fits space-constrained robotic platforms. With wide-temperature and anti-shock capabilities, the module supports factory, logistics, and outdoor deployments. As robotics evolve toward rapid learning and adaptation, the AI Storage Core enables upgrades to intelligence, identity, and security simply by module swapping.

Enabling Intelligent Storage for the AI-Driven Future

AI is changing many industries. This change makes smart storage key for real-time data processing and on-device intelligence. The Lexar AI Storage Core embodies Lexar’s deep understanding of storage demands in the AI era and its commitment to next-generation innovation. By delivering efficient, reliable, and flexible storage tailored for AI endpoints, Lexar aims to accelerate adoption of AI technologies in consumer, industrial, and automotive sectors establishing a new standard for intelligent storage in the years ahead.

Subscribe

- Never miss a story with notifications


    Latest stories