Saturday, December 28, 2024

STMicroelectronics Enhances Edge AI with NPU-Driven STM32 Microcontrollers

Related stories

Dataiku: 2024 Gartner Customers’ Choice for DSML Platforms

Dataiku, the Universal AI Platform, announced its recognition as...

Hive Wins DoD Contract for Deepfake AI Defense

Hive, a leading provider of enterprise AI solutions, has...

Upstream Launches AI Tool to Cut Vehicle Warranty Costs

Upstream, the leading provider of cloud-based cybersecurity and data...

M-Files Earns 2024 Gartner Recognition for Document Management

M-Files, the leader in knowledge work automation, announced that...
spot_imgspot_img

STMicroelectronics, a global semiconductor leader serving customers across the spectrum of electronics applications, is making embedded artificial intelligence (AI) truly here-to-help with a new microcontroller series integrating, for the first time, accelerated machine-learning (ML) capabilities. This enables cost-sensitive, power-conscious consumer and industrial products to provide high-performance features leveraging computer vision, audio processing, sound analysis and other algorithms, until now beyond the capabilities of small embedded systems.

The STM32N6 microcontroller (MCU) series is ST’s most powerful to date, and the first to embed ST’s proprietary neural processing unit (NPU), the Neural-ART Accelerator, delivering 600 times more machine-learning performance than a high-end STM32 MCU today. The STM32N6 has been available to selected customers since October 2023 and is now ready to be offered in high volumes.

“We are on the verge of a significant transformation at the tiny edge. This transformation involves the increasing augmentation or replacement of our customers’ workloads by AI models. Currently, these models are used for tasks such as segmentation, classification, and recognition. In the future, they will be applied to new applications yet to be developed,” said Remi El-Ouazzane, President, Microcontrollers, Digital ICs and RF Products Group (MDRF) at STMicroelectronics. “The STM32N6 is the first STM32 product to feature our Neural-ART Accelerator NPU. It will utilize a new release of our unique AI software ecosystem package. This marks the beginning of a long journey of AI hardware-accelerated STM32, which will enable innovations in applications and products in ways not possible with any other embedded processing solution.”

Also Read: Core42 Enhances AI Cloud with NVIDIA Computing

“It is a common misconception that AI is purely a big datacenter, power hungry application,” says Tom Hackenberg, Principal Analyst, Memory and Computing Group at Yole Group. “This is no longer true. Today’s IoT edge applications are hungry for the kind of analytics that AI can provide. The STM32N6 is a great example of the new trend melding energy-efficient Microcontroller workloads with the power of AI analytics to provide computer vision and mass sensor driven performance capable of great savings in the total cost of ownership in modern equipment.”1

Comments on STM32N6 from early customers

LG is a multinational corporation known for its electronics, chemicals, and telecommunications products, including smartphones, home appliances, and televisions.

“The STM32N6 delivers remarkable AI performance and provides excellent flexibility in a small silicon package ideal for embedded systems and wearable devices. The inference speed, powered by the groundbreaking Neural-ART Accelerator, has exceeded our expectations and ST’s developer-friendly software tools let us seamlessly integrate our AI models into the MCU.”

Yehan Ahn, Task Leader, R&D, LG Electronics CTO Division

Lenovo Research is the innovation and research arm of Lenovo, focused on developing cutting-edge technologies and solutions in areas such as artificial intelligence, big data, cloud computing, and smart devices.

SOURCE: GlobeNewswire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img