Sunday, December 22, 2024

MediaTek Leverages Meta’s Llama 2 to Enhance On-Device Generative AI in Edge Devices

Related stories

Doc.com Expands AI developments to Revolutionize Healthcare Access

Doc.com, a pioneering healthcare technology company, proudly announces the development...

Amesite Announces AI-Powered NurseMagic™ Growth in Marketing Reach to Key Markets

Amesite Inc., creator of the AI-powered NurseMagic™ app, announces...

Quantiphi Joins AWS Generative AI Partner Innovation Alliance

Quantiphi, an AI-first digital engineering company, has been named...
spot_imgspot_img

MediaTek, a leading global fabless semiconductor powering more than two billion connected edge devices every year, announced it is working closely with Meta’s Llama 2, the company’s next-generation open-source Large Language Model (LLM). Utilizing Meta’s LLM as well as MediaTek’s latest APUs and NeuroPilot AI Platform, MediaTek aims to build a complete edge computing ecosystem designed to accelerate AI application development on smartphones, IoT, vehicles, smart home, and other edge devices.

Presently, most Generative AI processing is performed through cloud computing; however, MediaTek’s use of Llama 2 models will enable generative AI applications to run directly on-device as well. Doing so provides several advantages to developers and users, including seamless performance, greater privacy, better security and reliability, lower latency, the ability to work in areas with little to no connectivity, and lower operation cost.

To truly take advantage of on-device Generative AI technology, edge device makers will need to adopt high computing, low-power AI processors and faster, more reliable connectivity to enhance computing capabilities. Every MediaTek-powered 5G smartphone SoC shipped is equipped with APUs designed to perform a wide variety of Generative AI features, such as AI Noise Reduction, AI Super Resolution, AI MEMC and more.

Also Read: Dialpad Launches DialpadGPT, the First Generative AI Built for Enterprises of All Sizes

Additionally, MediaTek’s next-generation flagship chipset, to be introduced later this year, will feature a software stack optimized to run Llama 2, as well as an upgraded APU with Transformer backbone acceleration, reduced footprint access and use of DRAM bandwidth, further enhancing LLM and AIGC performance. These advancements facilitate an expediated pace for building use cases for on-device Generative AI.

“The increasing popularity of Generative AI is a significant trend in digital transformation, and our vision is to provide the exciting community of Llama 2 developers and users with the tools needed to fully innovate in the AI space,” said JC Hsu, Corporate Senior Vice President and General Manager of Wireless Communications Business Unit at MediaTek. “Through our partnership with Meta, we can deliver hardware and software with far more capability in the edge than ever before.”

MediaTek expects Llama 2-based AI applications to become available for smartphones powered by the next-generation flagship SoC, scheduled to hit the market by the end of the year.

SOURCE: PRNewswire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img