Wednesday, July 23, 2025

Embedded LLM launched Monetisation for AMD AI GPUs

Related stories

Gathr.ai Launches Data Warehouse Intelligence Feature

Gathr.ai has introduced Data Warehouse Intelligence, a powerful new...

AWS & Meta Partner to Accelerate AI Innovation

Amazon Web Services, Inc. (AWS), an Amazon.com, Inc. company,...

Daylight Launches AI-Driven MDR with Human Expertise

Daylight Security announces that it has emerged from stealth...

OpenText Launches Cloud Editions 25.3 to Advance AI, Cloud, Security

OpenText has unveiled its latest release, Cloud Editions (CE)...

Dante Omics AI Brings Genomics to the AI Era With GPUs

Advanced NVIDIA GPU Integration Transforms Genome and Multiomics Alignment,...
spot_imgspot_img

Embedded LLM has officially launched TokenVisor, a global monetization and management platform designed to accelerate the commercialization of GPU infrastructure, particularly within the AMD AI ecosystem. Initially unveiled at the Advancing AI 2025 conference in Santa Clara alongside AMD, TokenVisor addresses a critical gap in the AI industry: enabling organizations to turn costly GPU investments into measurable returns. With AI factories on the rise, enterprises often struggle with the complexities of billing, usage tracking, and resource management. TokenVisor simplifies these challenges by offering a turnkey layer for token-based pricing, real-time usage monitoring, automated billing, multi-tenant access, governance policies, and a developer portal with self-service APIs and an LLM testing playground. “TokenVisor brings powerful new capabilities to the AMD GPU neocloud ecosystem, helping providers efficiently manage and monetise LLM workloads,” said Mahesh Balasubramanian, Senior Director of Product Marketing, Data Center GPU Business, AMD. Major industry players are already seeing its impact.

Also Read: Okta & Palo Alto Networks Unite AI Security to Stop ID Attacks

“TokenVisor flips the economics of AI infrastructure,” said Kumar Mitra, general manager and managing director of Lenovo in Greater Asia Pacific. “By pairing Lenovo ThinkSystem servers with AMD Instinct GPUs and TokenVisor’s turnkey monetisation layer, our customers are launching revenue-generating LLM services at unprecedented speed and scale, providing the financial guardrails and chargeback capabilities that CIOs and CFOs require to confidently greenlight AI investments at scale. It’s the key to unlocking the full economic potential of the AI factory.” The platform is described as a hypervisor for the AI Token era, born from the collaborative spirit of the AMD neocloud community and aimed at making decentralized GPU computing commercially viable. Early adopters praise its ability to eliminate commercialization guesswork, enabling providers to benchmark, allocate resources, and launch AI services within days instead of months—highlighting its immediate ROI and strong demand across the global AI market.

Read More: Embedded LLM Launches First-of-its-Kind Monetisation Platform for AMD AI GPUs

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img