Sunday, April 13, 2025

Cloudflare debuts first remote MCP for AI agents

Related stories

Google Cloud Unveils Firebase Studio

Google Cloud announced the launch of Firebase Studio, a...

CyberArk Boosts AI Agent Security with Accenture’s Help

CyberArk, the global leader in identity security, is joining...

Infor Velocity Suite Boosts Innovation with GenAI

Infor Velocity Suite enables customers to diagnose, automate, and...

NTT Unveils AI Chip for Real-Time 4K Video Processing

NTT unveils AI inference LSI that enables real-time AI...
spot_imgspot_img

Cloudflare’s developer platform and global network provide the best environment for developing and implementing AI agents by removing cost and complexity barriers, making AI agents feasible

Cloudflare, Inc., the connectivity cloud company, introduced a suite of new tools designed to dramatically simplify and speed up AI agent development. The announcement includes the launch of the industry’s first remote Model Context Protocol (MCP) server, free-tier access to Durable Objects, and the general availability of Durable Workflows. With these new capabilities, developers can build and deploy powerful AI agents in a matter of minutes—easily, cost-effectively, and at scale.

AI agents—intelligent systems capable of acting autonomously, making decisions, and adapting to evolving environments—are poised to redefine how organizations harness artificial intelligence. While the potential for increased productivity is immense, many companies face challenges in developing agentic systems that drive tangible value. Building effective AI agents requires three key elements: access to AI reasoning models, execution workflows, and robust APIs for integrating services. The need for a scalable and efficient development platform has never been greater.

“Cloudflare is the best environment for developing and scaling AI agents. Period. The most innovative companies out there understand that agents are the next big step in applying AI, and they choose Cloudflare because we have everything they need to build quickly and at scale on our Workers platform,” said Matthew Prince, co-founder and CEO of Cloudflare. “Cloudflare has zeroed in on this moment: First, we built the most interconnected network on the planet. Then we built a developer platform that leverages that network to run code from 95% of the people online within 50 milliseconds. And we’re continuing to accelerate to give developers the best tools to build agentic AI.”

Also Read: Tricentis Unveils Cloud Test Data for AI-Powered Tosca

Key Innovations Announced:

1. Remote MCP Server: Empowering Agents to Take Autonomous Action

Cloudflare is launching the first remote server implementation of the Model Context Protocol (MCP), a fast-growing open-source standard that enables AI agents to interact directly with external services. Previously limited to local environments, MCP’s potential was constrained by accessibility challenges.

Now, developers can easily deploy MCP servers on Cloudflare’s global network, enabling secure, remote connectivity to services such as email or calendar APIs—without the need for locally hosted infrastructure. These remote MCP servers can retain user context, allowing agents to deliver seamless, personalized experiences. Integration with partners like Auth0, Stytch, and WorkOS simplifies authentication and authorization, making it easier for users to securely delegate tasks to their AI agents.

2. Free-Tier Durable Objects: Persistent Context for Smarter Agents

Cloudflare is now offering Durable Objects—a core building block for context-aware applications—on a free-tier basis, expanding access to developers at all levels. Durable Objects blend compute and storage into a single, serverless construct that remembers user data across interactions.

Ideal for building responsive and adaptive AI agents, Durable Objects allow systems to remember preferences, track ongoing conversations, and evolve behavior over time. Running on Cloudflare’s expansive edge network, they can handle millions of concurrent interactions and deliver low-latency responses close to end users.

3. Durable Workflows: Now Generally Available

Cloudflare’s Workflows feature—now generally available—enables developers to create persistent, multi-step applications that can span from minutes to weeks. These workflows automatically manage retries and persist state across long durations, which is crucial for AI use cases like travel booking, complex scheduling, or transaction coordination.

By handling these processes within a serverless infrastructure, Workflows eliminate the need for traditional backend orchestration and reduce the complexity of maintaining long-running logic.

4. Pay-As-You-Go AI Inference for Maximum Efficiency

Unlike model training, AI inference workloads are unpredictable and vary with user behavior. Traditional cloud providers often require businesses to provision maximum capacity, even during idle periods—leading to inefficiencies and inflated costs.

Cloudflare’s serverless architecture solves this by dynamically scaling AI workloads in real time. Developers only pay for the resources they actually use, enabling significantly more cost-effective AI deployments that flex with demand.

“Cloudflare offers a developer-friendly ecosystem for building AI agents, including a free-tier offering for Durable Objects and serverless options for AI inference,” explains Kate Holterhoff, Senior Analyst at RedMonk. “These low-cost and easy-to-use options could enable more companies to adopt and experiment with agentic AI.”

Bringing AI Closer to the Edge

Cloudflare continues to break down barriers to enterprise AI adoption by making inference fast, accessible, and affordable. With GPUs deployed in more than 190 cities worldwide, Cloudflare ensures AI workloads run with ultra-low latency, placing computation as close to end users as possible.

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img