AI Tools, Industry News

Cloudflare’s Dynamic Workers Promise 100x Faster AI Agent Execution

Web infrastructure giant Cloudflare is transforming how enterprises deploy AI agents with the open beta release of Dynamic Workers, a new lightweight, isolate-based sandboxing system that starts in milliseconds, uses only a few megabytes of memory, and can run on the same machine??ven the same thread??s the request that created it.

Compared with traditional Linux containers, Dynamic Workers is roughly 100x faster to start and between 10x and 100x more memory efficient.

The Container Bottleneck Problem

Cloudflare’s argument is straightforward: for consumer-scale AI agents, containers are too slow and too expensive. A container is fine when a workload persists, but it is a bad fit when an agent needs to run one small computation, return a result, and disappear.

The company has spent months pushing what it calls Code Mode??he idea that large language models often perform better when given an API and asked to write code against it, rather than being forced into one tool call after another. Cloudflare says converting an MCP server into a TypeScript API can cut token usage by 81 percent.

The Evolution of Sandboxing

Modern sandboxing has evolved through three main models. Isolates, introduced by Google in 2011, let a single program spin up many small, tightly separated compartments. Containers, popularized by Docker in 2013, solved portability but are relatively heavy for short-lived tasks. MicroVMs, popularized by AWS Firecracker in 2018, offer stronger isolation with more overhead than isolates.

Dynamic Workers is Cloudflare’s answer to the container bottleneck for AI workloads. Because dynamic Workers are built on isolates, they can be created on demand, run one snippet of code, and then be thrown away immediately afterward. In many cases, they run on the same machine and even the same thread as the Worker that created them.

Code Mode: From Tool Orchestration to Generated Logic

Instead of giving an agent a long list of tools and asking it to call them one by one, give it a programming surface and let it write a short TypeScript function that performs the logic itself. This means the model can chain calls together, filter data, manipulate files, and return only the final result??ather than filling the context window with every intermediate step.

The company points to its own Cloudflare MCP server as proof of concept. Rather than exposing the full Cloudflare API as hundreds of individual tools, the server exposes the entire API through two tools??earch and execute??n under 1,000 tokens because the model writes code against a typed API.

Security Considerations

Cloudflare does not pretend security is easy. The company acknowledges that hardening an isolate-based sandbox is trickier than relying on hardware virtual machines. However, Cloudflare points to nearly a decade of experience making isolate-based multi-tenancy safe for the public web.

The Bigger Picture

Cloudflare is trying to turn sandboxing itself into a strategic layer in the AI stack. The company says many container-based sandbox providers limit concurrent sandboxes, while Dynamic Workers inherit the same platform characteristics that already let Workers scale to millions of requests per second.

Join the discussion

Your email address will not be published. Required fields are marked *