Web infrastructure giant Cloudflare has released Dynamic Workers into open beta—a new lightweight, isolate-based sandboxing system that starts in milliseconds, uses only a few megabytes of memory, and can run on the same machine as the request that created it. Compared to traditional Linux containers, Cloudflare claims Dynamic Workers is roughly 100x faster to start and 10x to 100x more memory efficient.
For enterprises deploying AI agents at scale, this could be a game-changer.
The Problem: Containers Weren’t Built for AI Agents
Modern sandboxing has evolved through three main models, each trying to build a better digital box: smaller, faster, and more specialized than the one before.
Isolates (Google, 2011): Introduced to let a single program run many separate, tightly separated compartments efficiently. Fast and lightweight, but limited to JavaScript, TypeScript, and WebAssembly.
Containers (Docker, 2013): Solved the portability problem by packaging code, libraries, and settings into a predictable unit that runs consistently everywhere. Revolutionary for cloud infrastructure—but too heavy for short-lived AI agent tasks. Containers generally take hundreds of milliseconds to boot and hundreds of megabytes to run.
MicroVMs (AWS Firecracker, 2018): Designed to offer stronger machine-like isolation than containers without the full bulk of traditional VMs. A popular choice for running untrusted code, but still slower and heavier than isolates.
Cloudflare is now arguing that for a growing class of web-scale, short-lived AI agent workloads, none of these models are optimal.
Dynamic Workers: Isolates Redefined for the AI Era
Cloudflare spent months developing what it calls “Code Mode”—the idea that LLMs often perform better when they are given an API and asked to write code against it, rather than being forced through rigid tool-call sequences.
Dynamic Workers serves as the secure execution layer that makes this approach practical. When an AI agent generates small pieces of code on the fly to retrieve data, transform files, call services, or automate workflows, Dynamic Workers provides the runtime.
The company says converting an MCP (Model Context Protocol) server into a TypeScript API can cut token usage by 81%—a dramatic reduction in both cost and latency.
The Key Insight: Matching Runtime to Task Duration
The key insight behind Dynamic Workers is that AI agents frequently generate code that only needs to run for milliseconds. A container that takes 500ms to start and uses 200MB of memory is absurdly over-provisioned for a task that needs 5ms of execution time and 2MB of memory.
Dynamic Workers solves this mismatch by dynamically loading code in the same process as the request that triggered it—no cold starts, no memory overhead, no container orchestration complexity.
The Bigger Picture: Sandboxing as a Strategic AI Layer
For enterprise technical decision makers, Cloudflare’s real pitch is strategic: turn sandboxing itself into a competitive advantage in the AI stack.
If agents increasingly generate code on the fly, then the economics and safety of the runtime matter almost as much as the capabilities of the underlying model. Containers and microVMs remain useful, but they’re too heavy for a future where millions of users may each have multiple agents writing and executing code constantly.
Dynamic Workers is priced at $0.002 per unique Worker loaded per day, in addition to standard CPU and invocation charges. For high-volume agentic workloads, this could represent a dramatic cost reduction compared to container-based alternatives.
What This Means for AI Developers
Cloudflare is positioning Dynamic Workers as the infrastructure layer that makes Code Mode practical at scale. The company is opening its platform to the growing MCP ecosystem, allowing developers to deploy MCP servers backed by Dynamic Workers with minimal configuration.
The broader implication is that the definition of “AI infrastructure” is expanding. It’s no longer just about GPUs and model weights—it’s about the execution layer that lets agents actually accomplish tasks. Cloudflare is betting that this execution layer will be as strategically important as the models themselves.
For developers building AI agents today, Dynamic Workers offers a compelling new option: faster execution, lower costs, and the security guarantees that come from running on Cloudflare’s globally distributed network.