MF
Writing
edgeserverlesscloudflarearchitecture

Edge runtimes: fast everywhere, limited somewhere

February 25, 2026

The pitch

Run your code in 300 data centers worldwide. Every request hits the nearest edge node. Latency drops from 200ms to 20ms. Cloudflare Workers, Vercel Edge Functions, Deno Deploy: they all promise the same thing. And for the right workload, they deliver.

Where edge shines

Authentication checks, A/B test routing, geolocation-based content, request transformation, and lightweight API responses. These are the sweet spot: small functions that run fast, don't need heavy dependencies, and benefit from being close to the user. A middleware that checks a JWT and redirects unauthenticated users is perfect for the edge.

Where it gets painful

The edge runtime is not Node.js. You get a subset of Web APIs. No fs, no native modules, no child_process. Many npm packages assume Node.js and break silently or loudly. ORMs that rely on native database drivers won't work. If your function needs to talk to a database, that database is still in one region, and your edge function is round-tripping across the world to reach it.

Cold starts are better than traditional serverless but still exist. For Workers they're negligible. For larger runtimes with more initialization logic, you notice.

The architecture that works

Use edge for the things that are genuinely latency-sensitive and stateless. Keep your data-heavy logic in a regional serverless function or a long-running server close to your database. The pattern is edge for routing and personalization, regional for data access.

The honest take

Edge is an optimization, not an architecture. Start with a regional deployment. Measure your latency. Move specific paths to the edge when the numbers justify the constraints. Most apps don't need every endpoint at the edge, and pretending otherwise leads to fighting the runtime instead of building features.

Share