Gerson

Gerson

Passionate developer specializing in web development, cloud architecture, and system design.

TypeScriptReactNext.jsPythonFastAPISQLNode.jsAWS

Why Fluid Compute Replaced Edge Functions

Vercel Edge Functions are no longer recommended. Fluid Compute runs in the same regions, at the same price, and removes the Web API restrictions that forced awkward workarounds. Here's what actually changed and how to migrate.

Global data network with glowing connection points across continents

Vercel quietly sunset the recommendation to use Edge Functions over the course of 2025. If you started a Next.js project in the last year and reached for export const runtime = 'edge' out of habit, you are probably now paying a compatibility tax you do not need to pay. Fluid Compute — Vercel's default serverless runtime — now runs in the same regions, costs the same, and lets you use the full Node.js standard library.

This article explains what actually changed, why Edge is no longer the right default, and how to migrate existing Edge code without changing user-facing behavior.

The Old Trade-off

The historical argument for Edge Functions was geography and cold-start. Edge ran closer to users and started faster than a cold Node.js Lambda. The cost was a restricted runtime: no fs, no crypto Node API, no full streams, and every npm package had to be Web-API-compatible or bundled with polyfills.

In practice this shifted pain around rather than removing it. You got snappy TTFB, but you also spent afternoons figuring out why jsonwebtoken did not work at the edge or why your Prisma client suddenly needed a Data Proxy.

What Fluid Compute Is

Fluid Compute is a serverless runtime that:

  • Runs in the same global regions as Edge Functions
  • Supports full Node.js 24 (and Bun, Python, Rust)
  • Reuses function instances across concurrent requests, so cold starts are much rarer
  • Supports graceful shutdown (SIGTERM handlers) and request cancellation via AbortSignal
  • Default execution timeout is 300 seconds on all plans

Under the hood, Middleware and Routing Middleware now run on Fluid Compute too. The Edge runtime is still available for compatibility, but new code should default to Fluid Compute — which is also what you get with zero config on App Router routes.

The Concurrency Win

Traditional serverless gives each request its own isolated instance. Fluid Compute reuses a warm instance to serve many concurrent requests, the way a long-running Node server would. For any workload where cold starts hurt (think: chat streaming, API proxies, auth checks), this is the win.

app/api/hello/route.ts

let callCount = 0;

export async function GET() {
  callCount++;
  return Response.json({
    calls: callCount,
    pid: process.pid,
    uptime: process.uptime(),
  });
}

Hit this a few times on a Fluid Compute deployment and you will see callCount and uptime growing across requests — the instance is being reused. On the old Edge runtime, every request was a fresh isolate and module-level state was effectively reset.

Migrating an Edge Function

The mechanical change is usually one line: delete export const runtime = 'edge'. That is enough for most routes. The harder parts are code paths that were written around Edge constraints.

Before: app/api/pdf/route.ts (edge)

export const runtime = 'edge';

// Had to use a Web-compatible PDF library
import { PDFDocument } from 'pdf-lib';

export async function POST(req: Request) {
  const buf = await req.arrayBuffer();
  const doc = await PDFDocument.load(buf);
  // ... Web API crypto, no node:fs, no Buffer
}

After: app/api/pdf/route.ts (fluid)

import { readFile } from 'node:fs/promises';
import { createHash } from 'node:crypto';
import { PDFDocument } from 'pdf-lib';

export async function POST(req: Request) {
  const buf = Buffer.from(await req.arrayBuffer());
  const doc = await PDFDocument.load(buf);
  const template = await readFile('./templates/cover.pdf');
  const hash = createHash('sha256').update(buf).digest('hex');
  // ... full Node API available
}

Watch out: Module-level state persists across requests in Fluid Compute. Do not cache per-user data in globals — that is a data-leak bug waiting to happen. Keep globals for genuinely shared state (connection pools, config, rate limiters) and put user-scoped data in the request handler.

Middleware Is Fluid Compute Now

Middleware historically ran on the Edge runtime and suffered the same restrictions. In 2025 it moved to Fluid Compute under the hood, which means middleware can now use Node.js APIs — database clients, full crypto, file system reads — without special config. Same location, same latency; the sandbox just got bigger.

If you were forwarding requests to a separate API route purely because middleware could not do the work itself, that hop is now optional.

When Edge Still Makes Sense

Fluid Compute is the right default, but there are a couple of cases where Edge is still a fit:

  • Pure transformation at extremely high RPS where startup cost genuinely dominates and you do not need any Node APIs.
  • Existing code that works and is not worth changing — there is no deprecation firefight. Edge keeps working.

For anything else — especially anything that touches a database, a file, or a Node-only package — Fluid Compute is the runtime you want.

Resources