Edge Computing 2026: Running Code at the CDN — Cloudflare Workers, Fastly Compute, and Lambda@Edge



Edge Computing 2026: Running Code at the CDN — Cloudflare Workers, Fastly Compute, and Lambda@Edge

Edge computing has matured from a niche optimization technique into a mainstream architectural pattern. In 2026, major applications run significant portions of their logic at the edge — not in a single data center, but across 300+ PoPs (Points of Presence) worldwide. This guide covers what’s actually worth running at the edge, which platform to choose, and how to structure edge-first architectures.

Edge Network Photo by NASA on Unsplash

What Is Edge Computing (The Real Version)

Edge computing isn’t just CDN caching. It’s executing application logic in the same infrastructure that serves your static assets — within milliseconds of your users, everywhere.

Traditional request flow:

User (Seoul) → CDN → Origin server (us-east-1) → Database → Response
Latency: 80-200ms just for the network round-trip

Edge request flow:

User (Seoul) → Edge function (Seoul PoP) → Response (or edge cache)
Latency: 5-20ms

The catch: edge functions have constraints. They run in V8 isolates (not full Node.js), have limited execution time, and can’t maintain persistent connections to traditional databases. Understanding these constraints is key to knowing when edge is the right choice.


The Edge Platform Landscape 2026

Cloudflare Workers

The market leader with the largest PoP network (300+ locations) and most mature ecosystem.

Key features:

  • V8 isolates (no container cold starts)
  • 0ms cold start time
  • 128MB memory, 10ms CPU time (free) / 50ms (paid)
  • KV store, Durable Objects, R2, D1 (SQLite at edge), Queues
  • Workers AI: Run LLM inference at edge

Pricing (2026):

  • Free: 100k requests/day, 10ms CPU
  • Workers Paid ($5/mo): 10M requests, 30ms CPU, unlimited KV reads

Fastly Compute

Focuses on Rust/WASM instead of JavaScript. Significantly faster cold starts (even with WASM) and better for compute-intensive tasks.

Key features:

  • WASM-first (Rust SDK, AssemblyScript, Go)
  • 50ms execution limit
  • KV store (Config Store), Object Store

Best for: Teams with Rust expertise, compute-heavy edge tasks

Lambda@Edge / CloudFront Functions

AWS’s edge compute, tightly integrated with CloudFront CDN.

  • CloudFront Functions: JavaScript, <1ms execution, viewer-facing only
  • Lambda@Edge: Node.js/Python, up to 30s, but ~1s cold starts

Best for: AWS-native teams who already use CloudFront

Vercel Edge Functions / Deno Deploy

Developer-experience focused platforms. Vercel’s edge functions are based on the Web Standard APIs (fetch, Request, Response) and work seamlessly with Next.js.


What to Run at the Edge (and What Not To)

✅ Perfect for Edge

1. Authentication & Authorization

// Cloudflare Workers: JWT validation at the edge
// auth-middleware.js
export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    
    // Skip auth for public routes
    if (url.pathname.startsWith('/public')) {
      return fetch(request);
    }
    
    const authHeader = request.headers.get('Authorization');
    if (!authHeader?.startsWith('Bearer ')) {
      return new Response('Unauthorized', { status: 401 });
    }
    
    const token = authHeader.slice(7);
    
    try {
      // Verify JWT using Web Crypto API (available at edge)
      const payload = await verifyJWT(token, env.JWT_SECRET);
      
      // Add user info to headers for origin server
      const modifiedRequest = new Request(request, {
        headers: {
          ...Object.fromEntries(request.headers),
          'X-User-Id': payload.sub,
          'X-User-Role': payload.role,
        },
      });
      
      return fetch(modifiedRequest);
    } catch {
      return new Response('Invalid token', { status: 401 });
    }
  },
};

async function verifyJWT(token, secret) {
  const [header, payload, signature] = token.split('.');
  const key = await crypto.subtle.importKey(
    'raw',
    new TextEncoder().encode(secret),
    { name: 'HMAC', hash: 'SHA-256' },
    false,
    ['verify']
  );
  
  const valid = await crypto.subtle.verify(
    'HMAC',
    key,
    base64UrlDecode(signature),
    new TextEncoder().encode(`${header}.${payload}`)
  );
  
  if (!valid) throw new Error('Invalid signature');
  return JSON.parse(atob(payload.replace(/-/g, '+').replace(/_/g, '/')));
}

2. A/B Testing & Feature Flags

// Edge-side A/B testing without origin involvement
export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    
    if (url.pathname === '/') {
      // Check for existing assignment
      const cookie = request.headers.get('Cookie') || '';
      const existingVariant = parseCookie(cookie, 'ab_variant');
      
      const variant = existingVariant || (Math.random() < 0.5 ? 'A' : 'B');
      
      // Route to different origins based on variant
      const targetUrl = variant === 'A' 
        ? `https://origin-a.example.com${url.pathname}`
        : `https://origin-b.example.com${url.pathname}`;
      
      const response = await fetch(targetUrl, request);
      const newResponse = new Response(response.body, response);
      
      if (!existingVariant) {
        newResponse.headers.append(
          'Set-Cookie',
          `ab_variant=${variant}; Path=/; Max-Age=86400; SameSite=Lax`
        );
      }
      
      // Track in Cloudflare Analytics Engine
      env.ANALYTICS.writeDataPoint({
        indexes: [variant],
        doubles: [1],
        blobs: [url.pathname],
      });
      
      return newResponse;
    }
    
    return fetch(request);
  },
};

3. Geolocation-Based Routing

export default {
  async fetch(request) {
    const country = request.cf?.country || 'US';
    const region = request.cf?.region;
    
    // GDPR: European users get EU data residency endpoint
    const euCountries = new Set([
      'DE', 'FR', 'IT', 'ES', 'NL', 'BE', 'PL', 'SE', 'NO', 'DK',
      'FI', 'AT', 'CH', 'PT', 'IE', 'CZ', 'HU', 'RO', 'BG', 'HR',
    ]);
    
    const originBase = euCountries.has(country)
      ? 'https://api.eu-west.example.com'
      : 'https://api.us-east.example.com';
    
    const url = new URL(request.url);
    const targetUrl = `${originBase}${url.pathname}${url.search}`;
    
    const response = await fetch(targetUrl, request);
    return new Response(response.body, {
      ...response,
      headers: {
        ...Object.fromEntries(response.headers),
        'X-Served-Region': euCountries.has(country) ? 'EU' : 'US',
      },
    });
  },
};

4. Request/Response Transformation

// Dynamic image optimization at the edge
export default {
  async fetch(request) {
    const url = new URL(request.url);
    
    if (!url.pathname.match(/\.(jpg|jpeg|png|webp)$/i)) {
      return fetch(request);
    }
    
    const accept = request.headers.get('Accept') || '';
    const supportsWebP = accept.includes('image/webp');
    const supportsAVIF = accept.includes('image/avif');
    
    // Use Cloudflare's image resizing
    const imageRequest = new Request(request.url, {
      cf: {
        image: {
          width: parseInt(url.searchParams.get('w') || '1200'),
          quality: parseInt(url.searchParams.get('q') || '80'),
          format: supportsAVIF ? 'avif' : supportsWebP ? 'webp' : 'jpeg',
        },
      },
    });
    
    return fetch(imageRequest);
  },
};

❌ Not Suitable for Edge

  • Complex business logic that requires database reads (latency kills the benefit)
  • Long-running tasks (>50ms CPU)
  • Large memory requirements (>128MB)
  • Native modules (no Node.js native bindings)
  • Stateful connections (WebSockets need special handling)

Cloudflare Durable Objects: State at the Edge

Durable Objects solve the biggest edge limitation — state. Each Durable Object is a single JavaScript instance with consistent storage, anywhere in the world.

// Real-time collaborative document — state lives at the edge
export class DocumentRoom {
  constructor(state, env) {
    this.state = state;
    this.sessions = new Map(); // WebSocket connections
  }

  async fetch(request) {
    if (request.headers.get('Upgrade') === 'websocket') {
      return this.handleWebSocket(request);
    }
    
    if (request.method === 'GET') {
      const content = await this.state.storage.get('content') || '';
      return new Response(JSON.stringify({ content }), {
        headers: { 'Content-Type': 'application/json' },
      });
    }
  }

  async handleWebSocket(request) {
    const [client, server] = Object.values(new WebSocketPair());
    
    server.accept();
    const sessionId = crypto.randomUUID();
    this.sessions.set(sessionId, server);
    
    server.addEventListener('message', async (event) => {
      const { type, content } = JSON.parse(event.data);
      
      if (type === 'update') {
        await this.state.storage.put('content', content);
        
        // Broadcast to all connected sessions
        for (const [id, session] of this.sessions) {
          if (id !== sessionId) {
            try {
              session.send(JSON.stringify({ type: 'update', content, from: sessionId }));
            } catch {
              this.sessions.delete(id);
            }
          }
        }
      }
    });
    
    server.addEventListener('close', () => {
      this.sessions.delete(sessionId);
    });
    
    return new Response(null, { status: 101, webSocket: client });
  }
}

Edge + Origin Architecture Patterns

Pattern 1: Edge Cache with Stale-While-Revalidate

export default {
  async fetch(request, env) {
    const cacheKey = new Request(request.url, request);
    const cache = caches.default;
    
    let response = await cache.match(cacheKey);
    
    if (response) {
      // Check if we should revalidate in background
      const age = parseInt(response.headers.get('Age') || '0');
      const maxAge = 60; // Serve cached version for 60s
      const staleWhileRevalidate = 300; // Revalidate in background for 5min
      
      if (age > maxAge && age < maxAge + staleWhileRevalidate) {
        // Serve stale, revalidate in background
        env.REVALIDATION_QUEUE.send({ url: request.url });
      }
      
      return response;
    }
    
    // Cache miss — fetch from origin
    response = await fetch(request);
    
    if (response.ok) {
      const responseToCache = new Response(response.body, response);
      responseToCache.headers.set('Cache-Control', 'public, max-age=60, stale-while-revalidate=300');
      
      event.waitUntil(cache.put(cacheKey, responseToCache.clone()));
      return responseToCache;
    }
    
    return response;
  },
};

Pattern 2: Edge BFF (Backend for Frontend)

// Aggregate multiple API calls at the edge
export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    
    if (url.pathname === '/api/dashboard') {
      const authHeader = request.headers.get('Authorization');
      const baseHeaders = { Authorization: authHeader };
      
      // Parallel fetch from multiple microservices
      const [user, orders, notifications] = await Promise.all([
        fetch(`https://user-service.internal/me`, { headers: baseHeaders })
          .then(r => r.json()),
        fetch(`https://order-service.internal/recent`, { headers: baseHeaders })
          .then(r => r.json()),
        fetch(`https://notification-service.internal/unread`, { headers: baseHeaders })
          .then(r => r.json()),
      ]);
      
      return new Response(JSON.stringify({
        user,
        recentOrders: orders.slice(0, 5),
        unreadCount: notifications.count,
      }), {
        headers: { 'Content-Type': 'application/json' },
      });
    }
    
    return fetch(request);
  },
};

Performance Benchmarks 2026

Real-world latency measurements from a global SaaS application:

ArchitectureP50P99Cost/1M req
Single-region origin145ms380ms$0.80
Multi-region + routing52ms190ms$2.40
Edge-first (Workers)18ms65ms$0.50
Edge + Durable Objects22ms80ms$0.75

Key insight: Edge is both faster AND cheaper for the right workloads, because you’re running in CDN infrastructure already deployed globally.


When to Go Edge-First

Greenfield application? Design edge-first:

  • Auth, routing, personalization → Edge functions
  • Business logic → Origin (traditional server/Lambda)
  • Data → Cloudflare D1, R2, or connect to your DB from origin
  • AI inference → Cloudflare Workers AI (Llama, Whisper at edge)

Existing application? Add edge incrementally:

  1. Start with auth/JWT validation at edge (biggest latency win)
  2. Add A/B testing and feature flags
  3. Add response caching with smart invalidation
  4. Gradually move more logic edge-ward

Conclusion

Edge computing in 2026 is no longer experimental — it’s production-proven at massive scale. Cloudflare handles 15% of all internet traffic through Workers. The question isn’t “should I use edge?” but “which parts of my application benefit from edge execution?”

The pattern that works: edge for latency-sensitive, stateless logic; origin for complex, stateful business logic. Use Durable Objects when you need state with edge latency.

The future is clear: as edge runtimes gain more capabilities (better SQLite support, AI inference, more memory), the “move to origin for state” rule will weaken. By 2027, most application logic will run within 20ms of every user on the planet.

Are you running production workloads on edge functions? What patterns have worked for you?

이 글이 도움이 되셨다면 공감 및 광고 클릭을 부탁드립니다 :)