Skip to main content
Edge deployment enables you to run your Nuxt application on CDN edge networks worldwide, providing the lowest possible latency for your users.

What is Edge Deployment?

Edge computing runs your application code in data centers distributed globally, close to your users. This provides:

Low Latency

Serve requests from the nearest location to your users

Global Scale

Automatic worldwide distribution

High Availability

Built-in redundancy and failover

Cost Efficiency

Pay only for actual usage
Nuxt’s server engine, Nitro, makes edge deployment possible with minimal configuration. Nitro can deploy to more than 15 different edge and serverless platforms.

Cloudflare Workers

Deploy to Cloudflare’s global edge network with 250+ data centers worldwide.
1

Configure the preset

Set the Cloudflare Workers preset in your nuxt.config.ts:
nuxt.config.ts
export default defineNuxtConfig({
  nitro: {
    preset: 'cloudflare-pages',
    // or 'cloudflare' for Cloudflare Workers
  },
})
2

Build for Cloudflare

npm run build
Or use the environment variable:
NITRO_PRESET=cloudflare-pages npm run build
3

Deploy

Using Wrangler CLI:
npx wrangler pages publish .output/public
Or connect your Git repository in the Cloudflare Pages dashboard for automatic deployments.

Cloudflare Configuration

nuxt.config.ts
export default defineNuxtConfig({
  nitro: {
    preset: 'cloudflare-pages',
    cloudflare: {
      pages: {
        routes: {
          exclude: ['/static/*'],
        },
      },
    },
  },
})

Vercel Edge Functions

Deploy to Vercel’s Edge Network for serverless computing at the edge.
1

Install Vercel CLI

npm i -g vercel
2

Configure for Vercel

Vercel auto-detects Nuxt. Optionally, configure edge functions:
nuxt.config.ts
export default defineNuxtConfig({
  nitro: {
    preset: 'vercel-edge',
  },
})
3

Deploy

vercel deploy --prod
Or connect your Git repository for automatic deployments.

Vercel Edge Middleware

Use route rules to specify which routes run on the edge:
nuxt.config.ts
export default defineNuxtConfig({
  routeRules: {
    '/api/**': { isr: false }, // Run API routes on edge
    '/blog/**': { isr: true }, // Use ISR for blog
  },
})

Netlify Edge Functions

Deploy to Netlify’s globally distributed edge network powered by Deno.
1

Configure the preset

nuxt.config.ts
export default defineNuxtConfig({
  nitro: {
    preset: 'netlify-edge',
  },
})
2

Create netlify.toml

netlify.toml
[build]
  command = "npm run build"
  publish = ".output/public"

[[edge_functions]]
  function = "server"
  path = "/*"
3

Deploy

netlify deploy --prod
Or connect your Git repository in the Netlify dashboard.

Deno Deploy

Run your Nuxt app on Deno’s fast, globally distributed edge runtime.
1

Configure for Deno

nuxt.config.ts
export default defineNuxtConfig({
  nitro: {
    preset: 'deno-deploy',
  },
})
2

Build

npm run build
3

Deploy with deployctl

deployctl deploy --project=my-nuxt-app .output/server/index.ts

Serverless Platforms

AWS Lambda

Deploy to AWS Lambda for serverless Node.js execution.
nuxt.config.ts
export default defineNuxtConfig({
  nitro: {
    preset: 'aws-lambda',
  },
})
# Build
NITRO_PRESET=aws-lambda npm run build

# The output will be in .output/server/
# Upload to AWS Lambda

Azure Functions

Deploy to Microsoft Azure Functions.
nuxt.config.ts
export default defineNuxtConfig({
  nitro: {
    preset: 'azure-functions',
  },
})
Build and deploy:
NITRO_PRESET=azure-functions npm run build
func azure functionapp publish <APP_NAME>

Google Cloud Functions

Deploy to Google Cloud Platform’s serverless functions.
nuxt.config.ts
export default defineNuxtConfig({
  nitro: {
    preset: 'gcp-functions',
  },
})

Hybrid Rendering at the Edge

Combine different rendering strategies for optimal performance:
nuxt.config.ts
export default defineNuxtConfig({
  nitro: {
    preset: 'cloudflare-pages',
  },
  
  routeRules: {
    // Homepage: Pre-rendered at build time
    '/': { prerender: true },
    
    // Blog: ISR with 1 hour revalidation
    '/blog/**': { isr: 3600 },
    
    // API: Always fresh from edge
    '/api/**': { cors: true, cache: false },
    
    // Product pages: SWR with 10 min cache
    '/products/**': { swr: 600 },
    
    // Static assets: Cache for 1 year
    '/_nuxt/**': { headers: { 'cache-control': 'max-age=31536000' } },
  },
})

Rendering Strategy Options

prerender
boolean
Pre-render at build time, serve as static file
isr
number | boolean
Incremental Static Regeneration - regenerate in background after TTL
swr
number | boolean
Stale-While-Revalidate - serve cached, update in background
ssr
boolean
Server-side render on every request (edge runtime)

Edge Runtime Limitations

Be aware of edge runtime constraints:
Edge runtimes have limitations compared to Node.js:
Edge runtimes don’t support all Node.js APIs. Use web standards:
  • Use fetch() instead of Node’s http
  • Use crypto.subtle instead of Node’s crypto
  • Avoid file system operations
  • Cloudflare Workers: 50ms CPU time (paid plans get more)
  • Vercel Edge: 30 seconds max
  • Netlify Edge: 50ms CPU time
Design for quick responses and use async operations.
Edge functions typically have 128MB memory limit. Optimize:
  • Minimize dependencies
  • Use code splitting
  • Avoid large data processing
First request may be slower. Minimize impact:
  • Keep bundles small
  • Use tree-shaking
  • Lazy load when possible

Environment Variables

Configure environment variables for edge platforms:
# Using Wrangler
wrangler secret put NUXT_API_SECRET
Or in wrangler.toml:
[vars]
NUXT_PUBLIC_API_BASE = "https://api.example.com"

Database Connections

For edge deployments, use connection poolers or edge-compatible databases:

Edge-Compatible Databases

Cloudflare D1

SQLite on Cloudflare’s edge

Turso

Edge-replicated SQLite database

PlanetScale

Serverless MySQL with HTTP API

Upstash Redis

Redis with HTTP API for edge

Example: Cloudflare D1

server/api/users.get.ts
export default defineEventHandler(async (event) => {
  const db = event.context.cloudflare.env.DB
  const { results } = await db.prepare('SELECT * FROM users').all()
  return results
})

Example: Upstash Redis

server/api/cache.ts
import { Redis } from '@upstash/redis'

export default defineEventHandler(async (event) => {
  const redis = new Redis({
    url: process.env.UPSTASH_REDIS_URL,
    token: process.env.UPSTASH_REDIS_TOKEN,
  })

  const cached = await redis.get('key')
  if (cached) return cached

  const data = await fetchData()
  await redis.set('key', data, { ex: 3600 })
  return data
})

Performance Monitoring

Monitor your edge deployments:
Built-in analytics in Cloudflare dashboard:
  • Request volume
  • Response times
  • Error rates
  • Geographic distribution

Best Practices

Keep edge bundles small for faster cold starts:
nuxt.config.ts
export default defineNuxtConfig({
  nitro: {
    minify: true,
    sourceMap: false,
  },
})
Choose packages that work in edge runtimes:
  • Avoid Node.js-specific dependencies
  • Use web standard APIs
  • Check package compatibility
nuxt.config.ts
export default defineNuxtConfig({
  routeRules: {
    '/api/**': { 
      cache: { 
        maxAge: 60,
        staleMaxAge: 300,
      },
    },
  },
})
# Test with Miniflare for Cloudflare Workers
npm install -D miniflare
npx miniflare .output/server/index.mjs

Troubleshooting

The module likely uses Node.js APIs. Solutions:
  • Find an edge-compatible alternative
  • Use dynamic imports with fallbacks
  • Move functionality to API routes with Node.js runtime
Edge functions have time limits:
  • Optimize slow operations
  • Use background tasks where available
  • Consider moving to Node.js serverless for longer tasks
Ensure environment variables are properly configured:
  • Use process.env or useRuntimeConfig()
  • Set variables in platform dashboard
  • Prefix with NUXT_ for runtime config

Next Steps

Node.js Deployment

Deploy to traditional Node.js servers

Static Hosting

Pre-render and deploy to static hosts

Deployment Overview

Explore all deployment options

Prerendering

Learn about prerendering strategies