back to blog

How My Viral URL Lengthener Burned Through 1M Vercel Edge Requests

Written by Namit Jain·August 15, 2025·5 min read

A true story about a silly side project that went viral, torched my free tier, and taught me more about edge traffic than any tutorial ever could.


How It Got Viral

I built a playful URL lengthener: paste a tidy link, get back a hilariously long one filled with faux folders and optional emojis. I shared a couple of examples on X and Reddit. People loved the joke. A subreddit picked it up. Friends sent it into WhatsApp. Before I grasped what was happening, the link had bounced around the internet and brought in waves of curious visitors.

That was fun… until I opened the Vercel dashboard and watched Edge Requests climb like a rocket.
I immediately assumed analytics were the culprit and commented out the client-side API calls. But the graph kept climbing and climbing.


What burned my vercel free tier

The tool itself was simple:

  1. Takes a tidy link
  2. Base64-encodes the destination into ?data=...
  3. Pads the path with a bunch of fake folders (sometimes emojis)

There were two modes:

  • Direct 302 → server decodes data and issues a 302 redirect to the destination.
  • Interstitial → a playful client-rendered page stretches out the link for 2–3 seconds before redirecting.

In my head, the flow was clean:

  • Visit generator page → 1 request.
  • Generate link → client-only (no network).
  • Click generated link → 1 redirect or 1 interstitial page, then redirect.
  • API to return total count of URLs lengthened → 1 call.
  • API to log user in DB → 1 call.

That’s 3–5 requests per person. Easy.

I thought the only issues were:

  • “Total count” endpoint hitting the DB on every page load
  • Logging endpoint firing each time someone generated a link
  • Interstitial delay adding some fluff

But I was convinced analytics were the main culprit.
I commented them out. Edge Requests didn’t normalize.


What Is an Edge Request (And How Vercel Counts It)

  • Any HTTP(S) request served through the Vercel Edge Network counts:
    • HTML, JS chunks, CSS, fonts, favicons, redirects, and even errors.
  • CDN cache hits still count. Cached ≠ free.
  • Only traffic served by Vercel counts (external requests don’t).
  • One page view can generate many Edge Requests if it pulls multiple assets.

This mattered because my interstitial wasn’t a tiny HTML stub it was a full React page with:

  • Global layout
  • Fonts
  • Shared UI

What Was Actually Happening

  • The interstitial page wasn’t one request. It was ~10.

From my Next.js App Router setup, a single interstitial visit triggered:

  1. 1 HTML document for /tools/url-lengthener/[...slug]
  2. 1 global CSS file
  3. 4–6 JS chunks (runtime, main app, layout, page, shared components)
  4. 2 font files (next/font/google Inter + Calistoga)
  5. 1 favicon request

Total: ~9–11 Edge Requests (sometimes 12–13).


  • Each generated link was unique and very long.
    • Unique path segments + query junk → zero cache reuse.
  • Redirect route had a runtime mismatch.
    • Used atob in Node, causing occasional 500s and retries.
  • Math still exploded even without analytics:

If 60% hit interstitial and 40% go direct:

  • Interstitial: ~10 requests/visit
  • Direct: ~1–2 requests/visit
    Weighted average: ~6–7 requests per click

150k clicks → ~900k–1M+ Edge Requests


Extra noise: link unfurlers/crawlers added some load,
but the real burn came from:

  • Interstitial fan-out
  • Unique uncacheable URLs

What Burned Through My Vercel Free Tier

  • Interstitial page: 1 click → ~10 requests.
  • URL design: countless unique paths & params → no cache reuse.
  • Redirect handler: runtime-unsafe decoding → retries.
  • Virality: multiplied everything.

The Real Cost Path

  1. Viral spread = lots of clicks on many unique URLs.
  2. Large % hit the interstitial → multiplied requests ×10.
  3. Zero cache reuse → every click a fresh hit.
  4. Runtime mismatch → errors & retries.
  5. Result: 1M+ Edge Requests.

What Could Have Prevented This

1. Default to a short, cacheable redirect

  • Use short paths (/tools/url-lengthener/r?data=...)
  • Add Cache-Control: public, s-maxage=3600 (or longer with 301 when safe)
  • Move “fun” into fragment #... so it’s visible but not network-bound

2. Turn off interstitial for public links

  • Keep it as a demo-only toggle
  • Or render minimal HTML with meta refresh (no fonts/JS)

3. Cap URL size and avoid non-ASCII in paths

  • Enforce ≤ 2KB total bytes
  • Reduce segment count
  • Avoid emojis in path (use them in-page or fragment)

4. Fix runtime consistency in redirect

  • Match decoding to runtime (Node/Edge)
  • Declare explicitly to avoid retries

5. Smarter analytics (if used again)

  • Cache count results at the edge
  • Batch/dedupe logs client-side
  • Consider sampling during spikes

The Lesson My Viral Tool Taught Me

  • Small multipliers become huge at scale.
  • Unique URLs kill caching, and caching is cost control.
  • Runtime details matter a mismatch can trigger retries.
  • Analytics weren’t the villain page design and URL strategy were.

Try the URL Lengthener

🔗 Launch the tool