Skip to main content

ADR-007: Edge Computing for Badge Serving

Status: PROPOSED
Date: 2025-08-25
Author: Architecture Review Team

Context

Badge serving has a critical performance requirement: <200ms at p95 globally. Badges are embedded in READMEs worldwide and slow badges break developer trust immediately.

Current plan: Serve badges from application server with CDN caching. However, this still requires origin requests for cache misses and doesn't handle dynamic badge generation at edge.

Decision

IMPLEMENT EDGE COMPUTING for badge generation using Cloudflare Workers

Architecture:

// Cloudflare Worker at edge
export default {
async fetch(request: Request, env: Env): Promise<Response> {
const { owner, repo, type } = parseRequest(request)

// Check edge cache first
const cached = await env.CACHE.get(`badge:${owner}:${repo}:${type}`)
if (cached) {
return new Response(cached, {
headers: {
'Content-Type': 'image/svg+xml',
'Cache-Control': 'public, max-age=3600',
'X-Cache': 'HIT-EDGE'
}
})
}

// Generate badge at edge
const analysis = await env.KV.get(`analysis:${owner}:${repo}`)
if (analysis) {
const svg = generateBadgeSVG(JSON.parse(analysis), type)

// Cache at edge
await env.CACHE.put(`badge:${owner}:${repo}:${type}`, svg, {
expirationTtl: 3600
})

return new Response(svg, {
headers: {
'Content-Type': 'image/svg+xml',
'Cache-Control': 'public, max-age=3600',
'X-Cache': 'MISS-EDGE-GENERATED'
}
})
}

// Fallback to origin
return fetch(`${env.ORIGIN}/badge/${owner}/${repo}/${type}.svg`)
}
}

Consequences

Positive:

  • <50ms global latency for cached badges
  • <150ms for cache misses with edge generation
  • 100% uptime even if origin is down
  • Reduced origin load by 95%+
  • Global scalability without infrastructure

Negative:

  • Vendor lock-in to Cloudflare
  • Additional complexity in deployment
  • Need to sync analysis data to edge KV
  • Limited compute time at edge (50ms CPU time limit)
  • Additional cost (~$5/million requests)

Alternatives Considered

  1. Traditional CDN Only

    • Pros: Simple, standard approach
    • Cons: Still requires origin for misses, no compute at edge
  2. Multi-Region Deployment

    • Pros: Full control, no vendor lock-in
    • Cons: Expensive, complex operations, still higher latency
  3. Static Pre-Generation

    • Pros: Ultimate performance
    • Cons: Loses real-time updates, massive storage requirements
  4. Edge Computing (RECOMMENDED)

    • Pros: Best performance, scalability, reliability
    • Cons: Vendor lock-in, edge compute limitations

Risk Assessment

Risks:

  • Vendor Lock-in: Tied to Cloudflare Workers
  • Edge Limitations: 50ms CPU time, 128MB memory limits
  • Data Sync Complexity: Keeping edge KV in sync with database
  • Cost at Scale: Could reach $500/month at 100M requests

Mitigation:

  • Abstract edge logic for portability
  • Optimize SVG generation for edge constraints
  • Implement efficient KV sync strategy
  • Monitor costs and optimize caching

Migration Strategy

Phase 1: Edge Cache (Week 1)

  • Deploy Cloudflare Workers
  • Implement edge caching
  • Monitor performance

Phase 2: Edge Generation (Week 2)

  • Sync analysis data to KV
  • Implement badge generation at edge
  • A/B test performance

Phase 3: Full Edge Computing (Week 3)

  • Move all badge logic to edge
  • Implement fallback strategies
  • Optimize for global performance

Performance Targets

Metrics:
p50_latency: <30ms
p95_latency: <100ms
p99_latency: <200ms
cache_hit_rate: >95%
edge_generation_rate: >90%
origin_fallback_rate: <5%

Conclusion

Edge computing for badge serving is essential to meet the <200ms global performance requirement. While adding complexity, it provides unmatched performance and reliability that builds developer trust. The investment in edge infrastructure pays dividends in user experience and operational efficiency.