js

Build a Distributed Rate Limiter with Redis, Express and TypeScript: Complete Implementation Guide

Learn to build a scalable distributed rate limiter using Redis, Express & TypeScript. Implement Token Bucket, Sliding Window algorithms with complete code examples & deployment guide.

Build a Distributed Rate Limiter with Redis, Express and TypeScript: Complete Implementation Guide

I recently worked on a project where our API started getting hammered by unexpected traffic spikes. We needed a way to protect our services without slowing down legitimate users. That’s when I decided to build a robust rate limiter using Redis, Express, and TypeScript. If you’ve ever faced similar challenges, you’ll find this guide practical and insightful.

Have you considered what happens when your API suddenly gets flooded with requests? Rate limiting acts as a traffic cop for your application. It ensures fair usage and prevents system overload. In distributed environments, this becomes trickier because multiple servers must coordinate their limits.

Let me show you how to set things up. First, install the necessary packages. Run npm install express redis ioredis and npm install -D @types/express typescript ts-node. This gives you the core tools. Now, create a basic project structure with folders for algorithms, storage, middleware, and services.

Here’s a simple TypeScript configuration to get started:

{
  "compilerOptions": {
    "target": "ES2020",
    "module": "commonjs",
    "strict": true,
    "outDir": "./dist"
  }
}

Why use Redis? It allows multiple servers to share rate limit data. Without it, each server would track limits independently, leading to inconsistencies. Redis provides fast, distributed storage that scales well.

Let’s implement the Token Bucket algorithm. Imagine a bucket that holds tokens. Each request consumes a token, and tokens refill over time. This method smooths out bursts and allows for flexible rate control.

class TokenBucket {
  private tokens: number;
  private lastRefill: number;

  constructor(private capacity: number, private refillRate: number) {
    this.tokens = capacity;
    this.lastRefill = Date.now();
  }

  consume(): boolean {
    this.refill();
    if (this.tokens >= 1) {
      this.tokens--;
      return true;
    }
    return false;
  }

  private refill(): void {
    const now = Date.now();
    const timePassed = (now - this.lastRefill) / 1000;
    this.tokens = Math.min(this.capacity, this.tokens + timePassed * this.refillRate);
    this.lastRefill = now;
  }
}

But how do you make this work across servers? Store the bucket state in Redis using a Lua script for atomic operations. This prevents race conditions when multiple processes update the same key.

Next, the Sliding Window Log algorithm tracks exact request timestamps within a time window. It’s precise but can use more memory. Here’s a basic idea:

async function checkSlidingWindow(key: string, windowMs: number, maxRequests: number): Promise<boolean> {
  const now = Date.now();
  const timestamps = await redis.lrange(key, 0, -1);
  const validTimestamps = timestamps.filter(ts => now - parseInt(ts) < windowMs);
  
  if (validTimestamps.length >= maxRequests) {
    return false;
  }
  
  await redis.lpush(key, now.toString());
  await redis.ltrim(key, 0, maxRequests - 1);
  await redis.pexpire(key, windowMs);
  return true;
}

Ever wondered which algorithm fits your needs? Token Bucket is great for burst handling, while Sliding Window offers accuracy. Fixed Window Counter is simpler but can allow double the limit at window edges.

Now, integrate this into Express with middleware. Middleware checks each request and applies limits before passing control to your route handlers.

import { Request, Response, NextFunction } from 'express';

function rateLimitMiddleware(limiter: RateLimiter) {
  return async (req: Request, res: Response, next: NextFunction) => {
    const key = req.ip; // or use a custom key based on user ID
    const result = await limiter.consume(key);
    
    if (!result.allowed) {
      res.status(429).json({ error: 'Too many requests' });
      return;
    }
    
    res.set('X-RateLimit-Remaining', result.remaining.toString());
    next();
  };
}

What about edge cases? Handle scenarios like failed requests or exemptions for certain endpoints. Always include graceful degradation—if Redis fails, fall back to a local limit or allow all requests temporarily.

Testing is crucial. Use tools like Jest to simulate high traffic and verify limits. Monitor performance with metrics on request counts and denial rates. In production, deploy with proper Redis clustering and health checks.

I’ve found that starting with a simple Fixed Window Counter and evolving based on metrics works well. Remember, the goal is to protect your API while maintaining a good user experience.

If this helps you build a safer application, please like and share this article. Your comments and experiences would be valuable—drop them below to continue the conversation!

Keywords: distributed rate limiter, Redis rate limiting, Express TypeScript middleware, token bucket algorithm, sliding window rate limiting, fixed window counter, API rate limiting implementation, scalable rate limiter, Redis distributed systems, TypeScript rate limiter tutorial



Similar Posts
Blog Image
Next.js Prisma Integration Guide: Build Type-Safe Full-Stack Apps with Database Management

Learn how to integrate Next.js with Prisma for type-safe full-stack development. Build modern web apps with seamless database management and TypeScript support.

Blog Image
How to Integrate Prisma with GraphQL for Type-Safe Database Operations in TypeScript Applications

Learn to integrate Prisma with GraphQL for type-safe database operations in TypeScript apps. Build scalable APIs with auto-generated clients and seamless data layers.

Blog Image
How to Integrate Svelte with Firebase: Complete Guide for Real-Time Web Applications

Learn to integrate Svelte with Firebase for powerful web apps with real-time data, authentication & cloud storage. Build reactive UIs without server management.

Blog Image
Next.js Prisma ORM Integration Guide: Build Type-Safe Full-Stack Applications with Modern Database Management

Learn to integrate Next.js with Prisma ORM for type-safe, database-driven web apps. Complete guide with setup, migrations, and best practices for modern development.

Blog Image
Complete Guide to Event-Driven Microservices with NestJS, RabbitMQ, and PostgreSQL: Build Scalable Systems

Learn to build scalable event-driven microservices with NestJS, RabbitMQ & PostgreSQL. Complete guide covers architecture patterns, message queues & monitoring.

Blog Image
Building Distributed Event-Driven Architecture with Node.js EventStore and Docker Complete Guide

Learn to build distributed event-driven architecture with Node.js, EventStore & Docker. Master event sourcing, CQRS, microservices & monitoring. Start building scalable systems today!