js

Rethinking Data Persistence with Event Sourcing and CQRS in Node.js

Discover how Event Sourcing and CQRS with EventStoreDB transform data modeling in Node.js and TypeScript for auditability and scalability.

Rethinking Data Persistence with Event Sourcing and CQRS in Node.js

rethinking-data-persistence-event-sourcing-cqrs-nodejs

I’ve been thinking about how we build software that truly remembers. Most applications I’ve worked on treat data like a snapshot—a single, current truth. But what if we could keep every change, every decision, every moment that led to that current state? That’s what brought me to explore Event Sourcing and CQRS with EventStoreDB. Let me show you how these patterns can change how you think about data persistence in Node.js and TypeScript.

Think about an order in an e-commerce system. In a traditional setup, you’d have an orders table with columns like status, total, and customer_id. When someone updates their order, you overwrite the old values. The history disappears. But what if you needed to know why an order total changed three days ago? Or prove when an item was added? That’s where Event Sourcing comes in.

Instead of storing just the current state, Event Sourcing stores every change as an immutable event. The current state becomes a calculation—a sum of all events that happened to that entity. It’s like having a detailed ledger instead of just a balance sheet. Every transaction is recorded, and you can reconstruct the balance at any point in time.

CQRS, or Command Query Responsibility Segregation, naturally complements this approach. It separates the operations that change state (commands) from those that read state (queries). Why would you want this separation? Because read and write patterns are often different. Your write model can focus on business rules and validation, while your read model can be optimized for fast queries and specific views.

Let me start with a practical example. We’ll build an order management system. First, we need to define what our domain events look like. These events represent things that have happened in our system—facts that cannot be changed.

// src/domain/events/order-events.ts
export interface DomainEvent {
  eventId: string;
  eventType: string;
  aggregateId: string;
  version: number;
  timestamp: Date;
}

export interface OrderCreatedEvent extends DomainEvent {
  eventType: 'OrderCreated';
  data: {
    orderId: string;
    customerId: string;
    items: Array<{
      productId: string;
      productName: string;
      quantity: number;
      unitPrice: number;
    }>;
    totalAmount: number;
  };
}

export interface OrderConfirmedEvent extends DomainEvent {
  eventType: 'OrderConfirmed';
  data: {
    orderId: string;
    confirmedAt: Date;
  };
}

Notice how each event is a complete description of something that occurred. OrderCreated tells us exactly what was ordered, by whom, and for how much. These events are stored in EventStoreDB, which is purpose-built for this pattern. It’s not just another database—it’s an event store that understands streams of events.

Setting up EventStoreDB is straightforward with Docker:

# docker-compose.yml
version: '3.8'
services:
  eventstore:
    image: eventstore/eventstore:23.10.0-bookworm-slim
    environment:
      - EVENTSTORE_INSECURE=true
    ports:
      - "2113:2113"
      - "1113:1113"

Have you ever wondered how to ensure business rules are always enforced? That’s where aggregates come in. An aggregate is a cluster of domain objects that can be treated as a single unit. In our order system, the Order aggregate ensures that business rules are maintained.

// src/domain/aggregates/order.ts
export class Order {
  private id: string;
  private status: 'draft' | 'confirmed' | 'shipped' | 'cancelled';
  private items: OrderItem[] = [];
  private version = 0;
  private changes: DomainEvent[] = [];

  static create(orderId: string, customerId: string, items: OrderItem[]): Order {
    const order = new Order();
    const event: OrderCreatedEvent = {
      eventId: uuid(),
      eventType: 'OrderCreated',
      aggregateId: orderId,
      version: 1,
      timestamp: new Date(),
      data: { orderId, customerId, items }
    };
    
    order.applyChange(event);
    return order;
  }

  confirm(): void {
    if (this.status !== 'draft') {
      throw new Error('Only draft orders can be confirmed');
    }
    
    const event: OrderConfirmedEvent = {
      eventId: uuid(),
      eventType: 'OrderConfirmed',
      aggregateId: this.id,
      version: this.version + 1,
      timestamp: new Date(),
      data: { orderId: this.id, confirmedAt: new Date() }
    };
    
    this.applyChange(event);
  }

  private applyChange(event: DomainEvent): void {
    this.changes.push(event);
    this.apply(event);
  }

  private apply(event: DomainEvent): void {
    switch (event.eventType) {
      case 'OrderCreated':
        this.id = event.data.orderId;
        this.status = 'draft';
        this.items = event.data.items;
        break;
      case 'OrderConfirmed':
        this.status = 'confirmed';
        break;
    }
    this.version = event.version;
  }
}

The apply method is crucial. It’s how we rebuild the current state from events. When we load an aggregate from the event store, we replay all its events through this method to get to the current state. The changes array holds new events that haven’t been persisted yet.

Now, how do we actually save these events? That’s where the repository pattern comes in. It acts as a bridge between our domain and the event store.

// src/infrastructure/repositories/order-repository.ts
export class OrderRepository {
  constructor(private eventStore: EventStoreDBClient) {}

  async save(order: Order): Promise<void> {
    const changes = order.getUncommittedChanges();
    const streamName = `order-${order.id}`;
    
    const events = changes.map(change => ({
      id: uuid(),
      type: change.eventType,
      data: change.data,
      metadata: {
        timestamp: change.timestamp,
        aggregateType: 'Order'
      }
    }));

    await this.eventStore.appendToStream(streamName, events, {
      expectedRevision: order.version - changes.length
    });
    
    order.markChangesAsCommitted();
  }

  async findById(orderId: string): Promise<Order> {
    const streamName = `order-${orderId}`;
    const events = await this.eventStore.readStream(streamName);
    
    const order = new Order();
    events.forEach(event => {
      order.apply({
        eventId: event.id,
        eventType: event.type,
        aggregateId: orderId,
        version: event.revision,
        timestamp: event.metadata?.timestamp || new Date(),
        data: event.data
      });
    });
    
    return order;
  }
}

Notice the expectedRevision parameter? That’s optimistic concurrency control. It ensures we don’t have conflicting updates. If someone else has added events to the stream since we loaded it, the save will fail, and we can retry or notify the user.

But what about reading data? That’s where CQRS shines. Our write model is optimized for maintaining business rules, but our read model can be optimized for queries. We can create projections that listen to events and update specialized read models.

// src/read-models/order-summary-projection.ts
export class OrderSummaryProjection {
  constructor(private db: Database) {}

  async onOrderCreated(event: OrderCreatedEvent): Promise<void> {
    await this.db.query(`
      INSERT INTO order_summaries 
      (order_id, customer_id, total_amount, status, created_at)
      VALUES ($1, $2, $3, $4, $5)
    `, [
      event.data.orderId,
      event.data.customerId,
      event.data.totalAmount,
      'draft',
      event.timestamp
    ]);
  }

  async onOrderConfirmed(event: OrderConfirmedEvent): Promise<void> {
    await this.db.query(`
      UPDATE order_summaries 
      SET status = 'confirmed', confirmed_at = $2
      WHERE order_id = $1
    `, [event.data.orderId, event.data.confirmedAt]);
  }
}

This projection maintains a simple table that’s perfect for listing orders or showing order status. It’s eventually consistent—it might be a few milliseconds behind the write model, but for most queries, that’s perfectly acceptable. The benefit? Our read queries become simple SQL queries instead of complex event replay operations.

What happens when you need to change your event structure? That’s where event versioning comes in. You can’t change events that are already stored, but you can write upcasters that transform old events into new formats.

// src/domain/events/upcasters.ts
export function upcastOrderCreatedEvent(event: any): OrderCreatedEvent {
  // Version 1 had different field names
  if (event.version === 1) {
    return {
      ...event,
      data: {
        orderId: event.data.id,
        customerId: event.data.customer_id,
        items: event.data.order_items,
        totalAmount: event.data.total
      }
    };
  }
  
  return event;
}

Testing event-sourced systems requires a different approach. You’re not just testing state—you’re testing behavior. You want to ensure that given certain commands, the correct events are produced.

// tests/order.test.ts
describe('Order', () => {
  it('should create an order with items', () => {
    const order = Order.create(
      'order-123',
      'customer-456',
      [{ productId: 'prod-1', quantity: 2, unitPrice: 10 }]
    );

    const changes = order.getUncommittedChanges();
    expect(changes).toHaveLength(1);
    expect(changes[0].eventType).toBe('OrderCreated');
    expect(changes[0].data.totalAmount).toBe(20);
  });

  it('should not confirm a shipped order', () => {
    const order = Order.create('order-123', 'customer-456', []);
    order.confirm();
    order.ship();
    
    expect(() => order.confirm()).toThrow('Order already shipped');
  });
});

The beauty of this approach is in the audit trail. Every change is recorded. You can see exactly what happened, when, and in what order. You can rebuild the state at any point in time. You can analyze patterns in how orders are created and modified.

But it’s not without challenges. You need to think carefully about event design. Events should represent business facts, not database operations. You need to handle eventual consistency in your read models. You need to consider how to archive or snapshot very long event streams.

I find that the mental shift is the biggest challenge. Instead of thinking “how do I update this record,” you think “what event represents this business action?” Instead of “what’s the current state,” you think “what sequence of events led to this state?”

The combination of Event Sourcing and CQRS gives you a system that’s inherently scalable, auditable, and flexible. Your write model can focus on consistency and business rules. Your read models can be optimized for specific queries. You can add new read models without touching the write model. You can replay events to create new projections or fix bugs in existing ones.

What would your application look like if every change was permanently recorded? How would it change how you handle customer disputes or compliance requirements? How could you use the event history to gain business insights?

I’ve found that once you start thinking in events, it’s hard to go back. There’s a clarity to it—a direct connection between business processes and technical implementation. The code tells the story of what the business does, not just how data is stored.

Give it a try. Start with a bounded context where audit trails matter or where you need temporal queries. You might be surprised at how naturally it fits certain business domains. And when you do, share your experiences. What challenges did you face? What benefits did you see? Your journey might help others see the potential in this approach.

If this exploration of Event Sourcing and CQRS resonated with you, or if you have different experiences with these patterns, I’d love to hear about it. Share your thoughts in the comments, and if you found this useful, pass it along to someone who might be wrestling with similar architectural decisions.


As a best-selling author, I invite you to explore my books on Amazon. Don’t forget to follow me on Medium and show your support. Thank you! Your support means the world!


101 Books

101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.

Check out our book Golang Clean Code available on Amazon.

Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!


📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!


Our Creations

Be sure to check out our creations:

Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | JS Schools


We are on Medium

Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva

Keywords: event sourcing,cqrs,node.js,typescript,eventstoredb



Similar Posts
Blog Image
Building Production-Ready GraphQL APIs: TypeScript, Apollo Server 4, and Prisma Complete Guide

Learn to build production-ready GraphQL APIs with TypeScript, Apollo Server 4, and Prisma ORM. Master authentication, real-time subscriptions, and optimization.

Blog Image
How to Scale Web Apps with CQRS, Event Sourcing, and Bun + Fastify

Learn to build scalable web apps using CQRS, event sourcing, Bun, Fastify, and PostgreSQL for fast reads and reliable writes.

Blog Image
Building Type-Safe Event-Driven Microservices with NestJS, RabbitMQ, and Prisma: Complete Tutorial

Learn to build type-safe event-driven microservices with NestJS, RabbitMQ & Prisma. Complete tutorial with error handling & monitoring. Start building now!

Blog Image
Build Real-Time Web Apps: Complete Svelte Firebase Integration Guide for Modern Developers

Learn how to integrate Svelte with Firebase for real-time web apps. Build fast, scalable applications with authentication, database, and hosting in one guide.

Blog Image
Complete Guide to Integrating Next.js with Prisma ORM: Build Type-Safe Full-Stack Applications

Learn to integrate Next.js with Prisma ORM for type-safe, full-stack web applications. Build powerful database-driven apps with seamless TypeScript support.

Blog Image
Build Full-Stack Apps Faster: Complete Next.js and Prisma Integration Guide for Type-Safe Development

Learn to integrate Next.js with Prisma for powerful full-stack development. Build type-safe apps with seamless database operations and improved dev experience.