js

Building Systems That Remember: A Practical Guide to Event Sourcing with CQRS

Learn how to build auditable, resilient applications using Event Sourcing and CQRS with NestJS, EventStoreDB, and Docker.

Building Systems That Remember: A Practical Guide to Event Sourcing with CQRS

I’ve been thinking about how we build software that truly remembers. Most applications store only the current state, like a snapshot that overwrites the past. But what if we could keep the entire story? This question led me down a path of rebuilding how data is handled, focusing on patterns that treat every change as a meaningful event. Let’s explore this together.

Think about your bank account. The app shows a balance, but that number is the result of every deposit, withdrawal, and fee. Traditional systems save only the final number. Event sourcing saves the story—each transaction as a separate, permanent fact. This creates a complete, auditable history. Why would we want that? For one, debugging becomes clearer. You can replay events to see exactly how a state was reached.

This approach often pairs with another idea: separating commands from queries. We call this CQRS. It means the code that changes data is separate from the code that reads it. They can use different models and even different databases. This separation allows each side to be optimized for its specific job. The write side focuses on correctness and business rules. The read side focuses on speed and convenience for queries.

Let’s set up a practical environment. You’ll need Node.js and Docker. We’ll use EventStoreDB, a database built for storing sequences of events, and NestJS, a framework for building efficient server-side applications. First, we create a new project.

nest new event-sourcing-app
cd event-sourcing-app
npm install @eventstore/db-client @nestjs/cqrs

Next, we run our databases with Docker. Create a docker-compose.yml file.

services:
  eventstore:
    image: eventstore/eventstore:latest
    ports:
      - "2113:2113"
    environment:
      - EVENTSTORE_INSECURE=true
  postgres:
    image: postgres:15
    environment:
      POSTGRES_DB: read_db

Run docker-compose up -d. Now, EventStoreDB is ready on port 2113 for storing our event streams, and Postgres is available for queryable read models.

How do we connect our app to EventStoreDB? We create a dedicated module. This sets up a single connection that our whole application can use.

// eventstore.module.ts
import { Module, Global } from '@nestjs/common';
import { EventStoreDBClient } from '@eventstore/db-client';

@Global()
@Module({
  providers: [
    {
      provide: 'EVENTSTORE_CLIENT',
      useFactory: () => EventStoreDBClient.connectionString('esdb://localhost:2113?tls=false')
    }
  ],
  exports: ['EVENTSTORE_CLIENT'],
})
export class EventStoreModule {}

The core of event sourcing is the domain model. Instead of a class with mutable properties, we have an aggregate that applies events. Let’s model a simple bank account. The account is created with an initial deposit. Its state is rebuilt by replaying all the events that happened to it.

// account.aggregate.ts
export class Account {
  public id: string;
  public balance: number = 0;
  private changes: IEvent[] = [];

  constructor(id: string, events?: IEvent[]) {
    this.id = id;
    if (events) {
      this.loadFromHistory(events);
    }
  }

  static open(id: string, initialDeposit: number): Account {
    const account = new Account(id);
    account.applyChange(new AccountOpenedEvent(id, initialDeposit));
    return account;
  }

  deposit(amount: number) {
    if (amount <= 0) throw new Error('Deposit must be positive');
    this.applyChange(new MoneyDepositedEvent(this.id, amount));
  }

  private applyChange(event: IEvent) {
    this.apply(event); // Update internal state
    this.changes.push(event); // Track new changes
  }

  private apply(event: IEvent) {
    if (event instanceof AccountOpenedEvent) {
      this.balance = event.initialDeposit;
    }
    if (event instanceof MoneyDepositedEvent) {
      this.balance += event.amount;
    }
  }

  public getUncommittedChanges(): IEvent[] {
    return [...this.changes];
  }

  public markChangesAsCommitted() {
    this.changes = [];
  }
}

Notice the apply method? It’s where the event modifies the aggregate’s state. The event itself is a simple data object.

// account-opened.event.ts
export class AccountOpenedEvent implements IEvent {
  constructor(
    public readonly accountId: string,
    public readonly initialDeposit: number,
    public readonly timestamp: Date = new Date()
  ) {}
}

We need a place to save and load these aggregates. This is the repository’s job. It fetches all past events for an aggregate, rebuilds it, saves new events, and publishes them.

// account.repository.ts
@Injectable()
export class AccountRepository {
  constructor(
    @Inject('EVENTSTORE_CLIENT') private client: EventStoreDBClient,
    private eventPublisher: EventPublisher
  ) {}

  async findById(accountId: string): Promise<Account> {
    const streamName = `account-${accountId}`;
    const events = [];
    
    try {
      const readStream = this.client.readStream(streamName);
      for await (const { event } of readStream) {
        if (event) {
          // Convert event data back to class instance
          const eventClass = this.getEventClass(event.type);
          events.push(Object.assign(new eventClass(), event.data));
        }
      }
    } catch (error) {
      if (error.type !== 'stream-not-found') throw error;
    }
    
    return new Account(accountId, events);
  }

  async save(account: Account): Promise<void> {
    const streamName = `account-${account.id}`;
    const changes = account.getUncommittedChanges();
    
    const eventsToStore = changes.map(change =>
      jsonEvent({
        type: change.constructor.name,
        data: change
      })
    );
    
    await this.client.appendToStream(streamName, eventsToStore);
    account.markChangesAsCommitted();
    
    // Publish events for read model updates
    changes.forEach(event => this.eventPublisher.publish(event));
  }
}

Commands trigger changes. They are handled by command handlers. A command is a request to do something. It’s validated, then executed on the aggregate.

// open-account.command.ts
export class OpenAccountCommand {
  constructor(
    public readonly accountId: string,
    public readonly initialDeposit: number
  ) {}
}

// open-account.handler.ts
@CommandHandler(OpenAccountCommand)
export class OpenAccountHandler implements ICommandHandler<OpenAccountCommand> {
  constructor(private repository: AccountRepository) {}

  async execute(command: OpenAccountCommand) {
    const { accountId, initialDeposit } = command;
    const account = Account.open(accountId, initialDeposit);
    await this.repository.save(account);
  }
}

So we’ve handled the write side. But how does the user see their current balance? This is the read side. When an event like AccountOpenedEvent is published, a handler updates a separate, optimized table in Postgres.

// account-opened.handler.ts
@EventsHandler(AccountOpenedEvent)
export class AccountOpenedHandler implements IEventHandler<AccountOpenedEvent> {
  constructor(@Inject('READ_DB') private readonly dataSource: DataSource) {}

  async handle(event: AccountOpenedEvent) {
    const queryRunner = this.dataSource.createQueryRunner();
    await queryRunner.connect();
    
    await queryRunner.query(
      `INSERT INTO account_balance (account_id, balance) VALUES ($1, $2)`,
      [event.accountId, event.initialDeposit]
    );
    
    await queryRunner.release();
  }
}

A query handler then fetches data from this simple table. It doesn’t replay history; it reads a pre-calculated state.

// get-balance.query.ts
export class GetAccountBalanceQuery {
  constructor(public readonly accountId: string) {}
}

// get-balance.handler.ts
@QueryHandler(GetAccountBalanceQuery)
export class GetAccountBalanceHandler implements IQueryHandler<GetAccountBalanceQuery> {
  constructor(@Inject('READ_DB') private readonly dataSource: DataSource) {}

  async execute(query: GetAccountBalanceQuery) {
    const result = await this.dataSource.query(
      `SELECT balance FROM account_balance WHERE account_id = $1`,
      [query.accountId]
    );
    return result[0]?.balance || 0;
  }
}

What happens when an account has thousands of events? Replaying them all is slow. This is where snapshots help. Periodically, we save the aggregate’s current state as a snapshot event. To load the account, we find the latest snapshot and only replay events that happened after it.

// account.snapshot.ts
export class AccountSnapshot {
  constructor(
    public readonly accountId: string,
    public readonly balance: number,
    public readonly version: number,
    public readonly timestamp: Date
  ) {}
}

// In the repository findById method:
async findById(accountId: string): Promise<Account> {
  const streamName = `account-${accountId}`;
  
  // Try to read the latest snapshot first
  const snapshot = await this.getLatestSnapshot(accountId);
  let fromVersion = snapshot ? snapshot.version + 1n : 0n;
  
  // Only load events after the snapshot
  const events = await this.loadEventsFromVersion(streamName, fromVersion);
  
  // Rebuild from snapshot and new events
  const account = snapshot 
    ? new Account(accountId, [snapshot, ...events])
    : new Account(accountId, events);
    
  return account;
}

Testing this setup is different. You test by asserting which events are produced after a command. You can also replay events to verify state reconstruction.

// account.aggregate.spec.ts
it('should deposit money correctly', () => {
  const account = new Account('acc-1', [
    new AccountOpenedEvent('acc-1', 100)
  ]);
  
  account.deposit(50);
  const changes = account.getUncommittedChanges();
  
  expect(changes).toHaveLength(1);
  expect(changes[0]).toBeInstanceOf(MoneyDepositedEvent);
  expect((changes[0] as MoneyDepositedEvent).amount).toBe(50);
  expect(account.balance).toBe(150);
});

This approach is powerful but adds complexity. It’s excellent for systems where audit trails, temporal queries, or complex business logic are critical. It might be overkill for simple CRUD applications. The key is to understand the trade-offs.

We’ve walked through the core concepts: storing changes as events, separating reads from writes, and rebuilding state from history. This method changes how we think about data, from a static record to a dynamic story. It provides clarity and resilience, especially as systems grow.

I hope this guide helps you see the potential in your own projects. What problem are you solving where a complete history would be invaluable? If you found this walk-through useful, please share it with your network. I’d love to hear about your experiences or questions in the comments below. Let’s keep building systems that remember.


As a best-selling author, I invite you to explore my books on Amazon. Don’t forget to follow me on Medium and show your support. Thank you! Your support means the world!


101 Books

101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.

Check out our book Golang Clean Code available on Amazon.

Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!


📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!


Our Creations

Be sure to check out our creations:

Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | JS Schools


We are on Medium

Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva

Keywords: event sourcing,cqrs,nestjs,eventstoredb,node.js



Similar Posts
Blog Image
Complete Next.js Prisma Integration Guide: Build Type-Safe Full-Stack Applications with Modern Database Operations

Learn how to integrate Next.js with Prisma ORM for type-safe, scalable web applications. Build powerful full-stack apps with seamless database operations.

Blog Image
Building Multi-Tenant SaaS with NestJS, Prisma, and Row-Level Security: Complete Implementation Guide

Learn to build secure multi-tenant SaaS apps with NestJS, Prisma & PostgreSQL RLS. Master tenant isolation, scalable architecture & data security patterns.

Blog Image
Build Complete Event-Driven Architecture: Node.js, RabbitMQ, and TypeScript Guide

Learn to build scalable event-driven architecture with Node.js, RabbitMQ & TypeScript. Master message brokers, error handling & microservices communication.

Blog Image
Production-Ready Rate Limiting System: Redis and Express.js Implementation Guide with Advanced Algorithms

Learn to build a robust rate limiting system using Redis and Express.js. Master multiple algorithms, handle production edge cases, and implement monitoring for scalable API protection.

Blog Image
Build Production-Ready GraphQL APIs: NestJS, Prisma, and Redis Caching Complete Guide

Build production-ready GraphQL APIs with NestJS, Prisma & Redis caching. Learn authentication, performance optimization & deployment best practices.

Blog Image
Build High-Performance GraphQL API with Apollo Server, Prisma, Redis Caching Complete Tutorial

Build high-performance GraphQL APIs with Apollo Server, Prisma ORM, and Redis caching. Learn authentication, subscriptions, and deployment best practices.