js

Building Reliable, Auditable Systems with Event Sourcing in Node.js

Learn how to build traceable, resilient applications using event sourcing, Node.js, and EventStoreDB with real-world banking examples.

Building Reliable, Auditable Systems with Event Sourcing in Node.js

I’ve been thinking a lot about how we build systems that need to be reliable, auditable, and adaptable. In my work, I’ve seen too many applications where the “why” behind a data change is lost forever, buried in overwritten database rows. This led me to explore a different way of thinking about application state, one that treats every change as a permanent, meaningful fact. This approach is called event sourcing.

Why does this matter now? Modern applications demand more than just storing the current state. They need to explain it, analyze trends, and recover from errors. Event sourcing provides this by design. It’s not just a technical pattern; it’s a shift in how we model business processes.

Let’s build a practical system together. We’ll use Node.js and EventStoreDB to create a banking application. This example makes the concepts tangible. Instead of just updating an account balance, we’ll record every deposit and withdrawal as an individual event. The current balance becomes a result of replaying those events.

First, we need our foundation. Here’s a Docker setup to run EventStoreDB locally.

# docker-compose.yml
services:
  eventstore:
    image: eventstore/eventstore:latest
    ports:
      - "2113:2113"
    environment:
      - EVENTSTORE_INSECURE=true

With the store running, let’s define what our events look like. In event sourcing, events are the source of truth. They are immutable records of something that happened. Think of them as the entries in a ledger.

// A domain event for opening an account
interface AccountOpenedEvent {
  type: 'AccountOpened';
  accountId: string;
  ownerId: string;
  initialBalance: number;
  timestamp: Date;
}

Notice how this event describes a business fact: “An account was opened for this owner with this starting balance.” It’s not a command or a request; it’s a statement of something that has already occurred and been accepted by the system.

Now, how do we get from commands to events? This is the job of an aggregate. An aggregate is a cluster of domain objects that can be treated as a single unit. It protects business rules. In our case, the BankAccount aggregate ensures you can’t withdraw more money than you have.

class BankAccount {
  private balance: number = 0;
  private changes: any[] = []; // New events to save

  openAccount(ownerId: string, initialDeposit: number) {
    // Business rule: Account needs a positive opening balance
    if (initialDeposit <= 0) {
      throw new Error('Initial deposit must be positive.');
    }
    // Record the event
    this.changes.push({
      type: 'AccountOpened',
      accountId: this.id,
      ownerId,
      initialBalance: initialDeposit,
      timestamp: new Date()
    });
    // Update the internal state
    this.balance = initialDeposit;
  }

  getUncommittedChanges() {
    return this.changes;
  }
}

The aggregate produces events. We then save these events to EventStoreDB. The code to save is straightforward. We append to a stream, which is like a dedicated log for a specific entity, such as one bank account.

import { EventStoreDBClient } from '@eventstore/db-client';

const client = EventStoreDBClient.connectionString('esdb://localhost:2113');

async function saveEvents(streamName: string, events: any[]) {
  await client.appendToStream(streamName, events);
}

But what about reading data? If the current balance is calculated by replaying all events, won’t that be slow for an account with thousands of transactions? This is a key question. The answer is projections.

A projection is a process that listens for new events and builds a separate, optimized view of the data. This view is called a read model. It’s designed for fast queries.

// A simple projection that maintains a current account balance
const accountBalances = new Map<string, number>();

async function handleAccountEvent(event) {
  if (event.type === 'AccountOpened') {
    accountBalances.set(event.accountId, event.initialBalance);
  }
  if (event.type === 'MoneyDeposited') {
    const current = accountBalances.get(event.accountId) || 0;
    accountBalances.set(event.accountId, current + event.amount);
  }
  if (event.type === 'MoneyWithdrawn') {
    const current = accountBalances.get(event.accountId) || 0;
    accountBalances.set(event.accountId, current - event.amount);
  }
}

Now, to get an account’s balance, we just look it up in the accountBalances map. It’s instant. This separation between the write model (event streams) and read models (projections) is a powerful concept. It allows us to optimize each for its specific job.

How do we handle changes to our event structure over time? Imagine we release version 1.0 of our app with the AccountOpened event. Later, we need to add a branchId field to know where the account was opened. This is schema evolution. We handle it with upcasting.

Upcasting is the process of transforming an old event format into a new one when it’s loaded.

function upcastEvent(oldEvent) {
  // If it's an old AccountOpened event without a branchId
  if (oldEvent.type === 'AccountOpened' && !oldEvent.branchId) {
    return {
      ...oldEvent,
      branchId: 'default-branch' // Assign a default value
    };
  }
  return oldEvent;
}

This way, the business logic in our aggregate always works with the latest event format, even when reading old data from the stream. The original raw event data in the store never changes.

Another performance consideration is the snapshot. For an active account with 10,000 transactions, replaying all events to get the current state is inefficient. A snapshot is a saved state at a specific point in time.

// Periodically, we save a snapshot
async function saveSnapshot(accountId: string, state: any, version: number) {
  await client.appendToStream(
    `snapshot-${accountId}`,
    [{ type: 'AccountSnapshot', data: { state, version } }]
  );
}

// To load, we find the latest snapshot and replay only events after it
async function loadAccount(accountId: string) {
  const snapshot = await getLatestSnapshot(accountId);
  const eventsAfterSnapshot = await loadEventsFromVersion(accountId, snapshot.version + 1);
  
  const account = BankAccount.fromSnapshot(snapshot.state);
  eventsAfterSnapshot.forEach(event => account.apply(event));
  return account;
}

This drastically reduces load time. Snapshots are an optimization, not a requirement. The system is still fully functional from the event stream alone.

What happens when a business process spans multiple aggregates? For example, transferring money from Account A to Account B. We can’t just withdraw from one and deposit to another in two separate steps; if the deposit fails after the withdrawal, money disappears. This is where sagas or process managers come in.

A saga coordinates the process. It listens for events and issues new commands.

// Saga for a money transfer
class MoneyTransferSaga {
  async start(transferId: string, fromAccountId: string, toAccountId: string, amount: number) {
    // 1. Send a command to withdraw from Account A
    await commandBus.send(new WithdrawMoneyCommand(fromAccountId, amount, transferId));
  }

  // React to events
  async onMoneyWithdrawn(event) {
    if (event.transferId === this.transferId) {
      // 2. If withdrawal succeeded, command deposit to Account B
      await commandBus.send(new DepositMoneyCommand(this.toAccountId, this.amount, this.transferId));
    }
  }

  async onMoneyDeposited(event) {
    if (event.transferId === this.transferId) {
      // 3. If deposit succeeded, mark transfer as complete
      await commandBus.send(new CompleteTransferCommand(this.transferId));
    }
  }
}

The saga ensures the entire process either completes successfully or compensates for any failures, maintaining system consistency.

Testing an event-sourced system is different. You can test an aggregate by giving it a command and asserting on the events it produces, without needing a database.

test('withdrawing money produces event', () => {
  const account = BankAccount.fromHistory([anAccountOpenedEvent]);
  
  account.withdraw(50);
  
  const changes = account.getUncommittedChanges();
  expect(changes).toContainEqual(
    expect.objectContaining({ type: 'MoneyWithdrawn', amount: 50 })
  );
});

This style of testing is very focused on business rules. You’re asserting that certain facts are recorded when specific commands are executed under the right conditions.

Building with event sourcing requires a shift in mindset. You stop asking “what is the state?” and start asking “what happened?” This perspective creates systems that are inherently more traceable and flexible. The event log becomes a valuable asset for new features, like generating customer statements or detecting fraud patterns.

It does add complexity. You now have to think about projections, eventual consistency in your read models, and event versioning. But for systems where auditability, temporal analysis, and resilience are critical, this complexity is a worthwhile investment.

I encourage you to start small. Model a single bounded context in your application using events. Experience the clarity it brings to your business logic. Feel the power of being able to rebuild state from an immutable history.

What part of your current system would benefit most from having a complete, unchangeable history? Share your thoughts in the comments below. If you found this walkthrough helpful, please like and share it with other developers who are building the next generation of robust applications. Let’s discuss how we can model our software to better reflect the real-world processes it supports.


As a best-selling author, I invite you to explore my books on Amazon. Don’t forget to follow me on Medium and show your support. Thank you! Your support means the world!


101 Books

101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.

Check out our book Golang Clean Code available on Amazon.

Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!


📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!


Our Creations

Be sure to check out our creations:

Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | JS Schools


We are on Medium

Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva

Keywords: event sourcing, node.js, eventstoredb, system design, auditability



Similar Posts
Blog Image
Complete Guide to Building Multi-Tenant SaaS Architecture with NestJS, Prisma, and PostgreSQL RLS

Learn to build scalable multi-tenant SaaS with NestJS, Prisma & PostgreSQL RLS. Complete guide with authentication, security & performance tips.

Blog Image
Production-Ready GraphQL API: NestJS, Prisma, Redis Cache Setup Tutorial for Scalable Development

Learn to build a scalable GraphQL API with NestJS, Prisma, and Redis cache. Master database operations, authentication, and performance optimization for production-ready applications.

Blog Image
Build Type-Safe Real-Time APIs with GraphQL Subscriptions TypeScript and Redis Complete Guide

Learn to build production-ready real-time GraphQL APIs with TypeScript, Redis pub/sub, and type-safe resolvers. Master subscriptions, auth, and scaling.

Blog Image
Build Type-Safe Event-Driven Microservices with NestJS, RabbitMQ, and Prisma

Learn to build scalable event-driven microservices with NestJS, RabbitMQ & Prisma. Complete guide with type-safe architecture, distributed transactions & Docker deployment.

Blog Image
Complete Guide to Integrating Next.js with Prisma ORM for Type-Safe Full-Stack Development

Learn how to integrate Next.js with Prisma ORM for type-safe, full-stack web applications. Build database-driven apps with seamless frontend-backend integration.

Blog Image
Building Full-Stack TypeScript Apps: Complete Next.js and Prisma Integration Guide for Modern Developers

Build type-safe full-stack apps with Next.js and Prisma integration. Learn seamless TypeScript development, database management, and API routes.