Event Sourcing in Practice: A .NET Engineer’s Field Notes

A personal exploration of Event Sourcing in.NET - from first principles to production-ready implementations with MartenDB

Author Avatar

Fernando

  ·  10 min read

Understanding the Challenge #

Working on enterprise banking systems exposed me to the fundamental limitations of traditional state-based persistence. During a critical production incident involving incorrect account balances, I realized that having only the current state made it nearly impossible to trace the sequence of operations that led to the discrepancy.

This experience highlighted the need for a more comprehensive approach to data persistence—one that captures not just the current state, but the complete history of how that state was achieved. This understanding became the foundation for exploring Event Sourcing as an architectural pattern.

But the real ‘a-ha’ moment wasn’t just about audits. It was realizing that if the event log is the source of truth, the current state is just one of many possible projections. This insight opens up incredible freedom for refactoring, debugging, and re-shaping read models without high-risk database migrations.

What Event Sourcing Really Means #

Instead of capturing just the end result, Event Sourcing tells the complete story. Think of it as the difference between a photograph and a film reel. While traditional databases show us snapshots in time, event sourcing preserves the entire narrative.

Consider this simple banking scenario. Traditional approach stores:

balance = $1,250

Event Sourcing tells the story:

AccountOpened: $1,000
MoneyDeposited: $500
MoneyWithdrawn: $200
MoneyDeposited: $150
InterestEarned: $47.50
FeesCharged: $2.50

The balance is the same, but now we have the why behind every change. This is where the real power lies.

A Note on Mindset: Avoid “State Obsession” #

This highlights a critical mindset shift. We record immutable business facts (MoneyDeposited), not state changes. A common mistake is the “State Obsession” anti-pattern, creating events like BalanceUpdated. This loses the why.

BalanceUpdated { NewBalance: 1250 }

tells you what, but

MoneyDeposited { Amount: 500 }

tells you the intent. Always model the business operation.

The Architecture #

Over years of building event-sourced systems in .NET, I’ve discovered that successful implementations follow certain patterns. Let me walk you through what I’ve learned, starting with the foundation.

Events as First-Class Citizens #

Events are the heart of the system. In .NET, I’ve found record types to be perfect for this:

 1public abstract record BankAccountEvent
 2{
 3    public DateTime Timestamp { get; init; } = DateTime.UtcNow;
 4    public Guid EventId { get; init; } = Guid.NewGuid();
 5}
 6
 7public record AccountOpened(
 8    string AccountNumber,
 9    string CustomerName,
10    decimal InitialDeposit) : BankAccountEvent;
11
12public record MoneyDeposited(
13    decimal Amount,
14    string Source,
15    string Reference) : BankAccountEvent;
16
17public record MoneyWithdrawn(
18    decimal Amount,
19    string Destination,
20    string Reference) : BankAccountEvent;
21
22public record AccountClosed(
23    string Reason,
24    decimal FinalBalance) : BankAccountEvent;

The Aggregate: A Pure State Machine #

The biggest lesson I learned was to stop using the AggregateBase pattern. The Marten team strongly advises against it, as it mixes decision-making and state mutation. A cleaner approach is the “Decider Pattern”, where the aggregate becomes a pure, immutable-style state object. Its only job is to transition its state by applying an event. All business logic lives elsewhere (like in a command handler).

 1public class BankAccount
 2{
 3    public Guid Id { get; private set; }
 4    public string AccountNumber { get; private set; } = null!;
 5    public decimal Balance { get; private set; }
 6    public bool IsClosed { get; private set; }
 7    
 8    // Marten requires a constructor 
 9    public BankAccount() {}
10
11    // --- State Transition Methods ---
12    // These methods have NO business logic. They just apply facts.
13
14    public void Apply(AccountOpened opened)
15    {
16        Id = Guid.NewGuid(); // Marten will set this from the stream ID
17        AccountNumber = opened.AccountNumber;
18        Balance = opened.InitialDeposit;
19    }
20
21    public void Apply(MoneyDeposited deposited)
22    {
23        Balance += deposited.Amount;
24    }
25
26    public void Apply(MoneyWithdrawn withdrawn)
27    {
28        Balance -= withdrawn.Amount;
29    }
30
31    public void Apply(AccountClosed closed)
32    {
33        IsClosed = true;
34    }
35}

Discovering MartenDB: A Game Changer #

Early in my event sourcing journey, I spent countless hours building custom event stores. Then I discovered MartenDB, and it was like finding a Swiss Army knife when you’ve been using a butter knife.

MartenDB leverages PostgreSQL’s JSON capabilities to provide a powerful, production-ready event store with minimal setup. Here’s how I typically configure it:

 1// In Program.cs
 2builder.Services.AddMarten(options =>
 3{
 4    options.Connection(builder.Configuration.GetConnectionString("PostgreSql"));
 5    
 6    // Configure projections for read models
 7    options.Projections.Add<BankAccountProjection>(ProjectionLifecycle.Async);
 8    options.Projections.Add<AccountSummaryProjection>(ProjectionLifecycle.Inline);
 9    
10    // Schema configuration
11    options.AutoCreateSchemaObjects = AutoCreate.All;
12    options.DatabaseSchemaName = "event_store";
13    
14}).AddAsyncDaemon(DaemonMode.HotCold);

Handling Commands Safely (No More Race Conditions) #

My old LoadAsync/SaveAsync repository was a classic and extremely dangerous race condition. If two requests loaded the same account, the last one to save would overwrite the other’s changes.

The correct, production-safe pattern is to treat the entire “load, decide, save” flow as one atomic operation. Marten’s FetchForWriting API is built for this, using optimistic concurrency.

 1public class BankAccountService // Or a CQRS Command Handler
 2{
 3    private readonly IDocumentStore _store;
 4
 5    public BankAccountService(IDocumentStore store)
 6    {
 7        _store = store;
 8    }
 9
10    public async Task DepositAsync(Guid accountId, decimal amount, string source, string reference)
11    {
12        await using var session = _store.LightweightSession();
13
14        // 1. Atomically fetch the aggregate AND its version 
15        var stream = await session.Events.FetchForWriting<BankAccount>(accountId);
16
17        // 2. Make business decisions (the "Decider" logic)
18        if (stream.Aggregate.IsClosed)
19            throw new InvalidOperationException("Account is closed");
20        if (amount <= 0)
21            throw new InvalidOperationException("Deposit must be positive");
22
23        // 3. Create the event
24        var evt = new MoneyDeposited(amount, source, reference);
25
26        // 4. Append the event
27        stream.AppendOne(evt);
28
29        // 5. SaveChanges atomically checks the stream version
30        await session.SaveChangesAsync();
31    }
32}

Projections: Building Read Models That Scale #

One of the most powerful aspects of event sourcing is the ability to create multiple views of the same data. This naturally leads to CQRS patterns where write and read models are completely separated. MartenDB makes this elegant:

 1public class BankAccountProjection : SingleStreamProjection<BankAccountSummary>
 2{
 3    public static BankAccountSummary Create(AccountOpened opened)
 4    {
 5        return new BankAccountSummary
 6        {
 7            Id = opened.EventId,
 8            AccountNumber = opened.AccountNumber,
 9            CustomerName = opened.CustomerName,
10            Balance = opened.InitialDeposit,
11            Status = AccountStatus.Active,
12            OpenedDate = opened.Timestamp,
13            TransactionCount = 1
14        };
15    }
16    
17    public void Apply(MoneyDeposited deposited, BankAccountSummary summary)
18    {
19        summary.Balance += deposited.Amount;
20        summary.LastTransactionDate = deposited.Timestamp;
21        summary.TransactionCount++;
22    }
23    
24    public void Apply(MoneyWithdrawn withdrawn, BankAccountSummary summary)
25    {
26        summary.Balance -= withdrawn.Amount;
27        summary.LastTransactionDate = withdrawn.Timestamp;
28        summary.TransactionCount++;
29    }
30    
31    public void Apply(AccountClosed closed, BankAccountSummary summary)
32    {
33        summary.Status = AccountStatus.Closed;
34        summary.ClosedDate = closed.Timestamp;
35    }
36}
37
38public class BankAccountSummary
39{
40    public Guid Id { get; set; }
41    public string AccountNumber { get; set; }
42    public string CustomerName { get; set; }
43    public decimal Balance { get; set; }
44    public AccountStatus Status { get; set; }
45    public DateTime OpenedDate { get; set; }
46    public DateTime? ClosedDate { get; set; }
47    public DateTime? LastTransactionDate { get; set; }
48    public int TransactionCount { get; set; }
49}
50
51public enum AccountStatus
52{
53    Active,
54    Closed,
55    Frozen
56}

The Critical Trade-off: Inline vs. Async #

In my Marten setup, I configured projections with ProjectionLifecycle.Async and ProjectionLifecycle.Inline. This choice is one of the most critical design decisions:

  • Inline: Runs in the same transaction as the event append (strong consistency, slower writes).
  • Async: Processed in the background via MartenAsyncDaemon (fast writes, eventual consistency).

Field Note: Default to Async unless strong consistency is a non-negotiable business requirement. Using Inline for complex, multi-stream projections can cause database contention under high load.

The Benefits I’ve Experienced #

Complete Auditability #

Every production incident becomes a learning opportunity. When that 2 AM call comes in, I can trace exactly what happened, when, and why. The events tell the complete story.

Temporal Queries #

Need to know the account balance on a specific date? My first attempt was to fetch all events and replay them in C# (a performance anti-pattern). A better solution is:

 1public async Task<BankAccount> GetAccountAtTime(Guid accountId, DateTime pointInTime)
 2{
 3    await using var session = _store.QuerySession();
 4    
 5    // Marten aggregates the stream *inside the database*
 6    var account = await session.Events.AggregateStreamAsync<BankAccount>(
 7        accountId,
 8        timestamp: pointInTime 
 9    );
10    
11    return account ?? new BankAccount();
12}

Scalable Read Models #

Different parts of the system need different views of the data. With projections, I can optimize each read model for its specific use case without compromising the write model.

Industries That Benefit from Event Sourcing #

Beyond financial services, event sourcing provides significant value across various domains:

Healthcare Systems - Patient care requires comprehensive audit trails for regulatory compliance and medical liability protection, capturing every diagnosis, treatment, medication change, and procedure.

E-commerce and Retail - Order lifecycle management, inventory tracking, and customer behavior analysis benefit from complete transaction history and the ability to reconstruct order states at any point.

Supply Chain and Logistics - Tracking goods movement, quality control, and compliance requirements across complex supply networks with full traceability and chain of custody.

Insurance Claims Processing - Managing claims lifecycles with complete audit trails for fraud detection, regulatory reporting, and dispute resolution.

Manufacturing and Quality Control - Production line monitoring, defect tracking, and regulatory compliance in manufacturing environments with full batch traceability.

Government and Public Services - Citizen services, permit processing, and public record management requiring transparent audit trails and accountability.

Research and Clinical Trials - Scientific research requiring complete data provenance, reproducible results, and regulatory compliance for FDA submissions.

These industries share common needs: regulatory compliance, complex business processes, complete audit trails, temporal analysis capabilities, and high-stakes decision making requiring full historical context.

Challenges and Hard-Learned Lessons #

Schema Evolution #

Events are immutable, but business evolves. Use Marten’s Upcasting to transform old events on the fly:

 1public record MoneyDepositedV2(
 2    decimal Amount,
 3    string Source,
 4    string Reference,
 5    string Currency) : BankAccountEvent;
 6
 7builder.Services.AddMarten(options =>
 8{
 9    options.Events.Upcast<MoneyDeposited, MoneyDepositedV2>(
10        oldEvent => new MoneyDepositedV2(
11            oldEvent.Amount,
12            oldEvent.Source,
13            oldEvent.Reference,
14            "USD"
15        )
16    );
17});

Eventual Consistency #

Async projections can lag. Real-world solutions:

1// 1. Wait for projections in integration tests
2await aHost.WaitForNonStaleProjectionDataAsync(TimeSpan.FromSeconds(5));
3
4// 2. Production monitoring
5builder.Services.AddHealthChecks()
6   .AddMartenAsyncDaemonHealthCheck(maxEventLag: 100);

Performance: Snapshots for Long-Lived Streams #

Large streams can be slow to replay. Marten supports snapshots:

1builder.Services.AddMarten(options =>
2{
3    options.Projections.Snapshot<BankAccount>(ProjectionLifecycle.Async, 100);
4});

The Continuous Learning Journey #

Event sourcing has fundamentally changed how I approach system design. It’s taught me to think in terms of behavior and causality rather than just state. Every project brings new insights, new patterns, and occasionally, new mistakes to learn from.

The key insight is that event sourcing isn’t just about technical benefits—it’s about aligning your code with how businesses actually work. Businesses are driven by events: customers place orders, payments are processed, inventory is updated. Event sourcing makes your software speak the same language as your domain experts.

What’s Next in My Event Sourcing Journey #

As I continue exploring this space, I’m excited about:

  • Event-driven microservices with proper saga orchestration
  • Machine learning on historical event data
  • Real-time analytics pipelines built on event streams
  • Blockchain integration for immutable audit trails

Event sourcing has become more than just an architectural pattern for me—it’s a lens through which I view software design. Every system tells a story, and event sourcing ensures we never lose the plot.

References #