My Journey into Event Sourcing: Lessons from the Trenches
A personal exploration of Event Sourcing in .NET - from first principles to production-ready implementations with MartenDB
· 10 min read
Understanding the Challenge #
Working on enterprise banking systems exposed me to the fundamental limitations of traditional state-based persistence. During a critical production incident involving incorrect account balances, I realized that having only the current state made it nearly impossible to trace the sequence of operations that led to the discrepancy.
This experience highlighted the need for a more comprehensive approach to data persistence—one that captures not just the current state, but the complete history of how that state was achieved. This understanding became the foundation for exploring Event Sourcing as an architectural pattern.
What Event Sourcing Really Means #
Instead of capturing just the end result, Event Sourcing tells the complete story. Think of it as the difference between a photograph and a film reel. While traditional databases show us snapshots in time, event sourcing preserves the entire narrative.
Consider this simple banking scenario. Traditional approach stores:
balance = $1,250
Event Sourcing tells the story:
AccountOpened: $1,000
MoneyDeposited: $500
MoneyWithdrawn: $200
MoneyDeposited: $150
InterestEarned: $47.50
FeesCharged: $2.50
The balance is the same, but now we have the why behind every change. This is where the real power lies.
The Architecture That Emerged #
Over years of building event-sourced systems in .NET, I’ve discovered that successful implementations follow certain patterns. Let me walk you through what I’ve learned, starting with the foundation.
Events as First-Class Citizens #
Events are the heart of the system. In .NET, I’ve found record types to be perfect for this:
1public abstract record BankAccountEvent
2{
3 public DateTime Timestamp { get; init; } = DateTime.UtcNow;
4 public Guid EventId { get; init; } = Guid.NewGuid();
5}
6
7public record AccountOpened(
8 string AccountNumber,
9 string CustomerName,
10 decimal InitialDeposit) : BankAccountEvent;
11
12public record MoneyDeposited(
13 decimal Amount,
14 string Source,
15 string Reference) : BankAccountEvent;
16
17public record MoneyWithdrawn(
18 decimal Amount,
19 string Destination,
20 string Reference) : BankAccountEvent;
21
22public record AccountClosed(
23 string Reason,
24 decimal FinalBalance) : BankAccountEvent;
The Aggregate Pattern #
Aggregates become event producers rather than state containers:
1public class BankAccount
2{
3 private readonly List<BankAccountEvent> _uncommittedEvents = new();
4
5 public Guid Id { get; private set; }
6 public string AccountNumber { get; private set; }
7 public decimal Balance { get; private set; }
8 public bool IsClosed { get; private set; }
9
10 // For event sourcing frameworks
11 public IEnumerable<BankAccountEvent> UncommittedEvents => _uncommittedEvents;
12
13 public static BankAccount Open(string accountNumber, string customerName, decimal initialDeposit)
14 {
15 if (initialDeposit < 0)
16 throw new InvalidOperationException("Initial deposit cannot be negative");
17
18 var account = new BankAccount();
19 var evt = new AccountOpened(accountNumber, customerName, initialDeposit);
20 account.Apply(evt);
21 account._uncommittedEvents.Add(evt);
22 return account;
23 }
24
25 public void Deposit(decimal amount, string source, string reference)
26 {
27 if (amount <= 0)
28 throw new InvalidOperationException("Deposit amount must be positive");
29 if (IsClosed)
30 throw new InvalidOperationException("Cannot deposit to closed account");
31
32 var evt = new MoneyDeposited(amount, source, reference);
33 Apply(evt);
34 _uncommittedEvents.Add(evt);
35 }
36
37 public void Withdraw(decimal amount, string destination, string reference)
38 {
39 if (amount <= 0)
40 throw new InvalidOperationException("Withdrawal amount must be positive");
41 if (IsClosed)
42 throw new InvalidOperationException("Cannot withdraw from closed account");
43 if (Balance < amount)
44 throw new InvalidOperationException("Insufficient funds");
45
46 var evt = new MoneyWithdrawn(amount, destination, reference);
47 Apply(evt);
48 _uncommittedEvents.Add(evt);
49 }
50
51 // Event replay - this is how we rebuild state
52 public void Apply(BankAccountEvent evt)
53 {
54 switch (evt)
55 {
56 case AccountOpened opened:
57 Id = Guid.NewGuid();
58 AccountNumber = opened.AccountNumber;
59 Balance = opened.InitialDeposit;
60 break;
61
62 case MoneyDeposited deposited:
63 Balance += deposited.Amount;
64 break;
65
66 case MoneyWithdrawn withdrawn:
67 Balance -= withdrawn.Amount;
68 break;
69
70 case AccountClosed closed:
71 IsClosed = true;
72 break;
73 }
74 }
75
76 public void MarkEventsAsCommitted() => _uncommittedEvents.Clear();
77}
Discovering MartenDB: A Game Changer #
Early in my event sourcing journey, I spent countless hours building custom event stores. Then I discovered MartenDB, and it was like finding a Swiss Army knife when you’ve been using a butter knife.
MartenDB leverages PostgreSQL’s JSON capabilities to provide a powerful, production-ready event store with minimal setup. Here’s how I typically configure it:
1// In Program.cs
2builder.Services.AddMarten(options =>
3{
4 options.Connection(builder.Configuration.GetConnectionString("PostgreSql"));
5
6 // Configure projections for read models
7 options.Projections.Add<BankAccountProjection>(ProjectionLifecycle.Async);
8 options.Projections.Add<AccountSummaryProjection>(ProjectionLifecycle.Inline);
9
10 // Schema configuration
11 options.AutoCreateSchemaObjects = AutoCreate.All;
12 options.DatabaseSchemaName = "event_store";
13
14}).AddAsyncDaemon(DaemonMode.HotCold);
Working with Event Streams #
The beauty of MartenDB lies in its simplicity. Here’s how I handle the core operations:
1public class BankAccountRepository
2{
3 private readonly IDocumentStore _store;
4
5 public BankAccountRepository(IDocumentStore store)
6 {
7 _store = store;
8 }
9
10 public async Task<BankAccount> LoadAsync(Guid accountId)
11 {
12 await using var session = _store.QuerySession();
13
14 // MartenDB rebuilds the aggregate from events
15 var account = await session.Events.AggregateStreamAsync<BankAccount>(accountId);
16
17 return account ?? throw new InvalidOperationException($"Account {accountId} not found");
18 }
19
20 public async Task SaveAsync(BankAccount account)
21 {
22 await using var session = _store.LightweightSession();
23
24 var uncommittedEvents = account.UncommittedEvents.ToList();
25 if (!uncommittedEvents.Any()) return;
26
27 // First time saving this aggregate
28 if (session.Events.StreamExists(account.Id))
29 {
30 session.Events.Append(account.Id, uncommittedEvents.ToArray());
31 }
32 else
33 {
34 session.Events.StartStream(account.Id, uncommittedEvents.ToArray());
35 }
36
37 await session.SaveChangesAsync();
38 account.MarkEventsAsCommitted();
39 }
40
41 public async Task<IList<IEvent>> GetEventsAsync(Guid accountId)
42 {
43 await using var session = _store.QuerySession();
44 return await session.Events.FetchStreamAsync(accountId);
45 }
46}
Projections: Building Read Models That Scale #
One of the most powerful aspects of event sourcing is the ability to create multiple views of the same data. MartenDB makes this elegant:
1public class BankAccountProjection : SingleStreamProjection<BankAccountSummary>
2{
3 public static BankAccountSummary Create(AccountOpened opened)
4 {
5 return new BankAccountSummary
6 {
7 Id = opened.EventId,
8 AccountNumber = opened.AccountNumber,
9 CustomerName = opened.CustomerName,
10 Balance = opened.InitialDeposit,
11 Status = AccountStatus.Active,
12 OpenedDate = opened.Timestamp,
13 TransactionCount = 1
14 };
15 }
16
17 public void Apply(MoneyDeposited deposited, BankAccountSummary summary)
18 {
19 summary.Balance += deposited.Amount;
20 summary.LastTransactionDate = deposited.Timestamp;
21 summary.TransactionCount++;
22 }
23
24 public void Apply(MoneyWithdrawn withdrawn, BankAccountSummary summary)
25 {
26 summary.Balance -= withdrawn.Amount;
27 summary.LastTransactionDate = withdrawn.Timestamp;
28 summary.TransactionCount++;
29 }
30
31 public void Apply(AccountClosed closed, BankAccountSummary summary)
32 {
33 summary.Status = AccountStatus.Closed;
34 summary.ClosedDate = closed.Timestamp;
35 }
36}
37
38public class BankAccountSummary
39{
40 public Guid Id { get; set; }
41 public string AccountNumber { get; set; }
42 public string CustomerName { get; set; }
43 public decimal Balance { get; set; }
44 public AccountStatus Status { get; set; }
45 public DateTime OpenedDate { get; set; }
46 public DateTime? ClosedDate { get; set; }
47 public DateTime? LastTransactionDate { get; set; }
48 public int TransactionCount { get; set; }
49}
50
51public enum AccountStatus
52{
53 Active,
54 Closed,
55 Frozen
56}
The Benefits I’ve Experienced #
After implementing event sourcing across multiple projects, certain benefits have proven themselves time and again:
Complete Auditability #
Every production incident becomes a learning opportunity. When that 2 AM call comes in, I can trace exactly what happened, when, and why. The events tell the complete story.
Temporal Queries #
Need to know the account balance on a specific date? Just replay events up to that point. This has saved me countless hours during audits and investigations.
1public async Task<decimal> GetBalanceAtTime(Guid accountId, DateTime pointInTime)
2{
3 await using var session = _store.QuerySession();
4 var events = await session.Events.FetchStreamAsync(accountId);
5
6 var account = new BankAccount();
7 foreach (var evt in events.Where(e => e.Timestamp <= pointInTime))
8 {
9 account.Apply((BankAccountEvent)evt.Data);
10 }
11
12 return account.Balance;
13}
Scalable Read Models #
Different parts of the system need different views of the data. With projections, I can optimize each read model for its specific use case without compromising the write model.
Industries That Benefit from Event Sourcing #
Beyond financial services, event sourcing provides significant value across various domains:
Healthcare Systems - Patient care requires comprehensive audit trails for regulatory compliance and medical liability protection, capturing every diagnosis, treatment, medication change, and procedure.
E-commerce and Retail - Order lifecycle management, inventory tracking, and customer behavior analysis benefit from complete transaction history and the ability to reconstruct order states at any point.
Supply Chain and Logistics - Tracking goods movement, quality control, and compliance requirements across complex supply networks with full traceability and chain of custody.
Insurance Claims Processing - Managing claims lifecycles with complete audit trails for fraud detection, regulatory reporting, and dispute resolution.
Manufacturing and Quality Control - Production line monitoring, defect tracking, and regulatory compliance in manufacturing environments with full batch traceability.
Government and Public Services - Citizen services, permit processing, and public record management requiring transparent audit trails and accountability.
Research and Clinical Trials - Scientific research requiring complete data provenance, reproducible results, and regulatory compliance for FDA submissions.
These industries share common needs: regulatory compliance, complex business processes, complete audit trails, temporal analysis capabilities, and high-stakes decision making requiring full historical context.
Challenges and Hard-Learned Lessons #
Event sourcing isn’t a silver bullet. Here are the challenges I’ve encountered and how I’ve addressed them:
Schema Evolution #
Events are immutable, but business requirements change. I’ve learned to version events and handle migrations gracefully:
1public record MoneyDepositedV2(
2 decimal Amount,
3 string Source,
4 string Reference,
5 string Currency = "USD", // New field with default
6 string Description = "") : BankAccountEvent;
Eventual Consistency #
Async projections mean your read models might lag behind. I’ve learned to design UIs that gracefully handle this:
1public async Task<BankAccountSummary> GetAccountSummaryAsync(Guid accountId)
2{
3 await using var session = _store.QuerySession();
4
5 // Try read model first (fast)
6 var summary = await session.LoadAsync<BankAccountSummary>(accountId);
7
8 if (summary == null)
9 {
10 // Fallback to live aggregation (slower but always current)
11 var account = await session.Events.AggregateStreamAsync<BankAccount>(accountId);
12 // Convert to summary...
13 }
14
15 return summary;
16}
Performance Considerations #
Large event streams can become slow to replay. I use snapshots for performance:
1public class BankAccountSnapshot
2{
3 public Guid AccountId { get; set; }
4 public long Version { get; set; }
5 public decimal Balance { get; set; }
6 public bool IsClosed { get; set; }
7 public DateTime SnapshotDate { get; set; }
8}
The Continuous Learning Journey #
Event sourcing has fundamentally changed how I approach system design. It’s taught me to think in terms of behavior and causality rather than just state. Every project brings new insights, new patterns, and occasionally, new mistakes to learn from.
The key insight is that event sourcing isn’t just about technical benefits—it’s about aligning your code with how businesses actually work. Businesses are driven by events: customers place orders, payments are processed, inventory is updated. Event sourcing makes your software speak the same language as your domain experts.
What’s Next in My Event Sourcing Journey #
As I continue exploring this space, I’m excited about:
- Event-driven microservices with proper saga orchestration
- Machine learning on historical event data
- Real-time analytics pipelines built on event streams
- Blockchain integration for immutable audit trails
Event sourcing has become more than just an architectural pattern for me—it’s a lens through which I view software design. Every system tells a story, and event sourcing ensures we never lose the plot.
This note is part of my ongoing exploration of distributed systems architecture. For related thoughts, see my notes on CQRS patterns and Domain-Driven Design.