Modelling a bank account and its transactions

I have an event-sourced bank account model which knows about its transactions. It releases funds and receives them, and its balance is the sum of all transactions.

In the state, I maintain a list of its transactions that belong to the account. The obvious flaw is that list of transactions will keep growing.

The transaction list is done primarily for idempotency (so a transaction can only be added once), but also to have a consistent way of asking for all transactions belonging to that account.

The only solution I can see for the idempotency, into only track the last n transactions in the state. Is there a better way to do it?

What’s the best way to maintain a list of related entities like accounts to transactions in the Command side? Is it ok to use read models?

Hi Mike,
For models based on double entry book keeping, the best plan is Closing The Books. Basically follow the accounting practice of closing accounting periods.
If the expected period ID is part of the check for idempotency then only the transactions for three current period need be checked.
The period can be set to any appropriate length. I’ve seen as short as a day, and as long as a year commonly used.
This both reduces the search space and sets streams up for archiving.

  • Chris
1 Like

Btw @oskar.dudycz is working on an article about closing the books.

In reality, the bank example is a good one. If you log in to your internet bank, you’ll immediately see it. They have a clear concept of a period when they produce a bank statement for a period. The period is one month. For each month you get a statement with opening and closing balance. That’s when they reconcile, close the period, and open a new one. One period only has transactions for one month. There’s a nuance about booked and reserved transactions too, as banks always account for the booked balance and available balance. Reserved transactions float to the next period, as they aren’t yet booked.

1 Like

I couldn’t find anything on the specifics of closing the books with ES, so I’m looking forward to that article.

Presumably the stream is capped with an event that points to the new stream, and the first event of the new stream refers back to the parent?

Would all of those streams share some sort of correlation ID in their first event so that we know all of those streams are the same account?

How do related entities refer to this new stream? Would they just listen for the BooksOpened event and update themselves accordingly?

Mike, I am not sure what “ES” means in your reply, is it EventStore or Event Sourcing?

None of your questions are really technical, and all the answers would be applicable for a particular context.

Let’s consider two scenarios:

  1. Each period for something (say, bank account) is an aggregate. Then, I’d not expect “related entities” to change anything as the period would be “related” to the account itself, but not the other way around. Technically, it could be that the period identity would include the account id and the period, by convention, so it would be quite easy to figure out the stream name for a given period. But then you keep the period as-is until it goes to archive, and start a new stream for the new period.

  2. When the period is closed, everything before that point goes to archive. Then, the entity is the same, there’s no need for a new stream. Say, the aggregate is an account itself. When the period is closed, we collect the necessary state into a period close event, append it to the account stream, then truncate the stream before the reconciliation event. Truncation in this case could be a reaction to the reconciliation event, as well as it could be combined with copying the to-be-truncated portion of the stream to an archive storage.

One important aspect to “Closing the Books” is that it implies leaving the old store in place and starting a new set with opening events.
I wrote the pattern up for Greg’s Versioning Event Sourced Systems book on leanpub.

Dear @alexey.zimarev, @chris.condron,

i find the discussion and the topic of event sourcing in connection with bank accounts very exciting and have often thought about it.
I had already started a thread on this in the EM Slack Channel. (https://eventmodeling.slack.com/archives/CKLVCU941/p1624969217169800)
What I miss in the discussion here in this thread, however, is the double-entry bookkeeping approach.
So far you are only talking about ONE account. In theory, however, a transaction is always posted to at least 2 accounts (two-legged transaction, multi-legged transaction - see also M. Fowler’s accounting pattern) and the total of a transaction always results in a zero balance.
So even if you look at an account to which something is credited (single remittance), there is always a counter account within the bank from which the money is debited (nostro & vostro accounts).

If you now take both bank accounts as an aggregate, how do you ensure transaction security so that both bookings are surely written in both streams? That is not really possible. As I understand it, the only source of truth in double entry bookkeeping is the journal. Based on a transaction, this results in a journal entry which, if this has been successfully added to the journal, generates the resulting account entries. These account entries are then posted to the respective (general) ledger accounts, which could actually exist as pure reading models.
I think it is important to differentiate between bank accounts and accounting accounts because they actually live in different domains (bank account -> payments domain, accounting accounts -> accounting domain)

So what is the actual status of a bank account in payments and what is it in the ledger account (or sub-ledger account) in accounting?
I believe for the accounting account it is all account entries in a limited period, as you mentioned already. But in this case I would not see the account as an aggregate, but rather the journal or the main and sub-ledgers.
For the payment account, there are more technical things, such as account holder, credit line, reporting period, authorizations, etc. All other account-relevant things, such as the current balance, available balance, account transactions or account statements, could exist as pure read models.

So accounting always seems to be the best example to describe event sourcing from scratch, but thinking about implementing it raises questions for me again and again.

Would appreciate hearing your thoughts on this.
best regards

( side note: @chris.condron & I recorded a conversation on that very topic , with the different alternatives and pro/cons of each .That will be out shortly here https://www.youtube.com/c/EventStoreLtd/playlists )

One thing to note in both the accounting & bank accounts transactions: the true invariant is on the whole system level and is no money created, no money destroyed

But in this case I would not see the account as an aggregate, but rather the journal or the main and sub-ledgers.

If using a ledger, the ledger is the source of truth, the account balance is a projected state from all the entries.
if using multiple account entiities you’ll need to use a write ahead log pattern either with
1/ the 2 entities , and use a have a watchdog process to monitor the transaction (so no explicit transaction entity)
2/ the 2 entitities and a specific entity for the transaction itslef

write ahead means:
Reserve (on entity 1)
Apply (on entity 2)
Complete Reservation (on entity 1 )
Complete Application (on entity 2 )

Not sure it’s helpful, in https://github.com/gklijs/bkes-demo I added an uuid to the command, so the handler can easily check if the same transaction has already been handled. So even if it’s send twice from front-end and/or there is some message duplication it should work.

added an uuid to the command… So even if it’s send twice from front-end and/or there is some message duplication it should work.

yep for dedup this is good.
In the case of ‘transaction’ though you should have a transaction Id as well, that is part of the data of any commands / events
something like { From: A, To: B, Units: 100, TransactionId: xyz }
this makes the transaction explicit in any implementation you choose

Indeed, also had that. But I think in this case a non scalable one, using increasing integers.

What I miss in the discussion here in this thread, however, is the double-entry bookkeeping approach.
So far you are only talking about ONE account. In theory, however, a transaction is always posted to at least 2 accounts (two-legged transaction, multi-legged transaction - see also M. Fowler’s accounting pattern) and the total of a transaction always results in a zero balance.

This is because in an accounting system, the general ledger entry is the aggregate. The credits and debits contained within must total up to 0.

I think it is important to differentiate between bank accounts and accounting accounts because they actually live in different domains

Yes, this is exactly right. One way to model this is to have one or mode documents (invoice, withdrawal, payment, inter-warehouse transfer) attached to a single general ledger entry. In fact, if you look at other accounting systems, they will typically insert this ‘document’ into the database. Then when you close the day out, they then get posted to the general ledger, after which you cannot modify the document.

The Invoice, Withdrawal, etc., are simply transformed into ledger entries. Of course what we can do in eventsourcing is place all the debits, credits, as well as documents that generated them into the same stream.

So accounting always seems to be the best example to describe event sourcing from scratch, but thinking about implementing it raises questions for me again and again.

I agree, which is why I’ve put this sample together: https://github.com/thefringeninja/Transacto/

1 Like

Thanks for your reply. One question: Do you mean the ledger is the aggregate or indeed the ledger entry ?

I see both

namespace Transacto.Domain {
public class GeneralLedger : AggregateRoot {

namespace Transacto.Domain {
public class GeneralLedgerEntry : AggregateRoot {

I need to look at that. Unfortunately I’m not a .NET guy

@tomask.de I am not sure why people keep referring to bank transfers as an example of cross-aggregate transactions. There is not a single bank in the world which makes payments between two accounts as a single transaction. When you find such a bank, please let me know. It will either be awesome or totally horrific.

@alexey.zimarev
I guess that depends what account and which domain you’re talking about. If you mean a bank account in a payments domain, I would agree.
A bank, like every other business, must use double-entry bookkeeping, otherwise regulatory requirements may not be met and audits may not be passed.

So like I mentioned: Even a single remittance or calculated interest while closing a period leads at the end to a two- or multilegged transaction in the core banking system, where at least one (accounting) account is credited and a counter (accounting) account is debited ( no money created, no money destroyed ). These accounts are part of the chart of accounts in a general ledger or a specified sub-ledger. So the ledger within a limited period represents the aggregate root, the single source of truth, doesn’t it?

Now let’s imagine a bank with 8million customers. Each customer with a bank account in the payments domain gets an corresponding accounting account in the ledger within the accounting domain. Closing a period then means 8 million closing entries and 8 million opening entries in the next periods ledger.

Please let me know if I am barking up the wrong tree. Maybe I am thinking completely wrong.

Like @Joao_Braganca said, every entry in a ledger is an aggregate on its own. It is the transaction.

What you described concerning the period open and close is great, as we here clearly see why people would not ask “how to keep a stream like a bank account from ever-growing in size” as we understand that the scope here is one period.

Still, this is an atomic operation, which is triggered by a single transaction on a single ledger entry, which involves both accounts. Then, both accounts get their own record in the ledger as a reaction.

I still have a hard time understanding why both accounts or both periods must be wrapped in a single transaction. As soon as the entry is created on its own, it just needs to be reflected elsewhere.

Now I imagine a bank with 80 million customers, ten times larger than the one you suggested. When closing the period they will do 240 million transactions. So, what problem are we looking at? I, as a regular customer, normally do from 1 to 20 bank operations per day, so as most of the customers of my bank, which gives us a number of around 10 billion transactions per day, as a single bank operation is never done as a single transaction. On a country-wide salary day, the number gets tenfold at least, to maybe 100 billion transactions for a day.

Why handling three additional transactions per customer at the end of the period would suddenly become a problem? I also might be missing something, but I don’t really see an issue here.

In the paper world just the ledger would be the aggregate. However that is impractical for software as you’d have to read the entire stream or resort to snapshot trickery to maintain consistency. It’s not ideal from a maintenance or ops perspective either.

So, we split them up. In the sample, the general ledger is responsible for closing each period, while entries are records of transactions.

Closing a period then means 8 million closing entries and 8 million opening entries in the next periods ledger

No. You only need a single entry to close out the previous period. Income and Expense accounts are in a sense temporary. When you close the period, you calculate profit and loss and apply the result to a special equity account (Retained Earnings) via a single closing entry.

Furthermore, it is really unlikely that each customer has their own account number in the chart of accounts. How/Why would you load anything with 80 million items in it? The document attached to the general ledger entry can contain customer information, and any downstream system (like your bank’s mobile app) can process that however it likes.

@alexey.zimarev @Joao_Braganca

Thanks for your Input. Does there exist an Event-Model for the Transacto Example?
I guess that would help to see a clearer picture. If not, maybe this is an idea for one of the meetups or workshops of Adam.

Unfortunately not, however this is an excellent idea! I will ask.

1 Like