Hi,
Suppose I’m using a message broker which ensures at-least-once delivery guarantee, meaning that there is a chance that the same command will be processed twice by my service.
The solution here is to use commands deduplication, hence every command is supplied with an id.
Now, reading the article https://developers.eventstore.com/clients/dotnet/5.0/appending.html#append-events-to-a-stream and going through multiple forums, I understand that EventStore ensures idempotency only at its api level. This means that if a CommandA, which produces EventA with id ‘id1’, is executed twice, I will have two events appended to my stream:
- EventA (‘id1’)
- EventA (‘id1’)
The reason for this is that every time a command is processed, the following workflow is done:
var aggregate = aggRepo.Load(cmd.AggregateId);
aggregate.MethodA();
aggRepo.Save(aggregate, cmd.Id);
Inside the aggregate repository, a call to EventStore is issued:
conn.AppendToStreamAsync(“newstream”, expectedVersion: agg.InitialVersion, ToEvents(agg.Events, command.Id));
The method ToEvents() creates EventStore event objects with predictable guids - every EventId is created based on command’s id.
Given the above logic, if CommandA is processed the second time, the expectedVersion is not longer the one that was at the first time of processing CommandA. Consequently, the second EventA will be successfully appended even though it has the same id.
Could you suggest a standard way of achieving commands deduplication and optimistic concurrency with EventStore?