My service grabs commands from Kafka. I use event sourcing pattern to persist the aggregates. To enhance the audit in my service, there are suggestions to persist commands in event store and then, with the help of correlation id/causation id/message id, I could visualize the entire conversation in my service.
If I store commands in a separate stream and the events as part of aggregate’s stream, I believe, I need to either:
- Put command stream and aggregate stream append operations in a transaction.
- Split these two operations in the following way: first append the command in the commands stream, then subscribe to commands stream(s) and actually execute the command.
Option #2 seems a bit heavy, but does not use transactions. Also, Kafka offers scaling by partitioning, which is not present in Event Store, hence competing consumers will not be an option without partitioning as it would impact commands ordering.
Is there anything I’m missing?