Storing commands and events transactionally in event store

My service grabs commands from Kafka. I use event sourcing pattern to persist the aggregates. To enhance the audit in my service, there are suggestions to persist commands in event store and then, with the help of correlation id/causation id/message id, I could visualize the entire conversation in my service.

If I store commands in a separate stream and the events as part of aggregate’s stream, I believe, I need to either:

  1. Put command stream and aggregate stream append operations in a transaction.
  2. Split these two operations in the following way: first append the command in the commands stream, then subscribe to commands stream(s) and actually execute the command.

Option #2 seems a bit heavy, but does not use transactions. Also, Kafka offers scaling by partitioning, which is not present in Event Store, hence competing consumers will not be an option without partitioning as it would impact commands ordering.

Is there anything I’m missing?

I would definitively separate command & aggregate streams .
option 2 is not heavy but it

  • does introduce some latency
  • (mis)use the stream as a persistent queue, so you need to manage the checkpoint
  • sine it’s (mis)used as a queue , you need to handle failed commands / retries / …
    .
    there is a 3way to improve the audit and reduce the burden of the above points
  • do store command on arrival and keep on processing as you normally would in your pipeline
    -> if the command is not saved for whatever reason you can choose to continue or just return some error to the sending process.