I have a client which publishes events to ES. I have enabled KeepReconnecting and KeepRetrying, but that causes the event stream to be out of order when I restart ES. Is it possible to write the events in the correct order without limiting the concurrent operations to 1?
Not really unless you’re setting the expected version with each write as a form of concurrency control. As soon as you introduce concurrency here there is no longer a “correct” order as the scheduler is free to do what it likes.
If I limit my writes to a single publisher is it possible to use the LimitConcurrentOperationsTo in order to preserve order and maintain performance for reads. Can I set it to 2 and use 1 for write and 1 for read or do I need a new subscription?
Setting the concurrent operations limit is a per connection thing - it’s entirely feasible to have two connections with different settings open from the same process though.
You might also want to look into running a cluster of Event Store nodes if you aren’t already?