Hi, I’ve read through the posts on this subject and looked at the source code, but async stuff is kind of hard to wrap my head around.
I am doing a bulk import of data from sql to eventstore. I’m grabbing a bunch of records, creating commands, running them through my aggregates to create the events and saving the aggregates.
The (current) problem is that I get the aggregate add some events and try to save but the previous cmd has just processed so I’m getting a concurrency (version) error.
My repository save is doing
await _eventStoreConnection.AppendToStreamAsync(streamName, expectedVersion, newEvents);
and my get is doing
currentSlice = await _eventStoreConnection.ReadStreamEventsForwardAsync(streamName, sliceStart, sliceCount, false);
I have tried making the save sync by doing
_eventStoreConnection.AppendToStreamAsync(streamName, expectedVersion, newEvents).Wait();
but this hangs.
I haven’t been able to figure out how to make the get sync.
I’m not even sure if having the save sync would address my issue. I suspect I could put a thread.sleep in there but I’m processing some 400,000 records which would push the process into next year.
it would be great if I could wrap a set of actions somehow and make sure they all complete before processing the next set, but that seems like it would be even more difficult than making the save sync.
So I guess my question is, a) how in the world do make the processes synchronous for the purposes of import, and b) is this even the right way to address my problem.
Thanks for the help,
R