batching appends to multiple streams

Hi,

currently .net client supports append operations for single stream only. Are there any plans to support batched appends - in order to be able to append multiples streams per single request? That would improve performance a lot for many use cases.

I imagine something like this:

AppendEventsToStreamsAsync(StreamAppends[] streamAppends)

Or this would create any undesirable side effects?

Thanks.

There intentionally isn’t a way to do this (in the server) - what if the streams were across multiple partitions?

In this case yes, it makes sense, however majority of cases (I guess) is not using multiple partitions.

The real question is why are you writing to multiple streams in one operation? This is normally a modeling issue.

My scenario is data migration. I have stream per aggregate in ddd context. This is “offline” operation, however it takes us quite a time to create ~10 mln. streams in ES when single stream is created per operation.

Running synchronously or doing many streams concurrently?

concurrently. Now the best I can achieve is ~700 streams per second.

How many events/second/size of events?

one event per stream (e.g. “EntityMigrated”). Event is quite small - ~1KB

So if one event/stream why can you only get to about 700-sec? A full asynchronous client should be able to get easily to say 20k want to post some code?

Or check https://github.com/EventStore/EventStore/blob/dev/src/EventStore.TestClient/Commands/WriteFloodProcessor.cs

Thaks.

Will do some clean tests on local machine to see if sothing in my code is wrong.

btw - is it possible that connection might be a bottleneck when shared between many threads?

Not likely try wrfl (you can put # of connections to use)