Streams with tens of millions of events

TL;DR - does Event Store do well with very large streams? i.e. tens of millions of events

Evaluating event sourcing (with Event Store) for the bulk analytical processing software I work on. One of the meat and potatoes items for this software is a large population of multi-dimensional tables of numbers. Some of them are created by users in the UI, others are bulk imported from other sources.

Some of the larger tables have up to 8 or 9 dimensions and tens of millions of cells. In an Event Store model, I had envisioned two events:

  • TableCreated (containing table metadata including dimensions)

  • TableCellChanged (for upserts of single-cell changes)

This model works fine for smaller tables, but for these big tables, we’d have some awfully big streams, containing those tens of millions of TableCellChanged events.

This is tolerable from an application standpoint, since we’ll probably just build the read model once for each table, and only rebuild it if the read model changes.

My question is: can Event Store handle such large streams well? Or should I invent some kind of “chunking” event, something like TableBulkUpserts that will contain many cell changes batched at once, thus having a smaller number of events in the stream, but each event being bulkier.

ES doesn't care whether its 1m streams with 1 event each or 1 with 1m
events. That said there is a limit on a stream of 2b events.