I have two projections that are similar to the following:
fromStreams(’$et-Orders’, ‘$et-Sales’)
.partitionBy(function(e) {
return e.data.StoreId; //Common field linking the events
})
.when({
$init: function(s, e) {
return {
count: 0
};
},
$any: function(s, e) {
var streamName = “StoreHistory-” + e.data.StoreId.replace(/-/gi, “”);
linkTo(streamName, e); //create a "bucket" per Store
}
);
//Projection 2 can now attack the new streams concurrently
fromCategory('StoreProductHistory')
.foreachStream()
.when({
$init: function(s, e) {
return {
};
},
$any: function(s, e) {
//do stuff
}
``
Any thoughts on this approach?
In my head, this is a fairly efficient way of attacking a stream(s) which have millions of events, but not necessarily written to the same aggregate (products, sales, orders etc)
The first projection creates a new “bucket” stream per aggregate, and the second can then concurrently go through these buckets.
It does mean we have potentially 10,000’s (maybe more) partitions.