Hey
We are currently moving away from a strictly mssql eventstore and read-model persistence (thank god), towards Event Store and primarily Elastic Search for read-model persistence.
One of the arguments that won over the rest of the team was that rebuilding the readmodel was extremely slow and brittle with the mssql-solution, and the mapping need (from json to c# to entity framework to sql) was ridiculous. And then i got to thinking: What if we could have a category of streams for holding the read-models, populated by projections inside the eventstore, so we only have to read backwards and ignore the products already committed to the read-model when rebuilding…
Am I barking up the wrong tree? What do other people do? Any good examples of denormalizer-implementations alongside with some performance-metrics? (time to rebuild / number of relevant events).
Problem:
Given we have a lot of products to show on a website,
And the products change state often during their lifecycle
When i rebuild my read-model from scratch (f.ex. due to deployed software changes that does not affect the interpretations of the events regarding the products)
Then I have to wait while all denormalized versions (redundant or otherwise) of all products are persisted syncrously
Thesis:
Given the core information about a product was “snapshot” and maintained by a projection
And that projection emits a “ProductReadModelProjectionChanged” event per changed product to the stream “ProductReadModelProjections”
Then i should be able to rebuild my read-model with fewest possible writes, since i can read “ProductReadModelProjections” backwards and discard “ProductReadModelProjectionChanged” events for products that i have already written to the read-model persistence.
… And this is a good thing, because it is not a horrible idea, and no one has a better solution to the problem
That last line is the one I am uncertain about.
Any thoughts?