Most of these questions are coming from the perspective of trying to take the leap from RDBMS. These are the steps to understanding this new paradigm my mind is taking…
So in RDBMS you have your table schema and data. Your consumers / clients then throw SQL at this data.
In event store you have these events continually being saved - into streams.
In the most part would you consider streams to be analogous to your tables?
I see in the examples, lots of great ways to then introspect your events, essentially projecting out views or new streams as it seems they become.
Say you add a new client, and it is interested in a new view of the data, from what I gather you write some snippet of javascript and voila, a new stream is available that you can then subscribe to.
Conceptually coming from SQL - you send the query - you get your data - you can repeat this as many times as you wish.
With the event store it appears a bit different - you create the view (stream) as a 1-time op, then subscribe to it for ever more, picking up lost data on reconnect etc.
So lets say you deploy a new micro service that is interested in a particular view of your event store, as part of its bootstrapping it create this view (new stream), then it subscribes to it for example.
So you scale up your micro service, should you be managing this bootstrapping of new streams in anyway or is it something that you can view as a no cost idempotent op?
Then if the event store crashes, I assume subject to bring it back up on the same data folder, everything comes back as is? Subscriptions, projections, constructed views / streams etc.
Sorry if these questions seem a little basic, but this really is new territory although I have a lot of interest in what appears to be on offer.
Thanks in anticipation of your response(s).