Batch append to multiple streams in single transaction

Hi,

I am new to event sourcing and about to start evaluating EventStore DB. I am investigating how I could use EventStore DB for condition monitoring of equipment (for example pumps).

I have 2 streams that contain the same type of entity (pump), pump1 and pump2.

I have an analytic batch job that:
1: gets all pumps (pump1 and pump2)
2: gets timeseries data for all pumps from separate timeseries DB
3: for each pump analyses the timeseries data and creates a result event, either "Pump_OK or “Pump_Low_On_Grease”. However at this point events aren’t persisted.
4: Once all pumps have been analysed I would like to write the 2 events in a single transaction to the Event Store, to the streams pump1 and pump2.

From looking at previous topics, it appears that step 4 is not be possible.

Could you please tell me how I could update the 2 streams as part of a single transaction? One approach I thought of is to create a new stream for the analytic job run (instance), add the 2 result events there, in a single transaction. I could then have a separate process that subscribes to the analytic job run stream events to update the individual pump streams.

That’s how I would go about it. The Job run would be represented by a stream, which could have many pump status events.
If I need to know the status of Pump 1, I can project events with “pumpId” : 1

Thank you for the help Steven