How to change event data for development purposes?

So I realize that you don’t wanna be able to change event data in an append-only store like this, but in terms of development and maintenance, I find this to be an issue that I hope there’s a solution for. Simply put, I developing a system and testing it, so events are going into the event store. Then i refactor the solution as I learn more about it, and some of the event object might end up moving. Now, when I replay the event store, nothing will deserialize because the qualified type names have changed since the files moved. My first instinct is to be able to find those events and update the $type field to point to the correct file.

A couple things about how things are going into the ES

  • I’m using to serialize the event object before converting it to byte[] and this is what gets stored
  • I have JsonSerializationSettings set to include $type for each object to make it easier to deserialize the object when I don’t know what the object type is at runtime.
  • This places a FQN into the field. I assume, it’s used in a Type.GetType(string) type fashion under the hood
    Is there any way to fix this?

My two cents is not to fully qualify types in events. That seems like it will end up being too brittle over time. We give each message or event a unique URI, like “” and use a custom attribute to bind an implementation to a unique message URI. We then use a common library to extract that URI from the message object’s type on send, and look up a concrete implementation for a given URI in the event metadata on receive. This keeps your implementation details regarding your DTO and your message itself as a first class and independent entity completely decoupled. Message versioning in distributed systems is a thorny problem always, and that’s amplified by event sourcing where all your old events not only stick around forever but must be reliably consumable for all time - so it’s worth the effort to give your versioning and serialization strategy some serious thought and keep it as flexible as possible.

Long response. its under way and free to read online.

Also why not just soft delete your old streams when in development
mode (or use new ones every time?)

I tried the soft delete play at first but it was causing errors to be thrown during deserialization. There may have been some other things in play at the time. I’ll have to go back and try that again and see what happens.

This was a thought that crossed my mind, and these are the type of “practice” type advice I’m looking for on these forums for sure. Thanks! I guess there’s no getting away from some sort of system like this, but at least it’ll allow you to move things around. At the moment, I’ve moved everything into a shared messaging library contained within the solution of each project and that library is a nuget package that can be pulled in by other services that wanna listen to streams. Versioning is done through namespacing, which I never cared for, so this could help with that.

So I think the issue is that the soft deleted event still gets read in through the subscription. I noticed that it comes with the type $metadata. Am I supposed to check for that and ignore, or is there something better?

can you be more specific?

I just appended an event to a new stream. I then sent a DELETE request to the rest API (in fiddler). After that, when I restart my service and my subscriptions start up, an event comes in that was the deleted stream – the EventType is $metadata (instead of SurveyGeneratedEvent, for example) and I can see that the stream name is the stream that I just deleted. At the moment, I don’t discriminate, so my process tries to deserialize that even but can’t because it’s this metadata thing.

More specifically, the event that comes in is the deleted stream but prepended with $$ (e.g. $$SurveyGeneratedEvent-) with an event type of $metadata

that is a metadata stream, what kind of subscription are you using?

This particular scenario is a catchup subscription. It is subscribed to a $ce-Surveys by category

Can you provide a short test case just so I understand properly?

I was able to lift some of it out of the abstract I wrote. Gist here but I’ll paste below:

The 3rd method is what deals with the event.

public void CatchUpSubscription(

string streamName,

int lastSeenIndex,

Action<EventInfo, int> processAction)


_processAction = processAction;

int? startIndex = lastSeenIndex;

if (lastSeenIndex == -2)

startIndex = null;

Logger.Debug($“Subscribing to {streamName} as catch-up subscription at index {lastSeenIndex}…”, this);

var settings = new CatchUpSubscriptionSettings(10000, 100, false, true);

var sub = _eventStore.SubscribeToStreamFrom(





_disposableCollection.Add(() => sub.Stop());


private void EventAppeared(EventStoreCatchUpSubscription eventStoreCatchUpSubscription, ResolvedEvent resolvedEvent)




private void EventAppeared(ResolvedEvent resolvedEvent)


if (_processAction == null)

throw new InvalidOperationException(“The process action was not set.”);



Logger.Debug($“EVENT READ: {resolvedEvent.Event.EventType} ({resolvedEvent.Event.EventStreamId})”, this);

// fails below because I’m not expecting $metadata

EventInfo info;

if (!_deserializationFactory.TryDeserialize(resolvedEvent.Event.Data, resolvedEvent.Event.Metadata, out info))


Process(info, resolvedEvent.OriginalEventNumber);



Logger.Error($“Could not deserialize event {resolvedEvent.Event.EventType}.”, this);


catch (Exception ex)


Logger.Fatal($“Could not deserialize event. {ex.Message}”, this, ex);