protobuf.net

Hi,
do you recommend using protobuf.net for serialization and deserialization of events in GES? I know that internally you use it so are there any caveats or is there any advantage to be gained ( besides the intrinsic advantages of using protobuf ). Also do people think it would be worth the added effort of coding your event classes to work with protobuf.net, or should I just use json.

oh, to give some context, I’m not running a system that needs to be super performant this is more just a general good practices question.

Thanks,

R

AFAIK json and protobuf understand [DataContract] / [DataMember(Order=?)]. So you can use either depending on context, at the cost of an extra dependency.

GES will support storage of messages in protobuf (I think it’d save disk space also), but you will not be able to use your own projections.

As Joao mentioned we see things just as a vector of bytes so you can use any serialization scheme you want. Projections however only natively supports Json you would have to support your own deserialization if you wanted js

Ok, well, I thought I would get some nifty benefit, not to mention be cooler, if I used protobuf, but what with the added syntax and the fact that projections work natively with json it doesn’t seem worth the effort, unless of course we start having billions of events and I need the space and speed.
Thanks for the clarification,

R

Just thinking out loud : what’s about using compressed(gziped) json, and hope Event Store will support decompression for projections, I don’t think it’s hard to implement… Or maybe it will support some plugin mechanism so we could add own extraction.

Probably better to compress the disk at that point.

Oren did some experiments on compressing json documents and found the result was often larger than the original. Basically, there wasn’t enough input data to make the compression worthwhile. So then he experimented with a shared dictionary that produced better results.

There’s a long series of posts. Here’s a couple.

http://ayende.com/blog/167169/using-shared-dictionary-during-compression

http://ayende.com/blog/167201/compression-finale

As João mentioned, it’s probably best to compress the disk.

Of course the other thing we could look at (before compressed JSON) is BSON with something like this https://github.com/naruto-star/bson.v8

We have discussed previously supporting bson internally for projections as well