How to read most efficiently from the strem using Atom Pub


I’m fascinated about the CQRS architecture and I’m playing around with it in a small PHP framework I’m using. To store and fetch events I’m using the event store and the HTTP API. This works fine.

Reading an event stream I’m paginating through events as described in the wiki. This is pretty slow since an aggregate of 100 events result in ~105 HTTP request ~ 500ms on my laptop.

I have read about the embed=body parameter you can set. But how would you fetch the event streams from the HTTP API for a production environment? Are there any alternatives?

Thanks for a nice open source product.

Best regards


The most efficient call is the one you never make. Use the cache headers to tell you how long you can store it. Better yet throw squid or varnish or cloudfront! in front. I believe it is one year for every resource except the head one.

Hi Joao

I managed to improve it a lot by using the format=json and embed=body - and manually generating the /range/ links to include 100 events per page. Now it is down to ~2 request ~ 23ms on my laptop. Much better.

Is there a better way to change the default 20 events per page so I dont have to manually build the links?

Best regards


We have talked about adding an option for pagination.

The other thing you need to look at is your use of caching (most links coming from the event store have a cache expiration of never).



Hi Greg

Thanks - I’ll try to build in a small caching mechanism of the feed-fetching on the PHP side according to the cache expire headers.

Regarding pagination - is there a way to increase the 20 events per page limit in the feeds, so I dont have to manually build the /range/ urls.


We were discussing adding a query parameter for this. It is relatively
easy to do.

Generally for servers though (eg webservers with a db) the TCP api
will be more performant than the atom api