EventStoreDB client performance issues on NodeJS (Golang and Rust clients are much faster at reading events)

In Golang and Rust reading 1 million events takes 2 minutes 40 seconds from a remote EventStoreDB, and 30 seconds from a local DB.

In NodeJS, same takes about 10+ minutes (almost 4x slower) from a remote EventStoreDB and about 1.5 minutes (3x slower) from a local DB.

Both on my (pretty old) PC with a fiber connection.

On my coworker’s laptop though (M1 Pro, slow starlink network), reading 1 million events from local db is only about 30% slower in NodeJS (23s vs 17.5s in Go), which can be explained by a faster CPU compared to mine, but reading from remote db is 6x slower than Golang (3370s vs 527s).

On our dev VM server (GCP), reading from our dev eventstoredb over the network, golang takes 2.5 minutes and nodejs takes 10 minutes (4x slower) to read 1 million events.

Seeing how sensitive NodeJS client is to the network conditions we suspect that there might be some problem with how NodeJS client is implemented (or with NodeJS itself?).

We ran benchmarks multiple times using different approaches, but NodeJS seems to be consistently 3-6x times slower at reading events over the network, both fiber (4x slower) and starlink (6x slower) connections. This applies to both readStream and subscribeToStream methods.

Is Nodejs just that much slower? Or are we doing something wrong?

Here is the NodeJS Script, can be run with node script.js:

function readEvents(BATCH_SIZE = 1_000_000) {
  const client = EventStoreDBClient.connectionString(`esdb+discover://user:[email protected]:2113?tls=true`)   
  const events = this.client.readStream('stream-with-a-lot-of-events', {
      fromRevision: START,
      direction: FORWARDS,
      maxCount: BATCH_SIZE,
      resolveLinkTos: true,
    });
    for await (const { event, link } of events) {
      if (!event || event.type === '$metadata') {
        continue;
      }
    }
}

console.time('ExtractionTime');
  readEvents
  .then((events) => console.timeEnd('ExtractionTime'))
  .catch((error) => console.error(error));

Golang code:

func readEvents(client *esdb.Client) {
 BATCH_SIZE := 1_000_000
 options := esdb.ReadStreamOptions{
  From:           esdb.Start{},
  Direction:      esdb.Forwards,
  ResolveLinkTos: true,
 }
 stream, err := client.ReadStream(context.Background(), "stream-with-a-lot-of-events", options, uint64(BATCH_SIZE))
 if err != nil {
  panic(err)
 }
 defer stream.Close()

 for {
  event, err := stream.Recv()

  if errors.Is(err, io.EOF) {
   break
  }

  if err != nil {
   panic(err)
  }

  if event == nil || event.Event == nil {
   continue
  }

  var eventData esdb.RecordedEvent
  if err := json.Unmarshal(event.Event.Data, &eventData); err != nil {
   log.Printf("Error deserializing event data: %v", err)
   continue
  }

  // log.Printf("Deserialized event: %v", eventData)
 }
}
1 Like

To add to this, I did some tests querying the development PostgreSQL from my local in a loop just doing simple queries to check how node and go compare doing network calls to the remote db, and they are exactly the same in this scenario:

  • SELECT 1 - 226 seconds both nodejs and go doing 10k iterations
  • SELECT id from users limit 10 - 24 seconds both nodejs and go doing 1k iterations
  • bunch of other tests changing number of iterations and queries show similar results, go and node are exactly the same time-wise

So there’s no difference when querying postgres, but eventstore reading is way slower in node for some reason.

    const X = 10_000;
    const client = new Client(config);
    await client.connect();
    for (let i = 0; i < X; i++) {
      const res = await client.query('SELECT 1');
    }