Pau Sanchez
Programming, technology & business

Yet another nodejs benchmark

TL;DR

I created another benchmark, this time only using web frameworks in nodejs. Unlike last time, this time I shared the repo so you can run it yourself.

Feel free to jump directly to the benchmark results or the conclusions.

Note: 2023-12-09: added redis benchmark

Why?

The other day I was checking the performance of some nodejs frameworks from the latest Tech Empower Web Framework Benchmarks (2023-10-17).

Albeit probably one of the most wide-spread and easy to use in node, express performs quite poorly (on today’s standards). Fastify performs usually twice as good and is mostly compatible with a nice and easy API. Then we have other servers like uWebSockets.js that have great performance, although less user-friendly to use, and finally servers like just-js which are at the top, but are not usable in production.

Not so long ago I shared a benchmark for several web servers on several programming languages.

This time I got curious on the performance metrics of node frameworks, so I decided to run a benchmark on my own. I wanted to experiment a little bit by testing performance degradation under some scenarios (like text-only vs introducing a database).

Using wrk to measure load

I started with a single application launching multiple servers and collecting benchmarks within the app itself, because it was easy to get the data as js object. I started using autocannon, since the API was very convenient and I just wanted to get a high-level idea on performance, however, while it was OKish to get a general sense on the best performing web servers, it was not so good as to measure performance in a meaningful way. Both the web server and the load testing were running in the same process, thus both parts were fighting for the same node event loop, thus causing a bottleneck.

So, I iterated.

I wanted a good performance load testing tool that was well performing, simple to use, and that generated a parseable output (CSV, JSON, …). I did not find that library in node, so I chose wrk.

While wrk does not output any machine-readable format, you can build custom LUA scripts. So with a quick search I found a script that generates JSON summary so I decided to use it with some minimal tweaks.

Running wrk with the script produced a simple output:

$ wrk --timeout 2s -t 1 -c 100 -d 3s http://localhost:3006/hello
Running 3s test @ http://localhost:3000/simple
  1 threads and 100 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    26.91ms   74.47ms 699.45ms   94.40%
    Req/Sec     9.41k     1.98k   11.25k    86.67%
  28059 requests in 3.03s, 6.48MB read
  Non-2xx or 3xx responses: 28059
Requests/sec:   9259.11
Transfer/sec:      2.14MB

JSON Output:
{
  "requests": 28059,
  "duration_in_microseconds": 3030422.00,
  "bytes": 6790278,
  "requests_per_sec": 9259.11,
  "bytes_transfer_per_sec": 2240703.77,
  "connect_errors": 0,
  "read_errors": 0,
  "write_errors": 0,
  "http_errors": 28059,
  "timeouts": 0,
  "latency_p99_9": 629359,
  "latency_distribution": [
    ...
  ]
}

This was very easy to parse from process’ output, a simple JSON.parse (string.split('JSON Output:')[1]) would do the trick.

Nodejs web frameworks

I decided to test the following nodejs servers:

  • express Probably the most wide-spread servers in node, but not the fastest

  • fastify A faster alternative to express

  • h3 Minimal web-framework that is used by nuxt 3

  • hyper-express High performance web server based on uWebSockets.js with better API

  • uwebsockets-express Aims to be a compatibility layer for express using uWebSockets.js but unfortunatelly it does not achieve the compatibility nor the performance

  • uWebSockets.js The fastest, production-ready, web server for node. Also used by bun internally.

  • node:http I implemented a very basic HTTP server to serve as a baseline for the rest of servers.

  • node:net Finally, I got curious on the performance of node:http native library, so I ended up implementing a minimal HTTP 1.1 server using raw sockets (yeah, I’m that kind of person). This implementation is very rought and, as you might imagine, it only includes the minimum functionality needed for this test to run. It assumes all requests are going to be GET and well-intentioned.

Finally I created the following routes on all servers:

/hey   -> "Hey!"
/hell  -> "Hell!"
/hello -> "Hello World!"
/about -> "<html><body>About page</body></html>"
/\*    -> "Not Found" (status = 404)

I made 3 routes start with “he” on purpose, since routing can also influence response times, although probably these few amount of routes would not create any performance penalty even on the simplest of the implementations.

I build different handlers that are used on the different web servers. All handlers are invoked like this:

  app.get("/route", (req, res) => handler())

Where, in the simplest form, the text-only handler looks like this:

  function handler() {
    return "Hello World!"
  }

And the database ones look more or less like this:

  async function handler() {
    return await dbPool.query("SELECT 'Hello World! as text'").all()[0].text
  }

All databases are initialized before running any tests and the select itself did not require any database or table to be present. I just wanted to measure the raw parsing & communication with it, at no cost to the database engine, disk, ….

You can see all handlers on this file

Methodology

  • All databases are initialized before any test starts
  • A quick warmup is done right before the real testing starts
  • All handlers are shared among all servers
  • Only one thread/process is used both in wrk and in the app itself
  • All databases were empty
  • All databases were running locally (thus, minimal latency)

Here you can find the configuration I used:

  Node Version:  v20.10.0
  CPU Model:     Intel(R) Core(TM) i7-8565U CPU @ 1.80GHz
  RAM:           32GB
  Path:          /hello
  Duration:      10 s
  Connections:   100

Since all servers ran without errors I ended up removing that column from results.

Results

See all results below. You can click on the legend to add/remove request handlers from the chart.

NameVersionSpeed FactorRequests/sLatency (us)Throughput (MB/s)
uWebSockets.js (text)20.34.0πŸ₯‡ 15.09xπŸ₯‡ 126816πŸ₯‡ 1891πŸ₯‡ 10.3MB/s
hyper-express (text)6.14.3πŸ₯ˆ 11.86xπŸ₯ˆ 99669πŸ₯‰ 6743πŸ₯ˆ 8.1MB/s
node:net (text)v20.10.0πŸ₯‰ 7.98xπŸ₯‰ 67105πŸ₯ˆ 35604.6MB/s
h3 (text)1.9.03.96x3325215059πŸ₯‰ 5.1MB/s
node:http (text)v20.10.03.79x3186667634.7MB/s
fastify (text)4.24.33.38x28447715384.8MB/s
uwebsockets-express (text)1.3.53.27x2750894253.5MB/s
express (text)4.18.21.00x84053539772.0MB/s
NameVersionSpeed FactorRequests/sLatency (us)Throughput (MB/s)
uWebSockets.js (redis)20.34.0πŸ₯‡ 10.81xπŸ₯‡ 96065πŸ₯‡ 2360πŸ₯‡ 7.8MB/s
hyper-express (redis)6.14.3πŸ₯ˆ 9.29xπŸ₯ˆ 82560πŸ₯‰ 4757πŸ₯ˆ 6.7MB/s
node:net (redis)v20.10.0πŸ₯‰ 7.13xπŸ₯‰ 63380πŸ₯ˆ 32444.3MB/s
node:http (redis)v20.10.03.73x33152115974.9MB/s
fastify (redis)4.24.33.59x318616399πŸ₯‰ 5.4MB/s
uwebsockets-express (redis)1.3.53.03x26951102343.5MB/s
h3 (redis)1.9.02.88x2558991583.9MB/s
express (redis)4.18.21.00x8887322292.1MB/s
NameVersionSpeed FactorRequests/sLatency (us)Throughput (MB/s)
uWebSockets.js (better-sqlite3)20.34.0πŸ₯‡ 6.78xπŸ₯‡ 5037513788πŸ₯‡ 4.1MB/s
hyper-express (better-sqlite3)6.14.3πŸ₯ˆ 6.14xπŸ₯ˆ 45631πŸ₯‡ 8512πŸ₯ˆ 3.7MB/s
node:net (better-sqlite3)v20.10.0πŸ₯‰ 4.58xπŸ₯‰ 34002πŸ₯‰ 110632.3MB/s
h3 (better-sqlite3)1.9.03.14x23333πŸ₯ˆ 8745πŸ₯‰ 3.6MB/s
node:http (better-sqlite3)v20.10.02.79x20725227723.0MB/s
fastify (better-sqlite3)4.24.32.70x20071243143.4MB/s
uwebsockets-express (better-sqlite3)1.3.52.60x19336156082.5MB/s
express (better-sqlite3)4.18.21.00x74281435811.7MB/s
NameVersionSpeed FactorRequests/sLatency (us)Throughput (MB/s)
uWebSockets.js (sqlite3)20.34.0πŸ₯‡ 6.14xπŸ₯‡ 41018πŸ₯‰ 8663πŸ₯‡ 3.3MB/s
hyper-express (sqlite3)6.14.3πŸ₯ˆ 5.66xπŸ₯ˆ 37774πŸ₯‡ 7260πŸ₯ˆ 3.1MB/s
node:net (sqlite3)v20.10.0πŸ₯‰ 4.67xπŸ₯‰ 31199πŸ₯ˆ 86052.1MB/s
node:http (sqlite3)v20.10.02.70x1800997972.6MB/s
h3 (sqlite3)1.9.02.67x1779688782.7MB/s
uwebsockets-express (sqlite3)1.3.52.63x17548139312.3MB/s
fastify (sqlite3)4.24.32.52x1681919068πŸ₯‰ 2.9MB/s
express (sqlite3)4.18.21.00x6675237761.5MB/s
NameVersionSpeed FactorRequests/sLatency (us)Throughput (MB/s)
uWebSockets.js (pg)20.34.0πŸ₯‡ 3.83xπŸ₯‡ 20246πŸ₯ˆ 111191.6MB/s
hyper-express (pg)6.14.3πŸ₯ˆ 3.67xπŸ₯ˆ 19386πŸ₯‰ 125991.6MB/s
node:net (pg)v20.10.0πŸ₯‰ 3.50xπŸ₯‰ 18464πŸ₯‡ 87321.3MB/s
node:http (pg)v20.10.02.52x1332212948πŸ₯ˆ 2.0MB/s
h3 (pg)1.9.02.41x1271713420πŸ₯‰ 1.9MB/s
fastify (pg)4.24.32.41x1270814619πŸ₯‡ 2.2MB/s
uwebsockets-express (pg)1.3.52.35x12392165451.6MB/s
express (pg)4.18.21.00x5280478231.2MB/s
NameVersionSpeed FactorRequests/sLatency (us)Throughput (MB/s)
fastify (postgres)4.24.3πŸ₯‡ 2.86xπŸ₯‡ 10789πŸ₯‡ 20396πŸ₯‡ 1.8MB/s
hyper-express (postgres)6.14.3πŸ₯ˆ 2.20xπŸ₯ˆ 8313πŸ₯ˆ 241930.7MB/s
h3 (postgres)1.9.0πŸ₯‰ 2.15xπŸ₯‰ 8112πŸ₯‰ 26094πŸ₯‰ 1.2MB/s
express (postgres)4.18.21.74x658758167πŸ₯ˆ 1.5MB/s
node:net (postgres)v20.10.01.61x6094276560.4MB/s
node:http (postgres)v20.10.01.57x5924571470.9MB/s
uWebSockets.js (postgres)20.34.01.52x5724266480.5MB/s
uwebsockets-express (postgres)1.3.51.00x3775650510.5MB/s
NameVersionSpeed FactorRequests/sLatency (us)Throughput (MB/s)
uWebSockets.js (text)20.34.0πŸ₯‡ 33.59xπŸ₯‡ 126816πŸ₯‡ 1891πŸ₯‡ 10.3MB/s
hyper-express (text)6.14.3πŸ₯ˆ 26.40xπŸ₯ˆ 996696743πŸ₯ˆ 8.1MB/s
uWebSockets.js (redis)20.34.0πŸ₯‰ 25.44xπŸ₯‰ 96065πŸ₯ˆ 2360πŸ₯‰ 7.8MB/s
hyper-express (redis)6.14.321.87x8256047576.7MB/s
node:net (text)v20.10.017.77x6710535604.6MB/s
node:net (redis)v20.10.016.79x63380πŸ₯‰ 32444.3MB/s
uWebSockets.js (better-sqlite3)20.34.013.34x50375137884.1MB/s
hyper-express (better-sqlite3)6.14.312.09x4563185123.7MB/s
uWebSockets.js (sqlite3)20.34.010.86x4101886633.3MB/s
hyper-express (sqlite3)6.14.310.00x3777472603.1MB/s
node:net (better-sqlite3)v20.10.09.01x34002110632.3MB/s
h3 (text)1.9.08.81x33252150595.1MB/s
node:http (redis)v20.10.08.78x33152115974.9MB/s
node:http (text)v20.10.08.44x3186667634.7MB/s
fastify (redis)4.24.38.44x3186163995.4MB/s
node:net (sqlite3)v20.10.08.26x3119986052.1MB/s
fastify (text)4.24.37.53x28447715384.8MB/s
uwebsockets-express (text)1.3.57.29x2750894253.5MB/s
uwebsockets-express (redis)1.3.57.14x26951102343.5MB/s
h3 (redis)1.9.06.78x2558991583.9MB/s
h3 (better-sqlite3)1.9.06.18x2333387453.6MB/s
node:http (better-sqlite3)v20.10.05.49x20725227723.0MB/s
uWebSockets.js (pg)20.34.05.36x20246111191.6MB/s
fastify (better-sqlite3)4.24.35.32x20071243143.4MB/s
hyper-express (pg)6.14.35.13x19386125991.6MB/s
uwebsockets-express (better-sqlite3)1.3.55.12x19336156082.5MB/s
node:net (pg)v20.10.04.89x1846487321.3MB/s
node:http (sqlite3)v20.10.04.77x1800997972.6MB/s
h3 (sqlite3)1.9.04.71x1779688782.7MB/s
uwebsockets-express (sqlite3)1.3.54.65x17548139312.3MB/s
fastify (sqlite3)4.24.34.45x16819190682.9MB/s
node:http (pg)v20.10.03.53x13322129482.0MB/s
h3 (pg)1.9.03.37x12717134201.9MB/s
fastify (pg)4.24.33.37x12708146192.2MB/s
uwebsockets-express (pg)1.3.53.28x12392165451.6MB/s
fastify (postgres)4.24.32.86x10789203961.8MB/s
express (redis)4.18.22.35x8887322292.1MB/s
express (text)4.18.22.23x84053539772.0MB/s
hyper-express (postgres)6.14.32.20x8313241930.7MB/s
h3 (postgres)1.9.02.15x8112260941.2MB/s
express (better-sqlite3)4.18.21.97x74281435811.7MB/s
express (sqlite3)4.18.21.77x6675237761.5MB/s
express (postgres)4.18.21.74x6587581671.5MB/s
node:net (postgres)v20.10.01.61x6094276560.4MB/s
node:http (postgres)v20.10.01.57x5924571470.9MB/s
uWebSockets.js (postgres)20.34.01.52x5724266480.5MB/s
express (pg)4.18.21.40x5280478231.2MB/s
uwebsockets-express (postgres)1.3.51.00x3775650510.5MB/s

Summary

In the text-only version, uWebSockets.js is clearly the winner. Hyper-express is a close second, and interestingly enough node:net, my custom and minimal HTTP implementation in raw sockets, comes third. node:net almost doubles in speed node:http, h3 and fastify. Express has the poorest performance of them all.

The moment we introduce a SQL database, whether a local sqlite or a local postgres, the difference in performance between the best and worse shrinks. Performance on all web servers drops around 2-3x. On the text version, uWebSockets.js performs almost 16x faster than express, however on better-sqlite3 handler, which is the best performant for all databases tested, uWebSockets.js drops its performance 2.6x (but still performs 7x faster than express on that version).

In pg test, uWebSockets.js drops 6x and only performs 1.6x as fast as fastify (for comparison, in the text version uWebSocket.js performs 4.5x times faster than fastify).

Interesting takeaways from this benchmark:

  • redis performs better than the rest of databases, thus, being suitable to be used as a cache
  • better-sqlite3, as advertised, does perform better than sqlite3 on this benchmark (around 1.25x or 25% better).
  • postgres.js, even though is advertised as faster than pg library, it fails to meet the expectations. The apps on this benchmark perform almost twice as fast on pg than on postgres.js. I might have screwed up with the implementation, but I tried to follow the simple examples all the libraries provided.

Conclusions

If you are looking for the best performing nodejs framework, you should consider that the performance drops on all frameworks the moment you query a database. In that scenario, the difference in performance between the fastest and the slowest is reduced and other factors come into play.

If you are purely looking for performance, you should definitely go with uWebSockets.js. It has been consistently the best-performant library on almost all the tests (although in the postgres test it had poor performance and I don’t get why). However, keep in mind that it might not be as straightforward to use as the others (but not rocket science either).

Hyper-Express is an interesting choice since the API is more pleasant to use than uwebsockets.js , but it is not as mature, has small community and poor documentation compared to other projects.

All in all, for a normal project I would likely chose fastify, since I believe it offers the right balance between performance, easy to use API, community and trust. The performance is quite close to node:http in all tests and has been around for several years. Definitely better performance than express on all tests.

Side note: If node is not your thing, then I would recommend having a look at golang’s fiber which is an excellent web framework, with very easy to use API and whose performance is really good (similar to uWebSockets.js).