Skip to content

Conversation

@v0-e
Copy link
Collaborator

@v0-e v0-e commented Sep 9, 2025

A tool for load testing the node/server implementation.

I've checked and tested out other tools such as ghz and goose, though something of our own will ultimately provide more freedom on what and how we are measuring, potentially allowing to analyze more intricate scenarios.

Success rate, latency, and throughput are measured.
Currently, an only summary of results is printed to stdout, though the info provided gives a good idea of node performance. We can eventually export the data and produce some graphs of it.

Some results on everything (node Docker setup + load-test tool) on the same machine (M3 Pro - 11 cores), issuing 100k requests using 10 concurrent workers:

  • send-note: 7.6k reqs/sec, 11.28 MB/sec throughput;
  • fetch-notes:
    • 0 replied notes per request: 11.8k reqs/sec, 0 MB/sec throughput (throughput is calculated using the size of replied notes);
    • 1 replied notes per request: 10.8k reqs/sec, 16 MB/sec throughput;
    • 5 replied notes per request: 8.1k reqs/sec, 60 MB/sec throughput;
  • mixed (random mix of send-note + fetch-notes): 9.6k reqs/sec, 5 MB/sec throughput. Some note tags are the same on each type of request -- fetch-notes responses may yield some notes.
  • req-rep (sending followed by the fetching of one note): average end-to-end latency of 2.08 ms.

100% success rate -- requests yielded no errors.

@v0-e
Copy link
Collaborator Author

v0-e commented Sep 10, 2025

Updated results for fetch-notes values, now the number of fetched notes per request is configurable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants