Node load testing #34
Open
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
A tool for load testing the node/server implementation.
I've checked and tested out other tools such as
ghzandgoose, though something of our own will ultimately provide more freedom on what and how we are measuring, potentially allowing to analyze more intricate scenarios.Success rate, latency, and throughput are measured.
Currently, an only summary of results is printed to stdout, though the info provided gives a good idea of node performance. We can eventually export the data and produce some graphs of it.
Some results on everything (node Docker setup + load-test tool) on the same machine (M3 Pro - 11 cores), issuing 100k requests using 10 concurrent workers:
send-note: 7.6k reqs/sec, 11.28 MB/sec throughput;fetch-notes:mixed(random mix ofsend-note+fetch-notes): 9.6k reqs/sec, 5 MB/sec throughput. Some note tags are the same on each type of request --fetch-notesresponses may yield some notes.req-rep(sending followed by the fetching of one note): average end-to-end latency of 2.08 ms.100% success rate -- requests yielded no errors.