-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(benchmark): add summary output for benchmark comparisons #74
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, I think it can be simplified a bit but see what you think.
@@ -323,9 +378,10 @@ pub fn getSystemInfo() !platform.OsInfo { | |||
pub const Result = struct { | |||
name: []const u8, | |||
readings: Readings, | |||
total_duration_ns: u64, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As far as I can see, the total duration is only needed for the summary, so it doesn't seem necessary to add a new field to Result
just to carry it. I think it can be summed and passed to printSummary
which makes it a more local change.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah you are right. Currently I have not found a clean way to handle this. When I tried to move it, I end up with a segfault. Need to have another look here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Where you're defining an array of Result
s in run()
, you could have an array total_durations: []u64
instead and append the total_duration
which you've calculated to that rather than make a new Result
from the data in the Result
you get in the switch
's .result
clause. Then rather than pass a slice of Result
values to printSummary
you can just pass the total_durations
, and to get the name for each one you use the index to look it up in the benchmarks given, just like you're doing for the fastest_index
. Does that make sense?
@bens thank you for your review 👍 i will try to address these points tomorrow. |
The thing that makes me uncertain about this overall change is that if two benchmarks have different numbers of iterations then they're not comparable. I've been reading up on some stats and might have some ideas later, but generally speaking I think for something like this you'd want to establish that the distribution you're modeling the results with is a good fit, then work out the degree of confidence that one benchmark is faster than another. Just comparing runtimes of the two benchmarks is simplistic. |
Good point, that's actually too simplistic. When adding the feature, I wasn't really considering that. |
closing this PR, since the approach would be too simplistic |
Implement a new feature in the benchmark module to display a summary at the end of benchmarking output. This summary includes the fastest benchmark, and a comparison of each benchmark showing how many times slower they are relative to the fastest. This enhancement aims to provide users with a quick overview of their code's performance characteristics.
closes #64