-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add FixedPointDecimal benchmark. #42
base: master
Are you sure you want to change the base?
Conversation
Adds a benchmark file that produces performance comparisons across various types and operations.
I think this makes sense. We do have the benchmarks in bench/ on JSON.jl also: https://github.com/JuliaIO/JSON.jl/tree/master/bench, so there's precedent. |
It isn't wildly used yet but there is: https://github.com/JuliaCI/PkgBenchmark.jl |
@omus that PkgBenchmark seems nice, thanks for the link. (I'm sending a couple PRs now to clean it up for 1.0 so we can tag a version there. 😄) I guess do you want me to play with setting that up before merging this PR in? I think that seems reasonable |
Use custom branch of `PkgBenchmark.jl` to support post-processing, which we need.
Okay! I think i've got the benchmarks working via I'll post the results.md file generated here in the next post! :) There are other things we might want to change, such as:
Okay, here are the results, generated by running |
Benchmark Report for FixedPointDecimalsJob Properties
ResultsBelow is a table of this job's results, obtained by running the benchmarks.
Benchmark Group ListHere's a list of all the benchmark groups executed by this job:
Julia versioninfo
|
Codecov Report
@@ Coverage Diff @@
## master #42 +/- ##
=======================================
Coverage 98.83% 98.83%
=======================================
Files 1 1
Lines 172 172
=======================================
Hits 170 170
Misses 2 2 Continue to review full report at Codecov.
|
So I just want to leave a status update here. So I think this basically works. The benchmarks run (and after the merged-changes in JuliaCI/PkgBenchmark.jl#75, they should correctly and precisely be measuring only the time for each operation (not copying the value, reading from an array, etc)). The remaining blocker to merging is that it's extremely variable, so much so that I don't think it's useful. Even when running on a single computer, comparing the a single commit against itself, I've tried several things trying to pinpoint the source of the variance, but haven't had any luck:
Does anyone have any other ideas? Without this, this seems not very useful. Sometimes the swings are as large as 100% or 200%, so I'm not sure we'd get meaningful feedback on PRs. |
Opening this PR to discuss merging benchmarks into this repo, so that we can track performance across commits/versions.
I'm not sure if there's a usual structure to follow for putting benchmarks into a julia repo? Do other repos besides the main Julia repo use Nanosoldier?
Also, in its current form, the benchmark compares against raw
Int
andFloat
types, but for all the operations except division, those types execute the operation in a single clock tick, so it's almost not worth spending the computation to measure them... So maybe we can simplify this file to just measure FixedDecimals.Anyway, looking forward to figuring out the best way to do this with you! :)