-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
matmul performance regression pipeline #647
Conversation
eaa7acd
to
8b532b4
Compare
@AlexAUT Is this PR for testing purpose only? IIRC, github does not have MI300 runners, does it? |
Was a confusion due to the call on Tuesday. I reverted it back to the jenkins pipeline. I will ping here when this is tested. The mi300 runners are quite congested at the moment. |
429661f
to
f2510ae
Compare
Most changes we discussed last week are implemented. Should we merge this as a first step. We can add tests for other kernels/datatypes as a separate pull request? |
…es no specify otherwise
f2510ae
to
76ca04b
Compare
* Added regression tests to tune_gemm * Add regression tests to pipelines * Add missing imports * Use warnings to signal that no performance comparison is found * Split regression tests into separate file * Disable github pipeline in favour of jenkins * Improve output and skip tests if no performance reference can be found * Add testcase for overall mean regression * Extend parameters which can be adjusted for perf regression tests * Switch to geo mean for overall result * Always recompile kernels in perf regression tests in case the user does no specify otherwise * Report default values in exported result to support changing them in the future
Adds a pipeline step to test for a performance regression for matmul.
It compares the performance with the last successful workflow from the main_perf branch. The workflow is started for each pull request into main_perf and for the main_perf branch after the merge. This run will then act as the new baseline if it was successful.