Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SPIKE: Secretless benchmarking reports #1401

Open
1 task
doodlesbykumbi opened this issue Apr 6, 2021 · 1 comment
Open
1 task

SPIKE: Secretless benchmarking reports #1401

doodlesbykumbi opened this issue Apr 6, 2021 · 1 comment

Comments

@doodlesbykumbi
Copy link
Contributor

Overview

Assume there is a pipeline to measure, export and analyse metrics. This issue is about defining the potential formats of benchmarking reports. For example, for streaming latency we might generate a report providing measures of spread, distributions etc.

Definition of done

  • A document exists capturing some potential benchmarking reports of interested and all that is needed to realise them.
@izgeri
Copy link
Contributor

izgeri commented Apr 6, 2021

I think we can potentially put this card off until we have some initial data. If we've defined how we'll collect the data, what experiments we'll run, and how we'll aggregate the data from the experiments - then we can define the report structure once we've collected some data and have a better understanding of how to visualize it. Do you agree, or is it important to do this now?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

No branches or pull requests

2 participants