Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test: automated performance testing suite #781

Open
13 tasks
krizhanovsky opened this issue Jul 30, 2017 · 2 comments
Open
13 tasks

Test: automated performance testing suite #781

krizhanovsky opened this issue Jul 30, 2017 · 2 comments

Comments

@krizhanovsky
Copy link
Contributor

krizhanovsky commented Jul 30, 2017

Scope

Need to develop performance testing suite for

  • HTTP/2 Web cache with 10 and 100 streams
  • HTTPS Web cache
  • HTTP/2 proxy mode with 10 and 100 streams
  • HTTPS proxy mode

All the tests above must compare Tempesta FW against:

  • optimized HAproxy
  • optimized Nginx
  • optimized Envoy (no cache)
  • optimized Varnish
  • previous results for Temesta FW 0.6.8 for now (no HTTP/2 tests so far)

The tests must measure:

These tests must run in two environments:

  • KVM
  • bare metal

The tests must run periodically in smoke (short) mode on CI and full run, including other web servers comparisons.

The test results should be stored in a server filesystem along with the configuration and system statistics (memory and CPU usage at first). A benchmark results must be also stored as text files with the command line to run the benchmark.

The CI jobs for the smoke performance tests must plot a Grafana graph to compare with previous runs and observe the trend.

Representing performance measurements

The benchmark runs must be cleaned to avoid results deviations. Different resources use 3-25 runs to get clean data and use different approaches for cleaning:

See https://bencher.dev/docs/explanation/thresholds/

References

Following issues address the problems, which must be revealed with the test suite, but require manual work.

https://github.com/nyrkio/dsi - automated performance regression testing in Python, inherited from MongoDB

https://github.com/bencherdev/bencher - similar project in Rust

@krizhanovsky krizhanovsky added this to the 1.0 WebOS milestone Jul 30, 2017
@krizhanovsky krizhanovsky modified the milestones: backlog, 0.6 KTLS Jan 9, 2018
@krizhanovsky krizhanovsky modified the milestones: 0.6 KTLS, 0.8 TDB v0.2 Mar 23, 2018
@krizhanovsky krizhanovsky modified the milestones: 0.8 TDB v0.2, 0.6 KTLS Mar 31, 2018
@krizhanovsky krizhanovsky modified the milestones: 0.6 KTLS, 0.7 HTTP/2 Jul 17, 2018
@krizhanovsky krizhanovsky modified the milestones: 0.7 HTTP/2, 1.1 QUIC Aug 9, 2018
@krizhanovsky krizhanovsky modified the milestones: 1.1 QUIC, 1.0 Beta Sep 3, 2018
@krizhanovsky krizhanovsky modified the milestones: 1.0 Beta, 1.1 QUIC Nov 17, 2018
@krizhanovsky krizhanovsky modified the milestones: 1.1 QUIC, 1.0 Beta Feb 2, 2019
@krizhanovsky krizhanovsky modified the milestones: 0.9 - TDB, 1.2 TBD, 0.12 - TLS perf Jan 3, 2022
@krizhanovsky krizhanovsky modified the milestones: 0.12 - TLS perf, 0.8 - Beta: TDBv0.2 & web cache eviction Aug 23, 2022
@ykargin
Copy link
Contributor

ykargin commented Jun 27, 2024

Performance Testing Plan

1. Existing Stress Tests:
    Can be used for performance testing.
    Need to write a configuration with a reasonable number of requests and parameters.

2. Grafana for Results Visualization:
    Determine how to calculate metrics for each test.
    Initially, it is sufficient to have a single metric for each test (total of 4 metrics).

3. CI (Continuous Integration):
    Set up a dedicated worker (virtual machine) for running performance tests.
    Create a separate pipeline for execution. It should operate only on the dedicated virtual machine for performance testing.

4. Reporting Script:
    Develop a script that will:
        - Report results.
        - Log installed packages.
        - Store all information locally in an archived format.

5. Grafana Charts:
    Draw a separate chart in Grafana for each of the 4 test suites.

6. Running Tests against HAproxy/nginx/Envoy:
    Add the execution of these tests against HAproxy/nginx/Envoy.
    Display the results on the charts of the corresponding test suites.

@krizhanovsky
Copy link
Contributor Author

We agreed on the call, that we'll go with adjusting our existing code from https://github.com/tempesta-tech/tempesta-test/ to build the performance regression test suite

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants