Skip to content
This repository has been archived by the owner on Jul 23, 2021. It is now read-only.

Determine performance goals and a corresponding strategy to maintain them #214

Open
Methuselah96 opened this issue Dec 19, 2020 · 1 comment

Comments

@Methuselah96
Copy link

Methuselah96 commented Dec 19, 2020

The problem

We don't have a strategy for determining whether the performance of immutable is acceptable. On the original repo, Lee did the manual work of deciding whether the performance was acceptable by pushing back on PRs that he knew to be slow.

One strategy is the "don't regress" strategy. I don't think this is a good strategy because the initial performance was somewhat arbitrary and the goals were not clearly defined. The advantage to this approach is that it's simple.

The dependability of the current bench-marking code is also not certain. The code is run in a VM context and when running bench-marking with the same exact code, it's not uncommon to see differences that exceed 10%.

We would also like to make the bench-marking automatic in CI (see #208) so that this work doesn't have to be manual and we catch problematic code before the PR gets merged.

(See #210, #212, and #213 for more context.)

Brainstorming

  • Is there a way that we can simulate real-life use-cases? That way we're measuring what matters: the performance impact in real life.
@Methuselah96 Methuselah96 changed the title Determine performance goals and a corresponding strategy to enforce them Determine performance goals and a corresponding strategy to maintain them Dec 19, 2020
@billouboq
Copy link

One way to do it is to take a free api response (like pokemon api or something like this) which will have a real case object and from this we can test atleast fromJS and toJS use case

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants