Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to tune the parameters of detection algorithms? #8

Open
JR-Jared opened this issue Oct 10, 2022 · 1 comment
Open

How to tune the parameters of detection algorithms? #8

JR-Jared opened this issue Oct 10, 2022 · 1 comment

Comments

@JR-Jared
Copy link

I use roerich on my own dataset, but the result is not so good.

figure

There are comments in the class ChangePointDetection:

    Parameters
    ----------
    scaler: A scaler object is used to scale an input data. The default one is `SmaScalerCache`
    metric: A loss function during optimize step of NN. Can be one of the following KL_sym, KL, JSD, PE, PE_sym, Wasserstein
    window_size: A size of a window when splitting input data into train and test arrays
    periods: A number of previous data-points used when constructing autoregressive matrix
    lag_size: A distance between train- and test- windows
    step: Each `step`-th data-point is used when creating the input dataset
    n_epochs: A number of epochs during training NN
    lr: A learning rate at each step of optimizer
    lam: A regularization rate
    optimizer: One of Adam, SGD, RMSprop or ASGD optimizers
    debug: default zero

How to tune the parameters of the change point algorithms, like periods, window_size , lag_size or step?

@hushchyn-mikhail
Copy link
Collaborator

@JR-Jared, hi!

The key parameters here are window_size and lag_size. The general recommendation is to choose values such that only one change point is within 2 * (window_size + lag_size).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants