Replies: 2 comments 18 replies
-
Do you mean the importance of the features generated by ROCKET or the the importance with respect to the original time series variables? In case it's the first, it's simple (any tabular approach with good interpretability such as xgb can do it), but I'm not sure if it is useful at all since the features are random kernels. If you are interested in interpretability I wouldn't go with ROCKET. Instead, you can try NN architectures like XCM in which you can create activation maps in both the time and the variable dimensions, which allows for a sort of heatmap that could give you the feature importance. |
Beta Was this translation helpful? Give feedback.
-
Hi @jefferyanderson and @vrodriguezf, X, y, splits = get_UCR_data('LSST', split_data=False)
tfms = [None, TSClassification()]
batch_tfms = TSStandardize(by_sample=True)
dls = get_ts_dls(X, y, splits=splits, tfms=tfms, batch_tfms=batch_tfms)
learn = ts_learner(dls, InceptionTimePlus, metrics=accuracy, cbs=[ShowGraph()])
learn.fit_one_cycle(10, 1e-2)
learn.feature_importance() If you try it please, let me know if it works well. |
Beta Was this translation helpful? Give feedback.
-
Are there any tools or recommended approaches to determining feature importance with tsai? I'm particularly interested in a method that would work with (mini)ROCKET but I'm all ears if anyone knows of a good approach with the other algorithms. Thanks and Regards.
Beta Was this translation helpful? Give feedback.
All reactions