Skip to content

Commit

Permalink
revert epoch to steps back, change templates
Browse files Browse the repository at this point in the history
  • Loading branch information
kprokofi committed Nov 29, 2024
1 parent 3d8a4b4 commit de0c0cb
Show file tree
Hide file tree
Showing 59 changed files with 81 additions and 222 deletions.
4 changes: 2 additions & 2 deletions src/otx/core/schedulers/warmup_schedulers.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ def __init__(
self,
optimizer: Optimizer,
num_warmup_steps: int = 1000,
interval: Literal["step", "epoch"] = "epoch",
interval: Literal["step", "epoch"] = "step",
):
if not num_warmup_steps > 0:
msg = f"num_warmup_steps should be > 0, got {num_warmup_steps}"
Expand Down Expand Up @@ -65,7 +65,7 @@ def __init__(
self,
main_scheduler_callable: LRSchedulerCallable,
num_warmup_steps: int = 0,
warmup_interval: Literal["step", "epoch"] = "epoch",
warmup_interval: Literal["step", "epoch"] = "step",
monitor: str | None = None,
):
self.main_scheduler_callable = SchedulerCallableSupportHPO.from_callable(main_scheduler_callable)
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/atss_mobilenetv2.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/atss_mobilenetv2_tile.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/atss_resnext101.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/atss_resnext101_tile.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/rtdetr_101_tile.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 5
num_warmup_steps: 100
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/rtdetr_18_tile.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 5
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/rtdetr_50_tile.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 5
num_warmup_steps: 100
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/rtmdet_tiny.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/rtmdet_tiny_tile.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/ssd_mobilenetv2.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/ssd_mobilenetv2_tile.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/yolox_l.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/yolox_l_tile.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/yolox_s.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/yolox_s_tile.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/yolox_tiny.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/yolox_tiny_tile.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/yolox_x.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/detection/yolox_x_tile.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 3
num_warmup_steps: 0
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 10
num_warmup_steps: 100
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 10
num_warmup_steps: 100
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/instance_segmentation/maskrcnn_r50.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 10
num_warmup_steps: 100
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 10
num_warmup_steps: 100
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/instance_segmentation/maskrcnn_r50_tv.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 10
num_warmup_steps: 100
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 10
num_warmup_steps: 100
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/instance_segmentation/maskrcnn_swint.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 10
num_warmup_steps: 100
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/semantic_segmentation/litehrnet_18.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 10
num_warmup_steps: 100
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/semantic_segmentation/litehrnet_s.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 10
num_warmup_steps: 100
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
2 changes: 1 addition & 1 deletion src/otx/recipe/semantic_segmentation/litehrnet_x.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ model:
scheduler:
class_path: otx.core.schedulers.LinearWarmupSchedulerCallable
init_args:
num_warmup_steps: 10
num_warmup_steps: 100
main_scheduler_callable:
class_path: lightning.pytorch.cli.ReduceLROnPlateau
init_args:
Expand Down
26 changes: 12 additions & 14 deletions src/otx/tools/templates/classification/configuration.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -87,11 +87,11 @@ learning_parameters:
num_iters:
affects_outcome_of: TRAINING
default_value: 200
description:
Increasing this value causes the results to be more robust but training
time will be longer.
description: Maximum number of epochs to train a model.
Increasing this value may result in longer training, but potentially in a more robust model.
Note, if the early stopping is enabled, the actual number of epochs may be less than this value.
editable: true
header: Number of training iterations
header: Number of training epochs
max_value: 1000
min_value: 1
type: INTEGER
Expand All @@ -109,9 +109,9 @@ learning_parameters:
description:
Increasing this value might improve training speed however it might
cause out of memory errors. If the number of workers is set to zero, data loading
will happen in the main training thread.
will happen only in the main training thread leading to decreasing training speed.
editable: true
header: Number of cpu threads to use during batch generation
header: Number of cpu threads for batch generation (num workers)
max_value: 8
min_value: 0
type: INTEGER
Expand Down Expand Up @@ -191,21 +191,19 @@ learning_parameters:
warning: This is applied exclusively when early stopping is enabled.
early_stop_iteration_patience:
affects_outcome_of: TRAINING
default_value: 0
description:
Training will stop if the model does not improve within the number of iterations of patience.
This ensures the model is trained enough with the number of iterations of patience before early stopping.
default_value: 30
description: This parameter ensures that the model is trained enough within the number of iterations of warmup and stable before early stopping is on.
editable: true
header: Iteration patience for early stopping
max_value: 1000
header: Number of warmup iterations for early stopping
max_value: 10000
min_value: 0
type: INTEGER
ui_rules:
action: DISABLE_EDITING
operator: AND
rules: []
type: UI_RULES
value: 0
value: 30
visible_in_ui: true
warning: This is applied exclusively when early stopping is enabled.
use_adaptive_interval:
Expand Down Expand Up @@ -264,7 +262,7 @@ learning_parameters:
auto_num_workers:
affects_outcome_of: TRAINING
default_value: false
description: Adapt num_workers according to current hardware status automatically.
description: Adapt number of workers according to current hardware status automatically.
editable: true
header: Enable auto adaptive num_workers
type: BOOLEAN
Expand Down
Loading

0 comments on commit de0c0cb

Please sign in to comment.