Skip to content

Commit

Permalink
Add jit.ignore to prototype optimizers (pytorch#2958)
Browse files Browse the repository at this point in the history
Summary:
Pull Request resolved: pytorch#2958

`torch.compile` doesn't seem to cause errors if we deprecate an optimizer that is no longer used, but `torch.jit.script` will. `torch.jit.script` seems to check and ensure all decision branches are alive. See [simplified Bento example](https://fburl.com/anp/rbktkl08)

To make prototype optimizers easily deprecated once included in production, we wrap the invoker function with `torch.jit.ignore`. This means that we need to always keep auto-generating the `lookup_{}.py` even the optimizers are deprecated and their backends are removed.

**Usage**
Add  `"is_prototype_optimizer": True` for the optimizer in `/codegen/genscript/optimizers.py`
Example:
```
def ensemble_rowwise_adagrad_optimizer:
   return {
      "optimizer": "ensemble_rowwise_adagrad",
      "is_prototype_optimizer": True,
   }
```

Reviewed By: q10

Differential Revision: D60943180

fbshipit-source-id: d43e11dfddf248b2b3113cb42d2c70c02e002ef8
  • Loading branch information
spcyppt authored and facebook-github-bot committed Aug 9, 2024
1 parent 3c559ab commit 9c0aa2a
Showing 1 changed file with 6 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,12 @@ torch.ops.load_library("//deeplearning/fbgemm/fbgemm_gpu:embedding_inplace_updat

{%- endif %}


{%- if is_prototype_optimizer %}
# Decorate the prototype optimizers which may be deprecated in the future with jit.ignore to avoid
# possible errors from torch.jit.script.
# Note that backends can be removed but the lookup invoker is still needed for backward compatibility
@torch.jit.ignore
{%- endif %}
def invoke(
common_args: CommonArgs,
optimizer_args: OptimizerArgs,
Expand Down

0 comments on commit 9c0aa2a

Please sign in to comment.