You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I have a relatively straightforward situation where I need to validate my input shapes while jit.loading a saved model and I can't seem to find the solution for.
load the same model cannot validate the traced input shapes:
# ... later elsewhere load my saved modelloaded_224=torch.jit.load("saved_model_224.pt")
# there's nothing preventing me from sending incorrect input shapes# i.e., traced with 224 but called with 500inp_500=torch.rand(1, 3, 500, 500, dtype=torch.float)
loaded_224(inp_500)
I'd like to prevent feeding incorrect shaped inputs after loading models.
In particular, I would like to be able to do something like:
traced_input_shape=loaded_224.get_input_shape()
if (inp_500.shape() !=traced_input_shape()):
print("Error: Trying to run Inference with Incorrect Shaped Inputs!")
# die
I tried using torchlayers to help with this situation by:
importtorchlayersastlt.build(loaded_224, inp_224)
This failed (reasonably) with:
PickleError: ScriptModules cannot be deepcopied using copy.deepcopy or saved using torch.save. Mixed serialization of script and non-script modules is not supported. For purely script modules use my_script_module.save(<filename>) instead.
Any recommendations?
The text was updated successfully, but these errors were encountered:
Hi,
I have a relatively straightforward situation where I need to validate my input shapes while
jit.load
ing a saved model and I can't seem to find the solution for.load the same model cannot validate the traced input shapes:
I'd like to prevent feeding incorrect shaped inputs after loading models.
In particular, I would like to be able to do something like:
I tried using
torchlayers
to help with this situation by:This failed (reasonably) with:
Any recommendations?
The text was updated successfully, but these errors were encountered: