Skip to content

Model inference fails with torchserve #261

Answered by rwightman
cotrane asked this question in Q&A
Discussion options

You must be logged in to vote

@cotrane hmm, I haven't gone through that process with these models, that error means the feature maps aren't lined up across resolution levels, which is usually due to padding (SAME) or interpolation size targets not being correct. This can happen if you don't use the appropriate sized inputs (matching model config) or if the padding is wrong (which can happen if you trace the model with the wrong size, or again trace with one size and use the model with a different size input).

The model is definitely being exported via scripting and not tracing yes?

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@cotrane
Comment options

Answer selected by cotrane
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants