Skip to content
This repository has been archived by the owner on Jan 3, 2023. It is now read-only.

Got RuntimeError after running import ngraph with resnet example #283

Open
minus-one opened this issue Jan 22, 2019 · 1 comment
Open

Got RuntimeError after running import ngraph with resnet example #283

minus-one opened this issue Jan 22, 2019 · 1 comment

Comments

@minus-one
Copy link

Got this error after running the import_onnx_model command following the example for resnet50
Running onnx 1.3 and python 3.5.2

ng_models = import_onnx_model(onnx_pb)
ONNX ai.onnx opset version 8 is not supported. Falling back to latest supported version: 7
More than one different shape in input nodes [<Constant: 'Constant_289' ([])>, <BatchNormInference: 'gpu_0/res2_0_branch2c_bn_1' ([1, 256, 56, 56])>].
More than one different data type in input nodes [<Constant: 'Constant_289' ([])>, <BatchNormInference: 'gpu_0/res2_0_branch2c_bn_1' ([1, 256, 56, 56])>].
truncated traceback:
RuntimeError: While validating node 'Add[Add_292](Broadcast_290: int64_t{1,256,56,56}, Broadcast_291: float{1,256,56,56})' of type 'Add':
Assertion 'element::Type::merge(element_type, element_type, get_input_element_type(i))' failed at /root/ng/src/ngraph/node.cpp:451:
Argument element types are inconsistent.

Tried converting the model to opset v7 and still no luck.

@postrational
Copy link
Member

What is the source of the model you were trying to import?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants