You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[11/29 19:37:01 fastreid.onnx_export]: Beginning ONNX file converting
/home/dourenyin/anaconda3/envs/fastreid/lib/python3.7/site-packages/torch/onnx/utils.py:243: UserWarning: add_node_names' can be set to True only when 'operator_export_type' is ONNX. Since 'operator_export_type' is not set to 'ONNX', add_node_names argument will be ignored. "{}argument will be ignored.".format(arg_name, arg_name)) /home/dourenyin/anaconda3/envs/fastreid/lib/python3.7/site-packages/torch/onnx/utils.py:243: UserWarning:do_constant_folding' can be set to True only when 'operator_export_type' is ONNX. Since 'operator_export_type' is not set to 'ONNX', do_constant_folding argument will be ignored.
"{} argument will be ignored.".format(arg_name, arg_name))
[11/29 19:37:03 fastreid.onnx_export]: Completed convert of ONNX model
[11/29 19:37:03 fastreid.onnx_export]: Beginning ONNX model path optimization
[11/29 19:37:04 fastreid.onnx_export]: Completed ONNX model path optimization
Traceback (most recent call last):
File "tools/deploy/onnx_export.py", line 158, in
model_simp, check = simplify(onnx_model)
File "/home/dourenyin/anaconda3/envs/fastreid/lib/python3.7/site-packages/onnxsim/onnx_simplifier.py", line 204, in simplify
tensor_size_threshold,
onnx.onnx_cpp2py_export.shape_inference.InferenceError: [ShapeInferenceError] Inferred shape and existing shape differ in rank: (1) vs (4)
The text was updated successfully, but these errors were encountered:
[11/29 19:37:01 fastreid.onnx_export]: Beginning ONNX file converting
/home/dourenyin/anaconda3/envs/fastreid/lib/python3.7/site-packages/torch/onnx/utils.py:243: UserWarning:
add_node_names' can be set to True only when 'operator_export_type' is
ONNX. Since 'operator_export_type' is not set to 'ONNX',
add_node_namesargument will be ignored. "
{}argument will be ignored.".format(arg_name, arg_name)) /home/dourenyin/anaconda3/envs/fastreid/lib/python3.7/site-packages/torch/onnx/utils.py:243: UserWarning:
do_constant_folding' can be set to True only when 'operator_export_type' isONNX
. Since 'operator_export_type' is not set to 'ONNX',do_constant_folding
argument will be ignored."
{}
argument will be ignored.".format(arg_name, arg_name))[11/29 19:37:03 fastreid.onnx_export]: Completed convert of ONNX model
[11/29 19:37:03 fastreid.onnx_export]: Beginning ONNX model path optimization
[11/29 19:37:04 fastreid.onnx_export]: Completed ONNX model path optimization
Traceback (most recent call last):
File "tools/deploy/onnx_export.py", line 158, in
model_simp, check = simplify(onnx_model)
File "/home/dourenyin/anaconda3/envs/fastreid/lib/python3.7/site-packages/onnxsim/onnx_simplifier.py", line 204, in simplify
tensor_size_threshold,
onnx.onnx_cpp2py_export.shape_inference.InferenceError: [ShapeInferenceError] Inferred shape and existing shape differ in rank: (1) vs (4)
The text was updated successfully, but these errors were encountered: