You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While optimizing a number of Mask and Faster R-CNN models on opset=12 from the ONNX Model Hub, the optimizer generated the models, but I encountered the following error when applying the eliminate_consecutive_idempotent_ops error when I run them:
[ONNXRuntimeError] : 1 : FAIL : Node (<NODE>) Op (Reshape) [ShapeInferenceError] Dimension could not be inferred: incompatible shapes
Where is: 2953 for the Mask models: Mask R-CNN R-50-FPN-int8, Mask R-CNN R-50-FPN-qdq, Mask R-CNN R-50-FPN-fp32. 2794 for the Faster models: Faster R-CNN R-50-FPN-int8, Faster R-CNN R-50-FPN-qdq, and Faster R-CNN R-50-FPN-fp32.
The original models run without problems. By visual inspection using Netron, I believe something is mishandled with the concatenation operations preceding these nodes in the models.
The text was updated successfully, but these errors were encountered:
While optimizing a number of
Mask
andFaster
R-CNN models onopset=12
from the ONNX Model Hub, the optimizer generated the models, but I encountered the following error when applying theeliminate_consecutive_idempotent_ops
error when I run them:[ONNXRuntimeError] : 1 : FAIL : Node (<NODE>) Op (Reshape) [ShapeInferenceError] Dimension could not be inferred: incompatible shapes
Where is:
2953
for theMask
models:Mask R-CNN R-50-FPN-int8
,Mask R-CNN R-50-FPN-qdq
,Mask R-CNN R-50-FPN-fp32
.2794
for theFaster
models:Faster R-CNN R-50-FPN-int8
,Faster R-CNN R-50-FPN-qdq
, andFaster R-CNN R-50-FPN-fp32
.The original models run without problems. By visual inspection using Netron, I believe something is mishandled with the concatenation operations preceding these nodes in the models.
The text was updated successfully, but these errors were encountered: