You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Seems like there is some logic that should address this case, but maybe it doesn't work for dynamic shapes?
// You can't eliminate both broadcasts if the same unit-dim in both// operands is being broadcast to a larger value. We can do some further// simplification, but we leave that to other patterns.for (unsigned i = 0; i < input1.getType().getRank(); i++) {
if (input1.getType().getDimSize(i) == 1 &&
input2.getType().getDimSize(i) == 1 && op.getType().getDimSize(i) > 1)
returnfailure();
}
If dynamic dimensions are represented by negative integers, then maybe we could change the condition to: op.getType().getDimSize(i) != 1?
pranavm-nvidia
changed the title
Incorrect broadcast elimintation when both operands to elementwise are broadcasted
Incorrect broadcast elimination when both operands to elementwise are broadcasted
Feb 25, 2025
ones
in Tripy works by creating a constant and broadcasting it up to the correct shape.There seems to be a bug with how
tensorrt.broadcast
is optimized out in the case where both operands are broadcasted in this way:Input MLIR:
This turns into:
This incorrectly results in an output with a volume of 1:
In this case, the
broadcast
instead needs to be rotated over the elementwise operation so that the output is still of the correct shape.The text was updated successfully, but these errors were encountered: