You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to convert an XGBRegressor to ONNX and what I get back is a TreeEnsembleRegressor operator. I've noticed that the output dimensions between the original model prediction and the ONNX one are not of equal shape. In particular:
The original model returns:
array([ 96.55492, 199.45296], dtype=float32)
The ONNX one returns:
array([[ 96.55492], [199.45296]], dtype=float32)
introducing an additional batch dimension. Is this intentional and if so, why? I've tried the same with XGBClassifier and that's not the case, the original output shape is preserved.
The text was updated successfully, but these errors were encountered:
It is by designed. It is possible to append a final node to the onnx graph to remove it but it is not done by the converter. It can be done with the onnx package. I can add an example if you need it.
I also find it kind of weird that the output variable is named "variable" instead of "label", "predictions" or "probabilities" that the equivalent classifier TreeEnsembleClassifier implementation uses.
In general, I think it's very important to preserve the original model's input/output shape to keep it consistent.
Hi,
I'm trying to convert an
XGBRegressor
toONNX
and what I get back is aTreeEnsembleRegressor
operator. I've noticed that the output dimensions between the original model prediction and theONNX
one are not of equal shape. In particular:ONNX
one returns:introducing an additional batch dimension. Is this intentional and if so, why? I've tried the same with
XGBClassifier
and that's not the case, the original output shape is preserved.The text was updated successfully, but these errors were encountered: