-
Notifications
You must be signed in to change notification settings - Fork 469
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[RT-Detr] - Add RT-Detr Onnx Config #2040
base: main
Are you sure you want to change the base?
[RT-Detr] - Add RT-Detr Onnx Config #2040
Conversation
Hi @fxmarty, @echarlaix, @JingyaHuang, @michaelbenayoun, I'm creating this PR following @qubvel and @ArthurZucker advices on that PR: huggingface/transformers#33877 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for the addition @YHallouard
|
||
|
||
class RTDetrOnnxConfig(ViTOnnxConfig): | ||
DEFAULT_ONNX_OPSET = 16 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is 16 the minimum required opset number and if yes could you add a comment on why this is required
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I firstly tried in opset 13 and got the following error:
UnsupportedOperatorError: Exporting the operator 'aten::grid_sampler' to ONNX opset version 13 is not supported. Support for this operator was added in version 16, try exporting with this version.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"rt-detr": supported_tasks_mapping( | ||
"object-detection", | ||
onnx="RTDetrOnnxConfig", | ||
), | ||
"sam": supported_tasks_mapping( | ||
"feature-extraction", | ||
onnx="SamOnnxConfig", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also can you add a test using a tiny random model, can be added directly here :
PYTORCH_EXPORT_MODELS_TINY = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added the smallest model, but maybe you want to create a smalest one to hf-internal-testing ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @echarlaix, thanks for the review! I added "PekingU/rtdetr_r18vd" for testing, but I saw that you used hf-internal-testing. Should I create a random initiated model their ? |
What does this PR do?
Fixes # (issue)
Before submitting
Who can review?