Replies: 1 comment
-
I added a comparison using ONNX export + trtexec in my issue and also included the kwarg |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I created a script to compare inference runtimes with
torch
,torch.compile
andtorch_tensorrt.compile
for any timm model, input shape and dtype and some runtimes are worse using TensorRT. I opened an issue here and would be interested to get your feedbacks if any,Simon
Beta Was this translation helpful? Give feedback.
All reactions