-
Notifications
You must be signed in to change notification settings - Fork 3.2k
Support load TensorRT V3 plugin #24211
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
After the change, V3 plugins can be loaded.
|
@jywu-msft , @tianleiwu , may I get a review of this change as well? Thanks! |
onnxruntime/core/providers/tensorrt/tensorrt_execution_provider_custom_ops.cc
Show resolved
Hide resolved
/azp run Big Models, Linux CPU Minimal Build E2E CI Pipeline, Linux QNN CI Pipeline, ONNX Runtime Web CI Pipeline, onnxruntime-binary-size-checks-ci-pipeline, Win_TRT_Minimal_CUDA_Test_CI, Windows ARM64 QNN CI Pipeline, Windows GPU Doc Gen CI Pipeline, Windows x64 QNN CI Pipeline |
Azure Pipelines successfully started running 9 pipeline(s). |
- `getPluginCreatorList` is deprecated, use `getAllCreators` to load plugin creators; - support load v1 and v3 creators.
/azp run Big Models, Linux CPU Minimal Build E2E CI Pipeline, Linux QNN CI Pipeline, ONNX Runtime Web CI Pipeline, onnxruntime-binary-size-checks-ci-pipeline, Win_TRT_Minimal_CUDA_Test_CI, Windows ARM64 QNN CI Pipeline, Windows GPU Doc Gen CI Pipeline, Windows x64 QNN CI Pipeline |
Azure Pipelines successfully started running 8 pipeline(s). |
Description
TensorRT V3 plugin is not able to load in TensorRT EP. The change deprecates
getPluginCreatorList
withgetAllCreators
to load V1 and V3 plugin creators.Motivation and Context
Support load TensorRT plugin.
Reference: https://github.com/NVIDIA/TensorRT/blob/8c6d69ddec0b2feff12f55472dc5d55cb6861d53/python/src/infer/pyPlugin.cpp#L2971C1-L2995C6