You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
std::println("Runtime created"); // printed
// Deserialize the engine
if (enginePath.ends_with(".engine") || enginePath.ends_with(".trt")) {
auto engine = runtime->deserializeCudaEngine(engineData, engineSize);
std::println("Engine deserialized"); // not printed, so error accured at the line above
}
but if I run trtexec to load the same engine file in the same folder, it could run.
[02/28/2025-19:02:12] [I] [TRT] Loaded engine size: 39 MiB
[02/28/2025-19:02:12] [I] [TRT] [MemUsageChange] TensorRT-managed allocation in engine deserialization: CPU +0, GPU +31, now: CPU 0, GPU 31 (MiB)
[02/28/2025-19:02:12] [I] Engine deserialized in 0.171571 sec.
Environment
TensorRT Version: 8.6.1.6
NVIDIA GPU: 4090D
NVIDIA Driver Version: 560.94
CUDA Version: 11.8
CUDNN Version: 8.9.0
Operating System: windows 11 Build 10.0.22631
Python Version (if applicable):/
Tensorflow Version (if applicable):/
PyTorch Version (if applicable):/
Baremetal or Container (if so, version):/
Relevant Files
Model link:
Steps To Reproduce
Commands or scripts:
Have you tried the latest release?: no, the whole project can only run with TensorRT 8.6.1.6
Can this model run on other frameworks? For example run ONNX model with ONNXRuntime (polygraphy run <model.onnx> --onnxrt): .trt engine can be run on trtexec
The text was updated successfully, but these errors were encountered:
Description
try to run a self designed model on GPU, but it failed when deserializing.
C++ code near the error:
but if I run
trtexec
to load the same engine file in the same folder, it could run.Environment
TensorRT Version: 8.6.1.6
NVIDIA GPU: 4090D
NVIDIA Driver Version: 560.94
CUDA Version: 11.8
CUDNN Version: 8.9.0
Operating System: windows 11 Build 10.0.22631
Python Version (if applicable):/
Tensorflow Version (if applicable):/
PyTorch Version (if applicable):/
Baremetal or Container (if so, version):/
Relevant Files
Model link:
Steps To Reproduce
Commands or scripts:
Have you tried the latest release?: no, the whole project can only run with TensorRT 8.6.1.6
Can this model run on other frameworks? For example run ONNX model with ONNXRuntime (
polygraphy run <model.onnx> --onnxrt
): .trt engine can be run on trtexecThe text was updated successfully, but these errors were encountered: