Skip to content

Cuda path issue on 5060 Ti 16G #1482

@petijj

Description

@petijj

C:\Users\Admin\Deep-Live-Cam>python run.py --execution-provider cuda
[ERROR:[email protected]] global obsensor_uvc_stream_channel.cpp:158 cv::obsensor::getStreamChannelGroup Camera index out of range
[ERROR:[email protected]] global obsensor_uvc_stream_channel.cpp:158 cv::obsensor::getStreamChannelGroup Camera index out of range

C:\Users\Admin\Deep-Live-Cam>python run.py --execution-provider cuda
2025-08-29 20:56:37.0493834 [E:onnxruntime:Default, provider_bridge_ort.cc:1480 onnxruntime::TryGetProviderInfo_CUDA] D:\a_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1193 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"

EP Error D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:743 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasn't able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
when using ['CUDAExecutionProvider']
Falling back to ['CUDAExecutionProvider', 'CPUExecutionProvider'] and retrying.
2025-08-29 20:56:37.1163537 [E:onnxruntime:Default, provider_bridge_ort.cc:1480 onnxruntime::TryGetProviderInfo_CUDA] D:\a_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1193 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"

Exception in Tkinter callback
Traceback (most recent call last):
File "C:\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "C:\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 463, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:743 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasn't able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\Python310\lib\tkinter_init_.py", line 1921, in call
return self.func(*args)
File "C:\Python310\lib\site-packages\customtkinter\windows\widgets\ctk_button.py", line 554, in _clicked
self._command()
File "C:\Users\Admin\Deep-Live-Cam\modules\ui.py", line 351, in
command=lambda: webcam_preview(
File "C:\Users\Admin\Deep-Live-Cam\modules\ui.py", line 792, in webcam_preview
create_webcam_preview(camera_index)
File "C:\Users\Admin\Deep-Live-Cam\modules\ui.py", line 911, in create_webcam_preview
source_image = get_one_face(cv2.imread(modules.globals.source_path))
File "C:\Users\Admin\Deep-Live-Cam\modules\face_analyser.py", line 28, in get_one_face
face = get_face_analyser().get(frame)
File "C:\Users\Admin\Deep-Live-Cam\modules\face_analyser.py", line 22, in get_face_analyser
FACE_ANALYSER = insightface.app.FaceAnalysis(name='buffalo_l', providers=modules.globals.execution_providers)
File "C:\Python310\lib\site-packages\insightface\app\face_analysis.py", line 31, in init
model = model_zoo.get_model(onnx_file, **kwargs)
File "C:\Python310\lib\site-packages\insightface\model_zoo\model_zoo.py", line 96, in get_model
model = router.get_model(providers=providers, provider_options=provider_options)
File "C:\Python310\lib\site-packages\insightface\model_zoo\model_zoo.py", line 40, in get_model
session = PickableInferenceSession(self.onnx_file, **kwargs)
File "C:\Python310\lib\site-packages\insightface\model_zoo\model_zoo.py", line 25, in init
super().init(model_path, **kwargs)
File "C:\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 430, in init
raise fallback_error from e
File "C:\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 425, in init
self._create_inference_session(self._fallback_providers, None)
File "C:\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 463, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:743 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasn't able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.

5060 Ti 16GB
7800X3D

I see others are also having this issue.
If you provide me a working fix I'd be more than grateful.
Thanks even if you comment on this post.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions