-
Notifications
You must be signed in to change notification settings - Fork 10.8k
Description

I install CUDA 12.8 its not helped me, error not change
PS C:\WINDOWS\system32> cd Deep-Live-Cam
PS C:\WINDOWS\system32\Deep-Live-Cam> python run.py --execution-provider cuda
2025-08-01 19:39:36.7319116 [E:onnxruntime:Default, provider_bridge_ort.cc:1480 onnxruntime::TryGetProviderInfo_CUDA] D:\a_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1193 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\Users\danil_1488\AppData\Local\Programs\Python\Python311\Lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"
EP Error D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:743 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasn't able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
when using ['CUDAExecutionProvider']
Falling back to ['CUDAExecutionProvider', 'CPUExecutionProvider'] and retrying.
2025-08-01 19:39:36.8828099 [E:onnxruntime:Default, provider_bridge_ort.cc:1480 onnxruntime::TryGetProviderInfo_CUDA] D:\a_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1193 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\Users\danil_1488\AppData\Local\Programs\Python\Python311\Lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"
Exception in Tkinter callback
Traceback (most recent call last):
File "C:\Users\danil_1488\AppData\Local\Programs\Python\Python311\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "C:\Users\danil_1488\AppData\Local\Programs\Python\Python311\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 463, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:743 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasn't able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Users\danil_1488\AppData\Local\Programs\Python\Python311\Lib\tkinter_init_.py", line 1948, in call
return self.func(*args)
^^^^^^^^^^^^^^^^
File "C:\Users\danil_1488\AppData\Local\Programs\Python\Python311\Lib\site-packages\customtkinter\windows\widgets\ctk_button.py", line 554, in _clicked
self._command()
File "C:\WINDOWS\system32\Deep-Live-Cam\modules\ui.py", line 351, in
command=lambda: webcam_preview(
^^^^^^^^^^^^^^^
File "C:\WINDOWS\system32\Deep-Live-Cam\modules\ui.py", line 792, in webcam_preview
create_webcam_preview(camera_index)
File "C:\WINDOWS\system32\Deep-Live-Cam\modules\ui.py", line 911, in create_webcam_preview
source_image = get_one_face(cv2.imread(modules.globals.source_path))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\WINDOWS\system32\Deep-Live-Cam\modules\face_analyser.py", line 28, in get_one_face
face = get_face_analyser().get(frame)
^^^^^^^^^^^^^^^^^^^
File "C:\WINDOWS\system32\Deep-Live-Cam\modules\face_analyser.py", line 22, in get_face_analyser
FACE_ANALYSER = insightface.app.FaceAnalysis(name='buffalo_l', providers=modules.globals.execution_providers)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\danil_1488\AppData\Local\Programs\Python\Python311\Lib\site-packages\insightface\app\face_analysis.py", line 31, in init
model = model_zoo.get_model(onnx_file, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\danil_1488\AppData\Local\Programs\Python\Python311\Lib\site-packages\insightface\model_zoo\model_zoo.py", line 96, in get_model
model = router.get_model(providers=providers, provider_options=provider_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\danil_1488\AppData\Local\Programs\Python\Python311\Lib\site-packages\insightface\model_zoo\model_zoo.py", line 40, in get_model
session = PickableInferenceSession(self.onnx_file, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\danil_1488\AppData\Local\Programs\Python\Python311\Lib\site-packages\insightface\model_zoo\model_zoo.py", line 25, in init
super().init(model_path, **kwargs)
File "C:\Users\danil_1488\AppData\Local\Programs\Python\Python311\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 430, in init
raise fallback_error from e
File "C:\Users\danil_1488\AppData\Local\Programs\Python\Python311\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 425, in init
self._create_inference_session(self._fallback_providers, None)
File "C:\Users\danil_1488\AppData\Local\Programs\Python\Python311\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 463, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:743 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasn't able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.