使用llamafactory-cli webui出错 #5666
Unanswered
Smallposoft
asked this question in
Q&A
Replies: 1 comment
-
update transformers |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
报错内容如下:
Traceback (most recent call last):
File "/home/evan/.local/bin/llamafactory-cli", line 5, in
from llamafactory.cli import main
File "/home/evan/LLaMA-Factory/src/llamafactory/init.py", line 43, in
from .extras.env import VERSION
File "/home/evan/LLaMA-Factory/src/llamafactory/extras/env.py", line 26, in
from transformers.utils import is_torch_cuda_available, is_torch_npu_available
ImportError: cannot import name 'is_torch_cuda_available' from 'transformers.utils' (/home/evan/.local/lib/python3.8/site-packages/transformers/utils/init.py)
Beta Was this translation helpful? Give feedback.
All reactions