Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incompatibility between transformers 4.45.0 and torch 1.9.1 #34736

Open
2 of 4 tasks
realjoshqsun opened this issue Nov 14, 2024 · 0 comments
Open
2 of 4 tasks

Incompatibility between transformers 4.45.0 and torch 1.9.1 #34736

realjoshqsun opened this issue Nov 14, 2024 · 0 comments
Labels

Comments

@realjoshqsun
Copy link

realjoshqsun commented Nov 14, 2024

System Info

Hello, I was using depth estimation model,
pipe = pipeline(task="depth-estimation", model="depth-anything/Depth-Anything-V2-Metric-Indoor-Small-hf")
But I got this error:

Traceback (most recent call last):
  File "test.py", line 3, in <module>
    pipe = pipeline(task="depth-estimation", model="depth-anything/Depth-Anything-V2-Metric-Indoor-Small-hf")
  File "/home/q84sun/miniconda3/envs/vlnce_py3.8/lib/python3.8/site-packages/transformers/pipelines/__init__.py", line 896, in pipeline
    framework, model = infer_framework_load_model(
  File "/home/q84sun/miniconda3/envs/vlnce_py3.8/lib/python3.8/site-packages/transformers/pipelines/base.py", line 288, in infer_framework_load_model
    model = model_class.from_pretrained(model, **kwargs)
  File "/home/q84sun/miniconda3/envs/vlnce_py3.8/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained
    return model_class.from_pretrained(
  File "/home/q84sun/miniconda3/envs/vlnce_py3.8/lib/python3.8/site-packages/transformers/modeling_utils.py", line 3808, in from_pretrained
    state_dict = load_state_dict(resolved_archive_file)
  File "/home/q84sun/miniconda3/envs/vlnce_py3.8/lib/python3.8/site-packages/transformers/modeling_utils.py", line 556, in load_state_dict
    return safe_load_file(checkpoint_file)
  File "/home/q84sun/miniconda3/envs/vlnce_py3.8/lib/python3.8/site-packages/safetensors/torch.py", line 315, in load_file
    result[k] = f.get_tensor(k)
AttributeError: module 'torch' has no attribute 'frombuffer'

It seemed like a compatible issue between transformers and torch. What is the right torch version to match transformers 4.45.0?

My environment:
ubuntu: 22.04
Python: 3.8.20
torch: 1.9.1+cu111
transformers: 4.45.0
nvcc: cuda_11.7

Who can help?

@amyeroberts @qubvel @Rocketknight1

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

I was using the official example scripts from https://huggingface.co/depth-anything/Depth-Anything-V2-Metric-Indoor-Small-hf

from transformers import pipeline

pipe = pipeline(task="depth-estimation", model="depth-anything/Depth-Anything-V2-Metric-Indoor-Small-hf")

When running this test script with the my own environment, it raised up an error.

Expected behavior

Should be working.

@realjoshqsun realjoshqsun changed the title Incompatibility between the versions of the transformers 4.45.0 and torch 1.9.1 Incompatibility between transformers 4.45.0 and torch 1.9.1 Nov 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant