Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Invalid none check for loftq_config attribute in LoraConfig #2170

Open
2 of 4 tasks
sirluk opened this issue Oct 22, 2024 · 5 comments
Open
2 of 4 tasks

Invalid none check for loftq_config attribute in LoraConfig #2170

sirluk opened this issue Oct 22, 2024 · 5 comments

Comments

@sirluk
Copy link
Contributor

sirluk commented Oct 22, 2024

System Info

  • transformers version: 4.45.2
  • Platform: Linux-5.14.0-427.37.1.el9_4.x86_64-x86_64-with-glibc2.34
  • Python version: 3.12.7
  • Huggingface_hub version: 0.26.0
  • Safetensors version: 0.4.5
  • Accelerate version: 1.0.1
  • Accelerate config: not found
  • PyTorch version (GPU?): 2.5.0+cu124 (True)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using distributed or parallel set-up in script?:
  • Using GPU in script?:
  • GPU type: NVIDIA A100-SXM4-80GB

Who can help?

@BenjaminBossan

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder
  • My own task or dataset (give details below)

Reproduction

from peft import LoraConfig

LoraConfig(init_lora_weights="loftq")

From the code in peft.tuners.lora.config.py this should raise a ValueError.

if self.loftq_config is None:
    raise ValueError("`loftq_config` must be specified when `init_lora_weights` is 'loftq'.")

However the loftq_config field of LoraConfig is defined to be a dict by default, so it is never None.

Expected behavior

ValueError should be raised when init_lora_weights="loftq" but loftq_config is not specified.

@BenjaminBossan
Copy link
Member

Good catch, indeed this check doesn't make sense as is. I'll fix it once I have a bit of time (it's not high priority). If someone else wants to take this, feel free to do so.

@sparsh2
Copy link
Contributor

sparsh2 commented Oct 23, 2024

@BenjaminBossan I would like to take this up if its okay

@sirluk
Copy link
Contributor Author

sirluk commented Oct 24, 2024

@BenjaminBossan @sparsh2 If its fine I would do it since I already started working on it as part of a different pull request that ideally should have the same structure as the fix for this

@BenjaminBossan
Copy link
Member

Thanks, please go ahead @sirluk since you already started to work on it. @sparsh2 I hope you did not put any time into this yet.

@sparsh2
Copy link
Contributor

sparsh2 commented Oct 24, 2024

No issues @sirluk @BenjaminBossan. @sirluk please go aheahd

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants