Skip to content

Commit de6a88c

Browse files
Set self._hf_peft_config_loaded to True when LoRA is loaded using load_lora_adapter in PeftAdapterMixin class (#11155)
set self._hf_peft_config_loaded to True on successful lora load Sets the `_hf_peft_config_loaded` flag if a LoRA is successfully loaded in `load_lora_adapter`. Fixes bug /issues/11148 Co-authored-by: Sayak Paul <[email protected]>
1 parent 7dc52ea commit de6a88c

File tree

1 file changed

+3
-0
lines changed

1 file changed

+3
-0
lines changed

src/diffusers/loaders/peft.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -307,6 +307,9 @@ def load_lora_adapter(self, pretrained_model_name_or_path_or_dict, prefix="trans
307307
try:
308308
inject_adapter_in_model(lora_config, self, adapter_name=adapter_name, **peft_kwargs)
309309
incompatible_keys = set_peft_model_state_dict(self, state_dict, adapter_name, **peft_kwargs)
310+
# Set peft config loaded flag to True if module has been successfully injected and incompatible keys retrieved
311+
if not self._hf_peft_config_loaded:
312+
self._hf_peft_config_loaded = True
310313
except Exception as e:
311314
# In case `inject_adapter_in_model()` was unsuccessful even before injecting the `peft_config`.
312315
if hasattr(self, "peft_config"):

0 commit comments

Comments
 (0)