You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
An officially supported task in the examples folder
My own task or dataset (give details below)
Reproduction
@BenjaminBossan 1. I use lora to finetune whisper,and get the model A. The settings are
config = LoraConfig(r=8, lora_alpha=16,target_modules=target_modules,modules_to_save=modules_to_save,lora_dropout=0.05, bias="none")
model = get_peft_model(model, config)
and then I change the source code of model A, I add an additional layer. I now want to train a model with an extra layer based on the lora trained model A. I use:
model_lora_path = "../lora_path/" + 'checkpoint-56416'
model = PeftModel.from_pretrained(model,model_lora_path,ignore_mismatched_sizes=True).cuda()
But the model LoraConfig's "modules_to_save" can not be changed, I want to store the additional layer in to 'adapter_model.safetensors' How can I change my code?
In short, I want to add parameters to modules_to_save in LoraConfig during the reload process based on the trained lora model so that the additional layer can be stored.
I tried to use model.peft_config['default'].modules_to_save.extend(modules_to_save) to add the “modules_to_save” but it doesn't work.
Expected behavior
Change reload lora model's LoraConfig settings
The text was updated successfully, but these errors were encountered:
What you'd need to do in this case is to modify the modules_to_save argument before loading the model. Doing it after the model was loaded is too late. I see 2 options here:
You can directly edit the adapter_config.json in your checkpoint directory.
You can load the config first using PeftConfig.from_pretrained(<checkpoint-path>). Then, you pass that config like so: model = PeftModel.from_pretrained(..., config=peft_config). This is the cleaner solution.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
System Info
transformers
version: 4.36.2Who can help?
@BenjaminBossan
Information
Tasks
examples
folderReproduction
@BenjaminBossan 1. I use lora to finetune whisper,and get the model A. The settings are
and then I change the source code of model A, I add an additional layer. I now want to train a model with an extra layer based on the lora trained model A. I use:
But the model LoraConfig's "modules_to_save" can not be changed, I want to store the additional layer in to 'adapter_model.safetensors' How can I change my code?
In short, I want to add parameters to modules_to_save in LoraConfig during the reload process based on the trained lora model so that the additional layer can be stored.
I tried to use
model.peft_config['default'].modules_to_save.extend(modules_to_save)
to add the “modules_to_save” but it doesn't work.Expected behavior
Change reload lora model's LoraConfig settings
The text was updated successfully, but these errors were encountered: