You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, in Oct 12, @SWivid you said SWivid commented on Oct 12 •
"Full finetune is currently supported, lora or adapter not yet." and now you say it is possible, did something change? Will you release a guide on how to do this? Have you tested it? I'm interested for accent LORAs in the same language
Hi @jpgallegoar , lora is pluggable but not added yet
pr is always welcome
thought the dit blocks are wrapped there, add some lines code for common lora practice is fine @Jerrister has got something with lora, so i think lora just fits in here
Checks
Question details
Has been LoRA subject to the study?
Is it currently "pluggable" in current architecture? (e.g. tie it to the attention layer?)
Is this something we may expect in a future version?
Has someone tested this?
The text was updated successfully, but these errors were encountered: