Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PEFT implementations of adapters are outdated and languishing #1931

Closed
2 of 4 tasks
bghira opened this issue Jul 16, 2024 · 10 comments
Closed
2 of 4 tasks

PEFT implementations of adapters are outdated and languishing #1931

bghira opened this issue Jul 16, 2024 · 10 comments

Comments

@bghira
Copy link

bghira commented Jul 16, 2024

System Info

Latest PEFT, transformers, diffusers codebase.

Who can help?

@BenjaminBossan @sayakpaul

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder
  • My own task or dataset (give details below)

Reproduction

The LoKr, LoHa, and other LyCORIS modules are all outdated and missing substantial improvements and fixes from their upstream implementations.

Expected behavior

Either the upstream implementation should be directly wrapped, or these implementations should be kept up to date by the PEFT team. It doesn't seem like it should be on KohakuBlueleaf to update support for his adapters since the code was hastily copy-pasted into PEFT before being abandoned.

@BenjaminBossan
Copy link
Member

We're currently in a bit dev starved when it comes PEFT (basically just me), which will hopefully be resolved soon. Could you give an example of what is broken or missing in LoKr and LoHa? Wrapping probably won't be an option, as the implementations differ too much and we have different requirements in PEFT.

@bghira
Copy link
Author

bghira commented Jul 16, 2024

full matrix tuning, 1x1 convolutions, quantised LoHa/LoKr, weight-decomposed LoHa/LoKr, fixed rank dropout implementation, fixed maths (not multiplying against the vector, but only the scalar)

@bghira
Copy link
Author

bghira commented Jul 16, 2024

basically the results are entirely different from upstream to PEFT implementation, with the upstream author recommending peft be avoided at this point in time :[

@BenjaminBossan
Copy link
Member

BenjaminBossan commented Jul 16, 2024

Thanks for the info. I reformatted this and put what sounds like bugs to the top, as they are more important. If you have any further references, like GH issues, that would be fantastic.

  • fixed rank dropout implementation
  • fixed maths (not multiplying against the vector, but only the scalar)
  • full matrix tuning
  • 1x1 convolutions
  • quantised LoHa/LoKr
  • weight-decomposed LoHa/LoKr

@sayakpaul
Copy link
Member

Thanks for making us aware of this, @bghira!

@BenjaminBossan WDYT about also opening this to the community in case someone is interested in collaborating to start with some of these?

@BenjaminBossan
Copy link
Member

Yes, absolutely @sayakpaul. Do you have a suggestion how to best promote this?

@sayakpaul
Copy link
Member

@BenjaminBossan #1935

Copy link

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

@bghira
Copy link
Author

bghira commented Aug 16, 2024

no stalebot. bad

Copy link

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants