You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have never tested a Lora version of LLaVA using lmms-eval. I remember there are some discussion about this in this issue #241. For llava-v1.5, since it is already a relatively old model, I am not sure what environment or dependencies you are going to need for lmms-eval
I'm evaluating the LLaVA-Lora version (https://huggingface.co/liuhaotian/llava-v1.5-7b-lora/discussions), but the performance seems unusually low. Do you know if this is supported in the lmms-eval pipeline?
The text was updated successfully, but these errors were encountered: