Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Llava-v1.5 and Llava-v1.5-hf Performes on MMMU differs a lot! #203

Open
hxhcreate opened this issue Aug 22, 2024 · 1 comment
Open

Llava-v1.5 and Llava-v1.5-hf Performes on MMMU differs a lot! #203

hxhcreate opened this issue Aug 22, 2024 · 1 comment

Comments

@hxhcreate
Copy link

Llava-v1.5-7b resutls
image

Llava-1.5-7b-hf
image

Is that right?
Can the official teams provide a reliable resutls for your supported model and datasets for a reference?

@sycny
Copy link

sycny commented Sep 7, 2024

I have found that using different versions of transformers will result in different MMMU values. When I use the latest version (4.42.2), I get a low value with Llava-1.5-7b-hf. But when I switch to an older version (4.39.2), I get a reasonable value, just like the one reported by the paper. I don't know what causes this problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants