Skip to content

Pull requests: huggingface/transformers

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Reviews
Assignee
Filter by who’s assigned
Sort

Pull requests list

🌐 [i18n-KO] Translated encoder-decoder.md to Korean
#34880 opened Nov 22, 2024 by maximizemaxwell Loading…
7 of 10 tasks
Fix test_auto_backbone_timm_model_from_pretrained
#34877 opened Nov 22, 2024 by ydshieh Loading…
BLIP: fix generation after hub update
#34876 opened Nov 22, 2024 by zucchini-nlp Loading…
Mllama: fix base prefix
#34874 opened Nov 22, 2024 by zucchini-nlp Loading…
Enable different torch dtype in sub models
#34873 opened Nov 22, 2024 by zucchini-nlp Loading…
Fix failling GGML test
#34871 opened Nov 22, 2024 by MekkCyber Loading…
Fix support for image processors modifications in modular
#34866 opened Nov 21, 2024 by yonigozlan Loading…
1 of 5 tasks
Rename OLMo November to OLMo2
#34864 opened Nov 21, 2024 by 2015aroras Loading…
3 of 5 tasks
Bitnet test fix to avoid using gated model
#34863 opened Nov 21, 2024 by MekkCyber Loading…
Fix import structure for Fast Image processors
#34859 opened Nov 21, 2024 by yonigozlan Loading…
Grounding DINO Processor standardization Processing run-slow Vision
#34853 opened Nov 21, 2024 by qubvel Loading…
3 of 5 tasks
Gemma flex attention
#34851 opened Nov 21, 2024 by dame-cell Draft
BLIP: enable device map
#34850 opened Nov 21, 2024 by zucchini-nlp Loading…
Comments update for better reading
#34844 opened Nov 21, 2024 by JohannFaust666 Loading…
5 tasks done
Update Mistral conversion script
#34829 opened Nov 20, 2024 by Cyrilvallez Loading…
Tiny typos in gemma2_modular.py after flex_attention introduction
#34828 opened Nov 20, 2024 by MekkCyber Loading…
1 of 5 tasks
ProTip! Exclude everything labeled bug with -label:bug.