-
Notifications
You must be signed in to change notification settings - Fork 27.1k
Pull requests: huggingface/transformers
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
🌐 [i18n-KO] Translated encoder-decoder.md to Korean
#34880
opened Nov 22, 2024 by
maximizemaxwell
Loading…
7 of 10 tasks
Update the Python version in the Chinese README to match the English README.
#34870
opened Nov 22, 2024 by
vansin
Loading…
Fix support for image processors modifications in modular
#34866
opened Nov 21, 2024 by
yonigozlan
Loading…
1 of 5 tasks
Skipping aqlm non working inference tests till fix merged
#34865
opened Nov 21, 2024 by
MekkCyber
Loading…
🧹 Remove deprecated RotaryEmbedding parts in the Attention layers
#34858
opened Nov 21, 2024 by
Cyrilvallez
Loading…
[CI] Skip EETQ tests while package is broken with latest transformers
#34854
opened Nov 21, 2024 by
BenjaminBossan
Loading…
1 of 5 tasks
Grounding DINO Processor standardization
Processing
run-slow
Vision
#34853
opened Nov 21, 2024 by
qubvel
Loading…
3 of 5 tasks
Fix torch.onnx.export of Qwen2-VL vision encoder
Multimodal
ONNX
run-slow
Vision
#34852
opened Nov 21, 2024 by
xenova
Loading…
1 of 5 tasks
Add Flex Attention for Mistral along with refactoring
#34845
opened Nov 21, 2024 by
OmarManzoor
Loading…
Comments update for better reading
#34844
opened Nov 21, 2024 by
JohannFaust666
Loading…
5 tasks done
Add optimized
PixtralImageProcessorFast
Multimodal
optimization
Processing
Vision
#34836
opened Nov 20, 2024 by
mgoin
Loading…
4 of 5 tasks
Tiny typos in gemma2_modular.py after flex_attention introduction
#34828
opened Nov 20, 2024 by
MekkCyber
Loading…
1 of 5 tasks
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.