-
Notifications
You must be signed in to change notification settings - Fork 173
Issues: philschmid/deep-learning-pytorch-huggingface
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
ValueError: Must flatten tensors with uniform dtype but got torch.bfloat16 and torch.float32
#61
opened Sep 12, 2024 by
daje0601
flash attention error on instruction tune llama-2 tutorial on Sagemaker notebook
#40
opened Oct 25, 2023 by
matthewchung74
Falcon-180B "forward() got an unexpected keyword argument 'position_ids'"
#38
opened Sep 22, 2023 by
aittalam
Does this work for Llama2 - Fine-tune Falcon 180B with DeepSpeed ZeRO, LoRA & Flash Attention?
#37
opened Sep 21, 2023 by
ibicdev
Is the DataCollator necessary in peft-flan-t5-int8-summarization.ipynb ?
#29
opened Aug 15, 2023 by
brooksbp
Previous Next
ProTip!
Follow long discussions with comments:>50.