Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Warning for maximum sequence length when running FSDP Llama2 example #354

Closed
amanshanbhag opened this issue Jun 10, 2024 · 2 comments
Closed
Labels

Comments

@amanshanbhag
Copy link
Collaborator

In awsome-distributed-training/3.test_cases/10.FSDP, when running sbatch 1.distributed-training.sbatch ( 1.distributed-training.sbatch), a bunch of warnings that look like pop up:

1: Token indices sequence length is longer than the specified maximum sequence length for this model (2522 > 2048). Running this sequence through the model will result in indexing errors

How to reproduce:

No changes were made to any of the training python scripts. The only changes made to the 1.distributed-training.sbatch file were to change from Llama2-7B to Llama2-13B. Everything else was kept the same. Just run everything as is as per the instructions in the workshop.

There's some discussion on altering max_length, or making some adjustments to the Tokenizer in this issue. This could be helpful in fixing the warning.

Copy link

github-actions bot commented Sep 9, 2024

This issue is stale because it has been open for 30 days with no activity.

@github-actions github-actions bot added the stale label Sep 9, 2024
Copy link

github-actions bot commented Nov 8, 2024

This issue was closed because it has been inactive for 14 days since being marked as stale.

@github-actions github-actions bot closed this as completed Nov 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant