Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue: Loss Discrepancy Between FSDP1 and FSDP2 with AdamW Optimizer #724

Open
Teng-xu opened this issue Dec 9, 2024 · 1 comment
Open
Labels
question Further information is requested

Comments

@Teng-xu
Copy link

Teng-xu commented Dec 9, 2024

We observed a loss discrepancy between FSDP1 and FSDP2 while training with the AdamW optimizer. Are you aware of any known issues with the AdamW optimizer and FSDP2 that might contribute to this behavior?

@awgu
Copy link
Contributor

awgu commented Dec 9, 2024

We have not seen this issue. You may need to provide more details about the training setup. We ran long-running numeric testing to compare FSDP1 and FSDP2 before and saw parity.

@tianyu-l tianyu-l added the question Further information is requested label Dec 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants