Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add whisper masking #146

Merged
merged 6 commits into from
Nov 9, 2024
Merged

Add whisper masking #146

merged 6 commits into from
Nov 9, 2024

Conversation

zqhuang211
Copy link
Contributor

@zqhuang211 zqhuang211 commented Nov 7, 2024

  • Added masking in whisper encoder to ensure consistency in training and inference.
  • Simplified release_config.yaml to serve as an example configuration.

Zhongqiang Huang added 2 commits November 6, 2024 15:56
@zqhuang211 zqhuang211 requested a review from liPatrick November 7, 2024 00:09
Copy link
Contributor

@liPatrick liPatrick left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Doesn't whisper expect us to pad to 30sec? Do we have any concern about masking the padding here? Other than that the code for the pad masking looks good

@zqhuang211
Copy link
Contributor Author

Doesn't whisper expect us to pad to 30sec? Do we have any concern about masking the padding here? Other than that the code for the pad masking looks good

Yes, Whisper is trained with padding to 30 seconds, so the way we’ve used it may cause some mismatch. However, it does not seem to degrade end-to-end performance in our (limited) comparative studies.

@zqhuang211 zqhuang211 merged commit 812f58c into main Nov 9, 2024
1 check passed
@farzadab farzadab deleted the zhuang/add_whisper_masking branch December 4, 2024 00:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants