Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

silero usage of torch.stft #283

Closed
vmoens opened this issue May 10, 2022 · 5 comments
Closed

silero usage of torch.stft #283

vmoens opened this issue May 10, 2022 · 5 comments
Labels
bug Something isn't working

Comments

@vmoens
Copy link
Contributor

vmoens commented May 10, 2022

It seems that silero uses the torch.stft function in a way that will be deprecated in the next pytorch release.

Previously, the following usage was permitted

torch.stft(tensor, num, num, num, num, bool, bool, *other)

but now a string representing the padding mode must be included

torch.stft(tensor, num, num, num, num, bool, str, bool, *other)

Link to the old function signature

Link to the new function signature

Link to the broken CI log

PR presumably responsible of the BC breaking change

Perhaps @peterbell10 can comment on this?

@snakers4 do you think it's something that can be fixed? Because the source code is not shared it's hard for us to debug this properly.

cc @NicolasHug

@vmoens vmoens added the bug Something isn't working label May 10, 2022
@snakers4
Copy link
Contributor

snakers4 commented May 10, 2022

Hi,

Link to the broken CI log

Looks like this is based off a nightly PyTorch build.

Here is how we use torch.stft before converting to TorchScript

class STFT(nn.Module):
    def __init__(self):

....

    def forward(self, x):
        return torch.stft(x,
                          n_fft=self.n_fft,
                          hop_length=self.hop_length,
                          win_length=self.n_fft,
                          return_complex=True,
                          window=torch.hann_window(window_length=self.n_fft,
                                                   dtype=x.dtype, device=x.device)).abs()

but now a string representing the padding mode must be included
pad_mode (string, optional) – controls the padding method used when center is True. Default: "reflect"

Is it already set in stone that this breaking change will be merged as is?
Usually PyTorch does not make such changes.

The following files fail:

  • file_path = 'python_code/snakers4_silero-vad_language.py'
  • file_path = 'python_code/snakers4_silero-vad_number.py
  • file_path = 'python_code/snakers4_silero-models_stt.py

In any case: snakers4_silero-vad_language.py and python_code/snakers4_silero-vad_number.py can be deprecated, because no one uses them and we decided to deprecate them.

As for python_code/snakers4_silero-models_stt.py I can just rebuild and reupload the latest models if the change gets added to a release.

@vmoens
Copy link
Contributor Author

vmoens commented May 10, 2022

@snakers4

Is it already set in stone that this breaking change will be merged as is?
Usually PyTorch does not make such changes.

it seems this PR will be reverted so let's wait for a couple of days before acting on anything.

Thanks for your responsiveness!

@snakers4
Copy link
Contributor

ok

@NicolasHug
Copy link
Member

Our CI seems to be green now: #225 (https://app.circleci.com/pipelines/github/pytorch/hub/1725/workflows/ec0914a3-1510-4dcd-b3b3-42f1eca174cd/jobs/2166)

Thanks a lot @vmoens for investigating, and @snakers4 for your timely replies!

@snakers4
Copy link
Contributor

Hope that this minor contribution made PyTorch a bit better)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants