We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hello!
Just checking the code of the DenseNet in Pytorch I found in the UpSample class, specifically, in the forward method, that the line 17 is incorrect. https://github.com/ialhashim/DenseDepth/blob/master/PyTorch/model.py#L17
The line of code is: return self.leakyreluB( self.convB( self.convA( torch.cat([up_x, concat_with], dim=1) ) ) )
return self.leakyreluB( self.convB( self.convA( torch.cat([up_x, concat_with], dim=1) ) ) )
But it should be: return self.leakyreluB( self.convB( self.leakyreluA( self.convA( torch.cat([up_x, concat_with], dim=1) ) ) ))
return self.leakyreluB( self.convB( self.leakyreluA( self.convA( torch.cat([up_x, concat_with], dim=1) ) ) ))
The first leaky relu is not included, but it should be. So, is it a bug? or it is intentionally? and if so, why?
Thank you, David
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Hello!
Just checking the code of the DenseNet in Pytorch I found in the UpSample class, specifically, in the forward method, that the line 17 is incorrect. https://github.com/ialhashim/DenseDepth/blob/master/PyTorch/model.py#L17
The line of code is:
return self.leakyreluB( self.convB( self.convA( torch.cat([up_x, concat_with], dim=1) ) ) )
But it should be:
return self.leakyreluB( self.convB( self.leakyreluA( self.convA( torch.cat([up_x, concat_with], dim=1) ) ) ))
The first leaky relu is not included, but it should be. So, is it a bug? or it is intentionally? and if so, why?
Thank you,
David
The text was updated successfully, but these errors were encountered: