-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Slight difference in code vs paper #2
Comments
Hi @sairaamVenkatraman, many thanks for your question. In the code, we have added the functionality of the data augmentation but we are not using it for the experiments in the paper. Please note that the shift_fraction could be changed to anything you want and run the code (set to 0 for no augmentation). I have updated the default value in the code, thanks! |
Thanks for the prompt reply. My question was actually based on your submission to the fashionmnist github page. Specifically, this zalandoresearch/fashion-mnist#140 In your paper, however, you have mentioned that no augmentation was used. Hence, the question. |
Thanks again for your reply. That's a mistake on my end, let me confirm the exact parameters and I'll get back to you soon. I am very sure that we used the same parameters (and augmentation techniques) for both baseline CapsNets and DCNets, but I'll confirm if we used data augmentation soon. |
Thank you! |
Hi @sairaamVenkatraman, thank you for your message. We have realised that there was a mistake in the ACCV paper. We used the exact augmentation that was used in CapsNet baseline model (as mentioned in the NIPS 2017 paper) to make the results comparable. So all the models are trained with the same augmentation (also for FashionMNIST). |
Thank you! |
In your ACCV paper, you had mentioned that you used no augmentation for fashionmnist. However, your code and comments on the fashionmnist site tell that you have used random translations by up to 2 pixels and random horizontal flips. Please clarify.
The text was updated successfully, but these errors were encountered: