Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is the meaning of 'non-conventional usage' of backbone at Table 7 in the Paper? #4

Open
knaffe opened this issue Jul 6, 2020 · 2 comments

Comments

@knaffe
Copy link

knaffe commented Jul 6, 2020

Thank you for your great job of this repo and paper!
I notice that the result of ResNet-50 with non-conventional usage has the best performance. I want to know how to implement this ' non-conventional usage'.
Does it mean 'discarding the down-sampling operation between stage3 and stage4' in section 3.1 of the paper?
Thanks a lots.

@kobiso
Copy link
Contributor

kobiso commented Jul 6, 2020

Thanks for the interest in our paper.
For your question, it means 'discarding the down-sampling operation between stage3 and stage4' in section 3.1 of the paper.
So, you are right :)

@knaffe
Copy link
Author

knaffe commented Jul 6, 2020

Thank you for your response !
So, 'discarding the down-sampling operation between stage3 and stage4' is changing the last stride of Resnet layer4 from 2 into 1 ? That is, last stride = 1 ?

These days I reimplement your great work with pytorch. When evaluating in CUB-200-2011, I just get 62% Recall@1. I believe I miss some important details.
So, some questions raising:

  1. Batch sample: shuffer all samples and get 128 samples in a batch? or use P-K sampling format(P-classes, K samples per class)
  2. I find that without L2 norm and FC after GD( like another issue said) could get higher performance(can't still reach your proposed results in CUB-200). Do you know the reasons about this?
  3. Could you share some training tricks?
    Looking forward to your guidance.
    Thank you so much!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants