Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UserWarning: torch.meshgrid #45

Open
the-cat-crying opened this issue Aug 9, 2024 · 7 comments
Open

UserWarning: torch.meshgrid #45

the-cat-crying opened this issue Aug 9, 2024 · 7 comments

Comments

@the-cat-crying
Copy link

UserWarning: torch.meshgrid: in an upcoming release, it will be required to pass the indexing argument. (Triggered internally at ..\aten\src\ATen\native\TensorShape.cpp:2157.)
return _VF.meshgrid(tensors, **kwargs) # type: ignore[attr-defined]

../accelerated_features/third_party/alike_wrapper.py line 76
../accelerated_features/third_party/ALIKE/soft_detect.py line 88

@guipotje
Copy link
Collaborator

Hi @the-cat-crying, this is just a warning because the indexing argument will be required in a future release of PyTorch. You can fix this by passing the argument indexing="ij".

@the-cat-crying
Copy link
Author

Thanks!

@the-cat-crying
Copy link
Author

But I also found two caveats

1 ../accelerated_features/third_party/ALIKE/soft_detect.py:152: UserWarning: floordiv is deprecated, and its behavior will change in a future version of pytorch. It currently rounds toward 0 (like the 'trunc' function NOT 'floor'). This results in incorrect rounding for negative values. To keep the current behavior, use torch.div(a, b, rounding_mode='trunc'), or for actual floor division, use torch.div(a, b, rounding_mode='floor').
keypoints_xy_nms = torch.stack([indices_kpt % w, indices_kpt // w], dim=1) # Mx2
2 ../accelerated_features/modules/training/losses.py:105: UserWarning: Implicit dimension choice for log_softmax has been deprecated. Change the call to include dim=X as an argument.
kpts = F.log_softmax(kpts)

I found that the code in these two places seems to be duplicated, please guide me if this is what I think.

@the-cat-crying
Copy link
Author

image

@XinningC
Copy link

Hello, I've also encounted "Change the call to include dim=X as an argument." on line "kpts = F.log_softmax(kpts)", and I found out that when the argument dim is not specified, the function takes dim=1 by default, which does not makes trouble. Hope this helps you, and please correctify me if I'm wrong.

@XinningC
Copy link

@the-cat-crying

@the-cat-crying
Copy link
Author

Thanks !@XinningC You are right!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants