Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add get_publishers_info_by_topic #80

Merged
merged 3 commits into from
Jan 11, 2024

Conversation

NerdToMars
Copy link
Contributor

related issue: #78

@m-dahl
Copy link
Collaborator

m-dahl commented Jan 7, 2024

Sorry for the delay I was caught up in the holiday season. I think this looks ready to merge once we fix the CI (looks like a type change should do it) and rebase on top of master.

@NerdToMars
Copy link
Contributor Author

NerdToMars commented Jan 10, 2024

Sorry for the delay I was caught up in the holiday season. I think this looks ready to merge once we fix the CI (looks like a type change should do it) and rebase on top of master.

thanks for helping review, I updated PR, but I still has this error for foxy:
seems the time is point to some address has sec: 7FFFFFFF, nsec: FFFFFFFF,

thread 'tokio_testing' panicked at r2r/tests/tokio_testing.rs:152:16:
called `Result::unwrap()` on an `Err` value: JoinError::Panic(Id(3), ...)
from_rmw_time_t rmw_time_t { sec: 2147483647, nsec: 4294967295 }
thread 'tokio-runtime-worker' panicked at r2r/src/qos.rs:829:9:
nsec part of rmw_time_t should be less than 1 billionthread 'tokio_testing' panicked at r2r/tests/tokio_testing.rs:152:16:
called `Result::unwrap()` on an `Err` value: JoinError::Panic(Id(3), ...)
from_rmw_time_t rmw_time_t { sec: 2147483647, nsec: 4294967295 }
thread 'tokio-runtime-worker' panicked at r2r/src/qos.rs:829:9:
nsec part of rmw_time_t should be less than 1 billion

full error can be found here: https://github.com/NerdToMars/r2r/actions/runs/7463328195/job/20307738761

I have checked the foxy rmw implementation, and it seems same as others DISTRO, https://github.com/ros2/rmw/blob/7fa45cb0d86fef00488e707b9ef37914d7ff0369/rmw/include/rmw/types.h#L321

Do you have any suggestions?

@m-dahl
Copy link
Collaborator

m-dahl commented Jan 10, 2024

Hi,

This commit looks to be from some months after the one you linked: ros2/rmw@703992c

There is a comment added about RMW_DURATION_INFINITE which seem to hint that the previous behaviour (which would be in foxy) was not standardized between rmw implementations. I would guess that the behaviour changed between foxy and galactic and that in foxy an unspecified time was represented as you found.

We could just add a conditional compilation for #[cfg(r2r__ros__distro__foxy)] that checks for the constant and turns it into a zero duration.

But maybe we also need to handle RMW_DURATION_INFINITE in some way?

@NerdToMars
Copy link
Contributor Author

NerdToMars commented Jan 11, 2024

Thanks for helping look into this, I tried to use ros2cli to publish some dummy std_msgs/msg/String topic without explicitly specify its qos profile, then I use ros2cli to get the verbose info of this dummy topic, I noticed all the duration data in the qos profile is like:

sec: 2147483651294967295 ( 2,147,483,647,000,000,000 + 4,294,967,295 )  nanosecs
nsec: 2147483651294967295 ( 2,147,483,647,000,000,000 + 4,294,967,295 ) nanosecs

which is similar as someone mentioned here ros2/rmw#210 (comment).
I noticed Foxy publisher is using #define RMW_QOS_DEADLINE_DEFAULT {0, 0} as deadline in default QoS profile, but somehow the subscriber is getting different deadline duration results (2147483651294967295). Sadly I did not find the corresponding implementation for this kind of behavior.

@m-dahl
Copy link
Collaborator

m-dahl commented Jan 11, 2024

Well at least it fixed CI :) Perhaps the proper fix is to simply drop foxy support with the next release, I am not sure if anyone would care now that it's so old(?). Will merge this now, thanks for your contribution.

@m-dahl m-dahl merged commit b3e4a58 into sequenceplanner:master Jan 11, 2024
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants