-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pointcloud transformation #13
Comments
Hello! Not really sure what you are planning to do with the point cloud - some more details of the application context could be helpful. However, if I understand correctly, you want to express the point cloud (that ORB-SLAM2 produces) in your custom coordinate system. Hence, you should:
Also, it is not clear for me what do you mean by |
Hi @rayvburn ! Thank you for your replay. If I see correctly, you have a Polish first and last name. I am from Poland and it would be easier for me to explain it in Polish. If I am wrong please forgive me and I will translate it into Polish. Posiadam drona pływającego z kamerą, LiDARem oraz inercyjnym systemem nawigacji (anteny GNSS, czujnik IMU). Chcę pozyskać chmurę punktów osobno dla komponentu kamery (z użyciem ORB SLAMu) oraz osobno dla LiDARu (z użyciem paczki dedykowanej urządzeniu). Oba komponenty działają w ROS Melodic. Współrzędne tych punktów chcę przetransformować na rzeczywisty układ np. WGS’84, by później wykonać dalszą analizę.
Jednak do końca nie rozumiem jak wyznaczyć "base_link". Rozumiem, że jest to punkt zaczepu między układem robota a powiedzmy światem, ale nie jest to fizyczny punkt. Dlatego miałam pomysł, by INS był base_linkiem, jednak to również nie działa. A jakkolwiek bym nie zmieniała układów, dostaję chmurę punktów z ORB SLAMu w układzie kamery (?), a nie z rzeczywistymi współrzędnymi.
|
@MagdaZal this is true, however I will keep my reply in English so English-speakers could also benefit from this conversation. So, the main question is - is your camera monocular? My package was designated to help estimating the scale of the point cloud generated by monocular cameras. This type of cameras do not deal with point cloud scale well. Is your LiDAR able to emit multiple vertical lines? Like, e.g., VLP-16? If yes, I think this could give you way more accurate point cloud compared to ORB-SLAM2 with monocular camera. Since you have got GNSS (I think that we should call it this way, instead of INS), you will probably be interested in
Did you try to ask a question at https://answers.ros.org/questions/? I think that general ROS questions will get a lot more attention on the forum instead of Github. If it comes to the ORB-SLAM2 package from this repository, did you manage to run any |
Yes, I use monocular camera (gopro). |
So, having a monocular camera, you should also feed my modification of ORB-SLAM2 with the odometry data to perform scale correction. The resultant point cloud should be pretty accurate. Have fun :) |
Hi!
Hi!
I want to add an INS to tf which will transform pointcloud from ORB SLAM2 to real world coordinate. My idea is to do a transformation so that the INS is at the rotational center of the robot.
My tf_tree looks like this: INS -> map -> base_link -> camera_link
My problem - despite the set transformation, the point cloud is saved in the orb slam system, not in the real coordinate system with INS. For my future work I have to get point cloud with world coordinates in ECEF.
Has anyone ever had this problem or can give me some advice?
The text was updated successfully, but these errors were encountered: