-
Notifications
You must be signed in to change notification settings - Fork 162
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sensor fusion + ptam + IMU setup issue #21
Comments
Hi Benjamin, There are no silly questions. We're always glad when someone is using our framework and the fastest way to get it run is asking what is unclear:-) To your questions:
2a) I would need to know a second axis of the camera to answer this, since I do not know how the camera is mounted on its yaw around camera z. To determine the camera x and y axis you can run PTAM and display the GUI (where you see the 3D point cloud). This will confirm your camera z axis and also show you which one is x and y. Then you can compute the transform to your IMU.
So you see, although our framework allows front-looking camera setup, at the moment it is not very trivial to initialize it. The difficulty being to find the initial rotation between PTAM map and gravity aligned world frame. If you are motivated I can guide you through how this could be done such that we could contribute this to the code once it is robust. As a first step I would suggest you bring the framework to run properly with the camera down looking. This would reduce error sources at the beginning. I hope this helps. From: Benjaminmar8 [[email protected]] Hi, All, Apology if my questions seems silly or they have been answered in other threads. I am trying to use sensor fusion + ptam on my quadrotor (not AscTec) with a front facing camera. There are related questions here #20#20 and #8#8 asked by other users. However, I still can't get it to work as there's seems no consolidated answer. Here is my questions:
Thanks a lot for the help! — |
Ben, Good to hear that the setup is working'ish in down-looking configuration. Fuzzy tracking should never occur and means something with the initialization or with the map tracking is seriously wrong. If you verify that the PTAM map is initialized gravity-aligned'ish before initializing the filter (down looking config, q_vw init to unit quaternion) then fuzzy tracking should not occur (except cam-IMU init is wrong, but based on your image q_ci(w,x,y,z)=(0,-1,0,0) should be correct). If you know that you are 1.1m above ground, you can also use "set height" to start the filter, this will automatically compute a correct scale initialization. To verify correct functioning you can use "rosrun ssf_core plot relevant" which plots
Once you managed to have it working robustly with the down looking setup. We can proceed to the front looking one. Best From: Benjaminmar8 [[email protected]] HI, Stephan, I have tried the down-looking camera, The result "seems" better though I still have a few things unclear. One obvious improvement is that I always had "fuzzy tracking warning" with front facing camera configuration. I know this is due to setup issue and the filter is just doing forward integration. With the down-looking camera, I rarely had this warning except 1, 2 times at start-up. My setup is as follows, I fix the down-looking camera at about 1.1 meter height on a movable table. Ths IMU is fixed on the same table with a bit of translation. I have tried my best to align the axis of the camera and IMU, the picture below shows the setup [config]https://f.cloud.github.com/assets/5643170/1952598/dff8808a-81ab-11e3-945c-f2740f9835ed.png Since IMU's Z axis points up and the camera's Z axis points down, I assume that there is a 180 degree rotation around the camera's X axis. So in the yaml file, I change the q_ci to [0, -1, 0, 0]. Is this correct? Is there anyway I can verify the configuration and results? When I push the table towards the +ve IMU's x axis, shall I expect a increase or decrease in filter's x reading (ssf_core/pose/x)? Thanks Ben — |
Ben, to 1) this might be because the scale is initialized too far off the true value. Thus I recommend to use only the set-height initialization option. to 2) When you say "suggest the scale estimate is keep on changing", please verify this in the corresponding plot. b) unlikely but as a sanity check: check if PTAM drifts that fast in z. To do so, plot the 3D map points either in RViz using the ptam_visualizer or the native PTAM GUI in map view. The point cloud should always lie in the xy plane of the PTAM map (z=0). c) check if the extrinsics are correct: "rosrun ssf_core plot_calib" plots the cam-IMU extrinsics. Verify if they are correct. If they are not, their initialization was way off the true value. If they "seem" to be correct, augment the noise on p_ic and q_ic separately (e.g. to 0.005) and apply rotational movement (careful to not corrupt/loose the map while doing the motion!). If the states change, your initialization was wrong. Hope that helps, Btw, to let our readers know where our framework is and can be used, and if you are allowed, may I ask your affiliation? And do you have a link to the project in which you plan to use the framework? From: Benjaminmar8 [[email protected]] Stephan, I obtained some results on the down-looking camera configuration. Here is the plot
Thanks. — |
Hello, |
Hi, All,
Apology if my questions seems silly or they have been answered in other threads. I am trying to use sensor fusion + ptam on my quadrotor (not AscTec) with a front facing camera. There are related questions here #20 and #8 asked by other users. However, I still can't get it to work as there's seems no consolidated answer. Here is my questions:
What is the frame coordinate used by the camera? I assume that the Z-axis points out from the camera lens. Then I use right hand rule to deduce the X and Y axis. Is this correct?
For the IMU, I am using a microstrain type and it uses NED frame. I edit the ROS code such that it outputs /imu/data in ENU frame. Assuming IMU's X-axis points to the front (same direction as the front facing camera), Z-axis points up out of quadrotor's body. What is the correct transformation between camera and IMU in the yaml file?
Once PTAM is intiallized, the PTAM's camera frame seems arbitrarily oriented with respect to the world frame. How can I make my camera frame gravity aligned, assuming my quadrotor rarely roll/pitch a lot? Or, the world frame should be gravity aligned?
the scale, /lambda, what I understand is that P_metric * /lambda = P_sensor, where P_metric is the position from /ssf_core/pose, and P_sensor is the position from /vslam/pose. Is this the correct interpretation?
My goal is to use this setup to navigate a quadrotor in a small indoor environment.The obvious input I can use is /ssf_core/pose as it gives quadrotor's position in metric form. I notice that there's no frame id for this /ssf_core/pose, is it specified in the world frame used in PTAM? If so, how can I use it for control as the world frame is not gravity aligned.
Thanks a lot for the help!
The text was updated successfully, but these errors were encountered: