Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Coordinate system confusion #20

Open
crazyboyyy opened this issue Nov 29, 2013 · 13 comments
Open

Coordinate system confusion #20

crazyboyyy opened this issue Nov 29, 2013 · 13 comments

Comments

@crazyboyyy
Copy link

2
1

Hi Stephan,

I'm a graduate student from China, working on autonomous quadrotor. Your published masterpieces & software packs are really helpful and they are guiding our research.

Recently we got a confusion about the coordinate system using in "Ethzasl_sensor_fusion" package. As the user manual provided by AscTEC(http://www.asctec.de/downloads/manuals/AscTec_SDK2.0_Manual_V1.1.pdf), a NED coordinate system is used. However, in the "dataset.bag" posted on the ROS website, the acc-z reading is positive +9.8 when the replay started, it seems to be a ENU to us. We've tried them all, making our homemade flight control board sending raw IMU readings in each coordinate system, if we are sending in ENU( acc-z ~+9.8), after initializing, there is always a warning message, indicating a "fuzzy tracking", and sometimes a "negative scale detected" warning. On the other hand, the NED one ( acc-z ~-9.8)seems to be OK without any warning message.(Using Ethzal_PTAM as update sensor)
Has the Asctec's coordinate changed?

In your thesis, the body-fixed coordinate of IMU is not a right-handed Cartesian coordinate, is that the old-style coordinate used by Asctec? Moreover, except for the world cooridnate, the other coordinates are all polted in left-handed, can you give some hint about it?
Really looking forward to your reply, Thanks a lot!

Zhang Xu
from Tianjin University, CHINA

@markusachtelik
Copy link
Contributor

Hi,
We followed the ros convention, which is ENU and we transformed it on the HLP accordingly. Everything that goes into/leaves the ros network follows this convention. There is only an exception for some plots: http://wiki.ros.org/asctec_hl_interface/Tutorials/hlp%20position%20control#Sanity_Checks
Hope this helps!

Markus

From: crazyboyyy <[email protected]mailto:[email protected]>
Reply-To: ethz-asl/ethzasl_sensor_fusion <[email protected]mailto:[email protected]>
Date: Friday, November 29, 2013 1:49
To: ethz-asl/ethzasl_sensor_fusion <[email protected]mailto:[email protected]>
Subject: [ethzasl_sensor_fusion] Coordinate systems confrusion (#20)

Hi Stephan,

I'm a graduate student from China, working on autonomous quadrotor. Your published masterpieces & software packs are really helpful and they are guiding our research.

Recently we got a confusion about the coordinate system using in "Ethzasl_sensor_fusion" package. As the user manual provided by AscTEC(http://www.asctec.de/downloads/manuals/AscTec_SDK2.0_Manual_V1.1.pdf), a NED coordinate system is used. However, in the "dataset.bag" posted on the ROS website, the acc-z reading is positive +9.8 when the replay started, it seems to be a ENU to us. We've tried them all, making our homemade flight control board sending raw IMU readings in each coordinate system, if we are sending in NED, after initializing, there is always a warning message, indicating a "fuzzy tracking", and sometimes a "negative scale detected" warning. On the other hand, the ENU one seems to be OK without any warning message.
Has the Asctec's coordinate changed?

In your thesis, the body-fixed coordinate of IMU is not a right-handed Cartesian coordinate, is that the old-style coordinate used by Asctec? Moreover, except for the world cooridnate, the other coordinates are all polted in left-handed, can you give some hint about it?
Really looking forward to your reply, Thanks a lot!

Zhang Xu
from Tianjin University, CHINA


Reply to this email directly or view it on GitHubhttps://github.com//issues/20.

@stephanweiss
Copy link
Contributor

Hi,

Thank you for using our framework - I am happy to hear that it helps in your research.

Markus' answer should cover your first question.
Concerning my thesis, all coordinate systems are right handed. The 2D figures of the 3D coordinate axes are a bit mind-twisting - you can think of the arrows also pointing "into the paper" instead of "out of the paper". This way you will end up in right handed systems.

Let us know if we can help in any further issues.

Best
Stephan


From: crazyboyyy [[email protected]]
Sent: Thursday, November 28, 2013 4:49 PM
To: ethz-asl/ethzasl_sensor_fusion
Subject: [ethzasl_sensor_fusion] Coordinate systems confrusion (#20)

Hi Stephan,

I'm a graduate student from China, working on autonomous quadrotor. Your published masterpieces & software packs are really helpful and they are guiding our research.

Recently we got a confusion about the coordinate system using in "Ethzasl_sensor_fusion" package. As the user manual provided by AscTEC(http://www.asctec.de/downloads/manuals/AscTec_SDK2.0_Manual_V1.1.pdf), a NED coordinate system is used. However, in the "dataset.bag" posted on the ROS website, the acc-z reading is positive +9.8 when the replay started, it seems to be a ENU to us. We've tried them all, making our homemade flight control board sending raw IMU readings in each coordinate system, if we are sending in NED, after initializing, there is always a warning message, indicating a "fuzzy tracking", and sometimes a "negative scale detected" warning. On the other hand, the ENU one seems to be OK without any warning message.
Has the Asctec's coordinate changed?

In your thesis, the body-fixed coordinate of IMU is not a right-handed Cartesian coordinate, is that the old-style coordinate used by Asctec? Moreover, except for the world cooridnate, the other coordinates are all polted in left-handed, can you give some hint about it?
Really looking forward to your reply, Thanks a lot!

Zhang Xu
from Tianjin University, CHINA


Reply to this email directly or view it on GitHubhttps://github.com//issues/20.

@caomeihui
Copy link

Dear Sir,
I'm a graduate student from China, working on autonomous quadrotor. Your published masterpieces & software packs are really helpful and they are guiding our research.

But we still have some difficulties in the coordinate systems . First , the vslam coordinate system going into ROS is ENU , but the direction of x/y is relevant with the manual initialization procedure . Is it right ?

Second , we do not have the Plican quadrotor , so we use our own IMU in a NED coordinate system , What should we do to convert our IMU coordinate for ROS ?

Meihui Cao,

from Tianjin University, CHINA

@crazyboyyy
Copy link
Author

image
here is our rqt_graph

we don't have a pelican so there is no High Level Processor.

@crazyboyyy
Copy link
Author

We setup our IMU in ENU coordinate, and we got
“fuzzy tracking triggered: 0.800376 limit: 0.1” from time to time:
and the position from ssf_core/pose is negative in z and xyz is very unstable

rostopic echo /fcu/raw/imu

frame_id: fcu
orientation:
x: 0.0
y: 0.0
z: 0.0
w: 0.0
orientation_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
angular_velocity:
x: -0.007
y: 0.01
z: 0.0
angular_velocity_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
linear_acceleration:
x: 0.588399
y: 0.83356525
z: 9.73800345
linear_acceleration_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]

rostopic echo /vslam/pose

we got:
pose:
pose:
position:
x: -0.73646774671
y: -0.101617856533
z: 2.25872944874
orientation:
x: 0.730405635777
y: 0.482996077233
z: 0.45000977258
w: 0.175252963412

roslaunch ssf_updates pose_sensor.launch
and after “init_filter”
we got something like this from time to time:
“fuzzy tracking triggered: 0.800376 limit: 0.1”

rostopic echo /ssf_core/pose

we got:
pose:
pose:
position:
x: 1.2232099864
y: 1.72093252308
z: -6.45702323286
orientation:
x: -0.561567164981
y: -0.19226044876
z: 0.198160987415
w: 0.780006706461

@crazyboyyy
Copy link
Author

And then we setup our IMU in NEU coordinate, and we got no warning any more:
rostopic echo /fcu/raw/imu

frame_id: fcu
orientation:
x: 0.0
y: 0.0
z: 0.0
w: 0.0
orientation_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
angular_velocity:
x: 0.003
y: 0.007
z: 0.0
angular_velocity_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
linear_acceleration:
x: -0.5687857
y: 0.8041453
z: -9.77723005

rostopic echo /vslam/pose

we got:

frame_id: usb_cam
pose:
pose:
position:
x: 0.150389205428
y: 0.14178428208
z: 0.4946300307
orientation:
x: 0.746930367688
y: 0.662400753243
z: 0.0559993724798
w: 0.0135771208761
roslaunch ssf_updates pose_sensor.launch
and after “init_filter”

rostopic echo /ssf_core/pose

we got:

frame_id: ''
pose:
pose:
position:
x: -0.0397306194602
y: -0.000488360865444
z: 0.187922269638
orientation:
x: -0.719282159649
y: -0.694373595154
z: -0.00474925118833
w: -0.0213525121906

@crazyboyyy
Copy link
Author

Hi Markus:

We don't have HLP on board, and we tried to transfer our own "/fcu/raw/imu" directly to "ssf_core“ together with /vslam/pose.

if "/fcu/raw/imu" is in ENU(positive in Z-acc), we got warning and unstable position reading.

But in NED(negative in Z-acc), everything seems to be ok, BUT it contradicts with the dataset.bag.

Please help me out!

Thanks a lot

@simonlynen
Copy link
Contributor

Hi @crazyboyyy,

OK, lets try to fix this.

As you also mentioned, the dataset is expressed in an ENU frame where gravity is measured as 9.81m/s^2. Does the example bag file work fine for you, when you take the setup described in the tutorial?

Looking at the numbers you provided, there are however several things wrong in your setup which we should fix first before even looking into the dataset.

  • To which values do you set the camera to IMU calibration, e.g. what is the value of the orientation of the camera frame of reference w.r.t. the IMU frame of reference expressed in the IMU frame of reference?

I am asking because you write that your measured orientation is q=[0.1753 0.7304 0.4830 0.4500] (Hamilton Quaternion notation) which is a rotation around all three axis, while the corresponding IMU measurement says you are gravity aligned. I am just wondering if you have the correct IMU-camera calibration provided in the parameter-file.

@simonlynen
Copy link
Contributor

A note on fuzzy tracking:

Given that the vision-measurement might not always be expressed in frame of reference which is gravity aligned, we estimate the rotation between the frame of reference of the vision measurements and the world frame of reference. This rotation estimate might change slowly over time, while large changes in this orientation estimate are usually a sign of a failure in the visual SLAM module. We therefore watch the rate of change on this estimate and trigger a warning message when the rate of change exceeds 0.1 rad/update. We then drop the update of the EKF and due pure IMU dead-reckoning (forward integration).

When however the camera-to-IMU calibration is wrong the two frames of reference do not match initially and large updates of the relative orientation estimate are required. This triggers fuzzy tracking. So this is commonly a sign that there is a problem in the setup or initialization of the filter.

@simonlynen
Copy link
Contributor

@crazyboyyy
Also we are using the Hamilton Notation to express Quaternions, make sure your measurement is also expressed in Hamilton notation (used also by ROS and Eigen).

@simonlynen
Copy link
Contributor

Hi @caomeihui ,

But we still have some difficulties in the coordinate systems . First , the vslam coordinate system going into ROS is ENU , but the direction of x/y is relevant with the manual initialization procedure . Is it right ?

Do you mean the yaw rotation? The initial yaw estimate should come from the update-sensor, e.g. a SLAM system or Vicon. If you use GPS you have to initialize it using dynamic reconfigure.

Second , we do not have the Plican quadrotor , so we use our own IMU in a NED coordinate system , What should we do to convert our IMU coordinate for ROS ?

What is your question? How to convert the data from your IMU to use in the Pelican flight controller or how to use the IMU data from your IMU in the EKF? If you mean the latter you just apply a rotation matrix to it in the node publishing the data.

@caomeihui
Copy link

Thank you for your reply, it is helpful to our research.During these days, we are under your guidance, have a new understanding of the coordinate systems.

@crazyboyyy
Copy link
Author

We are so deeply grateful to all of you for the great honor of your support !The Sensor_Fusion package now works with PTAM perfectly in our hand-held test, leaving only one problem to us, the high level viberation of our quadrotor seems to interfere the fusion process, we are now working at balancing everything.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants