Issue: IMU frame doesn't match autopilot frame

Hello all,
I’m a beginner with pixhawk and px4, and I’m confused about the orientation of the body frame which does not seem to match the acceleration measurements. The px4 documentation states that the body frame is : x forward, y right and z down, however when I displayed the acceleration measurements of the uorb topic sensor_bias, the IMU frame was : x backward, y right and z up.

I’m just wondering if the measurements I’m getting are normal, i.e the IMU frame is different from the body frame, or if there is an issue with my configuration.

Any help with issue will be very much appreciated.

Best Regards,

Mohamed,

sensor_bias is the offset that the estimator guesses / learns over time. It is usually small numbers to correct for calibration offsets. I don’t think the sensor_bias would tell you anything about the frame.

If the attitude shown in QGC is wrong and does not match the real attitude, then something is wrong, and maybe the Pixhawk orientation is not set correctly. You can set it before the sensor calibration in QGC.

Dear Julian,

Thanks for your quick reply. I got this wrong conclusion from the “landing_target_estimation” module ( see code below) where this topic is used for updating the landing target state. If this topic is the wrong one could you please point me to the appropriate topic for reading acceleration measurements.

Thanks in advance


/* predict */
if (_estimator_initialized) {
if (hrt_absolute_time() - _last_update > landing_target_estimator_TIMEOUT_US) {
PX4_WARN(“Timeout”);
_estimator_initialized = false;

	} else {
		float dt = (hrt_absolute_time() - _last_predict) / SEC2USEC;

		// predict target position with the help of accel data
		matrix::Vector3f a;

		if (_vehicleAttitude_valid && _sensorBias_valid) {
			matrix::Quaternion<float> q_att(&_vehicleAttitude.q[0]);
			_R_att = matrix::Dcm<float>(q_att);
			a(0) = _sensorBias.accel_x;
			a(1) = _sensorBias.accel_y;
			a(2) = _sensorBias.accel_z;
			a = _R_att * a;

		} else {
			a.zero();
		}

		_kalman_filter_x.predict(dt, -a(0), _params.acc_unc);
		_kalman_filter_y.predict(dt, -a(1), _params.acc_unc);

		_last_predict = hrt_absolute_time();
	}
}

To look at accel measurements you can either look at the HIGHRES_IMU message which is sent over MAVLink to e.g. QGC. If you can’t see it via telemetry, you can try to connect directly via USB cable.

Alternatively, you can look at the topics sensor_accel or sensor_combined in mavlink shell (either in QGC or using mavlink_shell.py). The command is listener sensor_accel or listener sensor_combined.

Or you could arm and therefore trigger a log to be recorded on the SD card. You can then upload it to https://logs.px4.io and check out “Raw Acceleration”.