How is MAV_FRAME_LOCAL_NED interpreted, how to use it in precision landing?

I am currently attempting MAVLink assisted precision landing with an aruco marker according to https://docs.px4.io/main/en/advanced_features/precland.html#offboard-positioning. I’m programming in Python using MavSDK. However, it’s unclear to me how to provide the LANDING_TARGET message.

This page says at the bottom that PX4 only supports MAV_FRAME_LOCAL_NED as MAV_FRAME. so, the landing messages have to contain the distance of the marker to the home position in metres?
This sounds very strange to me. In this case, I would only need to measure the distance of the marker and the start position once. In fact, I don’t even need a marker. Not to mention, everything would be down to gps accuracy.

So, what am I misunderstanding here? I didn’t find any example or documentation that made MavSDKs usage of coordinate systems clear to me.

Also, the article on Offboard landing says that only x, y and z are used. So you can’t make the drone achieve a correct yaw position by providing quaternions to the v2 message?

1 Like

I haven’t used the precision landing before, so unfortunately I can’t help you with most of your questions. But about the landing being in meter, the unit is meter, but since it is a float you can have decimals to have a very high accuracy in the position you send.

The local co-ordinate system is NED which is a North-East-Down co-ordinate system fixed to the earth at the “local home position”. This is the position that the vehicle uses as its reference - so if you got LOCAL_POSITION_NED it would be relative to this position.

My “guess” is that the LANDING_TARGET is provided by whatever knows the position of the target in the local co-ordinate system of the vehicle.

So for example say you had a companion computer set up with a vision system capable of seeing the marker. The companion computer knows the current vehicle position in the local frame from LOCAL_POSITION_NED and can calculate the relative position of the marker to determine its position in the local frame. This would then be put in the LANDING_TARGET and sent to the flight stack.

@dakejahl Is that about right?

@blank-supportgis you’re not crazy, the “Landing Target Protocol” specification is not very good. The requirement for local frame coordinate system is stupid and unnecessary. I’ve worked around this by using the “sitl irlock hack codepath”. The only values that are used are angle_x and angle_y

and then from your linux side you just need to publish the landing_target message as such

	float x = msg->poses[0].position.x;
	float y = msg->poses[0].position.y;
	float z = msg->poses[0].position.z;

	// Convert to unit vector
	float r = sqrtf(x*x + y*y + z*z);
	x = x / r;
	y = y / r;
	z = z / r;

	// https://github.com/PX4/PX4-SITL_gazebo/blob/master/src/gazebo_irlock_plugin.cpp
	float tan_theta_x = x / z;
	float tan_theta_y = y / z;
	mavlink_message_t message;
	mavlink_landing_target_t landing_target = {};

	// X-axis / Y-axis angular offset of the target from the center of the image
	landing_target.angle_x = angle_x;
	landing_target.angle_y = angle_y;

	mavlink_msg_landing_target_encode(AUTOPILOT_SYS_ID, YOUR_COMPONENT_ID, &message, &landing_target);

You’ll need to enable the landing_target_estimator module in PX4. You can test in sim using
make px4_sitl gazebo_iris_irlock

Hope this helps.

1 Like

@dakejahl So you’re sending the angles between drone and marker and PX4 accepts them? that’s good to know. So the article on offboard landing is wrong in mentioning only x, y and z?
Unfortunately, I’ve got another problem. MAVSDK doesn’t support LANDING_TARGET messages (see my question here). I haven’t yet found an alternative for PX4 in Python that supports it. There are only DroneKit, which is made for ArduPilot, and ROS which is too much work for only one command.

The article is not wrong, but as you’ve identified using the LOCAL_NED frame is inconvenient for a number of reasons. I’d like to amend the Landing Target Protocol eventually to officially support raw sensor measurements from an external source, I just haven’t had the time to do the work upstream.

Take a look at the code, there are only two code paths. The first one is as the article suggests: you need to provide x/y/z/position_valid and use MAV_FRAME_LOCAL_NED. The other path is achieved by simply publishing angle_x and angle_y and enabling the landing_target_estimator module

Regarding mavlink in python, you can use pymavlink. Take a look here for an example using pymavlink

1 Like

I am trying to develop a precision landing system based on the recognition and positioning of visual tags from an external computer, outputting positions (X, Y, Z) relative to the camera frame.

This post has been very useful and together with the rest of the information that I have researched I have designed the following scheme where I show the different options that I see to make this system work.

Considering this scheme, the three different ways I see to implement the system are:

1. The “sitl irlock hack codepath”:

Convert the relative position of the tag with respect to the drone to angle_x and angle_y from the camera. And publish this information in the mavlink message LANDING_TARGET with Position_valid=0. So the PX4 LANDING_TARGET_ESTIMATOR module calculates the relative position of the landing point and fills the PX4 LANDING_TARGET_POSE topic with relative positions in BODY_FRAME.

I have already tested this option in SITL and it works. Although I don’t think it is the most suitable option. Already having the position relative to the tag, but converting it to angles so that the LANDING_TARGET_ESTIMATOR module recalculates the position. I do not think it is the most elegant solution and although I have not tested it, I can see that it could introduce unwanted deviations.

2. Directly fill the uORB topic LANDING_TARGET_POSE with positions in MAV_FRAME_BODY_FRD.

This would require the use of FastDDS Bridge. At the moment I have not been able to do it, I find that there is very little documentation about it, especially if you are not using ROS2.

Has anyone been able to do something similar and test it? If so, could someone direct me to an example of this?

3. “Fix” PX4 to accept MAV_FRAME_BODY_FRD in the mavlink command LANDING_TARGET.

What modifications would be necessary to make this change? Is there a branch where this is already corrected? @dakejahl

Any feedback would be appreciated.

Thx

I agree with you it’s not the most elegant solution but it’s the solution that requires the least amount of effort. To do this properly and merge upstream we’d need to extend the landing target protocol specification and implement support for this new implementation in both PX4 and Ardupilot – these are the “rules” of contributing to mavlink and this process can be quite time consuming.

You might be interested in this recent PR

2 Likes