Visual precision landing

Hello Dev-Team of PX4

Does PX4 already have successfully tested functions (code & sensors) for a visual precision landing?

As I found on arducopter in the newest beta with IR-Lock;
http://ardupilot.org/copter/docs/precision-landing-with-irlock.html

or presented by Randy with NVidia TX1 running OpenKai and using a ZED stereo camera:

Thanks for any hint. I didnt found and documentation in PX4 while doing my researches.

Best, Severin

Does anyone know, if there is some developement going on about visual precision landing in PX4pro now, or in the near future. I would appreciate a hint.

ArduCopter seems to have implementet quite some code, that is already working:
http://ardupilot.org/copter/docs/precision-landing-with-irlock.html

There is none.

I am achieving the precision landing with offboard commands (from an Odroid XU4) by detecting April Tags. Probably the most robust tag.

Right now, I am looking to make the scanning with a OpenMV 7 Camera, that can be connected to the Pixhawk.

The IR beacons are rubish.

1 Like

What makes you say that?

A way to sell you 50 cent LED in a very bad packaging at 50$. Also, the LEDs are not powerful.

Yet, I never tested the IRLock. Might be better, at least, it’s a beautiful PCB.

Apriltag is much more robust, although it needs more processing power compared to detecting IR beacons. Also, with IR beacons, you don’t need to illuminate the april tag at low-light conditions(assuming that you are using something like IR-lock). But still, I’m also using april tags too… :slight_smile:

I am using the OpenMV M7 camera which allow to detect April tags from 4 meters with (15cm tag) @3Hz. (Because the computation is on chip)

This is not that much. If you do need only one landing platform, then a high power IR LED can be enough.

The other solution can be a tag in a bigger tag. allowing to detect it from further and land precisely.

The solution I will most likely implement is to have a TAG and a IR LED. The IR LED allows to detect the potential tag from far away and directly zoom inside the correct area with the camera to read the tag value.

The limitation of the OpenMV M7 camera is the RAM. You can take a VGA (640x480) picture but can only process a QQVGA picture for the find_apriltags algorithm. (160x120).

See : https://github.com/openmv/openmv/issues/5

After getting the position relative to the April tag -> how do you pass the information to the flight controller / adjust position of the UAV?

You could use MAVROS to achieve that.

1 Like

Mavros in ENU coordinate using ROS, easiest.

Mavlink (you handle the serial communication) in NED coordinate.