Advice to implement Absolute Visual Localization

Hello community!

I would love to implement computer vision algorithms to estimate the absolute localization of a drone when GPS data is absent. Some exciting discussion happened here: GPS Denied Navigation for long (20km) outdoor UAV missions - #6 by ppoirier - Copter 4.3 - ArduPilot Discourse.

Currently I have set up a simulation with a VTOL that performs a mission that consists on take off, follow waypoints (forward flight mode), and landing.

I’m seeking guidance on how to simulate various GPS failures while following waypoints, and how to incorporate those estimates into the PX4 control loop, adhering closely to the codebase guidelines.

Currently I have tested introducing failures as stated in the docs but I am facing the same problem outlined here: [Bug] System Failure Injection (gps) not working · Issue #22296 · PX4/PX4-Autopilot · GitHub. Are there any workarounds?

If you use gazebo.You can use the following command to stop working in gps:

sensor_gps_sim stop


It worked, thank you! I will now have to override the failsafe so it does the visual navigation.

Do you know of any method to do it programmatically, for instance insde a ROS2 node? Also, would you recommend me use Micro XRCE or MAVROS?

Best regards,

Micro XRCE DDS will better

1 Like