Simulate aruco markers in gazebo with MavSDK

I’m trying to do precision landing using a downward facing camera and aruco markers.
I’m currently doing my simulations in SITL (according to this tutorial) by manually holding the copter over the marker and trying to follow the movements of the simulated drone. It looks about as silly as it sounds…

I’d like to use Gazebo to simulate this, e.g. by adding a simple camera to the Iris drone and a marker at the bottom. However, there are 2 problems:

  • I have no experience in Gazebo or ROS.
  • I’m not developing in ROS at all, I’m developing in Python, using OpenCV to read the images and MavSDK to talk to the drone.

Is there a simple description or tutorial on how to extend the Gazebo simulations with a simple downward facing camera that externally acts as a USB camera.

Hey,
You can use Typhoon H480 which already has a camera and add your aruco marker intro your Gazebo world. But you may need to play around a bit to make the world changes in Gazebo.

Thanks, that would be an idea for the camera problem, I’ll give that a try. Unfortunately, it still leaves one problem.
The Typhoon has a gimbal camera that corrects any swaying of the drone, but most downward cameras for object recognition are fixed (so is mine). When the drone tilts or rolls, the camera center moves and the detected distance to the marker changes, though the drone hasn’t moved. You can correct that by getting the roll and tilt angles of the drone, turning them into rotation matrices and rotate the camera frame back into the drone body frame, which points straight towards earth. However, this important effect cannot be simulated here.

This is a project I did in the past, and it might be useful in your case.

1 Like