My goal is to use an overhead monocular camera, to provide position information for PX4 controller, and control the multirotor in GPS denied environment by sending offboard control commands to PX4 controller.
So far I have been working on determining the position of quadcopter from external monocular camera, which is located above the mentioned quadcopter. So far I have gotten rotation matrix and translation vector which can describe the quadcopter coordinates in camera coordinate space.
Now I would like to understand how to implement this with offboard control, where instead of GPS the position is given in camera coordinate space. As I understand there are multiple configuration parameters that have to be setup, in order for this to work. Could you help me out in suggesting which ones?
I am also assuming that I can control the quadcopter with sending MAVLink commands with MAVSDK-python. If I figure out how to get Quaternions from the rotation matrix I would still need to send the commands so that the quadcopter reacts as expected. I hope I can use the following sequence to get this to work:
# setup and send position message
q = mavsdk.Quaternion(w, x, y, z)
position_body = mavsdk.PositionBody(x_m, y_m, z_m)
pose_covariance = float('nan')
message = mavsdk.AttitudePositionMocap(time_usec, q, position_body, pose_covariance)
Hopefully this enough for the quadcopter to understand where it is in respect to the camera. But I wonder how often the position information has to be sent?
Then I hope to use the PX4 position controller as much as possible, so I would like to setup required position with the following commands. As I understand this also has to be sent with a minimal frequency of 2 Hz, for the controller to remain in offboard mode?
# send required quadcopter position
message = mavsdk.PositionNedYaw(north_m, east_m, down_m, yaw_deg)
Is this doable and is this a correct approach for this problem? Any help or suggestions will be appreciated.