Motion Capture LPE Communication

Hello,
I am very new to px4 and mavlink, and I am trying to build a motion capture positioning system for a drone.

I created a motion capture system from scratch and I am wondering how I can communicate that information to the drone via mavlink.

The software is written in matlab and I can export the data in any format, I just need to know how to talk to the pixhawk to give it current position.

I see that I need to use ATT_POS_MOCAP message I just dont know what exactly that is doing.
https://dev.px4.io/v1.9.0/en/ros/external_position_estimation.html

Can someone direct me to some info that might help me achieve this?

Thanks!
Greg

ok, I can see that I need to use the VRPN_client_ros to communicate with the FCU, but I don’t know how to setup the VRPN client with data from my motion capture system.

Is there a way to format the data output from my motion capture system so it is readable by the VRPN?

Can anyone that has set this up before with a optitrack or vicon motion capture system, weigh in?

Hi,
yeah I have setup a working vicon system with our drones via mavros and can help you much more in detail on monday. In short terms you just need a companion computer on your drone (for example: a raspberry pi – I run there my ros master and mavros-launch-file) and publish your current pose (a message in the format of geometry_msgs::PoseStamped) on the topic mavros/vision_pose/pose and set the EKF settings to vision-based control.
If you generate your personal motion capture data on matlab you should be able to use the matlab-ros-connection directly and so you don’t need to set up the vrpn_client_ros. Just have in mind to define ros_master_uri when you work with multiple ros-computern in a network.

I guess this is now a lot and maybe a bit confusing - just ask I will help you :smiley:
Greetings

1 Like

This is fantastic, thank you!!

I will see if I can get this working, might have another question along the way

Thanks again!