Optitrack to RTF integration

I am trying to get my Intel RTF done to fly using position information from an Optitrack system. Ultimately I would like to fly using offboard commands from my laptop in an area with no GPS. First some system information.
PX4 version: 1.8.2
Motive version: 1.7.3
Computer OS: Ubuntu 16.04
Drone OS: Ubuntu 16.04
ROS version: kinetic

I am using the vrpn_client_ros package to stream data from the Optitrack system to my laptop. That appears to be working as I can echo the /vrpn_client_node/robot1/pose topic to the screen and it updates as I move the drone.
I tried to start mavros on my laptop using the following command
roslaunch mavros px4.launch fcu_url:="udp://:14540@192.168.8.1:14557"
I am certain of the IP address (192.168.8.1), but not about the port, or the syntax. I assume this would connect to the flight controller, and if it could subscribe to the data stream from Optitrack, I would be set.
From the I need to remap the /vrpn_client_node/robot1/pose to /mavros/vision_pose/pose using
rosrun topic_tools relay /vrpn_client_node/robot1/pose /mavros/vision_pose/pose
however, the position shown in QGroundcontrol does not update.
Any ideas what I am doing wrong?

I found part of the problem. The connection string to start Mavros was incorrect.
roslaunch mavros px4.launch fcu_url:="tcp://192.168.8.1:5760"

Now the problem is the coordinate frame is incorrect from the vrpn_client_mode topic. Following the instructions from External Position Estimation, I was to flip the Up Axis in the Motive software. However, my version of the Motive software doesn’t have that option (1.7.3), or at least I have been unable to find that option.

Any ideas? I’m planning on transitioning to mocap_optitrack, as the External Position Estimation page suggests it already has the coordinate frame modified.

I have not seen that issue directly, but you could write a ROS node that corrects the data. If you are not using ROS elsewhere in your project, you could forward the UDP packets using a script on your companion computer. This is what I have done with my mocap system and drone. ROS is great for robotics, but it can add a bit of complexity if only packet forwarding is required.
Edit: I mean that you could run a script on the companion computer that listens for the pose packets from the mocap system and rebroadcast them as mavlink messages to the drone.

I modifed the code in the vrpn_client_ros package according to the instructions in External Position Estimation.

I’ll reproduce them here.

If x_{mav}, y_{mav} and z_{mav} are the coordinates that are sent through MAVLink as position feedback, then we obtain:

x_{mav} = x_{mocap}
y_{mav} = z_{mocap}
z_{mav} = - y_{mocap}

Regarding the orientation, keep the scalar part w of the quaternion the same and swap the vector part x, y and z in the same way. You can apply this trick with every system - if you need to obtain a NED frame, look at your MoCap output and swap axis accordingly.

That pretty much took care of it. Now the RTF has the correct position information.