Publishing VIO Data to /fmu/in/vehicle_visual_odometry in ROS 2 (Python)

Hi everyone,

I’m working on a UAV project using PX4 v1.14.4. I’m using OpenVINS for pose estimation and subscribing to the odometry output via rclpy in ROS 2. From what I understand, the position provided by OpenVINS is in the FLU (Forward-Left-Up) coordinate frame.

When publishing this data to /fmu/in/vehicle_visual_odometry in Python, is it recommended to set pose_frame as POSE_FRAME_FRD and manually invert the Y and Z axes of the position to match FRD? Should I also directly invert the signs of the Y and Z components in the orientation (quaternion) and covariance accordingly?

Alternatively, would it be better to align the system with the East at startup and perform a full ENU-to-NED transformation instead?

I’m aware that px4_ros_com provides frame transformation utilities in C++, but I’m not entirely confident in rewriting them in Python. For example:

Are the quaternion-to-Euler and Euler-to-quaternion conversions done in ZYX order?

Would you recommend using the existing px4_ros_com frame transformation tools directly if possible, rather than rewriting them in Python?

Thanks in advance for your advice!

1 Like

Hi, I am trying to do the same thing. Which topic are you using from Open VINS to publish to PX4?

The topic I am currently using is /ov_msckf/odomim.

Did you use a monocular camera or stereo camera? I am facing this issue https://github.com/rpng/open_vins/issues/511

@james88385542 were you able to solve this? please help, facing the same issue now

I believe Open-VINS needs stereo camera and the performance with monocular isn’t very good. I recommend using VINS-MONO or VINS-FUSION with RGB monocular stream if you want to use monocular

@avianb Currently, I’m using the RealSense D455 with a stereo setup, and like @ssghunterlineage mentioned, I would also recommend using stereo for VIO.

Regarding the issue you’re facing, I haven’t encountered similar problems myself since I’m not using a monocular setup. However, I would suggest sharing your configuration file for better debugging. Also, it’s important to perform camera calibration using tools like Kalibr to obtain accurate intrinsic and extrinsic parameters before running the VIO system again.

@ssghunterlineage Due to some hardware limitations, I haven’t been able to perform outdoor tests yet. However, in indoor tests, I was able to successfully forward the data from /ov_msckf/odom to /fmu/in/vehicle_visual_odometry using a custom ROS 2 node. I can also confirm that the odometry information is visible in QGroundControl.

1 Like

@james88385542 thanks for replying. I also haven’t done outdoor testing yet. I am also forwarding the data from /ov_msckf/odom to /fmu/in/vehicle_visual_odometry using a custom ROS2 bridge node. How do you confirm that the odometry is visible in QGC, is it through looking at the MAVLink Inspector “ODOMETRY” tab? I looked at this too, but this “ODOMETRY” window shows up even when I am not forwarding the VIO data. Also, have you been able to get a stable hover in Position mode with the VIO fusion indoors? My quadrotor still drifts quite a lot and requires manual correction even with stable VIO fusion in Position mode indoors. Any help is really appreciated

Sorry about the confusion. Initially, I checked this based on the PX4 documentation. However, I later realized that the Odometry data displayed is actually a result of sensor fusion performed by EKF2, which can integrate data from IMU, barometer, or LiDAR—even if VIO data is not being published to /fmu/in/vehicle_visual_odometry.

If you’d like to verify this, you could try switching to OFFBOARD mode while the propellers are removed and sending some simple commands. Normally, if the flight controller does not receive valid positioning data (e.g., from VIO or GPS), it should not be able to successfully switch modes. If you do see “Flying OFFBOARD” in QGroundControl, it indicates that the commands were accepted. At that point, try stopping the VIO data publication—you should see the flight controller trigger a failsafe response if it’s relying on that data.
image
image

Additionally, due to safety concerns (I don’t have a sufficiently large indoor space), I’m currently only verifying the VIO and flight controller fusion by holding the drone in my hand. Based on the local position feedback, it seems reasonably acceptable for now. However, I understand that this doesn’t necessarily reflect the actual performance during flight. So at this stage, I might not be able to provide a definitive answer on that.

In general, common troubleshooting steps could include re-calibrating the camera or adjusting the EV_DELAY parameter.

Agreed!

How do I send the commands, would this be through publishing on the topic /fmu/in/vehicle_trajectory_waypoint? Is this the correct way to send simple commands in the offboard mode, say I want it to takeoff and move forward 5 meters?

How do I set the correct EV_DELAY parameter, is there a way to see the delay between VIO camera and pixhawk IMU delay? Additionally, for your VIO, is your IMU data coming from the camera itself (e.g. RealSense D435i or D455 cameras have built-in IMU) or are you using the IMU data from the PX4 hardware? Thank you.

For my implementation, I referred to the PX4 documentation on ROS 2 Offboard control (ROS 2 Offboard Control Example | PX4 Guide (main)) to write a publishing node that sends commands to PX4. You can check out that example for reference. There are also some GitHub examples available, such as this one:
px4-offboard/px4_offboard/offboard_control.py at 3dc75b7f7010e2da9db78a7d51b28b4ba1ee7220 · Jaeyoung-Lim/px4-offboard · GitHub
However, make sure to pay attention to the versions of PX4 and px4_msgs, as compatibility issues may arise.

On the PX4 VIO page (Visual Inertial Odometry (VIO) | PX4 Guide (main)), you can see the EKF2 parameter recommendations provided by PX4. In particular, the EKF2_EV_DELAY parameter (Visual Inertial Odometry (VIO) | PX4 Guide (main)) can be configured via the ground control station, and the time offset can be inspected using FlightPlot. As for the IMU, I am using the one built into the RealSense D455.

1 Like

Thank you, this is super helpful. One quick question, what is your publish rate for the VIO on the topic /fmu/in/vehicle_visual_odometry? Right now I am publishing at 20Hz, which seems to have a lot of drift still, so I am wondering if it would not drift as much (and hold a stable hover) if I published VIO at 30Hz. Although I would need to tune some VIO package parameters to be able to run it smoothly on a jetson orin nx 16GB at 30Hz.

@ssghunterlineage @james88385542
Hi. Thanks for replying to my earlier messages. That error is now resolved. The problem I am facing now is of initialisation.

[run_subscribe_msckf-1] [init]: disparity is 20.851,4.031 (10.00 thresh)
[run_subscribe_msckf-1] [init]: failed static init: no accel jerk detected, platform moving too much
[run_subscribe_msckf-1] [TIME]: 0.0088 seconds total (113.1 hz, 21.31 ms behind)
[run_subscribe_msckf-1] [init]: disparity is 20.366,4.576 (10.00 thresh)
[run_subscribe_msckf-1] [init]: failed static init: no accel jerk detected, platform moving too much
[run_subscribe_msckf-1] [TIME]: 0.0104 seconds total (96.5 hz, 21.01 ms behind)
[run_subscribe_msckf-1] [init]: disparity is 20.499,4.085 (10.00 thresh)
[run_subscribe_msckf-1] [init]: failed static init: no accel jerk detected, platform moving too much
[run_subscribe_msckf-1] [TIME]: 0.0090 seconds total (111.6 hz, 23.49 ms behind)
[run_subscribe_msckf-1] [TIME]: 0.0077 seconds total (129.8 hz, 21.89 ms behind)
[run_subscribe_msckf-1] [init]: disparity is 21.764,10.697 (10.00 thresh)
[run_subscribe_msckf-1] [init]: failed static init: no accel jerk detected, platform moving too much

To initialise it I am first making the IMU still without giving any disturbances. After that I give some jerk but still it doesn’t initialise. I have tried a lot of ways to make it work. Do you have any solutions? I am using the monocular setup.

To confirm, you are using monocular setup with open-vins? Have you tried initializing the IMU by exciting roll, pitch yaw and linear accel directions, and then yawing the camera left to right so there’s enough parallax and variety in features? If this also doesn’t work, then it’s probably time to try to use VINS-MONO because it works better out-of-box with a monocular setup. Open-vins is still great for stereo setup though in my experience

I didn’t set a specific publishing rate. Instead, I publish synchronously inside the callback function that subscribes to /ov_msckf/odomimu. This part is implemented in C++.

According to the OpenVINS GitHub (failed static init · Issue #453 · rpng/open_vins · GitHub), you can adjust the init_imu_thresh parameter to suit your own hardware.