Local position estimate diverges while using Vicon

Hi everyone,

I’m trying to use a Vicon system to send position information to a quadrotor for indoor flight testing. I have an onboard Odroid computer that receives information from the Vicon system and sends this to a Pixracer using mavros. I’ve been able to confirm using QGroundControl that the Vicon data is consistently being received by my Pixracer without loss of data but I’m noticing that my x and y local position estimates consistently diverge from the values coming from the Vicon system.

Here is a link to the Flight Review upload from one test that I did: https://logs.px4.io/plot_app?log=a2655663-a519-4a10-b68a-5502681c0f84

Context: I’m using an untouched copy of stable release v1.8.2 on my drone. I’ve removed the four propellers from my quadrotor and in the test I’m holding the drone in the Vicon field and manually moving it around. From Flight Review, you can see that my x and y positions repeatedly diverge, jump to 0 and then return to reasonable values.

I also post processed the flight log using MATLAB and produced this image which I think really illustrates my problem:

In these graphs, I’m plotting data being logged from the vehicle_local_position topic (vlp, orange) and from the vehicle_vision_position topic (vvp, blue). The first graph shows the .xy_valid element of each of these topics and the subsequent plots show the x, y and z position data. What really confuses me about these results is that the state estimator does not seem to be properly making use of the Vicon data to get the x and y position. Every time the local position estimate diverges, the drone still has access to Vicon data but it seems to take several seconds before this gets flagged by the xy_valid element and then corrected by converging back to the Vicon value. I’ve been digging into the EKF code to try and figure out why this is happening but I haven’t had any luck yet.

I would greatly appreciate any help with this issue!

Thanks

Sean

Hi,

In my experience, this divergence looks like lost tracking (no data is received by the FC for some time). Is there network latency? We had to swap a 2.4ghz transmitter for a joystick as the network had a high ping due to 2.4Ghz interference.

What is your setup to forward the position data?

Have you checked the axis and data format?

Hi @charles-blouin,

Thank you very much for responding to my post!

I’m not too sure how to test for network latency in my setup, I’ll have to dig into that a bit further… Would that issue affect the values being logged on the vehicle_vision_position uorb topic? I’m noticing for instance that in the above graph at around the 180 second mark of the xPos plot that my vehicle_vision_position topic is still recording motion while my vehicle_local_position topic is diverging.

As for the setup for forwarding position data: We first have a computer running the Vicon software that streams data via UDP over wifi to an onboard Odroid. The Odroid captures this data using a ROS package that we developed and publishes that to a ROS topic. We then run MAVROS on the Odroid to pass the information into the pixracer via a wired connection into the telemetry 2 port. Whenever I set up the system, I use some rostopic echo commands to make sure that the topics are being published properly within the Odroid. While testing, I have my Pixracer connect over wifi to a laptop running QGC and I’m able to plot the vision position estimate data using the analyze window.

The axis format idea is interesting. I noticed yesterday that my Vicon inertial frame is not a true ENU frame (its axes are just aligned with some walls in our testing area). I’m wondering if this might be causing some problems with the setup. I remember that a little while back I had to disable yaw vision fusion in QGC for my setup because I was constantly getting yaw errors when I tried to arm the drone. Did you set up your mocap system such that its inertial frame is a true ENU frame?

Hi Sean,

The vision messages might be correct in the logs as the messages have a timestamp, but they might still be received with a delay by the autopilot. One way to debug latency is to ping the companion computer constantly and monitor the ping. With the transmitter on, I noticed peak pings of 0.5 seconds on the network. You can also use QGC while moving the quad manually to check that the vision messages are received with no delay.

I send the mavlink message VISION_POSITION_ESTIMATE directly instead of using ROS, so I can’t help you too much there. ROS should be doing the conversion for you. The PX4 expects NED. It is worth checking that frame is correct in QGC.

UPDATE

So, I dug into the issue of my Vicon frame being misaligned with the Pixracer’s inertial frame and sure enough there was around a 135 degree rotation between between the two coordinate systems. Unfortunately, our Vicon’s coordinate frame was set up to align with the walls of our lab and not true north. I ran into some issues with trying to redefine the Vicon’s frame myself, so I dug into the ekf code and made a slight modification:

Within the file ekf2_main.cpp (Firmware/src/modules/ekf2) I manually applied a rotation to the data coming in from the Vicon system before it gets passed on to the EKF:

Original code (approx line 960):

vehicle_local_position_s ev_pos = {};
orb_copy(ORB_ID(vehicle_vision_position), _ev_pos_sub, &ev_pos);

ext_vision_message ev_data;
ev_data.posNED(0) = ev_pos.x;
ev_data.posNED(1) = ev_pos.y;
ev_data.posNED(2) = ev_pos.z;
matrix::Quatf q(ev_att.q);
ev_data.quat = q;

New version:

vehicle_local_position_s ev_pos = {};
orb_copy(ORB_ID(vehicle_vision_position), _ev_pos_sub, &ev_pos);
// Vicon TO TRUE NED FRAME: Measured offset 135.32 deg, cos(135.32) = -0.7110, sin(135.32) = 0.7031, R_px4,vicon = [c s 0;-s c 0;0 0 1]
ext_vision_message ev_data;     //ev_pos is in Vicon frame after having been rotated by Mavros to be NED
ev_data.posNED(0) = ev_pos.x*-0.7110f + ev_pos.y*0.7031f;
ev_data.posNED(1) = ev_pos.x*-0.7031f + ev_pos.y*-0.7110f;
ev_data.posNED(2) = ev_pos.z;
matrix::Quatf q(ev_att.q);
ev_data.quat = q;

With this modification I redid my testing and got slightly better results in terms of having vehicle_local_position properly track the Vicon data:

Once again, I did this test by holding the drone manually and walking it around our Vicon field. I’m using the v1.8.2 firmware with the above modification. In the above figure, the blue vvp line denotes the raw position measurements coming in from the Vicon. The orange vlp line denotes the vehicle_local_position value and the yellow vvp rot line shows the Vicon data after applying the above rotation to align it with true north. The above results are promising since from around the 30 second mark through the 75 second mark the orange and yellow lines track each other almost perfectly in x and y which is exactly what I want. As soon as I start moving my drone around a bit faster though, the vehicle_local_position value never quite manages to track the rotated Vicon data again.

Here is a copy of the log from this second test flight: https://logs.px4.io/plot_app?log=66b838c5-b557-4a33-89f2-975d0427c180

At this point I’m still trying to figure out how I can improve the performance of the system. I’ve tried adjusting the EKF2_EV_DELAY value to fix the issue and right now 10ms looks to give the best results. I wasn’t able to replicate the suggested graph to properly measure what the delay value should be. I’m not sure if there are other parameters that I should try adjusting. I’m still working on testing for latency at the moment.