External Vision Yaw Estimate Error / Heading Estimate Unstable

We’re currently feeding in vehicle_visual_odometry data from the Bitcraze Bolt to the Modal AI Flight Core V2 at ~112 Hz. We keep getting yaw estimate errors and heading estimate unstable. Despite that, in our tests, the drone is able to hover pretty stable for the first 45s, but always ends up drifting uncontrollably.

I’m not sure if this is how other mocap systems work or if this could be problematic, but the Bitcraze Bolt’s quaternion data is calculated from its own gyroscope / accelerometer and not from an external sensor like how it calculates its position. Does the vehicle_visual_odometry uORB require the quaternion to come from an external source or should this setup work?

Flight test: Flight Review - Quadrotor (px4.io)