hardik
March 16, 2018, 4:56am
1
I am currently using Navio 2 with RPi 3 for running PX4 firmware. Followed the guide in documentation for cross-compiler build: https://dev.px4.io/en/setup/building_px4.html
I want to use Motion capture for flying indoors and referred to some notes:
# Using Vision or Motion Capture Systems for Position Estimation
Visual Inertial Odometry (VIO) and and Motion Capture (MoCap) systems allow vehicles to navigate when a global position source is unavailable or unreliable (e.g. indoors, or when flying under a bridge. etc.).
Both VIO and MoCap determine a vehicle's *pose* (position and attitude) from "visual" information.
The main difference between them is the frame perspective:
- VIO uses *onboard sensors* to get pose data from the vehicle's perspective (see [egomotion](https://en.wikipedia.org/wiki/Visual_odometry#Egomotion)).
- MoCap uses a system of *off-board cameras* to get vehicle pose data in a 3D space (i.e. it is an external system that tells the vehicle its pose).
Pose data from either type of system can be used to update a PX4-based autopilot's local position estimate (relative to the local origin) and also can optionally also be fused into the vehicle attitude estimation.
This topic explains how to configure a PX4-based system to get data from MoCap/VIO systems (either via ROS or some other MAVLink system) and more specifically how to set up MoCap systems like VICON and Optitrack, and vision-based estimation systems like [ROVIO](https://github.com/ethz-asl/rovio), [SVO](https://github.com/uzh-rpg/rpg_svo) and [PTAM](https://github.com/ethz-asl/ethzasl_ptam)).
> **Note** The instructions differ depending on whether you are using the EKF2 or LPE estimator.
## PX4 MAVLink Integration
PX4 uses the following MAVLink messages for getting external position information, and maps them to [uORB topics](http://dev.px4.io/en/middleware/uorb.html):
This file has been truncated. show original
# Flying with Motion Capture (VICON, Optitrack)
> **Warning** **WORK IN PROGRESS**.
This topic shares significant overlap with [External Position Estimation (ROS)](../ros/external_position_estimation.md).
Indoor motion capture systems like VICON and Optitrack can be used to provide position and attitude data for vehicle state estimation, orto serve as ground-truth for analysis.
The motion capture data can be used to update PX4's local position estimate relative to the local origin. Heading (yaw) from the motion capture system can also be optionally integrated by the attitude estimator.
Pose (position and orientation) data from the motion capture system is sent to the autopilot over MAVLink, using the [ATT_POS_MOCAP](https://mavlink.io/en/messages/common.html#ATT_POS_MOCAP) message. See the section below on coordinate frames for data representation conventions. The [mavros](../ros/mavros_installation.md) ROS-Mavlink interface has a default plugin to send this message. They can also be sent using pure C/C++ code and direct use of the MAVLink library.
## Computing Architecture
It is **highly recommended** that you send motion capture data via an **onboard** computer (e.g Raspberry Pi, ODroid, etc.) for reliable communications. The onboard computer can be connected to the motion capture computer through WiFi, which offers reliable, high-bandwidth connection.
Most standard telemetry links like 3DR/SiK radios are **not** suitable for high-bandwidth motion capture applications.
## Coordinate Frames
This section shows how to setup the system with the proper reference frames. There are various representations but we will use two of them: ENU and NED.
This file has been truncated. show original
I need to set LPE_FUSION parameters however, the RPi build seems to have only EKF verison. Is there a way to compile a LPE version for RPi? @LorenzMeier
@hardik were you able to achieve external_position_estimation using motion-capture for RPi?
Thanks
Prasanth