@Al_B Thanks for the info. Have you considered using Google ARcore for your research? When flying indoors, if you are using pose data as input, your neural network does not need to be that big. In the original PPO paper, the NN for controlling the Mujoco robots is two layer deep with 64 neurons on each layer, which is not that much even for a Raspberry Pi. In the reinforcement learning training I coded, the neural network is 32 neurons with two layers and it works well. Trajectory generation can be much more natural and aggressive with a NN than with a linear controller.
Is there an OpenAI Gym interface for gazebo and the px4 SITL software? Right now, I have to take the trained network with Pybullet, and run it in the Dronecode SDK script with the simulator or with the real drone. It could be possible to launch the simulator and write an OpenAI gym wrapper with Dronecode SDK, but I don’t think it would be possible to reset the simulation with an API call from Dronecode SDK.
@JulianOes Dronecode SDK works quite well for indoors flights, although it is not its main purpose. One way the firmware could be significantly improved to fly with vision or motion capture is regarding the handling of loss of tracking data. Right now, when pose data from the external vision system is lost, the EKF2 estimator relies only on the IMU to estimate position. It takes a few seconds before the filter resets. The position drifts by 10-15 meters in that time frame, which obviously causes a crash. The PX4 firmware (including the EKF2) could quickly recognize the lack of data vision data and switch to a mode that does not use external vision. Unfortunately, when activating the use of external vision system, vision fusion is always used in EKF2, which means that simply switching to stabilized mode is not enough to save the drone in the even that vision data is interrupted.
I think the system would need to run two EKF2 filters at the same time: one that relies on vision and another that does not. Another service could constantly check for the quality of the vision data and decide when to switch to a vision free mode.
A small issue I noticed is that there is no easy way to trim the vehicle when flying with a joystick or in off-board mode. This means that the drone does not take off perfectly straight when it is position control mode.
The EFK2 sanity checks don’t always reset. For example, if you start the drone but position data is not available, the drone can initialize in the wrong orientation. Even after starting the vision system and waiting a while, the error ‘High accel bias’ does not always clear, so you have to reset the PX4.