I’m currently working on a szenario, where precision flying is essential for the mission success. To archive this accuracy a RTK GPS (Here +) is used.
In my mission software I’m constantly checking the GPS_RAW_INT message to verify the necessary accuracy is provided. But after some research I’m still uncertain how the uncertainty actually gets calculated, in particular the message attribute h_acc.
In my understanding to calculate an accuracy the actual position has to be known to get the difference to the one provided by the GPS.
As this obviously isn’t possible how exactly does the uncertainty get calculated? Which parameters are taken into account to come up with this number?
You can find a good answer in this article: https://insidegnss-com.exactdn.com/wp-content/uploads/2018/01/IGM_julaug14-solutions.pdf
tl;dr: the accuracy is calculated in the Kalman filter (of the GNSS receiver) used to estimate the position/velocity/clock bias/… from the raw signals. They mostly depend on “[…] the level of measurement errors induced by orbital inaccuracies, atmospheric effects, multipath, and noise”.