You need to throw away those bad measurements. Filtering does not work well when the measurement noise is so high (it’s not even noise, it’s effectively detection glitches which should be treated like outliers). Try this:

```
static int last_measurement = 0;
static int data_valid_counter = 0;
bool data_valid = false;
if(current_measurement >last_measurement + delta || current_measurement < last_measurement - delta) {
data_valid = false;
data_valid_counter = 0;
} else {
data_valid_counter++;
if(data_valid_counter >= 3) {
data_valid = true;
}
}
last_measurement= current_measurement;
if(data_valid) {
/* Publish your data to uORB*/
}
```

I have an algorithm for my sensor that is similar to this with a bit extra. Essentially what I am suggesting above is imposing a requirement of 3 consecutive measurements that are within a delta of each other before allowing publishing. This ensures that you do not include spurious measurements and it also ensures that the data you are publishing has been stable for at least a little bit.

**edit:** I don’t have my code on this computer, I can advise more on monday. But this example is meant to give you an idea of the type of checks you need to implement. Get creative with it, straight math won’t solve this problem, gotta use logic.