These are the notes from June 19 DevSummit core maintainer/core dev workshop. The session was chaired by @LorenzMeier, and the following notes captures the session topic and action items that needs to be followed up.
Overview of current state
Current State of automated testing
- Compile tests
- Unit tests
- SITL tests (go through verification of outcome vs ground truth)
- On-HW tests (unit)
- On-HW tests (HITL)
- Linters (coverity)
@LorenzMeier describes above as product soak testing
Discussion
What is coverage by current tests? You can add a feature that is not covered by tests.
Test categories
- Product quality soak testing
- Regulatory testing
- Product contribution testing (make sure test coverage goes not down
Unit tests
- Now easily possible with new framework by @MaEtUgR
- Mostly agreement that only new features with unit tests are accepted
- Community contributions: how can we enforce unit testing
- Idea for approach: enforces unit tests from contributor; if feature is really important core team adds unit tests if the contributor does not follow up
- Core maintainers need to do refactoring to show other contributors how tests should be done
- First integration test coverage should get increased to be able to refactor
- Currently developers are scared to put in features due to low test coverage
- Can we do job sharing: 1 person writes feature; the other writes tests; currently do not have bandwidth for that
Adversarial testing
Can you make a drone crash with a RC?
Can we capture such flight data and put it into SITL tests?
Nuttx Upgrade
Example: I2C bug issue: currently no instrumentation to capture it (CPU coverage) on the hardware test rack
→ performance metric
Next steps ideas
- Modernize architecture to use SDK
- Match it to regulatory requirements
- Enforce in CI that code coverage does not go down
- Integrate test tooling in github
- SITL testing can be used outdoors (including automated and verification)
- Map of what is tested
- Format for test spec; test procedure; result checker to be able to keep it revisioned
- Could be done with test case management tool integrated into github
- Need to be able to have offset/pattern that is relative
- Feed logs from adversarial testing into SITL testing
- Performance metric monitoring on test rack
- Jerk monitoring on SITL/outside tests
- Peripheral testing on test rack
- Full coverage of every interface
- Connector level test
- Internal quality audit
Actions
- Requirements → Result Automation (Test case tool)
- Move existing int. Testing
- Policy change on PR testing
- Formalizing HW testing setup / Projectize test rack