|
1 |
| -# Test and CI Setup |
| 1 | +# Testing and CI Setup |
2 | 2 |
|
3 |
| -## On-commit tests - fast |
4 |
| -- Explanation of on-commit tests. |
| 3 | +The Asset Tracker Template features a comprehensive testing infrastructure with tests run both on real hardware (on target) and emulation. |
| 4 | +Code analysis is performed through compliance checks and SonarCloud analysis. |
5 | 5 |
|
6 |
| -## Nightly - speed not that important, run on target |
7 |
| -- Explanation of nightly tests. |
| 6 | +## CI Pipeline Structure |
| 7 | + |
| 8 | +The CI pipeline is composed by the following workflows: |
| 9 | +- .github/workflows/build.yml: for building the devices firmware. |
| 10 | +- .github/workflows/target-test.yml: for running tests on real hardware. |
| 11 | +- .github/workflows/build-target-test.yml: workflow that glues together build.yml and target-test.yml. |
| 12 | +- .github/workflows/sonarcloud.yml: for building and running tests on emulation, and run Sonarcloud analysis. |
| 13 | +- .github/workflows/compliance.yml: for static compliance checks. |
| 14 | + |
| 15 | +Additionally we are using AI assistant in .github/workflows/ai-review.yaml. It's an AI reviewer that runs on pull request. |
| 16 | +Developer can choose to run it or not with a label. |
| 17 | + |
| 18 | + |
| 19 | +The CI pipeline is triggered as follows: |
| 20 | +- On pull request: build.yml, sonarcloud.yml, compliance.yml |
| 21 | +The goal initially was to have a CI faster than ~5 minuts on PR. Current CI takes ~4minuts, with sonarcloud being bottleneck. |
| 22 | +No target tests are run on PR to avoid instabilities. |
| 23 | + |
| 24 | +- Direct pushes to the main branch when merge: build-target-test.yml, sonarcloud.yml |
| 25 | +Only "fast" targets tests are run. Avoiding excessively time-consuming tests. |
| 26 | + |
| 27 | +- Nightly schedule: build-target-test.yml |
| 28 | +Full set of target tests. Includes "slow" tests such as full-modem-fw fota test and power consumption test. |
| 29 | + |
| 30 | +### Testing |
| 31 | + |
| 32 | +#### Hardware Testing |
| 33 | +The project includes tests that run on real hardware devices, implemented in the `.github/workflows/target-test.yml` workflow. Key features include: |
| 34 | + |
| 35 | +Tests on target are performed on self-hosted runners. |
| 36 | +Read more here and how to set up your own instance for your project: https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners |
| 37 | + |
| 38 | + |
| 39 | +- Tests run on multiple target devices: |
| 40 | + - nRF9151 DK |
| 41 | + - Thingy:91 X |
| 42 | +- Uses self-hosted runners labeled according to the connected device |
| 43 | +- Runs in a containerized environment with hardware access |
| 44 | +- pytest as test runner |
| 45 | +- Supports Memfault integration for symbol file uploads |
| 46 | +- Generates detailed test reports and logs |
| 47 | +- Flexible test execution with support for specific test markers and paths |
| 48 | + |
| 49 | +Try out tests locally: tests/on_target/README.md |
| 50 | + |
| 51 | +#### Emulation Testing |
| 52 | +Emulation tests are implemented as part of the SonarCloud workflow (`.github/workflows/sonarcloud.yml`). These tests: |
| 53 | + |
| 54 | +- Run on the native_sim platform using Twister |
| 55 | +- Execute integration tests in an emulated environment |
| 56 | +- Generate code coverage reports |
| 57 | +- Use build wrapper for accurate code analysis |
| 58 | + |
| 59 | +### Code Analysis |
| 60 | + |
| 61 | +#### SonarCloud Analysis |
| 62 | +The SonarCloud integration (`.github/workflows/sonarcloud.yml`) provides: |
| 63 | + |
| 64 | +- Static code analysis for C/C++ files |
| 65 | +- Code coverage reporting |
| 66 | +- Pull request analysis and main branch analysis |
| 67 | +- Continuous monitoring of code quality metrics |
| 68 | +- Separate analysis configurations for main branch and pull requests |
| 69 | + |
| 70 | +#### Compliance Testing |
| 71 | +Compliance checks are implemented in `.github/workflows/compliance.yml` and include: |
| 72 | + |
| 73 | +- Codeowners validation |
| 74 | +- Devicetree compliance |
| 75 | +- Git commit message linting |
| 76 | +- Identity checks |
| 77 | +- Code style and formatting (Nits) |
| 78 | +- Python code linting |
| 79 | +- Kernel coding style checks (checkpatch) |
| 80 | + |
| 81 | + |
| 82 | +### Key Features |
| 83 | + |
| 84 | +1. **Modularity**: Each aspect of testing is handled by a dedicated workflow |
| 85 | +2. **Comprehensive Coverage**: Combines hardware testing, emulation, and static analysis |
| 86 | +3. **Detailed Reporting**: Generates test reports, coverage data, and compliance checks |
| 87 | +4. **Flexibility**: Supports different test configurations and target devices |
| 88 | +5. **Quality Assurance**: Multiple layers of validation ensure code quality |
| 89 | + |
| 90 | +## Test Results and Artifacts |
| 91 | + |
| 92 | +Each workflow generates specific artifacts: |
| 93 | +- Hardware test results and logs |
| 94 | +- Coverage reports |
| 95 | +- Compliance check outputs |
| 96 | +- SonarCloud analysis reports |
| 97 | + |
| 98 | +Artifacts are available in the GitHub Actions interface and can be used for debugging and quality assurance purposes. |
0 commit comments