Pytest plugin to run Zephyr tests and collect results.
This repository is not used in production and is still under development. The code for Twister which is used in Zephyr's CIs can be found here.
Installation from github:
pip install git+https://github.com/zephyrproject-rtos/twister.git
Installation from the source:
pip install .
Installation the project in editable mode:
pip install -e .
Build wheel package:
pip install build
python -m build --wheel
- Python >= 3.8
- pytest >= 7.0.0
Show all available options:
pytest --help
Run tests:
pytest <PATH_TO_ZEPHYR>/tests/kernel/common -vv --zephyr-base=<PATH_TO_ZEPHYR> --platform=native_posix --log-level=DEBUG
If environmental variable ZEPHYR_BASE
is set, one can omit --zephyr-base
argument.
Show what fixtures and tests would be executed but don't execute anything:
pytest tests --setup-plan
List all tests without executing:
pytest tests --collect-only
Run tests only for specific platforms:
pytest tests --platform=native_posix --platform=nrf52840dk_nrf52840
Provide directory to search for board configuration files:
pytest tests --board-root=path_to_board_dir
Generate test plan in CSV format:
pytest tests --testplan-csv=testplan.csv --collect-only
Use custom path for test plan in JSON format:
pytest tests --testplan-json=custom_plan.json --collect-only
Use custom path for result report in JSON format:
pytest tests --resutls-json=custom_name.json
Run tests with given tags (@ is optional and can be omitted):
pytest tests --tags=@tag1,@tag2
Examples of usage:
- not tag1
- --tags=~@tag1
- tag1 and tag2:
- --tags=@tag1 --tags=@tag2
- tag1 or tag2
- --tags=@tag1,@tag2
- (tag1 or tag2) and tag3 and not tag4
- --tags=@tag1,@tag2 --tags=@tag3 --tags=~@tag4
Scan connected devices and create hardware map:
twister_tools --generate-hardware-map hardware_map.yaml
Scan connected devices and list hardware map:
twister_tools --list-hardware-map
List all platforms:
twister_tools --list-platforms
List default platforms only:
twister_tools --list-platforms --default-only
Our plugin requires pytest-subtest plugin, however, we modify the behavior of "subtests" introduced with this plugin. The original implementation is based on subtest concept from unittest framework where such items are counted and reported in a peculiar way.
The fact that we modify the behavior of subtests in our plugin can influence users who are using unittest-based subtests in other projects. After adding our plugin to their existing environment the reporting of their existing subtests can change. To mitigate such issues we recommend running different projects in different virtual environments.
Additional context: Twister defines 2 levels of "test items":
- test suites (configurations) that correspond to built test applications
- test cases that correspond to individual ztest test cases within test applications using ztest framework.
In our plugin, we modified the reporting and counting of subtests to match how twister is doing it. Test suites are "tests" in pytest nomenclature and ztest test cases are based on subtests but they don't follow original unittest rules. E.g. in unittest, when a subtest fails it is counted towards failing tests but when it passes it is not counted towards tests. In our implementation, tests, and subtests have their own counters. I.e. subtests counts are not "leaking" into tests counts.