Contribute
Introduction
Feel free and welcome to contribute to this project. You can start with filing issues and ideas for improvement in GitHub tracker. Before creating a new issue you might want to check the existing issues to prevent filing a duplicate. Important issues affecting many users are marked with the known issue label.
Our favorite thoughts from The Zen of Python:
Beautiful is better than ugly.
Simple is better than complex.
Readability counts.
We respect the PEP8 Style Guide for Python Code. Here’s a couple of recommendations to keep on mind when writing code:
Maximum line length is 99 for code and 72 for documentation.
Comments should be complete sentences.
The first word should be capitalized (unless identifier).
When using hanging indent, the first line should be empty.
The closing brace/bracket/parenthesis on multiline constructs is under the first non-whitespace character of the last line.
When generating user messages use the whole sentence with the first word capitalized and enclose any names in single quotes:
self.warn(f"File '{path}' not found.")
Commits
It is challenging to be both concise and descriptive, but that is what a well-written summary should do. Consider the commit message as something that will be pasted into release notes:
The first line should have up to 50 characters.
Complete sentence with the first word capitalized.
Should concisely describe the purpose of the patch.
Do not prefix the message with file or module names.
Other details should be separated by a blank line.
Why should I care?
It helps others (and yourself) find relevant commits quickly.
The summary line will be re-used later (e.g. for rpm changelog).
Some tools do not handle wrapping, so it is then hard to read.
You will make the maintainers happy to read beautiful commits :)
You can get some more context in the stackoverflow article.
Develop
In order to experiment, play with the latest bits and develop improvements it is best to use a virtual environment. Make sure that you have all required packages installed on your box:
make develop
Create a development virtual environment with hatch:
git clone https://github.com/teemtee/tmt
cd tmt
hatch env create dev
Enter the environment by running:
hatch -e dev shell
When interacting from within the development environment with services with internal certificates, you need to export the following environment variable:
export REQUESTS_CA_BUNDLE=/etc/pki/tls/certs/ca-bundle.crt
Install the pre-commit script to run all available checks for
your commits to the project:
pre-commit install
Tests
Every code change should be accompanied by tests covering the new feature or affected code area. It’s possible to write new tests or extend the existing ones.
If writing a test is not feasible for you, explain the reason in
the pull request. If possible, the maintainers will help with
creating needed test coverage. You might also want to add the
help wanted and tests needed labels to bring a bit more
attention to your pull request.
Run the default set of tests directly on your localhost:
tmt run
Run selected tests or plans in verbose mode:
tmt run --verbose plan --name basic
tmt run -v test -n smoke
You might want to set some useful environment variables when
working on tmt tests, for example TMT_FEELING_SAFE to
allow the local provision method or TMT_SHOW_TRACEBACK to
show the full details for all failures. Consider installing the
direnv command which can take care of these for you.
Unit Tests
To run unit tests in hatch environment using pytest and generate coverage report:
make coverage
To see all available scripts for running tests in hatch test virtual environments:
hatch env show test
To run ‘unit’ script for example, run:
hatch run test:unit
When running tests using hatch, there are multiple virtual environments available, each using a different Python interpreter (generally the lowest and highest version supported). To run the tests in all environments, install the required Python versions. For example:
dnf install python3.9 python3.11
Note
When adding new unit tests, do not create class-based tests derived from
unittest.TestCase class. Such classes do not play well with Pytest’s
fixtures, see https://docs.pytest.org/en/7.1.x/how-to/unittest.html for
details.
Provision Methods
Tests which exercise multiple provision methods should use the
PROVISION_HOW environment variable to select which provision
method should be exercised during their execution. This variable
is likely to have local set as the default value in the test
script to execute directly on the test runner as the default
scenario. If a test does not support the local provision
method make sure to use the provision-only tag so that the
test in question is excluded from the regular plans.
The following tags can be used to enable given test under the respective provision method plan:
- provision-artemis
For tests checking the artemis plugin functionality.
- provision-beaker
For tests checking the beaker plugin functionality using the
mrackplugin.- provision-connect
For tests checking the connect plugin functionality.
- provision-container
For tests checking the container provision method using the
podmanplugin.- provision-virtual
For tests checking the virtual provision method using the
testcloudplugin.- provision-ssh
Tests which are not tied to a specific provision method but should be executed for all provision methods which are using
sshto connect to guests.- provision-only
Used to mark tests which are suitable to be run only under specific provision methods. These will be excluded from regular plans.
Images
Tests which exercise the container provisioning plugin with various guest environments should use the custom-built set of container images rather than using the upstream ones directly. We built custom images to have better control over the initial environment setup, especially when it comes to essential requirements and assumption tmt makes about the guest setup. The naming scheme also provides better information about content of these images when compared to very varied upstream locations.
Naming scheme
All our test images follow a simple naming pattern:
localhost/tmt/tests/BACKEND/DISTRIBUTION/RELEASE/EXTRAS:TAG
localhost/tmt/testsTo make it clear the image was built locally, it is owned by tmt, and it is not packaging tmt but serves for testing purposes only.
BACKENDThere are various kinds of “images”, the most well-known ones would be Docker/Podman images, their names would contain
containerflag, and QCOW2 images for VMs which would be labeled withvirtual.DISTRIBUTIONA lower-cased name of the Linux distribution hosted in the image:
fedora,ubuntu,alpine, etc.RELEASEA release of the
DISTRIBUTION:7for CentOS 7,stream9for CentOS Stream 9, or40,rawhideand evencoreosfor Fedora.EXTRASAdditional flags describing a “flavor” of the image:
upstreamimages are identical to an upstream image, adding no special setup on top of the upstream.unprivilegedimages come with password-lesssudosetup and may be used when unprivileged access is part of the test.ostreeimages are Fedora CoreOS that simulate being deployed by ostree.
TAGUsually
latestas in “the latest image for this distro, release and extra flags”.Note
So far we do not have much use for other tags besides
latest.stableused for Fedora CoreOS images will probably go away in favor oflatest.
For example, the following images can be found:
# Latest Alpine, with added Bash to simulate proper essential setup:
localhost/tmt/tests/container/alpine
# Various CentOS releases:
localhost/tmt/tests/container/centos/7
localhost/tmt/tests/container/centos/stream9
# Fedora rawhide, with dnf5 pre-installed:
localhost/tmt/tests/container/fedora/rawhide
# Same, but with password-less sudo set up:
localhost/tmt/tests/container/fedora/rawhide/unprivileged
To build these images, run the following:
# Build all images...
make images-tests
# ... or just a single one:
make images-tests/tmt/tests/container/fedora/rawhide:latest
Tests that need to use various container images should trigger this command before running the actual test cases:
rlRun "make -C images-tests"
To list built container images, run the following:
podman images | grep 'localhost/tmt/tests/' | sort
To remove these images from your local system, run the following:
make clean-test-images
Docs
When submitting a change affecting user experience it’s always good to include respective documentation. You can add or update the Metadata Specification, extend the Examples or write a new chapter for the user Guide.
tmt documentation is written with reStructuredText and built with Sphinx. Various features of both reST and Sphinx are used widely in tmt documentation, from inline markup to references. Feel free to use them as well to link new or updated documentation to relevant parts, to highlight important points, or to provide helpful examples.
A couple of best practices when updating documentation:
When referring to a plugin, its options or documentation, prefer reference to
/plugins/STEP/PLUGINrather than to older/spec/plans/STEP/PLUGIN:# This is good: :ref:`/plugins/prepare/ansible` # If the user-facing plugin name differs from the Python one, # or if you need capitalize the first letter: :ref:`Beaker</plugins/provision/beaker>` # This should be avoided: :ref:`/spec/plans/prepare/ansible`
Design the plugin docstrings and help texts as if they are to be rendered by Sphinx, i.e. make use of ReST goodies: literals for literals - metavars, values, names of environment variables, commands, keys, etc.,
code-blockfor blocks of code or examples. It leads to better HTML docs and tmt has a nice CLI renderer as well, therefore there is no need to compromise for the sake of CLI.Use full sentences, i.e. capital letters at the beginning & a full stop at the end.
Use Python multiline strings rather than joining multiple strings over several lines. It often leads to leading and/or trailing whitespace characters that are easy to miss.
Plugin docstring provides the bulk of its CLI help and HTML documentation. It should describe what the plugin does.
Other than trivial use cases and keys deserve an example or two.
Unless there’s an important difference, describe the plugin’s configuration in terms of fmf rather than CLI. It is easy to map fmf to CLI options, and fmf makes a better example for someone writing fmf files.
When referring to plugin configuration in user-facing docs, speak about “keys”: “
playbookkey ofprepare/ansibleplugin”. Keys are mapped 1:1 to CLI options, let’s make sure we avoid polluting docs with “fields”, “settings” and other synonyms.A metavar should represent the semantic of the expected value, i.e.
--file PATHis better than--file FILE,--playbook PATH|URLis better than--playbook PLAYBOOK.If there is a default value, it belongs to the
default=parameter oftmt.utils.field(), and the help text should not mention it because the “Default is …” sentence can be easily added automatically and rendered correctly with`show_default=True.When showing an example of plugin configuration, include also an example for the command line:
Run a single playbook on the guest: .. code-block:: yaml prepare: how: ansible playbook: ansible/packages.yml .. code-block:: shell prepare --how ansible --playbook ansible/packages.yml
Do not use
:caption:directive ofcode-block, it is understod by Sphinx only anddocutilspackage cannot handle it.
Examples
By default, examples provided in the specification stories are
rendered as yaml. In order to select a different syntax
highlighting schema add # syntax: <format>, for example:
# syntax: shell
Building documentation is then quite straightforward:
make docs
Find the resulting html pages under the docs/_build/html
folder.
Visual themes
Use the TMT_DOCS_THEME variable to easily pick custom theme.
If specified, make docs would use this theme for documentation
rendering by Sphinx. The theme must be installed manually, make
docs will not do so. Variable expects two strings, separated by
a colon (:): theme package name, and theme name.
# Sphinx book theme, sphinx-book-theme:
TMT_DOCS_THEME="sphinx_book_theme:sphinx_book_theme" make docs
# Renku theme, renku-sphinx-theme - note that package name
# and theme name are *not* the same string:
TMT_DOCS_THEME="renku_sphinx_theme:renku" make docs
By default, docs/_static/tmt-custom.css provides additional tweaks
to the documentation theme. Use the TMT_DOCS_CUSTOM_HTML_STYLE
variable to include additional file:
$ cat docs/_static/custom.local.css
/* Make content wider on my wider screen */
.wy-nav-content {
max-width: 1200px !important;
}
TMT_DOCS_CUSTOM_HTML_STYLE=custom.local.css make docs
Note
The custom CSS file specified by TMT_DOCS_CUSTOM_HTML_STYLE
is included before the built-in tmt-custom.css, therefore to
override theme CSS, it is recommended to add !important flag.
tldr pages
The tldr pages are maintained in the central tldr-pages
repository. To modify existing pages or add new ones, submit your
changes directly there by following their contribution
guidelines.
Translations of existing pages into other languages are welcomed. If you’d like to help translate pages, please follow the same contribution process described above.
Note
Changes made directly to documentation in this repository will not be reflected in the tldr pages collection.
Pull Requests
When submitting a new pull request which is not completely ready
for merging but you would like to get an early feedback on the
concept, use the GitHub feature to mark it as a Draft rather
than using the WIP prefix in the summary.
During the pull request review it is recommended to add new commits with your changes on the top of the branch instead of amending the original commit and doing a force push. This will make it easier for the reviewers to see what has recently changed.
It’s good to keep the pull request up-to-date with the main
branch. Rebase regularly or use /packit build command in the
pull request comment if there were significant changes on the
default branch otherwise newly added tests might cause unexpected
and irrelevant failures in your test jobs.
Once the pull request has been successfully reviewed and all tests
passed, please rebase on the latest main branch content and
squash the changes into a single commit. Use multiple commits to
group relevant code changes if the pull request is too large for a
single commit.
If the pull request addresses an existing issue, mention it using one of the automatically parsed formats so that it is linked to it, for example:
Fix #1234.
By default only a core set of tests is executed against a newly
created pull request and its updates to verify basic sanity of the
change. Once the pull request content is ready for a thorough
testing add the full test label and make sure that the
discuss label is not present. All future changes of the pull
request will be tested with the full test coverage. For changes
related to documentation only the full test suite is not required.
Checklist
The following checklist template is automatically added to the new pull request description to easily track progress of the implementation and prevent forgetting about essential steps to be completed before it is merged. Feel free to remove those which are irrelevant for your change.
Pull Request Checklist
* [ ] implement the feature
* [ ] write the documentation
* [ ] extend the test coverage
* [ ] update the specification
* [ ] adjust plugin docstring
* [ ] modify the json schema
* [ ] mention the version
* [ ] include a release note
The version should be mentioned in the specification and a release note should be included when a new essential feature is added or an important change is introduced so that users can easily check whether given functionality is already available in their package:
.. versionadded:: 1.23
Review
Code review is an essential part of the workflow. It ensures good quality of the code and prevents introducing regressions, but it also brings some additional benefits: By reading code written by others you can learn new stuff and get inspired for your own code. Each completed pull request review helps you, little by little, to get familiar with larger part of the project code and empowers you to contribute more easily in the future.
For instructions how to locally try a change on your laptop see the Develop section. Basically just enable the development environment and check out the pull request branch or use the github cli to check out code from a fork repository:
hatch -e dev shell # enable the dev environment
git checkout the-feature # if branch is in the tmt repo
gh pr checkout 1234 # check out branch from a fork
It is also possible to directly install packages freshly built by Packit for given pull request. See the respective Packit check for detailed installation instructions.
Note that you don’t have to always read the whole change. There are several ways how to provide feedback on the pull request:
check how the documentation would be rendered in the
docs/readthedocs.orgpull request check, look for typos, identify wording which is confusing or not clear, point out that documentation is completely missing for some arearemind a forgotten item from the Checklist, for example suggest writing a release note for a new significant feature which should be highlighted to users
verify just the functionality, make sure it works as expected and confirm it in a short comment, provide a simple reproducer when something is broken
review only the newly added test case, verify that the test works as expected and properly verifies the functionality
Even partial review which happens sooner is beneficial, saves time. Every single comment helps to improve and move the project forward. No question is a dumb question. Every feedback counts!
Merging
Pull request merging is done by one of maintainers who have a good overview of the whole code. Maintainer who will take care of the process will assign themselves to the pull request. Before merging it’s good to check the following:
New test coverage added if appropriate, all tests passed
Documentation has been added or updated where appropriate
Commit messages are sane, commits are reasonably squashed
At least one positive review provided by the maintainers
Merge commits are not used, rebase on the
maininstead
Pull requests which should not or cannot be merged are marked with
the blocked label. For complex topics which need more eyes to
review and discuss before merging use the discuss label.
Makefile
There are several Makefile targets defined to make the common daily tasks easy & efficient:
- make test
Execute the unit test suite.
- make smoke
Perform quick basic functionality test.
- make coverage
Run the test suite under coverage and report results.
- make docs
Build documentation.
- make packages
Build rpm and srpm packages.
- make images
Build container images.
- make tags
Create or update the Vim
tagsfile for quick searching. You might want to useset tags=./tags;in your.vimrcto enable parent directory search for the tags file as well.- make clean
Cleanup all temporary files.
Release
The tmt project is released monthly. If there are urgent
changes which need to be released quickly, a hotfix release may be
created to address the important problem sooner.
Regular
Follow the steps below to create a new major or minor release:
Update
overview.rstwith new contributors since the last releaseReview the release notes in
releases.rst, update as neededAdd a
Release x.y.zcommit, empty if needed:git commit --allow-empty -m "Release x.y.z"Create a pull request with the commit, ensure tests pass, merge it
Move the
fedorabranch to point to the new releaseTag the commit with
x.y.z, push tagsgit push --tags
Create a new github release based on the tag above
Mention the most important changes in the name, do not include version
Use
;as a delimiter, when multiple items are mentioned in the namePush the “Generate release notes” button to create the content
Prepend the “See the release notes for the list of interesting changes.” line
Publish the release, check Fedora pull requests, make sure tests pass and merge
Finally, if everything went well:
Close the corresponding release milestone
Once the non development copr build is completed, move the
quaybranch to point to the release commit as well to build fresh container images.
Handle manually what did not went well:
If the automation triggered by publishing the new github release was not successful, publish the fresh code to the pypi repository manually using
make wheel && make uploadIf there was a problem with creating Fedora pull requests, you can trigger them manually using
/packit propose-downstreamin any open issue.
Hotfix
The following steps should be followed when an important urgent fix needs to be released before the regular schedule:
Create a new branch from the
fedorabranchUse
git cherry-pickto apply the selected changeMention the hotfix release on the release page
Add a
Release x.y.zcommit, empty if needed:git commit --allow-empty -m "Release x.y.z"Create a new pull request with the target branch set to
fedoraMake sure that tests pass and merge the pull request
Tag the commit and publish the release in the same way as for regular release
Create a pull request with the hotfix release notes changes