Contributing to jobq¶
Thank you for your interest in contributing to this project!
We appreciate issue reports, pull requests for code and documentation, as well as any project-related communication through GitHub Discussions.
Prerequisites¶
We use the following development tools for all Python code:
- uv
- pre-commit
- ruff (also part of the
pre-commit
hooks) - pytest
The development version of Python is 3.12, so please make sure to use that version when developing. The client-side code is tested against Python 3.10 and 3.11, server-side code is tested against Python 3.11, please take this into account when using recent Python features.
To get started with development, create a fork of the GitHub repository and clone it to your local machine.
Please submit your changes as pull requests against the main
branch.
Working on the client-side code (decorators & CLI)¶
The client/
directory contains the source code for the jobq
CLI and Python decorators.
Development¶
If you want to contribute to the client-side code, you can follow these steps:
-
Create a virtual environment and install the development dependencies:
-
To run the Pytest test suite, run:
-
After making your changes, verify they adhere to our Python code style by running
pre-commit
:You can also set up Git hooks through
pre-commit
to perform these checks automatically:
Regenerating the API client¶
The src/openapi_client
folder contains an automatically-generated API client for the backend API.
If you make changes to the backend API, you can regenerate the API client with the following command:
This will regenerate the API client in the client/src/openapi_client
directory from a currently running FastAPI server using openapi-generator-cli
.
Note that you will need to have the backend server running and accessible at http://localhost:8000
in order to generate the client code.
The script automatically removes unnecessary files and reformats the generated code according to our code style.
Publishing to PyPI¶
The jobq
package is published to PyPI through a GitHub Actions workflow when a new release is created.
Working on the server-side code (API)¶
The server-side code under the backend/
folder is written in Python and uses the FastAPI framework.
You can follow the same instructions as for the client-side code to set up a development environment.
Running the server¶
Since the code can load Kubernetes credentials from an in-cluster Kubernetes service account or from a Kubeconfig file, you can run it locally without having to deploy to a Kubernetes cluster.
To run the server locally in development mode (accessible at http://localhost:8000), you can use the following command in the backend/
folder:
FastAPI will automatically reload the server when you make changes to the code.
Testing¶
Tests are written with pytest and can be run with the following command:
The end-to-end tests (under tests/e2e
) deploy a short-lived Kubernetes cluster and run the tests against those.
You will need to have a few tools installed so the test harness can spin up a Kubernetes cluster:
After the tests have been run, the cluster will be torn down (also if the test fails).
If you manually abort a test run (which prevents the automatic deletion), you can use minikube profile list
to find the name of the cluster (integration-test-<timestamp>
) and then call minikube delete -p <profile-name>
to delete the cluster.
If you want to run the tests against an existing cluster (which greatly speeds things up), you can provide the name of the context to use through the E2E_K8S_CONTEXT
environment variable:
Warning
Running the e2e tests against an existing will clutter the active namespace, please proceed with caution!
The test harness attempts to install Kueue and the Kuberay operator into the cluster, so you might run into conflicts if you already have them deployed.
If you want to skip the end-to-end tests (e.g., to speed up the test execution), you can use the following command:
Publishing Docker images¶
Docker images are published to GitHub Container Registry through GitHub Actions.
Images are tagged according to the following patterns:
<version>
for release versionspr-<pr_number>
for pull requestsmain
for themain
branch<branch_name>
for other branches
The CI workflow also attaches build attestations to the images, which can be used to verify the integrity of the images.
Updating dependencies¶
Dependencies should stay locked for as long as possible, ideally for a whole release. If you have to update a dependency during development, you should do the following:
- If it is a core dependency needed for the package, add it to the
dependencies
section in thepyproject.toml
. - In case of a development dependency, add it to the
dev
section of theproject.optional-dependencies
table instead. - Dependencies needed for documentation generation are found in the
docs
sections ofproject.optional-dependencies
.
After adding the dependency in either of these sections, run the helper script hack/lock-deps.sh
(which in turn uses uv pip compile
) to pin all dependencies again:
In addition to these manual steps, we also provide pre-commit
hooks that automatically lock the dependencies whenever pyproject.toml
is changed.
Selective package upgrade for existing dependencies are also handled by the helper script above. If you want to update the Pydantic dependency, for example, simply run:
Tip
Since the official development version is Python 3.12, please run the above commands in a virtual environment with Python 3.12.
Working on documentation¶
Improvements or additions to the project's documentation are highly appreciated.
The documentation is based on the MkDocs and Material for MkDocs (mkdocs-material
) projects, see their homepages for in-depth guides on their features and usage.
We use the Numpy documentation style for Python docstrings.
Documentation dependencies are defined in the root-level pyproject.toml
file.
You can start a local documentation server with uv run mkdocs serve
(mkdocs listens on port 8000 by default),
or generate a static build under the public/
folder using uv run mkdocs build
.
In order to maintain documentation for multiple versions of this library, we use the mike tool to maintain individual documentation builds per version.
The GitHub CI pipeline automatically invokes mike
as part of the release process with the correct version and updates the GitHub pages branch for the project.
Contributions under repository license¶
Any contributions you make need to be under the same Apache 2.0 License that covers the project.
See the GitHub Terms of Service for more details on this inbound=outbound policy:
Whenever you add Content to a repository containing notice of a license, you license that Content under the same terms, and you agree that you have the right to license that Content under those terms. If you have a separate agreement to license that Content under different terms, such as a contributor license agreement, that agreement will supersede.
Isn't this just how it works already? Yep. This is widely accepted as the norm in the open-source community; it's commonly referred to by the shorthand "inbound=outbound". We're just making it explicit.