pip install pytest-in-dockerOr with uv:
uv add pytest-in-dockerDecorate any test function. It runs inside a Docker container:
from pytest_in_docker import in_container
import platform
@in_container("python:alpine")
def test_runs_on_alpine():
info = platform.freedesktop_os_release() # platform is available in the container.
assert info["ID"] == "alpine"Then run pytest as usual:
pytestThe function is serialized with cloudpickle, sent to a fresh python:alpine container, and the result is reported back to your terminal.
The marker API integrates with all standard pytest features β fixtures, parametrize, and reporting work as expected.
Point to a directory containing a Dockerfile and provide a tag. The image is built before the test runs:
import subprocess
@pytest.mark.in_container(path="./docker", tag="my-test-image:latest")
def test_custom_image():
result = subprocess.run(["cat", "/etc/os-release"], capture_output=True, text=True)
assert "alpine" in result.stdout.lower()This works with the marker too:
@pytest.mark.in_container(path="./docker", tag="my-test-image:latest")
def test_custom_image_with_marker():
...Combine @pytest.mark.parametrize with the marker to run the same test across
different containers. Use image as the parameter name β the plugin picks it up
automatically:
import pytest
import platform
@pytest.mark.parametrize(
("image", "expected_id"),
[
("python:alpine", "alpine"),
("python:slim", "debian"),
],
)
@pytest.mark.in_container()
def test_across_distros(image: str, expected_id: str):
info = platform.freedesktop_os_release()
assert info["ID"].lower() == expected_idWhen @pytest.mark.in_container() is called with no arguments, it reads the image
parameter from @pytest.mark.parametrize. This lets you build a compatibility matrix
with zero boilerplate.
When you need to customise the container beyond what the other modes offer β environment variables, volumes, extra ports β pass a factory:
from contextlib import contextmanager
import os
from typing import Iterator
from testcontainers.core.container import DockerContainer
from pytest_in_docker import in_container
@contextmanager
def my_container(port: int) -> Iterator[DockerContainer]:
with (
DockerContainer("python:alpine")
.with_command("sleep infinity")
.with_exposed_ports(port)
.with_env("APP_ENV", "test") as container
):
container.start()
yield container
@pytest.mark.in_container(factory=my_container)
def test_env_is_set():
assert os.environ["APP_ENV"] == "test"A factory is a callable that accepts a port: int argument and returns a context
manager yielding an already-started DockerContainer. The framework passes the
communication port automatically β the factory just needs to expose it and run sleep infinity.
Tests running inside containers default to a 30-second timeout. If pytest-timeout is installed, its timeout ini setting and @pytest.mark.timeout marker are respected automatically:
import pytest
@pytest.mark.timeout(60)
@pytest.mark.in_container("python:alpine")
def test_slow_operation():
...When a decorated test runs:
Host (pytest) Docker Container
βββββββββββββ ββββββββββββββββ
1. Spin up container ββββββ> python:alpine starts
2. Install deps ββββββ> pip install cloudpickle rpyc pytest
3. Start RPyC server ββββββ> listening on port 51337
4. Serialize test (cloudpickle)
5. Send bytes over RPyC ββββββ> deserialize + execute
<ββββββ result (pass/fail/exception) ββββββ
6. Container stops
How serialization works: cloudpickle serializes your test function β including closures, lambdas, and locally-defined helpers β into bytes on the host. Those bytes are sent to the container over RPyC, deserialized with the standard pickle module, and executed natively.
This means:
- Your test code runs natively inside the container β not through
docker execor shell commands - Full Python semantics: imports, exceptions, and return values all work naturally
- Closures and lambdas serialize correctly β use helper functions, captured variables, and dynamic code freely
- pytest assertion introspection still works on the host side for reporting
