Skip to content

Tests using benchmark fixture from pytest-benchmark should be marked as unsafe #104

@ngoldbaum

Description

@ngoldbaum

Currently this is what happens if you run a test using the fixture:

goldbaum at Nathans-MBP in ~/Documents/cryptography on free-threaded-support!
(tests-parallel) ± pytest --parallel-threads=4 -svx --pdb
============================= test session starts ==============================
platform darwin -- Python 3.14.0rc1, pytest-8.4.1, pluggy-1.6.0 -- /Users/goldbaum/Documents/cryptography/.nox/tests-parallel/bin/python3
cachedir: .pytest_cache
benchmark: 5.1.0 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
OpenSSL: OpenSSL 3.5.1 1 Jul 2025
FIPS Enabled: False
rootdir: /Users/goldbaum/Documents/cryptography
configfile: pyproject.toml
plugins: run-parallel-0.5.0, xdist-3.8.0, cov-6.2.1, benchmark-5.1.0
collected 3546 items
Collected 3487 items to run in parallel

tests/bench/test_aead.py::test_chacha20poly1305_encrypt PARALLEL FAILED  [  0%]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> traceback >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

benchmark = <pytest_benchmark.fixture.BenchmarkFixture object at 0x11d6d2f90>

    @pytest.mark.skipif(
        not _aead_supported(ChaCha20Poly1305),
        reason="Requires OpenSSL with ChaCha20Poly1305 support",
    )
    def test_chacha20poly1305_encrypt(benchmark):
        chacha = ChaCha20Poly1305(b"\x00" * 32)
>       benchmark(chacha.encrypt, b"\x00" * 12, b"hello world plaintext", b"")

tests/bench/test_aead.py:31:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <pytest_benchmark.fixture.BenchmarkFixture object at 0x11d6d2f90>
function_to_benchmark = <built-in method encrypt of cryptography.hazmat.bindings._rust.openssl.aead.ChaCha20Poly1305 object at 0x11bea0750>
args = (b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', b'hello world plaintext', b'')
kwargs = {}

    def __call__(self, function_to_benchmark, *args, **kwargs):
        if self._mode:
            self.has_error = True
>           raise FixtureAlreadyUsed(f'Fixture can only be used once. Previously it was used in {self._mode} mode.')
E           pytest_benchmark.fixture.FixtureAlreadyUsed: Fixture can only be used once. Previously it was used in benchmark(...) mode.

.nox/tests-parallel/lib/python3.14/site-packages/pytest_benchmark/fixture.py:153: FixtureAlreadyUsed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> entering PDB >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> PDB post_mortem >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
> /Users/goldbaum/Documents/cryptography/.nox/tests-parallel/lib/python3.14/site-packages/pytest_benchmark/fixture.py(153)__call__()
-> raise FixtureAlreadyUsed(f'Fixture can only be used once. Previously it was used in {self._mode} mode.')

I tried the simplest thing of just adding "benchmark" to THREAD_UNSAFE_FIXTURES and that didn't seem to work.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions