feat: add run_config concurrency controls for experiments #2459
+226
−1
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Issue Link / Problem Description
Users running
@experiment().arun()couldn’t limit concurrent async tasks to honor provider rate limits (e.g., Azure OpenAI). Unlikeevaluate(), there was no RunConfig/max_workers option, so experiment tasks always fired at full concurrency.Changes Made
run_config+max_workersthroughExperimentWrapper.arun()and the@experimentdecorator, reusingragas.async_utils.as_completedwith the resolved worker limit.docs/concepts/experimentation.mdand the RunConfig how-to.Testing
How to Test
uv run pytest tests/unit/test_experiment.pyuv run async.py(manual script) before the fix showedrun_configkeyword errors; after the fix it reports the expectedmax concurrentvalues (unlimited, run_config=1, override=3).uv run pytest tests/unit/test_experiment.py -k run_config_max_workersfails on previous commit, passes now.make test(full suite) – all tests pass.References
docs/concepts/experimentation.md,docs/howtos/customizations/_run_config.mdScreenshots/Examples (if applicable)
N/A – behavior verified via tests + manual script.