Skip to content

fix(analyzer): honour language_model_params in BasicLangExtractRecognizer#1943

Open
lsternlicht wants to merge 2 commits intomicrosoft:mainfrom
lsternlicht:fix/basic-langextract-language-model-params-dropped
Open

fix(analyzer): honour language_model_params in BasicLangExtractRecognizer#1943
lsternlicht wants to merge 2 commits intomicrosoft:mainfrom
lsternlicht:fix/basic-langextract-language-model-params-dropped

Conversation

@lsternlicht
Copy link
Copy Markdown

Summary

Fixes #1942.

BasicLangExtractRecognizer loads langextract.model.provider.language_model_params from the yaml into self._language_model_params and forwards it to lx.extract(..., language_model_params=...). But langextract/extraction.py ignores the language_model_params argument when a pre-built ModelConfig is passed via config= — it takes the elif config: branch at line 240, and only the else: branch at line 255 reads language_model_params. Because BasicLangExtractRecognizer._get_provider_params() always returns {"config": ModelConfig(...)}, values like timeout and num_ctx were silently dropped and Ollama fell back to langextract's 120 s default regardless of what the yaml said.

This reliably breaks any Ollama model whose first inference takes longer than 120 s, with the confusing error:

InferenceRuntimeError: Ollama Model timed out (timeout=120, num_threads=None)

even when the user explicitly set timeout: 1800 in the yaml. The default bundled config at presidio-analyzer/presidio_analyzer/conf/langextract_config_basic.yaml is also affected — its timeout: 240 and num_ctx: 8192 have no effect at runtime.

AzureOpenAILangExtractRecognizer is not affected, because its _get_provider_params() returns {"model_id": ..., "language_model_params": ...} with no config= key, so lx.extract() takes the else: branch where language_model_params is actually applied.

Fix

Merge self._language_model_params into self.provider_kwargs before building lx_factory.ModelConfig in basic_langextract_recognizer.py. setdefault preserves precedence for explicit provider.kwargs: entries, so configs that already work by placing timeout under kwargs: (as a workaround) continue to work unchanged.

for key, value in self._language_model_params.items():
    self.provider_kwargs.setdefault(key, value)

Tests

Before the fix, existing tests were green while the behaviour was broken. test_basic_langextract_recognizer.py:568-570 only asserted that language_model_params was passed to lx.extract(), but never that it reached the ModelConfig or the provider.

This PR:

  1. Strengthens test_when_analyze_called_then_params_passed_to_langextract to also assert that timeout and num_ctx land on call_kwargs["config"].provider_kwargs.
  2. Adds test_language_model_params_reach_provider_kwargs — direct regression test for the reported behaviour, without going through the full analyze() flow.
  3. Adds test_provider_kwargs_take_precedence_over_language_model_params — asserts backwards compatibility for configs that already placed values under kwargs:.

All 83 LangExtract-related tests pass locally:

cd presidio-analyzer
pytest tests/test_basic_langextract_recognizer.py \
       tests/test_azure_openai_langextract_recognizer.py \
       tests/test_langextract_helper.py \
       tests/test_lm_recognizer.py
...
83 passed in 0.59s

Test plan

  • All existing test_basic_langextract_recognizer.py tests pass (21 tests).
  • All LangExtract-related tests across the analyzer pass (83 tests).
  • New regression test asserts ModelConfig.provider_kwargs contains timeout and num_ctx from yaml's language_model_params section.
  • New backwards-compat test asserts provider.kwargs: values win over provider.language_model_params: values of the same name.
  • Verified end-to-end against an actual Ollama gemma4:31b run: with the fix, the configured timeout: 3600 is passed to requests.post(..., timeout=3600) inside langextract/providers/ollama.py, replacing the hardcoded 120 s default.

…izer

BasicLangExtractRecognizer loaded langextract.model.provider.language_model_params
from the yaml into self._language_model_params and forwarded it to
lx.extract(..., language_model_params=...), but langextract.extract() ignores
its language_model_params argument when a pre-built ModelConfig is passed via
config= — it takes the elif config: branch (extraction.py:240) and only the
else: branch at line 255 reads language_model_params. Because
BasicLangExtractRecognizer._get_provider_params() always returns
{"config": ModelConfig(...)}, values like timeout and num_ctx were silently
dropped and Ollama fell back to langextract's 120s default regardless of
what the yaml said.

Merge _language_model_params into provider_kwargs before building the
ModelConfig so the values actually reach OllamaLanguageModel(**provider_kwargs).
setdefault() preserves precedence for explicit provider.kwargs: entries, so
configs that already work under kwargs: continue to work unchanged.

Adds two regression tests and strengthens an existing assertion to cover
ModelConfig.provider_kwargs (not just the separate language_model_params
argument that langextract ignores in this branch).

AzureOpenAILangExtractRecognizer is unaffected because its
_get_provider_params() returns {"model_id": ..., "language_model_params": ...}
with no config= key, so lx.extract() takes the else: branch where
language_model_params is applied.

Fixes microsoft#1942
Copilot AI review requested due to automatic review settings April 10, 2026 20:24
@microsoft-github-policy-service
Copy link
Copy Markdown
Contributor

@lsternlicht please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.

@microsoft-github-policy-service agree [company="{your company}"]

Options:

  • (default - no company specified) I have sole ownership of intellectual property rights to my Submissions and I am not making Submissions in the course of work for my employer.
@microsoft-github-policy-service agree
  • (when company given) I am making Submissions in the course of work for my employer (or my employer has intellectual property rights in my Submissions by contract or applicable law). I have permission from my employer to make Submissions and enter into this Agreement on behalf of my employer. By signing below, the defined term “You” includes me and my employer.
@microsoft-github-policy-service agree company="Microsoft"
Contributor License Agreement

Contribution License Agreement

This Contribution License Agreement (“Agreement”) is agreed to by the party signing below (“You”),
and conveys certain license rights to Microsoft Corporation and its affiliates (“Microsoft”) for Your
contributions to Microsoft open source projects. This Agreement is effective as of the latest signature
date below.

  1. Definitions.
    “Code” means the computer software code, whether in human-readable or machine-executable form,
    that is delivered by You to Microsoft under this Agreement.
    “Project” means any of the projects owned or managed by Microsoft and offered under a license
    approved by the Open Source Initiative (www.opensource.org).
    “Submit” is the act of uploading, submitting, transmitting, or distributing code or other content to any
    Project, including but not limited to communication on electronic mailing lists, source code control
    systems, and issue tracking systems that are managed by, or on behalf of, the Project for the purpose of
    discussing and improving that Project, but excluding communication that is conspicuously marked or
    otherwise designated in writing by You as “Not a Submission.”
    “Submission” means the Code and any other copyrightable material Submitted by You, including any
    associated comments and documentation.
  2. Your Submission. You must agree to the terms of this Agreement before making a Submission to any
    Project. This Agreement covers any and all Submissions that You, now or in the future (except as
    described in Section 4 below), Submit to any Project.
  3. Originality of Work. You represent that each of Your Submissions is entirely Your original work.
    Should You wish to Submit materials that are not Your original work, You may Submit them separately
    to the Project if You (a) retain all copyright and license information that was in the materials as You
    received them, (b) in the description accompanying Your Submission, include the phrase “Submission
    containing materials of a third party:” followed by the names of the third party and any licenses or other
    restrictions of which You are aware, and (c) follow any other instructions in the Project’s written
    guidelines concerning Submissions.
  4. Your Employer. References to “employer” in this Agreement include Your employer or anyone else
    for whom You are acting in making Your Submission, e.g. as a contractor, vendor, or agent. If Your
    Submission is made in the course of Your work for an employer or Your employer has intellectual
    property rights in Your Submission by contract or applicable law, You must secure permission from Your
    employer to make the Submission before signing this Agreement. In that case, the term “You” in this
    Agreement will refer to You and the employer collectively. If You change employers in the future and
    desire to Submit additional Submissions for the new employer, then You agree to sign a new Agreement
    and secure permission from the new employer before Submitting those Submissions.
  5. Licenses.
  • Copyright License. You grant Microsoft, and those who receive the Submission directly or
    indirectly from Microsoft, a perpetual, worldwide, non-exclusive, royalty-free, irrevocable license in the
    Submission to reproduce, prepare derivative works of, publicly display, publicly perform, and distribute
    the Submission and such derivative works, and to sublicense any or all of the foregoing rights to third
    parties.
  • Patent License. You grant Microsoft, and those who receive the Submission directly or
    indirectly from Microsoft, a perpetual, worldwide, non-exclusive, royalty-free, irrevocable license under
    Your patent claims that are necessarily infringed by the Submission or the combination of the
    Submission with the Project to which it was Submitted to make, have made, use, offer to sell, sell and
    import or otherwise dispose of the Submission alone or with the Project.
  • Other Rights Reserved. Each party reserves all rights not expressly granted in this Agreement.
    No additional licenses or rights whatsoever (including, without limitation, any implied licenses) are
    granted by implication, exhaustion, estoppel or otherwise.
  1. Representations and Warranties. You represent that You are legally entitled to grant the above
    licenses. You represent that each of Your Submissions is entirely Your original work (except as You may
    have disclosed under Section 3). You represent that You have secured permission from Your employer to
    make the Submission in cases where Your Submission is made in the course of Your work for Your
    employer or Your employer has intellectual property rights in Your Submission by contract or applicable
    law. If You are signing this Agreement on behalf of Your employer, You represent and warrant that You
    have the necessary authority to bind the listed employer to the obligations contained in this Agreement.
    You are not expected to provide support for Your Submission, unless You choose to do so. UNLESS
    REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING, AND EXCEPT FOR THE WARRANTIES
    EXPRESSLY STATED IN SECTIONS 3, 4, AND 6, THE SUBMISSION PROVIDED UNDER THIS AGREEMENT IS
    PROVIDED WITHOUT WARRANTY OF ANY KIND, INCLUDING, BUT NOT LIMITED TO, ANY WARRANTY OF
    NONINFRINGEMENT, MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE.
  2. Notice to Microsoft. You agree to notify Microsoft in writing of any facts or circumstances of which
    You later become aware that would make Your representations in this Agreement inaccurate in any
    respect.
  3. Information about Submissions. You agree that contributions to Projects and information about
    contributions may be maintained indefinitely and disclosed publicly, including Your name and other
    information that You submit with Your Submission.
  4. Governing Law/Jurisdiction. This Agreement is governed by the laws of the State of Washington, and
    the parties consent to exclusive jurisdiction and venue in the federal courts sitting in King County,
    Washington, unless no federal subject matter jurisdiction exists, in which case the parties consent to
    exclusive jurisdiction and venue in the Superior Court of King County, Washington. The parties waive all
    defenses of lack of personal jurisdiction and forum non-conveniens.
  5. Entire Agreement/Assignment. This Agreement is the entire agreement between the parties, and
    supersedes any and all prior agreements, understandings or communications, written or oral, between
    the parties relating to the subject matter hereof. This Agreement may be assigned by Microsoft.

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Fixes BasicLangExtractRecognizer so values configured under langextract.model.provider.language_model_params (e.g., timeout, num_ctx) are applied at runtime by ensuring they reach ModelConfig.provider_kwargs, which is what the langextract.extract(config=...) path actually uses.

Changes:

  • Merge language_model_params into provider_kwargs (with kwargs taking precedence) before constructing lx_factory.ModelConfig.
  • Strengthen and add regression tests to assert timeout/num_ctx arrive on ModelConfig.provider_kwargs.
  • Document the behavior fix in CHANGELOG.md.

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 1 comment.

File Description
presidio-analyzer/presidio_analyzer/predefined_recognizers/third_party/basic_langextract_recognizer.py Ensures language_model_params are surfaced via ModelConfig.provider_kwargs so providers (e.g., Ollama) receive them.
presidio-analyzer/tests/test_basic_langextract_recognizer.py Adds/strengthens regression tests verifying params reach ModelConfig.provider_kwargs and precedence rules.
CHANGELOG.md Records the fix under Unreleased → Analyzer → Fixed.

self.model_id = model_config.get("model_id")
self.provider = provider_config.get("name")
self.provider_kwargs = provider_config.get("kwargs", {})
self.provider_kwargs = dict(provider_config.get("kwargs", {}))
Copy link

Copilot AI Apr 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dict(provider_config.get("kwargs", {})) will raise a TypeError if the YAML contains kwargs: null (or an empty kwargs: key). Consider using dict(provider_config.get("kwargs") or {}) (or otherwise normalizing/validating the value) so missing/empty kwargs are treated as an empty mapping and the error message remains actionable for users.

Suggested change
self.provider_kwargs = dict(provider_config.get("kwargs", {}))
self.provider_kwargs = dict(provider_config.get("kwargs") or {})

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree with Copilot.

@RonShakutai RonShakutai self-requested a review April 12, 2026 08:58
Copy link
Copy Markdown
Collaborator

@RonShakutai RonShakutai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM,
Great PR!!
Left minor comments.

self.model_id = model_config.get("model_id")
self.provider = provider_config.get("name")
self.provider_kwargs = provider_config.get("kwargs", {})
self.provider_kwargs = dict(provider_config.get("kwargs", {}))
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree with Copilot.

self._extract_params.update(provider_config.get("extract_params", {}))
self._language_model_params.update(
provider_config.get("language_model_params", {})
)
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

.update(None) crashes if the YAML key is present but empty.
so we should add the or {} like copilot suggested above.

@@ -574,3 +574,68 @@ def test_when_analyze_called_then_params_passed_to_langextract(self, tmp_path):
assert call_kwargs["config"].provider == "ollama"
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we add a test where kwargs: null in the yaml and check if the yaml doesnt crash ?

@omri374
Copy link
Copy Markdown
Collaborator

omri374 commented Apr 26, 2026

Hi @lsternlicht would you be interested in merging this PR? There are a few minor changes to make

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

BasicLangExtractRecognizer silently drops provider.language_model_params (timeout, num_ctx, ...)

4 participants