Skip to content

[Inference Provider] Add TextCLF as an inference provider#3895

Open
textclf-api wants to merge 2 commits intohuggingface:mainfrom
textclf-api:textclf
Open

[Inference Provider] Add TextCLF as an inference provider#3895
textclf-api wants to merge 2 commits intohuggingface:mainfrom
textclf-api:textclf

Conversation

@textclf-api
Copy link

@textclf-api textclf-api commented Mar 8, 2026

Adding TextCLF as a new Inference Provider for the @huggingface/inference library.


Note

Medium Risk
Touches the central provider registry and selection surface area, so misconfiguration could affect routing for provider-based inference, though the change is additive and covered by tests.

Overview
Adds TextCLF as a new Inference Provider (provider="textclf") and wires it into the provider registry so InferenceClient/async client can route conversational and text-generation requests to https://api.textclf.com/v1/chat/completions.

Introduces a new provider helper (_providers/textclf.py) with TextCLF-specific response parsing for text generation, updates provider allowlists/type unions, and extends tests to cover provider registration and URL construction.

Written by Cursor Bugbot for commit 8d04d7f. This will update automatically on new commits. Configure here.

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant