Skip to content

feat: add openai inference provider#46

Merged
leseb merged 1 commit intoopendatahub-io:mainfrom
leseb:RHAIENG-1198
Sep 19, 2025
Merged

feat: add openai inference provider#46
leseb merged 1 commit intoopendatahub-io:mainfrom
leseb:RHAIENG-1198

Conversation

@leseb
Copy link
Copy Markdown
Collaborator

@leseb leseb commented Sep 19, 2025

What does this PR do?

New optional provider.

Relates to: RHAIENG-1198

Summary by CodeRabbit

  • New Features

    • Added support for OpenAI as a remote inference provider.
    • Automatically enabled when OPENAI_API_KEY is set; supports configurable base URL (default: https://api.openai.com/v1).
  • Documentation

    • Updated provider list to include the OpenAI remote inference option.

New optional provider.

Relates to: RHAIENG-1198
Signed-off-by: Sébastien Han <seb@redhat.com>
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Sep 19, 2025

Walkthrough

Adds a new inference provider entry remote::openai to distribution configs and documents it. The provider is conditionally enabled at runtime via OPENAI_API_KEY and supports configurable base_url. No other configuration sections or code paths are changed.

Changes

Cohort / File(s) Summary
Docs update
distribution/README.md
Adds documentation entry for inference provider remote::openai.
Distribution configs
distribution/build.yaml, distribution/run.yaml
Registers remote::openai in providers.inference. In run.yaml, adds a gated provider with provider_id ${env.OPENAI_API_KEY:+openai}, provider_type remote::openai, and config for api_key and base_url (default https://api.openai.com/v1).

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor Env as Environment
  participant RunCfg as run.yaml
  participant Dist as Provider Registry
  participant OpenAI as remote::openai

  Env->>RunCfg: OPENAI_API_KEY present?
  alt Key present
    RunCfg->>Dist: Register provider (id: openai, type: remote::openai)
    note right of Dist: base_url default<br/>https://api.openai.com/v1
    Dist->>OpenAI: Initialize with api_key, base_url
  else No key
    RunCfg-->>Dist: Skip registering remote::openai
  end
Loading

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

Poem

I twitch my whiskers at the breeze of change,
A new remote burrow within our range—
Flip the key, the tunnel lights align,
Open skies via base_url’s shine.
I thump, configure, hop—then fly:
Providers multiplied, oh my! 🥕✨

Pre-merge checks and finishing touches

✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The title "feat: add openai inference provider" accurately and concisely describes the primary change in this PR—adding a remote OpenAI inference provider in distribution/build.yaml and distribution/run.yaml and documenting it in distribution/README.md—which matches the PR objectives and file-level changes.
Docstring Coverage ✅ Passed No functions found in the changes. Docstring coverage check skipped.
✨ Finishing touches
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Collaborator

@Elbehery Elbehery left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (2)
distribution/run.yaml (2)

57-61: OpenAI provider gating looks good; consider optional org/project and timeouts.

Nice conditional include via ${env.OPENAI_API_KEY:+openai}. To improve operability (multi‑org setups, network slowness), consider adding optional fields if supported by remote::openai:

  • organization: ${env.OPENAI_ORG_ID:=}
  • project: ${env.OPENAI_PROJECT_ID:=}
  • connect_timeout/read_timeout (mirroring Bedrock defaults)

If the schema supports these, apply within this config block.


176-185: Optional: add a gated model entry for OpenAI to prevent provider/model mismatch.

Currently the first model binds to provider_id vllm‑inference. If users only set OPENAI_API_KEY and INFERENCE_MODEL to an OpenAI model, resolution may still point at vllm. Recommend adding a second model entry gated on OPENAI_API_KEY (and OPENAI_MODEL), so switching providers is env‑only:

Example (place next to existing models):

- metadata: {}
  model_id: ${env.OPENAI_MODEL:=gpt-4o-mini}
  provider_id: ${env.OPENAI_API_KEY:+openai}
  model_type: llm
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between d9dc908 and a25c332.

📒 Files selected for processing (3)
  • distribution/README.md (1 hunks)
  • distribution/build.yaml (1 hunks)
  • distribution/run.yaml (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: build-test-push (linux/amd64)
🔇 Additional comments (2)
distribution/README.md (1)

19-19: Docs entry added correctly.

The new inference provider row for remote::openai is consistent with the config changes.

distribution/build.yaml (1)

12-12: Build spec updated; confirm provider runtime deps.

Ensure remote::openai is available in the base stack without extra wheels. If it requires an external SDK (e.g., openai), add it under additional_pip_packages or vendor it in the provider image.

@leseb
Copy link
Copy Markdown
Collaborator Author

leseb commented Sep 19, 2025

Merging with a single approval for the sake of moving fast. CI is green ;) - Thanks!

@leseb leseb merged commit 5b8bf26 into opendatahub-io:main Sep 19, 2025
5 checks passed
@leseb leseb deleted the RHAIENG-1198 branch September 19, 2025 10:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants