-
-
Notifications
You must be signed in to change notification settings - Fork 12k
[Doc] Add AI Badgr framework integration documentation #30669
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Co-authored-by: miguelmanlyx <[email protected]>
Co-authored-by: miguelmanlyx <[email protected]>
[Doc] Add AI Badgr framework integration documentation
|
👋 Hi! Thank you for contributing to the vLLM project. 💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels. Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run You ask your reviewers to trigger select CI tests on top of Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can either: Add If you have any questions, please reach out to us on Slack at https://slack.vllm.ai. 🚀 |
|
Codex usage limits have been reached for code reviews. Please check with the admins of this repo to increase the limits by adding credits. |
|
Documentation preview: https://vllm--30669.org.readthedocs.build/en/30669/ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request introduces documentation for integrating AI Badgr as a framework. The new documentation is clear and follows the existing format for framework integrations. However, I've identified a significant correctness issue: the example commands use a model name that does not exist, which will cause them to fail. I've provided suggestions to replace it with a valid model name.
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com> Signed-off-by: miguelmanlyx <[email protected]>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com> Signed-off-by: miguelmanlyx <[email protected]>
Purpose
Add documentation for AI Badgr, an OpenAI-compatible LLM provider, in the framework integrations section. AI Badgr uses tier-based model naming (basic/normal/premium) and can work with vLLM as a backend or be accessed as a hosted service.
Test Plan
Test Result
site/deployment/frameworks/aibadgr/index.htmlChanges
New file:
docs/deployment/frameworks/aibadgr.mdDocuments two integration scenarios:
Example usage with environment variables:
Includes tier model mappings:
phi-3-mini(basic),mistral-7b(normal),llama3-8b-instruct(premium).Essential Elements of an Effective PR Description Checklist