Skip to content

Add expert parallelism support to mixtral#1510

Open
pstjohn wants to merge 3 commits intoNVIDIA:mainfrom
pstjohn:pstjohn/bio-327-add-expert-parallelism-support-to-modelsmixtral-model
Open

Add expert parallelism support to mixtral#1510
pstjohn wants to merge 3 commits intoNVIDIA:mainfrom
pstjohn:pstjohn/bio-327-add-expert-parallelism-support-to-modelsmixtral-model

Conversation

@pstjohn
Copy link
Collaborator

@pstjohn pstjohn commented Mar 11, 2026

Adds basic EP to mixtral/ model

pstjohn added 2 commits March 11, 2026 14:28
Signed-off-by: Peter St. John <pstjohn@nvidia.com>
Signed-off-by: Peter St. John <pstjohn@nvidia.com>
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 11, 2026

Important

Review skipped

Auto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: d6ce380c-4825-4106-8b49-89d3b6b5342a

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Use the checkbox below for a quick retry:

  • 🔍 Trigger review
✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Comment @coderabbitai help to get the list of available commands and usage tips.

Signed-off-by: Peter St. John <pstjohn@nvidia.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant