Skip to content

fix: patch OpenAI client in distil module to support dispatch mode#2032

Open
divyanshupatel17 wants to merge 1 commit into567-labs:mainfrom
divyanshupatel17:main
Open

fix: patch OpenAI client in distil module to support dispatch mode#2032
divyanshupatel17 wants to merge 1 commit into567-labs:mainfrom
divyanshupatel17:main

Conversation

@divyanshupatel17
Copy link

Describe your changes

  • Fixed a bug in instructor.distil where the OpenAI client was not being patched with instructor, causing dispatch mode to fail because response_model was not recognized.
  • Added comprehensive unit tests for instructor.distil in tests/test_distil.py covering key functionality including get_signature_from_fn, format_function, and the @distil decorator in both usage modes.
  • Refactored imports in distil.py to use top-level relative imports to avoid potential circular dependencies.

Issue ticket number and link

(None)

Checklist before requesting a review

  • I have performed a self-review of my code
  • If it is a core feature, I have added thorough tests.
  • If it is a core feature, I have added documentation.

@jxnl
Copy link
Collaborator

jxnl commented Jan 26, 2026

I wonder if the right thing to do here is to actually allow the distillation tool to support any client that is an instructor client. This was implemented when opening; that was the only option, but we can go beyond this now.

@divyanshupatel17
Copy link
Author

Thanks for the feedback @jxnl !

Looking at "examples/distilations/three_digit_mul_dispatch.py" ,
I see it already passes an Instructor client. Should I update to:

  1. Accept Union[OpenAI, Instructor] and only patch raw clients?
  2. Also rename openai_client -> client for clarity?
  3. Or keep the current fix minimal (just patching) and leave broader changes for a follow-up?
    Happy to go with whichever approach fits the project best.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants