Clarification on Output Compatibility with Non-OpenAI LLMs #1095
Unanswered
codebrain001
asked this question in
Q&A
Replies: 1 comment
-
Would like to expand on this question, as I'm having issues with output_json while using Llama3. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
Could you clarify for me the compatibility of the output formats mentioned in the documentation? Specifically, I have a question regarding the following options:
My question is:
Does the requirement for an OpenAI client for these output formats mean that they cannot be used with other LLMs, such as Claude models from Anthropic?
Understanding the constraints and compatibility is crucial for integrating these models into our existing workflows that utilize different LLMs. Any insights or additional documentation references would be highly appreciated.
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions