-
Notifications
You must be signed in to change notification settings - Fork 3
Add json schema to compatible models #20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
for more information, see https://pre-commit.ci
…eedback into add-json-schema
Rolland-He
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @wkukka1 , good work! I've left one inline comment.
| self.model = self.client.beta.assistants.create( | ||
| name="Markus LLM model", | ||
| model="gpt-4-turbo", | ||
| model="gpt-4o-mini", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure whether this is related to the PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I got approval from @david-yz-liu to do this since the gpt-4-turbo model doesn't support structured responses using the response_format arg.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great!
| Optional[Tuple[str, str]]: A tuple containing the prompt and the model's response, | ||
| or None if the response was invalid. | ||
| """ | ||
| schema = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's keep the code structure consistent with the other classes (if-else, etc.) Also keep the open call the same, with the utf-8 encoding
| response = self._call_openai(prompt, system_instructions) | ||
| if json_schema: | ||
| schema_path = Path(json_schema) | ||
| if not schema_path.exists(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a nice check; please add this to all of the classes.
Updates:
--json_schemaarg to cliOpenAIModel,OpenAIVectorModel,Codellama,DeepSeek-R1, andDeepSeek-v3to use json schema if supplied