-
I noticed that in the OpenAIChatCompletionClient can directly set the temperature, and I wonder if there is a similar way to set it in the OllamaChatCompletionClient, or how to set temperature or top_p and other parameter to control llm in OllamaChatCompletionClient? |
Beta Was this translation helpful? Give feedback.
Answered by
ekzhu
Mar 25, 2025
Replies: 1 comment 4 replies
-
Can you set it in the constructor? |
Beta Was this translation helpful? Give feedback.
4 replies
Answer selected by
Hakstar
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Can you set it in the constructor?