-
|
Regarding this blogpost: https://ollama.com/blog/structured-outputs Does OllamaSharp support structured outputs yet? |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 5 replies
-
|
This was implemented by @mili-tan in 4.0.10 |
Beta Was this translation helpful? Give feedback.
-
|
In case anyone can't find it, its the 4th parameter to ChatAsync, called "format" and needs to be a json schema object like this: var format = new
{
type = "object",
properties = new
{
computernames = new
{
type = "array",
items = new { type = "string" }
},
ipaddresses = new
{
type = "array",
items = new { type = "string" }
},
users = new
{
type = "array",
items = new { type = "string" }
}
},
required = new[] { "computernames", "users" }
};
await foreach (var answerToken in chat.SendAsync(message, null, null, format))
{
Console.Write(answerToken);
}Does indeed work like a dream! |
Beta Was this translation helpful? Give feedback.
-
|
Hello, I am trying to figure out how to get this to work. I have read that for Ollama to validate that a structured output actually matches the supplied schema, that the query needs to run with streaming set to false. ChatGPT is telling me that there is no way in the current OllamaSharp API to send a request for structured output with streaming set to false and that I should fallback to building raw HTTP requests. This seems like AI hallucination to me but I've been unable to find the correct way to do it and the answer above uses streaming. |
Beta Was this translation helpful? Give feedback.
This was implemented by @mili-tan in 4.0.10