Really, as per title.
I've been unable to set literally anything for Format in a GenerateRequest and have the response come back with anything, not even just "json", but the exact same system message, prompt, model, and format set to "json" as a "Chat" (with chat.SendAsync) returns data exactly as expected.
Literally the only change is swapping GenerateRequest for Chat, to get a successful response from Ollama.
Ollama version: 0.13.5
OllamaSharp: 5.4.12