AutoFunctionInvocationFilter and OnAutoFunctionInvocationAsync are not working while using llama 3.2 #11805
-
I followed the example here https://github.com/microsoft/semantic-kernel/blob/main/dotnet/samples/Concepts/Filtering/AutoFunctionInvocationFiltering.cs#L22 to implement auto function invocation filtering. The only change I made was to use Ollama Chat competition with llama 3.2. It never reaches the class and method OnAutoFunctionInvocationAsync . Is auto function invocation supported for Ollama? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Ollama does support Auto Function calling, actually it all depends on which model you are using with Ollama, because this is not necessarily a ollama feature but rather a model capability. |
Beta Was this translation helpful? Give feedback.
Ollama does support Auto Function calling, actually it all depends on which model you are using with Ollama, because this is not necessarily a ollama feature but rather a model capability.