doc: A page in extension to show the recommended ways to integrate with non-openai models #5118
Description
We need a documentation page in the Extensions to show the recommended ways to integrate with non-OpenAI models:
- Azure AI Studio / Azure AI Inference: Added Azure AI Chat Completion Client #4723
- GitHub Models: use
OpenAIChatCompletionClient
- Ollama: Ollama client #3817
- AWS Bedrock: use semantic kernel adapter
- Gemini: use semantic kernel adapter
- Claude: use semantic kernel adapter
Reference: semantic kernel connectors: https://github.com/microsoft/semantic-kernel/tree/main/python/semantic_kernel/connectors/ai
Also, update the tutorial documentation in both AgentChat and Core to point to this page for non-OpenAI models and local models.