I'm using the local model, the LLM of the openai paradigm. How do I configure the model_config.yaml in the example? #6252
Unanswered
XIAOke8698
asked this question in
Q&A
Replies: 1 comment
-
See example here: https://github.com/microsoft/autogen/blob/main/python/samples/agentchat_chess_game/model_config_template.yaml Though you can also use other model clients. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm using the local model, the LLM of the openai paradigm. How do I configure the model_config.yaml in the example?
Beta Was this translation helpful? Give feedback.
All reactions