This repository was archived by the owner on Nov 13, 2024. It is now read-only.
Replies: 1 comment
-
Here you can see an example: Chainlit/cookbook#84 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
After I run canopy start , the server is now running on http://0.0.0.0:8000.
And I read in the README, it is said that i could use the server with chat application.
That's it! you can now start using the Canopy server with any chat application that supports a /chat.completion endpoint.
I also see in the deployment-gcp.md, I can use any OpenAI compatible chat UI to interact with the newly deployed server.
Is there any example how to use the canopy server with a chat UI for example Chainlit?
Hope someone could help me on this, Thank You very much
Beta Was this translation helpful? Give feedback.
All reactions