v0.9.2 OpenAI backend support
What's Changed
-
Added support for OpenAI type API chat completion.
This isn't tested on a real endpoint, only a dummy one, so consider it experimental. To use it, change BACKEND in llm_config.yaml to 'openai', and url and endpoint accordingly. If you want to use chat-gpt, add your openai api key. -
Fixed issue where generated NPCs in generated locations weren't added to said location.
Expect next release to include some improvements to generated locations. For now, I still recommend to have the terminal window open so you can see the name of things.
Edit: Fixed and tested OpenAI against gpt-3.5-turbo
Full Changelog: v0.9.1.3...v0.9.2.1