Open
Description
Description of the new feature / enhancement
It should be possible to configure the model used (currently fixed as gpt3.5-turbo
) and endpoint (currently fixed as OpenAI's) to arbitrary values
Scenario when this would be used?
Sending requests to an alternative AI endpoint (eg a local model, internal company hosted models, alternative ai providers), or ensuring higher-quality conversions (eg by pointing requests at gpt-4o)
Supporting information
Microsoft's documentation appears to suggest that the underlying library used for AI completions supports other libraries, it just needs to be provided with an endpoint.
The currently used model is a hardcoded string in this repository
Metadata
Metadata
Assignees
Labels
Type
Projects
Status
No status
Activity