Skip to content

[Feat] ChatGPT Integration Part 2: LLM API #283

@trangiabach

Description

@trangiabach

This is related to #279. An article for reference.

  • For each course, create an LLM configuration object that stores the prompt, model type + other relevant configs. There will also be a general prompt applied to all LLM configuration object to tune the LLM to the task of responding to office hours questions based on course materials

  • Create an API endpoint to edit these configs (prompt engineering the LLM)

  • Create an API endpoint that given a search query, calls the VectorDB API to return relevant documents, ingest the documents into the LLM using the OpenAI API, query the OpenAI API using the search query and return the response

  • Since this API is priced based on usage, log the amount of usage along with the cost for each course using the LLM.

  • Implement rate-limits on the LLM Answering API

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions