Open
Description
🔖 Feature description
- Simplify tool call logic and move it to LLM abstraction https://github.com/arc53/DocsGPT/blob/main/application/agents/llm_handler.py to llm
- Checks if specific LLM model support different feature (attachments, tools)
- Rename env variable LLM_NAME into LLM_PROVIDER and MODEL_NAME -> LLM_NAME, MODEL_PATH -> LLM_PATH, MODEL_TOKEN_LIMITS -> LLM_TOKEN_LIMITS
- Include native support for fallback llm (FALLBACK_LLM_PROVIDE and FALLBACK_LLM_NAME) which are triggered on error for main LLM call.
- While streaming response, tools calls need to show up as they are being requested and when they are completed
- Adding agent implementations (types) should be much more simplified.
Go through the whole app and make sure that all names make sense in relation to LLM's, models and so on
🎤 Why is this feature needed ?
To clean up the code, Ensure consistent behavior
✌️ How do you aim to achieve this?
Change existing LLM abstraction.
Move logic from llm_handler in agents to llm abstraction
🔄️ Additional Information
No response
👀 Have you spent some time to check if this feature request has been raised before?
- I checked and didn't find similar issue
Are you willing to submit PR?
None
Metadata
Metadata
Assignees
Type
Projects
Status
Priority