FROM feat/464-update-input-support-multiple-keys TO development#493
Conversation
|
Caution Review failedThe pull request is closed. WalkthroughReplaces top-level message payloads with a structured LLMInput and new LLM schemas; adds LLM discovery service and model-filtering utilities; updates flows, stream pipeline, routes, auth, presidio, and assistant services to use Changes
Sequence Diagram(s)sequenceDiagram
actor Client
participant Frontend
participant API as FastAPI
participant Auth
participant InitConfig as init_config
participant AgentCtor as construct_agent
participant StreamGen as stream_generator
participant Orchestra
Note right of Frontend `#F0F8FF`: Frontend wraps messages in LLMInput
Client->>Frontend: send messages
Frontend->>API: POST /llm (body: input:{ messages })
API->>Auth: get_optional_user(params: LLMRequest)
API->>InitConfig: init_config(params, user)
InitConfig-->>API: RunnableConfig
API->>AgentCtor: construct_agent(system_prompt, tools, model, subagents, config)
AgentCtor-->>Orchestra: create agent/orchestra
API->>StreamGen: stream_generator(input: LLMInput, model, system_prompt, tools, subagents, config, service_context)
StreamGen->>Orchestra: stream using input.messages
Orchestra-->>StreamGen: stream chunks & final_state
StreamGen-->>API: final_state -> thread update (last human message)
API-->>Frontend: stream SSE + final update
Frontend-->>Client: render assistant response
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes
Poem
📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Free 📒 Files selected for processing (2)
Note 🎁 Summarized by CodeRabbit FreeYour organization is on the Free plan. CodeRabbit will generate a high-level summary and a walkthrough for each pull request. For a comprehensive line-by-line review, please upgrade your subscription to CodeRabbit Pro by visiting https://app.coderabbit.ai/login. Comment |
Closes #464
Summary by CodeRabbit
New Features
Changes
Chores