-
Notifications
You must be signed in to change notification settings - Fork 3.2k
use the local llm in the agent #132
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughThe asynchronous function Changes
Poem
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🔭 Outside diff range comments (1)
llamaindex-mcp/ollama_client.ipynb (1)
164-172
: 🛠️ Refactor suggestionAvoid hidden dependency on the global
llm
variable
get_agent
now closes over the notebook-scopedllm
.
This couples the helper to cell-execution order and makes the function unusable if someone imports it from another module / runs the cell before definingllm
, resulting inNameError
.Prefer explicit injection:
-async def get_agent(tools: McpToolSpec): +async def get_agent(tools: McpToolSpec, llm=Settings.llm): tools = await tools.to_tool_list_async() agent = FunctionAgent( name="Agent", description="An agent that can work with Our Database software.", tools=tools, - llm=llm, + llm=llm, system_prompt=SYSTEM_PROMPT, ) return agentBenefits:
• Keeps the function pure/portable.
• Allows callers to swap models easily (e.g., tests can pass a mock LLM).
• Removes the cell-ordering pitfall.
🧹 Nitpick comments (1)
llamaindex-mcp/ollama_client.ipynb (1)
165-166
: Variable shadowing:tools
is re-assigned
tools
(parameter) is immediately overwritten by the local list, which makes stack traces harder to read and forbids later reuse of the original spec. Rename the local variable to avoid shadowing:- tools = await tools.to_tool_list_async() + tool_list = await tools.to_tool_list_async()A tiny change, but it improves clarity and debuggability.
Use the local llm defined earlier. Tools are being passed to the `get_agent` function. Remove the redundant call to fetch those.
Use the local llm in the function calling agent.
Tools are being passed to the
get_agent
function. Remove the redundant call to fetch those.Summary by CodeRabbit