Replies: 1 comment 3 replies
-
from dotenv import load_dotenv
from langchain_core.tools import BaseTool
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent
from pydantic import BaseModel, Field
class NmapInput(BaseModel):
network_segment: str = Field(description="the target's IP address segment")
class Nmap(BaseTool):
name = "nmap"
description = "useful for when you need to determine the target's IP address based on a network segment"
args_schema = NmapInput
def _run(self, network_segment: str) -> str:
print(f"{network_segment = }")
return f"Scanning {network_segment}..."
load_dotenv()
llm = ChatOpenAI(model="gpt-4o-mini")
scan_agent = create_react_agent(llm, tools=[Nmap()])
outputs = scan_agent.invoke({"messages": [("human", "I want to find 127.0.0.1")]})
print(outputs["messages"][-1].content) |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Issue with current documentation:
official sample code link:https://github.com/langchain-ai/langgraph/blob/main/examples/multi_agent/hierarchical_agent_teams.ipynb
Idea or request for content:
The custom tool is as follows (the image is highlighted in red because I removed some parts of the function for screenshots, so please ignore it) :


Define the agent node in langgraph as follows (wrong feeling):
The error is as follows:
TypeError: Object of type 'ModelMetaclass' is not JSON serializable
or
TypeError: _run() missing 2 required positional arguments: 'self' and 'network_segment'
Beta Was this translation helpful? Give feedback.
All reactions