Skip to content

Add inputs and outputs to component definition tool #30

@mathislucka

Description

@mathislucka

Overview

We need to extend the get_component_definition tool in src/deepset_mcp/tools/haystack_service.py

We also need to fetch component input and outputs. You can do that by passing the component name (last element of the path) into service.get_component_input_output.

Format the input and outputs nicely including descriptions types and defaults.

Update the test in test/unit/tools/test_haystack_service.py to reflect the addition.

This is how the method on the service looks like:

async def get_component_input_output(self, component_name: str) -> dict[str, Any]:
"""Fetch the component input and output schema from the API.

    Args:
        component_name: The name of the component to fetch the input/output schema for

    Returns:
        The component input/output schema as a dictionary
    """
    resp = await self._client.request(
        endpoint=f"v1/haystack/components/input-output?domain=deepset-cloud&names={component_name}",
        method="GET",
        headers={"accept": "application/json"},
        response_type=list[dict[str, Any]],
    )

    raise_for_status(resp)

    if resp.json is None or len(resp.json) == 0:
        raise ResourceNotFoundError(f"Component '{component_name}' not found.")

    return resp.json[0] if resp.json is not None else {}

Any errors or missing inputs and outputs should be reflected in the response.

Example
Here is an example io response:

{'input': {'additionalProperties': False, 'definitions': {'ChatMessage': {'additionalProperties': False, 'properties': {'_content': {}, '_meta': {'properties': {...}, 'type': 'object'}, '_name': {'type': [...]}, '_python_type': {'default': 'dataclass', 'type': 'string'}, '_role': {'enum': [...], 'examples': [...], 'type': 'string'}}, 'required': ['_role', '_content'], 'type': 'object'}}, 'properties': {'messages': {'_annotation': 'typing.List[haystack.dataclasses.chat_message.ChatMessage]', 'description': 'List of chat messages to process', 'items': {'$ref': '#/definitions/ChatMessage'}, 'type': 'array'}, 'streaming_callback': {'_annotation': 'typing.Union[typing.Callable[[haystack.dataclasses.streaming_chunk.StreamingChunk], NoneType], typing.Callable[[haystack.dataclasses.streaming_chunk.StreamingChunk], typing.Awaitable[NoneType]], NoneType]', 'anyOf': [{'description': 'A string representing a Python function path. This string is expected to contain the fully qualified name of a callable object, with each part of the path separated by dots (e.g., 'module.submodule.function'). The function must be locally available.', 'type': 'string'}, {'description': 'A string representing a Python function path. This string is expected to contain the fully qualified name of a callable object, with each part of the path separated by dots (e.g., 'module.submodule.function'). The function must be locally available.', 'type': 'string'}, {'type': 'null'}], 'description': 'A callback that will be invoked when a response is streamed from the LLM.'}}, 'required': ['messages'], 'type': 'object'}, 'name': 'Agent', 'output': {'description': 'No output schema available. Failed to extract since the output decorator is not found.', 'type': 'object'}}

Parse the response defensively to avoid errors.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions