Skip to content

Cannot get LLM to use tools #1670

@MarcSkovMadsen

Description

@MarcSkovMadsen

I would like to be able to use Lumen AI with tools from holoviz-mcp. But I cannot get the lumen llm to use the tools. For example I cannot get it to use a simple tool like

async def list_projects() -> list[str]:
    """List searchable projects. Combination of HoloViz and custom projects.

    This includes both built-in HoloViz projects (panel, hvplot, etc.) and any custom/internal
    documentation projects you have configured.

    Returns:
    -------
        list[str]: A list of project names that have documentation available.
                Names are returned in hyphenated format (e.g., "panel-material-ui", "my-custom-project").
    """
    print("listing projects...")
    return ["panel", "panel-material-ui", "my-custom-project"]
Full Code
"""An experimental data assistant based on lumen ai.

Don't yet expect a smooth user experience. This is a work in progress.
"""

import lumen.ai as lmai
import panel as pn
from my_package import get_model_configuration
from my_package import get_token_provider

async def list_projects() -> list[str]:
    """List searchable projects. Combination of HoloViz and custom projects.

    This includes both built-in HoloViz projects (panel, hvplot, etc.) and any custom/internal
    documentation projects you have configured.

    Returns:
    -------
        list[str]: A list of project names that have documentation available.
                Names are returned in hyphenated format (e.g., "panel-material-ui", "my-custom-project").
    """
    print("listing projects...")
    return ["panel", "panel-material-ui", "my-custom-project"]

def get_tools() -> list:
    """Get the list of tools available to the data assistant."""
    return [list_projects]


def _create_data_assistant() -> lmai.ui.ExplorerUI:
    default_model_config = get_model_configuration("gpt-4.1")

    default_model_kwargs = {
        "model": default_model_config.deployment_name,
        "azure_ad_token_provider": get_token_provider(),
        "api_version": default_model_config.api_version,
        "azure_endpoint": default_model_config.azure_endpoint,
    }

    model_kwargs = {"default": default_model_kwargs}

    llm = lmai.llm.AzureOpenAI(
        api_version=default_model_config.api_version,
        endpoint=default_model_config.azure_endpoint,
        model_kwargs=model_kwargs,
    )
    return lmai.ui.ExplorerUI(llm=llm, log_level="DEBUG", tools=get_tools())


if pn.state.served:
    pn.extension("vega")
    _create_data_assistant().servable()
lumen-ai-tools-not-working.mp4

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions