Skip to content

[Bug Report]: Connection to Hugging Face fails with TypeError on macOS #1549

@bmthethwa1

Description

@bmthethwa1

Version

Version 0.2.4 (0.2.4)

Issue Type

  • Select a issue type 👇
  • Agent TARS Web UI (@agent-tars/web-ui)
  • Agent TARS CLI (@agent-tars/server)
  • Agent TARS Server (@agent-tars/server)
  • Agent TARS (@agent-tars/core)
  • MCP Agent (@tarko/mcp-agent)
  • Agent Kernel (@tarko/agent)
  • Other (please specify in description)

Model Provider

  • Select a model provider 👇
  • Volcengine
  • Anthropic
  • OpenAI
  • Azure OpenAI
  • Other (please specify in description)

Problem Description

Model Provider: Hugging Face

Problem Description
When setting up the VLM provider for Hugging Face on a new installation, the connection fails with a TypeError, which prevents the application from being used.

Steps to Reproduce

  1. Install UI-TARS (e.g., version 0.2.4) on a macOS Apple Silicon (M1) machine using Homebrew.
  2. Launch the application and proceed to the VLM Settings for the first-time setup.
  3. Select a Hugging Face provider (e.g., Hugging Face for UI-TARS-1.5).
  4. Enter a valid 'Read' Access Token from the Hugging Face website.
  5. Manually enter the VLM Base URL (https://api-inference.huggingface.co) and VLM Model Name (Bytecore/UI-TARS-1.5).
  6. Click the "Check Model Availability" button.

Expected behavior
The model availability check should pass, allowing the setup to complete.

Actual behavior
The check immediately fails with the error shown in the Error Logs section below. This occurs for both the 1.0 and 1.5 models.

Desktop:

  • OS: macOS (Apple Silicon M1)

Error Logs

Failed to connect to model: Error invoking remote method 'checkModelAvailability': TypeError: Cannot read properties of undefined (reading '0')

Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions