Skip to content

Latest commit

 

History

History
424 lines (318 loc) · 11.1 KB

File metadata and controls

424 lines (318 loc) · 11.1 KB

LLM Provider Template Management

CRUD operations for LLM Provider Template configurations

Create a new LLM provider template

POST /llm-provider-templates

Code samples

curl -X POST http://localhost:9090/llm-provider-templates \
  -u {username}:{password} \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json' \
  -d @payload.json

Add a new LLM provider template to the Gateway. A template defines token tracking and model extraction metadata for an LLM provider.

Payload

{
  "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
  "kind": "LlmProviderTemplate",
  "metadata": {
    "name": "openai-template"
  },
  "spec": {
    "displayName": "OpenAI",
    "promptTokens": {
      "location": "payload",
      "identifier": "$.usage.prompt_tokens"
    },
    "completionTokens": {
      "location": "payload",
      "identifier": "$.usage.completion_tokens"
    },
    "totalTokens": {
      "location": "payload",
      "identifier": "$.usage.total_tokens"
    },
    "remainingTokens": {
      "location": "header",
      "identifier": "x-ratelimit-remaining-tokens"
    },
    "requestModel": {
      "location": "payload",
      "identifier": "$.model"
    },
    "responseModel": {
      "location": "payload",
      "identifier": "$.model"
    }
  }
}

Authentication

This operation requires Basic Auth authentication.

Required roles: admin

Parameters

Name In Type Required Description
body body LLMProviderTemplate true none

Example responses

201 Response

{
  "status": "success",
  "message": "LLM provider template created successfully",
  "id": "openai",
  "createdAt": "2025-10-11T10:30:00Z"
}

Responses

Status Meaning Description Schema
201 Created LLM provider template created successfully LLMProviderTemplateCreateResponse
400 Bad Request Invalid configuration (validation failed) ErrorResponse
409 Conflict Conflict - Template with same name already exists ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

List all LLM provider templates

GET /llm-provider-templates

Code samples

curl -X GET http://localhost:9090/llm-provider-templates \
  -u {username}:{password} \
  -H 'Accept: application/json'

List LLM provider templates registered in the Gateway, optionally filtered by name.

Authentication

This operation requires Basic Auth authentication.

Required roles: admin

Parameters

Name In Type Required Description
displayName query string false Filter by template display name

Example responses

200 Response

{
  "status": "success",
  "count": 3,
  "templates": [
    {
      "id": "openai",
      "displayName": "OpenAI",
      "createdAt": "2025-10-11T10:30:00Z",
      "updatedAt": "2025-10-11T10:30:00Z"
    }
  ]
}

Responses

Status Meaning Description Schema
200 OK List of LLM provider templates Inline
500 Internal Server Error Internal server error ErrorResponse

Response Schema

Status Code 200

Name Type Required Restrictions Description
» status string false none none
» count integer false none none
» templates [LLMProviderTemplateListItem] false none none
»» id string false none none
»» displayName string false none none
»» createdAt string(date-time) false none none
»» updatedAt string(date-time) false none none

Get LLM provider template by id

GET /llm-provider-templates/{id}

Code samples

curl -X GET http://localhost:9090/llm-provider-templates/{id} \
  -u {username}:{password} \
  -H 'Accept: application/json'

Get an LLM provider template by its ID.

Authentication

This operation requires Basic Auth authentication.

Required roles: admin

Parameters

Name In Type Required Description
id path string true Unique public identifier for the LLM provider template

Example responses

200 Response

{
  "status": "success",
  "template": {
    "id": "openai",
    "configuration": {
      "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
      "kind": "LlmProviderTemplate",
      "metadata": {
        "name": "openai-template"
      },
      "spec": {
        "displayName": "OpenAI",
        "promptTokens": {
          "location": "payload",
          "identifier": "$.usage.prompt_tokens"
        },
        "completionTokens": {
          "location": "payload",
          "identifier": "$.usage.completion_tokens"
        },
        "totalTokens": {
          "location": "payload",
          "identifier": "$.usage.total_tokens"
        },
        "remainingTokens": {
          "location": "header",
          "identifier": "x-ratelimit-remaining-tokens"
        },
        "requestModel": {
          "location": "payload",
          "identifier": "$.model"
        },
        "responseModel": {
          "location": "payload",
          "identifier": "$.model"
        }
      }
    },
    "metadata": {
      "createdAt": "2025-10-11T10:30:00Z",
      "updatedAt": "2025-10-11T10:30:00Z"
    }
  }
}

Responses

Status Meaning Description Schema
200 OK LLM provider template details LLMProviderTemplateDetailResponse
404 Not Found LLM provider template not found ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

Update an existing LLM provider template

PUT /llm-provider-templates/{id}

Code samples

curl -X PUT http://localhost:9090/llm-provider-templates/{id} \
  -u {username}:{password} \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json' \
  -d @payload.json

Update an existing LLM provider template in the Gateway.

Payload

{
  "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
  "kind": "LlmProviderTemplate",
  "metadata": {
    "name": "openai-template"
  },
  "spec": {
    "displayName": "OpenAI",
    "promptTokens": {
      "location": "payload",
      "identifier": "$.usage.prompt_tokens"
    },
    "completionTokens": {
      "location": "payload",
      "identifier": "$.usage.completion_tokens"
    },
    "totalTokens": {
      "location": "payload",
      "identifier": "$.usage.total_tokens"
    },
    "remainingTokens": {
      "location": "header",
      "identifier": "x-ratelimit-remaining-tokens"
    },
    "requestModel": {
      "location": "payload",
      "identifier": "$.model"
    },
    "responseModel": {
      "location": "payload",
      "identifier": "$.model"
    }
  }
}

Authentication

This operation requires Basic Auth authentication.

Required roles: admin

Parameters

Name In Type Required Description
id path string true Unique public identifier of the template to update
body body LLMProviderTemplate true none

Example responses

200 Response

{
  "status": "success",
  "message": "LLM provider template updated successfully",
  "id": "openai",
  "updatedAt": "2025-10-11T11:45:00Z"
}

Responses

Status Meaning Description Schema
200 OK LLM provider template updated successfully LLMProviderTemplateUpdateResponse
400 Bad Request Invalid configuration (validation failed) ErrorResponse
404 Not Found LLM provider template not found ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

Delete an LLM provider template

DELETE /llm-provider-templates/{id}

Code samples

curl -X DELETE http://localhost:9090/llm-provider-templates/{id} \
  -u {username}:{password} \
  -H 'Accept: application/json'

Delete an LLM provider template from the Gateway.

Authentication

This operation requires Basic Auth authentication.

Required roles: admin

Parameters

Name In Type Required Description
id path string true Unique public identifier of the template to delete

Example responses

200 Response

{
  "status": "success",
  "message": "LLM provider template deleted successfully",
  "id": "openai"
}

Responses

Status Meaning Description Schema
200 OK LLM provider template deleted successfully Inline
404 Not Found LLM provider template not found ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

Response Schema

Status Code 200

Name Type Required Restrictions Description
» status string false none none
» message string false none none
» id string false none none