Skip to content

Latest commit

 

History

History
572 lines (460 loc) · 16.8 KB

File metadata and controls

572 lines (460 loc) · 16.8 KB

LLM Provider Template Management

CRUD operations for LLM Provider Template configurations

Create a new LLM provider template

POST /llm-provider-templates

Code samples

curl -X POST http://localhost:9090/api/management/v0.9/llm-provider-templates \
  -u {username}:{password} \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json' \
  -d @payload.json

Add a new LLM provider template to the Gateway. A template defines token tracking and model extraction metadata for an LLM provider.

Payload

{
  "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
  "kind": "LlmProviderTemplate",
  "metadata": {
    "name": "openai-template"
  },
  "spec": {
    "displayName": "OpenAI",
    "promptTokens": {
      "location": "payload",
      "identifier": "$.usage.prompt_tokens"
    },
    "completionTokens": {
      "location": "payload",
      "identifier": "$.usage.completion_tokens"
    },
    "totalTokens": {
      "location": "payload",
      "identifier": "$.usage.total_tokens"
    },
    "remainingTokens": {
      "location": "header",
      "identifier": "x-ratelimit-remaining-tokens"
    },
    "requestModel": {
      "location": "payload",
      "identifier": "$.model"
    },
    "responseModel": {
      "location": "payload",
      "identifier": "$.model"
    }
  }
}

Authentication

This operation requires Basic Auth authentication.

Required roles: admin

Parameters

Name In Type Required Description
body body LLMProviderTemplateRequest true none

Example responses

201 Response

{
  "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
  "kind": "LlmProviderTemplate",
  "metadata": {
    "name": "openai-template"
  },
  "spec": {
    "displayName": "OpenAI",
    "promptTokens": {
      "location": "payload",
      "identifier": "$.usage.prompt_tokens"
    },
    "completionTokens": {
      "location": "payload",
      "identifier": "$.usage.completion_tokens"
    },
    "totalTokens": {
      "location": "payload",
      "identifier": "$.usage.total_tokens"
    },
    "remainingTokens": {
      "location": "header",
      "identifier": "x-ratelimit-remaining-tokens"
    },
    "requestModel": {
      "location": "payload",
      "identifier": "$.model"
    },
    "responseModel": {
      "location": "payload",
      "identifier": "$.model"
    }
  },
  "status": {
    "id": "openai-template",
    "createdAt": "2026-04-24T07:21:13Z",
    "updatedAt": "2026-04-24T07:21:13Z"
  }
}

Responses

Status Meaning Description Schema
201 Created LLM provider template created successfully LLMProviderTemplate
400 Bad Request Invalid configuration (validation failed) ErrorResponse
409 Conflict Conflict - Template with same name already exists ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

List all LLM provider templates

GET /llm-provider-templates

Code samples

curl -X GET http://localhost:9090/api/management/v0.9/llm-provider-templates \
  -u {username}:{password} \
  -H 'Accept: application/json'

List LLM provider templates registered in the Gateway, optionally filtered by name.

Authentication

This operation requires Basic Auth authentication.

Required roles: admin

Parameters

Name In Type Required Description
displayName query string false Filter by template display name

Example responses

200 Response

{
  "status": "success",
  "count": 3,
  "templates": [
    {
      "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
      "kind": "LlmProviderTemplate",
      "metadata": {
        "name": "openai-template"
      },
      "spec": {
        "displayName": "OpenAI",
        "promptTokens": {
          "location": "payload",
          "identifier": "$.usage.prompt_tokens"
        },
        "completionTokens": {
          "location": "payload",
          "identifier": "$.usage.completion_tokens"
        },
        "totalTokens": {
          "location": "payload",
          "identifier": "$.usage.total_tokens"
        },
        "remainingTokens": {
          "location": "header",
          "identifier": "x-ratelimit-remaining-tokens"
        },
        "requestModel": {
          "location": "payload",
          "identifier": "$.model"
        },
        "responseModel": {
          "location": "payload",
          "identifier": "$.model"
        }
      },
      "status": {
        "id": "openai-template",
        "createdAt": "2026-04-24T07:21:13Z",
        "updatedAt": "2026-04-24T07:21:13Z"
      }
    }
  ]
}

Responses

Status Meaning Description Schema
200 OK List of LLM provider templates Inline
500 Internal Server Error Internal server error ErrorResponse

Response Schema

Status Code 200

Name Type Required Restrictions Description
» status string false none none
» count integer false none none
» templates [allOf] false none none

allOf

Name Type Required Restrictions Description
»» anonymous LLMProviderTemplateRequest false none none
»»» apiVersion string true none Template specification version
»»» kind string true none Template kind
»»» metadata Metadata true none none
»»»» name string true none Unique handle for the resource
»»»» labels object false none Labels are key-value pairs for organizing and selecting APIs. Keys must not contain spaces.
»»»»» additionalProperties string false none none
»»»» annotations object false none Annotations are arbitrary non-identifying metadata. Use domain-prefixed keys.
»»»»» additionalProperties string false none none
»»» spec LLMProviderTemplateData true none none
»»»» displayName string true none Human-readable LLM Template name
»»»» promptTokens ExtractionIdentifier false none none
»»»»» location string true none Where to find the token information
»»»»» identifier string true none JSONPath expression or header name to identify the token value
»»»» completionTokens ExtractionIdentifier false none none
»»»» totalTokens ExtractionIdentifier false none none
»»»» remainingTokens ExtractionIdentifier false none none
»»»» requestModel ExtractionIdentifier false none none
»»»» responseModel ExtractionIdentifier false none none
»»»» resourceMappings LLMProviderTemplateResourceMappings false none none
»»»»» resources [LLMProviderTemplateResourceMapping] false none none
»»»»»» resource string true none Resource path pattern for this mapping
»»»»»» promptTokens ExtractionIdentifier false none none
»»»»»» completionTokens ExtractionIdentifier false none none
»»»»»» totalTokens ExtractionIdentifier false none none
»»»»»» remainingTokens ExtractionIdentifier false none none
»»»»»» requestModel ExtractionIdentifier false none none
»»»»»» responseModel ExtractionIdentifier false none none

and

Name Type Required Restrictions Description
»» anonymous object false none none
»»» status ResourceStatus false read-only Server-managed lifecycle fields. Populated on responses.
»»»» id string false none Unique identifier assigned by the server (equal to metadata.name)
»»»» state string false none Desired deployment state reported by the server
»»»» createdAt string(date-time) false none Timestamp when the resource was first created (UTC)
»»»» updatedAt string(date-time) false none Timestamp when the resource was last updated (UTC)
»»»» deployedAt string(date-time) false none Timestamp when the resource was last deployed (omitted when undeployed)

Enumerated Values

Property Value
apiVersion gateway.api-platform.wso2.com/v1alpha1
kind LlmProviderTemplate
location payload
location header
location queryParam
location pathParam
state deployed
state undeployed

Get LLM provider template by id

GET /llm-provider-templates/{id}

Code samples

curl -X GET http://localhost:9090/api/management/v0.9/llm-provider-templates/{id} \
  -u {username}:{password} \
  -H 'Accept: application/json'

Get an LLM provider template by its ID.

Authentication

This operation requires Basic Auth authentication.

Required roles: admin

Parameters

Name In Type Required Description
id path string true Unique public identifier for the LLM provider template

Example responses

200 Response

{
  "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
  "kind": "LlmProviderTemplate",
  "metadata": {
    "name": "openai-template"
  },
  "spec": {
    "displayName": "OpenAI",
    "promptTokens": {
      "location": "payload",
      "identifier": "$.usage.prompt_tokens"
    },
    "completionTokens": {
      "location": "payload",
      "identifier": "$.usage.completion_tokens"
    },
    "totalTokens": {
      "location": "payload",
      "identifier": "$.usage.total_tokens"
    },
    "remainingTokens": {
      "location": "header",
      "identifier": "x-ratelimit-remaining-tokens"
    },
    "requestModel": {
      "location": "payload",
      "identifier": "$.model"
    },
    "responseModel": {
      "location": "payload",
      "identifier": "$.model"
    }
  },
  "status": {
    "id": "openai-template",
    "createdAt": "2026-04-24T07:21:13Z",
    "updatedAt": "2026-04-24T07:21:13Z"
  }
}

Responses

Status Meaning Description Schema
200 OK LLM provider template details LLMProviderTemplate
404 Not Found LLM provider template not found ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

Update an existing LLM provider template

PUT /llm-provider-templates/{id}

Code samples

curl -X PUT http://localhost:9090/api/management/v0.9/llm-provider-templates/{id} \
  -u {username}:{password} \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json' \
  -d @payload.json

Update an existing LLM provider template in the Gateway.

Payload

{
  "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
  "kind": "LlmProviderTemplate",
  "metadata": {
    "name": "openai-template"
  },
  "spec": {
    "displayName": "OpenAI",
    "promptTokens": {
      "location": "payload",
      "identifier": "$.usage.prompt_tokens"
    },
    "completionTokens": {
      "location": "payload",
      "identifier": "$.usage.completion_tokens"
    },
    "totalTokens": {
      "location": "payload",
      "identifier": "$.usage.total_tokens"
    },
    "remainingTokens": {
      "location": "header",
      "identifier": "x-ratelimit-remaining-tokens"
    },
    "requestModel": {
      "location": "payload",
      "identifier": "$.model"
    },
    "responseModel": {
      "location": "payload",
      "identifier": "$.model"
    }
  }
}

Authentication

This operation requires Basic Auth authentication.

Required roles: admin

Parameters

Name In Type Required Description
id path string true Unique public identifier of the template to update
body body LLMProviderTemplateRequest true none

Example responses

200 Response

{
  "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
  "kind": "LlmProviderTemplate",
  "metadata": {
    "name": "openai-template"
  },
  "spec": {
    "displayName": "OpenAI",
    "promptTokens": {
      "location": "payload",
      "identifier": "$.usage.prompt_tokens"
    },
    "completionTokens": {
      "location": "payload",
      "identifier": "$.usage.completion_tokens"
    },
    "totalTokens": {
      "location": "payload",
      "identifier": "$.usage.total_tokens"
    },
    "remainingTokens": {
      "location": "header",
      "identifier": "x-ratelimit-remaining-tokens"
    },
    "requestModel": {
      "location": "payload",
      "identifier": "$.model"
    },
    "responseModel": {
      "location": "payload",
      "identifier": "$.model"
    }
  },
  "status": {
    "id": "openai-template",
    "createdAt": "2026-04-24T07:21:13Z",
    "updatedAt": "2026-04-24T07:21:13Z"
  }
}

Responses

Status Meaning Description Schema
200 OK LLM provider template updated successfully LLMProviderTemplate
400 Bad Request Invalid configuration (validation failed) ErrorResponse
404 Not Found LLM provider template not found ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

Delete an LLM provider template

DELETE /llm-provider-templates/{id}

Code samples

curl -X DELETE http://localhost:9090/api/management/v0.9/llm-provider-templates/{id} \
  -u {username}:{password} \
  -H 'Accept: application/json'

Delete an LLM provider template from the Gateway.

Authentication

This operation requires Basic Auth authentication.

Required roles: admin

Parameters

Name In Type Required Description
id path string true Unique public identifier of the template to delete

Example responses

200 Response

{
  "status": "success",
  "message": "LLM provider template deleted successfully",
  "id": "openai"
}

Responses

Status Meaning Description Schema
200 OK LLM provider template deleted successfully Inline
404 Not Found LLM provider template not found ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

Response Schema

Status Code 200

Name Type Required Restrictions Description
» status string false none none
» message string false none none
» id string false none none