Skip to content

Latest commit

 

History

History
831 lines (616 loc) · 24.2 KB

File metadata and controls

831 lines (616 loc) · 24.2 KB

LLM Proxy Management

CRUD operations for LLM Proxy configurations

Create a new LLM proxy

POST /llm-proxies

Code samples

curl -X POST http://localhost:9090/api/management/v0.9/llm-proxies \
  -u {username}:{password} \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json' \
  -d @payload.json

Add a new LLM proxy to the Gateway. A proxy defines how to interact with an LLM service deployed in the Gateway, including authentication and policies.

Payload

{
  "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
  "kind": "LlmProxy",
  "metadata": {
    "name": "openai-proxy"
  },
  "spec": {
    "displayName": "OpenAI Proxy",
    "version": "v1.0",
    "context": "/openai-proxy",
    "provider": {
      "id": "wso2-openai-provider"
    },
    "policies": []
  }
}

Authentication

This operation requires Basic Auth authentication.

Required roles: admin, developer

Parameters

Name In Type Required Description
body body LLMProxyConfigurationRequest true LLM proxy in YAML or JSON format

Example responses

201 Response

{
  "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
  "kind": "LlmProxy",
  "metadata": {
    "name": "openai-proxy"
  },
  "spec": {
    "displayName": "OpenAI Proxy",
    "version": "v1.0",
    "context": "/openai-proxy",
    "provider": {
      "id": "wso2-openai-provider"
    },
    "policies": []
  },
  "status": {
    "id": "openai-proxy",
    "state": "deployed",
    "createdAt": "2026-04-24T07:21:13Z",
    "updatedAt": "2026-04-24T07:21:13Z",
    "deployedAt": "2026-04-24T07:21:13Z"
  }
}

Responses

Status Meaning Description Schema
201 Created LLM proxy created and deployed successfully LLMProxyConfiguration
400 Bad Request Invalid configuration (validation failed) ErrorResponse
409 Conflict Conflict - Proxy with same name and version already exists ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

List all LLM proxies

GET /llm-proxies

Code samples

curl -X GET http://localhost:9090/api/management/v0.9/llm-proxies \
  -u {username}:{password} \
  -H 'Accept: application/json'

List LLM proxies registered in the Gateway, optionally filtered by name, version, context, status, or vhost.

Authentication

This operation requires Basic Auth authentication.

Required roles: admin, developer

Parameters

Name In Type Required Description
displayName query string false Filter by LLM proxy displayName
version query string false Filter by LLM proxy version
context query string false Filter by LLM proxy context/path
status query string false Filter by deployment status
vhost query string false Filter by LLM proxy vhost

Enumerated Values

Parameter Value
status deployed
status undeployed

Example responses

200 Response

{
  "status": "success",
  "count": 2,
  "proxies": [
    {
      "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
      "kind": "LlmProxy",
      "metadata": {
        "name": "openai-proxy"
      },
      "spec": {
        "displayName": "OpenAI Proxy",
        "version": "v1.0",
        "context": "/openai-proxy",
        "provider": {
          "id": "wso2-openai-provider"
        },
        "policies": []
      },
      "status": {
        "id": "openai-proxy",
        "state": "deployed",
        "createdAt": "2026-04-24T07:21:13Z",
        "updatedAt": "2026-04-24T07:21:13Z",
        "deployedAt": "2026-04-24T07:21:13Z"
      }
    }
  ]
}

Responses

Status Meaning Description Schema
200 OK List of LLM proxies Inline
500 Internal Server Error Internal server error ErrorResponse

Response Schema

Status Code 200

Name Type Required Restrictions Description
» status string false none none
» count integer false none none
» proxies [allOf] false none none

allOf

Name Type Required Restrictions Description
»» anonymous LLMProxyConfigurationRequest false none none
»»» apiVersion string true none Proxy specification version
»»» kind string true none Proxy kind
»»» metadata Metadata true none none
»»»» name string true none Unique handle for the resource
»»»» labels object false none Labels are key-value pairs for organizing and selecting APIs. Keys must not contain spaces.
»»»»» additionalProperties string false none none
»»»» annotations object false none Annotations are arbitrary non-identifying metadata. Use domain-prefixed keys.
»»»»» additionalProperties string false none none
»»» spec LLMProxyConfigData true none none
»»»» displayName string true none Human-readable LLM proxy name (must be URL-friendly - only letters, numbers, spaces, hyphens, underscores, and dots allowed)
»»»» version string true none Semantic version of the LLM proxy
»»»» context string false none Base path for all API routes (must start with /, no trailing slash)
»»»» vhost string false none Virtual host name used for routing. Supports standard domain names, subdomains, or wildcard domains. Must follow RFC-compliant hostname rules. Wildcards are only allowed in the left-most label (e.g., *.example.com).
»»»» provider LLMProxyProvider true none none
»»»»» id string true none Unique id of a deployed llm provider
»»»»» auth LLMUpstreamAuth false none none
»»»»»» type string true none none
»»»»»» header string false none none
»»»»»» value string false none none
»»»» policies [LLMPolicy] false none List of policies applied only to this operation (overrides or adds to API-level policies)
»»»»» name string true none none
»»»»» version string true none none
»»»»» paths [LLMPolicyPath] true none none
»»»»»» path string true none none
»»»»»» methods [string] true none none
»»»»»» params object true none JSON Schema describing the parameters accepted by this policy. This itself is a JSON Schema document.
»»»» deploymentState string false none Desired deployment state - 'deployed' (default) or 'undeployed'. When set to 'undeployed', the LLM Proxy is removed from router traffic but configuration and policies are preserved for potential redeployment.

and

Name Type Required Restrictions Description
»» anonymous object false none none
»»» status ResourceStatus false read-only Server-managed lifecycle fields. Populated on responses.
»»»» id string false none Unique identifier assigned by the server (equal to metadata.name)
»»»» state string false none Desired deployment state reported by the server
»»»» createdAt string(date-time) false none Timestamp when the resource was first created (UTC)
»»»» updatedAt string(date-time) false none Timestamp when the resource was last updated (UTC)
»»»» deployedAt string(date-time) false none Timestamp when the resource was last deployed (omitted when undeployed)

Enumerated Values

Property Value
apiVersion gateway.api-platform.wso2.com/v1alpha1
kind LlmProxy
type api-key
deploymentState deployed
deploymentState undeployed
state deployed
state undeployed

Get LLM proxy by unique identifier

GET /llm-proxies/{id}

Code samples

curl -X GET http://localhost:9090/api/management/v0.9/llm-proxies/{id} \
  -u {username}:{password} \
  -H 'Accept: application/json'

Get an LLM proxy by its ID.

Authentication

This operation requires Basic Auth authentication.

Required roles: admin, developer

Parameters

Name In Type Required Description
id path string true Unique identifier of the LLM proxy

Example responses

200 Response

{
  "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
  "kind": "LlmProxy",
  "metadata": {
    "name": "openai-proxy"
  },
  "spec": {
    "displayName": "OpenAI Proxy",
    "version": "v1.0",
    "context": "/openai-proxy",
    "provider": {
      "id": "wso2-openai-provider"
    },
    "policies": []
  },
  "status": {
    "id": "openai-proxy",
    "state": "deployed",
    "createdAt": "2026-04-24T07:21:13Z",
    "updatedAt": "2026-04-24T07:21:13Z",
    "deployedAt": "2026-04-24T07:21:13Z"
  }
}

Responses

Status Meaning Description Schema
200 OK LLM proxy details LLMProxyConfiguration
404 Not Found LLM proxy not found ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

Update an existing LLM proxy

PUT /llm-proxies/{id}

Code samples

curl -X PUT http://localhost:9090/api/management/v0.9/llm-proxies/{id} \
  -u {username}:{password} \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json' \
  -d @payload.json

Update an existing LLM proxy in the Gateway.

Payload

{
  "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
  "kind": "LlmProxy",
  "metadata": {
    "name": "openai-proxy"
  },
  "spec": {
    "displayName": "OpenAI Proxy",
    "version": "v1.0",
    "context": "/openai-proxy",
    "provider": {
      "id": "wso2-openai-provider"
    },
    "policies": []
  }
}

Authentication

This operation requires Basic Auth authentication.

Required roles: admin, developer

Parameters

Name In Type Required Description
id path string true Unique identifier of the LLM proxy
body body LLMProxyConfigurationRequest true Updated LLM proxy

Example responses

200 Response

{
  "apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
  "kind": "LlmProxy",
  "metadata": {
    "name": "openai-proxy"
  },
  "spec": {
    "displayName": "OpenAI Proxy",
    "version": "v1.0",
    "context": "/openai-proxy",
    "provider": {
      "id": "wso2-openai-provider"
    },
    "policies": []
  },
  "status": {
    "id": "openai-proxy",
    "state": "deployed",
    "createdAt": "2026-04-24T07:21:13Z",
    "updatedAt": "2026-04-24T07:21:13Z",
    "deployedAt": "2026-04-24T07:21:13Z"
  }
}

Responses

Status Meaning Description Schema
200 OK LLM proxy updated successfully LLMProxyConfiguration
400 Bad Request Invalid configuration ErrorResponse
404 Not Found LLM proxy not found ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

Delete an LLM proxy

DELETE /llm-proxies/{id}

Code samples

curl -X DELETE http://localhost:9090/api/management/v0.9/llm-proxies/{id} \
  -u {username}:{password} \
  -H 'Accept: application/json'

Delete an LLM proxy from the Gateway.

Authentication

This operation requires Basic Auth authentication.

Required roles: admin, developer

Parameters

Name In Type Required Description
id path string true Unique identifier of the LLM proxy

Example responses

200 Response

{
  "status": "success",
  "message": "LLM proxy deleted successfully",
  "id": "openai-proxy"
}

Responses

Status Meaning Description Schema
200 OK LLM proxy deleted successfully Inline
404 Not Found LLM proxy not found ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

Response Schema

Status Code 200

Name Type Required Restrictions Description
» status string false none none
» message string false none none
» id string false none none

Create a new API key for an LLM proxy

POST /llm-proxies/{id}/api-keys

Code samples

curl -X POST http://localhost:9090/api/management/v0.9/llm-proxies/{id}/api-keys \
  -u {username}:{password} \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json' \
  -d @payload.json

Generate a new API key for an LLM proxy in the Gateway.

Payload

{
  "name": "my-production-key"
}

Authentication

This operation requires Basic Auth authentication.

Required roles: admin, consumer

Parameters

Name In Type Required Description
id path string true Unique handle of the LLM proxy to generate the key for
body body APIKeyCreationRequest true none

Example responses

201 Response

{
  "status": "success",
  "message": "API key generated successfully",
  "remainingApiKeyQuota": 9,
  "apiKey": {
    "name": "my-production-key",
    "displayName": "My Production Key",
    "apiKey": "apip_1234567890abcdef1234567890abcdef1234567890abcdef1234567890abcdef",
    "apiId": "reading-list-api-v1.0",
    "status": "active",
    "createdAt": "2026-04-01T10:30:00Z",
    "createdBy": "admin",
    "expiresAt": null,
    "source": "local"
  }
}

Responses

Status Meaning Description Schema
201 Created API key created successfully APIKeyCreationResponse
400 Bad Request Invalid configuration (validation failed) ErrorResponse
404 Not Found LLM proxy not found ErrorResponse
409 Conflict Conflict (duplicate key or conflicting update) ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

Get the list of API keys for an LLM proxy

GET /llm-proxies/{id}/api-keys

Code samples

curl -X GET http://localhost:9090/api/management/v0.9/llm-proxies/{id}/api-keys \
  -u {username}:{password} \
  -H 'Accept: application/json'

List all API keys for an LLM proxy in the Gateway.

Authentication

This operation requires Basic Auth authentication.

Required roles: admin, consumer

Parameters

Name In Type Required Description
id path string true Unique handle of the LLM proxy to retrieve keys for

Example responses

200 Response

{
  "apiKeys": [
    {
      "name": "my-production-key",
      "displayName": "My Production Key",
      "apiKey": "apip_1234567890abcdef1234567890abcdef1234567890abcdef1234567890abcdef",
      "apiId": "reading-list-api-v1.0",
      "status": "active",
      "createdAt": "2026-04-01T10:30:00Z",
      "createdBy": "admin",
      "expiresAt": null,
      "source": "local"
    }
  ],
  "totalCount": 3,
  "status": "success"
}

Responses

Status Meaning Description Schema
200 OK List of API keys APIKeyListResponse
404 Not Found LLM proxy not found ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

Regenerate API key for an LLM proxy

POST /llm-proxies/{id}/api-keys/{apiKeyName}/regenerate

Code samples

curl -X POST http://localhost:9090/api/management/v0.9/llm-proxies/{id}/api-keys/{apiKeyName}/regenerate \
  -u {username}:{password} \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json' \
  -d @payload.json

Regenerate an existing API key for an LLM proxy in the Gateway.

Payload

{}

Authentication

This operation requires Basic Auth authentication.

Required roles: admin, consumer

Parameters

Name In Type Required Description
id path string true Unique handle of the LLM proxy
apiKeyName path string true Name of the API key to regenerate
body body APIKeyRegenerationRequest true none

Example responses

200 Response

{
  "status": "success",
  "message": "API key generated successfully",
  "remainingApiKeyQuota": 9,
  "apiKey": {
    "name": "my-production-key",
    "displayName": "My Production Key",
    "apiKey": "apip_1234567890abcdef1234567890abcdef1234567890abcdef1234567890abcdef",
    "apiId": "reading-list-api-v1.0",
    "status": "active",
    "createdAt": "2026-04-01T10:30:00Z",
    "createdBy": "admin",
    "expiresAt": null,
    "source": "local"
  }
}

Responses

Status Meaning Description Schema
200 OK API key rotated successfully APIKeyCreationResponse
400 Bad Request Invalid configuration (validation failed) ErrorResponse
404 Not Found LLM proxy or API key not found ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

Update an API key for an LLM proxy

PUT /llm-proxies/{id}/api-keys/{apiKeyName}

Code samples

curl -X PUT http://localhost:9090/api/management/v0.9/llm-proxies/{id}/api-keys/{apiKeyName} \
  -u {username}:{password} \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json' \
  -d @payload.json

Update an API key with a custom value instead of auto-generating one.

Payload

{
  "name": "my-production-key"
}

Authentication

This operation requires Basic Auth authentication.

Required roles: admin, consumer

Parameters

Name In Type Required Description
id path string true Unique handle of the LLM proxy
apiKeyName path string true Name of the API key to update
body body APIKeyUpdateRequest true none

Example responses

200 Response

{
  "status": "success",
  "message": "API key generated successfully",
  "remainingApiKeyQuota": 9,
  "apiKey": {
    "name": "my-production-key",
    "displayName": "My Production Key",
    "apiKey": "apip_1234567890abcdef1234567890abcdef1234567890abcdef1234567890abcdef",
    "apiId": "reading-list-api-v1.0",
    "status": "active",
    "createdAt": "2026-04-01T10:30:00Z",
    "createdBy": "admin",
    "expiresAt": null,
    "source": "local"
  }
}

Responses

Status Meaning Description Schema
200 OK API key updated successfully APIKeyCreationResponse
400 Bad Request Invalid request (validation failed) ErrorResponse
404 Not Found LLM proxy or API key not found ErrorResponse
409 Conflict Conflict (duplicate key or conflicting update) ErrorResponse
500 Internal Server Error Internal server error ErrorResponse

Revoke an API key for an LLM proxy

DELETE /llm-proxies/{id}/api-keys/{apiKeyName}

Code samples

curl -X DELETE http://localhost:9090/api/management/v0.9/llm-proxies/{id}/api-keys/{apiKeyName} \
  -u {username}:{password} \
  -H 'Accept: application/json'

Revoke an API key. Once revoked, it can no longer be used to authenticate requests.

Authentication

This operation requires Basic Auth authentication.

Required roles: admin, consumer

Parameters

Name In Type Required Description
id path string true Unique handle of the LLM proxy
apiKeyName path string true Name of the API key to revoke

Example responses

200 Response

{
  "status": "success",
  "message": "API key revoked successfully"
}

Responses

Status Meaning Description Schema
200 OK API key revoked successfully APIKeyRevocationResponse
400 Bad Request Invalid configuration (validation failed) ErrorResponse
404 Not Found LLM proxy or API key not found ErrorResponse
500 Internal Server Error Internal server error ErrorResponse