CRUD operations for LLM Proxy configurations
POST /llm-proxies
Code samples
curl -X POST http://localhost:9090/api/management/v0.9/llm-proxies \
-u {username}:{password} \
-H 'Content-Type: application/json' \
-H 'Accept: application/json' \
-d @payload.json
Add a new LLM proxy to the Gateway. A proxy defines how to interact with an LLM service deployed in the Gateway, including authentication and policies.
Payload
{
"apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
"kind": "LlmProxy",
"metadata": {
"name": "openai-proxy"
},
"spec": {
"displayName": "OpenAI Proxy",
"version": "v1.0",
"context": "/openai-proxy",
"provider": {
"id": "wso2-openai-provider"
},
"policies": []
}
}Required roles: admin, developer
| Name | In | Type | Required | Description |
|---|---|---|---|---|
| body | body | LLMProxyConfigurationRequest | true | LLM proxy in YAML or JSON format |
Example responses
201 Response
{
"apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
"kind": "LlmProxy",
"metadata": {
"name": "openai-proxy"
},
"spec": {
"displayName": "OpenAI Proxy",
"version": "v1.0",
"context": "/openai-proxy",
"provider": {
"id": "wso2-openai-provider"
},
"policies": []
},
"status": {
"id": "openai-proxy",
"state": "deployed",
"createdAt": "2026-04-24T07:21:13Z",
"updatedAt": "2026-04-24T07:21:13Z",
"deployedAt": "2026-04-24T07:21:13Z"
}
}| Status | Meaning | Description | Schema |
|---|---|---|---|
| 201 | Created | LLM proxy created and deployed successfully | LLMProxyConfiguration |
| 400 | Bad Request | Invalid configuration (validation failed) | ErrorResponse |
| 409 | Conflict | Conflict - Proxy with same name and version already exists | ErrorResponse |
| 500 | Internal Server Error | Internal server error | ErrorResponse |
GET /llm-proxies
Code samples
curl -X GET http://localhost:9090/api/management/v0.9/llm-proxies \
-u {username}:{password} \
-H 'Accept: application/json'
List LLM proxies registered in the Gateway, optionally filtered by name, version, context, status, or vhost.
This operation requires Basic Auth authentication.Required roles: admin, developer
| Name | In | Type | Required | Description |
|---|---|---|---|---|
| displayName | query | string | false | Filter by LLM proxy displayName |
| version | query | string | false | Filter by LLM proxy version |
| context | query | string | false | Filter by LLM proxy context/path |
| status | query | string | false | Filter by deployment status |
| vhost | query | string | false | Filter by LLM proxy vhost |
| Parameter | Value |
|---|---|
| status | deployed |
| status | undeployed |
Example responses
200 Response
{
"status": "success",
"count": 2,
"proxies": [
{
"apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
"kind": "LlmProxy",
"metadata": {
"name": "openai-proxy"
},
"spec": {
"displayName": "OpenAI Proxy",
"version": "v1.0",
"context": "/openai-proxy",
"provider": {
"id": "wso2-openai-provider"
},
"policies": []
},
"status": {
"id": "openai-proxy",
"state": "deployed",
"createdAt": "2026-04-24T07:21:13Z",
"updatedAt": "2026-04-24T07:21:13Z",
"deployedAt": "2026-04-24T07:21:13Z"
}
}
]
}| Status | Meaning | Description | Schema |
|---|---|---|---|
| 200 | OK | List of LLM proxies | Inline |
| 500 | Internal Server Error | Internal server error | ErrorResponse |
Status Code 200
| Name | Type | Required | Restrictions | Description |
|---|---|---|---|---|
| » status | string | false | none | none |
| » count | integer | false | none | none |
| » proxies | [allOf] | false | none | none |
allOf
| Name | Type | Required | Restrictions | Description |
|---|---|---|---|---|
| »» anonymous | LLMProxyConfigurationRequest | false | none | none |
| »»» apiVersion | string | true | none | Proxy specification version |
| »»» kind | string | true | none | Proxy kind |
| »»» metadata | Metadata | true | none | none |
| »»»» name | string | true | none | Unique handle for the resource |
| »»»» labels | object | false | none | Labels are key-value pairs for organizing and selecting APIs. Keys must not contain spaces. |
| »»»»» additionalProperties | string | false | none | none |
| »»»» annotations | object | false | none | Annotations are arbitrary non-identifying metadata. Use domain-prefixed keys. |
| »»»»» additionalProperties | string | false | none | none |
| »»» spec | LLMProxyConfigData | true | none | none |
| »»»» displayName | string | true | none | Human-readable LLM proxy name (must be URL-friendly - only letters, numbers, spaces, hyphens, underscores, and dots allowed) |
| »»»» version | string | true | none | Semantic version of the LLM proxy |
| »»»» context | string | false | none | Base path for all API routes (must start with /, no trailing slash) |
| »»»» vhost | string | false | none | Virtual host name used for routing. Supports standard domain names, subdomains, or wildcard domains. Must follow RFC-compliant hostname rules. Wildcards are only allowed in the left-most label (e.g., *.example.com). |
| »»»» provider | LLMProxyProvider | true | none | none |
| »»»»» id | string | true | none | Unique id of a deployed llm provider |
| »»»»» auth | LLMUpstreamAuth | false | none | none |
| »»»»»» type | string | true | none | none |
| »»»»»» header | string | false | none | none |
| »»»»»» value | string | false | none | none |
| »»»» policies | [LLMPolicy] | false | none | List of policies applied only to this operation (overrides or adds to API-level policies) |
| »»»»» name | string | true | none | none |
| »»»»» version | string | true | none | none |
| »»»»» paths | [LLMPolicyPath] | true | none | none |
| »»»»»» path | string | true | none | none |
| »»»»»» methods | [string] | true | none | none |
| »»»»»» params | object | true | none | JSON Schema describing the parameters accepted by this policy. This itself is a JSON Schema document. |
| »»»» deploymentState | string | false | none | Desired deployment state - 'deployed' (default) or 'undeployed'. When set to 'undeployed', the LLM Proxy is removed from router traffic but configuration and policies are preserved for potential redeployment. |
and
| Name | Type | Required | Restrictions | Description |
|---|---|---|---|---|
| »» anonymous | object | false | none | none |
| »»» status | ResourceStatus | false | read-only | Server-managed lifecycle fields. Populated on responses. |
| »»»» id | string | false | none | Unique identifier assigned by the server (equal to metadata.name) |
| »»»» state | string | false | none | Desired deployment state reported by the server |
| »»»» createdAt | string(date-time) | false | none | Timestamp when the resource was first created (UTC) |
| »»»» updatedAt | string(date-time) | false | none | Timestamp when the resource was last updated (UTC) |
| »»»» deployedAt | string(date-time) | false | none | Timestamp when the resource was last deployed (omitted when undeployed) |
| Property | Value |
|---|---|
| apiVersion | gateway.api-platform.wso2.com/v1alpha1 |
| kind | LlmProxy |
| type | api-key |
| deploymentState | deployed |
| deploymentState | undeployed |
| state | deployed |
| state | undeployed |
GET /llm-proxies/{id}
Code samples
curl -X GET http://localhost:9090/api/management/v0.9/llm-proxies/{id} \
-u {username}:{password} \
-H 'Accept: application/json'
Get an LLM proxy by its ID.
This operation requires Basic Auth authentication.Required roles: admin, developer
| Name | In | Type | Required | Description |
|---|---|---|---|---|
| id | path | string | true | Unique identifier of the LLM proxy |
Example responses
200 Response
{
"apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
"kind": "LlmProxy",
"metadata": {
"name": "openai-proxy"
},
"spec": {
"displayName": "OpenAI Proxy",
"version": "v1.0",
"context": "/openai-proxy",
"provider": {
"id": "wso2-openai-provider"
},
"policies": []
},
"status": {
"id": "openai-proxy",
"state": "deployed",
"createdAt": "2026-04-24T07:21:13Z",
"updatedAt": "2026-04-24T07:21:13Z",
"deployedAt": "2026-04-24T07:21:13Z"
}
}| Status | Meaning | Description | Schema |
|---|---|---|---|
| 200 | OK | LLM proxy details | LLMProxyConfiguration |
| 404 | Not Found | LLM proxy not found | ErrorResponse |
| 500 | Internal Server Error | Internal server error | ErrorResponse |
PUT /llm-proxies/{id}
Code samples
curl -X PUT http://localhost:9090/api/management/v0.9/llm-proxies/{id} \
-u {username}:{password} \
-H 'Content-Type: application/json' \
-H 'Accept: application/json' \
-d @payload.json
Update an existing LLM proxy in the Gateway.
Payload
{
"apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
"kind": "LlmProxy",
"metadata": {
"name": "openai-proxy"
},
"spec": {
"displayName": "OpenAI Proxy",
"version": "v1.0",
"context": "/openai-proxy",
"provider": {
"id": "wso2-openai-provider"
},
"policies": []
}
}Required roles: admin, developer
| Name | In | Type | Required | Description |
|---|---|---|---|---|
| id | path | string | true | Unique identifier of the LLM proxy |
| body | body | LLMProxyConfigurationRequest | true | Updated LLM proxy |
Example responses
200 Response
{
"apiVersion": "gateway.api-platform.wso2.com/v1alpha1",
"kind": "LlmProxy",
"metadata": {
"name": "openai-proxy"
},
"spec": {
"displayName": "OpenAI Proxy",
"version": "v1.0",
"context": "/openai-proxy",
"provider": {
"id": "wso2-openai-provider"
},
"policies": []
},
"status": {
"id": "openai-proxy",
"state": "deployed",
"createdAt": "2026-04-24T07:21:13Z",
"updatedAt": "2026-04-24T07:21:13Z",
"deployedAt": "2026-04-24T07:21:13Z"
}
}| Status | Meaning | Description | Schema |
|---|---|---|---|
| 200 | OK | LLM proxy updated successfully | LLMProxyConfiguration |
| 400 | Bad Request | Invalid configuration | ErrorResponse |
| 404 | Not Found | LLM proxy not found | ErrorResponse |
| 500 | Internal Server Error | Internal server error | ErrorResponse |
DELETE /llm-proxies/{id}
Code samples
curl -X DELETE http://localhost:9090/api/management/v0.9/llm-proxies/{id} \
-u {username}:{password} \
-H 'Accept: application/json'
Delete an LLM proxy from the Gateway.
This operation requires Basic Auth authentication.Required roles: admin, developer
| Name | In | Type | Required | Description |
|---|---|---|---|---|
| id | path | string | true | Unique identifier of the LLM proxy |
Example responses
200 Response
{
"status": "success",
"message": "LLM proxy deleted successfully",
"id": "openai-proxy"
}| Status | Meaning | Description | Schema |
|---|---|---|---|
| 200 | OK | LLM proxy deleted successfully | Inline |
| 404 | Not Found | LLM proxy not found | ErrorResponse |
| 500 | Internal Server Error | Internal server error | ErrorResponse |
Status Code 200
| Name | Type | Required | Restrictions | Description |
|---|---|---|---|---|
| » status | string | false | none | none |
| » message | string | false | none | none |
| » id | string | false | none | none |
POST /llm-proxies/{id}/api-keys
Code samples
curl -X POST http://localhost:9090/api/management/v0.9/llm-proxies/{id}/api-keys \
-u {username}:{password} \
-H 'Content-Type: application/json' \
-H 'Accept: application/json' \
-d @payload.json
Generate a new API key for an LLM proxy in the Gateway.
Payload
{
"name": "my-production-key"
}Required roles: admin, consumer
| Name | In | Type | Required | Description |
|---|---|---|---|---|
| id | path | string | true | Unique handle of the LLM proxy to generate the key for |
| body | body | APIKeyCreationRequest | true | none |
Example responses
201 Response
{
"status": "success",
"message": "API key generated successfully",
"remainingApiKeyQuota": 9,
"apiKey": {
"name": "my-production-key",
"displayName": "My Production Key",
"apiKey": "apip_1234567890abcdef1234567890abcdef1234567890abcdef1234567890abcdef",
"apiId": "reading-list-api-v1.0",
"status": "active",
"createdAt": "2026-04-01T10:30:00Z",
"createdBy": "admin",
"expiresAt": null,
"source": "local"
}
}| Status | Meaning | Description | Schema |
|---|---|---|---|
| 201 | Created | API key created successfully | APIKeyCreationResponse |
| 400 | Bad Request | Invalid configuration (validation failed) | ErrorResponse |
| 404 | Not Found | LLM proxy not found | ErrorResponse |
| 409 | Conflict | Conflict (duplicate key or conflicting update) | ErrorResponse |
| 500 | Internal Server Error | Internal server error | ErrorResponse |
GET /llm-proxies/{id}/api-keys
Code samples
curl -X GET http://localhost:9090/api/management/v0.9/llm-proxies/{id}/api-keys \
-u {username}:{password} \
-H 'Accept: application/json'
List all API keys for an LLM proxy in the Gateway.
This operation requires Basic Auth authentication.Required roles: admin, consumer
| Name | In | Type | Required | Description |
|---|---|---|---|---|
| id | path | string | true | Unique handle of the LLM proxy to retrieve keys for |
Example responses
200 Response
{
"apiKeys": [
{
"name": "my-production-key",
"displayName": "My Production Key",
"apiKey": "apip_1234567890abcdef1234567890abcdef1234567890abcdef1234567890abcdef",
"apiId": "reading-list-api-v1.0",
"status": "active",
"createdAt": "2026-04-01T10:30:00Z",
"createdBy": "admin",
"expiresAt": null,
"source": "local"
}
],
"totalCount": 3,
"status": "success"
}| Status | Meaning | Description | Schema |
|---|---|---|---|
| 200 | OK | List of API keys | APIKeyListResponse |
| 404 | Not Found | LLM proxy not found | ErrorResponse |
| 500 | Internal Server Error | Internal server error | ErrorResponse |
POST /llm-proxies/{id}/api-keys/{apiKeyName}/regenerate
Code samples
curl -X POST http://localhost:9090/api/management/v0.9/llm-proxies/{id}/api-keys/{apiKeyName}/regenerate \
-u {username}:{password} \
-H 'Content-Type: application/json' \
-H 'Accept: application/json' \
-d @payload.json
Regenerate an existing API key for an LLM proxy in the Gateway.
Payload
{}Required roles: admin, consumer
| Name | In | Type | Required | Description |
|---|---|---|---|---|
| id | path | string | true | Unique handle of the LLM proxy |
| apiKeyName | path | string | true | Name of the API key to regenerate |
| body | body | APIKeyRegenerationRequest | true | none |
Example responses
200 Response
{
"status": "success",
"message": "API key generated successfully",
"remainingApiKeyQuota": 9,
"apiKey": {
"name": "my-production-key",
"displayName": "My Production Key",
"apiKey": "apip_1234567890abcdef1234567890abcdef1234567890abcdef1234567890abcdef",
"apiId": "reading-list-api-v1.0",
"status": "active",
"createdAt": "2026-04-01T10:30:00Z",
"createdBy": "admin",
"expiresAt": null,
"source": "local"
}
}| Status | Meaning | Description | Schema |
|---|---|---|---|
| 200 | OK | API key rotated successfully | APIKeyCreationResponse |
| 400 | Bad Request | Invalid configuration (validation failed) | ErrorResponse |
| 404 | Not Found | LLM proxy or API key not found | ErrorResponse |
| 500 | Internal Server Error | Internal server error | ErrorResponse |
PUT /llm-proxies/{id}/api-keys/{apiKeyName}
Code samples
curl -X PUT http://localhost:9090/api/management/v0.9/llm-proxies/{id}/api-keys/{apiKeyName} \
-u {username}:{password} \
-H 'Content-Type: application/json' \
-H 'Accept: application/json' \
-d @payload.json
Update an API key with a custom value instead of auto-generating one.
Payload
{
"name": "my-production-key"
}Required roles: admin, consumer
| Name | In | Type | Required | Description |
|---|---|---|---|---|
| id | path | string | true | Unique handle of the LLM proxy |
| apiKeyName | path | string | true | Name of the API key to update |
| body | body | APIKeyUpdateRequest | true | none |
Example responses
200 Response
{
"status": "success",
"message": "API key generated successfully",
"remainingApiKeyQuota": 9,
"apiKey": {
"name": "my-production-key",
"displayName": "My Production Key",
"apiKey": "apip_1234567890abcdef1234567890abcdef1234567890abcdef1234567890abcdef",
"apiId": "reading-list-api-v1.0",
"status": "active",
"createdAt": "2026-04-01T10:30:00Z",
"createdBy": "admin",
"expiresAt": null,
"source": "local"
}
}| Status | Meaning | Description | Schema |
|---|---|---|---|
| 200 | OK | API key updated successfully | APIKeyCreationResponse |
| 400 | Bad Request | Invalid request (validation failed) | ErrorResponse |
| 404 | Not Found | LLM proxy or API key not found | ErrorResponse |
| 409 | Conflict | Conflict (duplicate key or conflicting update) | ErrorResponse |
| 500 | Internal Server Error | Internal server error | ErrorResponse |
DELETE /llm-proxies/{id}/api-keys/{apiKeyName}
Code samples
curl -X DELETE http://localhost:9090/api/management/v0.9/llm-proxies/{id}/api-keys/{apiKeyName} \
-u {username}:{password} \
-H 'Accept: application/json'
Revoke an API key. Once revoked, it can no longer be used to authenticate requests.
This operation requires Basic Auth authentication.Required roles: admin, consumer
| Name | In | Type | Required | Description |
|---|---|---|---|---|
| id | path | string | true | Unique handle of the LLM proxy |
| apiKeyName | path | string | true | Name of the API key to revoke |
Example responses
200 Response
{
"status": "success",
"message": "API key revoked successfully"
}| Status | Meaning | Description | Schema |
|---|---|---|---|
| 200 | OK | API key revoked successfully | APIKeyRevocationResponse |
| 400 | Bad Request | Invalid configuration (validation failed) | ErrorResponse |
| 404 | Not Found | LLM proxy or API key not found | ErrorResponse |
| 500 | Internal Server Error | Internal server error | ErrorResponse |