Skip to content

[FEATURE] OpenSearch Plugin Connectors and other commands #178

Open
@uriofferup

Description

@uriofferup

Is your feature request related to a problem?

I'm trying to automate deployments of OpenSearch with its associated applications. In the past I tried using awscurl to create indices but that has its limitations. Using the OpenSearch Terraform Provider helped, but I still found some other required initial configurations that are hard to automate.

What solution would you like?

I'm looking to create something like an opensearch_execute_command that helps simplify many configurations as an initial step before creating specialized resources that correctly manage the lifecycle.
This can be set in a resource as something like:

resource "opensearch_execute_command" "connector"
{
    method: "POST"
    endpoint: "/_plugins/_ml/connectors/_create"
    body: <<EOF
    {
        "name": "OpenAI Chat Connector",
        "description": "The connector to public OpenAI model service for GPT 3.5",
        "version": 1,
        "protocol": "http",
        "parameters": {
            "endpoint": "api.openai.com",
            "model": "gpt-3.5-turbo"
        },
        "credential": {
            "openAI_key": "..."
        },
        "actions": [
            {
                "action_type": "predict",
                "method": "POST",
                "url": "https://${parameters.endpoint}/v1/chat/completions",
                "headers": {
                    "Authorization": "Bearer ${credential.openAI_key}"
                },
                "request_body": "{ \"model\": \"${parameters.model}\", \"messages\": ${parameters.messages} }"
            }
        ]
    }
    EOF
}

Then the resource could expose a result with the response body.

What alternatives have you considered?

Using a null resource with awscurl but that has its challenges with authentication and parsing the results.

Do you have any additional context?

For example following the documentation for implementing external ML Models, its required to send something like:

POST /_plugins/_ml/connectors/_create
{
    "name": "OpenAI Chat Connector",
    "description": "The connector to public OpenAI model service for GPT 3.5",
    "version": 1,
    "protocol": "http",
    "parameters": {
        "endpoint": "api.openai.com",
        "model": "gpt-3.5-turbo"
    },
    "credential": {
        "openAI_key": "..."
    },
    "actions": [
        {
            "action_type": "predict",
            "method": "POST",
            "url": "https://${parameters.endpoint}/v1/chat/completions",
            "headers": {
                "Authorization": "Bearer ${credential.openAI_key}"
            },
            "request_body": "{ \"model\": \"${parameters.model}\", \"messages\": ${parameters.messages} }"
        }
    ]
}

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    Status

    📦 Backlog

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions