Skip to content

Add new openapi spec for Watsonx.ai extension #322

Open
@dzzzzllz

Description

@dzzzzllz

Hi, I have created a new all-in-one openapi spec for Watsonx.ai extension, that able to connect wxAssistant/wxOrchestrate to external AI services and model/template deployed into watsonx.ai runtime.

The new openapi spec including 6 path:

  • generate text with wx.ai
  • generate text (stream)
  • generate text with deployed model/template
  • generate text with deployed model/template (stream)
  • generate text with deployed AI service
  • generate text with deployed AI service (stream)

This could be useful for everyone who wants to use external models on watsonx.ai, skip tuning prompts inside wxA, or conduct a RAG pipeline inside watsonx.ai

I want to add this to https://github.com/watson-developer-cloud/assistant-toolkit/tree/master/integrations/extensions/starter-kits/language-model-watsonx as an advanced usecase.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions