Caution
This package is in pre-release and not subject to backwards compatibility guarantees. The API may change based on feedback.
Pin to a specific minor version and review the changelog before upgrading.
This package provides an OpenAI integration for the LaunchDarkly AI SDK.
pip install launchdarkly-server-sdk-ai-openai-devimport asyncio
from ldai import AIClient
from ldai_openai import OpenAIProvider
async def main():
# Initialize the AI client
ai_client = AIClient(ld_client)
# Get AI config. Pass a default for improved resiliency when the flag is unavailable or
# LaunchDarkly is unreachable; omit for a disabled default. Example:
# from ldai.models import AICompletionConfigDefault, LDMessage, ModelConfig, ProviderConfig
# default = AICompletionConfigDefault(
# enabled=True,
# model=ModelConfig("gpt-4"),
# provider=ProviderConfig("openai"),
# messages=[LDMessage(role="system", content="You are a helpful assistant.")]
# )
# ai_config = ai_client.config("my-ai-config-key", context, default)
ai_config = ai_client.config("my-ai-config-key", context)
# Create an OpenAI provider from the config
provider = await OpenAIProvider.create(ai_config)
# Invoke the model
response = await provider.invoke_model(ai_config.messages)
print(response.message.content)
asyncio.run(main())- Full integration with OpenAI's chat completions API
- Automatic token usage tracking
- Support for structured output (JSON schema)
- Static utility methods for custom integrations
OpenAIProvider(client: OpenAI, model_name: str, parameters: Dict[str, Any], logger: Optional[Any] = None)create(ai_config: AIConfigKind, logger: Optional[Any] = None) -> OpenAIProvider- Factory method to create a provider from an AI configget_ai_metrics_from_response(response: Any) -> LDAIMetrics- Extract metrics from an OpenAI response
invoke_model(messages: List[LDMessage]) -> ChatResponse- Invoke the model with messagesinvoke_structured_model(messages: List[LDMessage], response_structure: Dict[str, Any]) -> StructuredResponse- Invoke the model with structured outputget_client() -> OpenAI- Get the underlying OpenAI client
Apache-2.0