PSAISuite is a high-performance, unified interface designed for systems architects and developers to integrate multiple LLMs through a standardized engineering layer.
By providing a consistent abstraction—similar to OpenAI’s SDK—it allows teams to swap, test, and benchmark responses across 15+ providers without modifying core application logic. This "Interface-First" approach brings multi-decade software architecture principles to the rapidly evolving AI landscape.
Wondering which model is fastest? Most instruction-compliant? Best at reasoning?
PSAISuiteBenchmarks is a built-in benchmark suite that runs standardized tests across all your providers in parallel and prints a leaderboard.
Import-Module .\PSAISuiteBenchmarks\PSAISuiteBenchmarks.psm1 -Force
Invoke-Benchmark -Models 'anthropic:claude-sonnet-4-6', 'xAI:grok-4-1-fast-non-reasoning', 'openai:gpt-4o' -Category 'InstructionFollowing'Real results. Real latency. Across all 15 providers.
Currently supported providers are:
- Anthropic
- Azure AI Foundry
- DeepSeek
- GitHub
- Groq
- Inception
- Mistral
- Nebius
- Ollama
- OpenAI
- OpenRouter
- Perplexity
- xAI
You can install the module from the PowerShell Gallery.
Install-Module PSAISuite🏗️ Architectural Core Provider Agnostic: Designed to prevent vendor lock-in through a decoupled interface.
Parallel Execution: Built-in benchmarking for real-time latency and instruction-compliance testing.
Context-Aware Piping: Engineered to handle massive data streams as context directly from the PowerShell pipeline.
To get started, you will need API Keys for the providers you intend to use.
The API Keys need to be be set as environment variables.
Set the API keys.
$env:OpenAIKey="your-openai-api-key"
$env:AnthropicKey="your-anthropic-api-key"
$env:NebiusKey="your-nebius-api-key"
$env:GITHUB_TOKEN="your-github-token" # Add GitHub token
# ... and so on for other providers
$env:INCEPTION_API_KEY="your-inception-api-key"You will need to set the AzureAIKey and AzureAIEndpoint environment variables.
$env:AzureAIKey = "your-azure-ai-key"
$env:AzureAIEndpoint = "your-azure-ai-endpoint"You can pipe data directly into Invoke-ChatCompletion (or its alias icc) to use it as context for your prompt. This is useful for summarizing files, analyzing command output, or providing additional information to the model.
For example:
Get-Content .\README.md | icc -Messages "Summarize this document." -Model "openai:gpt-4o-mini"You can also use the output of any command:
Get-Process | Out-String | icc -Messages "What processes are running?" -Model "openai:gpt-4o-mini"Tip:
- The
-Modelparameter supports tab completion for available providers and models. Start typing a provider (likeopenai:orgithub:) and pressTabto see suggestions.- You can use the
iccorgenerateTextalias instead ofInvoke-ChatCompletionin all examples above.
See PIPE-EXAMPLES.md for more details and examples.
🛠️ Tool Calling (Function Orchestration) PSAISuite implements native tool-calling patterns, allowing LLMs to interact with your local environment securely. This bridges the gap between static chat and high-agency autonomous actions.
- Native PowerShell Integration: Register any cmdlet or function as a tool instantly.
- Standardized Schemas: Pass custom JSON/Hashtable definitions for complex API interactions.
- Cross-Provider Support: Consistent implementation across OpenAI, xAI, Anthropic, and Google.
You can pass tools to Invoke-ChatCompletion using the -Tools parameter. Tools can be specified as:
- Cmdlet objects: Pass the cmdlet directly (e.g., Get-ChildItem), and it will be registered as a tool
- Pre-defined tool schemas (hashtables): Custom tool definitions
Example using a built-in command:
Invoke-ChatCompletion -Messages "List the files in the current directory" -Tools Get-ChildItem -Model "openai:gpt-4.1"This will allow the AI model to call the Get-ChildItem cmdlet to list directory contents. The model may respond with something like:
The files in the current directory are:
- README.md
- LICENSE
- PSAISuite.psd1
- ...
Example with custom tool definition:
$customTool = @{
Name = "Get-Weather"
Description = "Get current weather for a location"
Parameters = @{
type = "object"
properties = @{
location = @{
type = "string"
description = "City name"
}
}
required = @("location")
}
}
Invoke-ChatCompletion -Messages "What's the weather in New York?" -Tools $customTool -Model "openai:gpt-4o"Currently, tool calling is supported for the OpenAI, xAI, Anthropic, and Google providers. Support for other providers will be added in future updates.
Using PSAISuite to generate chat completion responses from different providers.
You can list all available AI providers using the Get-ChatProviders function:
# Get a list of all available providers
Get-ChatProvidersYou can list OpenRouter models by name using the Get-OpenRouterModel function. Use the -Raw switch to return all properties for matching models:
# List all OpenRouter models with 'gpt' in their name
Get-OpenRouterModel -Name '*gpt*'
# List all OpenRouter models and return all properties
Get-OpenRouterModel -Raw
# List models by name and return all properties
Get-OpenRouterModel -Name '*gpt*' -RawThe -Raw switch returns the full model object from the OpenRouter API, including all available properties.
# Import the module
Import-Module PSAISuite
$models = @("openai:gpt-4o", "anthropic:claude-3-5-sonnet-20240620", "azureai:gpt-4o", "nebius:meta-llama/Llama-3.3-70B-Instruct")
$message = New-ChatMessage -Prompt "What is the capital of France?"
foreach($model in $models) {
Invoke-ChatCompletion -Messages $message -Model $model
}# Import the module
Import-Module PSAISuite
$message = New-ChatMessage -Prompt "What is the capital of France?"
Invoke-ChatCompletion -Messages $message -Raw
# You can also use the alias:
generateText -Messages $message -Raw# Import the module
Import-Module PSAISuite
$model = "openai:gpt-4o"
$message = New-ChatMessage -Prompt "What is the capital of France?"
Invoke-ChatCompletion -Model $model -Messages $message
# or by setting the environment variable
$env:PSAISUITE_DEFAULT_MODEL = "openai:gpt-4o"
$message = New-ChatMessage -Prompt "What is the capital of France?"
Invoke-ChatCompletion -Messages $message Note that the model name in the Invoke-ChatCompletion call uses the format - <provider>:<model-name>.
documentation coming soon
