You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Raix 2.0.0 - RubyLLM backend and before_completion hook
Breaking Changes:
- Migrated from OpenRouter/OpenAI gems to RubyLLM for unified multi-provider support
- API keys now configured through RubyLLM instead of separate client instances
New Features:
- Added before_completion hook for intercepting/modifying requests
- Supports global, class, and instance-level configuration
- Enables dynamic params, logging, PII redaction, content filtering
- Added CompletionContext for hook access to messages and params
- Added FunctionToolAdapter and TranscriptAdapter for RubyLLM integration
Documentation:
- Updated README with RubyLLM configuration and before_completion examples
- Added migration guide for upgrading from 1.x
Copy file name to clipboardExpand all lines: CHANGELOG.md
+41Lines changed: 41 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,3 +1,44 @@
1
+
## [2.0.0] - 2025-12-17
2
+
3
+
### Breaking Changes
4
+
-**Migrated from OpenRouter/OpenAI gems to RubyLLM** - Raix now uses [RubyLLM](https://github.com/crmne/ruby_llm) as its unified backend for all LLM providers. This provides better multi-provider support and a more consistent API.
5
+
-**Configuration changes** - API keys are now configured through RubyLLM's configuration system instead of separate client instances.
6
+
-**Removed direct client dependencies** - `openrouter` and `ruby-openai` gems are no longer direct dependencies; RubyLLM handles provider connections.
7
+
8
+
### Added
9
+
-**`before_completion` hook** - New hook system for intercepting and modifying chat completion requests before they're sent to the AI provider.
10
+
- Configure at global, class, or instance levels
11
+
- Hooks receive a `CompletionContext` with access to messages, params, and the chat completion instance
12
+
- Messages are mutable for content filtering, PII redaction, adding system prompts, etc.
13
+
- Params can be modified for dynamic model selection, A/B testing, and more
14
+
- Supports any callable object (Proc, Lambda, or object responding to `#call`)
@@ -6,7 +6,7 @@ Raix (pronounced "ray" because the x is silent) is a library that gives you ever
6
6
7
7
Understanding how to use discrete AI components in otherwise normal code is key to productively leveraging Raix, and the subject of a book written by Raix's author Obie Fernandez, titled [Patterns of Application Development Using AI](https://leanpub.com/patterns-of-application-development-using-ai). You can easily support the ongoing development of this project by buying the book at Leanpub.
8
8
9
-
At the moment, Raix natively supports use of either OpenAI or OpenRouter as its underlying AI provider. Eventually you will be able to specify your AI provider via an adapter, kind of like ActiveRecord maps to databases. Note that you can also use Raix to add AI capabilities to non-Rails applications as long as you include ActiveSupport as a dependency. Extracting the base code to its own standalone library without Rails dependencies is on the roadmap, but not a high priority.
9
+
Raix 2.0 is powered by [RubyLLM](https://github.com/crmne/ruby_llm), giving you unified access to OpenAI, Anthropic, Google Gemini, and dozens of other providers through OpenRouter. Note that you can use Raix to add AI capabilities to non-Rails applications as long as you include ActiveSupport as a dependency.
10
10
11
11
### Chat Completions
12
12
@@ -105,6 +105,148 @@ When using JSON mode with non-OpenAI providers, Raix automatically sets the `req
105
105
=> { "key": "value" }
106
106
```
107
107
108
+
### before_completion Hook
109
+
110
+
The `before_completion` hook lets you intercept and modify chat completion requests before they're sent to the AI provider. This is useful for dynamic parameter resolution, logging, content filtering, PII redaction, and more.
111
+
112
+
#### Configuration Levels
113
+
114
+
Hooks can be configured at three levels, with later levels overriding earlier ones:
115
+
116
+
```ruby
117
+
# Global level - applies to all chat completions
118
+
Raix.configure do |config|
119
+
config.before_completion =->(context) {
120
+
# Return a hash of params to merge, or modify context.messages directly
121
+
{ temperature:0.7 }
122
+
}
123
+
end
124
+
125
+
# Class level - applies to all instances of a class
When hooks exist at multiple levels, they're called in order (global → class → instance), with returned params merged together. Later hooks override earlier ones for the same parameter.
140
+
141
+
#### The CompletionContext Object
142
+
143
+
Hooks receive a `CompletionContext` object with access to:
144
+
145
+
```ruby
146
+
context.chat_completion # The ChatCompletion instance
147
+
context.messages # Array of messages (mutable, in OpenAI format)
148
+
context.params # Hash of params (mutable)
149
+
context.transcript # The instance's transcript
150
+
context.current_model # Currently configured model
151
+
context.chat_completion_class # The class including ChatCompletion
152
+
context.configuration # The instance's configuration
The second (optional) module that you can add to your Ruby classes after `ChatCompletion` is `FunctionDispatch`. It lets you declare and implement functions to be called at the AI's discretion in a declarative, Rails-like "DSL" fashion.
@@ -711,49 +853,63 @@ If bundler is not being used to manage dependencies, install the gem by executin
711
853
712
854
$ gem install raix
713
855
714
-
If you are using the default OpenRouter API, Raix expects `Raix.configuration.openrouter_client` to initialized with the OpenRouter API client instance.
856
+
### Configuration
715
857
716
-
You can add an initializer to your application's `config/initializers` directory that looks like this example (setting up both providers, OpenRouter and OpenAI):
858
+
Raix 2.0 uses [RubyLLM](https://github.com/crmne/ruby_llm) as its backend for LLM provider connections. Configure your API keys through RubyLLM:
717
859
718
860
```ruby
719
-
# config/initializers/raix.rb
720
-
OpenRouter.configure do |config|
721
-
config.faraday do |f|
722
-
f.request :retry, retry_options
723
-
f.response :logger, Logger.new($stdout), { headers:true, bodies:true, errors:true } do |logger|
You will also need to configure the OpenRouter API access token as per the instructions here: https://github.com/OlympiaAI/open_router?tab=readme-ov-file#quickstart
871
+
Raix will automatically use the appropriate provider based on the model name:
872
+
- Models starting with `gpt-` or `o1` use OpenAI directly
873
+
- All other models route through OpenRouter
741
874
742
-
### Global vs class level configuration
875
+
### Global vs Class-Level Configuration
743
876
744
-
You can either configure Raix globally or at the class level. The global configuration is set in the initializer as shown above. You can however also override all configuration options of the `Configuration` class on the class level with the
745
-
same syntax:
877
+
You can configure Raix options globally or at the class level:
746
878
747
879
```ruby
748
-
classMyClass
880
+
# Global configuration
881
+
Raix.configure do |config|
882
+
config.temperature =0.7
883
+
config.max_tokens =1000
884
+
config.model ="gpt-4o"
885
+
config.max_tool_calls =25
886
+
end
887
+
888
+
# Class-level configuration (overrides global)
889
+
classMyAssistant
749
890
includeRaix::ChatCompletion
750
891
751
892
configure do |config|
752
-
config.openrouter_client =OpenRouter::Client.new# with my special options
893
+
config.model ="anthropic/claude-3-opus"
894
+
config.temperature =0.5
753
895
end
754
896
end
755
897
```
756
898
899
+
### Upgrading from Raix 1.x
900
+
901
+
If upgrading from Raix 1.x, update your configuration from:
To the new RubyLLM-based configuration shown above.
912
+
757
913
## Development
758
914
759
915
After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake spec` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
0 commit comments