-
Notifications
You must be signed in to change notification settings - Fork 50.8k
feat(Guardrails Node): Require Chat model only for LLM checks #22241
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
feat(Guardrails Node): Require Chat model only for LLM checks #22241
Conversation
…s-node-only-show-model-connector-if-the-selected
BundleMonUnchanged files (2)
No change in files bundle size Groups updated (2)
Final result: ✅ View report in BundleMon website ➡️ |
Codecov Report❌ Patch coverage is 📢 Thoughts on this report? Let us know! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
1 issue found across 7 files
Prompt for AI agents (all 1 issues)
Understand the root cause of the following 1 issues and fix them.
<file name="packages/@n8n/nodes-langchain/nodes/Guardrails/Guardrails.node.ts">
<violation number="1" location="packages/@n8n/nodes-langchain/nodes/Guardrails/Guardrails.node.ts:15">
`model` availability is determined only from item 0 guardrails, so later items that enable LLM checks fail with `Chat Model is required`. Iterate all items (or evaluate per item) when deciding to load the chat model.</violation>
</file>
Reply to cubic to teach it or ask questions. Re-run a review with @cubic-dev-ai review this PR
| const items = this.getInputData(); | ||
| const operation = this.getNodeParameter('operation', 0) as 'classify' | 'sanitize'; | ||
| const model = operation === 'classify' ? await getChatModel.call(this) : null; | ||
| const model = hasLLMGuardrails(this.getNodeParameter('guardrails', 0) as GuardrailsOptions) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
model availability is determined only from item 0 guardrails, so later items that enable LLM checks fail with Chat Model is required. Iterate all items (or evaluate per item) when deciding to load the chat model.
Prompt for AI agents
Address the following comment on packages/@n8n/nodes-langchain/nodes/Guardrails/Guardrails.node.ts at line 15:
<comment>`model` availability is determined only from item 0 guardrails, so later items that enable LLM checks fail with `Chat Model is required`. Iterate all items (or evaluate per item) when deciding to load the chat model.</comment>
<file context>
@@ -10,7 +12,9 @@ export class Guardrails implements INodeType {
const items = this.getInputData();
const operation = this.getNodeParameter('operation', 0) as 'classify' | 'sanitize';
- const model = operation === 'classify' ? await getChatModel.call(this) : null;
+ const model = hasLLMGuardrails(this.getNodeParameter('guardrails', 0) as GuardrailsOptions)
+ ? await getChatModel.call(this)
+ : null;
</file context>
This comment has been minimized.
This comment has been minimized.
|
E2E Tests: n8n tests passed after 10m 23.8s Run Details
Groups
This message was posted automatically by
currents.dev | Integration Settings
|
Summary
This PR makes Chat model requirement more granular, requiring it only when LLM checks are present
2025-11-24.14-48-01.mp4
Had to add versioning, so that old classification guardrails that use only non-LLM checks don't look like that

Related Linear tickets, Github issues, and Community forum posts
https://linear.app/n8n/issue/NODE-3984/guardrails-node-only-show-model-connector-if-the-selected-action-is
Review / Merge checklist
release/backport(if the PR is an urgent fix that needs to be backported)