Skip to content

Fixes invalid_request_error for tool calling with Anthropic models#192

Merged
zachdaniel merged 1 commit intoash-project:mainfrom
kioopi:fix/prevent-invalid-request-for-anthropic-tool-calls
Apr 24, 2026
Merged

Fixes invalid_request_error for tool calling with Anthropic models#192
zachdaniel merged 1 commit intoash-project:mainfrom
kioopi:fix/prevent-invalid-request-for-anthropic-tool-calls

Conversation

@kioopi
Copy link
Copy Markdown
Contributor

@kioopi kioopi commented Apr 24, 2026

Anthropic does not allow "prefilling assitant messages", meaning the last message of the messages parameter to ReqLLM generate_object/4 may not have the role :assistant.

See: #191

This commit prevents run_loop/10 and stream_iteration/1 in AshAi.ToolLoop from appending assistant messages to the context when the model provider is anthropic.


I've changed defp resolve_model(model, _opts), do: model to return ReqLLM.model!(model) in AshAi.ToolLoop. This makes working with the model inside the module simpler ans seems to be recommended by ReqLLM https://github.com/agentjido/req_llm/blob/main/lib/req_llm.ex#L353

Would have done the same in AshAi.Actions.Prompt for resolve_model_spec since model is passed from Prompt to ToolLoop and people using the fn variant of resolve_model this would be a breaking change.

Contributor checklist

Leave anything that you believe does not apply unchecked.

  • I accept the AI Policy, or AI was not used in the creation of this PR.
  • Bug fixes include regression tests
  • Chores
  • Documentation changes
  • Features include unit/acceptance tests
  • Refactoring
  • Update dependencies

Anthropic does not allow "prefilling assitant messages", meaning the last message
of the `messages` parameter to ReqLLM `generate_object/4` may not have the role
`:assistant`.

See: ash-project#191

This commit prevents run_loop/10 and stream_iteration/1 in AshAi.ToolLoop
from appending assistant messages to the context when the model provider is
anthropic.

---

I've changed `defp resolve_model(model, _opts), do: model` to return `ReqLLM.model!(model)` in `AshAi.ToolLoop`.
This makes working with the model inside the module simpler ans seems to
be recommended by ReqLLM https://github.com/agentjido/req_llm/blob/main/lib/req_llm.ex#L353

Would have done the same in `AshAi.Actions.Prompt` for
resolve_model_spec since model is passed from Prompt to ToolLoop and
people using the fn variant of resolve_model this would be a breaking change.
@zachdaniel zachdaniel merged commit 259b70a into ash-project:main Apr 24, 2026
21 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants