Releases: dimagi/open-chat-studio-docs
Releases · dimagi/open-chat-studio-docs
Weekly Release 2026.04.13
New Features
- Added date range and participant filters to the annotation queue view, making it easier to find specific sessions.
- Python Node code in pipelines can now call
end_session()to programmatically end the current session, enabling more complex and deterministic session-ending logic.
Bug Fixes
- Fixed an issue where document indexing failed for certain markdown files containing multi-byte UTF-8 characters.
Weekly Release 2026.04.06
New Features
- Added the ability to import sessions from an existing evaluation dataset into an annotation queue
- Added three session selection modes when adding sessions to an annotation queue: Selected only (default, hand-pick via checkboxes), All matching filters (bulk-add every session matching the current filter), and Sample (add a random percentage of matching sessions using a configurable slider). A confirmation modal is shown for bulk operations
Weekly Release 2026.03.30
New Features
- When uploading files to a media collection, it will now indicate which channels cannot send this file. Hovering over a channel will also show the reason why it cannot send the file.
- Added ElevenLabs as a speech service provider, supporting text-to-speech (TTS) and speech-to-text (STT). Providers can sync voices from the ElevenLabs catalog, and custom voices created in ElevenLabs are automatically synced to Open Chat Studio.
- The Meta Cloud API WhatsApp provider now supports media messages. Users can send and receive images, videos, audio, and documents through WhatsApp channels.
- The Meta Cloud API WhatsApp provider now supports template messages as a fallback when the 24-hour service window has expired. When a bot cannot send a message due to an expired window, it automatically sends a pre-configured WhatsApp template instead of silently dropping the message.
Improvements
- The default timeout for Custom Action HTTP calls has been increased from 10 seconds to 30 seconds to better accommodate complex or slow external services.
Bug Fixes
- Fixed an issue where chat poll API responses could not generate correct URLs due to missing request context in the response serializer.
- Fixed an authentication error that occurred when an invalid
chatbot_idwas provided in API requests. - Fixed an error that could occur when displaying file sizes for files with no recorded content size.
- Fixed an issue where timeout triggers stopped firing after publishing a new experiment version. Sessions created before the publish were silently excluded from timeout detection.
Weekly Release 2026.03.23
New Features
- Python Node code in pipelines can now use
print()to capture debug and diagnostic output. Printed output is collected and visible asconsoledata in the node's trace span, including in Langfuse. - The Meta Cloud API WhatsApp provider now supports voice messages in addition to text messages.
- Excel and Word document attachments are now automatically converted to text before being sent to the LLM, enabling these file types to be processed in conversations alongside PDFs and images.
- Added support for Meta Cloud API as a new WhatsApp messaging provider, enabling direct integration with the WhatsApp Business Platform without requiring a third-party intermediary. Configure it using your WhatsApp Business Account ID, System User Access Token, App Secret, and Webhook Verify Token.
- Added Set Session State Key and Get Session State built-in tools that allow LLM nodes to read and write data from the session state during a conversation.
- Added Append to Session State and Increment Session State Counter built-in tools, mirroring the existing participant data tools for managing lists and counters in session state.
- Session CSV exports now include a Session State column containing the data stored in the session_state field, making it easier to inspect pipeline state alongside conversation history.
- Session detail views now display participant data as of the latest trace, with a timestamp note. AI messages that triggered participant data changes show a diff icon — click it to see a color-coded popover of what was added, removed, or modified.
Bug Fixes
- Fixed an issue where pipelines triggered by events (e.g. conversation end, timeout) silently discarded updates to participant data, session state, and session tags. These state changes are now correctly persisted.
Weekly Release 2026.03.16
New Features
- Added an Annotation Reviewer team role that grants scoped access to annotation queues. Users with this role can view and annotate queues they are assigned to, but cannot manage queues, add sessions, export results, or access other parts of the app.
- Participant data changes are now tracked per trace. The trace detail page shows a color-coded summary of what data was added, removed, or modified during each conversation turn, and CSV exports include a Participant Data column with the data snapshot at each message.
- The Send Email pipeline node's subject and recipient fields now support Jinja2 templates, and a new optional body field also accepts Jinja2 templates — the same variables available in the Render Template node. Existing pipelines are unaffected.
- Natural language filtering is now available to all users. Type plain-English queries (e.g., "sessions from last week excluding WhatsApp") on session, message, traces, participants, and notifications tables and click ✨ Generate to automatically build filter rows.
Weekly Release 2026.03.02
New Features
- Tracing is now available to all users — no feature flag required. View and debug conversation traces directly from the session detail page. Learn more
- Natural language filter input added to session and message tables. Users can type plain-English queries (e.g., "sessions from last week excluding WhatsApp") and click ✨ Generate to automatically create filter rows. This feature is in beta and can be enabled by team admins from the team feature flags page.
- LLM deprecation notifications - Teams now receive in-app notifications when LLM models are deprecated or removed.
Weekly Release 2026.02.23
New Features
- Added support for Claude Sonnet 4.6 model with adaptive thinking. Claude Sonnet 4.6 is now the default Anthropic model, replacing Claude Sonnet 4.5 as the default.
- Document source sync logs are now accessible directly from the Collections page via a "View Sync Logs" button, allowing users to inspect sync history, file counts (added/updated/removed), duration, and error details without leaving the page.
- Added notification events that alert you when something important or noteworthy happens in your system, including failures across custom actions (health checks, API failures), chat operations (pipeline execution, LLM errors, tool failures), media handling (audio synthesis/transcription), and message delivery (platform-specific failures).
Weekly Release 2026.02.16
New Features
- Python nodes can now attach files fetched via HTTP to AI response messages using the new
attach_file_from_response()helper andresponse_bytesfield on HTTP responses. Documentation - Added
httpglobal to Python sandbox for making HTTP requests with security guardrails including SSRF prevention, request/response size limits, timeout clamping, automatic retries, and authentication provider integration. Documentation
Improvements
- Authentication provider names in Python node HTTP requests are now case-insensitive, allowing
auth="My-Provider"andauth="my-provider"to match the same provider.
Weekly Release 2026.02.09
New Features
- Voice notes from users and bots are now displayed as attachments in the chat transcript, making it easier to review and access voice messages.
Improvements
- Indexed collections using OpenAI-hosted vectorstores are now limited to 2 remote collections per LLM node, enforcing OpenAI's vectorstore limit. Local indexes and non-OpenAI providers remain unaffected.
Bug Fixes
- Fixed character encoding issues when reading plaintext files by automatically detecting and converting different encoding schemes to unicode.
- Fixed an issue where local collection index validation in LLM nodes incorrectly required all collections to use the same LLM provider as the node. This restriction now only applies to remote collections.
Weekly Release 2026.02.02
New Features
- Dataset messages table rows can now be highlighted and shared via URL. Each row has a link and copy button to easily share specific dataset messages with others, with automatic scrolling to the highlighted message.
- Custom actions now include health status monitoring. The system automatically checks custom action endpoints every 5 minutes to verify server availability, displaying the status in the custom actions table. Users can also manually trigger health checks.
Improvements
- Router keywords are now automatically converted to uppercase. All router configurations will only accept and match uppercase keywords.