An OpenMRS 3.x microfrontend that lets clinicians ask natural-language questions about a patient's chart and receive AI-generated answers with source citations.
A floating AI button appears on the patient chart page. Clicking it opens a search panel where clinicians can type questions like:
- "What medications is this patient on?"
- "Has she ever had a bad reaction to penicillin?"
- "Is her diabetes getting better or worse?"
The module streams an answer token-by-token (via SSE) with numbered citations (e.g. [1], [2]) that link back to the relevant section of the patient chart (Results, Orders, Allergies, etc.).
This frontend requires the Chart Search AI backend module, which uses a RAG (Retrieval Augmented Generation) architecture:
- Retrieval -- patient records are embedded with all-MiniLM-L6-v2 (ONNX, CPU) and narrowed to the top-K most relevant via cosine similarity.
- Generation -- the filtered records are sent to a local GGUF LLM (default: Llama 3.3 8B via llama.cpp) with a system prompt that produces cited, structured answers.
See the backend README for full setup instructions, model downloads, and global property configuration.
- OpenMRS 3.x with the Chart Search AI module installed and configured
- Node.js 18+
- Yarn 4.x
# Install dependencies
yarn install
# Start the dev server (proxies to a running OpenMRS instance)
yarn startThe following options can be set via the OpenMRS 3.x config system:
| Property | Type | Default | Description |
|---|---|---|---|
aiSearchPlaceholder |
string |
"Ask AI about this patient..." |
Placeholder text for the search input |
maxQuestionLength |
number |
1000 |
Maximum characters allowed in a question |
useStreaming |
boolean |
true |
Use the SSE streaming endpoint for token-by-token responses |
All endpoints are served by the backend module under /ws/rest/v1/chartsearchai/:
| Method | Path | Description |
|---|---|---|
| POST | /search |
Synchronous search (returns complete answer) |
| POST | /search/stream |
SSE streaming search (tokens streamed in real-time) |
Request body: { "patient": "<uuid>", "question": "<text>" }
Response:
{
"answer": "The patient is currently on metformin [1] and lisinopril [2]...",
"disclaimer": "AI-generated summary. Verify with the full chart.",
"references": [
{ "index": 1, "resourceType": "order", "resourceId": 456, "date": "2025-12-01" },
{ "index": 2, "resourceType": "order", "resourceId": 789, "date": "2025-11-15" }
]
}The required privilege is AI Query Patient Data.
These steps work for the OpenMRS SDK, O3 Standalone, and Docker deployments.
git clone https://github.com/openmrs/openmrs-esm-chartsearchai.git
cd openmrs-esm-chartsearchai
yarn install
yarn buildFind the frontend/ folder that contains importmap.json:
- OpenMRS SDK:
~/openmrs/<server-name>/frontend/ - O3 Standalone:
<standalone-directory>/appdata/frontend/ - Docker: the frontend files are inside the
frontendcontainer (see below)
Confirm by checking that importmap.json exists inside the directory.
For Docker, find the frontend directory inside the container:
# Find the frontend container name
docker ps --format '{{.Names}}' | grep frontend
# The frontend files are typically at /usr/share/nginx/html/
# Verify by checking for importmap.json
docker exec <frontend-container> ls /usr/share/nginx/html/importmap.jsonSDK / Standalone:
mkdir -p <frontend-directory>/openmrs-esm-chartsearchai-app
cp dist/* <frontend-directory>/openmrs-esm-chartsearchai-app/Docker:
# Create the directory inside the container
docker exec <frontend-container> mkdir -p /usr/share/nginx/html/openmrs-esm-chartsearchai-app
# Copy the built files into the container
docker cp dist/. <frontend-container>:/usr/share/nginx/html/openmrs-esm-chartsearchai-app/Edit importmap.json and add this entry inside the "imports" object:
"@openmrs/esm-chartsearchai-app": "./openmrs-esm-chartsearchai-app/openmrs-esm-chartsearchai-app.js"For Docker, you can edit the file in-place:
docker exec <frontend-container> sh -c "cat /usr/share/nginx/html/importmap.json | \
sed 's/}}/,\"@openmrs\/esm-chartsearchai-app\":\"\.\/openmrs-esm-chartsearchai-app\/openmrs-esm-chartsearchai-app.js\"}}/' \
> /tmp/importmap.json && mv /tmp/importmap.json /usr/share/nginx/html/importmap.json"Or copy the file out, edit locally, and copy it back:
docker cp <frontend-container>:/usr/share/nginx/html/importmap.json .
# Edit importmap.json with your editor
docker cp importmap.json <frontend-container>:/usr/share/nginx/html/importmap.jsonEdit routes.registry.json and add this entry to the top-level JSON object.
For Docker, copy the file out, edit, and copy back:
docker cp <frontend-container>:/usr/share/nginx/html/routes.registry.json .
# Edit routes.registry.json with your editor
docker cp routes.registry.json <frontend-container>:/usr/share/nginx/html/routes.registry.jsonAdd this entry:
"@openmrs/esm-chartsearchai-app": {
"$schema": "https://json.openmrs.org/routes.schema.json",
"backendDependencies": {
"webservices.rest": ">=2.44.0",
"chartsearchai": ">=1.0.0-SNAPSHOT"
},
"extensions": [
{
"name": "ai-search-button",
"component": "aiSearchButton",
"slot": "patient-banner-tags-slot",
"privilege": "AI Query Patient Data",
"order": 100
}
],
"version": "1.0.0"
}The logged-in user's role must include the "AI Query Patient Data" privilege. You can assign this via the OpenMRS admin UI under Administration > Manage Roles.
Press Cmd+Shift+R (Mac) or Ctrl+Shift+R (Windows/Linux) to bypass the cache. Navigate to a patient chart and the AI search button should appear in the patient banner.
After making changes, rebuild and copy:
SDK / Standalone:
yarn build
cp dist/* <frontend-directory>/openmrs-esm-chartsearchai-app/Docker:
yarn build
docker cp dist/. <frontend-container>:/usr/share/nginx/html/openmrs-esm-chartsearchai-app/Then hard-refresh the browser. No server restart is needed.
yarn testyarn build