The LMS API (Learning Management System API) is a web API built with FastAPI that provides endpoints for managing learning data.
The LMS frontend uses the LMS API to display items and dashboard charts.
Caddy serves as a reverse proxy that forwards requests to the backend.
Docs:
The API key that is used to authorize requests to the LMS API in:
- The
Swagger UI - The LMS frontend
The key should follow the API key format.
You store the key in LMS_API_KEY in .env.docker.secret.
The LMS API key (without < and >).
The port number (without < and >) which the LMS API is available at on the host.
The port number is the value of LMS_API_HOST_PORT in .env.docker.secret.
The LMS API host port.
Note
See URL.
-
(REMOTE or LOCAL) When running the request on the host where the LMS API is deployed:
http://localhost:<lms-api-host-port> -
(LOCAL) When running the request on the local machine and the LMS API is deployed on the VM:
http://<your-vm-ip-address>:<lms-api-host-port>
Replace the placeholders:
LMS API base URL (without < and >).
In this project, Caddy is configured using the Caddyfile.
The Caddyfile at caddy/Caddyfile specifies the Caddy duties.
- Listen on the specific port inside a
Dockercontainer. - Forward requests to the backend
- Forward requests to the
Qwen CodeAPI - Forward requests to the
pgAdmin - Serve the frontend files
Caddy listens on the port whose port number is the value of LMS_API_HOST_PORT from .env.docker.secret.
Caddy routes to the backend service these API endpoints:
/items*/learners*/interactions*/pipeline*/analytics*/docs*/openapi.json
Caddy routes to the Qwen Code API these API endpoints:
/utils/qwen-code-api*
Caddy routes to pgAdmin these API endpoints:
/utils/pgadmin*
Caddy serves static front-end files from /srv for all other paths.
The try_files directive falls back to index.html for client-side routing.