Skip to content

feat: add max length validation for user chat messages#256

Open
sharma-sugurthi wants to merge 2 commits intojenkinsci:mainfrom
sharma-sugurthi:feat/message-length-validation
Open

feat: add max length validation for user chat messages#256
sharma-sugurthi wants to merge 2 commits intojenkinsci:mainfrom
sharma-sugurthi:feat/message-length-validation

Conversation

@sharma-sugurthi
Copy link
Contributor

Fixes #255

Problem

ChatRequest.message only validates emptiness - no upper bound. Users can submit arbitrarily large payloads (e.g., 1MB) that flow directly into the LLM pipeline, wasting context window and compute. the WebSocket endpoint has no input validation at all.

by contrast, the file upload path already enforces MAX_TEXT_CONTENT_LENGTH = 10000 - but the primary chat path has none.

Fix

adds a configurable max_message_length (default: 5000 chars) enforced across all three input paths:

  1. ChatRequest (POST /sessions/{id}/message) - @field_validator rejects with 422
  2. ChatRequestWithFiles (POST /sessions/{id}/message/upload) - @model_validator rejects with 422
  3. WebSocket (/sessions/{id}/stream) - sends JSON error and continues the connection

the limit is configurable via config.yml under chat.max_message_length.

Changes

File What changed
schemas.py added max length check to both ChatRequest and ChatRequestWithFiles validators
chatbot.py added WebSocket length guard + CONFIG import
config.yml added chat.max_message_length: 5000
config-testing.yml same

Testing

  • pylint - 10.00/10
  • No existing tests reference ChatRequest - zero breakage risk

Adds a configurable max_message_length (default: 5000 chars) enforced
across all three input paths:

- ChatRequest (REST POST /sessions/{id}/message)
- ChatRequestWithFiles (REST POST /sessions/{id}/message/upload)
- WebSocket /sessions/{id}/stream

Oversized messages now get a clear 422/error response instead of
flowing into the LLM pipeline and wasting context window + compute.

The limit is configurable via config.yml under chat.max_message_length.

Fixes jenkinsci#255
@sharma-sugurthi sharma-sugurthi requested a review from a team as a code owner March 7, 2026 06:08
if not user_message:
continue

max_msg_len = CONFIG.get("chat", {}).get(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not recommended for getting the config like this. It's better to have a helper, so you can set the default value in one place. As the current approach, it will set default value in multiple place.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

extracted MAX_MESSAGE_LENGTH as a module-level constant in schemas.py so the default lives in one place. chatbot.py now imports it instead of reading config directly.

Address review feedback: replace scattered CONFIG.get() calls
with a single MAX_MESSAGE_LENGTH constant defined in schemas.py.
The default value (5000) now lives in one place.

- schemas.py: export MAX_MESSAGE_LENGTH from chat config
- chatbot.py: import MAX_MESSAGE_LENGTH instead of reading CONFIG
- Remove unused CONFIG import from chatbot.py
@berviantoleo berviantoleo added the enhancement For changelog: Minor enhancement. use `major-rfe` for changes to be highlighted label Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement For changelog: Minor enhancement. use `major-rfe` for changes to be highlighted

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Enh] Add max length validation for user chat messages across REST and WebSocket endpoints

2 participants