Skip to content

feat: Add comprehensive LLM exchange logging#144

Draft
gary-van-woerkens wants to merge 1 commit intomainfrom
feature/add-llm-logging
Draft

feat: Add comprehensive LLM exchange logging#144
gary-van-woerkens wants to merge 1 commit intomainfrom
feature/add-llm-logging

Conversation

@gary-van-woerkens
Copy link
Copy Markdown
Contributor

  • Add LLMLogEntry interface for structured logging
  • Implement configurable log levels: disabled, metadata, truncated, full
  • Add logLLMRequestSent, logLLMResponseReceived, logLLMRequestFailed functions
  • Integrate logging in line-comments-sender with automatic timing and token tracking
  • Add LOG_LLM_EXCHANGES environment variable for configuration
  • Include comprehensive tests for all logging levels and scenarios
  • Update README with complete logging documentation and security considerations
  • Add .env.example configuration for the new logging feature

This enables full visibility into LLM exchanges while maintaining security and performance controls.

- Add LLMLogEntry interface for structured logging
- Implement configurable log levels: disabled, metadata, truncated, full
- Add logLLMRequestSent, logLLMResponseReceived, logLLMRequestFailed functions
- Integrate logging in line-comments-sender with automatic timing and token tracking
- Add LOG_LLM_EXCHANGES environment variable for configuration
- Include comprehensive tests for all logging levels and scenarios
- Update README with complete logging documentation and security considerations
- Add .env.example configuration for the new logging feature

This enables full visibility into LLM exchanges while maintaining security and performance controls.
@revu-bot revu-bot Bot requested a review from revu-bot July 18, 2025 14:52
@sonarqubecloud
Copy link
Copy Markdown

Copy link
Copy Markdown
Contributor

@revu-bot revu-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

An error occurred: Error processing line comments: Invalid analysis format: [
{
"expected": "array",
"code": "invalid_type",
"path": [
"comments"
],
"message": "Invalid input: expected array, received string"
}
]

Revu logs

@gary-van-woerkens gary-van-woerkens marked this pull request as draft July 18, 2025 14:55
@RealVidy RealVidy force-pushed the main branch 3 times, most recently from 2cd6e78 to ecbeb74 Compare August 20, 2025 14:55
@tokenbureau
Copy link
Copy Markdown

tokenbureau Bot commented Aug 20, 2025

🎉 Deployment for commit 5b1d04d :

Ingresses
Docker images
  • 📦 docker pull harbor.fabrique.social.gouv.fr/fabrique/revu/app:v1.11.0
Debug

@gary-van-woerkens gary-van-woerkens force-pushed the main branch 2 times, most recently from d69789a to 05a51e8 Compare August 21, 2025 15:46
@roomote-v0
Copy link
Copy Markdown

roomote-v0 Bot commented Nov 20, 2025

Rooviewer Clock   See task on Roo Cloud

Review completed. Found 2 issues that need attention:

  • Missing PR context in LLM logging calls
  • Invalid YAML syntax in Kubernetes ConfigMap

Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues.


try {
// Log request being sent
logLLMRequestSent(prompt, model, strategyName)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The LLM logging functions are called without PR context (pr_number, repository), even though this information is available in the call chain. The sendToAnthropic function receives a PlatformContext containing prNumber, repoOwner, and repoName, but this context isn't passed to lineCommentsSender. This makes it difficult to correlate LLM logs with specific PRs in production environments where multiple PRs may be processed concurrently.

Fix it with Roo Code or mention @roomote and request a fix.

Comment on lines 6 to +12
REPOSITORY_FOLDER: "/app/repos"
LOG_LLM_EXCHANGES: "full"

#LOG_LLM_EXCHANGES=metadata # production mode
#LOG_LLM_EXCHANGES=truncated # quick review
#LOG_LLM_EXCHANGES=full # full LLM data
#LOG_LLM_EXCHANGES=disabled # no LLM logs
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These comment lines use # which is invalid YAML syntax within the data section of a ConfigMap. YAML comments must be outside of string values or the ConfigMap will fail to parse during deployment. Either remove these lines or move them outside the data section as proper YAML comments.

Suggested change
REPOSITORY_FOLDER: "/app/repos"
LOG_LLM_EXCHANGES: "full"
#LOG_LLM_EXCHANGES=metadata # production mode
#LOG_LLM_EXCHANGES=truncated # quick review
#LOG_LLM_EXCHANGES=full # full LLM data
#LOG_LLM_EXCHANGES=disabled # no LLM logs
# Configuration options for LOG_LLM_EXCHANGES:
# - metadata: production mode (logs only metadata)
# - truncated: quick review (logs metadata + truncated content)
# - full: full LLM data (logs complete prompts and responses)
# - disabled: no LLM logs
data:
REPOSITORY_FOLDER: "/app/repos"
LOG_LLM_EXCHANGES: "full"

Fix it with Roo Code or mention @roomote and request a fix.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants