Skip to content

Conversation

@m1rl0k
Copy link
Contributor

@m1rl0k m1rl0k commented Nov 5, 2025

Dynamic Reviews
Add custom prompt mode with customMode: on | off | auto
Auto mode routes complex code files to custom prompts
Capture documentation from first custom review batch and include in overview

Context Batching
Batch PR files by character size to respect config.maxReviewChars (default 725k)
Aggregate comments across batches

AWS Bedrock + SAP AI Core
Support Bedrock with IAM credentials (no LLM_API_KEY required when AWS creds present and model is bedrock/anthropic/meta/amazon family)
Keep SAP AI Core path (no API key required with proper SAP vars)

GitHub Enterprise Server (GHES) preserved
Inputs: github_api_url, github_server_url
Pass baseUrl to Octokit via initOctokit(config.githubToken, config.githubApiUrl)

CLI
Add --full (force full review) and --out (save output to file) flags

Comments improvements
Truncate code blocks in comments using config.maxCodeblockLines with marker: “… (truncated; more lines omitted) …”
Support multiline threads (start_line) in generateCommentThreads
Paginate listReviewComments to fetch all pages

Providers
AI SDK default temperature now 1 (when undefined)
Action + Config
action.yml: added inputs for new config (custom_mode, llm_provider, llm_model, review_scopes, allow_title_update, max_comments, max_codeblock_lines, max_review_chars)
branding: restored tags per request (ai, code-review, ai-assisted, automation, pull-requests, linting, ci, autofix, devtools)
Entry point remains dist/index.js

Build
Revert build script to two explicit esbuild outputs (dist/index.js, dist/cli.js)

Tests and Mocks
All tests green: 56 passed
Updated @octokit/action mock to expose Octokit.__lastOptions for throttle tests
Avoid custom prompt runner during tests to align with mocks

Misc
messages.ts: overview accepts optional documentation section

… CLI+tests

Dynamic Reviews
Add custom prompt mode with customMode: on | off | auto
Auto mode routes complex code files to custom prompts
Capture documentation from first custom review batch and include in overview
Context Batching
Batch PR files by character size to respect config.maxReviewChars (default 725k)
Aggregate comments across batches
AWS Bedrock + SAP AI Core
Support Bedrock with IAM credentials (no LLM_API_KEY required when AWS creds present and model is bedrock/anthropic/meta/amazon family)
Keep SAP AI Core path (no API key required with proper SAP vars)
GitHub Enterprise Server (GHES) preserved
Inputs: github_api_url, github_server_url
Pass baseUrl to Octokit via initOctokit(config.githubToken, config.githubApiUrl)
CLI
Add --full (force full review) and --out (save output to file) flags
Comments improvements
Truncate code blocks in comments using config.maxCodeblockLines with marker: “… (truncated; more lines omitted) …”
Support multiline threads (start_line) in generateCommentThreads
Paginate listReviewComments to fetch all pages
Providers
AI SDK default temperature now 1 (when undefined)
Action + Config
action.yml: added inputs for new config (custom_mode, llm_provider, llm_model, review_scopes, allow_title_update, max_comments, max_codeblock_lines, max_review_chars)
branding: restored tags per request (ai, code-review, ai-assisted, automation, pull-requests, linting, ci, autofix, devtools)
Entry point remains dist/index.js
Build
Revert build script to two explicit esbuild outputs (dist/index.js, dist/cli.js)
Tests and Mocks
All tests green: 56 passed
Updated @octokit/action mock to expose Octokit.__lastOptions for throttle tests
Avoid custom prompt runner during tests to align with mocks
Removals
Removed push.ts and its tests per requirement
Misc
messages.ts: overview accepts optional documentation section
@github-actions
Copy link

github-actions bot commented Nov 5, 2025

Analyzing changes in this PR...

This might take a few minutes, please wait

📥 Commits

Analyzing changes from base (c860450) to latest commit (99763bc):

  • 99763bc: Refactor dotenv import and usage in CLI

Replaces legacy dotenv import and usage in dist/cli.js and related files with a new import style and updated variable names. This change improves consistency and maintainability by aligning with the latest build output and source conventions.

  • 9498278: Add support for custom LLM_BASE_URL for ai-sdk provider

Introduces the optional LLM_BASE_URL environment variable for OpenAI-compatible providers when using LLM_PROVIDER=ai-sdk. Updates documentation and configuration to clarify usage, enabling support for providers like OpenRouter, Anyscale, and Together AI. No changes for sap-ai-sdk provider.

  • 2c31a9b: Enable PR title update by default and remove max_comments

The action now allows PR title updates by default when @presubmit is present, updating the default in both action.yml and config.ts. The unused max_comments input and related logic have been removed for simplification.

  • 0a76989: Update README.md
  • 7017eaa: Update README.md
  • 7afa451: bump github
  • 6fa22d6: Merge origin/main into feature-custom-prompts (keep our dist builds and @actions/github version)
  • c90cb90: anolther fix

type safety + schema validation...

Dynamic Reviews
Add custom prompt mode with customMode: on | off | auto
Auto mode routes complex code files to custom prompts
Capture documentation from first custom review batch and include in overview
Context Batching
Batch PR files by character size to respect config.maxReviewChars (default 725k)
Aggregate comments across batches
AWS Bedrock + SAP AI Core
Support Bedrock with IAM credentials (no LLM_API_KEY required when AWS creds present and model is bedrock/anthropic/meta/amazon family)
Keep SAP AI Core path (no API key required with proper SAP vars)
GitHub Enterprise Server (GHES) preserved
Inputs: github_api_url, github_server_url
Pass baseUrl to Octokit via initOctokit(config.githubToken, config.githubApiUrl)
CLI
Add --full (force full review) and --out (save output to file) flags
Comments improvements
Truncate code blocks in comments using config.maxCodeblockLines with marker: “… (truncated; more lines omitted) …”
Support multiline threads (start_line) in generateCommentThreads
Paginate listReviewComments to fetch all pages
Providers
AI SDK default temperature now 1 (when undefined)
Action + Config
action.yml: added inputs for new config (custom_mode, llm_provider, llm_model, review_scopes, allow_title_update, max_comments, max_codeblock_lines, max_review_chars)
branding: restored tags per request (ai, code-review, ai-assisted, automation, pull-requests, linting, ci, autofix, devtools)
Entry point remains dist/index.js
Build
Revert build script to two explicit esbuild outputs (dist/index.js, dist/cli.js)
Tests and Mocks
All tests green: 56 passed
Updated @octokit/action mock to expose Octokit.__lastOptions for throttle tests
Avoid custom prompt runner during tests to align with mocks
Removals
Removed push.ts and its tests per requirement
Misc
messages.ts: overview accepts optional documentation section

📁 Files being considered (23)

🔄 README.md (2 hunks)
🔄 action.yml (1 hunk)
🔄 dist/cli.js (0 hunks)
🔄 dist/index.js (0 hunks)
🔄 package-lock.json (8 hunks)
🔄 package.json (1 hunk)
🔄 src/mocks/@octokit/action.ts (2 hunks)
➕ src/tests/ai.test.ts (1 hunk)
➕ src/tests/comments.test.ts (1 hunk)
➕ src/tests/context.test.ts (1 hunk)
➕ src/tests/main.test.ts (1 hunk)
➕ src/tests/octokit.throttle.test.ts (1 hunk)
➕ src/tests/prompts.core.test.ts (8 hunks)
➕ src/tests/providers.ai-sdk.test.ts (1 hunk)
➕ src/tests/providers.sapaicore.test.ts (1 hunk)
🔄 src/cli.ts (4 hunks)
🔄 src/comments.ts (4 hunks)
🔄 src/config.ts (5 hunks)
🔄 src/messages.ts (1 hunk)
➕ src/prompts.custom.ts (1 hunk)
🔄 src/prompts.ts (3 hunks)
🔄 src/providers/ai-sdk.ts (1 hunk)
🔄 src/pull_request.ts (4 hunks)


autogenerated by presubmit.ai

@bstanga
Copy link
Contributor

bstanga commented Nov 8, 2025

Great PR! Do you have an example of a review with your changes applied?

Copy link
Contributor

@bstanga bstanga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You need to rebase

src/comments.ts Outdated
if (isFence(line)) {
if (inBlock) {
// closing fence
if (emittedTrunc) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

delete this if statement, it's no-op

Copy link

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Review Summary

Commits Considered (1)
Files Processed (2)
  • dist/cli.js (0 hunks)
  • dist/index.js (0 hunks)
Actionable Comments (0)
Skipped Comments (0)

type safety + schema validation...
Copy link

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Review Summary

Commits Considered (1)

type safety + schema validation...

Files Processed (3)
  • dist/cli.js (0 hunks)
  • dist/index.js (0 hunks)
  • src/prompts.ts (1 hunk)
Actionable Comments (0)
Skipped Comments (4)
  • src/prompts.ts [311-315]

    best practice: "Type annotation missing for raw variable"

  • src/prompts.ts [320-322]

    possible issue: "Hardcoded parameter name may not match all providers"

  • src/prompts.ts [325-327]

    maintainability: "Error message could be more informative"

  • src/prompts.ts [329-339]

    maintainability: "Default values may mask provider issues"

Copy link

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Review Summary

Commits Considered (1)
Files Processed (3)
  • dist/cli.js (0 hunks)
  • dist/index.js (0 hunks)
  • package.json (1 hunk)
Actionable Comments (0)
Skipped Comments (0)

@m1rl0k
Copy link
Contributor Author

m1rl0k commented Nov 8, 2025

Okay, again, maybe confirm... sorry for the issues

Copy link

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Review Summary

Commits Considered (1)
Files Processed (1)
  • README.md (2 hunks)
Actionable Comments (0)
Skipped Comments (0)

Copy link

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Review Summary

Commits Considered (1)
Files Processed (1)
  • README.md (2 hunks)
Actionable Comments (0)
Skipped Comments (0)

@m1rl0k m1rl0k requested a review from bstanga November 26, 2025 14:30
@bstanga
Copy link
Contributor

bstanga commented Nov 30, 2025

Sorry for delay, I've been on an extended leave. Will review this asap next week

The action now allows PR title updates by default when @presubmit is present, updating the default in both action.yml and config.ts. The unused max_comments input and related logic have been removed for simplification.
Introduces the optional LLM_BASE_URL environment variable for OpenAI-compatible providers when using LLM_PROVIDER=ai-sdk. Updates documentation and configuration to clarify usage, enabling support for providers like OpenRouter, Anyscale, and Together AI. No changes for sap-ai-sdk provider.
@m1rl0k
Copy link
Contributor Author

m1rl0k commented Dec 21, 2025

@bstanga fixed both issues and merged your changes in from main

Replaces legacy dotenv import and usage in dist/cli.js and related files with a new import style and updated variable names. This change improves consistency and maintainability by aligning with the latest build output and source conventions.
@m1rl0k m1rl0k requested a review from bstanga December 22, 2025 00:57
@m1rl0k m1rl0k closed this Jan 16, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants