[None][infra] PLC nightly source code scanning#12124
[None][infra] PLC nightly source code scanning#12124yuanjingx87 wants to merge 6 commits intoNVIDIA:mainfrom
Conversation
Signed-off-by: Yuanjing Xue <197832395+yuanjingx87@users.noreply.github.com>
Signed-off-by: Yuanjing Xue <197832395+yuanjingx87@users.noreply.github.com>
Signed-off-by: Yuanjing Xue <197832395+yuanjingx87@users.noreply.github.com>
3c76749 to
fcf9a3f
Compare
📝 WalkthroughWalkthroughThe changes migrate vulnerability and SBOM reporting from Slack-only notifications to an Elasticsearch-backed system. The Jenkins pipeline is updated to conditionally trigger based on job location, and the Python submission script is refactored to process and index vulnerability and SBOM data into Elasticsearch while maintaining Slack notifications for newly detected dependencies. Changes
Sequence DiagramsequenceDiagram
actor Jenkins
participant Pipeline as Jenkins Pipeline
participant Python as submit_vulnerability_report.py
participant ES as Elasticsearch
participant Slack as Slack API
Jenkins->>Pipeline: Trigger with build metadata
Pipeline->>Python: Execute with env vars<br/>(ES_POST_URL, ES_QUERY_URL, etc.)
Python->>Python: Load vulnerability JSON
Python->>ES: get_last_scan_results(vulnerability)<br/>Query prior scan data
ES-->>Python: Return last scan documents
Python->>Python: Identify new dependencies<br/>vs. prior scan
Python->>Python: Build bulk_documents for ES
Python->>ES: es_post(ES_POST_URL, documents)
ES-->>Python: Indexing result (success/error)
Python->>Python: Load SBOM JSON
Python->>ES: get_last_scan_results(sbom)<br/>Query prior SBOM data
ES-->>Python: Return last SBOM documents
Python->>Python: Identify new GPL/LGPL licenses<br/>vs. prior scan
Python->>Python: Build license bulk_documents
Python->>ES: es_post(ES_POST_URL, documents)
ES-->>Python: Indexing result (success/error)
alt New dependencies detected
Python->>Slack: post_slack_msg()<br/>Send summary + Kibana link
Slack-->>Python: Notification sent
end
Python-->>Pipeline: Report completion
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes 🚥 Pre-merge checks | ✅ 1 | ❌ 2❌ Failed checks (2 warnings)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
Signed-off-by: Yuanjing Xue <197832395+yuanjingx87@users.noreply.github.com>
fcf9a3f to
093cf68
Compare
There was a problem hiding this comment.
Actionable comments posted: 5
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@jenkins/scripts/submit_vulnerability_report.py`:
- Line 24: The KIBANA_DASHBOARD_URL environment variable is read into
KIBANA_DASHBOARD_URL but not used when composing the Slack dashboard link;
update the code that builds the Slack message (the link construction logic
around lines that reference the dashboard ID, e.g. the function or block that
concatenates the Kibana URL into the Slack text) to use KIBANA_DASHBOARD_URL as
the base URL (with a sensible fallback to the current hardcoded/base constant if
empty) so the Slack link is built from KIBANA_DASHBOARD_URL (and still appends
the dashboard ID/path as before).
- Around line 194-199: The loop that increments count_new_vulnerability over
bulk_documents counts duplicate package names multiple times; change the logic
to first collect unique dependency names from bulk_documents (use the
s_package_name key) and then count only those not present in
map_dependencies_last_report, updating
NEWLY_REPORTED_DEPENDENCIES["source_code_vulnerability"] with that unique count;
apply the same deduplication approach to the analogous logic referenced around
the other block (rows handling the other report) so each newly introduced
package is reported only once.
- Around line 1-10: Add the repository-mandated NVIDIA Apache 2.0
copyright/license header to the top of submit_vulnerability_report.py (before
any imports); include the full NVIDIA Apache-2.0 header block or SPDX identifier
plus the NVIDIA copyright line with the correct year of latest modification,
matching the project's header style and formatting convention.
- Around line 173-174: The CVE and BDSA fields are swapped: change the
assignments so s_cve uses the "CVE ID" value and s_bdsa uses the "Related Vuln"
value by updating the two lines that call safe(v.get(...)); specifically replace
the current s_cve = safe(v.get("Related Vuln")) and s_bdsa = safe(v.get("CVE
ID")) with s_cve = safe(v.get("CVE ID")) and s_bdsa = safe(v.get("Related
Vuln")) so Elasticsearch and downstream dashboards receive the correct
identifiers.
In `@jenkins/TensorRT_LLM_PLC.groovy`:
- Around line 225-228: The sh invocation that runs venv/bin/python
./jenkins/scripts/submit_vulnerability_report.py interpolates pipeline params
directly (e.g. ${params.branchName}), which can break or inject into the shell;
change the command to quote these CLI arguments (e.g. --build-url
"${pipelineUrl}", --build-number "${env.BUILD_NUMBER}", --branch
"${params.branchName}") when constructing the sh script so the values are passed
as single arguments to submit_vulnerability_report.py; update the sh block that
runs venv/bin/python ./jenkins/scripts/submit_vulnerability_report.py
accordingly.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 54514073-c084-4ffc-8c65-798ec4a8ab24
📒 Files selected for processing (2)
jenkins/TensorRT_LLM_PLC.groovyjenkins/scripts/submit_vulnerability_report.py
| # Required: TRTLLM_PLC_WEBHOOK — Slack incoming webhook URL | ||
| # Required: TRTLLM_KIBANA_DASHBOARD — Kibana dashboard URL for this report | ||
| SLACK_WEBHOOK_URL = os.environ.get("TRTLLM_PLC_WEBHOOK") | ||
| KIBANA_DASHBOARD_URL = os.environ.get("TRTLLM_KIBANA_DASHBOARD") |
There was a problem hiding this comment.
Actually use KIBANA_DASHBOARD_URL when building the Slack link.
KIBANA_DASHBOARD_URL is read from the environment but ignored, so this new configuration knob is ineffective and the Slack link will drift the next time the dashboard ID changes.
🐛 Proposed fix
- base = (
- "https://gpuwa.nvidia.com/kibana/s/tensorrt/app/dashboards"
- "#/view/f90d586c-553a-468e-b064-48e846e983a2"
- )
+ base = (
+ KIBANA_DASHBOARD_URL
+ or "https://gpuwa.nvidia.com/kibana/s/tensorrt/app/dashboards#/view/"
+ "f90d586c-553a-468e-b064-48e846e983a2"
+ )Also applies to: 250-262
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@jenkins/scripts/submit_vulnerability_report.py` at line 24, The
KIBANA_DASHBOARD_URL environment variable is read into KIBANA_DASHBOARD_URL but
not used when composing the Slack dashboard link; update the code that builds
the Slack message (the link construction logic around lines that reference the
dashboard ID, e.g. the function or block that concatenates the Kibana URL into
the Slack text) to use KIBANA_DASHBOARD_URL as the base URL (with a sensible
fallback to the current hardcoded/base constant if empty) so the Slack link is
built from KIBANA_DASHBOARD_URL (and still appends the dashboard ID/path as
before).
|
/bot skip --comment "no CI is needed" |
|
PR_Github #38638 [ skip ] triggered by Bot. Commit: |
|
PR_Github #38638 [ skip ] completed with state |
Summary by CodeRabbit
Release Notes
New Features
Chores
Description
Posing scanning result to NVDF, compare with last run result to notify plc-channle when new dependency issue found.
Test Coverage
PR Checklist
Please review the following before submitting your PR:
PR description clearly explains what and why. If using CodeRabbit's summary, please make sure it makes sense.
PR Follows TRT-LLM CODING GUIDELINES to the best of your knowledge.
Test cases are provided for new code paths (see test instructions)
Any new dependencies have been scanned for license and vulnerabilities
CODEOWNERS updated if ownership changes
Documentation updated as needed
Update tava architecture diagram if there is a significant design change in PR.
The reviewers assigned automatically/manually are appropriate for the PR.
Please check this after reviewing the above items as appropriate for this PR.
GitHub Bot Help
To see a list of available CI bot commands, please comment
/bot help.