fix: use json format directly if available and add information on debugging#1289
fix: use json format directly if available and add information on debugging#1289dbasunag merged 4 commits intoopendatahub-io:mainfrom
Conversation
…ugging Signed-off-by: Debarati Basu-Nag <dbasunag@redhat.com> Co-Authored-By: Claude <noreply@anthropic.com>
|
The following are automatically added/executed:
Available user actions:
Supported labels{'/wip', '/hold', '/build-push-pr-image', '/lgtm', '/verified', '/cherry-pick'} |
📝 WalkthroughWalkthroughAdds a “Debugging test failures” doc section and a pytest Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Security findings (actionable)
🚥 Pre-merge checks | ❌ 2❌ Failed checks (2 warnings)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
utilities/opendatahub_logger.py (1)
117-124:linenoserialized as string instead of integer.Line 123 converts
linenotostr, but JSON logging conventions typically preserve numeric types. This may complicate downstream log aggregation/filtering expecting an integer.Proposed fix
"event": msg, "filename": record.pathname.split("/")[-1] if record.pathname else "", - "lineno": str(record.lineno), + "lineno": record.lineno, })🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@utilities/opendatahub_logger.py` around lines 117 - 124, The JSON formatter is converting record.lineno to a string; change the serialization to keep it as an integer by removing the str() wrapper so "lineno" is assigned record.lineno (an int) in the dict returned by the formatter (refer to the JSON dict construction using record.lineno and the surrounding keys like "timestamp", "logger", and "level" to locate the code).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@utilities/opendatahub_logger.py`:
- Line 116: The except clause in utilities/opendatahub_logger.py uses Python 2
syntax ("except json.JSONDecodeError, TypeError:") which is a SyntaxError in
Python 3; update the exception handling in that try/except to use a
parenthesized exception tuple (or separate except blocks) so it reads as
catching both json.JSONDecodeError and TypeError, ensuring the module imports
under Python 3.
---
Nitpick comments:
In `@utilities/opendatahub_logger.py`:
- Around line 117-124: The JSON formatter is converting record.lineno to a
string; change the serialization to keep it as an integer by removing the str()
wrapper so "lineno" is assigned record.lineno (an int) in the dict returned by
the formatter (refer to the JSON dict construction using record.lineno and the
surrounding keys like "timestamp", "logger", and "level" to locate the code).
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository YAML (base), Central YAML (inherited), Organization UI (inherited)
Review profile: CHILL
Plan: Pro
Run ID: 586b1f72-6203-42d8-a058-060d5263c2d4
📒 Files selected for processing (2)
docs/GETTING_STARTED.mdutilities/opendatahub_logger.py
Signed-off-by: fege <fmosca@redhat.com>
There was a problem hiding this comment.
Actionable comments posted: 2
♻️ Duplicate comments (1)
utilities/opendatahub_logger.py (1)
116-116:⚠️ Potential issue | 🟠 MajorLine 116 blocks the current lint toolchain.
This Python 3.14-only
except A, Bform may be valid for the runtime, but Flake8 7.3.0 is still reporting E999 on it in this PR. Either keep the parenthesized tuple here or upgrade the parser/linter stack before merging.Expected result: the repo config will show whether the branch is intentionally on Python 3.14 while the lint toolchain still targets an older parser.
#!/bin/bash set -euo pipefail while IFS= read -r f; do printf '\n== %s ==\n' "$f" rg -n "requires-python|python_requires|flake8|ruff|external" "$f" || true done < <(fd '^(pyproject\.toml|setup\.cfg|setup\.py|tox\.ini|\.pre-commit-config\.yaml)$') printf '\n== Affected syntax ==\n' sed -n '113,117p' utilities/opendatahub_logger.py🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@utilities/opendatahub_logger.py` at line 116, Change the invalid Python 3.14-only except syntax to a tuple form so the linter accepts it: replace the line using "except json.JSONDecodeError, TypeError:" with "except (json.JSONDecodeError, TypeError):" (keeping the existing exception handling body intact) so the block catching json.JSONDecodeError and TypeError is compatible with current Flake8/older parsers.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@utilities/logger.py`:
- Around line 24-30: The setup_logging function currently uses a predictable
default log_file "/tmp/pytest-tests.log" which is insecure; change setup_logging
to avoid a fixed path by either requiring callers to pass an explicit log_file
or creating a secure temp file inside the function using
tempfile.mkstemp()/NamedTemporaryFile and using that file descriptor/path for
logging; update the signature of setup_logging (remove the unsafe default or set
log_file to None) and ensure the rest of the function (QueueListener setup and
any FileHandler creation) uses the securely-created path or raises a clear error
if None.
In `@utilities/opendatahub_logger.py`:
- Around line 111-124: ThirdPartyJSONFormatter.format currently returns any
syntactically valid JSON string unchanged and serializes only
record.getMessage(), which allows untrusted payloads to override provenance
fields and drops traceback/stack info; update ThirdPartyJSONFormatter.format
(and the other formatter override) to always build a JSON object that merges the
original message payload under a safe key (e.g. "event" or "message_payload")
instead of replacing provenance fields, and include record.exc_info (formatted
via logging.Formatter.formatException) and record.stack_info when present plus
canonical fields timestamp, logger, level, filename, lineno; ensure you parse
record.getMessage() only to include its content rather than using it as the full
output so exc_info/stack_info and provenance are preserved.
---
Duplicate comments:
In `@utilities/opendatahub_logger.py`:
- Line 116: Change the invalid Python 3.14-only except syntax to a tuple form so
the linter accepts it: replace the line using "except json.JSONDecodeError,
TypeError:" with "except (json.JSONDecodeError, TypeError):" (keeping the
existing exception handling body intact) so the block catching
json.JSONDecodeError and TypeError is compatible with current Flake8/older
parsers.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository YAML (base), Central YAML (inherited), Organization UI (inherited)
Review profile: CHILL
Plan: Pro
Run ID: d07ea629-ae49-4e3f-96f6-9795a3a5a3a7
📒 Files selected for processing (4)
conftest.pydocs/GETTING_STARTED.mdutilities/logger.pyutilities/opendatahub_logger.py
✅ Files skipped from review due to trivial changes (1)
- docs/GETTING_STARTED.md
|
Status of building tag latest: success. |
…
Pull Request
Summary
Related Issues
How it has been tested
Additional Requirements
Summary by CodeRabbit