feat: Add S3 remote provider configs to LLS midstream container.#138
feat: Add S3 remote provider configs to LLS midstream container.#138mergify[bot] merged 1 commit intomainfrom
Conversation
WalkthroughAdds optional S3 support for the files API: Changes
Sequence Diagram(s)sequenceDiagram
participant Env as Runtime Env
participant Runner as distribution/run.yaml
participant FilesSvc as Files API
participant S3 as remote::s3
participant DB as sql_files (files_metadata)
Env->>Runner: READ ENABLE_S3
alt ENABLE_S3 set
Runner->>FilesSvc: register provider_id "s3" (remote::s3) with config
FilesSvc->>S3: authenticate (aws_access_key_id / aws_secret_access_key) and set region/endpoint
FilesSvc->>S3: ensure bucket (auto_create_bucket?) and perform object operations
S3-->>FilesSvc: object store/retrieve responses
FilesSvc->>DB: persist metadata to `sql_files.table_name=files_metadata`
else ENABLE_S3 not set
Runner->>FilesSvc: do not configure remote::s3
FilesSvc-->>FilesSvc: use existing/local file provider and metadata store
end
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches🧪 Generate unit tests (beta)
📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (2)
🚧 Files skipped from review as they are similar to previous changes (2)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
cc81771 to
c3f3413
Compare
3b12820 to
7b00db4
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
distribution/README.md(1 hunks)distribution/run.yaml(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: build-test-push (linux/amd64)
- GitHub Check: Summary
🔇 Additional comments (1)
distribution/README.md (1)
20-20: Verify README reflects generated output from script.Line 1 indicates this file is auto-generated by
scripts/gen_distro_doc.py. Confirm that the S3 entry at line 20 resulted from running the generation script rather than manual editing, ensuring consistency withdistribution/run.yaml.
|
@nathan-weinberg ptal |
7b00db4 to
18dfa13
Compare
18dfa13 to
92c08e9
Compare
|
@skamenan7 PTAL |
92c08e9 to
4541e62
Compare
4541e62 to
9cc0595
Compare
|
@nathan-weinberg @derekhiggins ptal, rebased and updated 👍🏽 |
This MR adds S3 remote provider configs to LLS midstream container. Signed-off-by: Mustafa Elbehery <melbeher@redhat.com> Closes https://issues.redhat.com/browse/RHAIENG-2131
9cc0595 to
5bfe9f7
Compare
|
@derekhiggins @skamenan7 @leseb ptal 👍🏽 |
distribution/run.yaml
Outdated
| auto_create_bucket: ${env.S3_AUTO_CREATE_BUCKET:=false} | ||
| metadata_store: | ||
| table_name: s3_files_metadata | ||
| backend: sql_default |
There was a problem hiding this comment.
sql_default is missing backend reference.
type: sqlite ?
| aws_secret_access_key: ${env.AWS_SECRET_ACCESS_KEY:=} | ||
| aws_session_token: ${env.AWS_SESSION_TOKEN:=} | ||
| endpoint_url: ${env.S3_ENDPOINT_URL:=} | ||
| auto_create_bucket: ${env.S3_AUTO_CREATE_BUCKET:=false} |
There was a problem hiding this comment.
No connect_timeout and read_timeout. Please check bedrock for example timeouts.
This PR adds S3 remote provider configs to LLS midstream container.
Signed-off-by: Mustafa Elbehery melbeher@redhat.com
Closes https://issues.redhat.com/browse/RHAIENG-2131
Summary by CodeRabbit
New Features
Documentation
✏️ Tip: You can customize this high-level summary in your review settings.