@@ -182,14 +182,18 @@ paths:
182182 - 'description': Extracted from 'openshift.io/description' annotation if present
183183
184184 /api/v1/s3/file :
185- summary : Path used to get a file from S3.
185+ summary : Path used to get or upload a file in S3.
186186 description : >-
187- The REST endpoint/path used to retrieve an arbitrary file from S3 storage.
188- Uses the credentials from a specified Kubernetes secret to access the S3 bucket.
189- Returns the file with transfer-encoding: chunked for efficient streaming.
187+ The REST endpoint/path used to retrieve or upload files in S3 storage.
188+ GET returns an arbitrary file with transfer-encoding: chunked for efficient streaming.
189+ POST uploads a CSV file using multipart/form-data (part name `file`) and credentials
190+ from a required Kubernetes secret; the stored object key may differ from the requested
191+ `key` when a name collision is resolved (numeric suffix).
190192 get :
191193 tags :
192194 - S3Operation
195+ security :
196+ - Bearer : []
193197 parameters :
194198 - name : namespace
195199 in : query
@@ -227,6 +231,7 @@ paths:
227231 description : The S3 object key to retrieve
228232 schema :
229233 type : string
234+ pattern : ' ^\S(.*\S)?$'
230235 example : documents/myfile.pdf
231236 responses :
232237 " 200 " :
@@ -275,6 +280,106 @@ paths:
275280 **Explicit mode (override)** — when secretName is supplied, the specified Kubernetes
276281 secret must contain AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_DEFAULT_REGION,
277282 and AWS_S3_ENDPOINT using those exact field names.
283+ post :
284+ tags :
285+ - S3Operation
286+ security :
287+ - Bearer : []
288+ parameters :
289+ - name : namespace
290+ in : query
291+ required : true
292+ description : The Kubernetes namespace containing the secret
293+ schema :
294+ type : string
295+ maxLength : 63
296+ pattern : ' ^[a-z0-9]([-a-z0-9]*[a-z0-9])?$'
297+ example : default
298+ - name : secretName
299+ in : query
300+ required : true
301+ description : >-
302+ Name of the Kubernetes secret containing S3 credentials (required for POST).
303+ The secret must provide AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_DEFAULT_REGION,
304+ and AWS_S3_ENDPOINT using those exact field names. Bucket may come from this secret
305+ (e.g. AWS_S3_BUCKET) when the bucket query parameter is omitted.
306+ schema :
307+ type : string
308+ maxLength : 253
309+ pattern : ' ^[a-z0-9]([-a-z0-9]*[a-z0-9])?(\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*$'
310+ example : aws-secret-1
311+ - name : bucket
312+ in : query
313+ required : false
314+ description : >-
315+ The S3 bucket name. When omitted, the bucket is resolved from the secret (e.g. AWS_S3_BUCKET).
316+ Leading and trailing whitespace is trimmed.
317+ schema :
318+ type : string
319+ pattern : ' ^\S(.*\S)?$'
320+ example : my-bucket
321+ - name : key
322+ in : query
323+ required : true
324+ description : >-
325+ Requested S3 object key for the upload. If an object already exists at this key,
326+ the server may store the file under a non-colliding key (e.g. `file-1.csv`); see the
327+ response body `key` field for the actual object key.
328+ schema :
329+ type : string
330+ pattern : ' ^\S(.*\S)?$'
331+ example : data/training.csv
332+ requestBody :
333+ required : true
334+ content :
335+ multipart/form-data :
336+ schema :
337+ type : object
338+ required :
339+ - file
340+ properties :
341+ file :
342+ type : string
343+ format : binary
344+ description : >-
345+ CSV file to upload. Accepted as text/csv, or application/octet-stream / empty
346+ Content-Type when the filename ends with .csv.
347+ responses :
348+ " 201 " :
349+ description : File uploaded successfully
350+ content :
351+ application/json :
352+ schema :
353+ $ref : " #/components/schemas/S3UploadSuccess"
354+ " 400 " :
355+ $ref : " #/components/responses/BadRequest"
356+ " 401 " :
357+ $ref : " #/components/responses/Unauthorized"
358+ " 403 " :
359+ $ref : " #/components/responses/Forbidden"
360+ " 404 " :
361+ $ref : " #/components/responses/NotFound"
362+ " 413 " :
363+ $ref : " #/components/responses/PayloadTooLarge"
364+ " 409 " :
365+ $ref : " #/components/responses/Conflict"
366+ " 500 " :
367+ $ref : " #/components/responses/InternalServerError"
368+ operationId : uploadS3CsvFile
369+ summary : Upload CSV file to S3
370+ description : >-
371+ Uploads a CSV file to S3 using credentials from the specified Kubernetes secret.
372+ The request must be multipart/form-data with a part named `file`.
373+
374+ Only CSV uploads are allowed: Content-Type `text/csv`, or `application/octet-stream` (or empty)
375+ when the part filename ends with `.csv`. The body is size-limited (declared Content-Length
376+ and total multipart size caps; file part max 1 GiB). Rejects with 413 when limits are exceeded.
377+
378+ On success, returns JSON with `uploaded: true` and the resolved `key` (which may differ
379+ from the requested key if a collision was avoided by probing existing keys).
380+
381+ Returns 409 if the object key chosen after collision resolution still conflicts at upload time
382+ (e.g. concurrent writer); the client should retry the upload.
278383
279384 /api/v1/s3/file/schema :
280385 summary : Path used to get the schema (column names and types) of a CSV file from S3.
@@ -310,6 +415,8 @@ paths:
310415 get :
311416 tags :
312417 - S3Operation
418+ security :
419+ - Bearer : []
313420 parameters :
314421 - name : namespace
315422 in : query
@@ -347,6 +454,7 @@ paths:
347454 description : The S3 object key (CSV file) to retrieve schema from
348455 schema :
349456 type : string
457+ pattern : ' ^\S(.*\S)?$'
350458 example : data/training.csv
351459 responses :
352460 " 200 " :
@@ -781,6 +889,23 @@ components:
781889 description : Description from the 'openshift.io/description' annotation (if present)
782890 example : " S3 bucket for training data storage"
783891
892+ S3UploadSuccess :
893+ description : Response body for successful S3 file upload (POST /api/v1/s3/file)
894+ required :
895+ - uploaded
896+ - key
897+ type : object
898+ properties :
899+ uploaded :
900+ type : boolean
901+ example : true
902+ key :
903+ type : string
904+ description : >-
905+ S3 object key where the file was stored. May differ from the requested `key`
906+ when a collision was resolved (e.g. `data/file-1.csv` if `data/file.csv` existed).
907+ example : data/training.csv
908+
784909 # Pipeline Run Schemas
785910 CreateAutoMLRunRequestBase :
786911 type : object
@@ -1277,6 +1402,23 @@ components:
12771402 type : string
12781403 example : " Invalid request parameters"
12791404
1405+ PayloadTooLarge :
1406+ description : Request entity too large (e.g. declared Content-Length or file part exceeds limit)
1407+ content :
1408+ application/json :
1409+ schema :
1410+ type : object
1411+ properties :
1412+ error :
1413+ type : object
1414+ properties :
1415+ code :
1416+ type : string
1417+ example : " 413"
1418+ message :
1419+ type : string
1420+ example : " request body exceeds maximum upload size (1 GiB plus allowance for multipart framing)"
1421+
12801422 Unauthorized :
12811423 description : Unauthorized
12821424 content :
@@ -1311,6 +1453,25 @@ components:
13111453 type : string
13121454 example : " Resource not found"
13131455
1456+ Conflict :
1457+ description : >-
1458+ Conflict (e.g. S3 conditional upload failed because the object already exists at the
1459+ resolved key; client may retry the request)
1460+ content :
1461+ application/json :
1462+ schema :
1463+ type : object
1464+ properties :
1465+ error :
1466+ type : object
1467+ properties :
1468+ code :
1469+ type : string
1470+ example : " 409"
1471+ message :
1472+ type : string
1473+ example : " object key \" data/file.csv\" already exists in S3 (upload conflict); retry with a different key"
1474+
13141475 InternalServerError :
13151476 description : Internal Server Error
13161477 content :
0 commit comments