-
Notifications
You must be signed in to change notification settings - Fork 122
Description
@jiridanek
https://issues.redhat.com/browse/RHOAIENG-4574
yes, though I am speaking here more of running notebooks, i.e. anything related to S3 that happens before execution in the target environment (KFP, Airflow). Since there is that runtime config json, I'd like that mechanism to stay, just a different way of getting the runtime config into Elyra in Jupyterlab. Same as you do here with the runtimes json file.
But I definitely do not want to, doesn't matter whether submitting stuff for KFP or Airflow, have Elyra depend on config cos_username and cos_password, should at least be optional. Best for corporate environments is getting S3 info from a K8S secret, be it with personal or non-personal bucket credentials.
https://issues.redhat.com/browse/RHOAIENG-133
I do not have access rights on that ticket.
https://issues.redhat.com/browse/RHOAIENG-4531
This is related to DSPA config and runtime behavior, not Jupyter. Seems like a good idea for your uses cases outside running notebooks.
My point here is mainly, that for a running jupyterlab notebook on K8S, info like cos_username and cos_password should not be coming from a runtime json config file. People can specify the ENVs necessary either via CICD or via ODH Dashboard workbench env section - save in and reference from K8S Secret in background.
See my new ticket over at elyra. Caveat: I have not checked whether this has ever been adressed in odh-elyra, but it looks like the issue is present there, too, from a conceptual perspective
https://github.com/opendatahub-io/elyra/blob/main/elyra/util/cos.py#L68
Originally posted by @shalberd in opendatahub-io/kubeflow#513 (comment)
Metadata
Metadata
Assignees
Labels
Type
Projects
Status