Skip to content

Commit 7f983f1

Browse files
authored
New user doc feedback (#42)
* rhoai-7399 add new user feedback * rhoai-7399 make format of notes consistent * another minor update
1 parent 53da9bc commit 7f983f1

22 files changed

+70
-62
lines changed
53.9 KB
Loading
54.7 KB
Loading
-35.6 KB
Loading
-3.7 KB
Loading

workshop/docs/modules/ROOT/pages/automating-workflows-with-pipelines.adoc

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ image::pipelines/wb-pipeline-launcher.png[Pipeline buttons]
1919
+
2020
image::pipelines/wb-pipeline-editor-button.png[Pipeline Editor button, 100]
2121
+
22-
You've created a blank pipeline!
22+
You've created a blank pipeline.
2323

2424
. Set the default runtime image for when you run your notebook or Python code.
2525

@@ -35,7 +35,7 @@ image::pipelines/wb-pipeline-properties-tab.png[Pipeline Properties Tab]
3535
+
3636
image::pipelines/wb-pipeline-runtime-image.png[Pipeline Runtime Image0, 400]
3737

38-
. Save the pipeline.
38+
. Select *File* -> *Save Python File*.
3939

4040
== Add nodes to your pipeline
4141

@@ -55,7 +55,7 @@ image::pipelines/wb-pipeline-connect-nodes.png[Connect Nodes, 400]
5555

5656
Set node properties to specify the training file as a dependency.
5757

58-
Note: If you don't set this file dependency, the file is not included in the node when it runs and the training job fails.
58+
NOTE: If you don't set this file dependency, the file is not included in the node when it runs and the training job fails.
5959

6060
. Click the `1_experiment_train.ipynb` node.
6161
+
@@ -103,7 +103,7 @@ The secret is named `aws-connection-my-storage`.
103103

104104
[NOTE]
105105
====
106-
If you named your data connection something other than `My Storage`, you can obtain the secret name in the {productname-short} dashboard by hovering over the resource information icon *?* in the *Data Connections* tab.
106+
If you named your data connection something other than `My Storage`, you can obtain the secret name in the {productname-short} dashboard by hovering over the help (?) icon in the *Data Connections* tab.
107107
108108
image::pipelines/dsp-dc-secret-name.png[My Storage Secret Name, 400]
109109
====
@@ -136,16 +136,17 @@ image::pipelines/wb-pipeline-node-remove-env-var.png[Remove Env Var]
136136

137137
.. Under *Kubernetes Secrets*, click *Add*.
138138
+
139-
image::pipelines/wb-pipeline-add-kube-secret.png[Add Kube Secret]
139+
image::pipelines/wb-pipeline-add-kube-secret.png[Add Kubernetes Secret]
140140

141141
.. Enter the following values and then click *Add*.
142-
** *Environment Variable*: `AWS_ACCESS_KEY_ID`
142+
+
143+
* *Environment Variable*: `AWS_ACCESS_KEY_ID`
143144
** *Secret Name*: `aws-connection-my-storage`
144145
** *Secret Key*: `AWS_ACCESS_KEY_ID`
145146
+
146147
image::pipelines/wb-pipeline-kube-secret-form.png[Secret Form, 400]
147148

148-
.. Repeat Steps 2a and 2b for each set of these Kubernetes secrets:
149+
. Repeat Step 2 for each of the following Kubernetes secrets:
149150

150151
* *Environment Variable*: `AWS_SECRET_ACCESS_KEY`
151152
** *Secret Name*: `aws-connection-my-storage`
@@ -163,7 +164,7 @@ image::pipelines/wb-pipeline-kube-secret-form.png[Secret Form, 400]
163164
** *Secret Name*: `aws-connection-my-storage`
164165
** *Secret Key*: `AWS_S3_BUCKET`
165166

166-
. *Save* and *Rename* the `.pipeline` file.
167+
. Select *File* -> *Save Python File As* to save and rename the pipeline. For example, rename it to `My Train Save.pipeline`.
167168

168169
== Run the Pipeline
169170

workshop/docs/modules/ROOT/pages/conclusion.adoc

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,9 +5,7 @@
55
[.text-center.strong]
66
== Conclusion
77

8-
Congratulations!
9-
10-
In this {deliverable}, you learned how to incorporate data science and artificial intelligence (AI) and machine learning (ML) into an OpenShift development workflow.
8+
Congratulations. In this {deliverable}, you learned how to incorporate data science, artificial intelligence, and machine learning into an OpenShift development workflow.
119

1210
You used an example fraud detection model and completed the following tasks:
1311

workshop/docs/modules/ROOT/pages/creating-a-workbench.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ A workbench is an instance of your development and experimentation environment.
1616

1717
. Click the *Workbenches* tab, and then click the *Create workbench* button.
1818
+
19-
image::workbenches/ds-project-create-workbench.png[Create workbench button]
19+
image::workbenches/ds-project-create-workbench.png[Create workbench button, 300]
2020

2121
. Fill out the name and description.
2222
+
Lines changed: 21 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,9 @@
11
[id='creating-data-connections-to-storage']
22
= Creating data connections to your own S3-compatible object storage
33

4-
NOTE: If you do not have your own s3-compatible storage, or if you want to use a disposable local Minio instance instead, skip this section and follow the steps in xref:running-a-script-to-install-storage.adoc[Running a script to install local object storage buckets and create data connections].
4+
If you have existing S3-compatible storage buckets that you want to use for this {deliverable}, you must create a data connection to one storage bucket for saving your data and models and, if you want to complete the pipelines section of this {deliverable}, create another data connection to a different storage bucket for saving pipeline artifacts.
5+
6+
NOTE: If you do not have your own s3-compatible storage, or if you want to use a disposable local Minio instance instead, skip this section and follow the steps in xref:running-a-script-to-install-storage.adoc[Running a script to install local object storage buckets and create data connections]. The provided script automatically completes the following tasks for you: creates a Minio instance in your project, creates two storage buckets in that Minio instance, creates two data connections in your project, one for each bucket and both using the same credentials, and installs required network policies for service mesh functionality.
57

68
.Prerequisite
79

@@ -15,45 +17,48 @@ To create data connections to your existing S3-compatible storage buckets, you n
1517

1618
If you don't have this information, contact your storage administrator.
1719

18-
.Procedures
19-
20-
Create data connections to your two storage buckets.
20+
.Procedure
2121

22-
*Create a data connection for saving your data and models*
22+
. Create a data connection for saving your data and models:
2323

24-
. In the {productname-short} dashboard, navigate to the page for your data science project.
24+
.. In the {productname-short} dashboard, navigate to the page for your data science project.
2525

26-
. Click the *Data connections* tab, and then click *Add data connection*.
26+
.. Click the *Data connections* tab, and then click *Add data connection*.
2727
+
2828
image::projects/ds-project-add-dc.png[Add data connection]
2929

30-
. Fill out the *Add data connection* form and name your connection *My Storage*. This connection is for saving your personal work, including data and models.
30+
.. Fill out the *Add data connection* form and name your connection *My Storage*. This connection is for saving your personal work, including data and models.
31+
+
32+
NOTE: Skip the *Connected workbench* item. You add data connections to a workbench in a later section.
3133
+
3234
image::projects/ds-project-my-storage-form.png[Add my storage form]
3335

34-
. Click *Add data connection*.
35-
36-
*Create a data connection for saving pipeline artifacts*
36+
.. Click *Add data connection*.
3737

38+
. Create a data connection for saving pipeline artifacts:
39+
+
3840
NOTE: If you do not intend to complete the pipelines section of the {deliverable}, you can skip this step.
3941

40-
. Click *Add data connection*.
42+
.. Click *Add data connection*.
4143

42-
. Fill out the form and name your connection *Pipeline Artifacts*.
44+
.. Fill out the form and name your connection *Pipeline Artifacts*.
45+
+
46+
NOTE: Skip the *Connected workbench* item. You add data connections to a workbench in a later section.
4347
+
4448
image::projects/ds-project-pipeline-artifacts-form.png[Add pipeline artifacts form]
4549

46-
. Click *Add data connection*.
50+
.. Click *Add data connection*.
4751

4852

4953
.Verification
54+
5055
In the *Data connections* tab for the project, check to see that your data connections are listed.
5156

5257
image::projects/ds-project-dc-list.png[List of project data connections]
5358

5459

5560
.Next steps
5661

57-
* Configure a pipeline server as described in xref:enabling-data-science-pipelines.adoc[Enabling data science pipelines]
62+
If you want to complete the pipelines section of this {deliverable}, go to xref:enabling-data-science-pipelines.adoc[Enabling data science pipelines].
5863

59-
* Create a workbench and select a notebook image as described in xref:creating-a-workbench.adoc[Creating a workbench]
64+
Otherwise, skip to xref:creating-a-workbench.adoc[Creating a workbench].

workshop/docs/modules/ROOT/pages/deploying-a-model-multi-model-server.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33

44
{productname-short} multi-model servers can host several models at once. You create a new model server and deploy your model to it.
55

6-
.Prerequiste
6+
.Prerequisite
77

88
* A user with `admin` privileges has enabled the multi-model serving platform on your OpenShift cluster.
99

@@ -13,7 +13,7 @@
1313
+
1414
image::model-serving/ds-project-model-list-add.png[Models]
1515
+
16-
*Note:* Depending on how model serving has been configured on your cluster, you might see only one model serving platform option.
16+
NOTE: Depending on how model serving has been configured on your cluster, you might see only one model serving platform option.
1717

1818
. In the *Multi-model serving platform* tile, click *Add model server*.
1919

@@ -43,7 +43,7 @@ image::model-serving/deploy-model-form-mm.png[Deploy model from for multi-model
4343

4444
.Verification
4545

46-
Wait for the model to deploy and for the *Status* to show a green checkmark.
46+
Notice the loading symbol under the *Status* section. It will change to a green checkmark when the deployment is completes successfully.
4747

4848
image::model-serving/ds-project-model-list-status-mm.png[Deployed model status]
4949

workshop/docs/modules/ROOT/pages/deploying-a-model-single-model-server.adoc

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,10 @@
33

44
{productname-short} single-model servers host only one model. You create a new model server and deploy your model to it.
55

6-
*Note:* Depending on how model serving has been configured on your cluster, you might see only one model serving platform option.
6+
NOTE: Depending on how model serving has been configured on your cluster, you might see only one model serving platform option.
77

88

9-
.Prerequiste
9+
.Prerequisite
1010

1111
* A user with `admin` privileges has enabled the single-model serving platform on your OpenShift cluster.
1212

@@ -16,7 +16,7 @@
1616
+
1717
image::model-serving/ds-project-model-list-add.png[Models]
1818
+
19-
*Note:* Depending on how model serving has been configured on your cluster, you might see only one model serving platform option.
19+
NOTE: Depending on how model serving has been configured on your cluster, you might see only one model serving platform option.
2020

2121
. In the *Single-model serving platform* tile, click *Deploy model*.
2222
. In the form, provide the following values:
@@ -33,7 +33,7 @@ image::model-serving/deploy-model-form-sm.png[Deploy model from for single-model
3333

3434
.Verification
3535

36-
Wait for the model to deploy and for the *Status* to show a green checkmark.
36+
Notice the loading symbol under the *Status* section. It will change to a green checkmark when the deployment is completes successfully.
3737

3838
image::model-serving/ds-project-model-list-status-sm.png[Deployed model status]
3939

0 commit comments

Comments
 (0)