You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Upload the pipeline on your cluster and run it. You can do so directly from the pipeline editor. You can use your own newly created pipeline for this or `6 Train Save.pipeline`.
171
+
Upload the pipeline on your cluster and run it. You can do so directly from the pipeline editor. You can use your own newly created pipeline or the pipeline provided in the `6 Train Save.pipeline` file.
Copy file name to clipboardExpand all lines: workshop/docs/modules/ROOT/pages/conclusion.adoc
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,10 +5,11 @@
5
5
[.text-center.strong]
6
6
== Conclusion
7
7
8
-
Congratulations. In this {deliverable}, you learned how to incorporate data science, artificial intelligence, and machine learning into an OpenShift development workflow.
8
+
Congratulations. In this {deliverable}, you learned how to incorporate data science, artificial intelligence, and machine learning into an OpenShift development workflow.
9
9
10
10
You used an example fraud detection model and completed the following tasks:
11
11
12
12
* Explored a pre-trained fraud detection model by using a Jupyter notebook.
13
13
* Deployed the model by using {productname-short} model serving.
14
14
* Refined and trained the model by using automated pipelines.
15
+
* Learned how to train the model by using Ray, a distributed computing framework.
Copy file name to clipboardExpand all lines: workshop/docs/modules/ROOT/pages/creating-data-connections-to-storage.adoc
+8-8Lines changed: 8 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,11 +1,11 @@
1
1
[id='creating-data-connections-to-storage']
2
2
= Creating data connections to your own S3-compatible object storage
3
3
4
-
If you have existing S3-compatible storage buckets that you want to use for this {deliverable}, you must create a data connection to one storage bucket for saving your data and models and, if you want to complete the pipelines section of this {deliverable}, create another data connection to a different storage bucket for saving pipeline artifacts.
4
+
If you have existing S3-compatible storage buckets that you want to use for this {deliverable}, you must create a data connection to one storage bucket for saving your data and models. If you want to complete the pipelines section of this {deliverable}, create another data connection to a different storage bucket for saving pipeline artifacts.
5
5
6
-
NOTE: If you do not have your own s3-compatible storage, or if you want to use a disposable local Minio instance instead, skip this section and follow the steps in xref:running-a-script-to-install-storage.adoc[Running a script to install local object storage buckets and create data connections]. The provided script automatically completes the following tasks for you: creates a Minio instance in your project, creates two storage buckets in that Minio instance, creates two data connections in your project, one for each bucket and both using the same credentials, and installs required network policies for service mesh functionality.
6
+
NOTE: If you do not have your own s3-compatible storage, or if you want to use a disposable local Minio instance instead, skip this section and follow the steps in xref:running-a-script-to-install-storage.adoc[Running a script to install local object storage buckets and create data connections]. The provided script automatically completes the following tasks for you: creates a Minio instance in your project, creates two storage buckets in that Minio instance, creates two data connections in your project, one for each bucket and both using the same credentials, and installs required network policies for service mesh functionality.
7
7
8
-
.Prerequisite
8
+
.Prerequisites
9
9
10
10
To create data connections to your existing S3-compatible storage buckets, you need the following credential information for the storage buckets:
11
11
@@ -27,11 +27,11 @@ If you don't have this information, contact your storage administrator.
27
27
+
28
28
image::projects/ds-project-add-dc.png[Add data connection]
29
29
30
-
.. Fill out the *Add data connection* form and name your connection *My Storage*. This connection is for saving your personal work, including data and models.
30
+
.. Complete the *Add data connection* form and name your connection *My Storage*. This connection is for saving your personal work, including data and models.
31
31
+
32
32
NOTE: Skip the *Connected workbench* item. You add data connections to a workbench in a later section.
33
33
+
34
-
image::projects/ds-project-my-storage-form.png[Add my storage form, 400]
34
+
image::projects/ds-project-my-storage-form.png[Add my storage form, 500]
35
35
36
36
.. Click *Add data connection*.
37
37
@@ -41,11 +41,11 @@ NOTE: If you do not intend to complete the pipelines section of the {deliverable
41
41
42
42
.. Click *Add data connection*.
43
43
44
-
.. Fill out the form and name your connection *Pipeline Artifacts*.
44
+
.. Complete the form and name your connection *Pipeline Artifacts*.
45
45
+
46
46
NOTE: Skip the *Connected workbench* item. You add data connections to a workbench in a later section.
0 commit comments