Skip to content

Commit 0a513db

Browse files
authored
Bug fix single model serving (#89)
* rhoai-25155 fix model deploy mode * bug-fix-single-model-serving * bug-fix-single-model-serving 2 * bug-fix-single-model-serving 3 * bug-fix-single-model-serving'4 * bug-fix-single-model-serving-add new tab to links
1 parent 990e03c commit 0a513db

File tree

6 files changed

+18
-4
lines changed

6 files changed

+18
-4
lines changed

3_rest_requests_multi_model.ipynb

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,9 @@
55
"id": "f73046ff",
66
"metadata": {},
77
"source": [
8-
"# REST Inference"
8+
"# REST Inference for multi-model server deployment \n",
9+
"\n",
10+
"_Note: Use this procedure for testing a model that you deployed on a multi-model server. See `5_rest_requests_single_model.ipynb` for testing a model that you deployed on a single-model server._"
911
]
1012
},
1113
{

5_rest_requests_single_model.ipynb

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,15 @@
11
{
22
"cells": [
33
{
4-
"cell_type": "code",
4+
"cell_type": "markdown",
55
"execution_count": null,
66
"id": "55c8afde-9b18-4b6a-9ee5-33924bdb4f16",
77
"metadata": {},
88
"outputs": [],
99
"source": [
10-
"# REST Inference"
10+
"# REST Inference for single-model server deployment \n",
11+
"\n",
12+
"_Note: Use this procedure for testing a model that you deployed on a single-model server. See `3_rest_requests_multi_model.ipynb` for testing a model that you deployed on a multi-model server._"
1113
]
1214
},
1315
{
16.4 KB
Loading

workshop/docs/modules/ROOT/pages/creating-connections-to-storage.adoc

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -55,6 +55,11 @@ In the *Connections* tab for the project, check to see that your connections are
5555
image::projects/ds-project-connections.png[List of project connections, 500]
5656

5757

58+
[IMPORTANT]
59+
====
60+
If your cluster uses self-signed certificates, your {productname-short} administrator might need to provide a certificate authority (CA) to securely connect to the S3 object storage, as described in link:https://docs.redhat.com/en/documentation/red_hat_openshift_ai_self-managed/latest/html/installing_and_uninstalling_openshift_ai_self-managed/working-with-certificates_certs#accessing-s3-compatible-object-storage-with-self-signed-certificates_certs[Accessing S3-compatible object storage with self-signed certificates^] (Self-Managed) or link:https://docs.redhat.com/en/documentation/red_hat_openshift_ai_cloud_service/1/html/installing_and_uninstalling_openshift_ai_cloud_service/working-with-certificates_certs#accessing-s3-compatible-object-storage-with-self-signed-certificates_certs[Accessing S3-compatible object storage with self-signed certificates^] (Cloud Service).
61+
====
62+
5863
.Next step
5964

6065
If you want to complete the pipelines section of this {deliverable}, go to xref:enabling-data-science-pipelines.adoc[Enabling data science pipelines].

workshop/docs/modules/ROOT/pages/index.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ If don't have access to a cluster that includes an instance of {productname-shor
3434

3535
[IMPORTANT]
3636
====
37-
If your cluster uses self-signed certificates, before you begin the {deliverable}, your {productname-short} administrator must add self-signed certificates for {productname-short} as described in link:https://docs.redhat.com/en/documentation/red_hat_openshift_ai_self-managed/latest/html/installing_and_uninstalling_openshift_ai_self-managed/working-with-certificates_certs[Working with certificates] (Self-Managed) or link:https://docs.redhat.com/en/documentation/red_hat_openshift_ai_cloud_service/1/html/installing_and_uninstalling_openshift_ai_cloud_service/working-with-certificates_certs[Working with certificates] (Cloud Service).
37+
If your cluster uses self-signed certificates, before you begin the {deliverable}, your {productname-short} administrator must add self-signed certificates for {productname-short} as described in link:https://docs.redhat.com/en/documentation/red_hat_openshift_ai_self-managed/latest/html/installing_and_uninstalling_openshift_ai_self-managed/working-with-certificates_certs[Working with certificates^] (Self-Managed) or link:https://docs.redhat.com/en/documentation/red_hat_openshift_ai_cloud_service/1/html/installing_and_uninstalling_openshift_ai_cloud_service/working-with-certificates_certs[Working with certificates^] (Cloud Service).
3838
====
3939

4040
If you're ready, xref:navigating-to-the-dashboard.adoc[start the {deliverable}].

workshop/docs/modules/ROOT/pages/running-a-script-to-install-storage.adoc

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -119,6 +119,11 @@ spec:
119119
image::projects/ds-project-connections.png[Connections for Fraud Detection]
120120

121121

122+
[IMPORTANT]
123+
====
124+
If your cluster uses self-signed certificates, your {productname-short} administrator might need to provide a certificate authority (CA) to securely connect to the S3 object storage, as described in link:https://docs.redhat.com/en/documentation/red_hat_openshift_ai_self-managed/latest/html/installing_and_uninstalling_openshift_ai_self-managed/working-with-certificates_certs#accessing-s3-compatible-object-storage-with-self-signed-certificates_certs[Accessing S3-compatible object storage with self-signed certificates^] (Self-Managed) or link:https://docs.redhat.com/en/documentation/red_hat_openshift_ai_cloud_service/1/html/installing_and_uninstalling_openshift_ai_cloud_service/working-with-certificates_certs#accessing-s3-compatible-object-storage-with-self-signed-certificates_certs[Accessing S3-compatible object storage with self-signed certificates^] (Cloud Service).
125+
====
126+
122127
.Next step
123128

124129
If you want to complete the pipelines section of this {deliverable}, go to xref:enabling-data-science-pipelines.adoc[Enabling data science pipelines].

0 commit comments

Comments
 (0)