Skip to content

Commit a63e879

Browse files
authored
Fixing broken links with downstream link checker (#1035)
* Fixing broken links with downstream link checker * typo * typo2
1 parent a544b5d commit a63e879

File tree

31 files changed

+102
-91
lines changed

31 files changed

+102
-91
lines changed

assemblies/creating-and-importing-jupyter-notebooks.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ include::modules/emptying-trash-directory.adoc[leveloffset=+2]
1818

1919
[role='_additional-resources']
2020
== Additional resources
21-
* link:{odhdocshome}/working-in-your-data-science-ide/#collaborating-on-jupyter-notebooks-by-using-git_{context}[Collaborating on Jupyter notebooks by using Git]
21+
* link:{odhdocshome}/working-in-your-data-science-ide/#collaborating-on-jupyter-notebooks-by-using-git_ide[Collaborating on Jupyter notebooks by using Git]
2222

2323

2424
ifdef::parent-context[:context: {parent-context}]

assemblies/installing-odh-v1.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ include::modules/creating-a-new-project-for-your-odh-instance.adoc[leveloffset=+
2323

2424
include::modules/adding-an-odh-instance.adoc[leveloffset=+1]
2525

26-
include::modules/accessing-the-odh-dashboard.adoc[leveloffset=+1]
26+
include::modules/accessing-the-dashboard.adoc[leveloffset=+1]
2727

2828
ifdef::parent-context[:context: {parent-context}]
2929
ifndef::parent-context[:!context:]

installing-open-data-hub.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ include::modules/configuring-pipelines-with-your-own-argo-workflows-instance.ado
2222

2323
include::modules/installing-the-distributed-workloads-components.adoc[leveloffset=+1]
2424

25-
include::modules/accessing-the-odh-dashboard.adoc[leveloffset=+1]
25+
include::modules/accessing-the-dashboard.adoc[leveloffset=+1]
2626

2727
include::assemblies/working-with-certificates.adoc[leveloffset=+1]
2828

modules/about-gpu-time-slicing.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,4 +16,4 @@ Consider the following points when using GPU time slicing:
1616
[role="_additional-resources"]
1717
.Additional resources
1818
* link:https://docs.nvidia.com/datacenter/cloud-native/gpu-operator/latest/gpu-sharing.html[NVIDIA GPU Sharing Documentation]
19-
* link:https://github.com/stratus-ss/openshift-ai/blob/main/docs/rendered/OpenShift_AI_CLI.md#nvidia---configuring-time-slicing[OpenShift AI CLI - Configuring Time Slicing]
19+
//* link:https://github.com/stratus-ss/openshift-ai/blob/main/docs/rendered/OpenShift_AI_CLI.md#nvidia---configuring-time-slicing[OpenShift AI CLI - Configuring Time Slicing]
Lines changed: 46 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,46 @@
1+
:_module-type: PROCEDURE
2+
3+
[id='accessing-the-dashboard_{context}']
4+
= Accessing the dashboard
5+
6+
[role='_abstract']
7+
After you have installed {productname-short} and added users, you can access the URL for your {productname-short} console and share the URL with the users to let them log in and work on their models.
8+
9+
ifndef::upstream[]
10+
.Prerequisites
11+
* You have installed {productname-short} on your {openshift-platform} cluster.
12+
* You have added at least one user to the user group for {productname-short}.
13+
14+
.Procedure
15+
. Log in to {openshift-platform} web console.
16+
. Click the application launcher (image:images/osd-app-launcher.png[The application launcher]).
17+
. Right-click *{productname-long}* and copy the URL for your {productname-short} instance.
18+
. Provide this instance URL to your data scientists to let them log in to {productname-short}.
19+
20+
.Verification
21+
* Confirm that you and your users can log in to {productname-short} by using the instance URL.
22+
23+
*Note:* In the {productname-long} dashboard, users can view the list of the installed {productname-short} components, their corresponding source (upstream) components, and the versions of the installed components, as described in link:{rhoaidocshome}{default-format-url}/getting_started_with_{url-productname-long}/logging-in_get-started#viewing-installed-components_get-started[Viewing installed components].
24+
25+
[role="_additional-resources"]
26+
.Additional resources
27+
28+
* link:{rhoaidocshome}{default-format-url}/getting_started_with_{url-productname-long}/logging-in_get-started[Logging in to {productname-short}]
29+
* link:{rhoaidocshome}{default-format-url}/managing_openshift_ai/managing-users-and-groups#adding-users-to-user-groups_managing-rhoai[Adding users to {productname-short} user groups]
30+
endif::[]
31+
32+
ifdef::upstream[]
33+
.Prerequisites
34+
* You have installed the {productname-short} Operator.
35+
36+
.Procedure
37+
. Log in to {openshift-platform} web console.
38+
. Click the application launcher (image:images/osd-app-launcher.png[The application launcher]).
39+
. Right-click *{productname-long}* and copy the URL for your {productname-short} instance.
40+
. Give this URL to your users to let them log in to {productname-short} dashboard.
41+
42+
.Verification
43+
* Confirm that you and your users can log in to the {productname-short} dashboard by using the URL.
44+
45+
*Note:* In the {productname-long} dashboard, users can view the list of the installed {productname-short} components, their corresponding source (upstream) components, and the versions of the installed components, as described in link:{odhdocshome}/getting-started-with-open-data-hub/#viewing-installed-components_get-started[Viewing installed {productname-short} components].
46+
endif::[]

modules/accessing-the-odh-dashboard.adoc

Lines changed: 0 additions & 21 deletions
This file was deleted.

modules/adding-a-tested-and-verified-runtime-for-the-multi-model-serving-platform.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -129,7 +129,7 @@ The *Serving runtimes* page opens and shows the updated list of runtimes that ar
129129
[role='_additional-resources']
130130
.Additional resources
131131
ifndef::upstream[]
132-
* To learn how to configure a model server that uses a model-serving runtime that you have added, see link:{rhoaidocshome}{default-format-url}/deploying_models/deploying-models_rhoai-user#adding-a-model-server-for-the-multi-model-serving-platform_rhoai-user[Adding a model server to your data science project].
132+
* To learn how to configure a model server that uses a model-serving runtime that you have added, see link:{rhoaidocshome}{default-format-url}/deploying_models/deploying_models_on_the_multi_model_serving_platform#adding-a-model-server-for-the-multi-model-serving-platform_rhoai-user[Adding a model server to your data science project].
133133
endif::[]
134134
ifdef::upstream[]
135135
* To learn how to configure a model server that uses a model-serving runtime that you have added, see link:{odhdocshome}/deploying-models/#adding-a-model-server-for-the-multi-model-serving-platform_odh-user[Adding a model server to your data science project].

modules/adding-an-odh-instance.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,18 +12,18 @@ By adding an Open Data Hub instance to your project, you can access the URL for
1212

1313
.Procedure
1414
. In the OpenShift web console, select *Operators* -> *Installed Operators*.
15-
. On the *Installed Operators* page, click the *Project* list and select the `odh` project. The page filters to only display installed operators in the `odh` project.
15+
. On the *Installed Operators* page, click the *Project* list and select the `pass:attributes[{dbd-config-default-namespace}]` project.
1616
. Find and click the *Open Data Hub Operator* to display the details for the currently installed version.
1717
. On the *KfDef* tile, click *Create instance*. A `KfDef` object is a specification designed to control provisioning and management of a Kubeflow deployment. A default `KfDef` object is created when you install Open Data Hub Operator. This default configuration deploys the required Open Data Hub core components. For more information, see link:https://opendatahub.io/docs/tiered-components[Tiered Components].
1818
. On the *Create KfDef* page, leave *opendatahub* as the name. Click *Create* to create an Open Data Hub kfdef object named *opendatahub* and begin the deployment of the components.
1919

2020
.Verification
2121
. Select *Operators* -> *Installed Operators*.
22-
. On the *Installed Operators* page, click the *Project* list and select the `odh` project.
22+
. On the *Installed Operators* page, click the *Project* list and select the `pass:attributes[{dbd-config-default-namespace}]` project.
2323
. Find and click *Open Data Hub Operator*.
2424
. Click the *Kf Def* tab and confirm that *opendatahub* is displayed.
2525
. Select *Home* -> *Projects*.
26-
. On the *Projects* page, find and select the *odh* project.
26+
. On the *Projects* page, find and select the `pass:attributes[{dbd-config-default-namespace}]` project.
2727
. On the *Project details* page, click the *Workloads* tab and confirm that the Open Data Hub core components are running. For a description of the components, see link:https://opendatahub.io/docs/tiered-components[Tiered Components].
2828

2929

modules/api-workbench-creating.adoc

Lines changed: 7 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,6 @@ ifdef::cloud-service[]
2323
endif::[]
2424
* You have created a data science project. In the example in this procedure, the project is named `my-data-science-project`.
2525

26-
//will probably need to fix these links
2726
ifdef::upstream[]
2827
* You know the URL for the workbench image that you want to use in the workbench. The example in this procedure uses the custom image that you created in link:{odhdocshome}/api-workbench/#api-custom-image-creating_api-workbench[Creating a custom image by using the `ImageStream` CRD].
2928
endif::[]
@@ -54,7 +53,7 @@ metadata:
5453
annotations:
5554
notebooks.opendatahub.io/inject-oauth: 'true' # <1>
5655
opendatahub.io/image-display-name: My Custom Notebook # <2>
57-
notebooks.opendatahub.io/oauth-logout-url: 'https://rhods-dashboard-redhat-ods-applications.apps.my-cluster.com/projects/my-data-science-project?notebookLogout=my-workbench'
56+
notebooks.opendatahub.io/oauth-logout-url: 'https://<dashboard_URL>/projects/my-data-science-project?notebookLogout=my-workbench'
5857
opendatahub.io/accelerator-name: ''
5958
openshift.io/description: '' # <3>
6059
openshift.io/display-name: My Workbench # <4>
@@ -108,7 +107,7 @@ spec:
108107
--ServerApp.password=''
109108
--ServerApp.base_url=/notebook/my-data-science-project/my-workbench
110109
--ServerApp.quit_button=False
111-
--ServerApp.tornado_settings={"user":"kube-3aadmin","hub_host":"https://rhods-dashboard-redhat-ods-applications.apps.my-cluster.com", "hub_prefix":"/projects/my-data-science-project"}
110+
--ServerApp.tornado_settings={"user":"<user>","hub_host":"<dashboard_URL>", "hub_prefix":"/projects/my-data-science-project"}
112111
- name: JUPYTER_IMAGE
113112
value: 'image-registry.openshift-image-registry.svc:5000/redhat-ods-applications/my-custom-notebook:1.0'
114113
- name: PIP_CERT
@@ -195,7 +194,7 @@ spec:
195194
- '--email-domain=*'
196195
- '--skip-provider-button'
197196
- '--openshift-sar={"verb":"get","resource":"notebooks","resourceAPIGroup":"kubeflow.org","resourceName":"my-workbench","namespace":"$(NAMESPACE)"}'
198-
- '--logout-url=https://rhods-dashboard-redhat-ods-applications.apps.my-cluster.com/projects/my-data-science-project?notebookLogout=my-workbench'
197+
- '--logout-url=<dashboard_URL>/projects/my-data-science-project?notebookLogout=my-workbench'
199198
enableServiceLinks: false
200199
serviceAccountName: my-workbench
201200
volumes:
@@ -246,7 +245,7 @@ kind: Notebook
246245
metadata:
247246
annotations:
248247
...
249-
notebooks.opendatahub.io/oauth-logout-url: 'https://rhods-dashboard-redhat-ods-applications.apps.my-cluster.com/projects/my-data-science-project?notebookLogout=my-workbench' # <1>
248+
notebooks.opendatahub.io/oauth-logout-url: '<dashboard_URL>/projects/my-data-science-project?notebookLogout=my-workbench' # <1>
250249
...
251250
----
252251

@@ -269,7 +268,7 @@ spec:
269268
- resources:
270269
...
271270
args:
272-
- '--logout-url=https://rhods-dashboard-redhat-ods-applications.apps.my-cluster.com/projects/my-data-science-project?notebookLogout=my-workbench' # <1>
271+
- '--logout-url=<dashboard_URL>/projects/my-data-science-project?notebookLogout=my-workbench' # <1>
273272
...
274273
----
275274

@@ -358,7 +357,7 @@ Annotations: notebooks.kubeflow.org/last-activity: 2024-07-30T20:27:25Z
358357
notebooks.opendatahub.io/last-image-selection: my-custom-notebook:1.0
359358
notebooks.opendatahub.io/last-size-selection: Small
360359
notebooks.opendatahub.io/oauth-logout-url:
361-
https://rhods-dashboard-redhat-ods-applications.apps.my-cluster.com/projects/my-data-science-project?notebookLogout=my-workbench
360+
<dashboard_URL>/projects/my-data-science-project?notebookLogout=my-workbench
362361
opendatahub.io/accelerator-name:
363362
opendatahub.io/image-display-name: My Custom Notebook
364363
opendatahub.io/username: kube:admin
@@ -383,7 +382,7 @@ Spec:
383382
--ServerApp.password=''
384383
--ServerApp.base_url=/notebook/my-data-science-project/my-workbench
385384
--ServerApp.quit_button=False
386-
--ServerApp.tornado_settings={"user":"kube-3aadmin","hub_host":"https://rhods-dashboard-redhat-ods-applications.apps.my-cluster.com", "hub_prefix":"/projects/my-data-science-project"}
385+
--ServerApp.tornado_settings={"user":"kube-3aadmin","hub_host":"<dashboard_URL>", "hub_prefix":"/projects/my-data-science-project"}
387386
Name: JUPYTER_IMAGE
388387
Value: image-registry.openshift-image-registry.svc:5000/redhat-ods-applications/my-custom-notebook:1.0
389388
Name: PIP_CERT

modules/configuring-monitoring-for-the-multi-model-serving-platform.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,8 +19,8 @@ ifdef::cloud-service[]
1919
** link:https://docs.redhat.com/en/documentation/openshift_dedicated/{osd-latest-version}/html/cli_tools/openshift-cli-oc#installing-openshift-cli[Installing the OpenShift CLI^] for OpenShift Dedicated
2020
** link:https://docs.redhat.com/en/documentation/red_hat_openshift_service_on_aws_classic_architecture/{rosa-classic-latest-version}/html/cli_tools/openshift-cli-oc#installing-openshift-cli[Installing the OpenShift CLI^] for {rosa-classic-productname}
2121
endif::[]
22-
* You are familiar with link:https://docs.redhat.com/en/documentation/openshift_container_platform/{ocp-latest-version}/html/monitoring/index#preparing-to-configure-the-monitoring-stack-uwm[creating a config map] for monitoring a user-defined workflow. You will perform similar steps in this procedure.
23-
* You are familiar with link:https://docs.redhat.com/en/documentation/openshift_container_platform/{ocp-latest-version}/html/monitoring/index#enabling-monitoring-for-user-defined-projects-uwm_preparing-to-configure-the-monitoring-stack-uwm[enabling monitoring] for user-defined projects in OpenShift. You will perform similar steps in this procedure.
22+
* You are familiar with link:https://docs.redhat.com/en/documentation/openshift_container_platform/{ocp-latest-version}/html/monitoring/configuring-core-platform-monitoring#preparing-to-configure-the-monitoring-stack[creating a config map] for monitoring a user-defined workflow. You will perform similar steps in this procedure.
23+
* You are familiar with link:https://docs.redhat.com/en/documentation/openshift_container_platform/{ocp-latest-version}/html/monitoring/configuring-user-workload-monitoring#enabling-monitoring-for-user-defined-projects-uwm_preparing-to-configure-the-monitoring-stack-uwm[enabling monitoring] for user-defined projects in OpenShift. You will perform similar steps in this procedure.
2424
* You have link:https://docs.redhat.com/en/documentation/openshift_container_platform/{ocp-latest-version}/html/monitoring/configuring-user-workload-monitoring#granting-users-permission-to-monitor-user-defined-projects_preparing-to-configure-the-monitoring-stack-uwm[assigned] the `monitoring-rules-view` role to users that will monitor metrics.
2525

2626
.Procedure

0 commit comments

Comments
 (0)