This guide will walk through installing Open Data Hub/Red Hat Openshift AI and TrustyAI into your cluster. Starting from a completely blank cluster, you will be left with:
- An Open Data Hub/Red Hat Openshift AI installation
- A namespace to deploy models into
- A TrustyAI Operator, to manage all instances of the TrustyAI Service
- A TrustyAI Service, to monitor and analyze all the models deployed into your model namespace.
- Make sure you are
oc login'd to your OpenShift cluster - Create two projects,
opendatahubandmodel-namespace. These names are arbitrary, but I'll be using them throughout the rest of this demo:oc new-project opendatahuboc new-project model-namespace
To get enable ODH's monitoring stack , user-workload-monitoring must be configured:
- Enable user-workload-monitoring:
oc apply -f resources/enable_uwm.yaml - Configure user-workload-monitoring to hold metric data for 15 days:
oc apply -f resources/uwm_configmap.yaml
Depending on how your cluster was created, you may need to enable a User Workload Monitoring setting from your cluster management UI (for example, on console.redhat.com)
- Install the Red Hat OpenShift Serverless operator.
- Install the Red Hat OpenShift Service Mesh operator.
- From the OpenShift Console, navigate to "Operators" -> "OperatorHub", and search for "Open Data Hub" or "Red Hat Openshift AI"

- Click on "Open Data Hub Operator" or "Red Hat Openshift AI".
- If the "Show community Operator" warning opens, hit "Continue"
- Hit "Install".
- From the "Install Operator" screen:
- Make sure "All namespaces on the cluster" in selected as the "Installation Mode":
- Hit install
- Wait for the Operator to finish installing
- Navigate to your
opendatahubproject - From "Installed Operators", select "Open Data Hub Operator".
- Navigate to the "DSC Initialization" tab and hit "Create DSCInitialization", then install the default DSCI. Once the DSCI reports "Ready", move on to step 4.
- Navigate to the "Data Science Cluster" tab and hit "Create DataScienceCluster"
- In the YAML view Make sure
trustyaiis set toManaged:
- Hit the "Create" button
- Within the "Pods" menu, you should begin to see various ODH components being created, including the
trustyai-service-operator-controller-manager-xxx
In order to ensure that TrustyAI can receive encrypted model payloads, we need to add TrustyAI's CA bundle to your model controller. Make sure to pick the right command for your chosen operator:
NAMESPACE=opendatahub
oc patch configmap inferenceservice-config -n $NAMESPACE --type merge -p '{"metadata": {"annotations": {"opendatahub.io/managed": "false"}}}'
IMAGE=$(oc get configmap inferenceservice-config -n $NAMESPACE -o json | jq -r '.data.agent | fromjson | .image')
oc patch configmap inferenceservice-config \
-n "$NAMESPACE" \
--type json \
-p="[{
\"op\": \"add\",
\"path\": \"/data/logger\",
\"value\": \"{\\\"image\\\" : \\\"$IMAGE\\\",\\\"memoryRequest\\\": \\\"100Mi\\\",\\\"memoryLimit\\\": \\\"1Gi\\\",\\\"cpuRequest\\\": \\\"100m\\\",\\\"cpuLimit\\\": \\\"1\\\",\\\"defaultUrl\\\": \\\"http://default-broker\\\",\\\"caBundle\\\": \\\"kserve-logger-ca-bundle\\\",\\\"caCertFile\\\": \\\"service-ca.crt\\\",\\\"tlsSkipVerify\\\": false}\"
}]"NAMESPACE=redhat-ods-applications
oc patch configmap inferenceservice-config -n $NAMESPACE --type merge -p '{"metadata": {"annotations": {"opendatahub.io/managed": "false"}}}'
IMAGE=$(oc get configmap inferenceservice-config -n $NAMESPACE -o json | jq -r '.data.agent | fromjson | .image')
oc patch configmap inferenceservice-config \
-n "$NAMESPACE" \
--type json \
-p="[{
\"op\": \"add\",
\"path\": \"/data/logger\",
\"value\": \"{\\\"image\\\" : \\\"$IMAGE\\\",\\\"memoryRequest\\\": \\\"100Mi\\\",\\\"memoryLimit\\\": \\\"1Gi\\\",\\\"cpuRequest\\\": \\\"100m\\\",\\\"cpuLimit\\\": \\\"1\\\",\\\"defaultUrl\\\": \\\"http://default-broker\\\",\\\"caBundle\\\": \\\"kserve-logger-ca-bundle\\\",\\\"caCertFile\\\": \\\"service-ca.crt\\\",\\\"tlsSkipVerify\\\": false}\"
}]"oc project model-namespace
oc apply -f resources/trustyai.yamlThis will install the TrustyAI Service
into your model-namespace project, which will then provide TrustyAI features to all subsequent models deployed into that project, such as explainability, fairness monitoring, and data drift monitoring,