OpenShift AI Initialization
Supported configurations
OpenShift AI is supported in two configurations:
-
A managed cloud service add-on for Red Hat OpenShift Service on Amazon Web Services (ROSA, with a Customer Cloud Subscription for AWS) or Red Hat OpenShift Dedicated (GCP). For information about OpenShift AI on a Red Hat managed environment, see Product Documentation for Red Hat OpenShift AI Cloud Service.
-
Self-managed software that you can install on-premise or on the public cloud in a self-managed environment, such as OpenShift Container Platform. For information about OpenShift AI as self-managed software on your OpenShift cluster in a connected or a disconnected environment, see Product Documentation for Red Hat OpenShift AI Self-Managed 2.14.
In this course we cover installation of Red Hat OpenShift AI self-managed using the OpenShift Web Console.
Applicable Operators
The product name has been recently changed to Red Hat OpenShift AI (RHOAI) (old name Red Hat OpenShift Data Science). In this course, most references to the product use the new name. However, references to some UI elements might still use the previous name.
In addition to the Red Hat OpenShift AI Operator there are additional operators that you may need to install depending on which features and components of Red Hat OpenShift AI you want to utilize.
- Red Hat OpenShift Serverless Operator
-
The Red Hat OpenShift Serverless operator provides a collection of APIs that enables containers, microservices and functions to run "serverless". The Red Hat OpenShift Serverless Operator is required if you want to install the Single-model serving platform component.
- Red Hat OpenShift Service Mesh Operator
-
Red Hat OpenShift Service Mesh operator provides an easy way to create a network of deployed services that provides discovery, load balancing, service-to-service authentication, failure recovery, metrics, and monitoring. The Red Hat OpenShift Serverless Operator is required if you want to install the Single-model serving platform component.
- Red Hat Authorino (technical preview) Operator
-
Red Hat Authorino is an open source, Kubernetes-native external authorization service to protect APIs. The Red Hat Authorino Operator is required to support enforcing authentication policies in Red Hat OpenShift AI.
The following Operators are required to support the use of Nvidia GPUs (accelerators) with OpenShift AI:
- Node Feature Discovery Operator
-
The Node Feature Discovery Operator is a prerequisite for the NVIDIA GPU Operator.
- NVIDIA GPU Operator
-
The NVIDIA GPU Operator is required for GPU support in Red Hat OpenShift AI.