Red Hat OpenShift AI on HyperScalers

Introduction
This course focuses on Red Hat OpenShift AI (RHOAI) running on Cloud Providers (Hyperscalers).
So grab a cushioned seat and let’s discover the insider information you need to make informed decisions when considering Red Hat’s OpenShift Cloud Service for your AI solutions.
Waiting for hardware can take weeks or months, which is why obtaining an OpenShift AI Machine Learning Operations Platform (MLOps) in minutes can be a crucial component for unlocking the full potential of Generative AI for organizations on the move.
The scalable OpenShift Container Platform (OCP) clusters support containerized workloads, virtual machines, and with OpenShift AI running on OCP, it becomes an MLOps platform for Generative and Predictive AI model development, deployment, and management that can span cloud providers. A true Multi-Cloud solution which includes cloud providers, on-premises, and a mix of the two.
We’ll cover the elements that matter to provide you with an inclusive real world experience that you can use to solve business problems.
Why a course on OpenShift AI in the Cloud
OpenShift and OpenShift AI fill the gap between data, compute, AI Model life cylces and applications across Cloud, On-premise, and Managed environments.
Cloud providers are powering more and more open-source AI solutions and projects as SaaS offerings, some at zero cost to increase infrastructure usage. Red Hat must provide value-added services to help make Red Hat products, subscriptions, and services more compelling and valuable than competitor applications and infrastructure platforms.
This course focuses sharing the knowledge of Red Hat OpenShift Cloud Services, along with Cloud Provider SaSS solutions in the AI Space so you understand what actions should be taken to invent, create, optimize, or deploy AI solutions using Red Hat OpenShift AI on Amazon Web Services (AWS) and Microsoft Azure (Azure).
Based on your feedback, we will consider splitting this course into two distinct learning journeys focused on each cloud independently, if learning two cloud providers at once proves too mentally intensive. OpenShift AI is also supported as a managed Cloud Services on IBM Cloud and Google Cloud Platforms. The consumption of OpenShift on these cloud providers is limited and therefore this course will solely focus on AWS and Azure hyperscalers. |
So what’s included?
Looking for a specific knowledge
visit these sections for information on:
-
Section 1: MultiCloud OpenShift
-
Geared towards all audiences who want to know about OpenShift AI in Cloud Environments. We will review the options for OpenShift Container Platform Cluster offerings on Cloud Providers including both managed and a *special case self-managed offerings.
-
Focuses on the decisions to instantiate an OpenShift Container Cluster Platform using fully managed Red Hat’s Cloud Service offering.
-
-
Sections 2 and 3: Hyperscaler service offersings AWS and Azure
-
Focus on AWS and Azure AI service offerings and how those might compare to self-managed OpenShift AI.
-
-
Sections 4 and 5: Hands-on-Labs - AWS and Azure
-
Labs which provide hands-on experiences with a mix of Cloud Provider Solutions and OpenShift / OpenShift AI services to solve tasks using AI assistant business use cases.
-
There is one lab for AWS and another for Azure, choose the best option for your needs.
-
Duration
Recommended budgeting three hours if you plan to complete both labs.
Overall the experience will take about 1 hour consuming course materials, however there is also ~1 hour provisioning time for each lab environment to become available.
Objectives
On completing this course, you should be able to:
-
Create an open-source Multi-Cloud playbook for OpenShift AI
-
Understand AWS options for OpenShift Container Platform (ROSA) Cluster Deployment supporting OpenShift AI
-
Understand Azure options for OpenShift Container Cluster supporting OpenShift AI
-
Understand how OpenShift Cloud services are requested and provisioned.
-
Analyze Cloud Providers AI Platform features
By understanding the AI features, solutions and services offered by Cloud Hyperscales you’ll better understand the solutions that customers could be interested in, and consider how those same solutions can be provided via RHOAI.
-
Overview of AWS AI Services
-
SageMaker | BedRock | AWS Trainium | AWS Inferentia | EC2 Ultra Clusters
-
Amazon Q | AWS App Studio | PartyTock on Amazon BedRock
-
-
Overview of Azure Services and their integration with OpenShift AI.
-
Azure OpenAI Service | Azure AI Search | Azure AI Content Safety
-
Azure AI Vision | PHI Small Language Model
-
2 Labs Environments Azure and AWS *
Azure - Interacting with Azure OpenAI Service using Front End App hosted in OpenShift * expose how easy it is to run a model service on cloud provider * expose the cost per hour. * expose the time to value
AWS - starting from an OpenShift Cluster Use GUI to installation OpenShift AI Deploy RAG Environment populate database with documentation exports Deploy AI Model Runtime
Appendix - reference additional learning Aid
-
GitOps Deployments of AI Services and integration with OpenShift AI ??
-
New Lab Environment created using Code Files - CLI Commands - Showing how automation would work.
Prerequisites
This course assumes that you have the following prior experience:
-
Experience with navigating cloud provider dashboards (consoles) and service offerings
-
Understanding of Cloud Provider Marketplace & Solution purchasing / accounts / billing / subscriptions
-
Knowledge of Kubernetes which is the underlying technology of OpenShift and OpenShift AI
-
The OpenShift Cloud Services Opportunity course in the Red Hat LMS