Serving an LLM using OpenShift AI

Welcome to this quick course on Serving an LLM using OpenShift AI.

This program was designed to guide you through the process of installing an OpenShift AI Platform using the OpenShift Container Platform Web Console UI. We get hands-on experience in each component needed to enable a RHOAI Platform using an Openshift Container Platform Cluster.

Once we have an operational OpenShift AI Platform, we will login and begin the configuration of: model runtimes, data science projects, data connections, & finally use a jupyter notebook to infer the answers to easy questions.

There will be some challenges along the way, all designed to teach us about a component, or give us the knowledge needed to utilize OpenShift AI and host a Large Language Model.

If you’re ready, let’s get started!

The hands-on labs in this course were updated to and tested with RHOAI v2.14. Labs should work without any changes in minor dot release upgrades of the product. Please open issues in this repository if you face any problem or feel free to send me a note on slack.

Authors

The PTL team acknowledges the valuable contributions of the following Red Hat associates:

  • Christopher Nuland

  • Vijay Chebolu

  • Noel O’Conner

  • Hunter Gerlach

  • Karlos Knox

Classroom Environment

We will use the Red Hat OpenShift Container Platform Cluster (AWS) catalog item in the Red Hat Demo Platform (RHDP) to run the hands-on exercises in this course.

If you are planning on starting this course now, go ahead and launch the lab environment. It takes ~45 minutes to provision it, enough time to read through to understand the concepts, then follow along on the second pass.
demo platform catalog
Figure 1. Animated - Walkthrough of Demo Hub order selections.

When ordering this catalog item in RHDP:

  1. Select Red Hat OpenShift Container Platform Cluster (AWS) catalog item.

  2. Select order from the pop up lab menu page.

  3. Select Practice/Enablement for the Activity field.

  4. Select Learning about the Product for the Purpose field.

  5. Leave the Salesforce ID field blank.

  6. Scroll to the bottom, read the usage cost section, then check the box to confirm acceptance of terms and conditions.

  7. Click order.

For Red Hat partners who do not have access to RHDP, provision an environment using the Red Hat Hybrid Cloud Console. Unfortunately, the labs will NOT work on the trial sandbox environment. You need to provision an OpenShift AI cluster on-premises, or in the supported cloud environments by following the Product Documentation for installing Red Hat OpenShift AI 2.14.

Prerequisites

  • Basic experience with Red Hat OpenShift Container Platform is recommended but is not mandatory.

  • We will encounter & modify code segments, deploy resources using YAML files, & modify launch configurations; but we will not have to write code.

Objectives

The overall objectives of this course include:

  • Utilize Red Hat OpenShift AI to serve & interact with an LLM.

  • Install Red Hat OpenShift AI operators & dependencies.

  • Add a custom model serving runtime.

  • Create a data science project, workbench & data connections.

  • Load an LLM model into the Ollama runtime framework.

  • Import (from git repositories), interact with LLM model via Jupyter Notebooks.

  • Experiment with the Mistral LLM and Llama3 large language models.