OpenShift AI Configuration
This chapter begins with running and configured OpenShift AI environment. If you don’t already have your environment running, head over to Chapter 2.
There’s a lot to cover in this section, we add the Ollama custom Runtime, setup MinIO storage, create a workbench, and finally serve the Ollama Framework, utilizing the Single Model Serving Platform to deliver our model to our Notebook Application.
Let’s get started!