Red Hat AI and AWS Together
Overview
Together with AWS’s cloud offering, Red Hat helps organizations create consistent, security-focused, and trusted solutions that simplify the complexities of the latest advancements in AI and accelerate the return on investment (ROI). Whether customers are starting their AI journey using Amazon SageMaker, or are already using Red Hat AI tools such as Red Hat Enterprise Linux AI and Red Hat OpenShift AI, they gain the tight integration of MLOps with DevOps platforms to reduce the friction and time needed to launch business-ready applications.
Red Hat AI provides a consistent, security-focused, and trusted solution that simplifies the development and delivery of containerized applications and AI-enabled applications. When combined with the flexibility and scalability of the AWS cloud, Red Hat AI helps enterprise organizations adopt AI innovation with a consistent environment across on-premise and cloud with the ability to scale when they choose.
Benefits
Red Hat’s open hybrid cloud approach allows AWS customers to take advantage of their existing hybrid IT while setting themselves up for future advances in AI. Customers can deploy, run, and manage AI workloads and AI-enabled applications consistently across any environment.
This flexibility allows enterprises to use their preferred tools and platforms while implementing Red Hat’s robust infrastructure for deploying AI-driven applications. Red Hat’s approach to AI draws on its commitment to open source and includes Red Hat OpenShift AI, which offers a more composable environment, supporting any GPU, datacenter, or cloud (AWS).
As a native AWS service, Red Hat OpenShift Service on AWS (ROSA) provides flexible consumption-based billing, integrated cluster creation, and joint support and operation from Red Hat and AWS.
SageMaker integration
Red Hat OpenShift Service on AWS is designed to integrate with Amazon SageMaker and other tools, allowing for a streamlined workflow from AI model training to application deployment. This interoperability helps enterprises train models using preferred data science tools and deploy those models across various environments, whether on-premise, in the cloud, or in hybrid settings.
SageMaker works with Red Hat OpenShift Service on AWS (ROSA) to combine the benefits of the AWS Managed Service for model training, model lifecycle, and model deployment with the use of Red Hat OpenShift Service on AWS for the application lifecycle and integrated developer experience.
Models can be trained, developed, and delivered to RedHat OpenShift Service on AWS via Amazon CodeDeploy, integrated into cloud-native applications developed on Red Hat OpenShift Service on AWS, and deployed anywhere Red Hat OpenShift can run.
SageMaker appeals to data scientists that are fully invested in the AWS ecosystem for AI but want the ability to manage the deployment of models into applications anywhere Red Hat OpenShift can run.
AWS with OpenShift AI
AWS provides robust protections and certifications to safeguard business and customer data, and Red Hat’s security capabilities and operator lifecycle integration are built into intelligent applications to simplify the deployment and maintenance of models.
Red Hat OpenShift AI is built on open source technologies and designed as composable AI with the ability to help develop models. It can train models, tune and publish models into ML/Ops workflows, with the ability to run on Red Hat OpenShift Service on AWS (ROSA) or on Red Hat OpenShift in other hybrid deployment models.
Red Hat OpenShift AI is fully integrated with Red Hat OpenShift Service on AWS application platform so that ML/Ops and DevOps workflows can integrate and interact to accelerate the benefits of AI, while providing a predictable support experience.
Red Hat OpenShift AI appeals to application teams, data engineers, and data scientists wanting more control over what components they use.