In this episode, we ask Red Hat about the platform requirements for AI applications in production. What makes AI applications special and how does this change the infrastructure required to support these? The demand for flexibility, scalability, and distribution seems to match the capabilities of a hybrid cloud, and this is emerging as the preferred model for AI infrastructure. Red Hat is supporting the container-centric hybrid cloud with OpenShift, and containers are also critical to AI workloads. Red Hat has production customers in healthcare, manufacturing, and financial industries deploying ML workloads in production right now.
Episode Hosts and Guests
Abhinav Joshi, Senior Manager, Product Marketing, OpenShift Business Unit, Red Hat. Find Abhinav on Twitter at @Abhinav_Joshi.
Tushar Katarki, Senior Manager, Product Management, OpenShift Business Unit, Red Hat. Find Tushar on Twitter at @TKatarki.
Chris Grundemann, Gigaom Analyst and Managing Director at Grundemann Technology Solutions. Connect with Chris on ChrisGrundemann.com on Twitter at @ChrisGrundemann.
Stephen Foskett, Publisher of Gestalt IT and Organizer of Tech Field Day. Find Stephen’s writing at GestaltIT.com and on Twitter at @SFoskett.