Leverage MLOps to scale AI/ML to the enterprise has been saved
Leverage MLOps to scale AI/ML to the enterprise
Part of the For Cloud Professionals podcast series
AI/ML can help transform how companies glean insights into their business, but it's often difficult to put AI/ML models into wide-scale production. Enter MLOps, which takes a disciplined approach to scaling AI/ML to the enterprise.
Scaling the AI/ML process to the enterprise with MLOps
Artificial intelligence and machine learning (AI/ML) have spectacular potential to provide transformational insights. However, it’s often difficult to scale AI/ML models to the enterprise. In this episode, Mike Kavis and guest, Deloitte’s Sudi Bhattacharya, discuss the emerging discipline of MLOps and how it’s helping organizations develop sound models and then scale those to enterprise production—thus closing the “train to production” gap for AI/ML. According to Sudi, MLOps uses a three-step approach: continuously training the models, developing a robust deployment framework (including the right team), and integrating the AI/ML models into the business processes. One caveat: the approach to AI/ML should be problem first, technology later.
You have to think about building [models] for accuracy over time as well. So, that I think the right approach is problem first, technology–even if it’s AI/ML–later.
Sudi Bhattacharya is a managing director with Deloitte Consulting LLP in Deloitte’s Cloud Engineering practice. He leads big data and AI/ML driven business process transformation projects in public cloud environment for Fortune 500 businesses in finance, retail, CPG, and foodservice.
Cloud machine learning presents an emerging opportunity for organizations to advance their AI strategies, using cloud ML to accelerate innovation, drive efficiencies, and meet customer demands at an unprecedented pay and scale. With a three-pronged promise of efficiencies in time, technology, and talent, organizations are already reaping the benefits.
What happens if the business problem we are trying to solve for requires a large machine learning model or large amount of training data that won’t fit into the memory of a single server? What if the computation needs are impossible for a single processor to support?
Put Cloud in context with the future of business and technology
Because cloud is never just about cloud, a podcast about cloud isn’t either. Our two hosts deliver two unique perspectives to help bring you closer to achieving what matters most—your possible.
For Cloud Professionals, hosted by David Linthicum, provides an enterprise-level, strategic look at key issues impacting clients’ businesses. David, ranked as the #1 cloud influencer in a recent Apollo Research report, has published 13 books on computing, written over 5,000 published articles and performed over 500 conference presentations, making his specialization in the power of cloud simply undeniable.
As a pioneer in cloud computing, Mike Kavis leads Architecting the Cloud, which offers insights from the POV of those who’ve had hands-on experience with cloud technology. Mike’s personal cloud journey includes leading the team that built the world's first high-speed transaction network in Amazon's public cloud—a project that ultimately won the 2010 AWS Global Startup Challenge.
With two leaders in your ear, you’ll have the content you need to drive the next conversation around cloud. Check out both talk tracks within the Deloitte On Cloud podcast to get the compelling stories on your schedule to help you understand the topics that are reshaping today’s market.
Contact us at firstname.lastname@example.org for information on this or any other On Cloud podcasts.
Or visit the On Cloud library for the full collection of episodes.