Hands-On MLOps on Azure: Automate, secure, and scale ML workflows with the Azure ML CLI, GitHub, and LLMOps
English | 2025 | ISBN: 1836200331 | 322 pages | True EPUB | 8.71 MB
A practical guide to building, deploying, automating, monitoring, and scaling ML and LLM solutions in production
Key Features
Build reproducible ML pipelines with Azure ML CLI and GitHub Actions
Automate ML workflows end to end, including deployment and monitoring
Apply LLMOps principles to deploy and manage generative AI responsibly across clouds
Book Description
Effective machine learning (ML) now demands not just building models but deploying and managing them at scale. Written by a seasoned senior software engineer with high-level expertise in both MLOps and LLMOps, Hands-On MLOps on Azure equips ML practitioners, DevOps engineers, and cloud professionals with the skills to automate, monitor, and scale ML systems across environments.
The book begins with MLOps fundamentals and their roots in DevOps, exploring training workflows, model versioning, and reproducibility using pipelines. You'll implement CI/CD with GitHub Actions and the Azure ML CLI, automate deployments, and manage governance and alerting for enterprise use. The author draws on their production ML experience to provide you with actionable guidance and real-world examples. A dedicated section on LLMOps covers operationalizing large language models (LLMs) such as GPT-4 using RAG patterns, evaluation techniques, and responsible AI practices. You'll also work with case studies across Azure, AWS, and GCP that offer practical context for multi-cloud operations.
Whether you're building pipelines, packaging models, or deploying LLMs, this guide delivers end-to-end strategy to build robust, scalable systems. By the end of this book, you'll be ready to design, deploy, and maintain enterprise-grade ML solutions with confidence.
What you will learn
Understand the DevOps to MLOps transition
Build reproducible, reusable pipelines using the Azure ML CLI
Set up CI/CD for training and deployment workflows
Monitor ML applications and detect model/data drift
Capture and secure governance and lineage data
Operationalize LLMs using RAG and prompt flows
Apply MLOps across Azure, AWS, and GCP use cases
Who this book is for
This book is for DevOps and Cloud engineers and SREs interested in or responsible for managing the lifecycle of machine learning models. Professionals who are already familiar with their ML workloads and want to improve their practices, or those who are new to MLOps and want to learn how to effectively manage machine learning models in this environment, will find this book beneficial. The book is also useful for technical decision-makers and project managers looking to understand the process and benefits of MLOps.