Tags
Language
Tags
January 2025
Su Mo Tu We Th Fr Sa
29 30 31 1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31 1
Attention❗ To save your time, in order to download anything on this site, you must be registered 👉 HERE. If you do not have a registration yet, it is better to do it right away. ✌

( • )( • ) ( ͡⚆ ͜ʖ ͡⚆ ) (‿ˠ‿)
SpicyMags.xyz

LLMs in Production: From language models to successful products

Posted By: GFX_MAN
LLMs in Production: From language models to successful products

LLMs in Production: From language models to successful products
English | 2025 | ISBN: 1633437205 | 456 pages | True EPUB | 15.31 MB

Learn how to put Large Language Model-based applications into production safely and efficiently.

This practical book offers clear, example-rich explanations of how LLMs work, how you can interact with them, and how to integrate LLMs into your own applications. Find out what makes LLMs so different from traditional software and ML, discover best practices for working with them out of the lab, and dodge common pitfalls with experienced advice.

In LLMs in Production you will
Grasp the fundamentals of LLMs and the technology behind them
Evaluate when to use a premade LLM and when to build your own
Efficiently scale up an ML platform to handle the needs of LLMs
Train LLM foundation models and finetune an existing LLM
Deploy LLMs to the cloud and edge devices using complex architectures like PEFT and LoRA
Build applications leveraging the strengths of LLMs while mitigating their weaknesses

LLMs in Production delivers vital insights into delivering MLOps so you can easily and seamlessly guide one to production usage. Inside, you’ll find practical insights into everything from acquiring an LLM-suitable training dataset, building a platform, and compensating for their immense size. Plus, tips and tricks for prompt engineering, retraining and load testing, handling costs, and ensuring security.

About the Technology
Most business software is developed and improved iteratively, and can change significantly even after deployment. By contrast, because LLMs are expensive to create and difficult to modify, they require meticulous upfront planning, exacting data standards, and carefully-executed technical implementation. Integrating LLMs into production products impacts every aspect of your operations plan, including the application lifecycle, data pipeline, compute cost, security, and more. Get it wrong, and you may have a costly failure on your hands.

About the Book
LLMs in Production teaches you how to develop an LLMOps plan that can take an AI app smoothly from design to delivery. You’ll learn techniques for preparing an LLM dataset, cost-efficient training hacks like LORA and RLHF, and industry benchmarks for model evaluation. Along the way, you’ll put your new skills to use in three exciting example projects: creating and training a custom LLM, building a VSCode AI coding extension, and deploying a small model to a Raspberry Pi.

What's Inside
Balancing cost and performance
Retraining and load testing
Optimizing models for commodity hardware
Deploying on a Kubernetes cluster