Generative AI and Open Source Models: Hands-On Practice with Hugging Face Models
.MP4, AVC, 1280x720, 30 fps | English, AAC, 2 Ch | 1h 52m | 333 MB
Instructor: Harpreet Sahota
.MP4, AVC, 1280x720, 30 fps | English, AAC, 2 Ch | 1h 52m | 333 MB
Instructor: Harpreet Sahota
Large language models (LLMs) are becoming increasingly crucial in various industries. This course with instructor Harpreet Sahota offers a deep dive into the inner workings of text generation using LLMs. Learn about the importance of tokenization, special tokens, and chat templates in text generation. Explore how to manipulate the next selected token and gain a technical and intuitive understanding of generation parameters such as temperature, top-p, top-k, repetition penalty, length penalty, and bad words list.
Discover how these parameters can be combined to form powerful decoding strategies, including greedy search, multinomial sampling, beam search, and contrastive search. Gain hands-on experience using the Hugging Face text generation API and get a sneak peek into interacting with the NVIDIA NIM API to explore larger models. By the end of this course, you'll have a solid foundation in controlling text generation with LLMs, enabling you to apply these skills in real-world scenarios.