Feature Selection For Machine Learning

Posted By: ELK1nG

Feature Selection For Machine Learning
Last updated 6/2022
MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz
Language: English | Size: 2.39 GB | Duration: 5h 48m

Learn filter, wrapper, and embedded methods, recursive feature elimination, exhaustive search, feature shuffling & more.

What you'll learn
Learn about filter, embedded and wrapper methods for feature selection
Find out about hybdrid methods for feature selection
Select features with Lasso and decision trees
Implement different methods of feature selection with Python
Learn why less (features) is more
Reduce the feature space in a dataset
Build simpler, faster and more reliable machine learning models
Analyse and understand the selected features
Discover feature selection techniques used in data science competitions
Requirements
A Python installation
Jupyter notebook installation
Python coding skills
Some experience with Numpy and Pandas
Familiarity with Machine Learning algorithms
Familiarity with scikit-learn
Description
Welcome to Feature Selection for Machine Learning, the most comprehensive course on feature selection available online.In this course, you will learn how to select the variables in your data set and build simpler, faster, more reliable and more interpretable machine learning models.Who is this course for?You’ve given your first steps into data science, you know the most commonly used machine learning models, you probably built a few linear regression or decision tree based models. You are familiar with data pre-processing techniques like removing missing data, transforming variables, encoding categorical variables. At this stage you’ve probably realized that many data sets contain an enormous amount of features, and some of them are identical or very similar, some of them are not predictive at all, and for some others it is harder to say.You wonder how you can go about to find the most predictive features. Which ones are OK to keep and which ones could you do without? You also wonder how to code the methods in a professional manner. Probably you did your online search and found out that there is not much around there about feature selection. So you start to wonder: how are things really done in tech companies?This course will help you! This is the most comprehensive online course in variable selection. You will learn a huge variety of feature selection procedures used worldwide in different organizations and in data science competitions, to select the most predictive features.What will you learn?I have put together a fantastic collection of feature selection techniques, based on scientific articles, data science competitions and of course my own experience as a data scientist.Specifically, you will learn:How to remove features with low varianceHow to identify redundant featuresHow to select features based on statistical testsHow to select features based on changes in model performanceHow to find predictive features based on importance attributed by modelsHow to code procedures elegantly and in a professional mannerHow to leverage the power of existing Python libraries for feature selectionThroughout the course, you are going to learn multiple techniques for each of the mentioned tasks, and you will learn to implement these techniques in an elegant, efficient, and professional manner, using Python, Scikit-learn, pandas and mlxtend.At the end of the course, you will have a variety of tools to select and compare different feature subsets and identify the ones that returns the simplest, yet most predictive machine learning model. This will allow you to minimize the time to put your predictive models into production.This comprehensive feature selection course includes about 70 lectures spanning ~8 hours of video, and ALL topics include hands-on Python code examples which you can use for reference and for practice, and re-use in your own projects.In addition, I update the course regularly, to keep up with the Python libraries new releases and include new techniques when they appear.So what are you waiting for? Enroll today, embrace the power of feature selection and build simpler, faster and more reliable machine learning models.

Overview

Section 1: Introduction

Lecture 1 Course Curriculum Overview

Lecture 2 Course requirements

Lecture 3 Course Aim

Lecture 4 Optional: How to approach this course

Lecture 5 Course Material

Lecture 6 The code | Jupyter notebooks

Lecture 7 Presentations covered in this course

Lecture 8 Download the data sets

Lecture 9 FAQ: Data Science and Python programming

Section 2: Feature Selection

Lecture 10 What is feature selection?

Lecture 11 Feature selection methods | Overview

Lecture 12 Filter Methods

Lecture 13 Wrapper methods

Lecture 14 Embedded Methods

Lecture 15 Moving Forward

Lecture 16 Open-source packages for feature selection

Section 3: Filter Methods | Basics

Lecture 17 Constant, quasi constant, and duplicated features – Intro

Lecture 18 Constant features

Lecture 19 Quasi-constant features

Lecture 20 Duplicated features

Lecture 21 Install Feature-engine

Lecture 22 Drop constant and quasi-constant with Feature-engine

Lecture 23 Drop duplicates with Feature-engine

Section 4: Filter methods | Correlation

Lecture 24 Correlation - Intro

Lecture 25 Correlation Feature Selection

Lecture 26 Correlation procedures to select features

Lecture 27 Correlation | Notebook demo

Lecture 28 Basic methods plus Correlation pipeline

Lecture 29 Correlation with Feature-engine

Lecture 30 Feature Selection Pipeline with Feature-engine

Lecture 31 Additional reading resources

Section 5: Filter methods | Statistical measures

Lecture 32 Statistical methods – Intro

Lecture 33 Mutual information

Lecture 34 Mutual information demo

Lecture 35 Chi-square test

Lecture 36 Chi-square | Demo

Lecture 37 Chi-square considerations

Lecture 38 Chi2 - calculating the expected frequencies (Optional)

Lecture 39 Anova

Lecture 40 Anova | Demo

Lecture 41 Select features based of p-values

Lecture 42 Do you want to learn more about stats?

Lecture 43 Basic methods + Correlation + Filter with stats pipeline

Section 6: Filter Methods | Other methods and metrics

Lecture 44 Filter Methods with other metrics

Lecture 45 Univariate model performance metrics

Lecture 46 Univariate model performance metrics | Demo

Lecture 47 KDD 2009: Select features by target mean encoding

Lecture 48 KDD 2009: Select features by mean encoding | Demo

Lecture 49 Univariate model performance with Feature-engine

Lecture 50 Target Mean Encoding Selection with Feature-engine

Section 7: Wrapper methods

Lecture 51 Wrapper methods – Intro

Lecture 52 MLXtend

Lecture 53 Step forward feature selection

Lecture 54 SFS - MLXtend vs Sklearn

Lecture 55 Step forward feature selection | MLXtend

Lecture 56 Step forward feature selection | sklearn

Lecture 57 Step backward feature selection

Lecture 58 Step backward feature selection | MLXtend

Lecture 59 Step backward feature selection | Sklearn

Lecture 60 Exhaustive search

Lecture 61 Exhaustive search | Demo

Section 8: Embedded methods | Linear models

Lecture 62 Regression Coefficients – Intro

Lecture 63 Selection by Logistic Regression Coefficients

Lecture 64 Selection by Linear Regression Coefficients

Lecture 65 Coefficients change with penalty

Lecture 66 Basic methods + Correlation + Embedded method using coefficients

Section 9: Embedded methods – Lasso regularisation

Lecture 67 Regularisation – Intro

Lecture 68 Lasso

Lecture 69 A note on SelectFromModel

Lecture 70 Basic filter methods + LASSO pipeline

Section 10: Embedded methods | Trees

Lecture 71 Feature Selection by Tree importance | Intro

Lecture 72 Feature Selection by Tree importance | Demo

Lecture 73 Feature Selection by Tree importance | Recursively

Lecture 74 Feature selection with decision trees | review

Section 11: Hybrid feature selection methods

Lecture 75 Introduction to hybrid methods

Lecture 76 Feature Shuffling - Intro

Lecture 77 Shuffling features | Demo

Lecture 78 Recursive feature elimination - Intro

Lecture 79 Recursive feature elimination | Demo

Lecture 80 Recursive feature addition - Intro

Lecture 81 Recursive feature addition | Demo

Lecture 82 Feature Shuffling with Feature-engine

Lecture 83 Recursive feature elimination with Feature-engine

Lecture 84 Recursive feature addition with Feature-engine

Section 12: Final section | Next steps

Lecture 85 Additional reading resources

Lecture 86 Congratulations

Lecture 87 Bonus lecture

Beginner Data Scientists who want to understand how to select variables for machine learning,Intermediate Data Scientists who want to level up their experience in feature selection for machine learning,Advanced Data Scientists who want to discover alternative methods for feature selection,Software engineers and academics switching careers into data science,Software engineers and academics stepping into data science,Data analysts who want to level up their skills in data science