Learn Machine Learning Algorithms With Jax
Published 1/2024
MP4 | Video: h264, 1920x1080 | Audio: AAC, 44.1 KHz
Language: English | Size: 1.99 GB | Duration: 4h 58m
Published 1/2024
MP4 | Video: h264, 1920x1080 | Audio: AAC, 44.1 KHz
Language: English | Size: 1.99 GB | Duration: 4h 58m
to develop your data science skills
What you'll learn
Students will learn about Python's Jax library.
Students will learn how to code supervised classification machine learning algorithms in Jax.
Students will learn how to code supervised regression machine learning algorithms in Jax.
Students will learn how to code neural networks in Jax.
Requirements
Students should have a basic understanding of Python before taking this course.
Students should have taken my free Udemy courses, such as:- Introduction to Python programming; Theoretical concepts of machine learning; and Practicalities involved in exploratory data analysis.
Description
Jax is a Python library developed by Google in 2018 and is set to overtake Google's other Python library, Tensorflow, for research purposes. There is significantly less code available in Jax than there is in Tensorflow, which is why I have decided to develop a course in Jax. Jax has been written very similar to the numpy API, but there are a few differences that will be covered in the course.The beginning of the course will cover an introduction to Jax, discussing some of the code that will be in the 16 Jupyter Notebooks that will be presented. An introduction to machine learning algorithms will be vovered in eight sections. The machine learning algorithms that will be introduced, with the code covered in depth are:-1. Linear regression2. Logistic regression3. Naive bayes4. Decision tree5. Random forest6. K nearest neighbour7. Support vector machine8. Neural networksIn order for the machine learning algorithms to be efficiently presented, they must be included in a machine learning project, to include:-1. Import Jax and other Python libraries into the program2. Load the appropriate dataset into the program from Google Colab, GitHub, or sklearn3. Preprocess the data if necessary4. Remove outliers if appropriate5. Remove highly correlated features if appropriate6. Standardise the data if needed7. Define dependent and independent variables8. Split the dataset into training, validating, and testing sets, whichever is appropriate9. Define the Jax model10. Compare the Jax model with its sklearn equivalent11. Obtain predictions and test their accuracy or error, whever is appropriate.
Overview
Section 1: Introduction
Lecture 1 Introduction
Lecture 2 Intro to Jax
Section 2: Linear regression
Lecture 3 Introduction to Linear Regression
Lecture 4 Simple Linear Regression (mtcars)
Lecture 5 Multiple linear regression (mtcars)
Lecture 6 Jax jit (California house prices)
Section 3: Logistic regression
Lecture 7 Introduction to Logistic Regression
Lecture 8 Logistic regression (binary classification - breast cancer)
Lecture 9 Multinomial logistic regression (softmax - iris)
Lecture 10 Calculate probabilities (Kaggle play 3.23)
Section 4: Naive Bayes
Lecture 11 Introduction to Naive Bayes
Lecture 12 Naive Bayes Classifier (wine)
Section 5: Decision tree
Lecture 13 Introduction to decision trees
Lecture 14 Decision tree classifier (wine)
Section 6: Random Forest
Lecture 15 Introduction to Random Forest
Lecture 16 Random forest classifier (wine)
Section 7: K Nearest Neighbour
Lecture 17 Introduction to KNN
Lecture 18 KNN classifier (titanic)
Section 8: Support Vector Machine
Lecture 19 Introduction to SVM
Lecture 20 SVM Classifier (titanic)
Section 9: Neural Networks
Lecture 21 Introduction to neural networks
Lecture 22 Perceptron (breast cancer)
Lecture 23 Regression neural network (Boston house prices)
Lecture 24 Binary classifier neural network (breast cancer)
Lecture 25 Multiclass neural network (seeds)
Lecture 26 Image classifier (pizza)
This course is for persons interested in expanding their knowledge of Python's Jax library, machine learning, and data science.