Tags
Language
Tags
May 2025
Su Mo Tu We Th Fr Sa
27 28 29 30 1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30 31
    Attention❗ To save your time, in order to download anything on this site, you must be registered 👉 HERE. If you do not have a registration yet, it is better to do it right away. ✌

    ( • )( • ) ( ͡⚆ ͜ʖ ͡⚆ ) (‿ˠ‿)
    SpicyMags.xyz

    Master Statistics & Machine Learning: Intuition, Math, Code

    Posted By: ELK1nG
    Master Statistics & Machine Learning: Intuition, Math, Code

    Master Statistics & Machine Learning: Intuition, Math, Code
    Last updated 10/2022
    MP4 | Video: h264, 1920x1080 | Audio: AAC, 44.1 KHz
    Language: English | Size: 13.05 GB | Duration: 38h 20m

    A rigorous and engaging deep-dive into statistics and machine-learning, with hands-on applications in Python and MATLAB.

    What you'll learn
    Descriptive statistics (mean, variance, etc)
    Inferential statistics
    T-tests, correlation, ANOVA, regression, clustering
    The math behind the "black box" statistical methods
    How to implement statistical methods in code
    How to interpret statistics correctly and avoid common misunderstandings
    Coding techniques in Python and MATLAB/Octave
    Machine learning methods like clustering, predictive analysis, classification, and data cleaning
    Requirements
    Good work ethic and motivation to learn.
    Previous background in statistics or machine learning is not necessary.
    Python -OR- MATLAB with the Statistics toolbox (or Octave).
    Some coding familiarity for the optional code exercises.
    No textbooks necessary! All materials are provided inside the course.
    Description
    Statistics and probability control your life. I don't just mean What YouTube's algorithm recommends you to watch next, and I don't just mean the chance of meeting your future significant other in class or at a bar. Human behavior, single-cell organisms, Earthquakes, the stock market, whether it will snow in the first week of December, and countless other phenomena are probabilistic and statistical. Even the very nature of the most fundamental deep structure of the universe is governed by probability and statistics.You need to understand statistics.Nearly all areas of human civilization are incorporating code and numerical computations. This means that many jobs and areas of study are based on applications of statistical and machine-learning techniques in programming languages like Python and MATLAB. This is often called 'data science' and is an increasingly important topic. Statistics and machine learning are also fundamental to artificial intelligence (AI) and business intelligence.If you want to make yourself a future-proof employee, employer, data scientist, or researcher in any technical field – ranging from data scientist to engineering to research scientist to deep learning modeler – you'll need to know statistics and machine-learning. And you'll need to know how to implement concepts like probability theory and confidence intervals, k-means clustering and PCA, Spearman correlation and logistic regression, in computer languages like Python or MATLAB.There are six reasons why you should take this course:This course covers everything you need to understand the fundamentals of statistics, machine learning, and data science, from bar plots to ANOVAs, regression to k-means, t-test to non-parametric permutation testing.After completing this course, you will be able to understand a wide range of statistical and machine-learning analyses, even specific advanced methods that aren't taught here. That's because you will learn the foundations upon which advanced methods are build.This course balances mathematical rigor with intuitive explanations, and hands-on explorations in code.Enrolling in the course gives you access to the Q&A, in which I actively participate every day.I've been studying, developing, and teaching statistics for 20 years, and I'm, like, really great at math.What you need to know before taking this course:High-school level maths. This is an applications-oriented course, so I don't go into a lot of detail about proofs, derivations, or calculus.Basic coding skills in Python or MATLAB. This is necessary only if you want to follow along with the code. You can successfully complete this course without writing a single line of code! But participating in the coding exercises will help you learn the material. The MATLAB code relies on the Statistics and Machine Learning toolbox (you can use Octave if you don't have MATLAB or the statistics toolbox). Python code is written in Jupyter notebooks.I recommend taking my free course called "Statistics literacy for non-statisticians". It's 90 minutes long and will give you a bird's-eye-view of the main topics in statistics that I go into much much much more detail about here in this course. Note that the free short course is not required for this course, but complements this course nicely. And you can get through the whole thing in less than an hour if you watch if on 1.5x speed!You do not need any previous experience with statistics, machine learning, deep learning, or data science. That's why you're here!Is this course up to date?Yes, I maintain all of my courses regularly. I add new lectures to keep the course "alive," and I add new lectures (or sometimes re-film existing lectures) to explain maths concepts better if students find a topic confusing or if I made a mistake in the lecture (rare, but it happens!). You can check the "Last updated" text at the top of this page to see when I last worked on improving this course!What if you have questions about the material?This course has a Q&A (question and answer) section where you can post your questions about the course material (about the maths, statistics, coding, or machine learning aspects). I try to answer all questions within a day. You can also see all other questions and answers, which really improves how much you can learn! And you can contribute to the Q&A by posting to ongoing discussions. And, you can also post your code for feedback or just to show off – I love it when students actually write better code than mine! (Ahem, doesn't happen so often.)What should you do now?First of all, congrats on reading this far; that means you are seriously interested in learning statistics and machine learning. Watch the preview videos, check out the reviews, and, when you're ready, invest in your brain by learning from this course!

    Overview

    Section 1: Introductions

    Lecture 1 [Important] Getting the most out of this course

    Lecture 2 About using MATLAB or Python

    Lecture 3 Statistics guessing game!

    Lecture 4 Using the Q&A forum

    Lecture 5 (optional) Entering time-stamped notes in the Udemy video player

    Section 2: Math prerequisites

    Lecture 6 Should you memorize statistical formulas?

    Lecture 7 Arithmetic and exponents

    Lecture 8 Scientific notation

    Lecture 9 Summation notation

    Lecture 10 Absolute value

    Lecture 11 Natural exponent and logarithm

    Lecture 12 The logistic function

    Lecture 13 Rank and tied-rank

    Section 3: IMPORTANT: Download course materials

    Lecture 14 Download materials for the entire course!

    Section 4: What are (is?) data?

    Lecture 15 Is "data" singular or plural?!?!!?!

    Lecture 16 Where do data come from and what do they mean?

    Lecture 17 Types of data: categorical, numerical, etc

    Lecture 18 Code: representing types of data on computers

    Lecture 19 Sample vs. population data

    Lecture 20 Samples, case reports, and anecdotes

    Lecture 21 The ethics of making up data

    Section 5: Visualizing data

    Lecture 22 Bar plots

    Lecture 23 Code: bar plots

    Lecture 24 Box-and-whisker plots

    Lecture 25 Code: box plots

    Lecture 26 "Unsupervised learning": Boxplots of normal and uniform noise

    Lecture 27 Histograms

    Lecture 28 Code: histograms

    Lecture 29 "Unsupervised learning": Histogram proportion

    Lecture 30 Pie charts

    Lecture 31 Code: pie charts

    Lecture 32 When to use lines instead of bars

    Lecture 33 Linear vs. logarithmic axis scaling

    Lecture 34 Code: line plots

    Lecture 35 "Unsupervised learning": log-scaled plots

    Section 6: Descriptive statistics

    Lecture 36 Descriptive vs. inferential statistics

    Lecture 37 Accuracy, precision, resolution

    Lecture 38 Data distributions

    Lecture 39 Code: data from different distributions

    Lecture 40 "Unsupervised learning": histograms of distributions

    Lecture 41 The beauty and simplicity of Normal

    Lecture 42 Measures of central tendency (mean)

    Lecture 43 Measures of central tendency (median, mode)

    Lecture 44 Code: computing central tendency

    Lecture 45 "Unsupervised learning": central tendencies with outliers

    Lecture 46 Measures of dispersion (variance, standard deviation)

    Lecture 47 Code: Computing dispersion

    Lecture 48 Interquartile range (IQR)

    Lecture 49 Code: IQR

    Lecture 50 QQ plots

    Lecture 51 Code: QQ plots

    Lecture 52 Statistical "moments"

    Lecture 53 Histograms part 2: Number of bins

    Lecture 54 Code: Histogram bins

    Lecture 55 Violin plots

    Lecture 56 Code: violin plots

    Lecture 57 "Unsupervised learning": asymmetric violin plots

    Lecture 58 Shannon entropy

    Lecture 59 Code: entropy

    Lecture 60 "Unsupervised learning": entropy and number of bins

    Section 7: Data normalizations and outliers

    Lecture 61 Garbage in, garbage out (GIGO)

    Lecture 62 Z-score standardization

    Lecture 63 Code: z-score

    Lecture 64 Min-max scaling

    Lecture 65 Code: min-max scaling

    Lecture 66 "Unsupervised learning": Invert the min-max scaling

    Lecture 67 What are outliers and why are they dangerous?

    Lecture 68 Removing outliers: z-score method

    Lecture 69 The modified z-score method

    Lecture 70 Code: z-score for outlier removal

    Lecture 71 "Unsupervised learning": z vs. modified-z

    Lecture 72 Multivariate outlier detection

    Lecture 73 Code: Euclidean distance for outlier removal

    Lecture 74 Removing outliers by data trimming

    Lecture 75 Code: Data trimming to remove outliers

    Lecture 76 Non-parametric solutions to outliers

    Lecture 77 Nonlinear data transformations

    Lecture 78 An outlier lecture on personal accountability

    Section 8: Probability theory

    Lecture 79 What is probability?

    Lecture 80 Probability vs. proportion

    Lecture 81 Computing probabilities

    Lecture 82 Code: compute probabilities

    Lecture 83 Probability and odds

    Lecture 84 "Unsupervised learning": probabilities of odds-space

    Lecture 85 Probability mass vs. density

    Lecture 86 Code: compute probability mass functions

    Lecture 87 Cumulative distribution functions

    Lecture 88 Code: cdfs and pdfs

    Lecture 89 "Unsupervised learning": cdf's for various distributions

    Lecture 90 Creating sample estimate distributions

    Lecture 91 Monte Carlo sampling

    Lecture 92 Sampling variability, noise, and other annoyances

    Lecture 93 Code: sampling variability

    Lecture 94 Expected value

    Lecture 95 Conditional probability

    Lecture 96 Code: conditional probabilities

    Lecture 97 Tree diagrams for conditional probabilities

    Lecture 98 The Law of Large Numbers

    Lecture 99 Code: Law of Large Numbers in action

    Lecture 100 The Central Limit Theorem

    Lecture 101 Code: the CLT in action

    Lecture 102 "Unsupervised learning": Averaging pairs of numbers

    Section 9: Hypothesis testing

    Lecture 103 IVs, DVs, models, and other stats lingo

    Lecture 104 What is an hypothesis and how do you specify one?

    Lecture 105 Sample distributions under null and alternative hypotheses

    Lecture 106 P-values: definition, tails, and misinterpretations

    Lecture 107 P-z combinations that you should memorize

    Lecture 108 Degrees of freedom

    Lecture 109 Type 1 and Type 2 errors

    Lecture 110 Parametric vs. non-parametric tests

    Lecture 111 Multiple comparisons and Bonferroni correction

    Lecture 112 Statistical vs. theoretical vs. clinical significance

    Lecture 113 Cross-validation

    Lecture 114 Statistical significance vs. classification accuracy

    Section 10: The t-test family

    Lecture 115 Purpose and interpretation of the t-test

    Lecture 116 One-sample t-test

    Lecture 117 Code: One-sample t-test

    Lecture 118 "Unsupervised learning": The role of variance

    Lecture 119 Two-samples t-test

    Lecture 120 Code: Two-samples t-test

    Lecture 121 "Unsupervised learning": Importance of N for t-test

    Lecture 122 Wilcoxon signed-rank (nonparametric t-test)

    Lecture 123 Code: Signed-rank test

    Lecture 124 Mann-Whitney U test (nonparametric t-test)

    Lecture 125 Code: Mann-Whitney U test

    Lecture 126 Permutation testing for t-test significance

    Lecture 127 Code: permutation testing

    Lecture 128 "Unsupervised learning": How many permutations?

    Section 11: Confidence intervals on parameters

    Lecture 129 What are confidence intervals and why do we need them?

    Lecture 130 Computing confidence intervals via formula

    Lecture 131 Code: compute confidence intervals by formula

    Lecture 132 Confidence intervals via bootstrapping (resampling)

    Lecture 133 Code: bootstrapping confidence intervals

    Lecture 134 "Unsupervised learning:" Confidence intervals for variance

    Lecture 135 Misconceptions about confidence intervals

    Section 12: Correlation

    Lecture 136 Motivation and description of correlation

    Lecture 137 Covariance and correlation: formulas

    Lecture 138 Code: correlation coefficient

    Lecture 139 Code: Simulate data with specified correlation

    Lecture 140 Correlation matrix

    Lecture 141 Code: correlation matrix

    Lecture 142 "Unsupervised learning": average correlation matrices

    Lecture 143 "Unsupervised learning": correlation to covariance matrix

    Lecture 144 Partial correlation

    Lecture 145 Code: partial correlation

    Lecture 146 The problem with Pearson

    Lecture 147 Nonparametric correlation: Spearman rank

    Lecture 148 Fisher-Z transformation for correlations

    Lecture 149 Code: Spearman correlation and Fisher-Z

    Lecture 150 "Unsupervised learning": Spearman correlation

    Lecture 151 "Unsupervised learning": confidence interval on correlation

    Lecture 152 Kendall's correlation for ordinal data

    Lecture 153 Code: Kendall correlation

    Lecture 154 "Unsupervised learning": Does Kendall vs. Pearson matter?

    Lecture 155 The subgroups correlation paradox

    Lecture 156 Cosine similarity

    Lecture 157 Code: Cosine similarity vs. Pearson correlation

    Section 13: Analysis of Variance (ANOVA)

    Lecture 158 ANOVA intro, part1

    Lecture 159 ANOVA intro, part 2

    Lecture 160 Sum of squares

    Lecture 161 The F-test and the ANOVA table

    Lecture 162 The omnibus F-test and post-hoc comparisons

    Lecture 163 The two-way ANOVA

    Lecture 164 One-way ANOVA example

    Lecture 165 Code: One-way ANOVA (independent samples)

    Lecture 166 Code: One-way repeated-measures ANOVA

    Lecture 167 Two-way ANOVA example

    Lecture 168 Code: Two-way mixed ANOVA

    Section 14: Regression

    Lecture 169 Introduction to GLM / regression

    Lecture 170 Least-squares solution to the GLM

    Lecture 171 Evaluating regression models: R2 and F

    Lecture 172 Simple regression

    Lecture 173 Code: simple regression

    Lecture 174 "Unsupervised learning": Compute R2 and F

    Lecture 175 Multiple regression

    Lecture 176 Standardizing regression coefficients

    Lecture 177 Code: Multiple regression

    Lecture 178 Polynomial regression models

    Lecture 179 Code: polynomial modeling

    Lecture 180 "Unsupervised learning": Polynomial design matrix

    Lecture 181 Logistic regression

    Lecture 182 Code: Logistic regression

    Lecture 183 Under- and over-fitting

    Lecture 184 "Unsupervised learning": Overfit data

    Lecture 185 Comparing "nested" models

    Lecture 186 What to do about missing data

    Section 15: Statistical power and sample sizes

    Lecture 187 What is statistical power and why is it important?

    Lecture 188 Estimating statistical power and sample size

    Lecture 189 Compute power and sample size using G*Power

    Section 16: Clustering and dimension-reduction

    Lecture 190 K-means clustering

    Lecture 191 Code: k-means clustering

    Lecture 192 "Unsupervised learning:" K-means and normalization

    Lecture 193 "Unsupervised learning:" K-means on a Gauss blur

    Lecture 194 Clustering via dbscan

    Lecture 195 Code: dbscan

    Lecture 196 "Unsupervised learning": dbscan vs. k-means

    Lecture 197 K-nearest neighbor classification

    Lecture 198 Code: KNN

    Lecture 199 Principal components analysis (PCA)

    Lecture 200 Code: PCA

    Lecture 201 "Unsupervised learning:" K-means on PC data

    Lecture 202 Independent components analysis (ICA)

    Lecture 203 Code: ICA

    Section 17: Signal detection theory

    Lecture 204 The two perspectives of the world

    Lecture 205 d-prime

    Lecture 206 Code: d-prime

    Lecture 207 Response bias

    Lecture 208 Code: Response bias

    Lecture 209 F-score

    Lecture 210 Receiver operating characteristics (ROC)

    Lecture 211 Code: ROC curves

    Lecture 212 "Unsupervised learning": Make this plot look nicer!

    Section 18: A real-world data journey

    Lecture 213 Note about the code for this section

    Lecture 214 Introduction

    Lecture 215 MATLAB: Import and clean the marriage data

    Lecture 216 MATLAB: Import the divorce data

    Lecture 217 MATLAB: More data visualizations

    Lecture 218 MATLAB: Inferential statistics

    Lecture 219 Python: Import and clean the marriage data

    Lecture 220 Python: Import the divorce data

    Lecture 221 Python: Inferential statistics

    Lecture 222 Take-home messages

    Section 19: Bonus section

    Lecture 223 About deep learning

    Lecture 224 Bonus content

    Students taking statistics or machine learning courses,Professionals who need to learn statistics and machine learning,Scientists who want to understand their data analyses,Anyone who wants to see "under the hood" of machine learning,Artificial intelligence (AI) students,Business intelligence students