Supervised Learning Slides PDF

Document Details

TruthfulStrontium

Uploaded by TruthfulStrontium

연세대학교

Tags

supervised learning machine learning linear regression artificial intelligence

Summary

These slides provide an overview of supervised learning, including concepts like linear regression, classification, and different algorithms. The slides also touch on the topics of overfitting, regularization, and different types of supervised learning models.

Full Transcript

지도학습 1강 Supervised Learning Overview Image Classification CIFAR10 2 Text Classification 3 Next Word Prediction The cat sat on the ___________ - table (30%) - chair (35 %) - mat (32%) - …...

지도학습 1강 Supervised Learning Overview Image Classification CIFAR10 2 Text Classification 3 Next Word Prediction The cat sat on the ___________ - table (30%) - chair (35 %) - mat (32%) - … 4 Translation 5 Price Prediction 6 Supervised Learning - Data: usually a vector - Label: - Dataset: 7 Image Classification CIFAR10 8 Text Classification Positive 9 Next Word Prediction The cat sat on the ___________ mat 10 Translation 11 Price Prediction 12 Before Machine Learning - Rule-based Algorithms (based on expertise, knowledge) - What defines image ‘0’? - When stock prices goes up? - Cannot solve complicated tasks - Machine Learning - Data based - Provide data and let algorithm decides Supervised Learning 13 Machine Learning Machine Learning is “the field of study that gives computers the ability to learn without being explicitly programmed.” Arthur Lee Samuel @photo from Forbes.com 14 Supervised Learning - Classification - Regression Discrete (finite) Continuous (real) 15 Unsupervised Learning 16 Supervised Learning: Setup - True function: , where - Want to find - Function class: - Goal: Find from function class that approximates the true function well 17 Approximate well? - Similar function values for all input [Infeasible] - Similar function values for given dataset 18 Approximate well? - Similar function value can be measured - Pointwise Loss - Can be mean-squared error, cross-entropy, etc. - Loss 19 Supervised Learning - Given dataset (with label) - Set function class - Set loss function - Find that minimizes 20 Linear regression example - heights vs. weights - It has positive correlation - Function Class (Model) 21 Linear regression example - Heights vs weights - It has positive correlation - Function Class (Model) 22 Linear regression example - Loss function Mean-Squared Error (MSE) 23 Linear regression example - Why not absolute difference? - Why not perpendicular distance? It is perfectly fine, but - MSE is differentiable and has analytic solution 24 Solve Linear Regression - Loss function - Simplified 25 Solve Linear Regression 26 Summary: Supervised Learning and Linear Regression - dataset (with label) - Set function class and loss function - Minimizes - Linear Regression 27 감사합니다 28 지도학습 2강 Linear Regression Linear regression (revisited) - Heights vs weights - Function Class (Model) - Loss Function 30 Linear regression (Multidimension) - (heights, hand size) vs. weights - Model Class - Loss function 31 Linear regression (Multidimension) - Again a quadratic function - Gradient = 0 32 Linear regression (Multidimension) 33 Linear regression (Multidimension) - data - Function Class - Loss function 34 Linear regression (Multidimension) - data - Function Class - Loss function 35 Linear regression (Multidimension) - Minimize - Gradient - Can solve linear equations with n+1 variables 36 Normal Equation 37 Normal Equation 38 What if we want to fit quadratic equation? 39 What if we want to fit quadratic equation? Is this quadratic regression? → No 40 We fit parameter “a” - Redefine the dataset - Model Class - Loss function: the same MSE 41 You can always add features - For multidimensional input We can add - More feature, more expressive But should be careful (overfitting!) - Features can be selected via inspection or based on experties 42 Overfitting: Bias-Variance Tradeoff 43 Overfitting: Bias-Variance Tradeoff 44 Overfitting: Bias-Variance Tradeoff 45 Overfitting: Bias-Variance Tradeoff 46 More data is better Altered point 47 Underfitting vs. Overfitting Underfit Just right Overfit High bias High variance 48 How can we recognize overfitting? - Data is often high dimensional - Hard to visualize - Use validation set - train set (red) - validation set (blue) - Train loss ~ 0 - Validation loss > Train loss 49 Train Loss ~ Validation Loss - Generalization - Train loss ~ Validation loss 50 Train-Validation-Test - Train set: roughly 80% of data - Fit your model - Validation set: remaining 20% of data - Validate your model - Test set: another separate dataset - Test your model - Test set is often unknown in training stage (think about competition) - Validation also influences the decision 51 Rules - Based on prior knowledge (or theory) - Occam’s Razor The explanation that requires the fewest assumptions is usually correct. - Inspect data - Low training error - Train error ~ Validation error: good - Train error

Use Quizgecko on...
Browser
Browser