AB37A612-B6F3-48AD-8F9B-C838B06A6B4F.jpeg

Full Transcript

# Machine Learning: A Probabilistic Perspective ## Chapter 1 Introduction ### 1.1 What is machine learning? Machine learning is about making computers that can learn from data. "Learning" means improving some measure of performance when executing some task. We want to design learning algorithms....

# Machine Learning: A Probabilistic Perspective ## Chapter 1 Introduction ### 1.1 What is machine learning? Machine learning is about making computers that can learn from data. "Learning" means improving some measure of performance when executing some task. We want to design learning algorithms. ### 1.2 Types of machine learning - Supervised learning - Unsupervised learning - Reinforcement learning ### 1.2.1 Supervised learning Also known as predictive learning. The goal is to learn a mapping from inputs **x** to outputs **y**, given a labeled set of input-output pairs $D = \{(x_i, y_i)\}^N_{i=1}$. $D$ is called the training set, and $N$ is the number of training examples. #### 1.2.1.1 Classification If y $\in$ {1,..., C}, where C is the number of classes, the problem is known as classification. #### 1.2.1.2 Regression If y $\in \mathbb{R}$, the problem is known as regression. ### 1.2.2 Unsupervised learning Also known as descriptive learning. We are given a data set $D = \{x_i\}^N_{i=1}$ but without any labels. The goal is to find "interesting structure" in the data. #### 1.2.2.1 Clustering Discover groups of similar examples within the data. #### 1.2.2.2 Dimensionality reduction Find a low-dimensional representation of the data. ### 1.2.3 Reinforcement learning An agent learns how to take actions in an environment so as to maximize some notion of cumulative reward. ### 1.3 A brief history of machine learning - 1950s: Artificial Intelligence (AI) - 1960s: Neural Networks - 1970s: Expert Systems - 1980s: Machine Learning - 1990s: Data Mining - 2000s: Machine Learning - 2010s: Deep Learning ### 1.4 Book outline - Part I: Basics - Chapter 2: Probability - Chapter 3: Linear Algebra - Chapter 4: Optimization - Part II: Supervised learning - Chapter 5: Linear Regression - Chapter 6: Logistic Regression - Chapter 7: Generalized Linear Models - Chapter 8: Model Selection and Regularization - Chapter 9: Neural Networks - Chapter 10: Support Vector Machines - Chapter 11: Graphical Models - Part III: Unsupervised learning - Chapter 12: Mixture Models and EM - Chapter 13: Latent Linear Models - Chapter 14: Sparse Coding - Chapter 15: Independent Component Analysis - Part IV: Advanced topics - Chapter 16: Approximate Inference - Chapter 17: Sampling - Chapter 18: Discrete Graphical Models - Chapter 19: Bayesian Nonparametrics - Chapter 20: Hidden Markov Models - Chapter 21: State Space Models - Chapter 22: Time Series - Chapter 23: Deep Learning Revisited - Chapter 24: Causality ### 1.5 Prerequisites - Calculus - Linear algebra - Probability ### 1.6 Notation - Scalars: $a, b, c, \dots$ - Vectors: $\mathbf{a}, \mathbf{b}, \mathbf{c}, \dots$ - Matrices: $\mathbf{A}, \mathbf{B}, \mathbf{C}, \dots$ - Sets: $\mathcal{A}, \mathcal{B}, \mathcal{C}, \dots$ ### 1.7 Further reading - Bishop, C. M. (2006). *Pattern recognition and machine learning*. Springer. - Hastie, T., Tibshirani, R., & Friedman, J. (2009). *The elements of statistical learning*. Springer. - Murphy, K. P. (2012). *Machine learning: A probabilistic perspective*. MIT press.

Use Quizgecko on...
Browser
Browser