PCA (Principal Component Analysis) in Machine Learning PDF

Document Details

WillingChimera

Uploaded by WillingChimera

Tags

machine learning principal components analysis PCA dimensionality reduction

Summary

This document provides an overview of Principal Component Analysis (PCA) in machine learning. It details the concept, steps involved in PCA, and its benefits. The document explains how PCA is used to reduce the dimensions of data while preserving important information.

Full Transcript

Machine Learning Principal Components Analysis ML course lec10 Principal Components Analysis ( PCA) Principal Component Analysis can be abbreviated PCA PCA comes under the Unsupervised Machine Learning category 2 Principal Co...

Machine Learning Principal Components Analysis ML course lec10 Principal Components Analysis ( PCA) Principal Component Analysis can be abbreviated PCA PCA comes under the Unsupervised Machine Learning category 2 Principal Components Analysis ( PCA) The main goal of PCA is to reduce the number of variables in a data collection while retaining as much information as feasible. Principal component analysis in machine learning can be mainly used for Dimensionality Reduction and important feature selection. Correlated features to Independent features 3 Principal Components Analysis Ideas ( PCA) PCA in machine learning provides a complete explanation of the composition of variance and covariance using multiple linear combinations of the core variables. Row scattering may be analyzed using PCA, identifying the distribution- related properties. One of the techniques used to handle the curse of dimensionality in machine learning is principal component analysis (PCA). Typically, having sufficient data enables us to create a more accurate prediction model since we have more data to train the computer 4 Working with high-dimensional data will cause overfitting issues, and we will use dimensionality reduction to address them. Aids in locating important characteristics. Aids in the discovery of a linear combination of varied sequences. 5 How Does PCA Work? The steps involved for PCA in ML are as follows- Original Data Normalize the original data (mean =0, variance =1) Calculating covariance matrix Calculating Eigen values, Eigen vectors, and normalized Eigenvectors Calculating Principal Component (PC) Plot the graph for orthogonality between PCs 6 Now consider two dimensions X=Temperature Y=Humidity Covariance: measures the 40 90 correlation between X and Y 40 90 cov(X,Y)=0: independent 40 90 Cov(X,Y)>0: move same dir 30 90 Cov(X,Y)

Use Quizgecko on...
Browser
Browser