Deep Learning and Variants_Session 2_20240120.pdf
Document Details
Uploaded by PalatialRelativity
Tags
Related
- Deep Learning and Artificial Neural Networks for Spacecraft Dynamics, Navigation and Control PDF
- Artificial Intelligence - Neural Networks PDF
- Artificial Intelligence Lecture Notes PDF
- Lec 5. Artificial Intelligence PDF
- AI - Machine Learning & Deep Learning Lecture 08 PDF
- AI Notes - Introduction to Artificial Intelligence PDF
Full Transcript
Presents Deep Learning & its variants GGU DBA Neural Networks Dr. Anand Jayaraman [email protected] Professor, upGrad; Chief Data Scientist, Agastya Data Solutions Artificial neural model: Perceptron θ MT cars data set Classification with a single logistic regression unit Using the MTca...
Presents Deep Learning & its variants GGU DBA Neural Networks Dr. Anand Jayaraman [email protected] Professor, upGrad; Chief Data Scientist, Agastya Data Solutions Artificial neural model: Perceptron θ MT cars data set Classification with a single logistic regression unit Using the MTcars dataset, estimate the probability of a vehicle being fitted with a manual transmission if it has a 120hp engine and weights 2800 lbs. Example: Automatic or Manual Transmission New data prediction (manual entry) There is a 64% probability of the car being fitted with an Manual transmission. Logistic Regression Example: Automatic or Manual Transmission 0 = 18.8663 – 8.08035 wt + 0.0363 hp Neurons : Sigmoid et. al. LIMITATIONS OF PERCEPTRON AND HOW TO OVERCOME Will it work here? x1 x2 y 0 0 0 1 0 1 0 1 1 1 1 0 Will it work here? x1 x2 y 0 0 0 1 0 1 0 1 1 1 1 0 A line representing the function of a single perceptron Perceptron outputs a positive value for region above the line Negative vaue for region below the line How does brain do non linearity A single perceptron is not able to deal with non-linear data. What does your intuition tell you to do based on biological inspiration? Many neurons connected together A one hidden layer network can learn any function How to describe non-linearity X1 w1 X2 w2 X3 w3 How to describe non-linearity Input layer w1 X1 l1 w2 k1 Layer 1 k2 X2 l2 Each layer is a transformation into new space x1 x2 h1 h2 y 0 0 0 0 0 0 1 0 1 1 1 0 0 1 1 1 1 1 1 0 Each layer is a transformation into new space h2 x2 x1 h1 Each layer is a transformation into new space Remember transformation matrix from 2D/3D geometry? Output layer is a simple regression/classification layer All the connections represent a Matrix W Attribute 2 Non-linear problems Attribute 1 Attribute 2 Non-linear problems Attribute 1 Artificial Neural Networks Feed Forward Densely Connected Neural Networks Squashing/Activation/threshold functions Sigmoid is good for classification Linear for regression Output Layer Activation function in output layer 1) Linear : for regression problems 2) Sigmoid/tanh: classification problems Multi-class classification problems: Multiple output neurons Activation function for the last layer: softmax Neural Networks come in various shapes & sizes Multi Layer perceptrons – Networks made up of sigmoid or ReLu neurons (NOT perceptrons) ANNs: characterized by parameters and hyperparameters Parameters – Weights & bias – Learning / Training: Find parameters which minimize cost function Hyper-parameters – – – – – Network structure Activation functions Number of hidden layers Number of neurons Cost function NN e.g. 1 wt hp Hyper-parameters cyl – – – – – Network structure Activation functions Number of hidden layers Number of neurons in the hidden layer(s) Cost function – – Weights & bias Learning / Training: Find parameters which minimize cost function Parameters am NN e.g. 2 : MNIST Dataset Digit Recognition – Each digit image 28 x 28 (=784) pixels From: https://youtu.be/aircAruvnKk Network Structure From: https://youtu.be/aircAruvnKk Network Structure From: https://youtu.be/aircAruvnKk Network Structure + code