Week 13 COE305 Machine Learning PDF

Document Details

ToughSard1958

Uploaded by ToughSard1958

İstinye Üniversitesi

Femilda Josephin

Tags

machine learning artificial neural networks neural networks computer science

Summary

This document provides an overview of Artificial Neural Networks (ANNs). It explains the basic concepts of ANNs, including perceptrons and their application in machine learning.

Full Transcript

WEEK 13 COE305 MACHINE LEARNING BY FEMILDA JOSEPHIN Introduction to Artificial Neural Networks What is Artificial Neural Networks (ANN)? It tries to mimic the human behaviour Human behaviour changes with the activation of neurons. ANN has neur...

WEEK 13 COE305 MACHINE LEARNING BY FEMILDA JOSEPHIN Introduction to Artificial Neural Networks What is Artificial Neural Networks (ANN)? It tries to mimic the human behaviour Human behaviour changes with the activation of neurons. ANN has neurons that mimic the biological neurons. Multiple artificial neurons are interlinked and forms the artificial neural networks. Perceptron – helps to understand the concept of ANN. Perceptron is available in theoretical form from 1950s Perceptron takes multiple input and provides one output. Eg: If a person is allowed to take loan or not. A perceptron is a single-layer neural network 1/2/2025 3 What is Artificial Neural Networks? ANN is based on the structure of human brain. Biological Neural Network Artificial Neural Network 1/2/2025 4 ML vs ANN Aspect Machine Learning (ML) Artificial Neural Networks (ANN) Subset of AI focused on algorithms that learn Definition Subset of ML inspired by biological neural networks. from data. Broad techniques including supervised, Scope Specific to neural network-based approaches. unsupervised, and reinforcement learning. Includes simple algorithms like regression and Complexity Typically more complex with multiple layers and parameters. decision trees. Data Requirements Can work well with small to medium datasets. Requires large datasets to learn effectively. Easier to interpret (e.g., linear regression, Interpretability Difficult to interpret due to "black box" nature. decision trees). Predictive analytics, fraud detection, Applications Image recognition, speech recognition, autonomous vehicles, etc. recommendation systems, basic classification. Training Time Faster training for simple models. Longer training time, especially for deep networks. Linear regression, logistic regression, decision Key Algorithms Feedforward NN, Convolutional NN, Recurrent NN, etc. trees, SVM, k-means, PCA, etc. Computational Lower computational power for simpler High computational power needed, especially for deep learning. Power models. The Basic Architecture of Neural Networks This is also called as Single Layer Neural Network/Perceptron Age Salary Tenure Education Country 25 $2500 1 Graduate Spain 30 $3000 4 Undergrads Denmark 1/2/2025 6 Components of Neural Networks Layers Weight Bias Activation 1/2/2025 7 Components of neural networks - Layers Three Layers – Input Layer, Hidden Layer, Output Layer Neurons has to connect to each other to perform some actions Age Experience Salary Move out of company 25 1 $2000 N 30 3 $2500 Y 28 3 $2800 N 35 6 $3500 N 1/2/2025 8 Components of neural networks - Weights Weights are assigned because not all the features are equally important. Eg: Age: 0.2, Salary: 0.8, Experience : 0.6 In the hidden layer for each row in the dataset the values are multiplied with the corresponding weights and they are added together. For example: 25 * 0.2 + 1*0.6 + 2000 * 0.8 = 1605.6 𝑛 𝑦 = ෍ 𝑋𝑖 𝑤𝑖 𝑖=1 Age Experience Salary Move out of company 25 1 $2000 N 30 3 $2500 Y 28 3 $2800 N 35 6 $3500 N 1/2/2025 9 Components of neural networks - Bias The bias value allows the activation function to be shifted to the left or right, to better fit the data. For example : Straight line y= mx When the equation is added with ‘c’ y=mx+c Here c allows the model to move up and down if its needed to fit the data. 𝑦 = σ𝑛𝑖=1 𝑋𝑖 𝑤𝑖 +b So, 25 * 0.2 + 1*0.6 + 2000 * 0.8 + 0.1 = 1605.7 1/2/2025 10 Components of neural networks - Activation Functions Threshold or Binary Step Activation Function Sigmoid or Logistic ReLU (Rectified Linear Unit) Tanh or Hyperbolic Tangent Activation Function Leaky ReLU 1/2/2025 11 Components of neural networks - Activation Functions Threshold or Binary Step Activation Function The output is set at one of two levels, depending on whether the total input is greater than or less than some threshold value. Output will be 0 if x is lesser than 0 Output will be 1 if x is greater than 0 1/2/2025 12 Components of neural networks - Activation Functions Sigmoid or Logistic Activation Function The sigmoid function transforms a value between 0 to 1. Generally 0.5 is set as the threshold. For classification problem, in the output Sigmoid activation function can be used. 1 𝑆 𝑦 = 1 + 𝑒 −𝑦 Value less than threshold is considered to be 0 and value greater than threshold is considered to be 1. Output 1 means the neuron is activated otherwise it is deactivated. For classification problem, in the output Sigmoid activation function can be used. 1/2/2025 13 Components of neural networks - Activation Functions ReLU (Rectified Linear Unit) Max(y,0) Where 𝑦 = σ𝑛𝑖=1 𝑋𝑖 𝑤𝑖 + 𝑏 If y is negative the output will be 0. Max( -0.5, 0) = 0 If y is positive the output will be the value of y. Max( 0.5 , 0) = 0.5 1/2/2025 14 Components of neural networks - Activation Functions Tanh or Hyperbolic Tangent Activation Function Calculated as 𝑒 𝑥 − 𝑒 −𝑥 𝑦= 𝑥 𝑒 + 𝑒 −𝑥 It transforms a value between the range -1 to +1. When the value is more positive the output will be +1. When the value is more negative the output will be -1. 1/2/2025 15 Components of neural networks - Activation Functions Tanh or Hyperbolic Tangent Activation Function Calculated as 𝑒 𝑥 − 𝑒 −𝑥 𝑦= 𝑥 𝑒 + 𝑒 −𝑥 It transforms a value between the range -1 to +1. When the value is more positive the output will be +1. When the value is more negative the output will be -1. 1/2/2025 16 Feed forward neural network (FFNN) 02-01-2025 17 Feed forward neural network (FFNN) Epoch = 1 02-01-2025 18 Feed forward neural network (FFNN) To reduce the loss, weights should be adjusted with the help of optimizer. 02-01-2025 19 the learning rate is a tuning parameter in an optimization algorithm that determines the step size at each iteration while moving toward a minimum of a loss function. Neural Network with Backpropagation Learning rate = a small value 0.1, 0.01…. 1/2/2025 20 Neural Network with Backpropagation Epoch = 2 1/2/2025 21

Use Quizgecko on...
Browser
Browser