Supervised Learning: Classification PDF - CS 770 Notes
Document Details

Uploaded by RespectableRetinalite707
Wichita State University
Tags
Summary
These are lecture notes related to CS 770 Machine Learning covering supervised learning and classification algorithms. The notes primarily discuss neural networks, activation functions, and training methods used in machine learning. Topics include types of neural networks, how they work, and common activation functions like ReLU and Softmax.
Full Transcript
Welcome… Supervised Learning: Classification CS 770 Machine Learning 1 Contents Neural Network- Activation function Training Neural Network 2 NeuralWhat is Neural Network? A neural network is a meth...
Welcome… Supervised Learning: Classification CS 770 Machine Learning 1 Contents Neural Network- Activation function Training Neural Network 2 NeuralWhat is Neural Network? A neural network is a method in artificial intelligence that teaches computers to process data in a way that is inspired by the human brain. NeuralTypes of neural network? Neural networks can be mainly classified into & types depending on the requirement and the inputs. It can be further sub-classified. NeuHow does neural network work? The working of a neural network can be break down into 7 steps for better understanding Step 1: Let’s consider an image (for input) Step 2: The image is split into pixels and each pixel is fed as an input to each neuron Step 3: Each neuron is connected to adjacent neuron through webs known as channels. Step 4: These weights are multiplied with inputs and added to bias. NeHow does neural network work? Step 5: the value obtained is passed through a threshold function called activation function. The result of the activation function will decide if the neuron will be activated. Step 6: activated neurons transmit data to the next layer neurons over channels. This process is called Forward propagation. Step 7: In the output layer, the neurons with highest values fires & determines the output Result: What if the result is wrong Step 8: Back propagation Neural Network Neurons Input Output Neural Network What are hidden layers? How does Neural Network work Neural Network Pixels Channels Bias Activation function Neural Network Activated neurons Forward propagation Neural Network Output prediction Result Neural Network Back propagation Training the model NeuNeural network examples Face recognition Music composition Fingerprint recognition Speech recognition NeuraNeural network applications Activation function: What is an activation function? Why do we need an activation function? What are the different types of activation function? ActivWhat is an activation function? Activation function apply a non-linear transformation function and decide whether a neuron should be activated or not. Let’s see what these means ActWhy do we need an activation function? Without activation function the model is just a linear regression model that is not able to learn complex patterns By introducing activation function, the learns and understands complex patterns and works more efficiently ActivatTypes of activation function? Step function Sigmoid function TanH function ReLU function Leaky ReLU function Softmax function Activation function: Step function? If the input is greater than the threshold, neuron is activated The formula for activation function is This function is very common, too simple and not in use Activation function: Sigmoid function? Outputs a probability between 0 and 1 If input is a negative number, sigmoid outputs a number close 0 If input is a positive number, sigmoid outputs a number close to 1. Used in hidden layers, mostly in the last layer for binary classification problems Activation function: TanH function? Common choice for hidden layers Formula for the activation function is Outputs between -1 and 1 Activation function: ReLU function? Most common choice in hidden layer The formula for activation function Looks simple, but improves learning of neural networks Activation function: Leaky ReLU function? This function is used when we encounter dying ReLU problem Neuron can reach a dead state where no more updates takes place in terms of weight Works the same as ReLU in case of positive numbers In case of negative numbers, a scaling Activation function: Softmax function? Squashes input number to output number so that you get a probability. Formula for the function. Higher the raw input, higher the probability Used in last layer for multiclass classification problem. Neural Network 25 Neural Network 26 Neural Network 27 Neural Network 28 Neural Network 29 Neural Network 30 Neural Network 31 Neural Network 32 Neural Network 33 Neural Network 34 Neural Network 35