Document Details

ReputableDalmatianJasper

Uploaded by ReputableDalmatianJasper

Abhijit Boruah

Tags

logistic regression machine learning neural networks mathematics

Summary

This document is a presentation on logistic regression, covering fundamental concepts, algorithms, and implementation details. It explains notations, cost functions, gradient descent, and backpropagation, providing a comprehensive overview of this machine learning technique.

Full Transcript

LOGISTIC REGRESSION BASICS OF NEURAL NETWORK -ABHIJIT BORUAH NOTATIONS ◼ LOGISTIC REGRESSION ◼ IMPORTANCE OF BIAS AND WEIGHTS IMPORTANCE OF BIAS AND WEIGHTS SIGMOID FUNCTION ◼ LOGISTIC REGRESSION COST FUNCTION ◼ LOGISTIC REGRESSION COST FUNCTION ◼ LOGISTIC R...

LOGISTIC REGRESSION BASICS OF NEURAL NETWORK -ABHIJIT BORUAH NOTATIONS ◼ LOGISTIC REGRESSION ◼ IMPORTANCE OF BIAS AND WEIGHTS IMPORTANCE OF BIAS AND WEIGHTS SIGMOID FUNCTION ◼ LOGISTIC REGRESSION COST FUNCTION ◼ LOGISTIC REGRESSION COST FUNCTION ◼ LOGISTIC REGRESSION COST FUNCTION ◼ LOSS VS. COST FUNCTION The loss function computes the error for a single training example; the cost function is the average of the loss functions of the entire training set. GRADIENT DESCENT ◼ Gradient Descent is an optimization algorithm used for minimizing the cost function in various machine learning algorithms. ◼ Want to find w,b to minimize J(w,b) ◼ We want to find the value of w and b that corresponds to the minimum of the cost function J. ◼ GD is a convex function. GD ALGORITHM ◼ GRADIENT DESCENT b a ◼ w GRADIENT DESCENT ◼ Hence, actual update of the cost function J is & A convex function always has multiple local optima. T or F BACKPROPAGATION ALGORITHM ◼ In learning algorithms, we need the gradient of the cost function with respect to the parameters. ◼ The backprop allows the information from the cost to flow backwards through the network, in order to compute the gradient. ◼ The gradient descent algorithm then used to perform learning using this gradient. ◼ To describe the back-propagation algorithm more precisely, it is helpful to have a more precise computational graph language. COMPUTATIONAL GRAPHS ◼ A computational graph is a way to represent a math function in the language of graph theory, where ◼ nodes are either input values or functions for combining values. ◼ Edges receive their weights as the data flows through the graph. ◼ Outbound edges from an input node are weighted with that input value. ◼ Outbound nodes from a function node are weighted by combining the weights of the inbound edges using the specified function. expression: f(x, y, z) = (x + y) * z. COMPUTATIONAL GRAPHS ◼ Lets say we have to compute J(a,b,c) = 3 (a + bc) ◼ Lets u = bc, v = a + u. Then J =3v a b v = a+u J = 3v u=bc c ◼ Computational graphs are handy when we have an output variable that have to be optimized (like cost function J in linear regression). COMPUTING DERIVATIVES WITH COMP GRAPHS 5 33 1 1 3 6 Back propagation 2 ◼ COMPUTING DERIVATIVES TO IMPLEMENT GD Computing Derivatives to implement GD X1 w1 X2 w2 b GD ON M EXAMPLES ◼ dw1(i) ALGORITHM FOR LOGISTIC REGRESSION ON M EXAMPLES For n =2 For more features, we will need another loop to calculate dw’s ◼ VECTORIZATION z = w Tx + b Vectorization using pyton z = np.dot(w.T,x) + b Takes very less amount of time than a for loop. CODE CHECK import numpy as np a = np.random.rand(1000000) b = np.random.rand(1000000) import time tic = time.time() c = np.dot(a,b) toc = time.time() print(c) print("Vectorized version: "+ str(1000*(toc-tic))+"ms") c=0 tic = time.time() for i in range(1000000): c += a[i] * b[i] toc = time.time() print(c) print("For Loop: "+ str(1000*(toc-tic))+ "ms") VECTORIZING LOGISTIC REGRESSION ◼

Use Quizgecko on...
Browser
Browser