Artificial Intelligence Science Program Lecture 10 PDF
Document Details
Uploaded by CoolestMeerkat1413
Galala University
Tags
Summary
This lecture covers the fundamentals of artificial intelligence, focusing on neural networks and deep learning. It explains deep learning as a subfield of machine learning, focusing on learning representations from data and hierarchical representations. The document includes diagrams and examples of neural networks.
Full Transcript
Artificial Intelligence Science Program Chapter 5: Neural Networks and Deep Learning Deep Learning Deep learning is a specific subfield of machine learning: learning representations from data that puts an emphasis on learning successive layers of increasingly meaningful representat...
Artificial Intelligence Science Program Chapter 5: Neural Networks and Deep Learning Deep Learning Deep learning is a specific subfield of machine learning: learning representations from data that puts an emphasis on learning successive layers of increasingly meaningful representations. Deep Learning Deep learning is a specific subfield of machine learning: learning representations from data that puts an emphasis on learning successive layers of increasingly meaningful representations. Also, called layered representations learning and hierarchical representations learning. How many layers contribute to a model of the data is called the depth of the model. Modern deep learning often involves tens or even hundreds of successive layers of representations Raw Image Representation Machine Learning Deep Learning: learn representations! In deep learning, the layered representations are (almost always) learned via models called neural networks, where layers stacked on top of each other. The term neural network is a reference to neurobiology. Traditional Neural Network Artificial Neural Networks(ANN) are part of supervised machine learning where we will be having input as well as corresponding output present in our dataset. ANN can be used for solving both regression and classification problems. Neural networks form the base of deep learning, Neural networks take input data, train themselves to recognize patterns found in the data, and then predict the output for a new set of similar data. Therefore, a neural network can be thought of as the functional unit of deep learning, which mimics the behavior of the human brain to solve complex data- driven problems. Neural Network Perceptron and Neural Nets From biological neuron to artificial neuron (perceptron) Synapse Inputs Synapse Dendrites Axon x1 Linear Hard Axon w1 Combiner Limiter Output Y Soma Soma w2 Dendrites Synapse x2 Threshold 10 Traditional Neural Network General Structure of ANN Perceptron A perceptron is a neural network without any hidden layer. A perceptron only has an input layer and an output layer. Steps involved in the implementation of a neural network: 1. Feedforward: we have a set of input features and some random weights. Notice that in this case, we are taking random weights that we will optimize using backward propagation. 2. Backpropagation: we calculate the error between predicted output and target output and then use an algorithm (gradient descent) to update the weight values. Flow chart for a simple NN Perceptron Example Activation Function Sigmoid Function Example (logical OR Gate)