Podcast
Questions and Answers
Which of the following best describes the primary function of artificial neural networks (ANNs)?
Which of the following best describes the primary function of artificial neural networks (ANNs)?
- To strictly adhere to pre-defined rules without adapting to new information.
- To perfectly replicate the structure and function of the human brain.
- To perform computational tasks at a slower pace than traditional systems.
- To execute various computational tasks faster by emulating biological neural networks. (correct)
What is the role of weights associated with the connection links between neurons in an ANN?
What is the role of weights associated with the connection links between neurons in an ANN?
- To store programs that dictate the behavior of the entire network.
- To delay or halt the transmission of signals between neurons.
- To modulate the input signal, influencing whether the signal excites or inhibits the receiving neuron. (correct)
- To uniformly amplify all signals passing through the connection.
In the context of biological neurons, which component is responsible for receiving information from other neurons?
In the context of biological neurons, which component is responsible for receiving information from other neurons?
- Dendrites (correct)
- Axon
- Synapse
- Soma
Which of the following is a key distinction between Artificial Neural Networks (ANNs) and Biological Neural Networks (BNNs) regarding processing speed?
Which of the following is a key distinction between Artificial Neural Networks (ANNs) and Biological Neural Networks (BNNs) regarding processing speed?
What type of learning requires structured and formatted data to ensure precise outcomes and tolerance of ambiguity?
What type of learning requires structured and formatted data to ensure precise outcomes and tolerance of ambiguity?
In the general model of an Artificial Neural Network (ANN), how is the net input calculated?
In the general model of an Artificial Neural Network (ANN), how is the net input calculated?
What are the three primary building blocks that the processing of Artificial Neural Networks (ANN) depend on?
What are the three primary building blocks that the processing of Artificial Neural Networks (ANN) depend on?
What is the key characteristic of a feedforward network in the context of neural networks?
What is the key characteristic of a feedforward network in the context of neural networks?
In neural networks, what distinguishes a multilayer feedforward network from a single-layer feedforward network?
In neural networks, what distinguishes a multilayer feedforward network from a single-layer feedforward network?
What is a primary characteristic of a feedback network in neural networks?
What is a primary characteristic of a feedback network in neural networks?
If an ANN is undergoing supervised learning, what condition triggers the adjustment of weights in the network?
If an ANN is undergoing supervised learning, what condition triggers the adjustment of weights in the network?
How does unsupervised learning differ from supervised learning in ANNs?
How does unsupervised learning differ from supervised learning in ANNs?
What is a key characteristic of reinforcement learning in the context of ANNs?
What is a key characteristic of reinforcement learning in the context of ANNs?
What is the primary role of an activation function in an Artificial Neural Network (ANN)?
What is the primary role of an activation function in an Artificial Neural Network (ANN)?
What distinguishes a binary sigmoidal function from a bipolar sigmoidal function in neural networks?
What distinguishes a binary sigmoidal function from a bipolar sigmoidal function in neural networks?
What is the function of the bias in a perceptron?
What is the function of the bias in a perceptron?
Within the architecture of a perceptron, what is the role of the adder element?
Within the architecture of a perceptron, what is the role of the adder element?
According to the material, which activation function is the most basic and returns 1 if the input is positive, and 0 for any negative input?
According to the material, which activation function is the most basic and returns 1 if the input is positive, and 0 for any negative input?
In a Back Propagation Neural Network (BPN), what is the order of the three phases involved in its training?
In a Back Propagation Neural Network (BPN), what is the order of the three phases involved in its training?
Which of the following best describes an advantage of using neural networks?
Which of the following best describes an advantage of using neural networks?
Flashcards
Artificial Neural Networks (ANNs)
Artificial Neural Networks (ANNs)
Parallel computing devices that attempt to model the brain to perform computations faster than traditional systems.
Artificial Neural Network (ANN)
Artificial Neural Network (ANN)
An efficient computing system inspired by biological neural networks, composed of interconnected units for communication.
Dendrites
Dendrites
Tree-like branches of a neuron that receive information from other neurons it is connected to.
Soma
Soma
Signup and view all the flashcards
Axon
Axon
Signup and view all the flashcards
Synapses
Synapses
Signup and view all the flashcards
Network Topology
Network Topology
Signup and view all the flashcards
Feedforward Network
Feedforward Network
Signup and view all the flashcards
Single Layer Feedforward Network
Single Layer Feedforward Network
Signup and view all the flashcards
Multilayer Feedforward Network
Multilayer Feedforward Network
Signup and view all the flashcards
Feedback Network
Feedback Network
Signup and view all the flashcards
Learning
Learning
Signup and view all the flashcards
Supervised Learning
Supervised Learning
Signup and view all the flashcards
Unsupervised Learning
Unsupervised Learning
Signup and view all the flashcards
Reinforcement Learning
Reinforcement Learning
Signup and view all the flashcards
Activation Function
Activation Function
Signup and view all the flashcards
Linear Activation Function
Linear Activation Function
Signup and view all the flashcards
Binary Sigmoidal Function
Binary Sigmoidal Function
Signup and view all the flashcards
Bipolar Sigmoidal Function
Bipolar Sigmoidal Function
Signup and view all the flashcards
Back Propagation Network
Back Propagation Network
Signup and view all the flashcards
Study Notes
Artificial Neural Networks (ANNs)
- ANNs are parallel computing devices attempting to model the brain.
- Main objective is to perform computational tasks faster than traditional systems.
- ANN is an efficient computing system based on biological neural networks.
- ANNs are referred to as "artificial neural systems," "parallel distributed processing systems," or "connectionist systems."
- ANN consists of interconnected units allowing communication.
- Units, or nodes/neurons, are simple processors operating in parallel.
- Neurons connect to each other through connection links
- Each connection link has a weight with information about the input signal.
- The weight excites or inhibits the signal being communicated.
- Each neuron has an internal state called an activation signal.
- Output signals are sent to other units after combining input signals and an activation rule.
Biological Neuron
- Nerve cell neuron processes information.
- Approximately 10¹¹ neurons, with around 10¹⁵ interconnections.
- A typical neuron consists of four parts: Dendrites, Soma, Axon and Synapses
Dendrites
- Tree-like branches responsible for receiving information from other neurons.
Soma
- The cell body of the neuron, responsible for processing received information.
Axon
- A cable that sends information to other neurons.
Synapses
- The connection between the axon and other neuron's dendrites
ANN versus BNN
BNN | ANN | |
---|---|---|
Soma | Node | |
Dendrites | Input | |
Synapse | Weights or Interconnections | |
Axon | Output | |
Criteria | BNN | ANN |
Processing | Massively parallel, slow | Massively parallel, fast |
Size | 10¹¹ neurons/10¹⁵ connections | 10²-10⁴ nodes, depends on application/network designer |
Learning | Tolerates ambiguity | Requires precise, structured data to tolerate ambiguity |
Fault tolerance | Degrades with damage | Robust performance, potential to be fault tolerant |
Storage capacity | Stores in synapse | Stores in continuous memory locations |
Model of Artificial Neural Network
- The net input is calculated as: Yin = X1. W1 + X2. W2 + X3. W3 + … + Xm. Wm , which is equivalent to the sum of xi. Wi from i=1 to n.
- Output is calculated by applying an activation function over the net input: Y = F(yin).
- ANN processing depends on Network Topology, Adjustment of Weights/Learning and Activation Functions
Network Topology
- It is the arrangement of a network's nodes and connecting lines.
- ANN can be classified based on topology
Feedforward Network
- It is a non-recurrent network with processing units/nodes in layers.
- Nodes connect only to the previous layer's nodes.
- No feedback loop, signal flows in one direction (input to output).
Single layer feedforward network
- Has only one weighted layer
- The input layer is fully connected to the output layer.
Multilayer feedforward network
- Has one or more layers (hidden layers) between input and output.
Feedback Network
- Has feedback paths, signal flows in both directions using loops.
- Non-linear dynamic system, changes until equilibrium is reached.
- Subdivided into Recurrent networks and Jordan networks
Recurrent networks
- Feedback networks with closed loops.
Fully Recurrent Network
- All nodes connect to each other and work as input and output.
Jordan network
- Closed loop network where the output goes to the input as feedback.
Adjustments of Weights or Learning
- Learning modifies connection weights in a specified network.
- Learning is classified into supervised, unsupervised, and reinforcement learning.
Supervised Learning
- It is done under the supervision of a teacher/dependent.
- During training, an input vector is presented to the network, giving an output vector.
- The output vector is compared with the desired output vector.
- If there's a difference, an error signal is generated.
- Weights are adjusted until the actual output matches the desired output.
Unsupervised Learning
- Done without the supervision of a teacher/independent.
- Input vectors of similar type form clusters.
- The network gives an output response indicating the class to which a new input pattern belongs.
- There is no feedback as to what the desired output should be.
- The network discovers patterns, features, and relations between input data and output.
Reinforcement Learning
- used to reinforce or strengthen the network using a critic information
- Is like supervised learning, with very less explicit information
- The network receives feedback from the environment, which is evaluative (not instructive).
- Adjustments of weights are made to improve critic information in the future.
Activation Functions
- It is the force or effort applied over the input to obtain an exact output
- Examples include Linear and Sigmoid Activation Functions
Linear Activation Function
- Called identity function, performs no input editing.
- Defined as: F(x) = x
Sigmoid Activation Function
- Two types: binary and bipolar sigmoidal
Binary sigmoidal function
- Performs input editing between 0 and 1.
- Always positive and bounded (output between 0 and 1).
- Strictly increasing, higher input results in higher output.
- Defined as: F(x) = sigm(x) = 1 / (1 + exp (-x))
Bipolar Sigmoidal Function:
- Performs input editing between -1 and 1.
- Can be positive or negative.
- Always bounded (output between -1 and 1).
- Strictly increasing.
- Defined as: F(x) = sigm(x) = 2 / (1 + exp (-x)) - 1
Supervised Neural Networks
- Learning takes place under the supervision of a teacher/dependent learning.
- ANN presented with input vector, obtaining output vector.
- Output vector is compared with the desired/target output vector.
- An error signal is generated if there is a difference.
- Weights are adjusted until the actual output matches the desired output.
Perceptron
- Developed by Frank Rosenblatt using McCulloch and Pitts model.
- Is the basic operational unit of artificial neural networks using a supervised learning rule.
- Classifies data into two classes.
- Consists of a single neuron with multiple inputs with adjustable weights.
- Neuron output is 1 or 0 depending upon the threshold.
- Includes a bias whose weight is always 1.
- Perceptron has three basic elements:
- Links: connection links with weight and a bias with weight 1.
- Adder: adds the inputs multiplied with their respective weights.
- Activation function: limits the neuron's output, uses a Heaviside step function (output = 1 if input is positive, 0 if negative).
- Perceptron networks are single-layer feed-forward networks.
- Variations were designed by Rosenblatt and Minsky-Papert.
- Key points of a perceptron network:
- Consists of three units: sensory (input), associator (hidden), and response (output).
- Sensory units connect to associator units with fixed weights (1, 0, or -1) assigned at random.
- Binary activation function in sensory and associator unit.
- Response unit has an activation of 1, 0, or -1.
- Binary step with a fixed threshold is used as activation for the associator.
- Output signals to the response unit are only binary.
- The output of the perceptron network is: y = 1 if Yin >= 0, y = 0 if -theta <= Yin < 0, y= -1, if Yin <= -theta
- The perceptron learning rule weights updation:
- Perceptron learning rule used in weight updation between the associator unit and the response unit.
- The net calculates the response to determine if an error has occurred for each training input.
- Error calculation compares target values with calculated outputs.
- Weights on connections from units sending nonzero signals are adjusted.
- Weights are adjusted based on the learning rule if an error occurred for a training pattern.
- wi(new) = wi(old) + atxi
- b(new) = b(old) + at
- If no error, no weight updation occurs, and training process stops.
- Target value "t" is +1 or -1, and a is the learning rate.
Back Propagation Network (BPN)
- It is a multilayer neural network with at least one hidden layer
- Error is calculated at the output layer by comparing the target and actual outputs propagated back to the input layer.
- BPN architecture has three interconnected layers with weights.
- Hidden and output layers have a bias, whose weight is always 1.
- BPN working occurs in two phases: signal sending from the input to output layer and error back-propagation from the output to input layer.
- The Bias terms also act as weights
- During back propagation, signals are sent in the reverse direction.
- Inputs to the BPN and output can be binary (0, 1) or bipolar (-1, +1).
- The activation function increases monotonically and is differentiable.
- BPN training uses a binary sigmoid activation function.
- BPN training has three phases: feed forward, back propagation of error and updating of weights
Advantages of Neural Networks
- Mimics human control logic.
- Uses imprecise language.
- Inherently robust.
- Fails safely.
- Modified and tweaked easily.
Disadvantages of Neural Networks
- Operator's experience required.
- System complexity.
Applications of Neural Networks
- Automobile subsystems with air conditioning
- Automatic transmissions, ABS and cruise control (e.g. Tokyo monorail).
- Auto focus on cameras.
- Digital image processing (edge detection).
- Rice cookers, Dishwashers, and Elevators
Studying That Suits You
Use AI to generate personalized quizzes and flashcards to suit your learning preferences.