Podcast
Questions and Answers
Why are different weights assigned to the inputs in an artificial neuron?
Why are different weights assigned to the inputs in an artificial neuron?
- To emphasize the relative importance of the different inputs (correct)
- To ensure all inputs are equally important in calculating the output
- To strengthen the less important inputs
- To weaken the more important inputs
What is the mathematical form of the operation performed by an artificial neuron?
What is the mathematical form of the operation performed by an artificial neuron?
- $x_1w_1 + x_2w_2 + x_3w_3$
- $x_1w_1 + x_2w_2 + x_3w_3 + b$
- $f(x_1w_1 + x_2w_2 + x_3w_3 + b)$ (correct)
- All of the above
What is the key difference between an artificial neuron and linear regression?
What is the key difference between an artificial neuron and linear regression?
- Artificial neurons use weights and bias, while linear regression uses only coefficients
- Artificial neurons can model non-linear relationships, while linear regression can only model linear relationships
- Artificial neurons apply an activation function, while linear regression does not
- Both b and c (correct)
How are artificial neurons arranged in an Artificial Neural Network (ANN)?
How are artificial neurons arranged in an Artificial Neural Network (ANN)?
What is the role of the bias term in an artificial neuron?
What is the role of the bias term in an artificial neuron?
What is the purpose of the activation function in an artificial neuron?
What is the purpose of the activation function in an artificial neuron?
What determines the number of neurons in the output layer in a neural network?
What determines the number of neurons in the output layer in a neural network?
In binary classification, how many neurons are typically present in the output layer?
In binary classification, how many neurons are typically present in the output layer?
What is the purpose of introducing non-linearity in neural networks?
What is the purpose of introducing non-linearity in neural networks?
Which function scales the input value between 0 and 1 in neural networks?
Which function scales the input value between 0 and 1 in neural networks?
What is the range of values output by the Tanh function in neural networks?
What is the range of values output by the Tanh function in neural networks?
In a regression problem, how many neurons are typically found in the output layer of a neural network?
In a regression problem, how many neurons are typically found in the output layer of a neural network?
What is the function of a synapse in a neuron?
What is the function of a synapse in a neuron?
How are inputs to a neuron weighted before being summed in the cell body?
How are inputs to a neuron weighted before being summed in the cell body?
What is the role of the soma in a neuron?
What is the role of the soma in a neuron?
In artificial neurons, what happens to the inputs received before they are processed?
In artificial neurons, what happens to the inputs received before they are processed?
Which of the following best describes the activation function of a neuron?
Which of the following best describes the activation function of a neuron?
What distinguishes neurons in artificial neural networks from linear regression models?
What distinguishes neurons in artificial neural networks from linear regression models?
Artificial neurons are arranged by layers.
Artificial neurons are arranged by layers.
Nuerons in the same layer do not have any connections.
Nuerons in the same layer do not have any connections.
which one is not a typical ANN layers?
which one is not a typical ANN layers?
The number of neurons in the input layer is the number of inputs we feed to the network
The number of neurons in the input layer is the number of inputs we feed to the network
There is no computaion in the input layer. it is just used for passing information from the outside world to the network.
There is no computaion in the input layer. it is just used for passing information from the outside world to the network.
Any layer between the inpout layer and the output layer is called ___________.
Any layer between the inpout layer and the output layer is called ___________.
The input layer identifies the pattern in the dataset.
The input layer identifies the pattern in the dataset.
__________ identifies the pattern in dataset and is responsible for deriving complex relationships between input and output.
__________ identifies the pattern in dataset and is responsible for deriving complex relationships between input and output.
The network is called a ___________________ when we have many hidden layers.
The network is called a ___________________ when we have many hidden layers.
The number of neurons in the output layer is based on the number and the types of the problems to be solved.
The number of neurons in the output layer is based on the number and the types of the problems to be solved.
_______________ is used to introduce non-linearity in neural networks.
_______________ is used to introduce non-linearity in neural networks.
The aim of the activation function is to introduce a nonlinear transformation to learn the complex underlying patterns in data.
The aim of the activation function is to introduce a nonlinear transformation to learn the complex underlying patterns in data.
The _____________ scales the input values between 0 and 1.
The _____________ scales the input values between 0 and 1.
Sigmoid Function = 1 / (1+ exp(-x))
Sigmoid Function = 1 / (1+ exp(-x))
The ____________ outputs the values between -1 and +1
The ____________ outputs the values between -1 and +1
____________ outputs the values from 0 yo infinity.
____________ outputs the values from 0 yo infinity.
the sang for being zero for all negatives values is a problem called ________ and a neuron is said to be dead if it always outputs zero.
the sang for being zero for all negatives values is a problem called ________ and a neuron is said to be dead if it always outputs zero.
_____________ is a variant of the ReLU function that solves the dying ReLU problem. The value of alpha typically se to 0.01
_____________ is a variant of the ReLU function that solves the dying ReLU problem. The value of alpha typically se to 0.01
ELU (Exponential Linear Unit) is like Leaky ReLU has a small slope for negative values.
ELU (Exponential Linear Unit) is like Leaky ReLU has a small slope for negative values.
__________ is basically the generalization of the sigmoid functions.
__________ is basically the generalization of the sigmoid functions.
The Softmax Functions is usually applied to the final layer of the network and while performing multi-class classification tasks. It gives the probabilities of each class for being output and thus , the sum of softmax values will always equal 1.
The Softmax Functions is usually applied to the final layer of the network and while performing multi-class classification tasks. It gives the probabilities of each class for being output and thus , the sum of softmax values will always equal 1.
softmax function = exp(x) / sum(exp(x)) [0,j]
softmax function = exp(x) / sum(exp(x)) [0,j]
Flashcards are hidden until you start studying