1.2(part 2).pdf

Full Transcript

Module 1.2 (part 2) MP Neuron Example of NN to calculate net output 2 Example of NN to calculate net output 3 Example of NN to calculate net output 4 McCulloch Pitts Neuron (MP neuron)...

Module 1.2 (part 2) MP Neuron Example of NN to calculate net output 2 Example of NN to calculate net output 3 Example of NN to calculate net output 4 McCulloch Pitts Neuron (MP neuron) 5 Boolean Functions Using M-P Neuron This representation just denotes that, for the boolean inputs x1, x2 and x3 if the g(x) i.e., sum ≥ θ, the neuron will fire otherwise, it won’t. Concise representation of M-P neuron 6 AND Function AND function neuron would only fire when ALL the inputs are ON i.e., g(x) ≥ 3. 7 OR Function OR function neuron would fire if ANY of the inputs is ON i.e., g(x) ≥ 1 8 9 10 A simple M-P neuron is shown in the figure. It is excitatory with weight (w>0)/inhibitory with weight –p (pnw –p Output will fire if it receives “k” or more excitatory inputs but no inhibitory inputs where kw≥θ>(k-1) w The M-P neuron has no particular training algorithm. An analysis is performed to determine the weights and the threshold. It is used as a building block where any function or phenomenon is modeled based on a logic function. A Function With An Inhibitory Input we have an inhibitory input i.e., x2 so whenever x2 is 1, the output will be 0. x1 AND ! x2 would output 1 only when x1 is 1 and x2 is 0 13 Computation of threshold for logical AND-NOT operation 14 NOR Function For a NOR neuron to fire, we want ALL the inputs to be 0 so the thresholding parameter should also be 0 and we take them all as inhibitory input 15 Computation of threshold for logical NOR operation (try) 16 NOT Function For a NOT neuron, 1 outputs 0 and 0 outputs 1. So we take the input as an inhibitory input and set the thresholding parameter to 0 17 Computation of threshold for logical NOT operation(try) 18 Design an NN using only MP-neuron for NAND (2 inputs) function (try!) 19 Geometrical interpretation of OR function MP-neuron A single MP neuron splits the all inputs which produce an input points (4points for 2 binary output 0 will be on one side inputs) into two halves ( 𝑛𝑖=1 𝑥𝑖< θ ) of the line and Points lying on or above the line all inputs which produce an 𝑛 𝑖−1 𝑥𝑖 -θ =0 and points lying output 1 will lie on the other below this line. side ( 𝑛𝑖=1 𝑥𝑖 > θ ) of this line 20 Geometrical interpretation of AND function MP-neuron 21 Geometrical interpretation of Tautology MP-neuron 22 Geometrical interpretation of OR function MP-neuron with 3 inputs 23 Geometrical interpretation of OR function MP-neuron with 3 inputs For the OR function, we want a plane such that the point (0,0,0) lies on one side and the remaining 7 points lie on the other side of the plane 24 25 26 Thus.. A single McCulloch Pitts Neuron can be used to represent boolean functions which are linearly separable Linear separability (for boolean functions) : There exists a line (plane) such that all inputs which produce a 1 lie on one side of the line (plane) and all inputs which produce a 0 lie on other side of the line (plane) 27 Limitation of MP-neuron What about non-boolean (say, real) inputs? Do we always need to hand code the threshold? Are all inputs equal? What if we want to assign more importance to some inputs? What about functions which are not linearly separable? Say XOR function. 28 Linear separability It is a concept wherein the separation of the input space into regions is based on whether the network response is positive or negative. A decision line is drawn to separate positive or negative response. The decision line is also called as decision-making line or decision-support line or linear-separable line. The net input calculation to the output unit is given as The region which is called as decision boundary is determined by the relation Contd.. Linear separability of the network is based on decision-boundary line. If there exists weights(bias) for which training input vectors have positive response (+1 ve) input lie on one side of line and other having negative response(-ve) lie on other side of line, we conclude the problem is linearly separable 30 Contd.. Net input of the network Yin= b+ x1w1 + x2w2 Separating line for which the boundary lies between the values of x1 and x2 b+ x1w1 + x2w2 =0 If weight of w2 is not equal to 0 then we get x2 = -w1/w2 –b/w2 Net requirement for positive response b+ x1w1 + x2w2 >0 During training process , from training data w1, w2 and b are decided. 31 Contd.. Consider a network having positive response in the first quadrant and negative response in all other quadrants with either binary or bipolar data. Decision line is drawn separating two regions as shown in Fig. Using bipolar data representation, missing data can be distinguished from mistaken data. Hence bipolar data is better than binary data. Missing values are represented by 0 and mistakes by reversing the input values from +1 to -1 or vice versa. Numericals on linear separability to be added 33

Use Quizgecko on...
Browser
Browser