Hopfield Nets PDF - Computational Methodologies
Document Details
Uploaded by Deleted User
FCUL
2014
Rob Mills, Luís Correia
Tags
Summary
This document presents a presentation on Hopfield networks, encompassing aspects of recurrent neural networks, computational methodologies, and dynamical systems. It discusses topics such as artificial life, and includes figures and mathematical equations related to the models. The document appears to be part of a presentation or a handout.
Full Transcript
Hopfield nets Mills, Correia RNN Hopfield Artificial Life / Vida Artificial & networks Computational methodologies / Metodologias da Computação: Discrete H.N. Hopfield Networks Rob Mill...
Hopfield nets Mills, Correia RNN Hopfield Artificial Life / Vida Artificial & networks Computational methodologies / Metodologias da Computação: Discrete H.N. Hopfield Networks Rob Mills Luı́s Correia Informatics FCUL November 2014 Outline Hopfield nets Mills, Correia RNN Hopfield networks 1 Recurrent Neural Networks Discrete H.N. 2 Hopfield networks 3 Discrete H.N. Outline Hopfield nets Mills, Correia RNN Hopfield networks 1 Recurrent Neural Networks Discrete H.N. 2 Hopfield networks 3 Discrete H.N. Recurrent Neural networks Hopfield nets Mills, Correia Feed forward: Recurrent: RNN transform I → O; without with feedback loops; responds Hopfield memory to input, current state networks Discrete H.N. Recurrent Neural networks Hopfield nets Mills, Correia RNN Hopfield Networks with feedback: networks Local: at the level of the neuron Discrete H.N. Global: between parts of the overall network Representing time (implicitly) Recurrent Neural networks Hopfield nets Mills, Correia RNN Hopfield Networks with feedback: networks Local: at the level of the neuron Discrete H.N. Global: between parts of the overall network Representing time (implicitly) There are problems with the stability of a recurrent network We desire stable networks (in the sense of Lyapunov stability) — non-linear dynamical systems theory Motivation for Associative Memories Pattern storage/recall Hopfield nets Explicit training (Hebb’s rule) Run network to ‘find Mills, Correia out what it knows’ RNN Training Hopfield networks Discrete H.N. Nativ 2007, Trappenberg 2002, Hopfield 1982, Hebb 1949 Motivation for Associative Memories Pattern storage/recall Hopfield nets Explicit training (Hebb’s rule) Run network to ‘find Mills, Correia out what it knows’ RNN Training Hopfield networks Discrete H.N. Recall Nativ 2007, Trappenberg 2002, Hopfield 1982, Hebb 1949 Motivation for Associative Memories Pattern storage/recall Hopfield nets Mills, Correia RNN Hopfield networks Discrete H.N. Watson et al 2011 Dynamical systems Representation Hopfield nets Mills, Correia RNN For an order-N system, we write the state vector as Hopfield networks x(t) = x1 (t), x2 (t),... , xN (t), (1) Discrete H.N. where variables describe the states of a system, as a function of time. In general, first-order differential equations of the form d x(t) = F(x(t)), (2) dt where F is a vector of N non-linear functions. Dynamical systems Orbits Hopfield nets Mills, Correia Flux lines (trajectories from different initial conditions) in a 2D RNN phase space: Hopfield networks Discrete H.N. The velocity is given by the tangent vector, dx/dt Dynamical systems Attractors Hopfield nets Dissipative systems have contractive trajectories (lower dim) Mills, Correia point Chaotic RNN Hopfield limit cycle hyperbolic networks Discrete H.N. Each attractor has a basin of attraction Equilibrium is not necessarily static! Limit cycles are a state of equilibrium Haykin 1999 Neurodynamics = DS + neural networks Characteristics Hopfield nets Mills, Correia RNN Large number of degrees of freedom – Hopfield networks many neurons, with collective dynamics Discrete H.N. Non-linearity – this is essential to provide universal computation Dissipation – Convergence to an attractor over time Noise – neural networks characteristically exhibit noise (but not treated here) Neurodynamics Manipulating attractors Hopfield nets Mills, Correia RNN We desire the dynamical Hopfield attractors to correspond networks to input patterns Discrete H.N. somehow control the positions of the attractors Exploit the energy-minimisation dynamics to perform useful computation Watson et al 2011 Neurodynamics Manipulating attractors Hopfield nets Mills, Correia RNN Hopfield Ideals of an associative memory: networks Let us store a set of patterns {tp } = tip somehow, Discrete H.N. such that when presented with a new pattern s, the network produces the (stored) pattern that most closely resembles s. How can we induce the attractors of a dynamical system that corresponds to such a computational machine? Outline Hopfield nets Mills, Correia RNN Hopfield networks 1 Recurrent Neural Networks Discrete H.N. 2 Hopfield networks 3 Discrete H.N. Hopfield network Hopfield nets Mills, Correia Single layer network all neurons with feedback, and unit delays every neuron RNN Hopfield no self-connections networks Discrete H.N. Hopfield 1982 Hopfield network model definition Hopfield nets Mills, Correia RNN The additive model: Hopfield networks N Discrete H.N. dvj (t) vj (t) X Cj =− + wji ψ(vi (t)) + Ij (3) dt Rj i=1 Matrix of weights is symmetric (wij = wji ∀ i, j) Every neuron has a non-linear function, ψ(.) The non-linear function is invertible, i.e.,, vi = ψ −1 (xi ) Hopfield network model definition Hopfield nets Using a hyperbolic tangent function as the squashing function: Mills, Correia a v 1 − exp(−a v ) i i i i RNN xi = ψi (vi ) = tanh =. (4) Hopfield 2 1 + exp(−ai vi ) networks Discrete H.N. Then the inverse is 1 1−x vi = ψi−1 (xi ) = − log. (5) ai 1+x It is also possible to write as: −1 1−x ψ (xi ) = − log , (6) 1+x 1 −1 ∴ ψi−1 (xi ) = − ψ (xi ). (7) ai Hopfield network energy function Hopfield nets Mills, Correia The energy function can be defined as RNN Hopfield 1 XX X Z xj 1 X networks E =− wji xi xj + ψ −1 (x)dx − Ij xj. (8) Discrete H.N. 2 0 Rj i j j j The dynamics of the network minimise this energy function. Some useful resulting properties: ψj−1 (xj ) monotonically increases with xj ; 2 dxj dt ≥ 0 ∀xj , thus, dE dt ≤ 0. Hopfield network energy function Hopfield nets Mills, Correia RNN Since the energy function monotonically decreases with time, Hopfield networks the trajectory given by that energy function will reach fixed Discrete H.N. points its the state space/energy landscape. (n.b. the H.N. is asymptotically stable.) Key result the fixed-point attractors are minima in the energy function, and vice versa Outline Hopfield nets Mills, Correia RNN Hopfield networks 1 Recurrent Neural Networks Discrete H.N. 2 Hopfield networks 3 Discrete H.N. Discrete Hopfield network Definition Hopfield nets Mills, Correia RNN We use the additive model as per the McCulloch-Pitts network Hopfield networks The output of a neuron at asymptotic values Discrete H.N. ( +1 if vj = ∞ xj = (9) −1 if vj = −∞ and the midpoint of the activation function is ψ(0) = 0 We avoid self-connections, i.e., wj,j = 0 ∀ j We set the bias term Ij = 0 ∀ j Discrete Hopfield network Definition Hopfield nets The energy function can be defined as Mills, Correia 1 XX X Z xj 1 RNN E =− wji xi xj + ψj−1 (x)dx. (10) Hopfield 2 0 Rj i j j networks Discrete H.N. Or we can use the result from (5) to write it as 1 XX X Z xj 1 E =− wji xi xj + ψ −1 (x)dx, (11) 2 0 a R j j i j j and then consider the extreme case where aj = ∞, which then provides the simplified form 1 XX E =− wji xi xj. (12) 2 i j Discrete Hopfield network Activation functions Hopfield nets Mills, Correia RNN With the McCulloch-Pitts activation function, we have Hopfield networks ( Discrete H.N. +1 if vj > 0, xj = (13) −1 if vj < 0, (which is equivalent to the sign function). States can be updated either synchronously or asynchronously. Synch can get into a 2-cycle (assumptions violated) ⇒ asynch ofen more desirable The storage phase in Hopfield networks Hopfield nets Mills, Correia Hebb’s rule: neurons that fire together wire together. RNN wji (t + 1) = wji (t) + αxj (t)xi (t). Hopfield networks Discrete H.N. If we want to store M patterns, each of N binary dimensions, ξm , then according to a generalised Hebb’s rule, the weights should be M 1 X wji = ξmj ξmi , (14) N m where ξmi is the ith component of pattern m. The weight matrix is symmetric Fully connected network, except no self-feedback Hebb 1949 The retrieval phase in Hopfield networks Hopfield nets Mills, Correia What does the network “know”? RNN Hopfield networks Discrete H.N. The retrieval phase in Hopfield networks Hopfield nets Mills, Correia What does the network “know”? RNN Hopfield networks To find out, we assert a state on all neurons xi = ±1: Discrete H.N. xj (0) = ξtest,j , ∀ j. (15) The retrieval phase in Hopfield networks Hopfield nets Mills, Correia What does the network “know”? RNN Hopfield networks To find out, we assert a state on all neurons xi = ±1: Discrete H.N. xj (0) = ξtest,j , ∀ j. (15) Then run the state dynamics for each neuron, N ! X xj (n + 1) = sgn wji xi (n) + bj , (16) i asynchronously, until the network stabilises. → recalled a memory. Discrete Hopfield network Remarks Hopfield nets Mills, Correia RNN How many patterns can a network store? Hopfield — Error rates increase with more patterns, spurious networks attractors arise,... Discrete H.N. — approximate capacity = ∝ N. symmetry in weights is important for stability Desirable characteristics include: associative memory partial recall (/completion) noise removal Applications to comb optimisation...