Feb 10, 2005 - Chapter 2: Neural Networks I ... Associative content-addressable memory. .... Probability distribution of neuron activity times the local field s.
Attractor Neural Networks Presented by Janet Snape CS 790R Seminar University of Nevada, Reno February 10, 2005
1
References • Bar-Yam, Y. (1997) Dynamics of Complex
Systems - Chapter 2: Neural Networks I
• Flake, G. (1998) The Computational Beauty of Nature - Chapter 18: Natural and Analog Computation
2
Introduction • Bar-Yam -- Chapter Concepts - Mathematical models correlate to the human information process. - Associative content-addressable memory. - Imprinting and retrieval of memories. - Storage capacity
• Flake -- Chapter Concepts - Examination of artificial NNs - Associative memory - Combinatorial optimization problems and solutions.
Development of Neural Networks • McCulloch-Pitts discrete model (1940s) • Associative Hebbian Learning (1949) • Hopfield continuous model (1980s)
9
McCulloch-Pitts • McCulloch-Pitts NN (1940s) - Realistic model - Neuron has a discrete state and and changes in discrete time steps
10
A single McCulloch-Pitts neuron
11
McCulloch-Pitts cont’d... n
ai(t + 1) = Θ ( ∑ wij x aj(t) - bi ) j=1
A Neuron’s State Activation Rule
IF the weighted sum of incoming signals > than the threshold bi --- neuron fires with an activation of 1 ( i.e., send signal) ELSE neuron’s activation is 0 (i.e., cannot send signal) 12
Defining an Attractor Network • Definition • Operating and training attractor networks • Energy
17
Features of Attractor Networks • Binary variables for the neuron activity values +1 (ON) 1 (OFF) • No self-action by a neuron. • Symmetric synapses. • Synchronous / asynchronous neuron activity update Eq (2.2.4):
18
Attractor Network Process (Operation) • Input pattern • Continuous evolution to a steady state • Output network state
19
Training Attractor Networks • Consists of imprinting a set of selected neuron firing patterns. • Described as an associative memory.
20
ANNs Categorize
21
Energy analog
Imprinting on an attractor network.
22
Basin of Attraction • The basin of attraction is the region of patterns, near the imprinted pattern that will evolve under the neural updating back to the imprinted pattern (feedback loop).
Memory Capacity • Overloaded => Full => Basin of attraction is small • Memory not usable. • Can only recover pattern if we already know it.
25
Stability of Imprinted Pattern
Probability distribution of neuron activity times the local field sihi 26
Associative Memory • Content-addressable memory • How does associative memory work? • Rule: Hebbian learning
27
Hebbian Learning • Donald Hebb (1949) • Reinforces neurons either on or off at the same time. • Stores memory that can be recalled.
28
Hebbian learning cont’d... • Computation • How does recall work? - Associative memory - Recall letters
29
Recall Letters • Train • Find similarities between seed pattern and the stored pattern • Converges to a single stored pattern • Computational biases
30
Recall letters cont’d…
31
Recall letters cont’d…
32
Hebbian Unstable Imprints
• •
< 10, !00% stability of all stored patterns > 10, unstable patterns increase until ALL patterns are unstable 33
Hebbian Stable Imprints
• • •
Max number of stable patterns = 12 < 10 patterns, stable patterns > 15 patterns, number of stable patterns decrease to 0 34
Pros and Cons • Advantages - Efficient for some applications - Fault tolerant.
• Disadvantages - Prone to recall a composite of many of the stored patterns.
35
Hopfield NN Model (1980s) • Internal continuous state continuously varies over time • External state is a function of the internal state.
36
Hopfield cont’d... • Network starts out in a random state with each neuron close to 0 • Update activations • Set weights and inputs to solve a problem. • Application example: Cost optimization • Task assignment problem • Task assignment solution
37
Task Assignment Problem
38
Task Assignment Solution
39
Size of Basin of Attraction • • • •
More imprints => smaller basin of attraction More imprints => unstable patterns More imprints => affects storage capacity More imprints => unable to retrieve pattern
40
BofA cont’d...
41
Hamming Distance • The distance between two patterns = the number of neurons that differ between the two patterns. • Used in the Hopfield java applet demonstration • Eq. 2.2.16
42
Demonstration • Hopfield Java Applet…
43
Summary • Attractor neural networks can be used to model the human brain. • These networks developed from the simple McCullochPitts 1940s NN discrete model into other extensions: ¾ Associative memory led to Hebbian learning. ¾ Hopfield NN continuous model led to a more general use
• • • •
Thus, ANNs can be used to solve combinatorial problems. New patterns => learning. No new patterns => no learning. Storage capacity depends on the number of neurons! 44
Mar 1, 2006 - Both (0,1) and (-1,1) representations are equivalent. â New representation is not very biologically plausible, since real neurons cannot inhibit ...
Feb 22, 2006 - C. 3. H. 8. O. 1-propanol. 2-propanol. âcognitive isomersâ made of the same atomic features. Questions of representation. A molecular metaphor ...
3.2.3 The originalPerceptron . ... 5.1.3 Back-propagation in fully recurrent networks . ..... chapter 7 was form by a report of Gerard Schram at the University of Amsterdam. ... within their psychology, physics, computer science, or biology departmen
explanation of the representation of a legal problem developed by artificial ..... global sense of the source) and the relative importance of their influence on the ...
Abstract. This paper describes an experiment which consists in teaching a connexionnist ..... T.J.M. : Neural networks and open texture, in Proceedings of the 4th International ... For example, the principles require another representation. 11 ...
Apr 28, 2005 - von der Malsburg, C. (1981) The correlation theory of brain function. Internal. Report 81-2 ..... Crystallization from seed neurons. â Dynamic ...
Promising results are reported us- ..... A PER of 26.1% is reported in [25] using RNNs. More .... cd-rom. nist speech disc 1-1.1,â NASA STI/Recon technical report.
Dec 21, 2006 - Seminar Statistical Learning Theory. University ... Abstract. Pattern recognition is one of the traditional uses of neural networks. ... One of the most successful machine learning techniques is the class of gradient-based learning.
Machine Learning (ML) techniques have allowed a great ... a difficulty, a Quaternion Multi-Layer Perceptron (QMLP) ... ics that are the classification outputs from input features ob- ... P) and a set of labels tp associated to each xp, the output γl
If you want to possess a one-stop search and find the proper manuals on your products, you can visit this website that delivers many Neural Networks Les. Reseaux De Neurones Biological Computers Or Electronic Brains Ordinateurs. Biologiques. You can
a good descriptor of the influence of disturbance upon the biota ...... measure. IEEE Trans. .... pharmaco kinetic parameters and the assessment of their variability ...
nativism and establish the plausibility of constructivism is not clear in the absence of a more rigorous treatment of the properties of this class of models.
We use Neural Networks (NN) in order to design control ar- chitectures for ..... maximum of the convolution of the gradient image with a Di§ff erence Of Gaus-.
School of Electrical and Electronic Engineering, Nanyang Technological University, Block S1 ... Keywords: Algorithms for real-time control; Adaptive control; Fuzzy systems; Neural .... (generated by the joint motor). .... Simulink model into a target
Neural Networks. George ... need for a mechanism that can update the correct location of the robot. ... Unsupervised Learning for Robot Navigation. ⢠Let the ...
They are interesting since they can generate alarms in real time at ... sendmail(0; 17), snmpgetattack(0; 7, 741), snmpguess(0; 2, 406), spy(2; 0), warezclient(1 ...
Jul 16, 1993 - 1.0 Introduction ... for evolving neural networks than genetic algorithms [15, 16], a more popular class of evolution- .... Current theory suggests that crossover will tend to recombine short, connected substrings of the bit string.
made to exploit the available options to greatest benefit, rather ... In a paradigm developed by Salamone et al. .... (2007) have recently developed a mathema-.
... we compare the advantage of providing an artificial neural network with spectral ... Dailey, Cottrell, Padgett & Ralph [4] used convolution of. Gabor wavelets at ...