jain news jain mantra jain music stavan songs news facts astrology vastu shastra health treatment baby names india tourist place news ayurvedic upchar health disease best places to visit in india and world health blogging hosting domain wishing message blogging blogger tips
free adsense friendly template theme for blogger blogspot.com
blogger
blogspot.com
adsense blogging tips in english
hindi blogging tips
e cards wishing images
Pattern association involves associating a new pattern with a stored pattern. It is a “simplified” model of human memory.
Types of associative memory:
Heteroassociative memory
Autoassociative memory
Hopfield Net
Bidirectional Associative Memory (BAM)
These are usually single-layer networks.
The neural network is firstly trained to store a set of patterns in the form s : t s represents the input vector and t the corresponding output vector.
The neural network is then tested on a set of data to test its “memory” by using it to identify patterns containing incorrect or missing information.
Associative memory can be feed forward or recurrent.
Autoassociative memory cannot hold an infinite number of patterns.
Factors that affect this: Complexity of each pattern, Similarity of input patterns
Auto Associative Memory Architecture
Auto Associative Architecture
Auto associative Memory
The inputs and output vectors s and t are the same.
The Hebb rule is used as a learning algorithm or calculate the weight matrix by summing the outer products of each input-output pair.
The autoassociative application algorithm is used to test the algorithm
Hetero associative Memory
Hetero Associative Architecture
Hetero associative Memory
The inputs and output vectors s and t are different.
The Hebb rule is used as a learning algorithm or calculate the weight matrix by summing the outer products of each input-output pair.
The heteroassociative application algorithm is used to test the algorithm.
The Hebb Algorithm
Initialize weights to zero, wij =0, where i = 1, …, n and j = 1, …, m.
For each training case s:t repeat:
xi = si , where i=1,…,n
yi = tj, where j = 1, .., m
Adjust weights wij(new) = wij(old) + xiyj, where i = 1, .., n and j = 1, .., m
Comments
Post a Comment