SlideShare a Scribd company logo
3
Most read
4
Most read
5
Most read
HETRO ASSOCIATIVE MEMORY
Deependra Kumar Kori
ASSOCIATIVE MEMORY
Associative memory is defined as the ability to learn and remember the
relationship between unrelated items. for example, remembering the name
of someone.
There are two type of associative memory :-
I. Auto associative memory
II. Hetro associative memory
In auto associative memory the training input and target output vectors are
identical.
Now we will discuss hetro associative memory in a detail.
INTRODUCTION OF HETRO ASSOCIATIVE
MEMORY
Single layer neural network .
The input training vector and the output target vectors are not the same.
The weights are determined so that the network stores a set of patterns.
 Hetero associative network is static in nature, hence, there would be no non-linear and
delay operations.
The weights may be found using the Hebb rule or the delta rule.
Input has ‘n’ units and output has ‘m’ units and there is a weighted interconnection
between input and output.
Hetero Associative Memory network has ‘n’ number of input training vectors and ‘m’ number of
output target vectors.
TRAINING ALGORITHM
For training, this network is using the Hebb or Delta learning rule.
Step 1 − Initialize all the weights to zero as
wij = 0 (i = 1 to n, j = 1 to m)
Step 2 − Perform steps 3-4 for each input vector.
Step 3 − Activate each input unit as follows −
xi = si ( i=1 to n )
Step 4 − Activate each output unit as follows −
yj = sj (j=1 to m)
Step 5 − Adjust the weights as follows −
wij (new)=wij (old)+xi yj
TESTING ALGORITHM
Step 1 − Set the weights obtained during training for Hebb’s rule.
Step 2 − Perform steps 3-5 for each input vector.
Step 3 − Set the activation of the input units equal to that of the input vector.
Step 4 − Calculate the net input to each output unit j = 1 to m
Yij = 𝒊=𝟏
𝒏
𝒙𝒊 𝒘𝒊j
Step 5 − Apply the following activation function to calculate the output
1 if yij > 0
Yj = f(yij) = 0 if yij = 0
-1 if yij < 0
Hetero associative Architecture
wi1
wn1
 0
wi2
wn2
 0
w12
wi3
wn3
 0
w13
For m = 3
w11
xi
xn
n inputs
x1
Activation Functions:
For bipolar targets:
yi =
1 if y_ini >0
0 if y_ini =0
-1 if y_ini <0
{
For binary targets:
iy = i1 if y_in >0
0 if y_ini <=0{
EXAMPLE
 GOAL: build a neural network which will associate the following two sets of patterns using Hebb’s Rule:
s1 = ( 1 -1 -1 -1) f1 = ( 1 -1 -1)
s2 = (-1 1 -1 -1) f2 = ( 1 -1 1)
s3 = (-1 -1 1 -1) f3 = (-1 1 -1)
s4 = (-1 -1 -1 1) f4 = (-1 1 1)
The process will involve 4 input neurons and 3 output neurons
The algorithm involves finding the four outer products and adding them
ALGORITHM
1
-1 -1 1 1
[ 1 –1 –1] =-1 -1 1 1
-1 -1 1 1
1 –1 –1
Pattern pair 1:
-1
1
[ 1 –1 1] =-1 -1 1 -1
-1 -1 1 -1
1 -1 1
-1 1 –1
-1
-1
[-1 1 –1] =
1 -1 1 -1
-1 1 -1 1
1 –1 1
1 -1 1
Pattern pair 3:
-1
-1
[-1 1 1] =
-1 1 -1 -1
1 -1 1 1
1 -1 –1
1 -1 -1
Pattern pair 4:
Pattern pair 2:
WEIGHT MATRIX
Add all four individual
weight matrices to
produce the final weight
matrix:
1 –1 –1
-1 1 1
-1 1 –1
1 -1 1
1 –1 1
1 -1 1
1 -1 –1
1 -1 -1
-1 1 1 -1 1 -1 -1 1 -1 1 -1 -1
-1 1 1 -1 1 -1 1 -1 1 -1 1 1
+ + +
2 -2 –2
2 -2 2
-2 2 -2
-2 2 2
=
Each column defines
the weights for an output
neuron
EXAMPLE ARCHITECTURE
 General Structure
x1
x2
x
x3
2
-2
 0
2
-2
0
-2
-2
2
2
02

-2
yi =
1 if y_ini >0
0 if y_ini =0
-1 if y_ini <0
{2 -2 –2
2 -2 2
4
-2
2
-2 2 -2
-2 2 2
EXAMPLE RUN 1
 Try the first input pattern:
1
-1
-1
-1
 0
2
2
-2
-2
0
-2
-2
2
2
02

-2
1 if y_ini >0
-2
2
0 if y_ini =0
-1 if y_ini <0
{y_in1 = 2 – 2 + 2 + 2 = 4 so y1 = 1
y_in2 = -2 + 2 - 2 - 2 = -4 so y2 = -1
y_in3 = -2 - 2 + 2 - 2 = -4 so y3 = -1
S1 = ( 1 -1 -1 -1) f1 = ( 1 -1 -1)
yi =
Where is this
information
stored?
GENERAL EXAMPLE
Key content
 Recognition of patterns based on hints or features.
This man is
Napoleon. He
is a French. He
is a hero.
THANK YOU

More Related Content

PDF
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
PPT
Principles of soft computing-Associative memory networks
PPTX
Neural network
PPT
Adaline madaline
DOCX
Learning Methods in a Neural Network
PPTX
03 Single layer Perception Classifier
PDF
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
Principles of soft computing-Associative memory networks
Neural network
Adaline madaline
Learning Methods in a Neural Network
03 Single layer Perception Classifier
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...

What's hot (20)

PPTX
Associative memory network
PPTX
Graph coloring using backtracking
PPTX
Chess board problem(divide and conquer)
PPTX
A Role of Lexical Analyzer
PPTX
Feedforward neural network
PPTX
Presentation on risc pipeline
PPTX
Algorithm and pseudocode conventions
PPTX
HOPFIELD NETWORK
PDF
Storage organization and stack allocation of space
PPTX
Sum of subset problem.pptx
PPTX
Input-Buffering
PDF
Machine learning Lecture 2
PPTX
daa-unit-3-greedy method
PPTX
Lock based protocols
PPT
AI Lecture 3 (solving problems by searching)
PPTX
Activation function
PPTX
Counter propagation Network
PPTX
Bayesian Belief Network and its Applications.pptx
PPTX
Lexical analyzer generator lex
PPTX
Problem solving agents
Associative memory network
Graph coloring using backtracking
Chess board problem(divide and conquer)
A Role of Lexical Analyzer
Feedforward neural network
Presentation on risc pipeline
Algorithm and pseudocode conventions
HOPFIELD NETWORK
Storage organization and stack allocation of space
Sum of subset problem.pptx
Input-Buffering
Machine learning Lecture 2
daa-unit-3-greedy method
Lock based protocols
AI Lecture 3 (solving problems by searching)
Activation function
Counter propagation Network
Bayesian Belief Network and its Applications.pptx
Lexical analyzer generator lex
Problem solving agents
Ad

Similar to Hetro associative memory (20)

PPTX
Training Algorithms for Pattern Association.pptx
PPTX
sathiya new final.pptx
PPTX
lecture 10 - Associative_Memory_Neural_Networks_pptx.pptx
PPT
Lec 3-4-5-learning
PPTX
Associative_Memory_Neural_Networks_pptx.pptx
PDF
Associative Learning Artificial Intelligence
PPT
NN-Ch3 (1).ppt
PPTX
PDF
NN-Ch3.PDF
PPT
Ann
PPTX
Neural network
PDF
Machine Learning: The Bare Math Behind Libraries
PPTX
latest TYPES OF NEURAL NETWORKS (2).pptx
PDF
International Refereed Journal of Engineering and Science (IRJES)
PPTX
NN12345671234567890-9876543234567(Ass-4).pptx
PDF
Survey on Artificial Neural Network Learning Technique Algorithms
PDF
Introduction to Artificial Neural Networks - PART III.pdf
PDF
Deep learning unit 3 artificial neural network
PPTX
machine learning for engineering students
PPTX
Artificial Neural Networks ppt.pptx for final sem cse
Training Algorithms for Pattern Association.pptx
sathiya new final.pptx
lecture 10 - Associative_Memory_Neural_Networks_pptx.pptx
Lec 3-4-5-learning
Associative_Memory_Neural_Networks_pptx.pptx
Associative Learning Artificial Intelligence
NN-Ch3 (1).ppt
NN-Ch3.PDF
Ann
Neural network
Machine Learning: The Bare Math Behind Libraries
latest TYPES OF NEURAL NETWORKS (2).pptx
International Refereed Journal of Engineering and Science (IRJES)
NN12345671234567890-9876543234567(Ass-4).pptx
Survey on Artificial Neural Network Learning Technique Algorithms
Introduction to Artificial Neural Networks - PART III.pdf
Deep learning unit 3 artificial neural network
machine learning for engineering students
Artificial Neural Networks ppt.pptx for final sem cse
Ad

Recently uploaded (20)

DOCX
573137875-Attendance-Management-System-original
PPTX
FINAL REVIEW FOR COPD DIANOSIS FOR PULMONARY DISEASE.pptx
PPTX
Geodesy 1.pptx...............................................
PPTX
Sustainable Sites - Green Building Construction
PDF
Operating System & Kernel Study Guide-1 - converted.pdf
PDF
Digital Logic Computer Design lecture notes
PDF
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
PPTX
IOT PPTs Week 10 Lecture Material.pptx of NPTEL Smart Cities contd
PPTX
UNIT-1 - COAL BASED THERMAL POWER PLANTS
PDF
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
PPTX
KTU 2019 -S7-MCN 401 MODULE 2-VINAY.pptx
PPTX
CH1 Production IntroductoryConcepts.pptx
PPTX
additive manufacturing of ss316l using mig welding
PPTX
Internet of Things (IOT) - A guide to understanding
PPTX
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
PDF
PPT on Performance Review to get promotions
PPTX
bas. eng. economics group 4 presentation 1.pptx
PPTX
Foundation to blockchain - A guide to Blockchain Tech
PPTX
Recipes for Real Time Voice AI WebRTC, SLMs and Open Source Software.pptx
PPTX
Lesson 3_Tessellation.pptx finite Mathematics
573137875-Attendance-Management-System-original
FINAL REVIEW FOR COPD DIANOSIS FOR PULMONARY DISEASE.pptx
Geodesy 1.pptx...............................................
Sustainable Sites - Green Building Construction
Operating System & Kernel Study Guide-1 - converted.pdf
Digital Logic Computer Design lecture notes
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
IOT PPTs Week 10 Lecture Material.pptx of NPTEL Smart Cities contd
UNIT-1 - COAL BASED THERMAL POWER PLANTS
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
KTU 2019 -S7-MCN 401 MODULE 2-VINAY.pptx
CH1 Production IntroductoryConcepts.pptx
additive manufacturing of ss316l using mig welding
Internet of Things (IOT) - A guide to understanding
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
PPT on Performance Review to get promotions
bas. eng. economics group 4 presentation 1.pptx
Foundation to blockchain - A guide to Blockchain Tech
Recipes for Real Time Voice AI WebRTC, SLMs and Open Source Software.pptx
Lesson 3_Tessellation.pptx finite Mathematics

Hetro associative memory

  • 2. ASSOCIATIVE MEMORY Associative memory is defined as the ability to learn and remember the relationship between unrelated items. for example, remembering the name of someone. There are two type of associative memory :- I. Auto associative memory II. Hetro associative memory In auto associative memory the training input and target output vectors are identical. Now we will discuss hetro associative memory in a detail.
  • 3. INTRODUCTION OF HETRO ASSOCIATIVE MEMORY Single layer neural network . The input training vector and the output target vectors are not the same. The weights are determined so that the network stores a set of patterns.  Hetero associative network is static in nature, hence, there would be no non-linear and delay operations. The weights may be found using the Hebb rule or the delta rule. Input has ‘n’ units and output has ‘m’ units and there is a weighted interconnection between input and output.
  • 4. Hetero Associative Memory network has ‘n’ number of input training vectors and ‘m’ number of output target vectors.
  • 5. TRAINING ALGORITHM For training, this network is using the Hebb or Delta learning rule. Step 1 − Initialize all the weights to zero as wij = 0 (i = 1 to n, j = 1 to m) Step 2 − Perform steps 3-4 for each input vector. Step 3 − Activate each input unit as follows − xi = si ( i=1 to n ) Step 4 − Activate each output unit as follows − yj = sj (j=1 to m) Step 5 − Adjust the weights as follows − wij (new)=wij (old)+xi yj
  • 6. TESTING ALGORITHM Step 1 − Set the weights obtained during training for Hebb’s rule. Step 2 − Perform steps 3-5 for each input vector. Step 3 − Set the activation of the input units equal to that of the input vector. Step 4 − Calculate the net input to each output unit j = 1 to m Yij = 𝒊=𝟏 𝒏 𝒙𝒊 𝒘𝒊j Step 5 − Apply the following activation function to calculate the output 1 if yij > 0 Yj = f(yij) = 0 if yij = 0 -1 if yij < 0
  • 7. Hetero associative Architecture wi1 wn1  0 wi2 wn2  0 w12 wi3 wn3  0 w13 For m = 3 w11 xi xn n inputs x1 Activation Functions: For bipolar targets: yi = 1 if y_ini >0 0 if y_ini =0 -1 if y_ini <0 { For binary targets: iy = i1 if y_in >0 0 if y_ini <=0{
  • 8. EXAMPLE  GOAL: build a neural network which will associate the following two sets of patterns using Hebb’s Rule: s1 = ( 1 -1 -1 -1) f1 = ( 1 -1 -1) s2 = (-1 1 -1 -1) f2 = ( 1 -1 1) s3 = (-1 -1 1 -1) f3 = (-1 1 -1) s4 = (-1 -1 -1 1) f4 = (-1 1 1) The process will involve 4 input neurons and 3 output neurons The algorithm involves finding the four outer products and adding them
  • 9. ALGORITHM 1 -1 -1 1 1 [ 1 –1 –1] =-1 -1 1 1 -1 -1 1 1 1 –1 –1 Pattern pair 1: -1 1 [ 1 –1 1] =-1 -1 1 -1 -1 -1 1 -1 1 -1 1 -1 1 –1 -1 -1 [-1 1 –1] = 1 -1 1 -1 -1 1 -1 1 1 –1 1 1 -1 1 Pattern pair 3: -1 -1 [-1 1 1] = -1 1 -1 -1 1 -1 1 1 1 -1 –1 1 -1 -1 Pattern pair 4: Pattern pair 2:
  • 10. WEIGHT MATRIX Add all four individual weight matrices to produce the final weight matrix: 1 –1 –1 -1 1 1 -1 1 –1 1 -1 1 1 –1 1 1 -1 1 1 -1 –1 1 -1 -1 -1 1 1 -1 1 -1 -1 1 -1 1 -1 -1 -1 1 1 -1 1 -1 1 -1 1 -1 1 1 + + + 2 -2 –2 2 -2 2 -2 2 -2 -2 2 2 = Each column defines the weights for an output neuron
  • 11. EXAMPLE ARCHITECTURE  General Structure x1 x2 x x3 2 -2  0 2 -2 0 -2 -2 2 2 02  -2 yi = 1 if y_ini >0 0 if y_ini =0 -1 if y_ini <0 {2 -2 –2 2 -2 2 4 -2 2 -2 2 -2 -2 2 2
  • 12. EXAMPLE RUN 1  Try the first input pattern: 1 -1 -1 -1  0 2 2 -2 -2 0 -2 -2 2 2 02  -2 1 if y_ini >0 -2 2 0 if y_ini =0 -1 if y_ini <0 {y_in1 = 2 – 2 + 2 + 2 = 4 so y1 = 1 y_in2 = -2 + 2 - 2 - 2 = -4 so y2 = -1 y_in3 = -2 - 2 + 2 - 2 = -4 so y3 = -1 S1 = ( 1 -1 -1 -1) f1 = ( 1 -1 -1) yi = Where is this information stored?
  • 13. GENERAL EXAMPLE Key content  Recognition of patterns based on hints or features. This man is Napoleon. He is a French. He is a hero.