SlideShare a Scribd company logo
Neural Networks
V.Saranya
AP/CSE
Sri Vidya College of Engineering and
Technology,
Virudhunagar
Neural Networks
2
Natural Neural Networks
• Signals “move” via electrochemical signals
• The synapses release a chemical transmitter –
the sum of which can cause a threshold to be
reached – causing the neuron to “fire”
• Synapses can be inhibitory or excitatory
3
Natural Neural Networks
• We are born with about 100 billion neurons
• A neuron may connect to as many as 100,000
other neurons
4
Natural Neural Networks
• Many of their ideas still used today e.g.
– many simple units, “neurons” combine to give
increased computational power
– the idea of a threshold
5
Modelling a Neuron
• aj :Activation value of unit j
• wj,i :Weight on link from unit j to unit i
• ini :Weighted sum of inputs to unit i
• ai :Activation value of unit i
• g :Activation function
j
jiji aWin ,
6
Activation Functions
• Stept(x) = 1 if x ≥ t, else 0 threshold=t
• Sign(x) = +1 if x ≥ 0, else –1
• Sigmoid(x) = 1/(1+e-x)
7
Building a Neural Network
1. “Select Structure”: Design the way that the
neurons are interconnected
2. “Select weights” – decide the strengths with
which the neurons are interconnected
– weights are selected so get a “good match” to
a “training set”
– “training set”: set of inputs and desired
outputs
– often use a “learning algorithm”
8
Basic Neural Networks
• Will first look at simplest networks
• “Feed-forward”
– Signals travel in one direction through net
– Net computes a function of the inputs
9
The First Neural Neural Networks
Neurons in a McCulloch-Pitts network are connected by directed, weighted
paths
-1
2
2X1
X2
X3
Y
10
The First Neural Neural Networks
If the on weight on a path is positive the path is
excitatory,
otherwise it is inhibitory
-1
2
2X1
X2
X3
Y
11
The First Neural Neural Networks
The activation of a neuron is binary. That is, the neuron
either fires (activation of one) or does not fire (activation of
zero).
-1
2
2X1
X2
X3
Y
12
The First Neural Neural Networks
For the network shown here the activation function for unit Y is
f(y_in) = 1, if y_in >= θ else 0
where y_in is the total input signal received
θ is the threshold for Y
-1
2
2X1
X2
X3
Y
13
The First Neural Neural Networks
Originally, all excitatory connections into a particular neuron have the same
weight, although different weighted connections can be input to different
neurons
Later weights allowed to be arbitrary
-1
2
2X1
X2
X3
Y
14
The First Neural Neural Networks
Each neuron has a fixed threshold. If the net input into the neuron is
greater than or equal to the threshold, the neuron fires
-1
2
2X1
X2
X3
Y
15
The First Neural Neural Networks
The threshold is set such that any non-zero inhibitory input will prevent the neuron
from firing
-1
2
2X1
X2
X3
Y
16
Building Logic Gates
• Computers are built out of “logic gates”
• Use threshold (step) function for activation
function
– all activation values are 0 (false) or 1 (true)
17
The First Neural Neural Networks
AND Function
1
1
X1
X2
Y
AND
X1 X2 Y
1 1 1
1 0 0
0 1 0
0 0 0
Threshold(Y) = 2
18
The First Neural Networks
AND FunctionOR Function
2
2X1
X2
Y
OR
X1 X2 Y
1 1 1
1 0 1
0 1 1
0 0 0
Threshold(Y) = 2
19
Perceptron
• Synonym for Single-Layer,
Feed-Forward Network
• First Studied in the 50’s
• Other networks were known
about but the perceptron
was the only one capable of
learning and thus all research
was concentrated in this area
20
Perceptron
• A single weight only affects
one output so we can restrict
our investigations to a model
as shown on the right
• Notation can be simpler, i.e.
j
WjIjStepO 0
21
What can perceptrons represent?
AND XOR
Input 1 0 0 1 1 0 0 1 1
Input 2 0 1 0 1 0 1 0 1
Output 0 0 0 1 0 1 1 0
22
What can perceptrons represent?
0,0
0,1
1,0
1,1
0,0
0,1
1,0
1,1
AND XOR
• Functions which can be separated in this way are called Linearly Separable
• Only linearly separable functions can be represented by a perceptron
• XOR cannot be represented by a perceptron
23
XOR
• XOR is not “linearly separable”
– Cannot be represented by a perceptron
• What can we do instead?
1. Convert to logic gates that can be represented by
perceptrons
2. Chain together the gates
24
Single- vs. Multiple-Layers
• Once we chain together the gates then we have “hidden
layers”
– layers that are “hidden” from the output lines
• Have just seen that hidden layers allow us to represent XOR
– Perceptron is single-layer
– Multiple layers increase the representational power, so
e.g. can represent XOR
• Generally useful nets have multiple-layers
– typically 2-4 layers
25

More Related Content

PPT
Counterpropagation NETWORK
PPT
Artificial neural network
PPTX
HOPFIELD NETWORK
PPT
Unit I & II in Principles of Soft computing
PPTX
Artificial Neural Network(Artificial intelligence)
PPTX
Perceptron & Neural Networks
PPTX
Neuro-fuzzy systems
PPT
Artificial Neural Networks - ANN
Counterpropagation NETWORK
Artificial neural network
HOPFIELD NETWORK
Unit I & II in Principles of Soft computing
Artificial Neural Network(Artificial intelligence)
Perceptron & Neural Networks
Neuro-fuzzy systems
Artificial Neural Networks - ANN

What's hot (20)

PDF
Lec 5 uncertainty
PPTX
Artificial Neural Network
PPTX
Kohonen self organizing maps
PDF
Fuzzy Set Theory and Classical Set Theory (Soft Computing)
PPTX
Neural network & its applications
PPTX
Deep Learning A-Z™: Artificial Neural Networks (ANN) - The Activation Function
PPTX
Introdution and designing a learning system
PPTX
Constraint Satisfaction Problem (CSP) : Cryptarithmetic, Graph Coloring, 4- Q...
PPT
nural network ER. Abhishek k. upadhyay
PPTX
Neural network
PDF
Self Organizing Maps: Fundamentals
PPT
Classification using back propagation algorithm
PPT
Sorting network
PPTX
Multi processor scheduling
PPTX
Artificial neural networks
PPTX
Neural network
PPSX
Fuzzy expert system
PPT
Perceptron 2015.ppt
PPTX
Bee algorithm
PPTX
Machine Learning for Medical Image Analysis: What, where and how?
Lec 5 uncertainty
Artificial Neural Network
Kohonen self organizing maps
Fuzzy Set Theory and Classical Set Theory (Soft Computing)
Neural network & its applications
Deep Learning A-Z™: Artificial Neural Networks (ANN) - The Activation Function
Introdution and designing a learning system
Constraint Satisfaction Problem (CSP) : Cryptarithmetic, Graph Coloring, 4- Q...
nural network ER. Abhishek k. upadhyay
Neural network
Self Organizing Maps: Fundamentals
Classification using back propagation algorithm
Sorting network
Multi processor scheduling
Artificial neural networks
Neural network
Fuzzy expert system
Perceptron 2015.ppt
Bee algorithm
Machine Learning for Medical Image Analysis: What, where and how?
Ad

Similar to Neural networks (20)

PPTX
Neural networks
PPTX
Artificial Neural Network_VCW (1).pptx
PPT
Lec1 Inroduction to Neural Network.ppt
PDF
Lect аі 2 n net p2
PPTX
Introduction to Neural Networks By Simon Haykins
PPTX
Introduction to Neural networks (under graduate course) Lecture 2 of 9
PDF
Artificial Neural Network
PPTX
Neural Networks.pptx
PPT
PDF
Artificial Neural Network
PDF
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
PPT
Neural networks 1
PDF
10-Perceptron.pdf
PPTX
Artificial Neural Network
PPT
Ann ics320 part4
PPTX
Machine Learning - Neural Networks - Perceptron
PPTX
Machine Learning - Introduction to Neural Networks
PPT
Soft Computering Technics - Unit2
PPTX
Neural network and fuzzy logic-UNIT-III-PPT.pptx
PDF
2-Perceptrons.pdf
Neural networks
Artificial Neural Network_VCW (1).pptx
Lec1 Inroduction to Neural Network.ppt
Lect аі 2 n net p2
Introduction to Neural Networks By Simon Haykins
Introduction to Neural networks (under graduate course) Lecture 2 of 9
Artificial Neural Network
Neural Networks.pptx
Artificial Neural Network
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
Neural networks 1
10-Perceptron.pdf
Artificial Neural Network
Ann ics320 part4
Machine Learning - Neural Networks - Perceptron
Machine Learning - Introduction to Neural Networks
Soft Computering Technics - Unit2
Neural network and fuzzy logic-UNIT-III-PPT.pptx
2-Perceptrons.pdf
Ad

More from Slideshare (20)

PPTX
Crystal report generation in visual studio 2010
PPTX
Report generation
PPT
Trigger
PPTX
Security in Relational model
PPTX
Entity Relationship Model
PPTX
PPTX
Major issues in data mining
PPTX
Data preprocessing
PPTX
What is in you
PPTX
Propositional logic & inference
PPTX
Logical reasoning 21.1.13
PPT
Logic agent
PPTX
Statistical learning
PPTX
Resolution(decision)
PPT
Reinforcement learning 7313
PPTX
Instance based learning
PPTX
Statistical learning
PPTX
Logical reasoning
PPTX
Instance based learning
PPTX
Input & output devices
Crystal report generation in visual studio 2010
Report generation
Trigger
Security in Relational model
Entity Relationship Model
Major issues in data mining
Data preprocessing
What is in you
Propositional logic & inference
Logical reasoning 21.1.13
Logic agent
Statistical learning
Resolution(decision)
Reinforcement learning 7313
Instance based learning
Statistical learning
Logical reasoning
Instance based learning
Input & output devices

Recently uploaded (20)

PPTX
UV-Visible spectroscopy..pptx UV-Visible Spectroscopy – Electronic Transition...
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PPTX
Lesson notes of climatology university.
PDF
Empowerment Technology for Senior High School Guide
PDF
Computing-Curriculum for Schools in Ghana
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PPTX
CHAPTER IV. MAN AND BIOSPHERE AND ITS TOTALITY.pptx
PPTX
Cell Types and Its function , kingdom of life
PDF
Practical Manual AGRO-233 Principles and Practices of Natural Farming
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PDF
LNK 2025 (2).pdf MWEHEHEHEHEHEHEHEHEHEHE
PDF
Indian roads congress 037 - 2012 Flexible pavement
PDF
Weekly quiz Compilation Jan -July 25.pdf
PDF
1_English_Language_Set_2.pdf probationary
PDF
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
PDF
A systematic review of self-coping strategies used by university students to ...
PPTX
Tissue processing ( HISTOPATHOLOGICAL TECHNIQUE
PPTX
History, Philosophy and sociology of education (1).pptx
PDF
Paper A Mock Exam 9_ Attempt review.pdf.
PDF
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
UV-Visible spectroscopy..pptx UV-Visible Spectroscopy – Electronic Transition...
Final Presentation General Medicine 03-08-2024.pptx
Lesson notes of climatology university.
Empowerment Technology for Senior High School Guide
Computing-Curriculum for Schools in Ghana
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
CHAPTER IV. MAN AND BIOSPHERE AND ITS TOTALITY.pptx
Cell Types and Its function , kingdom of life
Practical Manual AGRO-233 Principles and Practices of Natural Farming
Chinmaya Tiranga quiz Grand Finale.pdf
LNK 2025 (2).pdf MWEHEHEHEHEHEHEHEHEHEHE
Indian roads congress 037 - 2012 Flexible pavement
Weekly quiz Compilation Jan -July 25.pdf
1_English_Language_Set_2.pdf probationary
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
A systematic review of self-coping strategies used by university students to ...
Tissue processing ( HISTOPATHOLOGICAL TECHNIQUE
History, Philosophy and sociology of education (1).pptx
Paper A Mock Exam 9_ Attempt review.pdf.
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3

Neural networks

  • 1. Neural Networks V.Saranya AP/CSE Sri Vidya College of Engineering and Technology, Virudhunagar
  • 3. Natural Neural Networks • Signals “move” via electrochemical signals • The synapses release a chemical transmitter – the sum of which can cause a threshold to be reached – causing the neuron to “fire” • Synapses can be inhibitory or excitatory 3
  • 4. Natural Neural Networks • We are born with about 100 billion neurons • A neuron may connect to as many as 100,000 other neurons 4
  • 5. Natural Neural Networks • Many of their ideas still used today e.g. – many simple units, “neurons” combine to give increased computational power – the idea of a threshold 5
  • 6. Modelling a Neuron • aj :Activation value of unit j • wj,i :Weight on link from unit j to unit i • ini :Weighted sum of inputs to unit i • ai :Activation value of unit i • g :Activation function j jiji aWin , 6
  • 7. Activation Functions • Stept(x) = 1 if x ≥ t, else 0 threshold=t • Sign(x) = +1 if x ≥ 0, else –1 • Sigmoid(x) = 1/(1+e-x) 7
  • 8. Building a Neural Network 1. “Select Structure”: Design the way that the neurons are interconnected 2. “Select weights” – decide the strengths with which the neurons are interconnected – weights are selected so get a “good match” to a “training set” – “training set”: set of inputs and desired outputs – often use a “learning algorithm” 8
  • 9. Basic Neural Networks • Will first look at simplest networks • “Feed-forward” – Signals travel in one direction through net – Net computes a function of the inputs 9
  • 10. The First Neural Neural Networks Neurons in a McCulloch-Pitts network are connected by directed, weighted paths -1 2 2X1 X2 X3 Y 10
  • 11. The First Neural Neural Networks If the on weight on a path is positive the path is excitatory, otherwise it is inhibitory -1 2 2X1 X2 X3 Y 11
  • 12. The First Neural Neural Networks The activation of a neuron is binary. That is, the neuron either fires (activation of one) or does not fire (activation of zero). -1 2 2X1 X2 X3 Y 12
  • 13. The First Neural Neural Networks For the network shown here the activation function for unit Y is f(y_in) = 1, if y_in >= θ else 0 where y_in is the total input signal received θ is the threshold for Y -1 2 2X1 X2 X3 Y 13
  • 14. The First Neural Neural Networks Originally, all excitatory connections into a particular neuron have the same weight, although different weighted connections can be input to different neurons Later weights allowed to be arbitrary -1 2 2X1 X2 X3 Y 14
  • 15. The First Neural Neural Networks Each neuron has a fixed threshold. If the net input into the neuron is greater than or equal to the threshold, the neuron fires -1 2 2X1 X2 X3 Y 15
  • 16. The First Neural Neural Networks The threshold is set such that any non-zero inhibitory input will prevent the neuron from firing -1 2 2X1 X2 X3 Y 16
  • 17. Building Logic Gates • Computers are built out of “logic gates” • Use threshold (step) function for activation function – all activation values are 0 (false) or 1 (true) 17
  • 18. The First Neural Neural Networks AND Function 1 1 X1 X2 Y AND X1 X2 Y 1 1 1 1 0 0 0 1 0 0 0 0 Threshold(Y) = 2 18
  • 19. The First Neural Networks AND FunctionOR Function 2 2X1 X2 Y OR X1 X2 Y 1 1 1 1 0 1 0 1 1 0 0 0 Threshold(Y) = 2 19
  • 20. Perceptron • Synonym for Single-Layer, Feed-Forward Network • First Studied in the 50’s • Other networks were known about but the perceptron was the only one capable of learning and thus all research was concentrated in this area 20
  • 21. Perceptron • A single weight only affects one output so we can restrict our investigations to a model as shown on the right • Notation can be simpler, i.e. j WjIjStepO 0 21
  • 22. What can perceptrons represent? AND XOR Input 1 0 0 1 1 0 0 1 1 Input 2 0 1 0 1 0 1 0 1 Output 0 0 0 1 0 1 1 0 22
  • 23. What can perceptrons represent? 0,0 0,1 1,0 1,1 0,0 0,1 1,0 1,1 AND XOR • Functions which can be separated in this way are called Linearly Separable • Only linearly separable functions can be represented by a perceptron • XOR cannot be represented by a perceptron 23
  • 24. XOR • XOR is not “linearly separable” – Cannot be represented by a perceptron • What can we do instead? 1. Convert to logic gates that can be represented by perceptrons 2. Chain together the gates 24
  • 25. Single- vs. Multiple-Layers • Once we chain together the gates then we have “hidden layers” – layers that are “hidden” from the output lines • Have just seen that hidden layers allow us to represent XOR – Perceptron is single-layer – Multiple layers increase the representational power, so e.g. can represent XOR • Generally useful nets have multiple-layers – typically 2-4 layers 25