SlideShare a Scribd company logo
INTRODUCTION TO NEURAL
NETWORKS
BY : AHMED YOUSRY
AGENDA
Biological Background
Artificial Neuron
Classes of Neural Networks
1. Perceptrons
2. Multi-Layered Feed-Forward Networks
3. Recurrent Networks.
Modeling the neuron
Activation functions
BIOLOGICAL BACKGROUND
• Neuron consists of:
• Cell body
• Dendrites
• Axon
• Synapses
Neural activation :
 Throught dendrites/axon
 Synapses have different
strengths
ANN INGREDIENTS
SIMULATION ON ANN
ARTIFICIAL NEURON EXAMPLE
aj Wji
Input links
(dendrites)
Unit
(cell body)
Output links
(axon)
ai
ai =
g(ini)
ini =
SajWji
CLASS I: PERCEPTRON
a =
g(in)
in =
SajWj
a = g(-W0 + W1a1 + W2a2) g(in) =
0, in<0
1, in>0
{
a1
a2
-1
Ij
a
O
W0
W1
W2
Wj
CLASS II: MULTI-LAYER
FEED-FORWARD NETWORKS
• Feed-forward:
• Output links only connected
to input links in the next
layer
Input Hidden Output
Multiple layers:
 hidden layer(s)
Complex non-linear
functions can be
represented
MULTI LAYER NN EXAMPLE
CLASS III: RECURRENT NETWORKS
Input Hidden Output
No restrictions on
connections
Behaviour more
difficult to predict/
understand
Apps: Voice to text
MODELLING A NEURON
• aj :Activation value of unit j
• wj,I :Weight on the link from unit j to unit i
• inI :Weighted sum of inputs to unit i
• aI :Activation value of unit i
• g :Activation function

 j
j
i
j
i a
W
in ,
ACTIVATION FUNCTIONS
• Stept(x) = 1 if x >= t, else 0
• Sign(x) = +1 if x >= 0, else –1
• Sigmoid(x) = 1/(1+e-x)
• Identity Function
• Relu(x) = max(o,X)
BOOLEAN FUNCTION
THE FIRST NEURAL NEURAL
NETWORKS
AND Function
1
1
X1
X2
Y
AND
X1 X2 Y
1 1 1
1 0 0
0 1 0
0 0 0
Threshold(Y) = 2
THE FIRST NEURAL NEURAL
NETWORKS
AND Function
OR Function
2
2
X1
X2
Y
OR
X1 X2 Y
1 1 1
1 0 1
0 1 1
0 0 0
Threshold(Y) = 2
THE FIRST NEURAL NEURAL
NETWORKS
AND NOT Function
-1
2
X1
X2
Y
Threshold(Y) = 2
AND
NOT
X1 X2 Y
1 1 0
1 0 1
0 1 0
0 0 0
G5BAIM Neural Networks
What can perceptrons represent?
0,0
0,1
1,0
1,1
0,0
0,1
1,0
1,1
AND XOR
• Functions which can be separated in this way are called
Linearly Separable
• Only linearly Separable functions can be represented by a
perceptron
TRAINING A PERCEPTRONS
t = 0.0
y
x
-1
W = 0.3
W = -0.4
W = 0.5
I1 I2 I3 Summation Output
-1 0 0 (-1*0.3) + (0*0.5) + (0*-0.4) = -0.3 0
-1 0 1 (-1*0.3) + (0*0.5) + (1*-0.4) = -0.7 0
-1 1 0 (-1*0.3) + (1*0.5) + (0*-0.4) = 0.2 1
-1 1 1 (-1*0.3) + (1*0.5) + (1*-0.4) = -0.2 0
LEARNING
While epoch produces an error
Present network with next inputs from
epoch
Err = T – O
If Err <> 0 then
Wj = Wj + LR * Ij * Err
End If
End While
LEARNING
While epoch produces an error
Present network with next inputs from epoch
Err = T – O
If Err <> 0 then
Wj = Wj + LR * Ij * Err
End If
End While
Epoch : Presentation of the entire training set to the neural
network.
In the case of the AND function an epoch consists of
four sets of inputs being presented to the network (i.e.
[0,0], [0,1], [1,0], [1,1])
LEARNING
While epoch produces an error
Present network with next inputs from epoch
Err = T – O
If Err <> 0 then
Wj = Wj + LR * Ij * Err
End If
End While
Training Value, T : When we are training a network we not
only present it with the input but also with a value that
we require the network to produce. For example, if we
present the network with [1,1] for the AND function
the training value will be 1
LEARNING
While epoch produces an error
Present network with next inputs from epoch
Err = T – O
If Err <> 0 then
Wj = Wj + LR * Ij * Err
End If
End While
Error, Err : The error value is the amount by which the value
output by the network differs from the training value.
For example, if we required the network to output 0
and it output a 1, then Err = -1
LEARNING
While epoch produces an error
Present network with next inputs from epoch
Err = T – O
If Err <> 0 then
Wj = Wj + LR * Ij * Err
End If
End While
Output from Neuron, O : The output value from the neuron
Ij : Inputs being presented to the neuron
Wj : Weight from input neuron (Ij) to the output neuron
LR : The learning rate. This dictates how quickly the network converges. It
is set by a matter of experimentation. It is typically 0.1

More Related Content

DOCX
Dhcp snooping option 82 configuration
PDF
How to use the new Domino Query Language
PPTX
BD para Dispositivos Moviles - Unidad 3 SMBD Moviles
PPTX
Introducing Oracle Audit Vault and Database Firewall
PPTX
INF104 - HCL Domino AppDev Pack – The Future of Domino App Dev Nobody Knows A...
PDF
FUZZING & SOFTWARE SECURITY TESTING
PPTX
Nonblocking Database Access in Helidon SE
PPTX
Malware Static Analysis
Dhcp snooping option 82 configuration
How to use the new Domino Query Language
BD para Dispositivos Moviles - Unidad 3 SMBD Moviles
Introducing Oracle Audit Vault and Database Firewall
INF104 - HCL Domino AppDev Pack – The Future of Domino App Dev Nobody Knows A...
FUZZING & SOFTWARE SECURITY TESTING
Nonblocking Database Access in Helidon SE
Malware Static Analysis

What's hot (20)

PPT
Network Security - Layer 2
PPT
Cics web interface new
PPTX
PDF
Practical Guide to Run an IEEE 802.15.4 Network with 6LoWPAN Under Linux
PDF
B4X IDE
PPTX
Sql vs NoSQL
PDF
Lenguajes de trazabilidad expresiones regulares
PPTX
Introduction to Dynamic Malware Analysis ...Or am I "Cuckoo for Malware?"
PPTX
Oracle database threats - LAOUC Webinar
PDF
SpecterOps Webinar Week - Kerberoasting Revisisted
PDF
Linux-wpan: IEEE 802.15.4 and 6LoWPAN in the Linux Kernel - BUD17-120
PPTX
Octo and the DevSecOps Evolution at Oracle by Ian Van Hoven
PPTX
Vsam presentation PPT
PPTX
ページャ lessを使いこなす
PDF
What’s wrong with WebSocket APIs? Unveiling vulnerabilities in WebSocket APIs.
PDF
Why Use EXPLAIN FORMAT=JSON?
PPTX
IPSec VPN tunnel
PDF
Http and Servlet basics
PPTX
Sql injection
PDF
Malware Analysis
Network Security - Layer 2
Cics web interface new
Practical Guide to Run an IEEE 802.15.4 Network with 6LoWPAN Under Linux
B4X IDE
Sql vs NoSQL
Lenguajes de trazabilidad expresiones regulares
Introduction to Dynamic Malware Analysis ...Or am I "Cuckoo for Malware?"
Oracle database threats - LAOUC Webinar
SpecterOps Webinar Week - Kerberoasting Revisisted
Linux-wpan: IEEE 802.15.4 and 6LoWPAN in the Linux Kernel - BUD17-120
Octo and the DevSecOps Evolution at Oracle by Ian Van Hoven
Vsam presentation PPT
ページャ lessを使いこなす
What’s wrong with WebSocket APIs? Unveiling vulnerabilities in WebSocket APIs.
Why Use EXPLAIN FORMAT=JSON?
IPSec VPN tunnel
Http and Servlet basics
Sql injection
Malware Analysis
Ad

Similar to Lec1 Inroduction to Neural Network.ppt (20)

PPTX
Neural Networks.pptx
PDF
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
PDF
Artificial Neural Network
PPT
SOFTCOMPUTERING TECHNICS - Unit
PPTX
PPT
2011 0480.neural-networks
PPTX
Neural network
PPTX
Chapter-5-Part I-Basics-Neural-Networks.pptx
PPTX
Artificial Neural Network
PPT
19_Learning.ppt
PPTX
Neural networks
PDF
Artificial Neural Network
PPT
ch11.pptKGYUTFYDRERLJIOUY7T867RVHOJIP09-IU08Y7GTFGYU890-I90UIYGUI
PPT
ch11.ppt kusrdsdagrfzgfdfgdfsdsfdsxgdhfjgh50s
PPT
SOFT COMPUTERING TECHNICS -Unit 1
PDF
Artificial Neural Network
PPT
Artificial Neural Networks
PDF
Artificial Neural Network-Types,Perceptron,Problems
PPS
Neural Networks
PPTX
1.Introduction to Artificial Neural Networks.pptx
Neural Networks.pptx
Artificial Neural Networks Lect2: Neurobiology & Architectures of ANNS
Artificial Neural Network
SOFTCOMPUTERING TECHNICS - Unit
2011 0480.neural-networks
Neural network
Chapter-5-Part I-Basics-Neural-Networks.pptx
Artificial Neural Network
19_Learning.ppt
Neural networks
Artificial Neural Network
ch11.pptKGYUTFYDRERLJIOUY7T867RVHOJIP09-IU08Y7GTFGYU890-I90UIYGUI
ch11.ppt kusrdsdagrfzgfdfgdfsdsfdsxgdhfjgh50s
SOFT COMPUTERING TECHNICS -Unit 1
Artificial Neural Network
Artificial Neural Networks
Artificial Neural Network-Types,Perceptron,Problems
Neural Networks
1.Introduction to Artificial Neural Networks.pptx
Ad

Recently uploaded (20)

PPTX
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
PPTX
Lesson notes of climatology university.
PDF
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
PDF
Hazard Identification & Risk Assessment .pdf
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PDF
Classroom Observation Tools for Teachers
PDF
Empowerment Technology for Senior High School Guide
PPTX
Digestion and Absorption of Carbohydrates, Proteina and Fats
PDF
SOIL: Factor, Horizon, Process, Classification, Degradation, Conservation
PPTX
History, Philosophy and sociology of education (1).pptx
PPTX
Introduction to Building Materials
PDF
RMMM.pdf make it easy to upload and study
PDF
IGGE1 Understanding the Self1234567891011
PDF
A systematic review of self-coping strategies used by university students to ...
PPTX
Tissue processing ( HISTOPATHOLOGICAL TECHNIQUE
PDF
Paper A Mock Exam 9_ Attempt review.pdf.
PPTX
UV-Visible spectroscopy..pptx UV-Visible Spectroscopy – Electronic Transition...
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PPTX
Chinmaya Tiranga Azadi Quiz (Class 7-8 )
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
Lesson notes of climatology university.
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
Hazard Identification & Risk Assessment .pdf
Chinmaya Tiranga quiz Grand Finale.pdf
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
Classroom Observation Tools for Teachers
Empowerment Technology for Senior High School Guide
Digestion and Absorption of Carbohydrates, Proteina and Fats
SOIL: Factor, Horizon, Process, Classification, Degradation, Conservation
History, Philosophy and sociology of education (1).pptx
Introduction to Building Materials
RMMM.pdf make it easy to upload and study
IGGE1 Understanding the Self1234567891011
A systematic review of self-coping strategies used by university students to ...
Tissue processing ( HISTOPATHOLOGICAL TECHNIQUE
Paper A Mock Exam 9_ Attempt review.pdf.
UV-Visible spectroscopy..pptx UV-Visible Spectroscopy – Electronic Transition...
Final Presentation General Medicine 03-08-2024.pptx
Chinmaya Tiranga Azadi Quiz (Class 7-8 )

Lec1 Inroduction to Neural Network.ppt

  • 2. AGENDA Biological Background Artificial Neuron Classes of Neural Networks 1. Perceptrons 2. Multi-Layered Feed-Forward Networks 3. Recurrent Networks. Modeling the neuron Activation functions
  • 3. BIOLOGICAL BACKGROUND • Neuron consists of: • Cell body • Dendrites • Axon • Synapses Neural activation :  Throught dendrites/axon  Synapses have different strengths
  • 6. ARTIFICIAL NEURON EXAMPLE aj Wji Input links (dendrites) Unit (cell body) Output links (axon) ai ai = g(ini) ini = SajWji
  • 7. CLASS I: PERCEPTRON a = g(in) in = SajWj a = g(-W0 + W1a1 + W2a2) g(in) = 0, in<0 1, in>0 { a1 a2 -1 Ij a O W0 W1 W2 Wj
  • 8. CLASS II: MULTI-LAYER FEED-FORWARD NETWORKS • Feed-forward: • Output links only connected to input links in the next layer Input Hidden Output Multiple layers:  hidden layer(s) Complex non-linear functions can be represented
  • 9. MULTI LAYER NN EXAMPLE
  • 10. CLASS III: RECURRENT NETWORKS Input Hidden Output No restrictions on connections Behaviour more difficult to predict/ understand Apps: Voice to text
  • 11. MODELLING A NEURON • aj :Activation value of unit j • wj,I :Weight on the link from unit j to unit i • inI :Weighted sum of inputs to unit i • aI :Activation value of unit i • g :Activation function   j j i j i a W in ,
  • 12. ACTIVATION FUNCTIONS • Stept(x) = 1 if x >= t, else 0 • Sign(x) = +1 if x >= 0, else –1 • Sigmoid(x) = 1/(1+e-x) • Identity Function • Relu(x) = max(o,X)
  • 14. THE FIRST NEURAL NEURAL NETWORKS AND Function 1 1 X1 X2 Y AND X1 X2 Y 1 1 1 1 0 0 0 1 0 0 0 0 Threshold(Y) = 2
  • 15. THE FIRST NEURAL NEURAL NETWORKS AND Function OR Function 2 2 X1 X2 Y OR X1 X2 Y 1 1 1 1 0 1 0 1 1 0 0 0 Threshold(Y) = 2
  • 16. THE FIRST NEURAL NEURAL NETWORKS AND NOT Function -1 2 X1 X2 Y Threshold(Y) = 2 AND NOT X1 X2 Y 1 1 0 1 0 1 0 1 0 0 0 0
  • 17. G5BAIM Neural Networks What can perceptrons represent? 0,0 0,1 1,0 1,1 0,0 0,1 1,0 1,1 AND XOR • Functions which can be separated in this way are called Linearly Separable • Only linearly Separable functions can be represented by a perceptron
  • 18. TRAINING A PERCEPTRONS t = 0.0 y x -1 W = 0.3 W = -0.4 W = 0.5 I1 I2 I3 Summation Output -1 0 0 (-1*0.3) + (0*0.5) + (0*-0.4) = -0.3 0 -1 0 1 (-1*0.3) + (0*0.5) + (1*-0.4) = -0.7 0 -1 1 0 (-1*0.3) + (1*0.5) + (0*-0.4) = 0.2 1 -1 1 1 (-1*0.3) + (1*0.5) + (1*-0.4) = -0.2 0
  • 19. LEARNING While epoch produces an error Present network with next inputs from epoch Err = T – O If Err <> 0 then Wj = Wj + LR * Ij * Err End If End While
  • 20. LEARNING While epoch produces an error Present network with next inputs from epoch Err = T – O If Err <> 0 then Wj = Wj + LR * Ij * Err End If End While Epoch : Presentation of the entire training set to the neural network. In the case of the AND function an epoch consists of four sets of inputs being presented to the network (i.e. [0,0], [0,1], [1,0], [1,1])
  • 21. LEARNING While epoch produces an error Present network with next inputs from epoch Err = T – O If Err <> 0 then Wj = Wj + LR * Ij * Err End If End While Training Value, T : When we are training a network we not only present it with the input but also with a value that we require the network to produce. For example, if we present the network with [1,1] for the AND function the training value will be 1
  • 22. LEARNING While epoch produces an error Present network with next inputs from epoch Err = T – O If Err <> 0 then Wj = Wj + LR * Ij * Err End If End While Error, Err : The error value is the amount by which the value output by the network differs from the training value. For example, if we required the network to output 0 and it output a 1, then Err = -1
  • 23. LEARNING While epoch produces an error Present network with next inputs from epoch Err = T – O If Err <> 0 then Wj = Wj + LR * Ij * Err End If End While Output from Neuron, O : The output value from the neuron Ij : Inputs being presented to the neuron Wj : Weight from input neuron (Ij) to the output neuron LR : The learning rate. This dictates how quickly the network converges. It is set by a matter of experimentation. It is typically 0.1