Neural Networks
Neural Networks




                  2
Natural Neural Networks
• Signals “move” via electrochemical signals
• The synapses release a chemical transmitter –
  the sum of which can cause a threshold to be
  reached – causing the neuron to “fire”
• Synapses can be inhibitory or excitatory




                                              3
Natural Neural Networks
• We are born with about 100 billion neurons

• A neuron may connect to as many as 100,000
  other neurons




                                               4
Natural Neural Networks
• Many of their ideas still used today e.g.
  – many simple units, “neurons” combine to give
    increased computational power
  – the idea of a threshold




                                                   5
Modelling a Neuron




ini   j
        Wj, iaj • aj      :Activation value of unit j
               •   wj,i   :Weight on link from unit j to unit i
               •   ini    :Weighted sum of inputs to unit i
               •   ai     :Activation value of unit i
               •   g      :Activation function
                                                                  6
Activation Functions




• Stept(x) =   1 if x ≥ t, else 0 threshold=t
• Sign(x) =    +1 if x ≥ 0, else –1
• Sigmoid(x)   =      1/(1+e-x)
                                                7
Building a Neural Network
1. “Select Structure”: Design the way that the
    neurons are interconnected
2. “Select weights” – decide the strengths with
    which the neurons are interconnected
   – weights are selected so get a “good match” to
      a “training set”
   – “training set”: set of inputs and desired
      outputs
   – often use a “learning algorithm”

                                                     8
Basic Neural Networks
• Will first look at simplest networks
• “Feed-forward”
  – Signals travel in one direction through net
  – Net computes a function of the inputs




                                                  9
The First Neural Neural Networks
  X1           2


               2
  X2                            Y

              -1

  X3


 Neurons in a McCulloch-Pitts network are connected by directed, weighted
 paths


                                                                            10
The First Neural Neural Networks
  X1      2


          2
 X2                   Y

         -1

 X3

 If the on weight on a path is positive the path is
 excitatory,
 otherwise it is inhibitory
                                                      11
The First Neural Neural Networks

  X1         2


             2
  X2                       Y

            -1

  X3

The activation of a neuron is binary. That is, the neuron
either fires (activation of one) or does not fire (activation of
zero).
                                                                   12
The First Neural Neural Networks
  X1            2


                2
  X2                               Y

               -1

  X3

For the network shown here the activation function for unit Y is

                              f(y_in) = 1, if y_in >= θ else 0

where    y_in is the total input signal received
         θ is the threshold for Y

                                                                   13
The First Neural Neural Networks
X1          2


             2
X2                            Y

           -1

X3


 Originally, all excitatory connections into a particular neuron have the same
 weight, although different weighted connections can be input to different
 neurons

 Later weights allowed to be arbitrary

                                                                                 14
The First Neural Neural Networks
   X1           2


                2
  X2                              Y

               -1

  X3



Each neuron has a fixed threshold. If
                         the net input into the neuron is
greater than or equal to the threshold, the neuron fires

                                                       15
The First Neural Neural Networks
X1            2


              2
X2                              Y

             -1

X3


The threshold is set such that any non-zero inhibitory input will prevent the neuron
from firing




                                                                                  16
Building Logic Gates

• Computers are built out of “logic gates”

• Use threshold (step) function for activation
  function
   – all activation values are 0 (false) or 1 (true)




                                                       17
The First Neural Neural Networks

                    1
                              AND
      X1

                          Y
                              X1    X2   Y
                                1    1   1
      X2            1
                                1    0   0
           AND Function
                                0    1   0
                                0    0   0


           Threshold(Y) = 2




                                             18
The First Neural Networks
                               OR
X1      2                      X1    X2   Y
                    Y            1    1   1
                                 1    0   1
X2      2
                                 0    1   1
     ANDFunction
      OR Function
                                 0    0   0




            Threshold(Y) = 2




                                              19
Perceptron
         • Synonym for Single-Layer,
           Feed-Forward Network


         • First Studied in the 50’s


         • Other networks were known
           about but the perceptron
           was the only one capable of
           learning and thus all research
           was concentrated in this area




                                       20
Perceptron
         • A single weight only affects
           one output so we can restrict
           our investigations to a model
           as shown on the right
         • Notation can be simpler, i.e.




             O    Step0       j
                                  WjIj




                                     21
What can perceptrons represent?

          AND               XOR
Input 1   0     0   1   1   0     0   1   1
Input 2   0     1   0   1   0     1   0   1
Output    0     0   0   1   0     1   1   0




                                          22
What can perceptrons represent?
                            1,1
                                                                          1,1
     0,1
                                                  0,1




    0,0                           1,0
                                                                                1,0
                                                  0,0
                 AND                                            XOR


•     Functions which can be separated in this way are called Linearly Separable
•     Only linearly separable functions can be represented by a perceptron
•     XOR cannot be represented by a perceptron

                                                                                      23
XOR
•    XOR is not “linearly separable”
    – Cannot be represented by a perceptron
•    What can we do instead?
    1. Convert to logic gates that can be represented by
       perceptrons
    2. Chain together the gates




                                                           24
Single- vs. Multiple-Layers

• Once we chain together the gates then we have “hidden
  layers”
   – layers that are “hidden” from the output lines

• Have just seen that hidden layers allow us to represent XOR
   – Perceptron is single-layer
   – Multiple layers increase the representational power, so
     e.g. can represent XOR

• Generally useful nets have multiple-layers
   – typically 2-4 layers

                                                                25

More Related Content

PDF
Machine-Learning-A-Z-Course-Downloadable-Slides-V1.5.pdf
PPT
Artificial neural network
PPT
Ann
 
PDF
Artificial Neural Network
PPT
Artificial Intelligence: Artificial Neural Networks
PPTX
Artifical Neural Network and its applications
PPTX
Deep Learning - RNN and CNN
PPT
neural networks
Machine-Learning-A-Z-Course-Downloadable-Slides-V1.5.pdf
Artificial neural network
Ann
 
Artificial Neural Network
Artificial Intelligence: Artificial Neural Networks
Artifical Neural Network and its applications
Deep Learning - RNN and CNN
neural networks

What's hot (20)

PDF
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
PPTX
Introduction to Neural networks (under graduate course) Lecture 7 of 9
PPTX
Deep Learning A-Z™: Artificial Neural Networks (ANN) - The Activation Function
PPT
Counterpropagation NETWORK
 
PPTX
Perceptron & Neural Networks
PPT
Perceptron algorithm
PPT
Artificial neural network
PDF
Artificial Neural Networks (ANN)
ODP
Artificial Neural Network
PPTX
Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...
PPSX
Perceptron (neural network)
PPTX
Unsupervised learning clustering
PDF
Back Propagation Network (Soft Computing)
PDF
Multi Layer Perceptron & Back Propagation
PPTX
HML: Historical View and Trends of Deep Learning
 
PPTX
Introduction to CNN
PPT
Unit I & II in Principles of Soft computing
PDF
Distributed Deep Q-Learning
 
PPTX
Machine learning with neural networks
PPT
Perceptron
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Introduction to Neural networks (under graduate course) Lecture 7 of 9
Deep Learning A-Z™: Artificial Neural Networks (ANN) - The Activation Function
Counterpropagation NETWORK
 
Perceptron & Neural Networks
Perceptron algorithm
Artificial neural network
Artificial Neural Networks (ANN)
Artificial Neural Network
Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...
Perceptron (neural network)
Unsupervised learning clustering
Back Propagation Network (Soft Computing)
Multi Layer Perceptron & Back Propagation
HML: Historical View and Trends of Deep Learning
 
Introduction to CNN
Unit I & II in Principles of Soft computing
Distributed Deep Q-Learning
 
Machine learning with neural networks
Perceptron
Ad

Viewers also liked (20)

PPTX
Neural network
PPTX
What is in you
PPTX
Resolution(decision)
PPTX
Instance based learning
PPTX
Книги-юбиляры 2013 года
PPTX
Report generation
PPTX
16 Queens Problem - trial 1
PPTX
Security in Relational model
PPT
Logic agent
PPT
Unit+i
PPTX
Accessing I/O Devices
PDF
인공지능 방법론 - 딥러닝 이해하기
PDF
Lecture 9 Perceptron
PPTX
Data preprocessing
PPTX
Crystal report generation in visual studio 2010
PPTX
Statistical learning
PPTX
Propositional logic & inference
PPTX
Major issues in data mining
PPTX
Neural networks...
PPT
Trigger
Neural network
What is in you
Resolution(decision)
Instance based learning
Книги-юбиляры 2013 года
Report generation
16 Queens Problem - trial 1
Security in Relational model
Logic agent
Unit+i
Accessing I/O Devices
인공지능 방법론 - 딥러닝 이해하기
Lecture 9 Perceptron
Data preprocessing
Crystal report generation in visual studio 2010
Statistical learning
Propositional logic & inference
Major issues in data mining
Neural networks...
Trigger
Ad

Similar to Neural networks (20)

PPTX
Neural networks
PPTX
Neural Networks.pptx
PPTX
neural-networks (1)
PPT
Ann ics320 part4
PPTX
Neural Networks in Artificial intelligence
PPTX
Artificial neural network
PPT
Neural networks 1
PPT
Ann by rutul mehta
PPTX
Artificial Neural Network
PPT
ch11.pptKGYUTFYDRERLJIOUY7T867RVHOJIP09-IU08Y7GTFGYU890-I90UIYGUI
PPT
ch11.ppt kusrdsdagrfzgfdfgdfsdsfdsxgdhfjgh50s
PDF
Neural Networks
PPTX
Introduction to Neural Networks By Simon Haykins
PDF
Neural networks.cheungcannonnotes
PPTX
lecture13-NN-basics.pptx
PDF
ANN-lecture9
PPT
UNIT 5-ANN.ppt
PPT
2011 0480.neural-networks
PDF
Analytical and Systematic Study of Artificial Neural Network
PPT
Introduction to Artificial neural network
Neural networks
Neural Networks.pptx
neural-networks (1)
Ann ics320 part4
Neural Networks in Artificial intelligence
Artificial neural network
Neural networks 1
Ann by rutul mehta
Artificial Neural Network
ch11.pptKGYUTFYDRERLJIOUY7T867RVHOJIP09-IU08Y7GTFGYU890-I90UIYGUI
ch11.ppt kusrdsdagrfzgfdfgdfsdsfdsxgdhfjgh50s
Neural Networks
Introduction to Neural Networks By Simon Haykins
Neural networks.cheungcannonnotes
lecture13-NN-basics.pptx
ANN-lecture9
UNIT 5-ANN.ppt
2011 0480.neural-networks
Analytical and Systematic Study of Artificial Neural Network
Introduction to Artificial neural network

More from Slideshare (15)

PPTX
Entity Relationship Model
PPTX
OLAP
PPTX
Logical reasoning 21.1.13
PPTX
Statistical learning
PPT
Reinforcement learning 7313
PPTX
Logical reasoning
PPTX
Instance based learning
PPTX
Input & output devices
PPTX
16 queens problem - trial 2
PPTX
Basic Processing Unit
PPTX
Cache performance considerations
PPTX
Cachememory
PPTX
Memory management
PPT
Secondary storage devices
PPTX
Magnetic tape system
Entity Relationship Model
OLAP
Logical reasoning 21.1.13
Statistical learning
Reinforcement learning 7313
Logical reasoning
Instance based learning
Input & output devices
16 queens problem - trial 2
Basic Processing Unit
Cache performance considerations
Cachememory
Memory management
Secondary storage devices
Magnetic tape system

Recently uploaded (20)

PDF
My India Quiz Book_20210205121199924.pdf
DOC
Soft-furnishing-By-Architect-A.F.M.Mohiuddin-Akhand.doc
PDF
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
PDF
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
PDF
Environmental Education MCQ BD2EE - Share Source.pdf
PDF
FOISHS ANNUAL IMPLEMENTATION PLAN 2025.pdf
PDF
FORM 1 BIOLOGY MIND MAPS and their schemes
PDF
LDMMIA Reiki Yoga Finals Review Spring Summer
PDF
International_Financial_Reporting_Standa.pdf
PDF
IGGE1 Understanding the Self1234567891011
PPTX
History, Philosophy and sociology of education (1).pptx
 
DOCX
Cambridge-Practice-Tests-for-IELTS-12.docx
PDF
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
PPTX
TNA_Presentation-1-Final(SAVE)) (1).pptx
PDF
CISA (Certified Information Systems Auditor) Domain-Wise Summary.pdf
PPTX
Virtual and Augmented Reality in Current Scenario
PDF
Paper A Mock Exam 9_ Attempt review.pdf.
PDF
Uderstanding digital marketing and marketing stratergie for engaging the digi...
PPTX
A powerpoint presentation on the Revised K-10 Science Shaping Paper
PPTX
Unit 4 Computer Architecture Multicore Processor.pptx
My India Quiz Book_20210205121199924.pdf
Soft-furnishing-By-Architect-A.F.M.Mohiuddin-Akhand.doc
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
Environmental Education MCQ BD2EE - Share Source.pdf
FOISHS ANNUAL IMPLEMENTATION PLAN 2025.pdf
FORM 1 BIOLOGY MIND MAPS and their schemes
LDMMIA Reiki Yoga Finals Review Spring Summer
International_Financial_Reporting_Standa.pdf
IGGE1 Understanding the Self1234567891011
History, Philosophy and sociology of education (1).pptx
 
Cambridge-Practice-Tests-for-IELTS-12.docx
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
TNA_Presentation-1-Final(SAVE)) (1).pptx
CISA (Certified Information Systems Auditor) Domain-Wise Summary.pdf
Virtual and Augmented Reality in Current Scenario
Paper A Mock Exam 9_ Attempt review.pdf.
Uderstanding digital marketing and marketing stratergie for engaging the digi...
A powerpoint presentation on the Revised K-10 Science Shaping Paper
Unit 4 Computer Architecture Multicore Processor.pptx

Neural networks

  • 3. Natural Neural Networks • Signals “move” via electrochemical signals • The synapses release a chemical transmitter – the sum of which can cause a threshold to be reached – causing the neuron to “fire” • Synapses can be inhibitory or excitatory 3
  • 4. Natural Neural Networks • We are born with about 100 billion neurons • A neuron may connect to as many as 100,000 other neurons 4
  • 5. Natural Neural Networks • Many of their ideas still used today e.g. – many simple units, “neurons” combine to give increased computational power – the idea of a threshold 5
  • 6. Modelling a Neuron ini j Wj, iaj • aj :Activation value of unit j • wj,i :Weight on link from unit j to unit i • ini :Weighted sum of inputs to unit i • ai :Activation value of unit i • g :Activation function 6
  • 7. Activation Functions • Stept(x) = 1 if x ≥ t, else 0 threshold=t • Sign(x) = +1 if x ≥ 0, else –1 • Sigmoid(x) = 1/(1+e-x) 7
  • 8. Building a Neural Network 1. “Select Structure”: Design the way that the neurons are interconnected 2. “Select weights” – decide the strengths with which the neurons are interconnected – weights are selected so get a “good match” to a “training set” – “training set”: set of inputs and desired outputs – often use a “learning algorithm” 8
  • 9. Basic Neural Networks • Will first look at simplest networks • “Feed-forward” – Signals travel in one direction through net – Net computes a function of the inputs 9
  • 10. The First Neural Neural Networks X1 2 2 X2 Y -1 X3 Neurons in a McCulloch-Pitts network are connected by directed, weighted paths 10
  • 11. The First Neural Neural Networks X1 2 2 X2 Y -1 X3 If the on weight on a path is positive the path is excitatory, otherwise it is inhibitory 11
  • 12. The First Neural Neural Networks X1 2 2 X2 Y -1 X3 The activation of a neuron is binary. That is, the neuron either fires (activation of one) or does not fire (activation of zero). 12
  • 13. The First Neural Neural Networks X1 2 2 X2 Y -1 X3 For the network shown here the activation function for unit Y is f(y_in) = 1, if y_in >= θ else 0 where y_in is the total input signal received θ is the threshold for Y 13
  • 14. The First Neural Neural Networks X1 2 2 X2 Y -1 X3 Originally, all excitatory connections into a particular neuron have the same weight, although different weighted connections can be input to different neurons Later weights allowed to be arbitrary 14
  • 15. The First Neural Neural Networks X1 2 2 X2 Y -1 X3 Each neuron has a fixed threshold. If the net input into the neuron is greater than or equal to the threshold, the neuron fires 15
  • 16. The First Neural Neural Networks X1 2 2 X2 Y -1 X3 The threshold is set such that any non-zero inhibitory input will prevent the neuron from firing 16
  • 17. Building Logic Gates • Computers are built out of “logic gates” • Use threshold (step) function for activation function – all activation values are 0 (false) or 1 (true) 17
  • 18. The First Neural Neural Networks 1 AND X1 Y X1 X2 Y 1 1 1 X2 1 1 0 0 AND Function 0 1 0 0 0 0 Threshold(Y) = 2 18
  • 19. The First Neural Networks OR X1 2 X1 X2 Y Y 1 1 1 1 0 1 X2 2 0 1 1 ANDFunction OR Function 0 0 0 Threshold(Y) = 2 19
  • 20. Perceptron • Synonym for Single-Layer, Feed-Forward Network • First Studied in the 50’s • Other networks were known about but the perceptron was the only one capable of learning and thus all research was concentrated in this area 20
  • 21. Perceptron • A single weight only affects one output so we can restrict our investigations to a model as shown on the right • Notation can be simpler, i.e. O Step0 j WjIj 21
  • 22. What can perceptrons represent? AND XOR Input 1 0 0 1 1 0 0 1 1 Input 2 0 1 0 1 0 1 0 1 Output 0 0 0 1 0 1 1 0 22
  • 23. What can perceptrons represent? 1,1 1,1 0,1 0,1 0,0 1,0 1,0 0,0 AND XOR • Functions which can be separated in this way are called Linearly Separable • Only linearly separable functions can be represented by a perceptron • XOR cannot be represented by a perceptron 23
  • 24. XOR • XOR is not “linearly separable” – Cannot be represented by a perceptron • What can we do instead? 1. Convert to logic gates that can be represented by perceptrons 2. Chain together the gates 24
  • 25. Single- vs. Multiple-Layers • Once we chain together the gates then we have “hidden layers” – layers that are “hidden” from the output lines • Have just seen that hidden layers allow us to represent XOR – Perceptron is single-layer – Multiple layers increase the representational power, so e.g. can represent XOR • Generally useful nets have multiple-layers – typically 2-4 layers 25