SlideShare a Scribd company logo
Spiking Neural Network (SNN):
A Introduction I
Learning Group (29/Jun/2018)
Dalin Zhang
1
2
3
Spiking Neural Network
Recall
Outline
Leaky Integrate-and-Fire (LIF)
4 Encoding Approaches
Recall
Neurons in Human
Recall
Problem:
• Temporal Information
Recall
Problem:
• React only when receiving a pulse
Spiking Neural Network
Spiking Neural Network Architecture
• Encoding
• Build Network
• Loss computation
• Parameter update (learning)
• Decoding
Spiking Neural Network
Spiking Neural Network
Leaky Integrate-and-Fire (LIF)
Pulse: Current/Electric Charge
When Membrane Voltage
exceeds the threshold
Human Neuron
Leaky Integrate-and-Fire (LIF)
• Potassium Ion Channel
• Sodium Ion Channel
• Other Ions Channel
Hodgkin-Huxley (HH)
Leaky Integrate-and-Fire (LIF)
Leaky Integrate-and-Fire (LIF) Model
Hypothesis: The model makes use of the fact that neuronal action potentials of a given neuron
always have roughly the same form. If the shape of an action potential is always the same, then the
shape cannot be used to transmit information: rather information is contained in the presence or
absence of a spike. Therefore action potentials are reduced to ‘events’ that happen at a precise
moment in time.
No attempt is made to describe the shape of an action potential.
Leaky Integrate-and-Fire (LIF)
Integrate-and-fire models have two separate components that are both
necessary to define their dynamics:
First, an equation that describes the evolution of the membrane potential;
Second, a mechanism to generate spikes.
Leaky Integrate-and-Fire (LIF)
time constant
We suppose that, at time t = t0 the membrane potential takes a value
For t > t0, the input vanishes I(t)=0, Intuitively we expect that, if we wait long
enough, the membrane potential relaxes to its resting value
Indeed, the solution of the differential equation with initial condition
Thus, in the absence of input, the membrane potential decays exponentially to its
resting value. The membrane time constant is the characteristic time of the
decay.
Zero Input
Leaky Integrate-and-Fire (LIF)
Constant Input
Suppose that the passive membrane is stimulated by a constant input current
which starts at and ends at time , with the initial condition
The solution for
If the input current never stopped, the membrane potential would approach for
Once a steady state is reached, the charge on the capacitor no longer changes. All
input current must then flow through the resistor. The steady-state voltage at the
resistor is therefore so that the total membrane voltage is
Leaky Integrate-and-Fire (LIF)
Pulse Input
For short pulses the steady state value is never reached. At the end of the pulse,
the value of the membrane potential is
For pulse durations Δ ≪τm, we find
Thus, the voltage deflection depends linearly on the amplitude and the duration of the
pulse. As
And the duration Δ is made shorter and shorter, the total charge q delivered by the
current pulse is always the same.
Leaky Integrate-and-Fire (LIF)
Pulse Input
Interestingly, the voltage deflection at the end of the pulse remains unaltered.
What happens for times The membrane potential evolves from its new initial
value in the absence of any further input. Then we could use the zero input
function:
Leaky Integrate-and-Fire (LIF)
So the LIF model function with pulse input is
The solution of the linear differential equation is
Thus we can consider the limit of an infinitely short pulse
Pulse Input
Leaky Integrate-and-Fire (LIF)
Spike Generation
The term ‘firing time’ refers to the moment when a given neuron emits an action
potential. The firing time in the leaky integrate-and-fire model is defined by a threshold
criterion.
The firing time is noted and immediately after the potential is reset to a new value
For the dynamics is again given by
until the next threshold crossing occurs.
Leaky Integrate-and-Fire (LIF)
Encoding Approaches
Rate Encoding
The rate coding model of neuronal firing communication states that as the intensity
of a stimulus increases, the frequency or rate of action potentials, or "spike firing",
increases.
Example:
Figures with 28x28 size, greyscale ([0 255]).
Pixel value as stimuli frequency
Present for 500ms
Pixel with value 20 as 20Hz generates 10 spikes during present duration.
Encoding Approaches
Temporal Encoding
When precise spike timing are found to carry information, the neural code is often
identified as a temporal code.
Latency Encoding
Example:
Pixel with value 20 as 20ms after
stimulus. Only one spike for each.
Interspike Interval Encoding
Example:
Pixel with value 20 as 20ms
between two spikes.
Encoding Approaches
Population Encoding
Population coding is a method to represent stimuli by using the joint activities of a
number of neurons. In population coding, each neuron has a distribution of
responses over some set of inputs, and the responses of many neurons may be
combined to determine some value about the inputs.
Encode an input variable using multiple overlapping Gaussian Receptive Fields
(RF). Gaussian RF are used to generate firing times from real values.
Encoding Approaches
Population Encoding
Encode an input variable using multiple overlapping Gaussian Receptive Fields
(RF). Gaussian RF are used to generate firing times from real values. For a range
[IMax..IMin] of a variable, which is also called the coding interval, a set of m Gaussian
RF neurons are used.
The center Ci and the width σi of each RF neuron i are determined by the
following equations:
Where m is number of receptive fields in each population and γ is a constant
variable usually 1.5.
Encoding Approaches
While converting the activation values of RF into firing times, a threshold ϑ has
been imposed on the activation value. A receptive field that gives an activation
value less than this threshold will be marked as not-firing and the corresponding
input neuron will not contribute to the post-synaptic potential.
Encoding Approaches
Hough Spiker Algorithm (HSA)
Digital to Analog Analog to Digital
Encoding Approaches
29/06/2018
THANK
YOU

More Related Content

PPTX
Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...
PPTX
HOPFIELD NETWORK
PDF
ELM: Extreme Learning Machine: Learning without iterative tuning
PDF
Deep Feed Forward Neural Networks and Regularization
PPTX
Activation function
PDF
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
PPTX
hopfield neural network
PDF
Brief Introduction to Boltzmann Machine
Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...
HOPFIELD NETWORK
ELM: Extreme Learning Machine: Learning without iterative tuning
Deep Feed Forward Neural Networks and Regularization
Activation function
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
hopfield neural network
Brief Introduction to Boltzmann Machine

What's hot (20)

PPTX
Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...
PPTX
Deep neural networks
PPTX
Perceptron & Neural Networks
PPTX
PPTX
Feature pyramid networks for object detection
PDF
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
PPTX
Deep learning algorithms
PDF
FPGA Hardware Accelerator for Machine Learning
PPTX
Fine tuning large LMs
PPSX
Perceptron (neural network)
PPT
PDF
Recurrent neural networks rnn
PPTX
Attention Is All You Need
ODP
Simple Introduction to AutoEncoder
PPTX
Convolution codes and turbo codes
PDF
Large Language Models Are Reasoning Teachers
PPTX
Brief intro : Invariance and Equivariance
PDF
Crypto With OpenSSL
PDF
Neural Networks: Rosenblatt's Perceptron
PPT
Introduction to soft computing
Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...
Deep neural networks
Perceptron & Neural Networks
Feature pyramid networks for object detection
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
Deep learning algorithms
FPGA Hardware Accelerator for Machine Learning
Fine tuning large LMs
Perceptron (neural network)
Recurrent neural networks rnn
Attention Is All You Need
Simple Introduction to AutoEncoder
Convolution codes and turbo codes
Large Language Models Are Reasoning Teachers
Brief intro : Invariance and Equivariance
Crypto With OpenSSL
Neural Networks: Rosenblatt's Perceptron
Introduction to soft computing
Ad

Similar to Spiking neural network: an introduction I (20)

PDF
Modeling Stochasticity and Gap Junction Dynamics: Integrate and Fire Model
PDF
H44084348
PDF
NeuronsPart4
PPTX
Neuroengineering Tutorial: Integrate and Fire neuron modeling
PDF
Spiking Neural Networks As Continuous-Time Dynamical Systems: Fundamentals, E...
PDF
JAISTサマースクール2016「脳を知るための理論」講義01 Single neuron models
PDF
Abstract_Natalie
PPTX
Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...
PPTX
Integrate and Fire based neuron model
PPT
Models of neuronal populations
PPTX
Python data science course
PPTX
Data science institute in kolkata
PPTX
Data science certification in pune
PPTX
data science course in pune
PPTX
Data science certification
PPTX
Data science course
PPTX
Best data science courses
PPTX
Data science courses in bangalore
PPTX
Machine learning certification in gurgaon
Modeling Stochasticity and Gap Junction Dynamics: Integrate and Fire Model
H44084348
NeuronsPart4
Neuroengineering Tutorial: Integrate and Fire neuron modeling
Spiking Neural Networks As Continuous-Time Dynamical Systems: Fundamentals, E...
JAISTサマースクール2016「脳を知るための理論」講義01 Single neuron models
Abstract_Natalie
Introduction to Spiking Neural Networks: From a Computational Neuroscience pe...
Integrate and Fire based neuron model
Models of neuronal populations
Python data science course
Data science institute in kolkata
Data science certification in pune
data science course in pune
Data science certification
Data science course
Best data science courses
Data science courses in bangalore
Machine learning certification in gurgaon
Ad

Recently uploaded (20)

PPT
Teaching material agriculture food technology
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PDF
Empathic Computing: Creating Shared Understanding
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PDF
Machine learning based COVID-19 study performance prediction
PDF
Electronic commerce courselecture one. Pdf
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PDF
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PPTX
Understanding_Digital_Forensics_Presentation.pptx
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PDF
The Rise and Fall of 3GPP – Time for a Sabbatical?
PPTX
sap open course for s4hana steps from ECC to s4
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
DOCX
The AUB Centre for AI in Media Proposal.docx
PPTX
20250228 LYD VKU AI Blended-Learning.pptx
PPTX
MYSQL Presentation for SQL database connectivity
PDF
KodekX | Application Modernization Development
PDF
MIND Revenue Release Quarter 2 2025 Press Release
Teaching material agriculture food technology
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
Diabetes mellitus diagnosis method based random forest with bat algorithm
Empathic Computing: Creating Shared Understanding
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
Machine learning based COVID-19 study performance prediction
Electronic commerce courselecture one. Pdf
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
Understanding_Digital_Forensics_Presentation.pptx
Digital-Transformation-Roadmap-for-Companies.pptx
The Rise and Fall of 3GPP – Time for a Sabbatical?
sap open course for s4hana steps from ECC to s4
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
The AUB Centre for AI in Media Proposal.docx
20250228 LYD VKU AI Blended-Learning.pptx
MYSQL Presentation for SQL database connectivity
KodekX | Application Modernization Development
MIND Revenue Release Quarter 2 2025 Press Release

Spiking neural network: an introduction I

  • 1. Spiking Neural Network (SNN): A Introduction I Learning Group (29/Jun/2018) Dalin Zhang
  • 2. 1 2 3 Spiking Neural Network Recall Outline Leaky Integrate-and-Fire (LIF) 4 Encoding Approaches
  • 5. Recall Problem: • React only when receiving a pulse
  • 6. Spiking Neural Network Spiking Neural Network Architecture • Encoding • Build Network • Loss computation • Parameter update (learning) • Decoding
  • 9. Leaky Integrate-and-Fire (LIF) Pulse: Current/Electric Charge When Membrane Voltage exceeds the threshold Human Neuron
  • 10. Leaky Integrate-and-Fire (LIF) • Potassium Ion Channel • Sodium Ion Channel • Other Ions Channel Hodgkin-Huxley (HH)
  • 11. Leaky Integrate-and-Fire (LIF) Leaky Integrate-and-Fire (LIF) Model Hypothesis: The model makes use of the fact that neuronal action potentials of a given neuron always have roughly the same form. If the shape of an action potential is always the same, then the shape cannot be used to transmit information: rather information is contained in the presence or absence of a spike. Therefore action potentials are reduced to ‘events’ that happen at a precise moment in time. No attempt is made to describe the shape of an action potential.
  • 12. Leaky Integrate-and-Fire (LIF) Integrate-and-fire models have two separate components that are both necessary to define their dynamics: First, an equation that describes the evolution of the membrane potential; Second, a mechanism to generate spikes.
  • 13. Leaky Integrate-and-Fire (LIF) time constant We suppose that, at time t = t0 the membrane potential takes a value For t > t0, the input vanishes I(t)=0, Intuitively we expect that, if we wait long enough, the membrane potential relaxes to its resting value Indeed, the solution of the differential equation with initial condition Thus, in the absence of input, the membrane potential decays exponentially to its resting value. The membrane time constant is the characteristic time of the decay. Zero Input
  • 14. Leaky Integrate-and-Fire (LIF) Constant Input Suppose that the passive membrane is stimulated by a constant input current which starts at and ends at time , with the initial condition The solution for If the input current never stopped, the membrane potential would approach for Once a steady state is reached, the charge on the capacitor no longer changes. All input current must then flow through the resistor. The steady-state voltage at the resistor is therefore so that the total membrane voltage is
  • 15. Leaky Integrate-and-Fire (LIF) Pulse Input For short pulses the steady state value is never reached. At the end of the pulse, the value of the membrane potential is For pulse durations Δ ≪τm, we find Thus, the voltage deflection depends linearly on the amplitude and the duration of the pulse. As And the duration Δ is made shorter and shorter, the total charge q delivered by the current pulse is always the same.
  • 16. Leaky Integrate-and-Fire (LIF) Pulse Input Interestingly, the voltage deflection at the end of the pulse remains unaltered. What happens for times The membrane potential evolves from its new initial value in the absence of any further input. Then we could use the zero input function:
  • 17. Leaky Integrate-and-Fire (LIF) So the LIF model function with pulse input is The solution of the linear differential equation is Thus we can consider the limit of an infinitely short pulse Pulse Input
  • 18. Leaky Integrate-and-Fire (LIF) Spike Generation The term ‘firing time’ refers to the moment when a given neuron emits an action potential. The firing time in the leaky integrate-and-fire model is defined by a threshold criterion. The firing time is noted and immediately after the potential is reset to a new value For the dynamics is again given by until the next threshold crossing occurs.
  • 20. Encoding Approaches Rate Encoding The rate coding model of neuronal firing communication states that as the intensity of a stimulus increases, the frequency or rate of action potentials, or "spike firing", increases. Example: Figures with 28x28 size, greyscale ([0 255]). Pixel value as stimuli frequency Present for 500ms Pixel with value 20 as 20Hz generates 10 spikes during present duration.
  • 21. Encoding Approaches Temporal Encoding When precise spike timing are found to carry information, the neural code is often identified as a temporal code. Latency Encoding Example: Pixel with value 20 as 20ms after stimulus. Only one spike for each. Interspike Interval Encoding Example: Pixel with value 20 as 20ms between two spikes.
  • 22. Encoding Approaches Population Encoding Population coding is a method to represent stimuli by using the joint activities of a number of neurons. In population coding, each neuron has a distribution of responses over some set of inputs, and the responses of many neurons may be combined to determine some value about the inputs. Encode an input variable using multiple overlapping Gaussian Receptive Fields (RF). Gaussian RF are used to generate firing times from real values.
  • 23. Encoding Approaches Population Encoding Encode an input variable using multiple overlapping Gaussian Receptive Fields (RF). Gaussian RF are used to generate firing times from real values. For a range [IMax..IMin] of a variable, which is also called the coding interval, a set of m Gaussian RF neurons are used. The center Ci and the width σi of each RF neuron i are determined by the following equations: Where m is number of receptive fields in each population and γ is a constant variable usually 1.5.
  • 24. Encoding Approaches While converting the activation values of RF into firing times, a threshold ϑ has been imposed on the activation value. A receptive field that gives an activation value less than this threshold will be marked as not-firing and the corresponding input neuron will not contribute to the post-synaptic potential.
  • 25. Encoding Approaches Hough Spiker Algorithm (HSA) Digital to Analog Analog to Digital

Editor's Notes