Why Information Flow in A Computer Simulated Cortical
Neural Network Is Important
R. Jarvis Supervisor P. Junor
Department of Electronic Engineering
La Trobe University
July 22, 2015
1 / 31
Introduction
Reverse Engineer Principles of Brain Function: Created a neural
network model of rat cortex including accurate cell shapes. Observed
information flow, making it possible to infer some functions of
individual brain cells.
Made with NEURON-7.3 as a Python module.
100mm3 µm3
http : //neuromorpho.org/neuroMorpho/LSV ideo.jsp
2 / 31
From Reconstructed Cells to Reconstructed Networks.
In order to move from reconstructed Cells to reconstructed networks,
one must connect neurons in a biologically informed way. Spatial and
anatomical data from the Allen Rodent Brain Atlas, is used to inform
cell centre position and synapse polarity.
Neurons are connected by conducting an exhaustive search in the
neural volume for near contact points and allocating a network
connection to the intervening synapse cleft.
3 / 31
From Neurite coordinates to to synaptic cleft
Figure : Two neurons with classified Membrane with Post Synapses
4 / 31
Wiring Algorithm: Searching the volume of spike sources
HOC
1 proc wire_distances (){
forsec $o1.spk_trig_ls{
3 for (x){
if((x>0) &&(x<1)){
5 if(ismembrane("xtra"){
find_distances (x_xtra(x), y_xtra(x),
z_xtra(x), secname (), x, $o2)
7 }
}
9 }
}
11 }
5 / 31
Self Reflection
HOC
1 // String processing performed elsewhere enabled
execution of Strings performed here.
Facilitates network connections.
3 sprint(synapse_post , "%s%s%g%s", sec_string2 , "
syn_ = new AMPA(",storex ,")")
5 execute(synapse_post) //put a post synapse at
the target
7 sprint(synapse_pre , "%s%s%g%s", sec_string1 ,"
nc = new NetCon (&v(",num_seg ,"),syn_)")
9 execute(synapse_pre)
6 / 31
Find Contact Points Algorithm
HOC
forsec $o7.spk_rx_ls{
2 sec_string1=$s4
num_seg=$5
4 if(strcmp(str_src ,str_target)==0){ break }
if(Cell[py.int(tail_src)].div.x[py.int(
tail_target)]>3){ break }
6 if (ismembrane("xtra")) {
for(x){
8 if((x>0) &&(x<1)){
r = (sqrt (( x_xtra(x) - $1)^2 + (
y_xtra(x) - $2)^2 + (z_xtra(x) - $3)^2))
10 if(r <10){// units of um.
if(Cell[py.int(tail_src)].div.x
[py.int(tail_target)]<3){
7 / 31
Pre and Post Synapses
In most what follows I will be referring to pre synaptic locations and
post synaptic locations.
(sources:) http://www.igi.tugraz.at/maass/129/129a.htm
8 / 31
EPSP and IPSP
Some people think of neurons as analog input digital output.
One or more IPSPs can prevent a spike from occurring, by depressing
the membrane potential.
(sources:) http://www.cerebromente.org.br/n12/fundamentos/neurotransmissores/neurotransmitters2.html
http://www.studyblue.com/notes/note/n/chapter-48-nervous-system/deck/4169450
9 / 31
Pre Processing Signals: Spike Trains
Continuous membrane potential thresholded, the times each neuron
fires is stored. Time binning makes signal coarse grained.
10 / 31
Connectivity and the Adjacency Matrix.
I→E=2073, fb1− = 1081
11 / 31
Information In the Spike Train
The number of spikes per bin: ∆t is the source of uncertainty in the
spike train.
There are two major sources of information:
1st the external input (sensory).
2nd sources of variability that are intrinsic to the brains dynamics.
12 / 31
An Interpretation of Information Flow:Prediction
Using Transfer Entropy we can say that neuronY influences neuronX
if neuronY ’s past activity reduces the uncertainty about neuronX ’s
future activity.
neuronY neuronX
When a particular type of entropy (uncertaintity) is reduced
prediction is increased.
13 / 31
Resulting Information Flow
One cell can predict another cells activity acting through intermediate
neurons
A reduction in firing probability still represents an information
transmission influence.
Connectivity nTE
14 / 31
Information Flow Between Neurons
Degree matrix shows the quantity and direction of information
transmitted between each pair of neurons.
nTE Correlation Matrix
Presence of information flow contributes validation of reconstruction
because when information does flow, it suggests information is not
randomly created and destroyed at every node.
15 / 31
Information Flow
In Von Neuman architecture no conflict between transmitting,
translating, and storing information. Respective tasks performed by
the bus, the CPU, RAM. One entire neuron simultaneously engages in
information transfer, information alteration and information storage.
Sums, Cells[0,24] Sums, Cells[25,49]
16 / 31
Summary
Discussed how TE identified functions. TE distinguished information
sending neurons from information receiving neurons.
Transfer Entropy was used to quantify Information Flow within and
between neurons.
I described how information flow in IPSPs can be detected.
I have discussed how the presence of information flow contributed to
the validation of model.
Generally I have described the analysis of Information flow in a
cortical neural network, and why it is of interest.
17 / 31
References
Kerr, Cliff C et al (2012)
Electrostimulation as a prosthesis for repair of information flow in a
computer model of neocortex
IEEE 20(2), 153 – 160.
Neymotin, Samuel A et al (2011)
Synaptic information transfer in computer models of neocortical
columns
Journal of computational neuroscience 30(1), 69 – 84.
Gour´evitch, Boris and Eggermont, Jos J (2007)
Evaluating information transfer between auditory cortical neurons.
Journal of Neurophysiology 97(3), 2533 – 2543
Lizier, Joseph T et al (2012)
Local measures of information storage in complex distributed
computation
Information Sciences 208, 39–54
18 / 31
The End
made with LATEX
19 / 31

More Related Content

PDF
A tutorial in Connectome Analysis (1) - Marcus Kaiser
DOC
Neural network
PDF
A tutorial in Connectome Analysis (0) - Marcus Kaiser
PPTX
Introduction to Artificial Neural Network
PDF
tsopze2011
DOCX
Research Paper - Vaibhav
PPT
Lec 1-2-3-intr.
PDF
A SELF-ORGANIZING RECURRENT NEURAL NETWORK
A tutorial in Connectome Analysis (1) - Marcus Kaiser
Neural network
A tutorial in Connectome Analysis (0) - Marcus Kaiser
Introduction to Artificial Neural Network
tsopze2011
Research Paper - Vaibhav
Lec 1-2-3-intr.
A SELF-ORGANIZING RECURRENT NEURAL NETWORK

What's hot (20)

PPTX
Artificial neural network
PDF
Lesson 37
PDF
Mobile Network Coverage Determination at 900MHz for Abuja Rural Areas using A...
PPTX
Neuron level interpretation of deep nlp model
PDF
ANALYSIS OF ELEMENTARY CELLULAR AUTOMATA CHAOTIC RULES BEHAVIOR
PDF
Functional Brain Networks - Javier M. Buldù
PDF
Nature Inspired Reasoning Applied in Semantic Web
PPSX
Fundamentals of Neural Networks
DOC
Question bank soft computing
DOCX
Artifical neural networks
PPTX
Have We Missed Half of What the Neocortex Does? by Jeff Hawkins (12/15/2017)
PDF
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
PDF
Optimization of Number of Neurons in the Hidden Layer in Feed Forward Neural ...
PDF
Report-de Bruijn Graph
PPTX
Introduction Of Artificial neural network
PPTX
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
PDF
Artificial Neural Network and its Applications
PDF
Neural network
DOC
Neural Network Applications In Machining: A Review
Artificial neural network
Lesson 37
Mobile Network Coverage Determination at 900MHz for Abuja Rural Areas using A...
Neuron level interpretation of deep nlp model
ANALYSIS OF ELEMENTARY CELLULAR AUTOMATA CHAOTIC RULES BEHAVIOR
Functional Brain Networks - Javier M. Buldù
Nature Inspired Reasoning Applied in Semantic Web
Fundamentals of Neural Networks
Question bank soft computing
Artifical neural networks
Have We Missed Half of What the Neocortex Does? by Jeff Hawkins (12/15/2017)
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Optimization of Number of Neurons in the Hidden Layer in Feed Forward Neural ...
Report-de Bruijn Graph
Introduction Of Artificial neural network
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Artificial Neural Network and its Applications
Neural network
Neural Network Applications In Machining: A Review
Ad

Similar to project_presentation (20)

PDF
M.Sc_CengineeringS_II_Soft_Computing_PCSC401.pdf
PPTX
Unit 4 Neurobiology Unit 4 Neurobiology Unit 4 Neurobiology Unit 4 Neurobiology
PPTX
Artificial Neural Network
PPT
lecture11_Artificial neural networks.ppt
PPTX
intelligent system
PPT
Introduction to Artificial neural network
PDF
Hardware Implementation of Spiking Neural Network (SNN)
PPTX
CARLsim 3: Concepts, Tools, and Applications
PPTX
Artificial Neural Networks for NIU session 2016 17
PDF
IntrotoooooooooooooooooooooooooooooNNetwork.pdf
PPT
Unit I & II in Principles of Soft computing
PPT
Neuralnetwork 101222074552-phpapp02
PDF
Neural Computing
PPTX
01-Introduction to artificial in(1).pptx
PDF
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
PDF
08 neural networks(1).unlocked
PDF
Introduction_NNFL_Aug2022.pdf
PDF
Ch 1-1 introduction
PPTX
ANN load forecasting
M.Sc_CengineeringS_II_Soft_Computing_PCSC401.pdf
Unit 4 Neurobiology Unit 4 Neurobiology Unit 4 Neurobiology Unit 4 Neurobiology
Artificial Neural Network
lecture11_Artificial neural networks.ppt
intelligent system
Introduction to Artificial neural network
Hardware Implementation of Spiking Neural Network (SNN)
CARLsim 3: Concepts, Tools, and Applications
Artificial Neural Networks for NIU session 2016 17
IntrotoooooooooooooooooooooooooooooNNetwork.pdf
Unit I & II in Principles of Soft computing
Neuralnetwork 101222074552-phpapp02
Neural Computing
01-Introduction to artificial in(1).pptx
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
08 neural networks(1).unlocked
Introduction_NNFL_Aug2022.pdf
Ch 1-1 introduction
ANN load forecasting
Ad

project_presentation

  • 1. Why Information Flow in A Computer Simulated Cortical Neural Network Is Important R. Jarvis Supervisor P. Junor Department of Electronic Engineering La Trobe University July 22, 2015 1 / 31
  • 2. Introduction Reverse Engineer Principles of Brain Function: Created a neural network model of rat cortex including accurate cell shapes. Observed information flow, making it possible to infer some functions of individual brain cells. Made with NEURON-7.3 as a Python module. 100mm3 µm3 http : //neuromorpho.org/neuroMorpho/LSV ideo.jsp 2 / 31
  • 3. From Reconstructed Cells to Reconstructed Networks. In order to move from reconstructed Cells to reconstructed networks, one must connect neurons in a biologically informed way. Spatial and anatomical data from the Allen Rodent Brain Atlas, is used to inform cell centre position and synapse polarity. Neurons are connected by conducting an exhaustive search in the neural volume for near contact points and allocating a network connection to the intervening synapse cleft. 3 / 31
  • 4. From Neurite coordinates to to synaptic cleft Figure : Two neurons with classified Membrane with Post Synapses 4 / 31
  • 5. Wiring Algorithm: Searching the volume of spike sources HOC 1 proc wire_distances (){ forsec $o1.spk_trig_ls{ 3 for (x){ if((x>0) &&(x<1)){ 5 if(ismembrane("xtra"){ find_distances (x_xtra(x), y_xtra(x), z_xtra(x), secname (), x, $o2) 7 } } 9 } } 11 } 5 / 31
  • 6. Self Reflection HOC 1 // String processing performed elsewhere enabled execution of Strings performed here. Facilitates network connections. 3 sprint(synapse_post , "%s%s%g%s", sec_string2 , " syn_ = new AMPA(",storex ,")") 5 execute(synapse_post) //put a post synapse at the target 7 sprint(synapse_pre , "%s%s%g%s", sec_string1 ," nc = new NetCon (&v(",num_seg ,"),syn_)") 9 execute(synapse_pre) 6 / 31
  • 7. Find Contact Points Algorithm HOC forsec $o7.spk_rx_ls{ 2 sec_string1=$s4 num_seg=$5 4 if(strcmp(str_src ,str_target)==0){ break } if(Cell[py.int(tail_src)].div.x[py.int( tail_target)]>3){ break } 6 if (ismembrane("xtra")) { for(x){ 8 if((x>0) &&(x<1)){ r = (sqrt (( x_xtra(x) - $1)^2 + ( y_xtra(x) - $2)^2 + (z_xtra(x) - $3)^2)) 10 if(r <10){// units of um. if(Cell[py.int(tail_src)].div.x [py.int(tail_target)]<3){ 7 / 31
  • 8. Pre and Post Synapses In most what follows I will be referring to pre synaptic locations and post synaptic locations. (sources:) http://www.igi.tugraz.at/maass/129/129a.htm 8 / 31
  • 9. EPSP and IPSP Some people think of neurons as analog input digital output. One or more IPSPs can prevent a spike from occurring, by depressing the membrane potential. (sources:) http://www.cerebromente.org.br/n12/fundamentos/neurotransmissores/neurotransmitters2.html http://www.studyblue.com/notes/note/n/chapter-48-nervous-system/deck/4169450 9 / 31
  • 10. Pre Processing Signals: Spike Trains Continuous membrane potential thresholded, the times each neuron fires is stored. Time binning makes signal coarse grained. 10 / 31
  • 11. Connectivity and the Adjacency Matrix. I→E=2073, fb1− = 1081 11 / 31
  • 12. Information In the Spike Train The number of spikes per bin: ∆t is the source of uncertainty in the spike train. There are two major sources of information: 1st the external input (sensory). 2nd sources of variability that are intrinsic to the brains dynamics. 12 / 31
  • 13. An Interpretation of Information Flow:Prediction Using Transfer Entropy we can say that neuronY influences neuronX if neuronY ’s past activity reduces the uncertainty about neuronX ’s future activity. neuronY neuronX When a particular type of entropy (uncertaintity) is reduced prediction is increased. 13 / 31
  • 14. Resulting Information Flow One cell can predict another cells activity acting through intermediate neurons A reduction in firing probability still represents an information transmission influence. Connectivity nTE 14 / 31
  • 15. Information Flow Between Neurons Degree matrix shows the quantity and direction of information transmitted between each pair of neurons. nTE Correlation Matrix Presence of information flow contributes validation of reconstruction because when information does flow, it suggests information is not randomly created and destroyed at every node. 15 / 31
  • 16. Information Flow In Von Neuman architecture no conflict between transmitting, translating, and storing information. Respective tasks performed by the bus, the CPU, RAM. One entire neuron simultaneously engages in information transfer, information alteration and information storage. Sums, Cells[0,24] Sums, Cells[25,49] 16 / 31
  • 17. Summary Discussed how TE identified functions. TE distinguished information sending neurons from information receiving neurons. Transfer Entropy was used to quantify Information Flow within and between neurons. I described how information flow in IPSPs can be detected. I have discussed how the presence of information flow contributed to the validation of model. Generally I have described the analysis of Information flow in a cortical neural network, and why it is of interest. 17 / 31
  • 18. References Kerr, Cliff C et al (2012) Electrostimulation as a prosthesis for repair of information flow in a computer model of neocortex IEEE 20(2), 153 – 160. Neymotin, Samuel A et al (2011) Synaptic information transfer in computer models of neocortical columns Journal of computational neuroscience 30(1), 69 – 84. Gour´evitch, Boris and Eggermont, Jos J (2007) Evaluating information transfer between auditory cortical neurons. Journal of Neurophysiology 97(3), 2533 – 2543 Lizier, Joseph T et al (2012) Local measures of information storage in complex distributed computation Information Sciences 208, 39–54 18 / 31
  • 19. The End made with LATEX 19 / 31