Advances in automating analysis
of neural time series
Mainak Jas
Advisor:
Alexandre Gramfort
Ph.D. defense
April 12, 2018
Advances in automating analysis of neural time series 2
Where does the data come from?
Advances in automating analysis of neural time series 2
Where does the data come from?
Electroencephalography
(EEG)
Magnetoencephalography
(MEG)
Advances in automating analysis of neural time series 2
Where does the data come from?
Electroencephalography
(EEG)
Magnetoencephalography
(MEG)
Advances in automating analysis of neural time series 2
Where does the data come from?
Electroencephalography
(EEG)
Magnetoencephalography
(MEG)
Spontaneous
brain activity
Advances in automating analysis of neural time series 2
Where does the data come from?
Electroencephalography
(EEG)
Magnetoencephalography
(MEG)
Earth’s
magnetic field
Spontaneous
brain activity
Advances in automating analysis of neural time series 2
Where does the data come from?
Electroencephalography
(EEG)
Magnetoencephalography
(MEG)
Earth’s
magnetic field
Traffic,
Electrical disturbance
Spontaneous
brain activity
Advances in automating analysis of neural time series 2
Where does the data come from?
Electroencephalography
(EEG)
Magnetoencephalography
(MEG)
Earth’s
magnetic field
Traffic,
Electrical disturbance
Spontaneous
brain activity
Sensor noise
Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Trigger
channel
Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Trials
Trigger
channel
Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Trials
Trigger
channel
Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Trials
Trigger
channel
Evoked
response
Advances in automating analysis of neural time series 3
We will analyze multivariate time series
Time
Trials
Trigger
channel
Evoked
response
N170: faces
P300: surprise
N400: words
etc.
4
Manual analysis cannot be scaled and is not
reproducible
4
Manual analysis cannot be scaled and is not
reproducible
1http://biorxiv.org/content/biorxiv/early/2016/02/12/039354.full.pdf
(A practical guide for improving transparency and reproducibility in neuroimaging research)
“Quite often in the course of a project parameters are modified, list of
subjects are changed, and processing steps need to be rerun …
automating instead of manual interventions can really pay off”
—Russel Poldrack1
4
Manual analysis cannot be scaled and is not
reproducible
1http://biorxiv.org/content/biorxiv/early/2016/02/12/039354.full.pdf
(A practical guide for improving transparency and reproducibility in neuroimaging research)
“Quite often in the course of a project parameters are modified, list of
subjects are changed, and processing steps need to be rerun …
automating instead of manual interventions can really pay off”
—Russel Poldrack1
Advances in automating analysis
of neural time series
Advances in automating analysis of neural time series 5
Contributions
Advances in automating analysis of neural time series 5
Contributions
Reproducibility
• A tutorial paper for group analysis of M/EEG data
(submitted to Frontiers in Brain Imaging Methods)
• Brain Imaging Data Structure for MEG
(submitted to Nature, Scientific Data)
Advances in automating analysis of neural time series 5
Contributions
Reproducibility
Automation
• A tutorial paper for group analysis of M/EEG data
(submitted to Frontiers in Brain Imaging Methods)
• Brain Imaging Data Structure for MEG
(submitted to Nature, Scientific Data)
• Autoreject to remove artifacts (NeuroImage, 2017)
• AlphaCSC to learn brain waveforms (NIPS, 2017)
Advances in automating analysis of neural time series 5
Contributions
Reproducibility
Automation
• A tutorial paper for group analysis of M/EEG data
(submitted to Frontiers in Brain Imaging Methods)
• Brain Imaging Data Structure for MEG
(submitted to Nature, Scientific Data)
• Autoreject to remove artifacts (NeuroImage, 2017)
• AlphaCSC to learn brain waveforms (NIPS, 2017)
Advances in automating analysis of neural time series 6
Contribution I: Brain Imaging Data Structure (BIDS) validator
Advances in automating analysis of neural time series 6
Contribution I: Brain Imaging Data Structure (BIDS) validator
- Automatic converter MNE-BIDS
- BIDS compatible dataset ds000248
Advances in automating analysis of neural time series 7
Contribution II: Tutorial paper on group analysis
[Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
Advances in automating analysis of neural time series 7
Contribution II: Tutorial paper on group analysis
code
[Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
Advances in automating analysis of neural time series 7
Contribution II: Tutorial paper on group analysis
code
?
[Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
Advances in automating analysis of neural time series 7
Contribution II: Tutorial paper on group analysis
code
?
http://mne-tools.github.io/mne-biomag-group-demo/
[Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
Advances in automating analysis of neural time series 7
Contribution II: Tutorial paper on group analysis
•Diagnostic plots
code
?
http://mne-tools.github.io/mne-biomag-group-demo/
[Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
Advances in automating analysis of neural time series 7
Contribution II: Tutorial paper on group analysis
•Diagnostic plots
•Alternatives
code
?
http://mne-tools.github.io/mne-biomag-group-demo/
[Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
Advances in automating analysis of neural time series 7
Contribution II: Tutorial paper on group analysis
•Diagnostic plots
•Alternatives
•Statistics
code
?
http://mne-tools.github.io/mne-biomag-group-demo/
[Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
Advances in automating analysis of neural time series 8
Contributions
Reproducibility
Automation
• A tutorial paper for group analysis of M/EEG data
(submitted to Frontiers in Neuroscience)
• Brain Imaging Data Structure for MEG
(submitted to Nature, Scientific Data)
• Autoreject to remove artifacts (NeuroImage, 2017)
• AlphaCSC to learn brain waveforms (NIPS, 2017)
Advances in automating analysis of neural time series 9
Contribution III: Automatic rejection of artifactssensors
Trials
[Jas, Engemann, Raimondo, Bekhti, Gramfort. NeuroImage. 2017]
Bad
trial
Related work
Riemannian Potato
[Barachant, 2013]
PREP (RANSAC)
[Bigdely-Shamlo, 2015]
FASTER
[Nolan, 2010]
10
Robust regression
[Diedrichsen, 2005]
Sensor Noise Suppression
[Cheveigné, 2008]
Advances in automating analysis of neural time series
Advances in automating analysis of neural time series 11
Rejecting based on peak-to-peak amplitudes
Advances in automating analysis of neural time series 11
Rejecting based on peak-to-peak amplitudes
Peak-to-peak amplitude A
Advances in automating analysis of neural time series 11
Rejecting based on peak-to-peak amplitudes
Peak-to-peak amplitude A
Advances in automating analysis of neural time series 11
Rejecting based on peak-to-peak amplitudes
Peak-to-peak amplitude A
Advances in automating analysis of neural time series 11
Rejecting based on peak-to-peak amplitudes
Peak-to-peak amplitude A
A < threshold τ
Advances in automating analysis of neural time series 11
Rejecting based on peak-to-peak amplitudes
Peak-to-peak amplitude A
A < threshold τ
Good data
Yes?
Advances in automating analysis of neural time series 11
Rejecting based on peak-to-peak amplitudes
Peak-to-peak amplitude A
A < threshold τ
Good data Bad data
Yes? No?
Observation: The optimal threshold retains sufficient
trials while rejecting outliers.
12Advances in automating analysis of neural time series
Observation: The optimal threshold retains sufficient
trials while rejecting outliers.
12
Too few
trials!
Advances in automating analysis of neural time series
Observation: The optimal threshold retains sufficient
trials while rejecting outliers.
12
Too few
trials!
Advances in automating analysis of neural time series
Observation: The optimal threshold retains sufficient
trials while rejecting outliers.
12
Too few
trials!
Outliers not
removed
Advances in automating analysis of neural time series
Observation: The optimal threshold retains sufficient
trials while rejecting outliers.
12
Too few
trials!
Outliers not
removedOptimal
Advances in automating analysis of neural time series
Evoked
response
13
How do we measure data quality?
Many trials for a
single sensor
13
Artifact
How do we measure data quality?
Many trials for a
single sensor
13
Artifact
How do we measure data quality?
Many trials for a
single sensor
13
Artifact
How do we measure data quality?
Many trials for a
single sensor
Training set Validation set
13
Artifact
How do we measure data quality?
Many trials for a
single sensor
Training set Validation set
XvalXtrain(τ)
13
Artifact
How do we measure data quality?
Many trials for a
single sensor
Training set Validation set
XvalXtrain(τ)
13
Artifact
How do we measure data quality?
Many trials for a
single sensor
Training set Validation set
Xval Fro
RMSE = Xtrain(τ)
(lower error = cleaner training set) 13
Artifact
How do we measure data quality?
Many trials for a
single sensor
Training set Validation set
Xval Fro
RMSE = Xtrain(τ)
14
But what if my validation data also has artifacts?
Advances in automating analysis of neural time series
14
But what if my validation data also has artifacts?
Validation set
Advances in automating analysis of neural time series
14
But what if my validation data also has artifacts?
Xval
Validation set
Advances in automating analysis of neural time series
14
But what if my validation data also has artifacts?
Xval
Xval
~
Validation set
Advances in automating analysis of neural time series
14
But what if my validation data also has artifacts?
Xval
Xval
~
Answer: use the median instead of the mean RMSE= Xtrain(τ) Xval
~
Fro
Validation set
Advances in automating analysis of neural time series
Autoreject (global) vs. human threshold
15Advances in automating analysis of neural time series
Autoreject (global) vs. human threshold
15Advances in automating analysis of neural time series
Remove trial if data in any sensor > τ
Autoreject (global) vs. human threshold
1.85
1.95
2.05
2.15
2.25
2.35
40 57 73 90 106 123 139 156 172 189
RMSE(μV)
Threshold (μV)
RMSE
Autoreject (global)
Manual
15Advances in automating analysis of neural time series
Remove trial if data in any sensor > τ
Autoreject (global) vs. human threshold
1.85
1.95
2.05
2.15
2.25
2.35
40 57 73 90 106 123 139 156 172 189
RMSE(μV)
Threshold (μV)
RMSE
Autoreject (global)
Manual
15
Average with 5-fold cross-validation
Advances in automating analysis of neural time series
Remove trial if data in any sensor > τ
Autoreject (global) vs. human threshold
1.85
1.95
2.05
2.15
2.25
2.35
40 57 73 90 106 123 139 156 172 189
RMSE(μV)
Threshold (μV)
RMSE
Autoreject (global)
Manual
15
Average with 5-fold cross-validation
Advances in automating analysis of neural time series
Remove trial if data in any sensor > τ
Autoreject (global) vs. human threshold
1.85
1.95
2.05
2.15
2.25
2.35
40 57 73 90 106 123 139 156 172 189
RMSE(μV)
Threshold (μV)
RMSE
Autoreject (global)
Manual
15
Average with 5-fold cross-validation
Advances in automating analysis of neural time series
Remove trial if data in any sensor > τ
How to sample the thresholds?
16
How to sample the thresholds?
16
error
Thresholds
How to sample the thresholds?
16
error
Thresholds
How to sample the thresholds?
16
Grid search
error
Thresholds
How to sample the thresholds?
16
Grid search
Grid resolution
error
Thresholds
How to sample the thresholds?
16
Random search
error
Thresholds
17
Bayesian optimization: a more efficient approach
Thresholds
error
17
Bayesian optimization: a more efficient approach
Thresholds
error
17
Bayesian optimization: a more efficient approach
Thresholds
error
Gaussian process
17
Bayesian optimization: a more efficient approach
Thresholds
error
Exploration vs exploitation dilemma
Gaussian process
17
Bayesian optimization: a more efficient approach
Thresholds
error
Exploration vs exploitation dilemma
Gaussian process
Acquisition function
17
Bayesian optimization: a more efficient approach
Thresholds
error
Exploration vs exploitation dilemma
Gaussian process
Acquisition function
18
Each sensor has a different threshold
18
0
5
10
15
20
25
123 225 327 430 532 634 736 839
Numberofsensors
Threshold (fT/cm)
Each sensor has a different threshold
18
0
5
10
15
20
25
123 225 327 430 532 634 736 839
Numberofsensors
Threshold (fT/cm)
Each sensor has a different threshold
Sensors
Trials
18
0
5
10
15
20
25
123 225 327 430 532 634 736 839
Numberofsensors
Threshold (fT/cm)
Each sensor has a different threshold
Sensors
Trials
18
0
5
10
15
20
25
123 225 327 430 532 634 736 839
Numberofsensors
Threshold (fT/cm)
Each sensor has a different threshold
Sensors
Trials
19
Schematic of the complexity of the problem
Trials
Sensors
19
Schematic of the complexity of the problem
Trials
Sensors
Bad trial
19
Schematic of the complexity of the problem
Trials
Sensors
Bad trial
Globally bad sensor
19
Schematic of the complexity of the problem
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
19
Schematic of the complexity of the problem
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
19
Schematic of the complexity of the problem
Naïve solution:
Drop bad trials and sensors
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
19
Schematic of the complexity of the problem
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
Naïve solution:
Drop bad trials and sensors
19
Schematic of the complexity of the problem
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
Naïve solution:
Drop bad trials and sensors
19
Schematic of the complexity of the problem
Proposed solution:
Naïve solution:
Drop bad trials and sensors
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
19
Schematic of the complexity of the problem
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
Proposed solution:
Naïve solution:
Drop bad trials and sensors
if #bad-sensors > κ (=4)
drop trial
19
Schematic of the complexity of the problem
Proposed solution:
Naïve solution:
Drop bad trials and sensors
if #bad-sensors > κ (=4)
drop trial
Interpolation
Bad sensor
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
19
Schematic of the complexity of the problem
Proposed solution:
Naïve solution:
Drop bad trials and sensors
if #bad-sensors > κ (=4)
drop trial
Interpolation
• EEG: Spherical splines
• MEG: MNE
Bad sensor
[Hamalainen et al., 1994]
[Perrin et al., 1989]
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
19
Schematic of the complexity of the problem
Proposed solution:
Naïve solution:
Drop bad trials and sensors
if #bad-sensors > κ (=4)
drop trial
Interpolation
• EEG: Spherical splines
• MEG: MNE
Bad sensor
[Hamalainen et al., 1994]
[Perrin et al., 1989]
Else:
interpolate ρ (=2) worst
bad sensors per trial
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
19
Schematic of the complexity of the problem
Proposed solution:
Naïve solution:
Drop bad trials and sensors
if #bad-sensors > κ (=4)
drop trial
Else:
interpolate ρ (=2) worst
bad sensors per trial
Interpolation
• EEG: Spherical splines
• MEG: MNE
Bad sensor
[Hamalainen et al., 1994]
[Perrin et al., 1989]
Trials
Sensors
Bad trial
Globally bad sensor
Locally bad sensor
Local artifacts
Autoreject in action: MNE sample data (MEG)
20http://autoreject.github.io/auto_examples/plot_auto_repair.html
21
Autoreject in action (EEG data)
Advances in automating analysis of neural time series
Before autoreject After autoreject
19 subject Faces dataset
[Wakeman and Henson, 2015]
22
Quantitative evaluation
Advances in automating analysis of neural time series
[Nolan, 2010] [Bigdely-Shamlo, 2015]
[Cheveigné, 2008]
22
Quantitative evaluation
Advances in automating analysis of neural time series

 humanmethod XX
[Nolan, 2010] [Bigdely-Shamlo, 2015]
[Cheveigné, 2008]
22
Quantitative evaluation
Advances in automating analysis of neural time series
one subject

 humanmethod XX
[Nolan, 2010] [Bigdely-Shamlo, 2015]
[Cheveigné, 2008]
22
Quantitative evaluation
Advances in automating analysis of neural time series
Competing
method better
one subject

 humanmethod XX
[Nolan, 2010] [Bigdely-Shamlo, 2015]
[Cheveigné, 2008]
22
Quantitative evaluation
Advances in automating analysis of neural time series
Autoreject
better
Competing
method better
one subject

 humanmethod XX
[Nolan, 2010] [Bigdely-Shamlo, 2015]
[Cheveigné, 2008]
22
Quantitative evaluation
Advances in automating analysis of neural time series
Autoreject
better
Competing
method better
one subject

 humanmethod XX
[Nolan, 2010] [Bigdely-Shamlo, 2015]
[Cheveigné, 2008]
Transparency: an example diagnostic plot
23http://autoreject.github.io/auto_examples/plot_visualize_bad_epochs.html
Transparency: an example diagnostic plot
23
Sensor to be interpolated
http://autoreject.github.io/auto_examples/plot_visualize_bad_epochs.html
Transparency: an example diagnostic plot
23
Sensor to be interpolated
Bad sensor but not going to be interpolated
http://autoreject.github.io/auto_examples/plot_visualize_bad_epochs.html
Transparency: an example diagnostic plot
23
Sensor to be interpolated
Bad sensor but not going to be interpolated Bad trials
http://autoreject.github.io/auto_examples/plot_visualize_bad_epochs.html
Advances in automating analysis of neural time series 24
Contributions
Reproducibility
Automation
• A tutorial paper for group analysis of M/EEG data
(submitted to Frontiers in Neuroscience)
• Brain Imaging Data Structure for MEG
(submitted to Nature, Scientific Data)
• Autoreject to remove artifacts (NeuroImage, 2017)
• AlphaCSC to learn brain waveforms (NIPS, 2017)
Advances in automating analysis of neural time series 24
Contributions
Reproducibility
Automation
• A tutorial paper for group analysis of M/EEG data
(submitted to Frontiers in Neuroscience)
• Brain Imaging Data Structure for MEG
(submitted to Nature, Scientific Data)
• Autoreject to remove artifacts (NeuroImage, 2017)
• AlphaCSC to learn brain waveforms (NIPS, 2017)
Contribution IV: Learn representations from neural data
#1 Shape of brain rhythms matter
25Advances in automating analysis of neural time series
[Cole and Voytek, 2017]
μ rhythm
[Jas, Dupré la Tour, Şimşekli, Gramfort. NIPS. 2017]
Contribution IV: Learn representations from neural data
#1 Shape of brain rhythms matter
25Advances in automating analysis of neural time series
[Cole and Voytek, 2017]
μ rhythm
[Jas, Dupré la Tour, Şimşekli, Gramfort. NIPS. 2017]
Contribution IV: Learn representations from neural data
#2 Filtering
#1 Shape of brain rhythms matter
25Advances in automating analysis of neural time series
[Cole and Voytek, 2017]
μ rhythm
[Jas, Dupré la Tour, Şimşekli, Gramfort. NIPS. 2017]
Contribution IV: Learn representations from neural data
#2 Filtering
#1 Shape of brain rhythms matter
25Advances in automating analysis of neural time series
[Cole and Voytek, 2017]
μ rhythm
asymmetry
[Jas, Dupré la Tour, Şimşekli, Gramfort. NIPS. 2017]
Some attempts in the neuroscience
community
26Advances in automating analysis of neural time series
Some attempts in the neuroscience
community
26
Adaptive Waveform Learning [Hitziger, 2017]
Template waveform Coefficient updates waveform updates
Different durations
Advances in automating analysis of neural time series
Some attempts in the neuroscience
community
Sliding Window Method
[Gips et al., 2017]
26
Adaptive Waveform Learning [Hitziger, 2017]
Template waveform Coefficient updates waveform updates
Different durations
Advances in automating analysis of neural time series
Some attempts in the neuroscience
community
Sliding Window Method
[Gips et al., 2017]
MoTIF [Jost et al., 2006]
26
Adaptive Waveform Learning [Hitziger, 2017]
Template waveform Coefficient updates waveform updates
Different durations
Learning recurrent waveforms in
EEG [Brockmeier & Principe, 2016]
Multivariate temporal dictionary
learning [Barthélemy et al., 2013]
Advances in automating analysis of neural time series
The problem setup (simulation)
27Advances in automating analysis of neural time series
The problem setup (simulation)
27Advances in automating analysis of neural time series
The problem setup (simulation)
27Advances in automating analysis of neural time series
The problem setup (simulation)
Two different atoms
(can have more)
27Advances in automating analysis of neural time series
The problem setup (simulation)
Different amplitudes
Two different atoms
(can have more)
27Advances in automating analysis of neural time series
The problem setup (simulation)
Different locations
Different amplitudes
Two different atoms
(can have more)
27Advances in automating analysis of neural time series
The problem setup (simulation)
Different locations
Different amplitudes
Two different atoms
(can have more)
Atoms can even overlap
27Advances in automating analysis of neural time series
The problem setup (simulation)
Different locations
Different amplitudes
Two different atoms
(can have more)
Small gaussian noise
Atoms can even overlap
27Advances in automating analysis of neural time series
28
Convolutional Sparse Coding (CSC) formulation
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
28
Convolutional Sparse Coding (CSC) formulation
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
=nx
28
Convolutional Sparse Coding (CSC) formulation
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
*=nx
k
d k
nz
28
Convolutional Sparse Coding (CSC) formulation
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
*=nx
k
d k
nz

28
Convolutional Sparse Coding (CSC) formulation
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
*
*
=
=
nx
k
d k
nz


28
Convolutional Sparse Coding (CSC) formulation
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min 
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
*
*
=
=
nx
k
d k
nz


28
Convolutional Sparse Coding (CSC) formulation
Penalty to enforce sparsity
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min 
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
*
*
=
=
nx
k
d k
nz


28
Convolutional Sparse Coding (CSC) formulation
Penalty to enforce sparsity
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min 
s.t. 0k
nz
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
*
*
=
=
nx
k
d k
nz


28
Convolutional Sparse Coding (CSC) formulation
d k 2
2 ≤ 1
Penalty to enforce sparsity
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min 
s.t. 0k
nz
Advances in automating analysis of neural time series
[Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
*
*
=
=
nx
k
d k
nz


Basic strategy: alternate minimization
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min 
29Advances in automating analysis of neural time series
Basic strategy: alternate minimization
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min 
29Advances in automating analysis of neural time series
• Not jointly convex in both d and z
Basic strategy: alternate minimization
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min 
29Advances in automating analysis of neural time series
• Not jointly convex in both d and z
• Convex when d or z is fixed
Basic strategy: alternate minimization
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min 
29Advances in automating analysis of neural time series
• Not jointly convex in both d and z
• Convex when d or z is fixed
• Alternate minimization guarantees cost
function goes down at every step
Basic strategy: alternate minimization
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min 
d-step
29Advances in automating analysis of neural time series
• Not jointly convex in both d and z
• Convex when d or z is fixed
• Alternate minimization guarantees cost
function goes down at every step
Basic strategy: alternate minimization
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min 
z-step
29Advances in automating analysis of neural time series
• Not jointly convex in both d and z
• Convex when d or z is fixed
• Alternate minimization guarantees cost
function goes down at every step
Basic strategy: alternate minimization
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min 
d-step
29Advances in automating analysis of neural time series
• Not jointly convex in both d and z
• Convex when d or z is fixed
• Alternate minimization guarantees cost
function goes down at every step
Basic strategy: alternate minimization
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*
2
1
min 
z-step
29Advances in automating analysis of neural time series
• Not jointly convex in both d and z
• Convex when d or z is fixed
• Alternate minimization guarantees cost
function goes down at every step
z-step: The Toeplitz matrix trick
30Advances in automating analysis of neural time series
z-step: The Toeplitz matrix trick
k
n
kk
n
k
zDzd *
30Advances in automating analysis of neural time series
z-step: The Toeplitz matrix trick
k
n
kk
n
k
zDzd *
k
D
30Advances in automating analysis of neural time series
z-step: The Toeplitz matrix trick
k
n
kk
n
k
zDzd *
k
D
k
nz
30Advances in automating analysis of neural time series
z-step: The Toeplitz matrix trick
k
n
kk
n
k
zDzd *
k
D
k
nz
  
kn
k
n
n k
k
n
k
n
z
zzDx
,
2
2
min 
30Advances in automating analysis of neural time series
z-step: The Toeplitz matrix trick
31Advances in automating analysis of neural time series
z-step: The Toeplitz matrix trick
k
k
n
k
zD
31Advances in automating analysis of neural time series
z-step: The Toeplitz matrix trick
1
D
k
k
n
k
zD
1
nz
31Advances in automating analysis of neural time series
z-step: The Toeplitz matrix trick
1
D k
D
k
k
n
k
zD
1
nz
k
nz
31Advances in automating analysis of neural time series
[ ]
z-step: The Toeplitz matrix trick
…
…
1
D k
D
k
k
n
k
zD
1
nz
k
nz
31Advances in automating analysis of neural time series
[ ]
z-step: The Toeplitz matrix trick
…
…
1
D k
D
k
k
n
k
zD
1
nz
k
nz
nDz
31Advances in automating analysis of neural time series
 






n
nnn
z
zDzx 
2
2
2
1
min
[ ]
z-step: The Toeplitz matrix trick
…
…
1
D k
D
k
k
n
k
zD
1
nz
k
nz
nDz
31Advances in automating analysis of neural time series
 








n
nnn
n
zDzx
z

2
2
2
1
[ ]
z-step: The Toeplitz matrix trick
…
…
1
D k
D
k
k
n
k
zD
1
nz
k
nz
nDz
31Advances in automating analysis of neural time series
 








n
nnn
n
zDzx
z

2
2
2
1
[ ]
z-step: The Toeplitz matrix trick
…
…
1
D k
D
k
k
n
k
zD
1
nz
k
nz
nDz
1 )( nn
T
DzxD
31Advances in automating analysis of neural time series
 








n
nnn
n
zDzx
z

2
2
2
1
[ ]
z-step: The Toeplitz matrix trick
…
…
1
D k
D
k
k
n
k
zD
1
nz
k
nz
nDz
1 )( nn
T
DzxD
Now, we can feed the gradient to L-BFGS-B optimization algorithm
31Advances in automating analysis of neural time series
  









 n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2

k
nz
d-step: Strided matrices
00 0 1 0 .5 0
k
nZ
32Advances in automating analysis of neural time series
  









 n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2

k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
k
nZ
32Advances in automating analysis of neural time series
  









 n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2

k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
00 1
k
nZ
32Advances in automating analysis of neural time series
  









 n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2

k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
00 1
01 0
k
nZ
32Advances in automating analysis of neural time series
  









 n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2

k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
00 1
01 0
0 0 .5
k
nZ
32Advances in automating analysis of neural time series
  









 n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2

k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
00 1
01 0
0 0 .5
0 .5 0
k
nZ
32Advances in automating analysis of neural time series
  









 n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2

k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
00 1
01 0
0 0 .5
0 .5 0
0
2
4
k
nZ k
d
32Advances in automating analysis of neural time series
  









 n kn
k
n
k
k
n
k
n
d
zzdx
,
2
2
1
*
2
1
min2
2

k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
00 1
01 0
0 0 .5
0 .5 0
0
2
4
k
nZ k
d
kk
n
k
n
k
dZzd *
32Advances in automating analysis of neural time series
  








 n k
kk
nn
d
dZx
2
2
1 2
1
min2
2
k
nz
d-step: Strided matrices
00 0 1 0 .5 0 0 0 1
00 1
01 0
0 0 .5
0 .5 0
0
2
4
k
nZ k
d
kk
n
k
n
k
dZzd *
We remove the regularization
term as it doesn’t depend on d
32Advances in automating analysis of neural time series
Least square problem under
unit norm constraint
33
d-step: The atoms are updated using LBFGS-B
Advances in automating analysis of neural time series
33
d-step: The atoms are updated using LBFGS-B
  










n k
kk
nnk
dZx
d
2
2
2
1
Advances in automating analysis of neural time series
33
d-step: The atoms are updated using LBFGS-B
  










n k
kk
nnk
dZx
d
2
2
2
1
  





 k
kk
nn
Tk
n dZxZ
Advances in automating analysis of neural time series
33
d-step: The atoms are updated using LBFGS-B
  










n k
kk
nnk
dZx
d
2
2
2
1









2
ˆ
1
,1minˆˆ
k
kk
d
dd
1. Gradient step
2. Projection step
  





 k
kk
nn
Tk
n dZxZ
Advances in automating analysis of neural time series
Projected gradient
descent
33
d-step: The atoms are updated using LBFGS-B
  










n k
kk
nnk
dZx
d
2
2
2
1









2
ˆ
1
,1minˆˆ
k
kk
d
dd
1. Gradient step
2. Projection step
  





 k
kk
nn
Tk
n dZxZ
Advances in automating analysis of neural time series
In practice, it is more complicated … LBFGS-B in the dual
Projected gradient
descent
Putting it all together
34Advances in automating analysis of neural time series
Putting it all together
z update
34Advances in automating analysis of neural time series
Putting it all together
z update
Toeplitz D
34Advances in automating analysis of neural time series
Putting it all together
z update
d update
Toeplitz D
34Advances in automating analysis of neural time series
Putting it all together
z update
d update
Toeplitz D
Strided Z
34Advances in automating analysis of neural time series
Putting it all together
z update
d update
Toeplitz D
Strided Z
(L-BFGS-B)
(L-BFGS-B)
34Advances in automating analysis of neural time series
Putting it all together
z update
d update
Toeplitz D
Strided Z
(L-BFGS-B)
(L-BFGS-B)
Alternate minimization
34Advances in automating analysis of neural time series
35
The quasi-Newton solvers outperform
ADMM solvers
K = 10 atoms
L = 32 samples per atom
Advances in automating analysis of neural time series
35
The quasi-Newton solvers outperform
ADMM solvers
K = 10 atoms
L = 32 samples per atom
Advances in automating analysis of neural time series
35
The quasi-Newton solvers outperform
ADMM solvers
K = 10 atoms
L = 32 samples per atom
30 minutes
Advances in automating analysis of neural time series
35
The quasi-Newton solvers outperform
ADMM solvers
K = 10 atoms
L = 32 samples per atom
30 minutes
1 hour
Advances in automating analysis of neural time series
35
The quasi-Newton solvers outperform
ADMM solvers
K = 10 atoms
L = 32 samples per atom
30 minutes
1 hour
Advances in automating analysis of neural time series
35
The quasi-Newton solvers outperform
ADMM solvers
K = 10 atoms
L = 32 samples per atom
30 minutes
1 hour
45 minutes
Advances in automating analysis of neural time series
35
The quasi-Newton solvers outperform
ADMM solvers
K = 10 atoms
L = 32 samples per atom
30 minutes
1 hour
45 minutes
2 hours
Advances in automating analysis of neural time series
36
The quasi-Newton solver outperform ADMM
solvers
Advances in automating analysis of neural time series
36
The quasi-Newton solver outperform ADMM
solvers
ADMM solvers
Advances in automating analysis of neural time series
36
The quasi-Newton solver outperform ADMM
solvers
Different random seeds
ADMM solvers
Advances in automating analysis of neural time series
Cross frequency coupling uncovered via CSC
~80 Hz
37Advances in automating analysis of neural time series
[Canolty, 2006]
[Dupré la Tour, 2017]
Challenge
Neural data often contains transient artifacts
38Advances in automating analysis of neural time series
39
CSC in the presence of transient artifacts
Advances in automating analysis of neural time series
39
CSC in the presence of transient artifacts
Advances in automating analysis of neural time series
39
Neural signals
CSC in the presence of transient artifacts
Advances in automating analysis of neural time series
39
Transient
artifacts
Neural signals
CSC in the presence of transient artifacts
Advances in automating analysis of neural time series
A probabilistic interpretation
40Advances in automating analysis of neural time series
A probabilistic interpretation
40Advances in automating analysis of neural time series
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min 
A probabilistic interpretation
  






tn k
k
tntn
zd
zpzdxpzd
,
,,
,
)(log),|(logmaxarg*)*,(
Maximum a posteriori estimate
40Advances in automating analysis of neural time series
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min 
A probabilistic interpretation







1,*~,|
1
,
K
k
k
n
k
tn zdNdzx
  






tn k
k
tntn
zd
zpzdxpzd
,
,,
,
)(log),|(logmaxarg*)*,(
Maximum a posteriori estimate
Data likelihood
40Advances in automating analysis of neural time series
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min 
A probabilistic interpretation
)(~, k
tnz






1,*~,|
1
,
K
k
k
n
k
tn zdNdzx
  






tn k
k
tntn
zd
zpzdxpzd
,
,,
,
)(log),|(logmaxarg*)*,(
Maximum a posteriori estimate
PriorData likelihood
40Advances in automating analysis of neural time series
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min 
A probabilistic interpretation
)(~, k
tnz






1,*~,|
1
,
K
k
k
n
k
tn zdNdzx
  






tn k
k
tntn
zd
zpzdxpzd
,
,,
,
)(log),|(logmaxarg*)*,(
Maximum a posteriori estimate
PriorData likelihood
40Advances in automating analysis of neural time series
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min 
A probabilistic interpretation
)(~, k
tnz






1,*~,|
1
,
K
k
k
n
k
tn zdNdzx
  






tn k
k
tntn
zd
zpzdxpzd
,
,,
,
)(log),|(logmaxarg*)*,(
Maximum a posteriori estimate
PriorData likelihood
40Advances in automating analysis of neural time series
  









n kn
k
n
k
k
n
k
n
zd
zzdx
,
2
2
,
*min 
Sparser
activations
Alpha-stable distributions
),,,(~ SX
41Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
Alpha-stable distributions
),,,(~ SX
41Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
Heavy tail
Alpha-stable distributions
),,,(~ SX
]2,0(Characteristic exponent
41Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
Heavy tail
Alpha-stable distributions
),,,(~ SX
]2,0(Characteristic exponent
Skewness parameter ]1,1[ 
41Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
Heavy tail
Alpha-stable distributions
),,,(~ SX
]2,0(Characteristic exponent
Skewness parameter ]1,1[ 
Location parameter ),( 
41Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
Heavy tail
Alpha-stable distributions
),,,(~ SX
]2,0(Characteristic exponent
Skewness parameter ]1,1[ 
Location parameter
),0( Scale parameter
),( 
41Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
Heavy tail
Alpha-stable distributions
),,,(~ SX
]2,0(Characteristic exponent
Skewness parameter ]1,1[ 
Location parameter
),0( Scale parameter
),( 
41
Special cases
Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
Heavy tail
Alpha-stable distributions
),,,(~ SX
]2,0(Characteristic exponent
Skewness parameter ]1,1[ 
Location parameter
),0( Scale parameter
),( 
41
Special cases
Normal distribution
),,0,2(~  SX
Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
Heavy tail
Alpha-stable distributions
),,,(~ SX
]2,0(Characteristic exponent
Skewness parameter ]1,1[ 
Location parameter
),0( Scale parameter
),( 
41
Special cases
Normal distribution
),,0,2(~  SX
),,0,1(~  SX
Cauchy distribution
Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
Heavy tail
42







1,*~,|
1
,
K
k
k
n
k
tn zdNdzx
From light tails to heavy tails






 
K
k
k
n
k
tn zdSdzx
1
, *,2/1,0,~,| 
2
Symmetric alpha-stable
distributions
Advances in automating analysis of neural time series
Symmetric α-stable distribution is
conditionally Gaussian
43Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]
Symmetric α-stable distribution is
conditionally Gaussian
43






 
K
k
k
n
k
tn zdSdzx
1
, *,2/1,0,~,| 
Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]







tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,| 
Symmetric α-stable distribution is
conditionally Gaussian
43






 
K
k
k
n
k
tn zdSdzx
1
, *,2/1,0,~,| 
Advances in automating analysis of neural time series
[Samorodnitsky et al., 1996]







tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,| 
Symmetric α-stable distribution is
conditionally Gaussian
43






 
K
k
k
n
k
tn zdSdzx
1
, *,2/1,0,~,| 
Advances in automating analysis of neural time series
from a positive stable
distribution
tn,
[Samorodnitsky et al., 1996]







tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,| 
Symmetric α-stable distribution is
conditionally Gaussian
43






 
K
k
k
n
k
tn zdSdzx
1
, *,2/1,0,~,| 
  








n kn
k
n
k
k
n
k
n
n
zd
zzdx
,
2
2
,
*
1
min 

Weighted CSC!
Advances in automating analysis of neural time series
from a positive stable
distribution
tn,
[Samorodnitsky et al., 1996]







tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,| 
Symmetric α-stable distribution is
conditionally Gaussian
43






 
K
k
k
n
k
tn zdSdzx
1
, *,2/1,0,~,| 
  








n kn
k
n
k
k
n
k
n
n
zd
zzdx
,
2
2
,
*
1
min 

Weighted CSC!
Advances in automating analysis of neural time series
But is not known …tn,
from a positive stable
distribution
tn,
[Samorodnitsky et al., 1996]
EM algorithm is used to estimate MAP when
some variables are missing
44Advances in automating analysis of neural time series
[Dempster et. al., 1977]
EM algorithm is used to estimate MAP when
some variables are missing
44
Expectation step
)(log)],|,([log),( ),,|(
)(
zpzdxpzdB zdxp
i
 E
Advances in automating analysis of neural time series
[Dempster et. al., 1977]
EM algorithm is used to estimate MAP when
some variables are missing
44
Expectation step
)(log)],|,([log),( ),,|(
)(
zpzdxpzdB zdxp
i
 E
  














n kn
k
n
k
k
n
k
n
n
i
zzdxEzdB
,
2
2
)(
*
1
),( 

Advances in automating analysis of neural time series
[Dempster et. al., 1977]
EM algorithm is used to estimate MAP when
some variables are missing
44
Expectation step
)(log)],|,([log),( ),,|(
)(
zpzdxpzdB zdxp
i
 E
  














n kn
k
n
k
k
n
k
n
n
i
zzdxEzdB
,
2
2
)(
*
1
),( 

Advances in automating analysis of neural time series
[Dempster et. al., 1977]
weights
EM algorithm is used to estimate MAP when
some variables are missing
44
Expectation step
Maximization step
)(log)],|,([log),( ),,|(
)(
zpzdxpzdB zdxp
i
 E
),(maxarg),( )(
,
)1()1(
zdBzd i
zd
ii

  














n kn
k
n
k
k
n
k
n
n
i
zzdxEzdB
,
2
2
)(
*
1
),( 

Advances in automating analysis of neural time series
[Dempster et. al., 1977]
weights
EM algorithm is used to estimate MAP when
some variables are missing
44
Expectation step
Maximization step
)(log)],|,([log),( ),,|(
)(
zpzdxpzdB zdxp
i
 E
),(maxarg),( )(
,
)1()1(
zdBzd i
zd
ii

  














n kn
k
n
k
k
n
k
n
n
i
zzdxEzdB
,
2
2
)(
*
1
),( 

Iterate until
convergence
Advances in automating analysis of neural time series
[Dempster et. al., 1977]
weights
45
For the E-step, we compute the expectation
using sampling
Advances in automating analysis of neural time series
[Bishop, 2007]
45
For the E-step, we compute the expectation
using sampling






tntn
tntn
dzdxp ,,
,,
),,|(
11


E
Advances in automating analysis of neural time series
[Bishop, 2007]
45
For the E-step, we compute the expectation
using sampling






tntn
tntn
dzdxp ,,
,,
),,|(
11


E


J
j
j
tnJ 1 ,
11

Advances in automating analysis of neural time series
[Bishop, 2007]
45
For the E-step, we compute the expectation
using sampling






tntn
tntn
dzdxp ,,
,,
),,|(
11


E


J
j
j
tnJ 1 ,
11

Advances in automating analysis of neural time series
[Bishop, 2007]
45
For the E-step, we compute the expectation
using sampling






tntn
tntn
dzdxp ,,
,,
),,|(
11


E


J
j
j
tnJ 1 ,
11

Sampled from the
posterior distribution
Advances in automating analysis of neural time series
[Bishop, 2007]
45
For the E-step, we compute the expectation
using sampling






tntn
tntn
dzdxp ,,
,,
),,|(
11


E


J
j
j
tnJ 1 ,
11

Sampled from the
posterior distribution
Markov Chain Monte Carlo (MCMC)
Advances in automating analysis of neural time series
[Bishop, 2007]
46
Let’s recap
Advances in automating analysis of neural time series
46
Let’s recap
Data likelihood term
Advances in automating analysis of neural time series
46
Let’s recap
Data likelihood term







tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,| 
Conditionally Gaussian
Advances in automating analysis of neural time series
46
Let’s recap
Data likelihood term







tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,| 
Conditionally Gaussian
Latent variable
Advances in automating analysis of neural time series
46
Let’s recap
Data likelihood term







tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,| 
Conditionally Gaussian
Latent variable
EM algorithm
Advances in automating analysis of neural time series
46
Let’s recap
Data likelihood term







tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,| 
Conditionally Gaussian
Latent variable
EM algorithm
E step: MCMC to learn weights 





tn,
1

E
Advances in automating analysis of neural time series
46
Let’s recap
Data likelihood term







tn
K
k
k
n
k
tntn zdNdzx ,
1
,,
2
1
,*~,,| 
Conditionally Gaussian
Latent variable
EM algorithm
E step: MCMC to learn weights
M step: Weighted CSC






tn,
1

E
Advances in automating analysis of neural time series
47
-CSC on simulated data
Advances in automating analysis of neural time series
[Jost et al., 2006]
48
-CSC with  = 2 (reduces to Gaussian)
Advances in automating analysis of neural time series
48
-CSC with  = 2 (reduces to Gaussian)
Advances in automating analysis of neural time series
49
-CSC with ≠2
Advances in automating analysis of neural time series
49
-CSC with ≠2
Advances in automating analysis of neural time series
Advances in automating analysis of neural time series 50
Conclusion
Advances in automating analysis of neural time series 50
Conclusion
Data sharing
And data standards
Advances in automating analysis of neural time series 50
Conclusion
Data sharing
And data standards
Autoreject
to remove artifacts
Advances in automating analysis of neural time series 50
Conclusion
Data sharing
And data standards
Autoreject
to remove artifacts
αCSC to learn
recurring waveforms
Alexandre
Gramfort
Denis
Engemann
Yousra
Bekhti
Eric LarsonUmut Şimşekli
Tom Dupré la Tour
Credits
51Advances in automating analysis of neural time series
52
Thank you
Advances in automating analysis of neural time series

More Related Content

PPT
Durkheim Project: Social Media Risk & Bayesian Counters
PPTX
Machine Learning for Data Extraction
PDF
Using Neural Net Algorithms to Classify Human Activity, with Applications in ...
PDF
IRJET-Sound-Quality Predict for Medium Cooling Fan Noise based on BP Neural N...
PDF
Ivy Zhu, Research Scientist, Intel at MLconf SEA - 5/01/15
PDF
A Dual congress Psychiatry and the Neurosciences
PDF
20190122_cohenadad_sc-mri-workshop
PDF
bbbPaper
Durkheim Project: Social Media Risk & Bayesian Counters
Machine Learning for Data Extraction
Using Neural Net Algorithms to Classify Human Activity, with Applications in ...
IRJET-Sound-Quality Predict for Medium Cooling Fan Noise based on BP Neural N...
Ivy Zhu, Research Scientist, Intel at MLconf SEA - 5/01/15
A Dual congress Psychiatry and the Neurosciences
20190122_cohenadad_sc-mri-workshop
bbbPaper

Similar to Advances in automating analysis of neural time series (20)

PDF
DSSV2022_np_eeg_toolbox
PDF
Deep Learning: concepts and use cases (October 2018)
PDF
Measuring EEG in vivo for Preclinical Evaluation of Sleep and Alzheimer’s Dis...
PPTX
Artificial neural network
PPTX
The second seminar
PPT
1_sLORETA_Tutorial.pptjjjjjjjjjjjjjjjjjjjjjjjjjjjjj
PDF
Analysis of EEG Signal using nonextensive statistics
PPTX
EEG course.pptx
PDF
Laskaris mining information_neuroinformatics
PDF
A04401001013
PPT
Summer Work
PDF
EEG Signal Classification using Deep Neural Network
PDF
Heuristic Algorithms.pdf
PDF
Spatial and Temporal Features of Noise in fMRI
PDF
Neuroscience: Myths, Metaphors and Marketing
PDF
Hacking Brain Computer Interfaces
PDF
Honey, I Deep-shrunk the Sample Covariance Matrix! by Erk Subasi at QuantCon ...
PDF
Signal Processing For Neuroscientists An Introduction To The Analysis Of Phys...
PDF
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...
PPTX
Fetal Health Final ppt using machine learning.pptx
DSSV2022_np_eeg_toolbox
Deep Learning: concepts and use cases (October 2018)
Measuring EEG in vivo for Preclinical Evaluation of Sleep and Alzheimer’s Dis...
Artificial neural network
The second seminar
1_sLORETA_Tutorial.pptjjjjjjjjjjjjjjjjjjjjjjjjjjjjj
Analysis of EEG Signal using nonextensive statistics
EEG course.pptx
Laskaris mining information_neuroinformatics
A04401001013
Summer Work
EEG Signal Classification using Deep Neural Network
Heuristic Algorithms.pdf
Spatial and Temporal Features of Noise in fMRI
Neuroscience: Myths, Metaphors and Marketing
Hacking Brain Computer Interfaces
Honey, I Deep-shrunk the Sample Covariance Matrix! by Erk Subasi at QuantCon ...
Signal Processing For Neuroscientists An Introduction To The Analysis Of Phys...
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...
Fetal Health Final ppt using machine learning.pptx
Ad

Recently uploaded (20)

PPTX
TORCH INFECTIONS in pregnancy with toxoplasma
PPTX
gene cloning powerpoint for general biology 2
PDF
Wound infection.pdfWound infection.pdf123
PDF
Worlds Next Door: A Candidate Giant Planet Imaged in the Habitable Zone of ↵ ...
PPTX
Presentation1 INTRODUCTION TO ENZYMES.pptx
PDF
Cosmic Outliers: Low-spin Halos Explain the Abundance, Compactness, and Redsh...
PPTX
A powerpoint on colorectal cancer with brief background
PPTX
ap-psych-ch-1-introduction-to-psychology-presentation.pptx
PPTX
perinatal infections 2-171220190027.pptx
PPTX
BODY FLUIDS AND CIRCULATION class 11 .pptx
PDF
Unit 5 Preparations, Reactions, Properties and Isomersim of Organic Compounds...
PPTX
INTRODUCTION TO PAEDIATRICS AND PAEDIATRIC HISTORY TAKING-1.pptx
PPT
Animal tissues, epithelial, muscle, connective, nervous tissue
PPT
LEC Synthetic Biology and its application.ppt
PPT
Heredity-grade-9 Heredity-grade-9. Heredity-grade-9.
PDF
Looking into the jet cone of the neutrino-associated very high-energy blazar ...
PPT
Presentation of a Romanian Institutee 2.
PDF
CHAPTER 3 Cell Structures and Their Functions Lecture Outline.pdf
PDF
CHAPTER 2 The Chemical Basis of Life Lecture Outline.pdf
PPTX
Probability.pptx pearl lecture first year
TORCH INFECTIONS in pregnancy with toxoplasma
gene cloning powerpoint for general biology 2
Wound infection.pdfWound infection.pdf123
Worlds Next Door: A Candidate Giant Planet Imaged in the Habitable Zone of ↵ ...
Presentation1 INTRODUCTION TO ENZYMES.pptx
Cosmic Outliers: Low-spin Halos Explain the Abundance, Compactness, and Redsh...
A powerpoint on colorectal cancer with brief background
ap-psych-ch-1-introduction-to-psychology-presentation.pptx
perinatal infections 2-171220190027.pptx
BODY FLUIDS AND CIRCULATION class 11 .pptx
Unit 5 Preparations, Reactions, Properties and Isomersim of Organic Compounds...
INTRODUCTION TO PAEDIATRICS AND PAEDIATRIC HISTORY TAKING-1.pptx
Animal tissues, epithelial, muscle, connective, nervous tissue
LEC Synthetic Biology and its application.ppt
Heredity-grade-9 Heredity-grade-9. Heredity-grade-9.
Looking into the jet cone of the neutrino-associated very high-energy blazar ...
Presentation of a Romanian Institutee 2.
CHAPTER 3 Cell Structures and Their Functions Lecture Outline.pdf
CHAPTER 2 The Chemical Basis of Life Lecture Outline.pdf
Probability.pptx pearl lecture first year
Ad

Advances in automating analysis of neural time series

  • 1. Advances in automating analysis of neural time series Mainak Jas Advisor: Alexandre Gramfort Ph.D. defense April 12, 2018
  • 2. Advances in automating analysis of neural time series 2 Where does the data come from?
  • 3. Advances in automating analysis of neural time series 2 Where does the data come from? Electroencephalography (EEG) Magnetoencephalography (MEG)
  • 4. Advances in automating analysis of neural time series 2 Where does the data come from? Electroencephalography (EEG) Magnetoencephalography (MEG)
  • 5. Advances in automating analysis of neural time series 2 Where does the data come from? Electroencephalography (EEG) Magnetoencephalography (MEG) Spontaneous brain activity
  • 6. Advances in automating analysis of neural time series 2 Where does the data come from? Electroencephalography (EEG) Magnetoencephalography (MEG) Earth’s magnetic field Spontaneous brain activity
  • 7. Advances in automating analysis of neural time series 2 Where does the data come from? Electroencephalography (EEG) Magnetoencephalography (MEG) Earth’s magnetic field Traffic, Electrical disturbance Spontaneous brain activity
  • 8. Advances in automating analysis of neural time series 2 Where does the data come from? Electroencephalography (EEG) Magnetoencephalography (MEG) Earth’s magnetic field Traffic, Electrical disturbance Spontaneous brain activity Sensor noise
  • 9. Advances in automating analysis of neural time series 3 We will analyze multivariate time series
  • 10. Advances in automating analysis of neural time series 3 We will analyze multivariate time series Time
  • 11. Advances in automating analysis of neural time series 3 We will analyze multivariate time series Time
  • 12. Advances in automating analysis of neural time series 3 We will analyze multivariate time series Time
  • 13. Advances in automating analysis of neural time series 3 We will analyze multivariate time series Time
  • 14. Advances in automating analysis of neural time series 3 We will analyze multivariate time series Time
  • 15. Advances in automating analysis of neural time series 3 We will analyze multivariate time series Time Trigger channel
  • 16. Advances in automating analysis of neural time series 3 We will analyze multivariate time series Time Trials Trigger channel
  • 17. Advances in automating analysis of neural time series 3 We will analyze multivariate time series Time Trials Trigger channel
  • 18. Advances in automating analysis of neural time series 3 We will analyze multivariate time series Time Trials Trigger channel Evoked response
  • 19. Advances in automating analysis of neural time series 3 We will analyze multivariate time series Time Trials Trigger channel Evoked response N170: faces P300: surprise N400: words etc.
  • 20. 4 Manual analysis cannot be scaled and is not reproducible
  • 21. 4 Manual analysis cannot be scaled and is not reproducible 1http://biorxiv.org/content/biorxiv/early/2016/02/12/039354.full.pdf (A practical guide for improving transparency and reproducibility in neuroimaging research) “Quite often in the course of a project parameters are modified, list of subjects are changed, and processing steps need to be rerun … automating instead of manual interventions can really pay off” —Russel Poldrack1
  • 22. 4 Manual analysis cannot be scaled and is not reproducible 1http://biorxiv.org/content/biorxiv/early/2016/02/12/039354.full.pdf (A practical guide for improving transparency and reproducibility in neuroimaging research) “Quite often in the course of a project parameters are modified, list of subjects are changed, and processing steps need to be rerun … automating instead of manual interventions can really pay off” —Russel Poldrack1 Advances in automating analysis of neural time series
  • 23. Advances in automating analysis of neural time series 5 Contributions
  • 24. Advances in automating analysis of neural time series 5 Contributions Reproducibility • A tutorial paper for group analysis of M/EEG data (submitted to Frontiers in Brain Imaging Methods) • Brain Imaging Data Structure for MEG (submitted to Nature, Scientific Data)
  • 25. Advances in automating analysis of neural time series 5 Contributions Reproducibility Automation • A tutorial paper for group analysis of M/EEG data (submitted to Frontiers in Brain Imaging Methods) • Brain Imaging Data Structure for MEG (submitted to Nature, Scientific Data) • Autoreject to remove artifacts (NeuroImage, 2017) • AlphaCSC to learn brain waveforms (NIPS, 2017)
  • 26. Advances in automating analysis of neural time series 5 Contributions Reproducibility Automation • A tutorial paper for group analysis of M/EEG data (submitted to Frontiers in Brain Imaging Methods) • Brain Imaging Data Structure for MEG (submitted to Nature, Scientific Data) • Autoreject to remove artifacts (NeuroImage, 2017) • AlphaCSC to learn brain waveforms (NIPS, 2017)
  • 27. Advances in automating analysis of neural time series 6 Contribution I: Brain Imaging Data Structure (BIDS) validator
  • 28. Advances in automating analysis of neural time series 6 Contribution I: Brain Imaging Data Structure (BIDS) validator - Automatic converter MNE-BIDS - BIDS compatible dataset ds000248
  • 29. Advances in automating analysis of neural time series 7 Contribution II: Tutorial paper on group analysis [Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
  • 30. Advances in automating analysis of neural time series 7 Contribution II: Tutorial paper on group analysis code [Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
  • 31. Advances in automating analysis of neural time series 7 Contribution II: Tutorial paper on group analysis code ? [Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
  • 32. Advances in automating analysis of neural time series 7 Contribution II: Tutorial paper on group analysis code ? http://mne-tools.github.io/mne-biomag-group-demo/ [Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
  • 33. Advances in automating analysis of neural time series 7 Contribution II: Tutorial paper on group analysis •Diagnostic plots code ? http://mne-tools.github.io/mne-biomag-group-demo/ [Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
  • 34. Advances in automating analysis of neural time series 7 Contribution II: Tutorial paper on group analysis •Diagnostic plots •Alternatives code ? http://mne-tools.github.io/mne-biomag-group-demo/ [Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
  • 35. Advances in automating analysis of neural time series 7 Contribution II: Tutorial paper on group analysis •Diagnostic plots •Alternatives •Statistics code ? http://mne-tools.github.io/mne-biomag-group-demo/ [Jas, Larson, Engemann, Taulu, Hamalainen, Gramfort. 2017]
  • 36. Advances in automating analysis of neural time series 8 Contributions Reproducibility Automation • A tutorial paper for group analysis of M/EEG data (submitted to Frontiers in Neuroscience) • Brain Imaging Data Structure for MEG (submitted to Nature, Scientific Data) • Autoreject to remove artifacts (NeuroImage, 2017) • AlphaCSC to learn brain waveforms (NIPS, 2017)
  • 37. Advances in automating analysis of neural time series 9 Contribution III: Automatic rejection of artifactssensors Trials [Jas, Engemann, Raimondo, Bekhti, Gramfort. NeuroImage. 2017] Bad trial
  • 38. Related work Riemannian Potato [Barachant, 2013] PREP (RANSAC) [Bigdely-Shamlo, 2015] FASTER [Nolan, 2010] 10 Robust regression [Diedrichsen, 2005] Sensor Noise Suppression [Cheveigné, 2008] Advances in automating analysis of neural time series
  • 39. Advances in automating analysis of neural time series 11 Rejecting based on peak-to-peak amplitudes
  • 40. Advances in automating analysis of neural time series 11 Rejecting based on peak-to-peak amplitudes Peak-to-peak amplitude A
  • 41. Advances in automating analysis of neural time series 11 Rejecting based on peak-to-peak amplitudes Peak-to-peak amplitude A
  • 42. Advances in automating analysis of neural time series 11 Rejecting based on peak-to-peak amplitudes Peak-to-peak amplitude A
  • 43. Advances in automating analysis of neural time series 11 Rejecting based on peak-to-peak amplitudes Peak-to-peak amplitude A A < threshold τ
  • 44. Advances in automating analysis of neural time series 11 Rejecting based on peak-to-peak amplitudes Peak-to-peak amplitude A A < threshold τ Good data Yes?
  • 45. Advances in automating analysis of neural time series 11 Rejecting based on peak-to-peak amplitudes Peak-to-peak amplitude A A < threshold τ Good data Bad data Yes? No?
  • 46. Observation: The optimal threshold retains sufficient trials while rejecting outliers. 12Advances in automating analysis of neural time series
  • 47. Observation: The optimal threshold retains sufficient trials while rejecting outliers. 12 Too few trials! Advances in automating analysis of neural time series
  • 48. Observation: The optimal threshold retains sufficient trials while rejecting outliers. 12 Too few trials! Advances in automating analysis of neural time series
  • 49. Observation: The optimal threshold retains sufficient trials while rejecting outliers. 12 Too few trials! Outliers not removed Advances in automating analysis of neural time series
  • 50. Observation: The optimal threshold retains sufficient trials while rejecting outliers. 12 Too few trials! Outliers not removedOptimal Advances in automating analysis of neural time series Evoked response
  • 51. 13 How do we measure data quality? Many trials for a single sensor
  • 52. 13 Artifact How do we measure data quality? Many trials for a single sensor
  • 53. 13 Artifact How do we measure data quality? Many trials for a single sensor
  • 54. 13 Artifact How do we measure data quality? Many trials for a single sensor Training set Validation set
  • 55. 13 Artifact How do we measure data quality? Many trials for a single sensor Training set Validation set XvalXtrain(τ)
  • 56. 13 Artifact How do we measure data quality? Many trials for a single sensor Training set Validation set XvalXtrain(τ)
  • 57. 13 Artifact How do we measure data quality? Many trials for a single sensor Training set Validation set Xval Fro RMSE = Xtrain(τ)
  • 58. (lower error = cleaner training set) 13 Artifact How do we measure data quality? Many trials for a single sensor Training set Validation set Xval Fro RMSE = Xtrain(τ)
  • 59. 14 But what if my validation data also has artifacts? Advances in automating analysis of neural time series
  • 60. 14 But what if my validation data also has artifacts? Validation set Advances in automating analysis of neural time series
  • 61. 14 But what if my validation data also has artifacts? Xval Validation set Advances in automating analysis of neural time series
  • 62. 14 But what if my validation data also has artifacts? Xval Xval ~ Validation set Advances in automating analysis of neural time series
  • 63. 14 But what if my validation data also has artifacts? Xval Xval ~ Answer: use the median instead of the mean RMSE= Xtrain(τ) Xval ~ Fro Validation set Advances in automating analysis of neural time series
  • 64. Autoreject (global) vs. human threshold 15Advances in automating analysis of neural time series
  • 65. Autoreject (global) vs. human threshold 15Advances in automating analysis of neural time series Remove trial if data in any sensor > τ
  • 66. Autoreject (global) vs. human threshold 1.85 1.95 2.05 2.15 2.25 2.35 40 57 73 90 106 123 139 156 172 189 RMSE(μV) Threshold (μV) RMSE Autoreject (global) Manual 15Advances in automating analysis of neural time series Remove trial if data in any sensor > τ
  • 67. Autoreject (global) vs. human threshold 1.85 1.95 2.05 2.15 2.25 2.35 40 57 73 90 106 123 139 156 172 189 RMSE(μV) Threshold (μV) RMSE Autoreject (global) Manual 15 Average with 5-fold cross-validation Advances in automating analysis of neural time series Remove trial if data in any sensor > τ
  • 68. Autoreject (global) vs. human threshold 1.85 1.95 2.05 2.15 2.25 2.35 40 57 73 90 106 123 139 156 172 189 RMSE(μV) Threshold (μV) RMSE Autoreject (global) Manual 15 Average with 5-fold cross-validation Advances in automating analysis of neural time series Remove trial if data in any sensor > τ
  • 69. Autoreject (global) vs. human threshold 1.85 1.95 2.05 2.15 2.25 2.35 40 57 73 90 106 123 139 156 172 189 RMSE(μV) Threshold (μV) RMSE Autoreject (global) Manual 15 Average with 5-fold cross-validation Advances in automating analysis of neural time series Remove trial if data in any sensor > τ
  • 70. How to sample the thresholds? 16
  • 71. How to sample the thresholds? 16 error Thresholds
  • 72. How to sample the thresholds? 16 error Thresholds
  • 73. How to sample the thresholds? 16 Grid search error Thresholds
  • 74. How to sample the thresholds? 16 Grid search Grid resolution error Thresholds
  • 75. How to sample the thresholds? 16 Random search error Thresholds
  • 76. 17 Bayesian optimization: a more efficient approach Thresholds error
  • 77. 17 Bayesian optimization: a more efficient approach Thresholds error
  • 78. 17 Bayesian optimization: a more efficient approach Thresholds error Gaussian process
  • 79. 17 Bayesian optimization: a more efficient approach Thresholds error Exploration vs exploitation dilemma Gaussian process
  • 80. 17 Bayesian optimization: a more efficient approach Thresholds error Exploration vs exploitation dilemma Gaussian process Acquisition function
  • 81. 17 Bayesian optimization: a more efficient approach Thresholds error Exploration vs exploitation dilemma Gaussian process Acquisition function
  • 82. 18 Each sensor has a different threshold
  • 83. 18 0 5 10 15 20 25 123 225 327 430 532 634 736 839 Numberofsensors Threshold (fT/cm) Each sensor has a different threshold
  • 84. 18 0 5 10 15 20 25 123 225 327 430 532 634 736 839 Numberofsensors Threshold (fT/cm) Each sensor has a different threshold Sensors Trials
  • 85. 18 0 5 10 15 20 25 123 225 327 430 532 634 736 839 Numberofsensors Threshold (fT/cm) Each sensor has a different threshold Sensors Trials
  • 86. 18 0 5 10 15 20 25 123 225 327 430 532 634 736 839 Numberofsensors Threshold (fT/cm) Each sensor has a different threshold Sensors Trials
  • 87. 19 Schematic of the complexity of the problem Trials Sensors
  • 88. 19 Schematic of the complexity of the problem Trials Sensors Bad trial
  • 89. 19 Schematic of the complexity of the problem Trials Sensors Bad trial Globally bad sensor
  • 90. 19 Schematic of the complexity of the problem Trials Sensors Bad trial Globally bad sensor Locally bad sensor
  • 91. 19 Schematic of the complexity of the problem Trials Sensors Bad trial Globally bad sensor Locally bad sensor Local artifacts
  • 92. 19 Schematic of the complexity of the problem Naïve solution: Drop bad trials and sensors Trials Sensors Bad trial Globally bad sensor Locally bad sensor Local artifacts
  • 93. 19 Schematic of the complexity of the problem Trials Sensors Bad trial Globally bad sensor Locally bad sensor Local artifacts Naïve solution: Drop bad trials and sensors
  • 94. 19 Schematic of the complexity of the problem Trials Sensors Bad trial Globally bad sensor Locally bad sensor Local artifacts Naïve solution: Drop bad trials and sensors
  • 95. 19 Schematic of the complexity of the problem Proposed solution: Naïve solution: Drop bad trials and sensors Trials Sensors Bad trial Globally bad sensor Locally bad sensor Local artifacts
  • 96. 19 Schematic of the complexity of the problem Trials Sensors Bad trial Globally bad sensor Locally bad sensor Local artifacts Proposed solution: Naïve solution: Drop bad trials and sensors if #bad-sensors > κ (=4) drop trial
  • 97. 19 Schematic of the complexity of the problem Proposed solution: Naïve solution: Drop bad trials and sensors if #bad-sensors > κ (=4) drop trial Interpolation Bad sensor Trials Sensors Bad trial Globally bad sensor Locally bad sensor Local artifacts
  • 98. 19 Schematic of the complexity of the problem Proposed solution: Naïve solution: Drop bad trials and sensors if #bad-sensors > κ (=4) drop trial Interpolation • EEG: Spherical splines • MEG: MNE Bad sensor [Hamalainen et al., 1994] [Perrin et al., 1989] Trials Sensors Bad trial Globally bad sensor Locally bad sensor Local artifacts
  • 99. 19 Schematic of the complexity of the problem Proposed solution: Naïve solution: Drop bad trials and sensors if #bad-sensors > κ (=4) drop trial Interpolation • EEG: Spherical splines • MEG: MNE Bad sensor [Hamalainen et al., 1994] [Perrin et al., 1989] Else: interpolate ρ (=2) worst bad sensors per trial Trials Sensors Bad trial Globally bad sensor Locally bad sensor Local artifacts
  • 100. 19 Schematic of the complexity of the problem Proposed solution: Naïve solution: Drop bad trials and sensors if #bad-sensors > κ (=4) drop trial Else: interpolate ρ (=2) worst bad sensors per trial Interpolation • EEG: Spherical splines • MEG: MNE Bad sensor [Hamalainen et al., 1994] [Perrin et al., 1989] Trials Sensors Bad trial Globally bad sensor Locally bad sensor Local artifacts
  • 101. Autoreject in action: MNE sample data (MEG) 20http://autoreject.github.io/auto_examples/plot_auto_repair.html
  • 102. 21 Autoreject in action (EEG data) Advances in automating analysis of neural time series Before autoreject After autoreject 19 subject Faces dataset [Wakeman and Henson, 2015]
  • 103. 22 Quantitative evaluation Advances in automating analysis of neural time series [Nolan, 2010] [Bigdely-Shamlo, 2015] [Cheveigné, 2008]
  • 104. 22 Quantitative evaluation Advances in automating analysis of neural time series   humanmethod XX [Nolan, 2010] [Bigdely-Shamlo, 2015] [Cheveigné, 2008]
  • 105. 22 Quantitative evaluation Advances in automating analysis of neural time series one subject   humanmethod XX [Nolan, 2010] [Bigdely-Shamlo, 2015] [Cheveigné, 2008]
  • 106. 22 Quantitative evaluation Advances in automating analysis of neural time series Competing method better one subject   humanmethod XX [Nolan, 2010] [Bigdely-Shamlo, 2015] [Cheveigné, 2008]
  • 107. 22 Quantitative evaluation Advances in automating analysis of neural time series Autoreject better Competing method better one subject   humanmethod XX [Nolan, 2010] [Bigdely-Shamlo, 2015] [Cheveigné, 2008]
  • 108. 22 Quantitative evaluation Advances in automating analysis of neural time series Autoreject better Competing method better one subject   humanmethod XX [Nolan, 2010] [Bigdely-Shamlo, 2015] [Cheveigné, 2008]
  • 109. Transparency: an example diagnostic plot 23http://autoreject.github.io/auto_examples/plot_visualize_bad_epochs.html
  • 110. Transparency: an example diagnostic plot 23 Sensor to be interpolated http://autoreject.github.io/auto_examples/plot_visualize_bad_epochs.html
  • 111. Transparency: an example diagnostic plot 23 Sensor to be interpolated Bad sensor but not going to be interpolated http://autoreject.github.io/auto_examples/plot_visualize_bad_epochs.html
  • 112. Transparency: an example diagnostic plot 23 Sensor to be interpolated Bad sensor but not going to be interpolated Bad trials http://autoreject.github.io/auto_examples/plot_visualize_bad_epochs.html
  • 113. Advances in automating analysis of neural time series 24 Contributions Reproducibility Automation • A tutorial paper for group analysis of M/EEG data (submitted to Frontiers in Neuroscience) • Brain Imaging Data Structure for MEG (submitted to Nature, Scientific Data) • Autoreject to remove artifacts (NeuroImage, 2017) • AlphaCSC to learn brain waveforms (NIPS, 2017)
  • 114. Advances in automating analysis of neural time series 24 Contributions Reproducibility Automation • A tutorial paper for group analysis of M/EEG data (submitted to Frontiers in Neuroscience) • Brain Imaging Data Structure for MEG (submitted to Nature, Scientific Data) • Autoreject to remove artifacts (NeuroImage, 2017) • AlphaCSC to learn brain waveforms (NIPS, 2017)
  • 115. Contribution IV: Learn representations from neural data #1 Shape of brain rhythms matter 25Advances in automating analysis of neural time series [Cole and Voytek, 2017] μ rhythm [Jas, Dupré la Tour, Şimşekli, Gramfort. NIPS. 2017]
  • 116. Contribution IV: Learn representations from neural data #1 Shape of brain rhythms matter 25Advances in automating analysis of neural time series [Cole and Voytek, 2017] μ rhythm [Jas, Dupré la Tour, Şimşekli, Gramfort. NIPS. 2017]
  • 117. Contribution IV: Learn representations from neural data #2 Filtering #1 Shape of brain rhythms matter 25Advances in automating analysis of neural time series [Cole and Voytek, 2017] μ rhythm [Jas, Dupré la Tour, Şimşekli, Gramfort. NIPS. 2017]
  • 118. Contribution IV: Learn representations from neural data #2 Filtering #1 Shape of brain rhythms matter 25Advances in automating analysis of neural time series [Cole and Voytek, 2017] μ rhythm asymmetry [Jas, Dupré la Tour, Şimşekli, Gramfort. NIPS. 2017]
  • 119. Some attempts in the neuroscience community 26Advances in automating analysis of neural time series
  • 120. Some attempts in the neuroscience community 26 Adaptive Waveform Learning [Hitziger, 2017] Template waveform Coefficient updates waveform updates Different durations Advances in automating analysis of neural time series
  • 121. Some attempts in the neuroscience community Sliding Window Method [Gips et al., 2017] 26 Adaptive Waveform Learning [Hitziger, 2017] Template waveform Coefficient updates waveform updates Different durations Advances in automating analysis of neural time series
  • 122. Some attempts in the neuroscience community Sliding Window Method [Gips et al., 2017] MoTIF [Jost et al., 2006] 26 Adaptive Waveform Learning [Hitziger, 2017] Template waveform Coefficient updates waveform updates Different durations Learning recurrent waveforms in EEG [Brockmeier & Principe, 2016] Multivariate temporal dictionary learning [Barthélemy et al., 2013] Advances in automating analysis of neural time series
  • 123. The problem setup (simulation) 27Advances in automating analysis of neural time series
  • 124. The problem setup (simulation) 27Advances in automating analysis of neural time series
  • 125. The problem setup (simulation) 27Advances in automating analysis of neural time series
  • 126. The problem setup (simulation) Two different atoms (can have more) 27Advances in automating analysis of neural time series
  • 127. The problem setup (simulation) Different amplitudes Two different atoms (can have more) 27Advances in automating analysis of neural time series
  • 128. The problem setup (simulation) Different locations Different amplitudes Two different atoms (can have more) 27Advances in automating analysis of neural time series
  • 129. The problem setup (simulation) Different locations Different amplitudes Two different atoms (can have more) Atoms can even overlap 27Advances in automating analysis of neural time series
  • 130. The problem setup (simulation) Different locations Different amplitudes Two different atoms (can have more) Small gaussian noise Atoms can even overlap 27Advances in automating analysis of neural time series
  • 131. 28 Convolutional Sparse Coding (CSC) formulation Advances in automating analysis of neural time series [Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015]
  • 132. 28 Convolutional Sparse Coding (CSC) formulation Advances in automating analysis of neural time series [Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015] =nx
  • 133. 28 Convolutional Sparse Coding (CSC) formulation Advances in automating analysis of neural time series [Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015] *=nx k d k nz
  • 134. 28 Convolutional Sparse Coding (CSC) formulation Advances in automating analysis of neural time series [Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015] *=nx k d k nz 
  • 135. 28 Convolutional Sparse Coding (CSC) formulation Advances in automating analysis of neural time series [Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015] * * = = nx k d k nz  
  • 136. 28 Convolutional Sparse Coding (CSC) formulation             n kn k n k k n k n zd zzdx , 2 2 , *min  Advances in automating analysis of neural time series [Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015] * * = = nx k d k nz  
  • 137. 28 Convolutional Sparse Coding (CSC) formulation Penalty to enforce sparsity             n kn k n k k n k n zd zzdx , 2 2 , *min  Advances in automating analysis of neural time series [Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015] * * = = nx k d k nz  
  • 138. 28 Convolutional Sparse Coding (CSC) formulation Penalty to enforce sparsity             n kn k n k k n k n zd zzdx , 2 2 , *min  s.t. 0k nz Advances in automating analysis of neural time series [Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015] * * = = nx k d k nz  
  • 139. 28 Convolutional Sparse Coding (CSC) formulation d k 2 2 ≤ 1 Penalty to enforce sparsity             n kn k n k k n k n zd zzdx , 2 2 , *min  s.t. 0k nz Advances in automating analysis of neural time series [Kavukcuoglu et al., 2010] [Zeiler et al. 2010][Gross et al., 2012][Heidi et al., 2015] * * = = nx k d k nz  
  • 140. Basic strategy: alternate minimization             n kn k n k k n k n zd zzdx , 2 2 , * 2 1 min  29Advances in automating analysis of neural time series
  • 141. Basic strategy: alternate minimization             n kn k n k k n k n zd zzdx , 2 2 , * 2 1 min  29Advances in automating analysis of neural time series • Not jointly convex in both d and z
  • 142. Basic strategy: alternate minimization             n kn k n k k n k n zd zzdx , 2 2 , * 2 1 min  29Advances in automating analysis of neural time series • Not jointly convex in both d and z • Convex when d or z is fixed
  • 143. Basic strategy: alternate minimization             n kn k n k k n k n zd zzdx , 2 2 , * 2 1 min  29Advances in automating analysis of neural time series • Not jointly convex in both d and z • Convex when d or z is fixed • Alternate minimization guarantees cost function goes down at every step
  • 144. Basic strategy: alternate minimization             n kn k n k k n k n zd zzdx , 2 2 , * 2 1 min  d-step 29Advances in automating analysis of neural time series • Not jointly convex in both d and z • Convex when d or z is fixed • Alternate minimization guarantees cost function goes down at every step
  • 145. Basic strategy: alternate minimization             n kn k n k k n k n zd zzdx , 2 2 , * 2 1 min  z-step 29Advances in automating analysis of neural time series • Not jointly convex in both d and z • Convex when d or z is fixed • Alternate minimization guarantees cost function goes down at every step
  • 146. Basic strategy: alternate minimization             n kn k n k k n k n zd zzdx , 2 2 , * 2 1 min  d-step 29Advances in automating analysis of neural time series • Not jointly convex in both d and z • Convex when d or z is fixed • Alternate minimization guarantees cost function goes down at every step
  • 147. Basic strategy: alternate minimization             n kn k n k k n k n zd zzdx , 2 2 , * 2 1 min  z-step 29Advances in automating analysis of neural time series • Not jointly convex in both d and z • Convex when d or z is fixed • Alternate minimization guarantees cost function goes down at every step
  • 148. z-step: The Toeplitz matrix trick 30Advances in automating analysis of neural time series
  • 149. z-step: The Toeplitz matrix trick k n kk n k zDzd * 30Advances in automating analysis of neural time series
  • 150. z-step: The Toeplitz matrix trick k n kk n k zDzd * k D 30Advances in automating analysis of neural time series
  • 151. z-step: The Toeplitz matrix trick k n kk n k zDzd * k D k nz 30Advances in automating analysis of neural time series
  • 152. z-step: The Toeplitz matrix trick k n kk n k zDzd * k D k nz    kn k n n k k n k n z zzDx , 2 2 min  30Advances in automating analysis of neural time series
  • 153. z-step: The Toeplitz matrix trick 31Advances in automating analysis of neural time series
  • 154. z-step: The Toeplitz matrix trick k k n k zD 31Advances in automating analysis of neural time series
  • 155. z-step: The Toeplitz matrix trick 1 D k k n k zD 1 nz 31Advances in automating analysis of neural time series
  • 156. z-step: The Toeplitz matrix trick 1 D k D k k n k zD 1 nz k nz 31Advances in automating analysis of neural time series
  • 157. [ ] z-step: The Toeplitz matrix trick … … 1 D k D k k n k zD 1 nz k nz 31Advances in automating analysis of neural time series
  • 158. [ ] z-step: The Toeplitz matrix trick … … 1 D k D k k n k zD 1 nz k nz nDz 31Advances in automating analysis of neural time series
  • 159.         n nnn z zDzx  2 2 2 1 min [ ] z-step: The Toeplitz matrix trick … … 1 D k D k k n k zD 1 nz k nz nDz 31Advances in automating analysis of neural time series
  • 160.           n nnn n zDzx z  2 2 2 1 [ ] z-step: The Toeplitz matrix trick … … 1 D k D k k n k zD 1 nz k nz nDz 31Advances in automating analysis of neural time series
  • 161.           n nnn n zDzx z  2 2 2 1 [ ] z-step: The Toeplitz matrix trick … … 1 D k D k k n k zD 1 nz k nz nDz 1 )( nn T DzxD 31Advances in automating analysis of neural time series
  • 162.           n nnn n zDzx z  2 2 2 1 [ ] z-step: The Toeplitz matrix trick … … 1 D k D k k n k zD 1 nz k nz nDz 1 )( nn T DzxD Now, we can feed the gradient to L-BFGS-B optimization algorithm 31Advances in automating analysis of neural time series
  • 163.              n kn k n k k n k n d zzdx , 2 2 1 * 2 1 min2 2  k nz d-step: Strided matrices 00 0 1 0 .5 0 k nZ 32Advances in automating analysis of neural time series
  • 164.              n kn k n k k n k n d zzdx , 2 2 1 * 2 1 min2 2  k nz d-step: Strided matrices 00 0 1 0 .5 0 0 0 1 k nZ 32Advances in automating analysis of neural time series
  • 165.              n kn k n k k n k n d zzdx , 2 2 1 * 2 1 min2 2  k nz d-step: Strided matrices 00 0 1 0 .5 0 0 0 1 00 1 k nZ 32Advances in automating analysis of neural time series
  • 166.              n kn k n k k n k n d zzdx , 2 2 1 * 2 1 min2 2  k nz d-step: Strided matrices 00 0 1 0 .5 0 0 0 1 00 1 01 0 k nZ 32Advances in automating analysis of neural time series
  • 167.              n kn k n k k n k n d zzdx , 2 2 1 * 2 1 min2 2  k nz d-step: Strided matrices 00 0 1 0 .5 0 0 0 1 00 1 01 0 0 0 .5 k nZ 32Advances in automating analysis of neural time series
  • 168.              n kn k n k k n k n d zzdx , 2 2 1 * 2 1 min2 2  k nz d-step: Strided matrices 00 0 1 0 .5 0 0 0 1 00 1 01 0 0 0 .5 0 .5 0 k nZ 32Advances in automating analysis of neural time series
  • 169.              n kn k n k k n k n d zzdx , 2 2 1 * 2 1 min2 2  k nz d-step: Strided matrices 00 0 1 0 .5 0 0 0 1 00 1 01 0 0 0 .5 0 .5 0 0 2 4 k nZ k d 32Advances in automating analysis of neural time series
  • 170.              n kn k n k k n k n d zzdx , 2 2 1 * 2 1 min2 2  k nz d-step: Strided matrices 00 0 1 0 .5 0 0 0 1 00 1 01 0 0 0 .5 0 .5 0 0 2 4 k nZ k d kk n k n k dZzd * 32Advances in automating analysis of neural time series
  • 171.             n k kk nn d dZx 2 2 1 2 1 min2 2 k nz d-step: Strided matrices 00 0 1 0 .5 0 0 0 1 00 1 01 0 0 0 .5 0 .5 0 0 2 4 k nZ k d kk n k n k dZzd * We remove the regularization term as it doesn’t depend on d 32Advances in automating analysis of neural time series Least square problem under unit norm constraint
  • 172. 33 d-step: The atoms are updated using LBFGS-B Advances in automating analysis of neural time series
  • 173. 33 d-step: The atoms are updated using LBFGS-B              n k kk nnk dZx d 2 2 2 1 Advances in automating analysis of neural time series
  • 174. 33 d-step: The atoms are updated using LBFGS-B              n k kk nnk dZx d 2 2 2 1          k kk nn Tk n dZxZ Advances in automating analysis of neural time series
  • 175. 33 d-step: The atoms are updated using LBFGS-B              n k kk nnk dZx d 2 2 2 1          2 ˆ 1 ,1minˆˆ k kk d dd 1. Gradient step 2. Projection step          k kk nn Tk n dZxZ Advances in automating analysis of neural time series Projected gradient descent
  • 176. 33 d-step: The atoms are updated using LBFGS-B              n k kk nnk dZx d 2 2 2 1          2 ˆ 1 ,1minˆˆ k kk d dd 1. Gradient step 2. Projection step          k kk nn Tk n dZxZ Advances in automating analysis of neural time series In practice, it is more complicated … LBFGS-B in the dual Projected gradient descent
  • 177. Putting it all together 34Advances in automating analysis of neural time series
  • 178. Putting it all together z update 34Advances in automating analysis of neural time series
  • 179. Putting it all together z update Toeplitz D 34Advances in automating analysis of neural time series
  • 180. Putting it all together z update d update Toeplitz D 34Advances in automating analysis of neural time series
  • 181. Putting it all together z update d update Toeplitz D Strided Z 34Advances in automating analysis of neural time series
  • 182. Putting it all together z update d update Toeplitz D Strided Z (L-BFGS-B) (L-BFGS-B) 34Advances in automating analysis of neural time series
  • 183. Putting it all together z update d update Toeplitz D Strided Z (L-BFGS-B) (L-BFGS-B) Alternate minimization 34Advances in automating analysis of neural time series
  • 184. 35 The quasi-Newton solvers outperform ADMM solvers K = 10 atoms L = 32 samples per atom Advances in automating analysis of neural time series
  • 185. 35 The quasi-Newton solvers outperform ADMM solvers K = 10 atoms L = 32 samples per atom Advances in automating analysis of neural time series
  • 186. 35 The quasi-Newton solvers outperform ADMM solvers K = 10 atoms L = 32 samples per atom 30 minutes Advances in automating analysis of neural time series
  • 187. 35 The quasi-Newton solvers outperform ADMM solvers K = 10 atoms L = 32 samples per atom 30 minutes 1 hour Advances in automating analysis of neural time series
  • 188. 35 The quasi-Newton solvers outperform ADMM solvers K = 10 atoms L = 32 samples per atom 30 minutes 1 hour Advances in automating analysis of neural time series
  • 189. 35 The quasi-Newton solvers outperform ADMM solvers K = 10 atoms L = 32 samples per atom 30 minutes 1 hour 45 minutes Advances in automating analysis of neural time series
  • 190. 35 The quasi-Newton solvers outperform ADMM solvers K = 10 atoms L = 32 samples per atom 30 minutes 1 hour 45 minutes 2 hours Advances in automating analysis of neural time series
  • 191. 36 The quasi-Newton solver outperform ADMM solvers Advances in automating analysis of neural time series
  • 192. 36 The quasi-Newton solver outperform ADMM solvers ADMM solvers Advances in automating analysis of neural time series
  • 193. 36 The quasi-Newton solver outperform ADMM solvers Different random seeds ADMM solvers Advances in automating analysis of neural time series
  • 194. Cross frequency coupling uncovered via CSC ~80 Hz 37Advances in automating analysis of neural time series [Canolty, 2006] [Dupré la Tour, 2017]
  • 195. Challenge Neural data often contains transient artifacts 38Advances in automating analysis of neural time series
  • 196. 39 CSC in the presence of transient artifacts Advances in automating analysis of neural time series
  • 197. 39 CSC in the presence of transient artifacts Advances in automating analysis of neural time series
  • 198. 39 Neural signals CSC in the presence of transient artifacts Advances in automating analysis of neural time series
  • 199. 39 Transient artifacts Neural signals CSC in the presence of transient artifacts Advances in automating analysis of neural time series
  • 200. A probabilistic interpretation 40Advances in automating analysis of neural time series
  • 201. A probabilistic interpretation 40Advances in automating analysis of neural time series             n kn k n k k n k n zd zzdx , 2 2 , *min 
  • 202. A probabilistic interpretation          tn k k tntn zd zpzdxpzd , ,, , )(log),|(logmaxarg*)*,( Maximum a posteriori estimate 40Advances in automating analysis of neural time series             n kn k n k k n k n zd zzdx , 2 2 , *min 
  • 203. A probabilistic interpretation        1,*~,| 1 , K k k n k tn zdNdzx          tn k k tntn zd zpzdxpzd , ,, , )(log),|(logmaxarg*)*,( Maximum a posteriori estimate Data likelihood 40Advances in automating analysis of neural time series             n kn k n k k n k n zd zzdx , 2 2 , *min 
  • 204. A probabilistic interpretation )(~, k tnz       1,*~,| 1 , K k k n k tn zdNdzx          tn k k tntn zd zpzdxpzd , ,, , )(log),|(logmaxarg*)*,( Maximum a posteriori estimate PriorData likelihood 40Advances in automating analysis of neural time series             n kn k n k k n k n zd zzdx , 2 2 , *min 
  • 205. A probabilistic interpretation )(~, k tnz       1,*~,| 1 , K k k n k tn zdNdzx          tn k k tntn zd zpzdxpzd , ,, , )(log),|(logmaxarg*)*,( Maximum a posteriori estimate PriorData likelihood 40Advances in automating analysis of neural time series             n kn k n k k n k n zd zzdx , 2 2 , *min 
  • 206. A probabilistic interpretation )(~, k tnz       1,*~,| 1 , K k k n k tn zdNdzx          tn k k tntn zd zpzdxpzd , ,, , )(log),|(logmaxarg*)*,( Maximum a posteriori estimate PriorData likelihood 40Advances in automating analysis of neural time series             n kn k n k k n k n zd zzdx , 2 2 , *min  Sparser activations
  • 207. Alpha-stable distributions ),,,(~ SX 41Advances in automating analysis of neural time series [Samorodnitsky et al., 1996]
  • 208. Alpha-stable distributions ),,,(~ SX 41Advances in automating analysis of neural time series [Samorodnitsky et al., 1996] Heavy tail
  • 209. Alpha-stable distributions ),,,(~ SX ]2,0(Characteristic exponent 41Advances in automating analysis of neural time series [Samorodnitsky et al., 1996] Heavy tail
  • 210. Alpha-stable distributions ),,,(~ SX ]2,0(Characteristic exponent Skewness parameter ]1,1[  41Advances in automating analysis of neural time series [Samorodnitsky et al., 1996] Heavy tail
  • 211. Alpha-stable distributions ),,,(~ SX ]2,0(Characteristic exponent Skewness parameter ]1,1[  Location parameter ),(  41Advances in automating analysis of neural time series [Samorodnitsky et al., 1996] Heavy tail
  • 212. Alpha-stable distributions ),,,(~ SX ]2,0(Characteristic exponent Skewness parameter ]1,1[  Location parameter ),0( Scale parameter ),(  41Advances in automating analysis of neural time series [Samorodnitsky et al., 1996] Heavy tail
  • 213. Alpha-stable distributions ),,,(~ SX ]2,0(Characteristic exponent Skewness parameter ]1,1[  Location parameter ),0( Scale parameter ),(  41 Special cases Advances in automating analysis of neural time series [Samorodnitsky et al., 1996] Heavy tail
  • 214. Alpha-stable distributions ),,,(~ SX ]2,0(Characteristic exponent Skewness parameter ]1,1[  Location parameter ),0( Scale parameter ),(  41 Special cases Normal distribution ),,0,2(~  SX Advances in automating analysis of neural time series [Samorodnitsky et al., 1996] Heavy tail
  • 215. Alpha-stable distributions ),,,(~ SX ]2,0(Characteristic exponent Skewness parameter ]1,1[  Location parameter ),0( Scale parameter ),(  41 Special cases Normal distribution ),,0,2(~  SX ),,0,1(~  SX Cauchy distribution Advances in automating analysis of neural time series [Samorodnitsky et al., 1996] Heavy tail
  • 216. 42        1,*~,| 1 , K k k n k tn zdNdzx From light tails to heavy tails         K k k n k tn zdSdzx 1 , *,2/1,0,~,|  2 Symmetric alpha-stable distributions Advances in automating analysis of neural time series
  • 217. Symmetric α-stable distribution is conditionally Gaussian 43Advances in automating analysis of neural time series [Samorodnitsky et al., 1996]
  • 218. Symmetric α-stable distribution is conditionally Gaussian 43         K k k n k tn zdSdzx 1 , *,2/1,0,~,|  Advances in automating analysis of neural time series [Samorodnitsky et al., 1996]
  • 219.        tn K k k n k tntn zdNdzx , 1 ,, 2 1 ,*~,,|  Symmetric α-stable distribution is conditionally Gaussian 43         K k k n k tn zdSdzx 1 , *,2/1,0,~,|  Advances in automating analysis of neural time series [Samorodnitsky et al., 1996]
  • 220.        tn K k k n k tntn zdNdzx , 1 ,, 2 1 ,*~,,|  Symmetric α-stable distribution is conditionally Gaussian 43         K k k n k tn zdSdzx 1 , *,2/1,0,~,|  Advances in automating analysis of neural time series from a positive stable distribution tn, [Samorodnitsky et al., 1996]
  • 221.        tn K k k n k tntn zdNdzx , 1 ,, 2 1 ,*~,,|  Symmetric α-stable distribution is conditionally Gaussian 43         K k k n k tn zdSdzx 1 , *,2/1,0,~,|             n kn k n k k n k n n zd zzdx , 2 2 , * 1 min   Weighted CSC! Advances in automating analysis of neural time series from a positive stable distribution tn, [Samorodnitsky et al., 1996]
  • 222.        tn K k k n k tntn zdNdzx , 1 ,, 2 1 ,*~,,|  Symmetric α-stable distribution is conditionally Gaussian 43         K k k n k tn zdSdzx 1 , *,2/1,0,~,|             n kn k n k k n k n n zd zzdx , 2 2 , * 1 min   Weighted CSC! Advances in automating analysis of neural time series But is not known …tn, from a positive stable distribution tn, [Samorodnitsky et al., 1996]
  • 223. EM algorithm is used to estimate MAP when some variables are missing 44Advances in automating analysis of neural time series [Dempster et. al., 1977]
  • 224. EM algorithm is used to estimate MAP when some variables are missing 44 Expectation step )(log)],|,([log),( ),,|( )( zpzdxpzdB zdxp i  E Advances in automating analysis of neural time series [Dempster et. al., 1977]
  • 225. EM algorithm is used to estimate MAP when some variables are missing 44 Expectation step )(log)],|,([log),( ),,|( )( zpzdxpzdB zdxp i  E                  n kn k n k k n k n n i zzdxEzdB , 2 2 )( * 1 ),(   Advances in automating analysis of neural time series [Dempster et. al., 1977]
  • 226. EM algorithm is used to estimate MAP when some variables are missing 44 Expectation step )(log)],|,([log),( ),,|( )( zpzdxpzdB zdxp i  E                  n kn k n k k n k n n i zzdxEzdB , 2 2 )( * 1 ),(   Advances in automating analysis of neural time series [Dempster et. al., 1977] weights
  • 227. EM algorithm is used to estimate MAP when some variables are missing 44 Expectation step Maximization step )(log)],|,([log),( ),,|( )( zpzdxpzdB zdxp i  E ),(maxarg),( )( , )1()1( zdBzd i zd ii                   n kn k n k k n k n n i zzdxEzdB , 2 2 )( * 1 ),(   Advances in automating analysis of neural time series [Dempster et. al., 1977] weights
  • 228. EM algorithm is used to estimate MAP when some variables are missing 44 Expectation step Maximization step )(log)],|,([log),( ),,|( )( zpzdxpzdB zdxp i  E ),(maxarg),( )( , )1()1( zdBzd i zd ii                   n kn k n k k n k n n i zzdxEzdB , 2 2 )( * 1 ),(   Iterate until convergence Advances in automating analysis of neural time series [Dempster et. al., 1977] weights
  • 229. 45 For the E-step, we compute the expectation using sampling Advances in automating analysis of neural time series [Bishop, 2007]
  • 230. 45 For the E-step, we compute the expectation using sampling       tntn tntn dzdxp ,, ,, ),,|( 11   E Advances in automating analysis of neural time series [Bishop, 2007]
  • 231. 45 For the E-step, we compute the expectation using sampling       tntn tntn dzdxp ,, ,, ),,|( 11   E   J j j tnJ 1 , 11  Advances in automating analysis of neural time series [Bishop, 2007]
  • 232. 45 For the E-step, we compute the expectation using sampling       tntn tntn dzdxp ,, ,, ),,|( 11   E   J j j tnJ 1 , 11  Advances in automating analysis of neural time series [Bishop, 2007]
  • 233. 45 For the E-step, we compute the expectation using sampling       tntn tntn dzdxp ,, ,, ),,|( 11   E   J j j tnJ 1 , 11  Sampled from the posterior distribution Advances in automating analysis of neural time series [Bishop, 2007]
  • 234. 45 For the E-step, we compute the expectation using sampling       tntn tntn dzdxp ,, ,, ),,|( 11   E   J j j tnJ 1 , 11  Sampled from the posterior distribution Markov Chain Monte Carlo (MCMC) Advances in automating analysis of neural time series [Bishop, 2007]
  • 235. 46 Let’s recap Advances in automating analysis of neural time series
  • 236. 46 Let’s recap Data likelihood term Advances in automating analysis of neural time series
  • 237. 46 Let’s recap Data likelihood term        tn K k k n k tntn zdNdzx , 1 ,, 2 1 ,*~,,|  Conditionally Gaussian Advances in automating analysis of neural time series
  • 238. 46 Let’s recap Data likelihood term        tn K k k n k tntn zdNdzx , 1 ,, 2 1 ,*~,,|  Conditionally Gaussian Latent variable Advances in automating analysis of neural time series
  • 239. 46 Let’s recap Data likelihood term        tn K k k n k tntn zdNdzx , 1 ,, 2 1 ,*~,,|  Conditionally Gaussian Latent variable EM algorithm Advances in automating analysis of neural time series
  • 240. 46 Let’s recap Data likelihood term        tn K k k n k tntn zdNdzx , 1 ,, 2 1 ,*~,,|  Conditionally Gaussian Latent variable EM algorithm E step: MCMC to learn weights       tn, 1  E Advances in automating analysis of neural time series
  • 241. 46 Let’s recap Data likelihood term        tn K k k n k tntn zdNdzx , 1 ,, 2 1 ,*~,,|  Conditionally Gaussian Latent variable EM algorithm E step: MCMC to learn weights M step: Weighted CSC       tn, 1  E Advances in automating analysis of neural time series
  • 242. 47 -CSC on simulated data Advances in automating analysis of neural time series [Jost et al., 2006]
  • 243. 48 -CSC with  = 2 (reduces to Gaussian) Advances in automating analysis of neural time series
  • 244. 48 -CSC with  = 2 (reduces to Gaussian) Advances in automating analysis of neural time series
  • 245. 49 -CSC with ≠2 Advances in automating analysis of neural time series
  • 246. 49 -CSC with ≠2 Advances in automating analysis of neural time series
  • 247. Advances in automating analysis of neural time series 50 Conclusion
  • 248. Advances in automating analysis of neural time series 50 Conclusion Data sharing And data standards
  • 249. Advances in automating analysis of neural time series 50 Conclusion Data sharing And data standards Autoreject to remove artifacts
  • 250. Advances in automating analysis of neural time series 50 Conclusion Data sharing And data standards Autoreject to remove artifacts αCSC to learn recurring waveforms
  • 251. Alexandre Gramfort Denis Engemann Yousra Bekhti Eric LarsonUmut Şimşekli Tom Dupré la Tour Credits 51Advances in automating analysis of neural time series
  • 252. 52 Thank you Advances in automating analysis of neural time series