Linear-Size Approximations to the
     Vietoris-Rips Filtration

               Don Sheehy
             Geometrica Group
               INRIA Saclay




      This work appeared at SoCG 2012
The goal of topological data analysis
is to extract meaningful topological
       information from data.
Use powerful ideas from computational
   geometry to speed up persistent
homology computation when the data is
    intrinsically low-dimensional.
ATMCS: Linear-Size Approximations to the Vietoris-Rips Filtration
ATMCS: Linear-Size Approximations to the Vietoris-Rips Filtration
ATMCS: Linear-Size Approximations to the Vietoris-Rips Filtration
ATMCS: Linear-Size Approximations to the Vietoris-Rips Filtration
ATMCS: Linear-Size Approximations to the Vietoris-Rips Filtration
ATMCS: Linear-Size Approximations to the Vietoris-Rips Filtration
ATMCS: Linear-Size Approximations to the Vietoris-Rips Filtration
ATMCS: Linear-Size Approximations to the Vietoris-Rips Filtration
A filtration is a growing sequence of spaces.
A filtration is a growing sequence of spaces.




         A persistence diagram describes
         the topological changes over time.
A filtration is a growing sequence of spaces.




                                         Death
                                                 Birth
         A persistence diagram describes
         the topological changes over time.
The Vietoris-Rips Filtration encodes the topology of a
metric space when viewed at different scales.

    Input: A finite metric space (P, d).
    Output: A sequence of simplicial complexes {Rα }
    such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ.
The Vietoris-Rips Filtration encodes the topology of a
metric space when viewed at different scales.

    Input: A finite metric space (P, d).
    Output: A sequence of simplicial complexes {Rα }
    such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ.
The Vietoris-Rips Filtration encodes the topology of a
metric space when viewed at different scales.

    Input: A finite metric space (P, d).
    Output: A sequence of simplicial complexes {Rα }
    such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ.
The Vietoris-Rips Filtration encodes the topology of a
metric space when viewed at different scales.

    Input: A finite metric space (P, d).
    Output: A sequence of simplicial complexes {Rα }
    such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ.
The Vietoris-Rips Filtration encodes the topology of a
metric space when viewed at different scales.

    Input: A finite metric space (P, d).
    Output: A sequence of simplicial complexes {Rα }
    such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ.
The Vietoris-Rips Filtration encodes the topology of a
metric space when viewed at different scales.

    Input: A finite metric space (P, d).
    Output: A sequence of simplicial complexes {Rα }
    such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ.
The Vietoris-Rips Filtration encodes the topology of a
metric space when viewed at different scales.

    Input: A finite metric space (P, d).
    Output: A sequence of simplicial complexes {Rα }
    such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ.




              R∞ is the powerset 2P .
The Vietoris-Rips Filtration encodes the topology of a
metric space when viewed at different scales.

    Input: A finite metric space (P, d).
    Output: A sequence of simplicial complexes {Rα }
    such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ.




              R∞ is the powerset 2P .
                   This is too big!
Persistence Diagrams describe the changes in
topology corresponding to changes in scale.
   Death




            Birth
Persistence Diagrams describe the changes in
topology corresponding to changes in scale.
   Death




            Birth
Persistence Diagrams describe the changes in
topology corresponding to changes in scale.

                            Bottleneck Distance
                             d∞ = max |pi − qi |∞
                              B
                                     i
   Death




            Birth
Persistence Diagrams describe the changes in
topology corresponding to changes in scale.

                            Bottleneck Distance
                              d∞ = max |pi − qi |∞
                               B
                                       i
   Death




                            In approximate persistence
                            diagrams, birth and death
                            times differ by at most a
                            constant factor.



            Birth
Persistence Diagrams describe the changes in
topology corresponding to changes in scale.

                            Bottleneck Distance
                              d∞ = max |pi − qi |∞
                               B
                                       i
   Death




                            In approximate persistence
                            diagrams, birth and death
                            times differ by at most a
                            constant factor.


                            This is just the bottleneck
            Birth           distance of the log-scale
                            diagrams.
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
The Vietoris-Rips complex is a nerve of boxes.
Embed the input metric in Rn with the L∞ norm.
                                            ˇ
In L∞ , the Rips complex is the same as the Cech complex
Some related work on sparse filtrations.
Some related work on sparse filtrations.

    Previous Approaches at subsampling:
Some related work on sparse filtrations.

    Previous Approaches at subsampling:
      Witness Complexes [dSC04, BGO07]
Some related work on sparse filtrations.

    Previous Approaches at subsampling:
      Witness Complexes [dSC04, BGO07]
      Persistence-based Reconstruction [CO08]
Some related work on sparse filtrations.

    Previous Approaches at subsampling:
      Witness Complexes [dSC04, BGO07]
      Persistence-based Reconstruction [CO08]
    Other methods:
Some related work on sparse filtrations.

    Previous Approaches at subsampling:
      Witness Complexes [dSC04, BGO07]
      Persistence-based Reconstruction [CO08]
    Other methods:
      Meshes in Euclidean Space [HMSO10]
Some related work on sparse filtrations.

    Previous Approaches at subsampling:
      Witness Complexes [dSC04, BGO07]
      Persistence-based Reconstruction [CO08]
    Other methods:
      Meshes in Euclidean Space [HMSO10]
      Topological simplification [Z10, ALS11]
Key idea: Treat many close points as one point.
Key idea: Treat many close points as one point.




    This idea is ubiquitous in computational geometry.
Key idea: Treat many close points as one point.




    This idea is ubiquitous in computational geometry.

     n-body simulation, approximate nearest neighbor search,
     spanners, well-separated pair decomposition,...
Consider a 2-dimensional filtration parameterized
by both scale and sampling density.
Consider a 2-dimensional filtration parameterized
by both scale and sampling density.
Consider a 2-dimensional filtration parameterized
by both scale and sampling density.
Consider a 2-dimensional filtration parameterized
by both scale and sampling density.
Consider a 2-dimensional filtration parameterized
by both scale and sampling density.
Consider a 2-dimensional filtration parameterized
by both scale and sampling density.
Consider a 2-dimensional filtration parameterized
by both scale and sampling density.
Consider a 2-dimensional filtration parameterized
by both scale and sampling density.
Consider a 2-dimensional filtration parameterized
by both scale and sampling density.
Consider a 2-dimensional filtration parameterized
by both scale and sampling density.
Consider a 2-dimensional filtration parameterized
by both scale and sampling density.
Intuition: Remove points that are covered
by their neighbors.
Intuition: Remove points that are covered
by their neighbors.
Intuition: Remove points that are covered
by their neighbors.
Intuition: Remove points that are covered
by their neighbors.



 No change
 in topology
Intuition: Remove points that are covered
by their neighbors.



 No change
 in topology
Intuition: Remove points that are covered
by their neighbors.



 No change
 in topology
Intuition: Remove points that are covered
by their neighbors.



 No change
 in topology




                                Perturb the metric
ATMCS: Linear-Size Approximations to the Vietoris-Rips Filtration
Two tricks:
Two tricks:
 1 Embed the zigzag in a topologically equivalent filtration.
Two tricks:
 1 Embed the zigzag in a topologically equivalent filtration.

                          ˆ    ˆ    ˆ
    Standard filtration → Rα → Rβ → Rγ →




                                            →
                            →


                                    →
     Zigzag filtration   ← Qα → Qβ ← Qγ →
Two tricks:
 1 Embed the zigzag in a topologically equivalent filtration.
 2 Perturb the metric so the persistence module does not zigzag.
                          ˆ    ˆ    ˆ
    Standard filtration → Rα → Rβ → Rγ →




                                           →
                           →


                                   →
     Zigzag filtration   ← Qα → Qβ ← Qγ →
Two tricks:
 1 Embed the zigzag in a topologically equivalent filtration.
 2 Perturb the metric so the persistence module does not zigzag.
                          ˆ    ˆ    ˆ
    Standard filtration → Rα → Rβ → Rγ →




                                           →
                           →


                                   →
     Zigzag filtration   ← Qα → Qβ ← Qγ →

                   ← H(Qα ) → H(Qβ ) ← H(Qγ ) →
Two tricks:
 1 Embed the zigzag in a topologically equivalent filtration.
 2 Perturb the metric so the persistence module does not zigzag.
                          ˆ    ˆ    ˆ
    Standard filtration → Rα → Rβ → Rγ →




                                           →
                           →


                                   →
     Zigzag filtration   ← Qα → Qβ ← Qγ →
                   ∼
                   =                 ∼
                                     =
                   ← H(Qα ) → H(Qβ ) ← H(Qγ ) →
Two tricks:
 1 Embed the zigzag in a topologically equivalent filtration.
 2 Perturb the metric so the persistence module does not zigzag.
                          ˆ    ˆ    ˆ
    Standard filtration → Rα → Rβ → Rγ →




                                           →
                           →


                                   →
     Zigzag filtration   ← Qα → Qβ ← Qγ →
                   ∼
                   =                 ∼
                                     =
                   → H(Qα ) → H(Qβ ) → H(Qγ ) →
Two tricks:
 1 Embed the zigzag in a topologically equivalent filtration.
 2 Perturb the metric so the persistence module does not zigzag.
                          ˆ    ˆ    ˆ
    Standard filtration → Rα → Rβ → Rγ →




                                             →
                             →


                                     →
     Zigzag filtration   ← Qα → Qβ ← Qγ →
                   ∼
                   =                 ∼
                                     =
                   → H(Qα ) → H(Qβ ) → H(Qγ ) →

                At the homology level, there is no zigzag.
The Result:
The Result:
 Given an n point metric (P, d), there exists a zigzag filtration of
 size O(n) whose persistence diagram (1+ ε)-approximates that of
 the Rips filtration.
The Result:
 Given an n point metric (P, d), there exists a zigzag filtration of
 size O(n) whose persistence diagram (1+ ε)-approximates that of
 the Rips filtration.

 (Big-O hides doubling dimension and approximation factor, ε)
The Result:
 Given an n point metric (P, d), there exists a zigzag filtration of
 size O(n) whose persistence diagram (1+ ε)-approximates that of
 the Rips filtration.

 (Big-O hides doubling dimension and approximation factor, ε)
The Result:
 Given an n point metric (P, d), there exists a zigzag filtration of
 size O(n) whose persistence diagram (1+ ε)-approximates that of
 the Rips filtration.

 (Big-O hides doubling dimension and approximation factor, ε)
The Result:
 Given an n point metric (P, d), there exists a zigzag filtration of
 size O(n) whose persistence diagram (1+ ε)-approximates that of
 the Rips filtration.

 (Big-O hides doubling dimension and approximation factor, ε)




 A metric with doubling dimension d is
 one for which every ball of radius 2r
 can be covered by 2d balls of radius r
 for all r.
How to perturb the metric.
How to perturb the metric.
    Let tp be the time when point p is removed.
How to perturb the metric.
    Let tp be the time when point p is removed.
             
              0                    if α ≤ (1 − ε)tp
    wp (α) =   α − (1 − ε)tp        if (1 − ε)tp < α < tp
               εα                   if tp ≤ α
             
How to perturb the metric.
    Let tp be the time when point p is removed.
             
              0                       if α ≤ (1 − ε)tp
    wp (α) =   α − (1 − ε)tp           if (1 − ε)tp < α < tp
               εα                      if tp ≤ α
             


       0
           0            (1 − ε)tp tp
How to perturb the metric.
    Let tp be the time when point p is removed.
             
              0                        if α ≤ (1 − ε)tp
    wp (α) =   α − (1 − ε)tp            if (1 − ε)tp < α < tp
               εα                       if tp ≤ α
             


       0
           0             (1 − ε)tp tp

               ˆ
               dα (p, q) = d(p, q) + wp (α) + wq (α)
How to perturb the metric.
     Let tp be the time when point p is removed.
              
               0                        if α ≤ (1 − ε)tp
     wp (α) =   α − (1 − ε)tp            if (1 − ε)tp < α < tp
                εα                       if tp ≤ α
              


        0
            0             (1 − ε)tp tp

                ˆ
                dα (p, q) = d(p, q) + wp (α) + wq (α)
   Rips Complex: σ ∈ Rα ⇔ d(p, q) ≤ 2α for all p, q ∈ σ
How to perturb the metric.
     Let tp be the time when point p is removed.
              
               0                        if α ≤ (1 − ε)tp
     wp (α) =   α − (1 − ε)tp            if (1 − ε)tp < α < tp
                εα                       if tp ≤ α
              


        0
            0             (1 − ε)tp tp

                ˆ
                dα (p, q) = d(p, q) + wp (α) + wq (α)
   Rips Complex: σ ∈ Rα ⇔ d(p, q) ≤ 2α for all p, q ∈ σ
                     ˆ    ˆ
   Relaxed Rips: σ ∈ Rα ⇔ dα (p, q) ≤ 2α for all p, q ∈ σ
How to perturb the metric.
     Let tp be the time when point p is removed.
              
               0                        if α ≤ (1 − ε)tp
     wp (α) =   α − (1 − ε)tp            if (1 − ε)tp < α < tp
                εα                       if tp ≤ α
              


        0
            0             (1 − ε)tp tp

                ˆ
                dα (p, q) = d(p, q) + wp (α) + wq (α)
   Rips Complex: σ ∈ Rα ⇔ d(p, q) ≤ 2α for all p, q ∈ σ
                     ˆ    ˆ
   Relaxed Rips: σ ∈ Rα ⇔ dα (p, q) ≤ 2α for all p, q ∈ σ

                               ˆ
                       R 1+ε ⊆ Rα ⊆ Rα
                          α
Net-trees
Net-trees
Net-trees
Net-trees


A generalization of quadtrees to metric spaces.
Net-trees


A generalization of quadtrees to metric spaces.
One leaf per point of P.
Net-trees


A generalization of quadtrees to metric spaces.
One leaf per point of P.
Each node u has a representative rep(u) in P and a radius rad(p).
Net-trees


A generalization of quadtrees to metric spaces.
One leaf per point of P.
Each node u has a representative rep(u) in P and a radius rad(p).
Three properties:
Net-trees


A generalization of quadtrees to metric spaces.
One leaf per point of P.
Each node u has a representative rep(u) in P and a radius rad(p).
Three properties:

 1 Inheritance: Every nonleaf u has a child with the same rep.
Net-trees


A generalization of quadtrees to metric spaces.
One leaf per point of P.
Each node u has a representative rep(u) in P and a radius rad(p).
Three properties:

 1 Inheritance: Every nonleaf u has a child with the same rep.
 2 Covering: Every heir is within ball(rep(u), rad(u))
Net-trees


A generalization of quadtrees to metric spaces.
One leaf per point of P.
Each node u has a representative rep(u) in P and a radius rad(p).
Three properties:

 1 Inheritance: Every nonleaf u has a child with the same rep.
 2 Covering: Every heir is within ball(rep(u), rad(u))
 3 Packing: Any children v,w of u have d(rep(v),rep(w)) > K rad(u)
Net-trees


A generalization of quadtrees to metric spaces.
One leaf per point of P.
Each node u has a representative rep(u) in P and a radius rad(p).
Three properties:

 1 Inheritance: Every nonleaf u has a child with the same rep.
 2 Covering: Every heir is within ball(rep(u), rad(u))
 3 Packing: Any children v,w of u have d(rep(v),rep(w)) > K rad(u)
       Let up be the ancestor of all nodes represented by p.
                                      1
          Time to remove p: tp =   ε(1−ε) rad(parent(up ))
Projection onto a net
Projection onto a net
             Nα = {p ∈ P : tp ≥ α}
Projection onto a net
              Nα = {p ∈ P : tp ≥ α}
                             ˆ
     Qα is the subcomplex of Rα induced on Nα .
Projection onto a net
                  Nα = {p ∈ P : tp ≥ α}
                             ˆ
     Qα is the subcomplex of Rα induced on Nα .
                    p                     if p ∈ Nα
       πα (p) =                 ˆ
                    arg minq∈Nα d(p, q)   otherwise
Projection onto a net
                  Nα = {p ∈ P : tp ≥ α}
                             ˆ
     Qα is the subcomplex of Rα induced on Nα .
                    p                     if p ∈ Nα
       πα (p) =                 ˆ
                    arg minq∈Nα d(p, q)   otherwise

                    Nα is a Delone set.
Projection onto a net
                   Nα = {p ∈ P : tp ≥ α}
                             ˆ
     Qα is the subcomplex of Rα induced on Nα .
                     p                     if p ∈ Nα
        πα (p) =                 ˆ
                     arg minq∈Nα d(p, q)   otherwise

                     Nα is a Delone set.
    1   Covering: For all p ∈ P , there is a q ∈ Nα
                  such that d(p, q) ≤ ε(1 − ε)α.
Projection onto a net
                   Nα = {p ∈ P : tp ≥ α}
                             ˆ
     Qα is the subcomplex of Rα induced on Nα .
                     p                     if p ∈ Nα
        πα (p) =                 ˆ
                     arg minq∈Nα d(p, q)   otherwise

                     Nα is a Delone set.
    1   Covering: For all p ∈ P , there is a q ∈ Nα
                  such that d(p, q) ≤ ε(1 − ε)α.

    2   Packing: For all distinct p, q ∈ Nα ,
                 d(p, q) ≥ Kpack ε(1 − ε)α.
Projection onto a net
                    Nα = {p ∈ P : tp ≥ α}
                              ˆ
      Qα is the subcomplex of Rα induced on Nα .
                      p                     if p ∈ Nα
         πα (p) =                 ˆ
                      arg minq∈Nα d(p, q)   otherwise

                      Nα is a Delone set.
     1   Covering: For all p ∈ P , there is a q ∈ Nα
                   such that d(p, q) ≤ ε(1 − ε)α.

     2   Packing: For all distinct p, q ∈ Nα ,
                  d(p, q) ≥ Kpack ε(1 − ε)α.

   Key Fact:                      ˆ              ˆ
               For all p, q ∈ P , d(πα (p), q) ≤ d(p, q).
Key Fact:                      ˆ              ˆ
            For all p, q ∈ P , d(πα (p), q) ≤ d(p, q).

This can be used to show that projection onto a net is a
homotopy equivalence.


         Relaxed Rips: → Rα → Rβ → Rγ →
                         ˆ    ˆ    ˆ




                                              →
                              →


                                      →
    Sparse Rips Zigzag: ← Qα → Qβ ← Qγ →
Key Fact:                      ˆ              ˆ
            For all p, q ∈ P , d(πα (p), q) ≤ d(p, q).

This can be used to show that projection onto a net is a
homotopy equivalence.


         Relaxed Rips: → Rα → Rβ → Rγ →
                         ˆ    ˆ    ˆ




                                              →
                              →


                                      →
    Sparse Rips Zigzag: ← Qα → Qβ ← Qγ →



                ← H(Qα ) → H(Qβ ) ← H(Qγ ) →
Key Fact:                      ˆ              ˆ
            For all p, q ∈ P , d(πα (p), q) ≤ d(p, q).

This can be used to show that projection onto a net is a
homotopy equivalence.


         Relaxed Rips: → Rα → Rβ → Rγ →
                         ˆ    ˆ    ˆ




                                              →
                              →


                                      →
    Sparse Rips Zigzag: ← Qα → Qβ ← Qγ →


                ∼
                =                 ∼
                                  =
                ← H(Qα ) → H(Qβ ) ← H(Qγ ) →
Key Fact:                      ˆ              ˆ
            For all p, q ∈ P , d(πα (p), q) ≤ d(p, q).

This can be used to show that projection onto a net is a
homotopy equivalence.


         Relaxed Rips: → Rα → Rβ → Rγ →
                         ˆ    ˆ    ˆ




                                              →
                              →


                                      →
    Sparse Rips Zigzag: ← Qα → Qβ ← Qγ →


                ∼
                =                 ∼
                                  =
                → H(Qα ) → H(Qβ ) → H(Qγ ) →
Why is the filtration only linear size?
Why is the filtration only linear size?
Standard trick from Euclidean geometry:
Why is the filtration only linear size?
Standard trick from Euclidean geometry:

 1 Charge each simplex to its vertex with the earliest deletion time.
Why is the filtration only linear size?
Standard trick from Euclidean geometry:

 1 Charge each simplex to its vertex with the earliest deletion time.
 2 Apply a packing argument to the larger neighbors.
Why is the filtration only linear size?
Standard trick from Euclidean geometry:

 1 Charge each simplex to its vertex with the earliest deletion time.
 2 Apply a packing argument to the larger neighbors.
 3 Conclude average O(1) simplices per vertex.
Why is the filtration only linear size?
Standard trick from Euclidean geometry:

 1 Charge each simplex to its vertex with the earliest deletion time.
 2 Apply a packing argument to the larger neighbors.
 3 Conclude average O(1) simplices per vertex.
                            O(d2 )
                        1
                        ε
Really getting rid of the zigzags.
Really getting rid of the zigzags.

                Xα =         Qα
                       β≤α
Really getting rid of the zigzags.

                       Xα =          Qα
                               β≤α




 This is (almost) the clique complex of a hierarchical spanner!
Summary
Summary
   Approximate the VR filtration with a Zigzag filtration.
Summary
   Approximate the VR filtration with a Zigzag filtration.
   Remove points using a hierarchical net-tree.
Summary
   Approximate the VR filtration with a Zigzag filtration.
   Remove points using a hierarchical net-tree.
   Perturb the metric to straighten out the zigzags.
Summary
   Approximate the VR filtration with a Zigzag filtration.
   Remove points using a hierarchical net-tree.
   Perturb the metric to straighten out the zigzags.
   Use packing arguments to get linear size.
Summary
   Approximate the VR filtration with a Zigzag filtration.
   Remove points using a hierarchical net-tree.
   Perturb the metric to straighten out the zigzags.
   Use packing arguments to get linear size.
   Union trick eliminates zigzag.
Summary
    Approximate the VR filtration with a Zigzag filtration.
    Remove points using a hierarchical net-tree.
    Perturb the metric to straighten out the zigzags.
    Use packing arguments to get linear size.
    Union trick eliminates zigzag.

 The Future
Summary
    Approximate the VR filtration with a Zigzag filtration.
    Remove points using a hierarchical net-tree.
    Perturb the metric to straighten out the zigzags.
    Use packing arguments to get linear size.
    Union trick eliminates zigzag.

 The Future
    Distance to a measure?
Summary
    Approximate the VR filtration with a Zigzag filtration.
    Remove points using a hierarchical net-tree.
    Perturb the metric to straighten out the zigzags.
    Use packing arguments to get linear size.
    Union trick eliminates zigzag.

 The Future
    Distance to a measure?
    Other types of simplification.
Summary
    Approximate the VR filtration with a Zigzag filtration.
    Remove points using a hierarchical net-tree.
    Perturb the metric to straighten out the zigzags.
    Use packing arguments to get linear size.
    Union trick eliminates zigzag.

 The Future
    Distance to a measure?
    Other types of simplification.
    Construct the net-tree in O(n log n) time.
Summary
    Approximate the VR filtration with a Zigzag filtration.
    Remove points using a hierarchical net-tree.
    Perturb the metric to straighten out the zigzags.
    Use packing arguments to get linear size.
    Union trick eliminates zigzag.

 The Future
    Distance to a measure?
    Other types of simplification.
    Construct the net-tree in O(n log n) time.
    An implementation.
Summary
    Approximate the VR filtration with a Zigzag filtration.
    Remove points using a hierarchical net-tree.
    Perturb the metric to straighten out the zigzags.
    Use packing arguments to get linear size.
    Union trick eliminates zigzag.

 The Future
    Distance to a measure?
    Other types of simplification.
    Construct the net-tree in O(n log n) time.
    An implementation.


                   Thank You.
Summary
    Approximate the VR filtration with a Zigzag filtration.
    Remove points using a hierarchical net-tree.
    Perturb the metric to straighten out the zigzags.
    Use packing arguments to get linear size.
    Union trick eliminates zigzag.

 The Future
    Distance to a measure?
    Other types of simplification.
    Construct the net-tree in O(n log n) time.
    An implementation.


                   Thank You.
ATMCS: Linear-Size Approximations to the Vietoris-Rips Filtration

More Related Content

PDF
A Multicover Nerve for Geometric Inference
PPTX
Labmeeting 20.12.2011evie
PPT
Marco legal historia clinica
PPT
MARCO LEGAL DE LA HISTORIA CLINICA Y REGISTROS CLINICOS DEL CUIDADO
PDF
Sensors and Samples: A Homological Approach
PDF
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
PDF
The Persistent Homology of Distance Functions under Random Projection
A Multicover Nerve for Geometric Inference
Labmeeting 20.12.2011evie
Marco legal historia clinica
MARCO LEGAL DE LA HISTORIA CLINICA Y REGISTROS CLINICOS DEL CUIDADO
Sensors and Samples: A Homological Approach
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
The Persistent Homology of Distance Functions under Random Projection

Similar to ATMCS: Linear-Size Approximations to the Vietoris-Rips Filtration (12)

PDF
Sensor networks, point processes and homology
PPT
Inverse Limits in Holomorphic Dynamics
PDF
SOCG: Linear-Size Approximations to the Vietoris-Rips Filtration
PDF
Topological algebra for wireless networks
PDF
Seifert Fiberings Kyung Bai Lee Frank Raymond
PDF
Fractals And Universal Spaces In Dimension Theory 1st Edition Stephen Lipscom...
PDF
Topological Data Analysis and Persistent Homology
PDF
Neuronal Detection using Persistent Homology
PPTX
Topology for data science
PDF
The Infinitedimensional Topology Of Function Spaces Illustrated Edition J Van...
PDF
Relative superior mandelbrot and julia sets for integer and non integer values
PDF
Relative superior mandelbrot sets and relative
Sensor networks, point processes and homology
Inverse Limits in Holomorphic Dynamics
SOCG: Linear-Size Approximations to the Vietoris-Rips Filtration
Topological algebra for wireless networks
Seifert Fiberings Kyung Bai Lee Frank Raymond
Fractals And Universal Spaces In Dimension Theory 1st Edition Stephen Lipscom...
Topological Data Analysis and Persistent Homology
Neuronal Detection using Persistent Homology
Topology for data science
The Infinitedimensional Topology Of Function Spaces Illustrated Edition J Van...
Relative superior mandelbrot and julia sets for integer and non integer values
Relative superior mandelbrot sets and relative
Ad

More from Don Sheehy (20)

PDF
Some Thoughts on Sampling
PDF
Characterizing the Distortion of Some Simple Euclidean Embeddings
PDF
Persistent Homology and Nested Dissection
PDF
Geometric and Topological Data Analysis
PDF
Geometric Separators and the Parabolic Lift
PDF
A New Approach to Output-Sensitive Voronoi Diagrams and Delaunay Triangulations
PDF
Optimal Meshing
PDF
Output-Sensitive Voronoi Diagrams and Delaunay Triangulations
PDF
Mesh Generation and Topological Data Analysis
PDF
Minimax Rates for Homology Inference
PDF
New Bounds on the Size of Optimal Meshes
PPT
Flips in Computational Geometry
PDF
Beating the Spread: Time-Optimal Point Meshing
PDF
Ball Packings and Fat Voronoi Diagrams
PDF
Learning with Nets and Meshes
PDF
On Nets and Meshes
PDF
Topological Inference via Meshing
PDF
Topological Inference via Meshing
PDF
Geometry, Topology, and all of Your Wildest Dreams Will Come True
PDF
Achieving Spatial Adaptivity while Searching for Approximate Nearest Neighbors
Some Thoughts on Sampling
Characterizing the Distortion of Some Simple Euclidean Embeddings
Persistent Homology and Nested Dissection
Geometric and Topological Data Analysis
Geometric Separators and the Parabolic Lift
A New Approach to Output-Sensitive Voronoi Diagrams and Delaunay Triangulations
Optimal Meshing
Output-Sensitive Voronoi Diagrams and Delaunay Triangulations
Mesh Generation and Topological Data Analysis
Minimax Rates for Homology Inference
New Bounds on the Size of Optimal Meshes
Flips in Computational Geometry
Beating the Spread: Time-Optimal Point Meshing
Ball Packings and Fat Voronoi Diagrams
Learning with Nets and Meshes
On Nets and Meshes
Topological Inference via Meshing
Topological Inference via Meshing
Geometry, Topology, and all of Your Wildest Dreams Will Come True
Achieving Spatial Adaptivity while Searching for Approximate Nearest Neighbors
Ad

ATMCS: Linear-Size Approximations to the Vietoris-Rips Filtration

  • 1. Linear-Size Approximations to the Vietoris-Rips Filtration Don Sheehy Geometrica Group INRIA Saclay This work appeared at SoCG 2012
  • 2. The goal of topological data analysis is to extract meaningful topological information from data.
  • 3. Use powerful ideas from computational geometry to speed up persistent homology computation when the data is intrinsically low-dimensional.
  • 12. A filtration is a growing sequence of spaces.
  • 13. A filtration is a growing sequence of spaces. A persistence diagram describes the topological changes over time.
  • 14. A filtration is a growing sequence of spaces. Death Birth A persistence diagram describes the topological changes over time.
  • 15. The Vietoris-Rips Filtration encodes the topology of a metric space when viewed at different scales. Input: A finite metric space (P, d). Output: A sequence of simplicial complexes {Rα } such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ.
  • 16. The Vietoris-Rips Filtration encodes the topology of a metric space when viewed at different scales. Input: A finite metric space (P, d). Output: A sequence of simplicial complexes {Rα } such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ.
  • 17. The Vietoris-Rips Filtration encodes the topology of a metric space when viewed at different scales. Input: A finite metric space (P, d). Output: A sequence of simplicial complexes {Rα } such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ.
  • 18. The Vietoris-Rips Filtration encodes the topology of a metric space when viewed at different scales. Input: A finite metric space (P, d). Output: A sequence of simplicial complexes {Rα } such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ.
  • 19. The Vietoris-Rips Filtration encodes the topology of a metric space when viewed at different scales. Input: A finite metric space (P, d). Output: A sequence of simplicial complexes {Rα } such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ.
  • 20. The Vietoris-Rips Filtration encodes the topology of a metric space when viewed at different scales. Input: A finite metric space (P, d). Output: A sequence of simplicial complexes {Rα } such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ.
  • 21. The Vietoris-Rips Filtration encodes the topology of a metric space when viewed at different scales. Input: A finite metric space (P, d). Output: A sequence of simplicial complexes {Rα } such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ. R∞ is the powerset 2P .
  • 22. The Vietoris-Rips Filtration encodes the topology of a metric space when viewed at different scales. Input: A finite metric space (P, d). Output: A sequence of simplicial complexes {Rα } such that σ ∈ Rα iff d(p, q) ≤ 2α for all p, q ∈ σ. R∞ is the powerset 2P . This is too big!
  • 23. Persistence Diagrams describe the changes in topology corresponding to changes in scale. Death Birth
  • 24. Persistence Diagrams describe the changes in topology corresponding to changes in scale. Death Birth
  • 25. Persistence Diagrams describe the changes in topology corresponding to changes in scale. Bottleneck Distance d∞ = max |pi − qi |∞ B i Death Birth
  • 26. Persistence Diagrams describe the changes in topology corresponding to changes in scale. Bottleneck Distance d∞ = max |pi − qi |∞ B i Death In approximate persistence diagrams, birth and death times differ by at most a constant factor. Birth
  • 27. Persistence Diagrams describe the changes in topology corresponding to changes in scale. Bottleneck Distance d∞ = max |pi − qi |∞ B i Death In approximate persistence diagrams, birth and death times differ by at most a constant factor. This is just the bottleneck Birth distance of the log-scale diagrams.
  • 28. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 29. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 30. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 31. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 32. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 33. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 34. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 35. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 36. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 37. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 38. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 39. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 40. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 41. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 42. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 43. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 44. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 45. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 46. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 47. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 48. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 49. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 50. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 51. The Vietoris-Rips complex is a nerve of boxes. Embed the input metric in Rn with the L∞ norm. ˇ In L∞ , the Rips complex is the same as the Cech complex
  • 52. Some related work on sparse filtrations.
  • 53. Some related work on sparse filtrations. Previous Approaches at subsampling:
  • 54. Some related work on sparse filtrations. Previous Approaches at subsampling: Witness Complexes [dSC04, BGO07]
  • 55. Some related work on sparse filtrations. Previous Approaches at subsampling: Witness Complexes [dSC04, BGO07] Persistence-based Reconstruction [CO08]
  • 56. Some related work on sparse filtrations. Previous Approaches at subsampling: Witness Complexes [dSC04, BGO07] Persistence-based Reconstruction [CO08] Other methods:
  • 57. Some related work on sparse filtrations. Previous Approaches at subsampling: Witness Complexes [dSC04, BGO07] Persistence-based Reconstruction [CO08] Other methods: Meshes in Euclidean Space [HMSO10]
  • 58. Some related work on sparse filtrations. Previous Approaches at subsampling: Witness Complexes [dSC04, BGO07] Persistence-based Reconstruction [CO08] Other methods: Meshes in Euclidean Space [HMSO10] Topological simplification [Z10, ALS11]
  • 59. Key idea: Treat many close points as one point.
  • 60. Key idea: Treat many close points as one point. This idea is ubiquitous in computational geometry.
  • 61. Key idea: Treat many close points as one point. This idea is ubiquitous in computational geometry. n-body simulation, approximate nearest neighbor search, spanners, well-separated pair decomposition,...
  • 62. Consider a 2-dimensional filtration parameterized by both scale and sampling density.
  • 63. Consider a 2-dimensional filtration parameterized by both scale and sampling density.
  • 64. Consider a 2-dimensional filtration parameterized by both scale and sampling density.
  • 65. Consider a 2-dimensional filtration parameterized by both scale and sampling density.
  • 66. Consider a 2-dimensional filtration parameterized by both scale and sampling density.
  • 67. Consider a 2-dimensional filtration parameterized by both scale and sampling density.
  • 68. Consider a 2-dimensional filtration parameterized by both scale and sampling density.
  • 69. Consider a 2-dimensional filtration parameterized by both scale and sampling density.
  • 70. Consider a 2-dimensional filtration parameterized by both scale and sampling density.
  • 71. Consider a 2-dimensional filtration parameterized by both scale and sampling density.
  • 72. Consider a 2-dimensional filtration parameterized by both scale and sampling density.
  • 73. Intuition: Remove points that are covered by their neighbors.
  • 74. Intuition: Remove points that are covered by their neighbors.
  • 75. Intuition: Remove points that are covered by their neighbors.
  • 76. Intuition: Remove points that are covered by their neighbors. No change in topology
  • 77. Intuition: Remove points that are covered by their neighbors. No change in topology
  • 78. Intuition: Remove points that are covered by their neighbors. No change in topology
  • 79. Intuition: Remove points that are covered by their neighbors. No change in topology Perturb the metric
  • 82. Two tricks: 1 Embed the zigzag in a topologically equivalent filtration.
  • 83. Two tricks: 1 Embed the zigzag in a topologically equivalent filtration. ˆ ˆ ˆ Standard filtration → Rα → Rβ → Rγ → → → → Zigzag filtration ← Qα → Qβ ← Qγ →
  • 84. Two tricks: 1 Embed the zigzag in a topologically equivalent filtration. 2 Perturb the metric so the persistence module does not zigzag. ˆ ˆ ˆ Standard filtration → Rα → Rβ → Rγ → → → → Zigzag filtration ← Qα → Qβ ← Qγ →
  • 85. Two tricks: 1 Embed the zigzag in a topologically equivalent filtration. 2 Perturb the metric so the persistence module does not zigzag. ˆ ˆ ˆ Standard filtration → Rα → Rβ → Rγ → → → → Zigzag filtration ← Qα → Qβ ← Qγ → ← H(Qα ) → H(Qβ ) ← H(Qγ ) →
  • 86. Two tricks: 1 Embed the zigzag in a topologically equivalent filtration. 2 Perturb the metric so the persistence module does not zigzag. ˆ ˆ ˆ Standard filtration → Rα → Rβ → Rγ → → → → Zigzag filtration ← Qα → Qβ ← Qγ → ∼ = ∼ = ← H(Qα ) → H(Qβ ) ← H(Qγ ) →
  • 87. Two tricks: 1 Embed the zigzag in a topologically equivalent filtration. 2 Perturb the metric so the persistence module does not zigzag. ˆ ˆ ˆ Standard filtration → Rα → Rβ → Rγ → → → → Zigzag filtration ← Qα → Qβ ← Qγ → ∼ = ∼ = → H(Qα ) → H(Qβ ) → H(Qγ ) →
  • 88. Two tricks: 1 Embed the zigzag in a topologically equivalent filtration. 2 Perturb the metric so the persistence module does not zigzag. ˆ ˆ ˆ Standard filtration → Rα → Rβ → Rγ → → → → Zigzag filtration ← Qα → Qβ ← Qγ → ∼ = ∼ = → H(Qα ) → H(Qβ ) → H(Qγ ) → At the homology level, there is no zigzag.
  • 90. The Result: Given an n point metric (P, d), there exists a zigzag filtration of size O(n) whose persistence diagram (1+ ε)-approximates that of the Rips filtration.
  • 91. The Result: Given an n point metric (P, d), there exists a zigzag filtration of size O(n) whose persistence diagram (1+ ε)-approximates that of the Rips filtration. (Big-O hides doubling dimension and approximation factor, ε)
  • 92. The Result: Given an n point metric (P, d), there exists a zigzag filtration of size O(n) whose persistence diagram (1+ ε)-approximates that of the Rips filtration. (Big-O hides doubling dimension and approximation factor, ε)
  • 93. The Result: Given an n point metric (P, d), there exists a zigzag filtration of size O(n) whose persistence diagram (1+ ε)-approximates that of the Rips filtration. (Big-O hides doubling dimension and approximation factor, ε)
  • 94. The Result: Given an n point metric (P, d), there exists a zigzag filtration of size O(n) whose persistence diagram (1+ ε)-approximates that of the Rips filtration. (Big-O hides doubling dimension and approximation factor, ε) A metric with doubling dimension d is one for which every ball of radius 2r can be covered by 2d balls of radius r for all r.
  • 95. How to perturb the metric.
  • 96. How to perturb the metric. Let tp be the time when point p is removed.
  • 97. How to perturb the metric. Let tp be the time when point p is removed.   0 if α ≤ (1 − ε)tp wp (α) = α − (1 − ε)tp if (1 − ε)tp < α < tp εα if tp ≤ α 
  • 98. How to perturb the metric. Let tp be the time when point p is removed.   0 if α ≤ (1 − ε)tp wp (α) = α − (1 − ε)tp if (1 − ε)tp < α < tp εα if tp ≤ α  0 0 (1 − ε)tp tp
  • 99. How to perturb the metric. Let tp be the time when point p is removed.   0 if α ≤ (1 − ε)tp wp (α) = α − (1 − ε)tp if (1 − ε)tp < α < tp εα if tp ≤ α  0 0 (1 − ε)tp tp ˆ dα (p, q) = d(p, q) + wp (α) + wq (α)
  • 100. How to perturb the metric. Let tp be the time when point p is removed.   0 if α ≤ (1 − ε)tp wp (α) = α − (1 − ε)tp if (1 − ε)tp < α < tp εα if tp ≤ α  0 0 (1 − ε)tp tp ˆ dα (p, q) = d(p, q) + wp (α) + wq (α) Rips Complex: σ ∈ Rα ⇔ d(p, q) ≤ 2α for all p, q ∈ σ
  • 101. How to perturb the metric. Let tp be the time when point p is removed.   0 if α ≤ (1 − ε)tp wp (α) = α − (1 − ε)tp if (1 − ε)tp < α < tp εα if tp ≤ α  0 0 (1 − ε)tp tp ˆ dα (p, q) = d(p, q) + wp (α) + wq (α) Rips Complex: σ ∈ Rα ⇔ d(p, q) ≤ 2α for all p, q ∈ σ ˆ ˆ Relaxed Rips: σ ∈ Rα ⇔ dα (p, q) ≤ 2α for all p, q ∈ σ
  • 102. How to perturb the metric. Let tp be the time when point p is removed.   0 if α ≤ (1 − ε)tp wp (α) = α − (1 − ε)tp if (1 − ε)tp < α < tp εα if tp ≤ α  0 0 (1 − ε)tp tp ˆ dα (p, q) = d(p, q) + wp (α) + wq (α) Rips Complex: σ ∈ Rα ⇔ d(p, q) ≤ 2α for all p, q ∈ σ ˆ ˆ Relaxed Rips: σ ∈ Rα ⇔ dα (p, q) ≤ 2α for all p, q ∈ σ ˆ R 1+ε ⊆ Rα ⊆ Rα α
  • 106. Net-trees A generalization of quadtrees to metric spaces.
  • 107. Net-trees A generalization of quadtrees to metric spaces. One leaf per point of P.
  • 108. Net-trees A generalization of quadtrees to metric spaces. One leaf per point of P. Each node u has a representative rep(u) in P and a radius rad(p).
  • 109. Net-trees A generalization of quadtrees to metric spaces. One leaf per point of P. Each node u has a representative rep(u) in P and a radius rad(p). Three properties:
  • 110. Net-trees A generalization of quadtrees to metric spaces. One leaf per point of P. Each node u has a representative rep(u) in P and a radius rad(p). Three properties: 1 Inheritance: Every nonleaf u has a child with the same rep.
  • 111. Net-trees A generalization of quadtrees to metric spaces. One leaf per point of P. Each node u has a representative rep(u) in P and a radius rad(p). Three properties: 1 Inheritance: Every nonleaf u has a child with the same rep. 2 Covering: Every heir is within ball(rep(u), rad(u))
  • 112. Net-trees A generalization of quadtrees to metric spaces. One leaf per point of P. Each node u has a representative rep(u) in P and a radius rad(p). Three properties: 1 Inheritance: Every nonleaf u has a child with the same rep. 2 Covering: Every heir is within ball(rep(u), rad(u)) 3 Packing: Any children v,w of u have d(rep(v),rep(w)) > K rad(u)
  • 113. Net-trees A generalization of quadtrees to metric spaces. One leaf per point of P. Each node u has a representative rep(u) in P and a radius rad(p). Three properties: 1 Inheritance: Every nonleaf u has a child with the same rep. 2 Covering: Every heir is within ball(rep(u), rad(u)) 3 Packing: Any children v,w of u have d(rep(v),rep(w)) > K rad(u) Let up be the ancestor of all nodes represented by p. 1 Time to remove p: tp = ε(1−ε) rad(parent(up ))
  • 115. Projection onto a net Nα = {p ∈ P : tp ≥ α}
  • 116. Projection onto a net Nα = {p ∈ P : tp ≥ α} ˆ Qα is the subcomplex of Rα induced on Nα .
  • 117. Projection onto a net Nα = {p ∈ P : tp ≥ α} ˆ Qα is the subcomplex of Rα induced on Nα . p if p ∈ Nα πα (p) = ˆ arg minq∈Nα d(p, q) otherwise
  • 118. Projection onto a net Nα = {p ∈ P : tp ≥ α} ˆ Qα is the subcomplex of Rα induced on Nα . p if p ∈ Nα πα (p) = ˆ arg minq∈Nα d(p, q) otherwise Nα is a Delone set.
  • 119. Projection onto a net Nα = {p ∈ P : tp ≥ α} ˆ Qα is the subcomplex of Rα induced on Nα . p if p ∈ Nα πα (p) = ˆ arg minq∈Nα d(p, q) otherwise Nα is a Delone set. 1 Covering: For all p ∈ P , there is a q ∈ Nα such that d(p, q) ≤ ε(1 − ε)α.
  • 120. Projection onto a net Nα = {p ∈ P : tp ≥ α} ˆ Qα is the subcomplex of Rα induced on Nα . p if p ∈ Nα πα (p) = ˆ arg minq∈Nα d(p, q) otherwise Nα is a Delone set. 1 Covering: For all p ∈ P , there is a q ∈ Nα such that d(p, q) ≤ ε(1 − ε)α. 2 Packing: For all distinct p, q ∈ Nα , d(p, q) ≥ Kpack ε(1 − ε)α.
  • 121. Projection onto a net Nα = {p ∈ P : tp ≥ α} ˆ Qα is the subcomplex of Rα induced on Nα . p if p ∈ Nα πα (p) = ˆ arg minq∈Nα d(p, q) otherwise Nα is a Delone set. 1 Covering: For all p ∈ P , there is a q ∈ Nα such that d(p, q) ≤ ε(1 − ε)α. 2 Packing: For all distinct p, q ∈ Nα , d(p, q) ≥ Kpack ε(1 − ε)α. Key Fact: ˆ ˆ For all p, q ∈ P , d(πα (p), q) ≤ d(p, q).
  • 122. Key Fact: ˆ ˆ For all p, q ∈ P , d(πα (p), q) ≤ d(p, q). This can be used to show that projection onto a net is a homotopy equivalence. Relaxed Rips: → Rα → Rβ → Rγ → ˆ ˆ ˆ → → → Sparse Rips Zigzag: ← Qα → Qβ ← Qγ →
  • 123. Key Fact: ˆ ˆ For all p, q ∈ P , d(πα (p), q) ≤ d(p, q). This can be used to show that projection onto a net is a homotopy equivalence. Relaxed Rips: → Rα → Rβ → Rγ → ˆ ˆ ˆ → → → Sparse Rips Zigzag: ← Qα → Qβ ← Qγ → ← H(Qα ) → H(Qβ ) ← H(Qγ ) →
  • 124. Key Fact: ˆ ˆ For all p, q ∈ P , d(πα (p), q) ≤ d(p, q). This can be used to show that projection onto a net is a homotopy equivalence. Relaxed Rips: → Rα → Rβ → Rγ → ˆ ˆ ˆ → → → Sparse Rips Zigzag: ← Qα → Qβ ← Qγ → ∼ = ∼ = ← H(Qα ) → H(Qβ ) ← H(Qγ ) →
  • 125. Key Fact: ˆ ˆ For all p, q ∈ P , d(πα (p), q) ≤ d(p, q). This can be used to show that projection onto a net is a homotopy equivalence. Relaxed Rips: → Rα → Rβ → Rγ → ˆ ˆ ˆ → → → Sparse Rips Zigzag: ← Qα → Qβ ← Qγ → ∼ = ∼ = → H(Qα ) → H(Qβ ) → H(Qγ ) →
  • 126. Why is the filtration only linear size?
  • 127. Why is the filtration only linear size? Standard trick from Euclidean geometry:
  • 128. Why is the filtration only linear size? Standard trick from Euclidean geometry: 1 Charge each simplex to its vertex with the earliest deletion time.
  • 129. Why is the filtration only linear size? Standard trick from Euclidean geometry: 1 Charge each simplex to its vertex with the earliest deletion time. 2 Apply a packing argument to the larger neighbors.
  • 130. Why is the filtration only linear size? Standard trick from Euclidean geometry: 1 Charge each simplex to its vertex with the earliest deletion time. 2 Apply a packing argument to the larger neighbors. 3 Conclude average O(1) simplices per vertex.
  • 131. Why is the filtration only linear size? Standard trick from Euclidean geometry: 1 Charge each simplex to its vertex with the earliest deletion time. 2 Apply a packing argument to the larger neighbors. 3 Conclude average O(1) simplices per vertex. O(d2 ) 1 ε
  • 132. Really getting rid of the zigzags.
  • 133. Really getting rid of the zigzags. Xα = Qα β≤α
  • 134. Really getting rid of the zigzags. Xα = Qα β≤α This is (almost) the clique complex of a hierarchical spanner!
  • 136. Summary Approximate the VR filtration with a Zigzag filtration.
  • 137. Summary Approximate the VR filtration with a Zigzag filtration. Remove points using a hierarchical net-tree.
  • 138. Summary Approximate the VR filtration with a Zigzag filtration. Remove points using a hierarchical net-tree. Perturb the metric to straighten out the zigzags.
  • 139. Summary Approximate the VR filtration with a Zigzag filtration. Remove points using a hierarchical net-tree. Perturb the metric to straighten out the zigzags. Use packing arguments to get linear size.
  • 140. Summary Approximate the VR filtration with a Zigzag filtration. Remove points using a hierarchical net-tree. Perturb the metric to straighten out the zigzags. Use packing arguments to get linear size. Union trick eliminates zigzag.
  • 141. Summary Approximate the VR filtration with a Zigzag filtration. Remove points using a hierarchical net-tree. Perturb the metric to straighten out the zigzags. Use packing arguments to get linear size. Union trick eliminates zigzag. The Future
  • 142. Summary Approximate the VR filtration with a Zigzag filtration. Remove points using a hierarchical net-tree. Perturb the metric to straighten out the zigzags. Use packing arguments to get linear size. Union trick eliminates zigzag. The Future Distance to a measure?
  • 143. Summary Approximate the VR filtration with a Zigzag filtration. Remove points using a hierarchical net-tree. Perturb the metric to straighten out the zigzags. Use packing arguments to get linear size. Union trick eliminates zigzag. The Future Distance to a measure? Other types of simplification.
  • 144. Summary Approximate the VR filtration with a Zigzag filtration. Remove points using a hierarchical net-tree. Perturb the metric to straighten out the zigzags. Use packing arguments to get linear size. Union trick eliminates zigzag. The Future Distance to a measure? Other types of simplification. Construct the net-tree in O(n log n) time.
  • 145. Summary Approximate the VR filtration with a Zigzag filtration. Remove points using a hierarchical net-tree. Perturb the metric to straighten out the zigzags. Use packing arguments to get linear size. Union trick eliminates zigzag. The Future Distance to a measure? Other types of simplification. Construct the net-tree in O(n log n) time. An implementation.
  • 146. Summary Approximate the VR filtration with a Zigzag filtration. Remove points using a hierarchical net-tree. Perturb the metric to straighten out the zigzags. Use packing arguments to get linear size. Union trick eliminates zigzag. The Future Distance to a measure? Other types of simplification. Construct the net-tree in O(n log n) time. An implementation. Thank You.
  • 147. Summary Approximate the VR filtration with a Zigzag filtration. Remove points using a hierarchical net-tree. Perturb the metric to straighten out the zigzags. Use packing arguments to get linear size. Union trick eliminates zigzag. The Future Distance to a measure? Other types of simplification. Construct the net-tree in O(n log n) time. An implementation. Thank You.