SlideShare a Scribd company logo
i M O D U s e r D a y 2 0 1 9 – D S D - I N T 2 0 1 9
Parallelization project for the USGS
Jarno Verkaik (Deltares, groundwater management department)
SURFsara Cartesius supercomputer
(47,776 cores, 130TB RAM)
Why (distributed memory) parallel computing?
MODFLOW grid
Serial computing
Parallel computing
256 GB RAM
64 GB RAM
64 GB RAM
64 GB RAM
64 GB RAM MPI
iMODUserDay2019–DSD-INT2019
1 day computing
@ 256 GB machine
6 hours computing
@ 64 GB machines
MPI = Message Passing Interface
2
Contents
• Organization
• Project results and plans
• Global scale application
iMODUserDay2019–DSD-INT2019
3
How it started…
• 2010: Email correspondence on parallel MT3DMS
• 2013: Visit to USGS, start of joined code development (in kind)
• 2015: Start development of Parallel Krylov Solver for
MODFLOW-2005 and MODFLOW-USG
→ poster @ AGU Fall Meeting 2015, San Francisco
• 2016: First application of PKS at national and global scale
→ poster @ AGU Fall Meeting 2016, San Francisco
• Jul.2017: PKS as main feature for iMOD 4.0
& applied as default solver in National Water Model
• Oct.2017: Start parallelization of MODFLOW 6
→ funded by USGS through USGS-Deltares co-op
iMODUserDay2019–DSD-INT2019
4
AGU Fall Meeting 2015
AGU Fall Meeting 2016
Organization through (coastal morphology) USGS-Deltares co-op
iMODUserDay2019–DSD-INT2019
5
Robert McCall
Applied Morphodynamics,
Delft
Kees Nederhoff
Deltares USA,
Silver Spring
Martijn Russcher
Numerical Simulation Software,
Delft
Jarno Verkaik
Groundwater management,
Utrecht
Joseph D. Hughes
Integrated Modeling and Prediction,
Reston
Christian D. Langevin
Integrated Modeling and Prediction,
Mounds View
Li Erikson
Pacific Coastal and Marine
Science Center, Santa Cruz
USGS project FY2018 (Oct.2017 – Sep.2018)
• Start parallelization of MODFLOW 6
• Such that it can be part of a future release
• Target application: CONUS model by
Wesley Zell and Ward Sanford (USGS)
• USGS requirements:
- Proof of concept applicable to CONUS model
- Low code footprint
- Version controlled code at GitHub
- Easy to use
- Not depending on 3rd party libraries
iMODUserDay2019–DSD-INT2019
6
USGS project FY2018 (Oct.2017 – Sep.2018)
• Proof of concept was developed, applicable to CONUS
• Parallelization of IMS linear solver using Schwarz domain decomposition
(similar to Parallel Krylov Solver in iMOD)
• Repos: https://github.com/verkaik/modflow6-parallel.git
→MODFLOW 6 framework refactoring required for
exchanges between models (subdomains):
- That is generic for both serial and parallel computing
- Such that numerical schemes can be evaluated more easily at model interfaces
- Such that XT3D option can be used with multiple models (serial and parallel)
iMODUserDay2019–DSD-INT2019
7
Halo v2. concept
USGS project FY2019 & FY2020
• FY2019 (Oct.2018 – Sep.2019)
• Support XT3D option with multi-models (serial only)
• Development of interface model concept (revised halo v2)
• FY2020 (Oct.2019 – Sep.2020)
(To be determined)
• Continue working on parallel MODFLOW
• Development of Basic Model Interface
iMODUserDay2019–DSD-INT2019
8
M1
M2
USGS project FY2018 results: circle test 1250M cells
iMODUserDay2019–DSD-INT2019
9
USGS project FY2018 results: circle test 12.5M cells
iMODUserDay2019–DSD-INT2019
10
Related work to USGS project
iMODUserDay2019–DSD-INT2019
• PhD project (start 2018)
“Towards Exascale Computing for Large Scale Groundwater Simulation”
Goal: development of distributed parallel methods applying to large real-life groundwater models of
O(106)–O(109) cells.
• Mainly funded by Deltares research
• Directly relates to MODFLOW 6 kernel development for new iMOD 6
(see next presentation by Otto de Keizer)
Prof. Marc Bierkens
(Utrecht University)
Prof. Hai Xiang Lin
(Delft University of Technology)
Gualbert Oude Essink, PhD
(Deltares)
11
Contributions from PhD project
iMODUserDay2019–DSD-INT2019
Short term coding:
• Improve linear solver convergence when using many subdomains:
→ add coarse grid parallel preconditioner (implementation largely done)
• Option to check parallel implementation
→ add serial block Jacobi preconditioner (first implementation done)
• Code profiling & optimizing parallel performance (ongoing)
Longer term coding:
• Robustness option when using many subdomains:
→ add recovery mechanism for failing hardware
• Add physics-based parallel preconditioner
Short term modeling:
• Run USGS CONUS model in parallel @ 250 m 12
• Development of PCR-GLOBWB global groundwater model
having 1km x 1km resolution, O(108) cells
• First experience with parallel MODFLOW 6 with this scale:
• Physics based subdomain partitioning
• Model generation (pre-processing)
• Parallel computing
• Visualization of model results
→ Big data!
Typical raster: 43200 columns x 21600 rows, 3 GB binary
Global groundwater model @ 1km and 428M cells
Ref: Verkaik, J., Sutanudjaja, E.H., Oude Essink, G.H.P., Lin, H.X., and Bierkens, M.F.P., 2019. Parallel global hydrology and water resources
PCR-GLOBWB-MODFLOW model at hyper-resolution scale (1 km): first results, in: EGU General Assembly Conference Abstracts. p. 13397.
iMODUserDay2019–DSD-INT2019
13
MODFLOW 6 model characteristics:
• Steady-state, 2 layers, subsurface down-scaled from 10km
• Unstructured DISU grid met only “land cells”, total 428M
• CHD for sea, RIV in layer 1 + DRN in layer 1 & 2 (HydroSHEDS)
Global groundwater model @ 1km and 428M cells
iMODUserDay2019–DSD-INT2019
Parallel pre-processing using 128 subdomains
14
Global groundwater model @ 1km and 428M cells
Can we come up with predefined subdomain boundaries (e.g. hydrologically / administrative
boundary) such that they are useful for both modeler and parallel computing?
→ How to partition the world into 1024 subdomains using 1.8M catchments?
→ How to solve a sub-optimal optimization problem (load + edge cuts)?
1. Determine independent regions for groundwater
flow (continents, islands)
→ ~20k regions
2. Further divide large regions/catchments using
a lumped graph method → define parallel models
3. Cluster small regions → define serial models
iMODUserDay2019–DSD-INT2019
15
Global groundwater model @ 1km and 428M cells
• Partitioning results in 52 separate MODFLOW 6 models:
• 38 serial, small islands
• 13 parallel, 3 largest on super computer
428M
(2 layers)
Small parallel+
serial models
2. America
120M cells
286 cores
1min 36sec
112GB memory
1. Africa+EurAsia
256M cells
612 cores
3min 31sec
390 GB memory
3. Australia
20M cells
48 cores
33 sec
13 GB memory
5%
28%
60%
iMODUserDay2019–DSD-INT2019
16
Global groundwater model @ 1km and 428M cells
iMODUserDay2019–DSD-INT2019
Simulated
Groundwater Table
subdomain
boundary
(total: 1024)
17
Global groundwater model @ 1km and 428M cells
iMODUserDay2019–DSD-INT2019
Simulated
Groundwater Table
subdomain
boundary
(total: 1024)
Take home message:
USGS and Deltares are making progress on MPI parallelization
of the MODFLOW 6 multi-model capability
for reducing computing times & memory usage
THANK YOU! 18

More Related Content

PDF
Best Practices: Large Scale Multiphysics
PDF
Designing HPC & Deep Learning Middleware for Exascale Systems
PPTX
CNN Dataflow Implementation on FPGAs
PDF
DSD-INT 2016 The new parallel Krylov Solver package - Verkaik
PDF
State of Linux Containers for HPC
PDF
DSD-INT 2017 High Performance Parallel Computing with iMODFLOW-MetaSWAP - Ver...
PPTX
CNN Dataflow Implementation on FPGAs
PDF
Elastic multicore scheduling with the XiTAO runtime
Best Practices: Large Scale Multiphysics
Designing HPC & Deep Learning Middleware for Exascale Systems
CNN Dataflow Implementation on FPGAs
DSD-INT 2016 The new parallel Krylov Solver package - Verkaik
State of Linux Containers for HPC
DSD-INT 2017 High Performance Parallel Computing with iMODFLOW-MetaSWAP - Ver...
CNN Dataflow Implementation on FPGAs
Elastic multicore scheduling with the XiTAO runtime

What's hot (14)

PDF
Userspace RCU library : what linear multiprocessor scalability means for your...
PDF
Dock-site
PPTX
CNN Dataflow Implementation on FPGAs
PPTX
Comparing Orchestration
PDF
Low Energy Task Scheduling based on Work Stealing
ODP
Gluster fs hadoop_fifth-elephant
PDF
Using Ceph for Large Hadron Collider Data
PDF
Marian Marinov Clusters With Glusterfs
PPT
Responsive Distributed Routing Algorithm
PDF
DSD-INT 2017 The extended iMOD water balance tool; a cooperation of Deltares ...
PDF
Clusters With Glusterfs
PDF
Using Docker containers for scientific environments - on-premises and in the ...
PPTX
CNN Dataflow implementation on FPGAs
PPTX
New Ceph capabilities and Reference Architectures
Userspace RCU library : what linear multiprocessor scalability means for your...
Dock-site
CNN Dataflow Implementation on FPGAs
Comparing Orchestration
Low Energy Task Scheduling based on Work Stealing
Gluster fs hadoop_fifth-elephant
Using Ceph for Large Hadron Collider Data
Marian Marinov Clusters With Glusterfs
Responsive Distributed Routing Algorithm
DSD-INT 2017 The extended iMOD water balance tool; a cooperation of Deltares ...
Clusters With Glusterfs
Using Docker containers for scientific environments - on-premises and in the ...
CNN Dataflow implementation on FPGAs
New Ceph capabilities and Reference Architectures
Ad

Similar to DSD-INT 2019 Parallelization project for the USGS - Verkaik (20)

PDF
DSD-INT 2019 The iMOD 6 project - De Keizer
PDF
DSD-NL 2017 Parallel Krylov Solver Package for iMODFLOW-MetaSWAP - Verkaik
PDF
DSD-INT 2019 Ongoing MODFLOW Development by the USGS and External Collaborato...
PDF
DSD-INT 2023 Example of unstructured MODFLOW 6 modelling in California - Romero
PDF
DSD-INT 2018 iMOD X - MODFLOW6 developments - Icke
PDF
DSD-INT 2023 Recent MODFLOW Developments - Langevin
PDF
DSD-INT 2023 iMOD and new developments - Davids
PDF
DSD-INT 2022 iMOD User Day Welcome and programme - Kroon
PDF
DSD-INT 2023 Deltares Hydrology Suite - An introduction - Slootjes
PDF
DSD-INT 2017 Summary closing remarks & planned releases - Minnema
PPTX
Water resource model modflow
PDF
DSD-INT 2014 - OpenMI symposium - OpenMI and other model coupling standards, ...
PDF
DSD-INT 2023 Demo new features iMOD Suite - van Engelen
PDF
DSD-INT 2023 Hydrology User Days - Presentations - Day 2
PDF
DSD-INT 2019 Regional groundwater and geological voxel models for the Cauca V...
PDF
DSD-INT 2020 BlueEarth Engine - hydroMT - model builder framework
PDF
DSD-INT 2016 Delft3D Flexible Mesh Suite 2017 in a nutshell - Melger
PPTX
GROUNDWATER MODELING SYSTEM
PPTX
Ten Years of Coupled Hydrology and Hydraulic Modelling Supporting Storm Water...
PPTX
FutureGrid Computing Testbed as a Service
DSD-INT 2019 The iMOD 6 project - De Keizer
DSD-NL 2017 Parallel Krylov Solver Package for iMODFLOW-MetaSWAP - Verkaik
DSD-INT 2019 Ongoing MODFLOW Development by the USGS and External Collaborato...
DSD-INT 2023 Example of unstructured MODFLOW 6 modelling in California - Romero
DSD-INT 2018 iMOD X - MODFLOW6 developments - Icke
DSD-INT 2023 Recent MODFLOW Developments - Langevin
DSD-INT 2023 iMOD and new developments - Davids
DSD-INT 2022 iMOD User Day Welcome and programme - Kroon
DSD-INT 2023 Deltares Hydrology Suite - An introduction - Slootjes
DSD-INT 2017 Summary closing remarks & planned releases - Minnema
Water resource model modflow
DSD-INT 2014 - OpenMI symposium - OpenMI and other model coupling standards, ...
DSD-INT 2023 Demo new features iMOD Suite - van Engelen
DSD-INT 2023 Hydrology User Days - Presentations - Day 2
DSD-INT 2019 Regional groundwater and geological voxel models for the Cauca V...
DSD-INT 2020 BlueEarth Engine - hydroMT - model builder framework
DSD-INT 2016 Delft3D Flexible Mesh Suite 2017 in a nutshell - Melger
GROUNDWATER MODELING SYSTEM
Ten Years of Coupled Hydrology and Hydraulic Modelling Supporting Storm Water...
FutureGrid Computing Testbed as a Service
Ad

More from Deltares (20)

PDF
DSD-INT 2024 Delft3D FM Suite 2025.01 2D3D - New features + Improvements - Ge...
PDF
DSD-INT 2024 Delft3D FM Suite 2025.01 1D2D - Beta testing programme - Hutten
PDF
DSD-INT 2024 MeshKernel and Grid Editor - New mesh generation tools - Carniato
PDF
DSD-INT 2024 Quantifying wind wake effects around offshore wind farms in the ...
PDF
DSD-INT 2024 Salinity intrusion in the Rhine-Meuse Delta - Geraeds
PDF
DSD-INT 2024 El-Nakheel beach swimmer safety study - Dobrochinski
PDF
DSD-INT 2024 Development of a Delft3D FM Scheldt Estuary Model - Vanlede
PDF
DSD-INT 2024 Modeling the effects of dredging operations on salt transport in...
PDF
DSD-INT 2024 Wadi Flash Flood Modelling using Delft3D FM Suite 1D2D - Dangudu...
PDF
DSD-INT 2024 European Digital Twin Ocean and Delft3D FM - Dols
PDF
DSD-INT 2024 Building towards a better (modelling) future - Wijnants
PDF
DSD-INT 2024 Flood modelling using the Delft3D FM Suite 1D2D - Horn
PDF
DSD-INT 2024 The effects of two cable installations on the water quality of t...
PDF
DSD-INT 2024 Morphological modelling of tidal creeks along arid coasts - Luo
PDF
DSD-INT 2024 Rainfall nowcasting – now and then - Uijlenhoet
PDF
DSD-INT 2023 Hydrology User Days - Intro - Day 3 - Kroon
PDF
DSD-INT 2023 Demo EPIC Response Assessment Methodology (ERAM) - Couvin Rodriguez
PDF
DSD-INT 2023 Demo Climate Stress Testing Tool (CST Tool) - Taner
PDF
DSD-INT 2023 Demo Climate Resilient Cities Tool (CRC Tool) - Rooze
PDF
DSD-INT 2023 Approaches for assessing multi-hazard risk - Ward
DSD-INT 2024 Delft3D FM Suite 2025.01 2D3D - New features + Improvements - Ge...
DSD-INT 2024 Delft3D FM Suite 2025.01 1D2D - Beta testing programme - Hutten
DSD-INT 2024 MeshKernel and Grid Editor - New mesh generation tools - Carniato
DSD-INT 2024 Quantifying wind wake effects around offshore wind farms in the ...
DSD-INT 2024 Salinity intrusion in the Rhine-Meuse Delta - Geraeds
DSD-INT 2024 El-Nakheel beach swimmer safety study - Dobrochinski
DSD-INT 2024 Development of a Delft3D FM Scheldt Estuary Model - Vanlede
DSD-INT 2024 Modeling the effects of dredging operations on salt transport in...
DSD-INT 2024 Wadi Flash Flood Modelling using Delft3D FM Suite 1D2D - Dangudu...
DSD-INT 2024 European Digital Twin Ocean and Delft3D FM - Dols
DSD-INT 2024 Building towards a better (modelling) future - Wijnants
DSD-INT 2024 Flood modelling using the Delft3D FM Suite 1D2D - Horn
DSD-INT 2024 The effects of two cable installations on the water quality of t...
DSD-INT 2024 Morphological modelling of tidal creeks along arid coasts - Luo
DSD-INT 2024 Rainfall nowcasting – now and then - Uijlenhoet
DSD-INT 2023 Hydrology User Days - Intro - Day 3 - Kroon
DSD-INT 2023 Demo EPIC Response Assessment Methodology (ERAM) - Couvin Rodriguez
DSD-INT 2023 Demo Climate Stress Testing Tool (CST Tool) - Taner
DSD-INT 2023 Demo Climate Resilient Cities Tool (CRC Tool) - Rooze
DSD-INT 2023 Approaches for assessing multi-hazard risk - Ward

Recently uploaded (20)

PDF
Odoo Companies in India – Driving Business Transformation.pdf
PDF
Why TechBuilder is the Future of Pickup and Delivery App Development (1).pdf
PDF
PTS Company Brochure 2025 (1).pdf.......
PDF
T3DD25 TYPO3 Content Blocks - Deep Dive by André Kraus
PDF
Adobe Illustrator 28.6 Crack My Vision of Vector Design
PDF
Raksha Bandhan Grocery Pricing Trends in India 2025.pdf
PDF
System and Network Administration Chapter 2
PPTX
L1 - Introduction to python Backend.pptx
PDF
Addressing The Cult of Project Management Tools-Why Disconnected Work is Hold...
PDF
Digital Systems & Binary Numbers (comprehensive )
PDF
Nekopoi APK 2025 free lastest update
PDF
How to Migrate SBCGlobal Email to Yahoo Easily
PDF
top salesforce developer skills in 2025.pdf
PDF
Digital Strategies for Manufacturing Companies
PPTX
Transform Your Business with a Software ERP System
PPTX
Oracle E-Business Suite: A Comprehensive Guide for Modern Enterprises
PPTX
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
PDF
Understanding Forklifts - TECH EHS Solution
PPTX
Embracing Complexity in Serverless! GOTO Serverless Bengaluru
PDF
wealthsignaloriginal-com-DS-text-... (1).pdf
Odoo Companies in India – Driving Business Transformation.pdf
Why TechBuilder is the Future of Pickup and Delivery App Development (1).pdf
PTS Company Brochure 2025 (1).pdf.......
T3DD25 TYPO3 Content Blocks - Deep Dive by André Kraus
Adobe Illustrator 28.6 Crack My Vision of Vector Design
Raksha Bandhan Grocery Pricing Trends in India 2025.pdf
System and Network Administration Chapter 2
L1 - Introduction to python Backend.pptx
Addressing The Cult of Project Management Tools-Why Disconnected Work is Hold...
Digital Systems & Binary Numbers (comprehensive )
Nekopoi APK 2025 free lastest update
How to Migrate SBCGlobal Email to Yahoo Easily
top salesforce developer skills in 2025.pdf
Digital Strategies for Manufacturing Companies
Transform Your Business with a Software ERP System
Oracle E-Business Suite: A Comprehensive Guide for Modern Enterprises
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
Understanding Forklifts - TECH EHS Solution
Embracing Complexity in Serverless! GOTO Serverless Bengaluru
wealthsignaloriginal-com-DS-text-... (1).pdf

DSD-INT 2019 Parallelization project for the USGS - Verkaik

  • 1. i M O D U s e r D a y 2 0 1 9 – D S D - I N T 2 0 1 9 Parallelization project for the USGS Jarno Verkaik (Deltares, groundwater management department) SURFsara Cartesius supercomputer (47,776 cores, 130TB RAM)
  • 2. Why (distributed memory) parallel computing? MODFLOW grid Serial computing Parallel computing 256 GB RAM 64 GB RAM 64 GB RAM 64 GB RAM 64 GB RAM MPI iMODUserDay2019–DSD-INT2019 1 day computing @ 256 GB machine 6 hours computing @ 64 GB machines MPI = Message Passing Interface 2
  • 3. Contents • Organization • Project results and plans • Global scale application iMODUserDay2019–DSD-INT2019 3
  • 4. How it started… • 2010: Email correspondence on parallel MT3DMS • 2013: Visit to USGS, start of joined code development (in kind) • 2015: Start development of Parallel Krylov Solver for MODFLOW-2005 and MODFLOW-USG → poster @ AGU Fall Meeting 2015, San Francisco • 2016: First application of PKS at national and global scale → poster @ AGU Fall Meeting 2016, San Francisco • Jul.2017: PKS as main feature for iMOD 4.0 & applied as default solver in National Water Model • Oct.2017: Start parallelization of MODFLOW 6 → funded by USGS through USGS-Deltares co-op iMODUserDay2019–DSD-INT2019 4 AGU Fall Meeting 2015 AGU Fall Meeting 2016
  • 5. Organization through (coastal morphology) USGS-Deltares co-op iMODUserDay2019–DSD-INT2019 5 Robert McCall Applied Morphodynamics, Delft Kees Nederhoff Deltares USA, Silver Spring Martijn Russcher Numerical Simulation Software, Delft Jarno Verkaik Groundwater management, Utrecht Joseph D. Hughes Integrated Modeling and Prediction, Reston Christian D. Langevin Integrated Modeling and Prediction, Mounds View Li Erikson Pacific Coastal and Marine Science Center, Santa Cruz
  • 6. USGS project FY2018 (Oct.2017 – Sep.2018) • Start parallelization of MODFLOW 6 • Such that it can be part of a future release • Target application: CONUS model by Wesley Zell and Ward Sanford (USGS) • USGS requirements: - Proof of concept applicable to CONUS model - Low code footprint - Version controlled code at GitHub - Easy to use - Not depending on 3rd party libraries iMODUserDay2019–DSD-INT2019 6
  • 7. USGS project FY2018 (Oct.2017 – Sep.2018) • Proof of concept was developed, applicable to CONUS • Parallelization of IMS linear solver using Schwarz domain decomposition (similar to Parallel Krylov Solver in iMOD) • Repos: https://github.com/verkaik/modflow6-parallel.git →MODFLOW 6 framework refactoring required for exchanges between models (subdomains): - That is generic for both serial and parallel computing - Such that numerical schemes can be evaluated more easily at model interfaces - Such that XT3D option can be used with multiple models (serial and parallel) iMODUserDay2019–DSD-INT2019 7 Halo v2. concept
  • 8. USGS project FY2019 & FY2020 • FY2019 (Oct.2018 – Sep.2019) • Support XT3D option with multi-models (serial only) • Development of interface model concept (revised halo v2) • FY2020 (Oct.2019 – Sep.2020) (To be determined) • Continue working on parallel MODFLOW • Development of Basic Model Interface iMODUserDay2019–DSD-INT2019 8 M1 M2
  • 9. USGS project FY2018 results: circle test 1250M cells iMODUserDay2019–DSD-INT2019 9
  • 10. USGS project FY2018 results: circle test 12.5M cells iMODUserDay2019–DSD-INT2019 10
  • 11. Related work to USGS project iMODUserDay2019–DSD-INT2019 • PhD project (start 2018) “Towards Exascale Computing for Large Scale Groundwater Simulation” Goal: development of distributed parallel methods applying to large real-life groundwater models of O(106)–O(109) cells. • Mainly funded by Deltares research • Directly relates to MODFLOW 6 kernel development for new iMOD 6 (see next presentation by Otto de Keizer) Prof. Marc Bierkens (Utrecht University) Prof. Hai Xiang Lin (Delft University of Technology) Gualbert Oude Essink, PhD (Deltares) 11
  • 12. Contributions from PhD project iMODUserDay2019–DSD-INT2019 Short term coding: • Improve linear solver convergence when using many subdomains: → add coarse grid parallel preconditioner (implementation largely done) • Option to check parallel implementation → add serial block Jacobi preconditioner (first implementation done) • Code profiling & optimizing parallel performance (ongoing) Longer term coding: • Robustness option when using many subdomains: → add recovery mechanism for failing hardware • Add physics-based parallel preconditioner Short term modeling: • Run USGS CONUS model in parallel @ 250 m 12
  • 13. • Development of PCR-GLOBWB global groundwater model having 1km x 1km resolution, O(108) cells • First experience with parallel MODFLOW 6 with this scale: • Physics based subdomain partitioning • Model generation (pre-processing) • Parallel computing • Visualization of model results → Big data! Typical raster: 43200 columns x 21600 rows, 3 GB binary Global groundwater model @ 1km and 428M cells Ref: Verkaik, J., Sutanudjaja, E.H., Oude Essink, G.H.P., Lin, H.X., and Bierkens, M.F.P., 2019. Parallel global hydrology and water resources PCR-GLOBWB-MODFLOW model at hyper-resolution scale (1 km): first results, in: EGU General Assembly Conference Abstracts. p. 13397. iMODUserDay2019–DSD-INT2019 13
  • 14. MODFLOW 6 model characteristics: • Steady-state, 2 layers, subsurface down-scaled from 10km • Unstructured DISU grid met only “land cells”, total 428M • CHD for sea, RIV in layer 1 + DRN in layer 1 & 2 (HydroSHEDS) Global groundwater model @ 1km and 428M cells iMODUserDay2019–DSD-INT2019 Parallel pre-processing using 128 subdomains 14
  • 15. Global groundwater model @ 1km and 428M cells Can we come up with predefined subdomain boundaries (e.g. hydrologically / administrative boundary) such that they are useful for both modeler and parallel computing? → How to partition the world into 1024 subdomains using 1.8M catchments? → How to solve a sub-optimal optimization problem (load + edge cuts)? 1. Determine independent regions for groundwater flow (continents, islands) → ~20k regions 2. Further divide large regions/catchments using a lumped graph method → define parallel models 3. Cluster small regions → define serial models iMODUserDay2019–DSD-INT2019 15
  • 16. Global groundwater model @ 1km and 428M cells • Partitioning results in 52 separate MODFLOW 6 models: • 38 serial, small islands • 13 parallel, 3 largest on super computer 428M (2 layers) Small parallel+ serial models 2. America 120M cells 286 cores 1min 36sec 112GB memory 1. Africa+EurAsia 256M cells 612 cores 3min 31sec 390 GB memory 3. Australia 20M cells 48 cores 33 sec 13 GB memory 5% 28% 60% iMODUserDay2019–DSD-INT2019 16
  • 17. Global groundwater model @ 1km and 428M cells iMODUserDay2019–DSD-INT2019 Simulated Groundwater Table subdomain boundary (total: 1024) 17
  • 18. Global groundwater model @ 1km and 428M cells iMODUserDay2019–DSD-INT2019 Simulated Groundwater Table subdomain boundary (total: 1024) Take home message: USGS and Deltares are making progress on MPI parallelization of the MODFLOW 6 multi-model capability for reducing computing times & memory usage THANK YOU! 18