This paper presents a parallel computing Smoothed Particle Hydrodynamics (SPH) framework for geophysical granular flows scalable on large CPU clusters. The framework is accomplished by adopting a Message Passing Interface (MPI) approach with domain partitioning strategy. The Orthogonal Recursive Bisection (ORB) technique is utilised to subdivide the computational domain. The ORB algorithm is implemented such that any number of MPI processes can be used instead of being limited to powers of two. To avoid global communications in the particle distribution process, a diffusion-based distribution algorithm is implemented and demonstrated to be much faster than global communication approaches when distributing particles to non-neighbouring processes. The proposed parallel scheme achieves 95% weak scaling efficiency and up to 900 times strong scaling speedup on 1024 CPU cores. The parallel scheme enables previously unfeasible simulations to be carried out and here we apply it to the investigation of the granular column collapse experiment under full three-dimensional, axisymmetric conditions for aspect ratios up to 30, not attempted previously using numerical techniques in the literature. Enabled by the parallel scheme, the simulations use up to 11.7 million SPH particles. The investigation is conducted using two popular constitutive models commonly used in modelling of granular flows: the elasto-plastic model with Drucker-Prager yield criterion and the μ(I) rheological model. While very good agreement with experimental data has been reported for both models for small and intermediate aspect ratios, the large-scale simulations conducted for large aspect ratios show that the Drucker-Prager model tends to over-predict final deposit height, and the μ(I) model under-predicts it. Furthermore, due to the capability of the parallel scheme to model the 3D axisymmetric column collapse at higher resolutions, we demonstrate that the elasto-plastic approach is capable of capturing arching effects in the stress profile, whereas the μ(I) model cannot.
- Geophysical flows
- Granular flows
- Message Passing Interface (MPI)
- Parallel computing
- Smoothed Particle Hydrodynamics (SPH)