The University of Queensland Homepage
Go to the ESSCC Homepage You are at the ESSCC website

 AuScope Projects

AuScope Projects at ESSCC

ESSCC is a key member of the AuScope SAM component. The team at ESSCC is developing state-of-the art software environments for solving complex and large numerical simulations on highly parallel supercomputers and modern distributed user environment. Moreover, ESSCC provides compute services to the community.

For additional information please contact Director of ESSCC Prof Hans Muhlhaus

Esys Particle

ESyS-Particle is a software package for particle-based numerical modelling. The software implements the Discrete Element Method (DEM), a widely used technique for modelling processes involving large deformations, granular flow and/or fragmentation. ESyS-Particle is designed for execution on parallel supercomputers, clusters or multi-core PCs running a Linux-based OS. The C++ simulation engine implements spatial domain decomposition via the Message Passing Interface (MPI). A Python wrapper API provides flexibility in the design of numerical models, specification of modelling parameters and contact logic, and analysis of simulation data. ESyS-Particle has been utilised to simulate earthquake nucleation, comminution in shear cells, silo flow, rock fragmentation, and fault gouge evolution, to name but a few applications.




Escript is a python-based programming tool for mathematical modelling based on non-linear, time-dependent partial differential equations. It has been designed to give modeller an easy-to-use environment for develop and run complex and coupled models without accessing the underlying data structures directly. This approach leads to highly portable codes allowing the user to run a simulation on her/his desktop computer as well as highly parallel supercomputer with no changes to the program. Escript is suitable for rapid prototyping (e.g for a student project or thesis) as well as for large software projects. It has successfully being used in a broad spectrum of applications including Earth mantel convection, earthquakes, porous media flow, reactive transport, plate subduction, and tsunamis. Escript is using the finley finite element method (FEM) solver library. The code has been parallelized efficiently with MPI, OpenMP and hybrid mode.  Escript portal is available at

Applications of escript:


o        Gas from coal seams

o        Plumes in the Earth Mantel

o        Travelling Tsunami in the South Pacific

o        Dynamic Rupture in Crustal Fault Systems


Useful links:

o        the escript download page

o        the escript user mail list registration

o        Gross, Lutz, Muhlhaus, Hans, Hale, Alina and Bourgouin, Laurent (2007) Interface modeling in incompressible media using level sets in Escript. Physics of The Earth And Planetary Interiors, 163 1-4: 23-24.

o        Bourgouin, Laurent, Muhlhaus, Hans-Bernd, Hale, Alina Jane and Arsac, Antonin (2006) Towards realistic simulations of lava dome growth using the level set method. Acta Geotechnica, 1 4: 225-236.

o        Olsen-Kettle, Louise, Weatherley, Dion, Saez, Estelle, Gross, Lutz, Muhlhaus, Hans and Xing, Huilin (2007) Analysis of slip-weakening frictional laws with static restrengthening and their implications on the scaling, asymmetry and mode of dynamic rupture on homogeneous and bi-material interfaces.





ESyS-Crustal provides the software infrastructure needed for a deeper understanding and better description of interacting fault systems with potential applications in natural hazard forecasting and risk evaluation (e.g. earthquakes and tsunami generation forecasting), green energy exploitation (e.g. geothermal reservoir modelling), deep geological disposal (e.g. radioactive waste treatment and CO2 geological storage), groundwater modelling, minerals exploration, and related environmental problems. ESyS-Crustal is designed for modern supercomputers and Linux-based multi-core desktop PCs.


Further readings:

  1. Xing, H. L. and J. Zhang (2009). Finite Element Modeling of Non-linear Deformation Behaviours of Rate-Dependent Materials using an R-minimum Strategy, Acta Geotechnica, 4, 139-148. doi: 10.1007/s11440-009-0090-7.

  2. Liu, Y., Y. Shi, E. O. D. Sevre, H L Xing and D. A Yuen (2009). Probabilistic Forecast of Tsunami Hazards along Chinese Coast, Chapter VIII in Advance in Geocomputing, Springer-Verlag GmbH, pp279-317, DOI:10.1007/978-3-540-85879-9_8.

  3. Xing, H. L., Makinouchi, A. and Mora, P. (2007). Finite element modeling of interacting fault system, Physics of the Earth and Planetary Interiors, 163, 106-121.doi:10.1016/j.pepi.2007.05.006

  4. Xing, H. L., Zhang, J. and Yin, C. (2007). A finite element analysis of tidal deformation of the entire Earth with a discontinuous outer layer, Geophysical Journal International, 170 (3), 961–970. doi:10.1111/j.1365-246X.2007.03442.x

  5. Xing, H. L., Mora, P., Makinouchi, A. (2006). An unified friction description and its application to simulation of frictional instability using finite element method. Philosophy Magazine. 86, 3453-3475

  6. Xing, H. L., Mora, P. (2006). Construction of an intraplate fault system model of South Australia, and simulation tool for the iSERVO institute seed project. Pure and Applied Geophysics, 163, 2297-2316.

  7. Xing, H. L., Mora, P., & Makinouchi, A. (2004). Finite element analysis of fault bend influence on stick-slip instability along an intra-plate fault, Pure and Applied Geophysics, 161, 2091-2102




Virtual Rock Lab

Virtual Rock Lab (VRL) is a web portal that allows running ESyS-Particle simulations on the AuScope computational grid. Requiring only a JavaScript-enabled web browser, the portal can be accessed from anywhere in the world. It contains intuitive interfaces for constructing simulation scripts, submitting them to a supercomputer, monitoring the progress and downloading the results - all from within the portal. Thus, there is no need to use SSH or other command line tools to submit or access simulation jobs. VRL's dialogue-based Script Builder makes it possible to create ESyS-Particle simulation scripts without writing a single line of Python code. Hence, first-time users can quickly gain experience using the software. Since VRL provides access to AuScope Grid resources interested parties are required to register before being able to use it. The portal is available at



Compute Services

ESSCC operates an SGI Altix ICE 8200 EX (savanna) for running large scale simulations and to test and development code. The facility has been acquired through competitive UQ funding. The specs are

·         128 Intel Xeon quad-core processors (512 cores at 2.8 GHz each)

·         InfiniBand 2xDDR interconnect

·         2 Tb memory

·         14.4 TBytes disc on a Nexis 9000 NAS

·         Peak performance 5.7 TFlops

·         access to SGI Data Migration Facility