Fast, cheap, & turbulent — Global ocean modelling with GPU acceleration in Python

Dion Häfner, Roman Nuterman, Markus Jochum
2021 Journal of Advances in Modeling Earth Systems  
Virtually all of the most commonly used earth system models, for example those used for the CMIP climate model intercomparison studies (Meehl et al., 2000) , are implemented in the Fortran programming language. Their ocean components, such as POP2 (Danabasoglu et al., 2012) , NEMO (Madec et al., 2017), or MICOM (Bleck et al., 1995), are no exception. Our ocean model Veros follows a different route. Veros is implemented in the high-level programming language Python, which has several key
more » ... y advantages over Fortran (see Häfner et al., 2018 , for a discussion), and is arguably the most popular programming language in science today. This allows even undergraduates to perform non-trivial numerical experiments. However, model performance has been a long-standing issue. The lack of a built-in optimizing compiler makes it that vectorized Python code is typically about 3-5 times slower than equivalent Fortran code. This may be fine for idealized experiments, but is unacceptable for large setups that occupy thousands of central processing units (CPU) cores for months. Recently, we succeeded to close most of this performance gap by exploiting the just-in-time compiler of the JAX library (Bradbury et al., 2018) . This results in competitive CPU performance, as we will demonstrate in this article. But using JAX has another advantage: It allows us to use graphical processing units (GPUs) without any additional code. With the advent of machine learning in general and deep learning in particular, GPUs have experienced a renaissance. Model training is vastly more efficient on massively parallel hardware, which has led to a feedback loop between supply and demand that has amplified their capabilities. Today, GPUs are the industry standard devices to train artificial neural networks. This trend has also impacted the design of modern compute facilities; for
doi:10.1029/2021ms002717 fatcat:njzzkutfirffrlgeryc6pqxyqu