Journal cover Journal topic
Geoscientific Model Development An interactive open-access journal of the European Geosciences Union
Journal topic

Journal metrics

Journal metrics

  • IF value: 4.252 IF 4.252
  • IF 5-year value: 4.890 IF 5-year
    4.890
  • CiteScore value: 4.49 CiteScore
    4.49
  • SNIP value: 1.539 SNIP 1.539
  • SJR value: 2.404 SJR 2.404
  • IPP value: 4.28 IPP 4.28
  • h5-index value: 40 h5-index 40
  • Scimago H <br class='hide-on-tablet hide-on-mobile'>index value: 51 Scimago H
    index 51
Discussion papers
https://doi.org/10.5194/gmd-2019-20
© Author(s) 2019. This work is distributed under
the Creative Commons Attribution 4.0 License.
https://doi.org/10.5194/gmd-2019-20
© Author(s) 2019. This work is distributed under
the Creative Commons Attribution 4.0 License.

Development and technical paper 15 Feb 2019

Development and technical paper | 15 Feb 2019

Review status
This discussion paper is a preprint. It is a manuscript under review for the journal Geoscientific Model Development (GMD).

How to use mixed precision in Ocean Models

Oriol Tintó Prims1,2, Mario C. Acosta1, Andrew M. Moore3, Miguel Castrillo1, Kim Serradell1, Ana Cortés2, and Francisco J. Doblas-Reyes1,4 Oriol Tintó Prims et al.
  • 1Earth Sciences Department, Barcelona Supercomputing Center - Centro Nacional de Supercomputación, Barcelona, Spain
  • 2HPCA4SE research group, Computer Architecture and Operating Systems Department, Universitat Autònoma de Barcelona, Bellaterra, Spain
  • 3Ocean Sciences Department, University of California, Santa Cruz, CA, USA
  • 4ICREA, Barcelona, Spain

Abstract. Mixed-precision approaches can provide substantial speed-ups for both computing- and memory-bound codes requiring little effort. Most scientific codes have overengineered the numerical precision leading to a situation where models are using more resources than required without having a clue about where these resources are unnecessary and where are really needed. Consequently, there is the possibility to obtain performance benefits from using a more appropriate choice of precision and the only thing that is needed is a method to determine which real variables can be represented with fewer bits without affecting the accuracy of the results. This paper presents a novel method to enable modern and legacy codes to benefit from a reduction of precision without sacrificing accuracy. It consists in a simple idea: if we can measure how reducing the precision of a group of variables affects the outputs, we can evaluate the level of precision this group of variables need. Modifying and recompiling the code for each case that has to be evaluated would require an amount of effort that makes this task prohibitive. Instead, the method presented in this paper relies on the use of a tool called Reduced Precision Emulator (RPE) that can significantly streamline the process . Using the RPE and a list of parameters containing the precisions that will be used for each real variable in the code, it is possible within a single binary to emulate the effect on the outputs of a specific choice of precision. Once we have the potential of emulating the effects of reduced precision, we can proceed with the design of the tests required to obtain knowledge about all the variables in the model. The number of possible combinations is prohibitively large and impossible to explore. The alternative of performing a screening of the variables individually can give certain insight about the precision needed by the variables, but on the other hand some more complex interactions that involve several variables may remain hidden. Instead, we use a divide-and-conquer algorithm that identifies the parts that cannot handle reduced precision and builds a set of variables that can. The method has been put to proof using two state-of-the-art ocean models, NEMO and ROMS, with very promising results. Obtaining this information is crucial to build afterwards an actual mixed precision version of the code that will bring the promised performance benefits.

Oriol Tintó Prims et al.
Interactive discussion
Status: open (until 12 Apr 2019)
Status: open (until 12 Apr 2019)
AC: Author comment | RC: Referee comment | SC: Short comment | EC: Editor comment
[Subscribe to comment alert] Printer-friendly Version - Printer-friendly version Supplement - Supplement
Oriol Tintó Prims et al.
Data sets

NEMO Reference configurations inputs (Version v4.0) NEMO Consortium https://doi.org/10.5281/zenodo.1472245

Oriol Tintó Prims et al.
Viewed  
Total article views: 233 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
186 46 1 233 2 4
  • HTML: 186
  • PDF: 46
  • XML: 1
  • Total: 233
  • BibTeX: 2
  • EndNote: 4
Views and downloads (calculated since 15 Feb 2019)
Cumulative views and downloads (calculated since 15 Feb 2019)
Viewed (geographical distribution)  
Total article views: 195 (including HTML, PDF, and XML) Thereof 195 with geography defined and 0 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 
Cited  
Saved  
No saved metrics found.
Discussed  
No discussed metrics found.
Latest update: 18 Mar 2019
Publications Copernicus
Download
Short summary
Mixed-precision approaches can provide substantial speed-ups for both computing- and memory-bound codes requiring little effort. A novel method to enable modern and legacy codes to benefit from a reduction of precision without sacrificing accuracy is presented. Using a precision emulator and a divide-and-conquer algorithm it identifies the parts that cannot handle reduced precision and the ones that can. The method has been proved using two ocean models, NEMO and ROMS, with promising results.
Mixed-precision approaches can provide substantial speed-ups for both computing- and...
Citation