Skip to main content
Log in

Brainware for green HPC

  • Special Issue Paper
  • Published:
Computer Science - Research and Development

Abstract

The reduction of the infrastructural costs of HPC, in particular power consumption, currently is mainly driven by architectural advances in hardware. Recently, in the quest for the EFlop/s, hardware-software codesign has been advocated, owing to the realization that without some software support only heroic programmers could use high-end HPC machines. However, in the topically diverse world of universities, the EFlop/s is still very far off for most users, and yet their computational demands shape the HPC landscape in the foreseeable future. Based on experiences made at RWTH Aachen University and in the context of the distributed Computational Science and Engineering support of the UK HECToR program, we claim based on economic considerations that HPC hard- and software installations need to be complemented by a “brainware” component, i.e., trained HPC specialists supporting performance optimization of users’ codes. This statement itself is not new, and the establishment of simulation labs at HPC centers echoes this fact. However, based on our experiences, we quantify the savings resulting from brainware, thus providing an economic argument that sufficient brainware must be an integral part of any “green” HPC installation. Thus, it also follows that the current HPC funding regimes, which favor iron over staff, are fundamentally flawed, and long-term efficient HPC deployment must emphasize brainware development to a much greater extent.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Attig N, Eickermann T, Gibbon P, Lippert T (2009) Community-oriented support and research structures. J Phys Conf Ser 180(1):012038

    Article  Google Scholar 

  2. Behr M, Bischof C (2010) Conquering complexity through modeling and simulation. http://www.foren.rwth-aachen.de/aw/cms/Interdisziplinaere_Foren/Zielgruppen/forum_informatik/publikationen/viy/studie/?lang=de

  3. Meuer HW (2008) The top500 project: looking back over 15 years of supercomputing experience. Inform-Spektrum 31:203–222

    Article  Google Scholar 

  4. Post DE, Kendall RP (2004) Software project management and quality engineering practices for complex, coupled multiphysics, massively parallel computational simulations: lessons learned from asci. Int J High Perform Comput Appl 18:399–416

    Article  Google Scholar 

  5. Sharma S, Hsu CH, Feng CW (2006) Making a case for a green500 list. In: 20th international, parallel and distributed processing symposium, 2006, IPDPS 2006, p 8

    Google Scholar 

  6. Wylie BJN, Geimer M, Nicolai M, Probst M (2007) Performance analysis and tuning of the xns cfd solver on blue gene/l. In: PVM/MPI, pp 107–116

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christian Bischof.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Bischof, C., an Mey, D. & Iwainsky, C. Brainware for green HPC. Comput Sci Res Dev 27, 227–233 (2012). https://doi.org/10.1007/s00450-011-0198-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00450-011-0198-5

Keywords

Navigation