NSF says high-performance computing shouldn't fixate on FLOPS

Tools

Declaring that high-performance computing can't be measured just in floating point operations per second, or FLOPS, the National Science Foundation says in an advance computing infrastructure strategic plan released Feb. 23 that it will place grant funding emphasis on matters such as "innovations in software and algorithms, data analytics, statistical techniques, fundamental operating system research, file systems, and innovative domain-centric applications."

"It's not just about speed, it's about how we deal with big data, it's about how we integrate computation into various disciplines," said Lisa-Joy Zgorski, an NSF spokeswoman. The strategy will affect agency grants across multiple disciplines that need computing support and wrestle with problems of big data.

In a time of continued growth in the number of cores per chip and accelerator-based hybrid systems, the strategy says high-performance computing efforts need increased attention to matters such as fault tolerance and power consumption. The latter is a key limitation for all sizes of computers, the strategy notes, adding that memory and increased data movement also require additional fundamental research.

The strategy says NSF will pursue five basic strategic directions:

  • Foundational research to fully exploit parallelism and concurrency through innovations in computational models and languages; mathematics and statistics; algorithms; compilers; operating and run-time systems; middleware; software tools; application frameworks; virtual machines; and advanced hardware.
  • Research and development in the use of high-end computing resources in partnerships with scientific domains, including new computational, mathematical and statistical modeling; simulation, visualization and analytic tools; aggressive domain-centric applications development; and deployment of scalable data management systems.
  • Building, testing, and deploying both sustainable and innovative resources into a collaborative ecosystem that encompasses integration/coordination with campus and regional systems, networks, cloud services, and/or data centers in partnerships with scientific domains.
  • Development of comprehensive education and workforce programs, from building deep expertise in computational, mathematical and statistical simulation, modeling and CDS&E to developing a technical workforce and enabling career paths in science, academia, government and industry.
  • Development and evaluation of transformational and grand challenge community programs that support contemporary complex problem solving by engaging a comprehensive and integrated approach to science, utilizing high-end computing, data, networking, facilities, software, and multidisciplinary expertise across communities, other government agencies and international partnerships.

For more:
- download the NSF Advanced Computing Infrastructure strategic plan (.pdf)

Related Articles:
NIST scientists make quantum-computing device communication breakthrough 
Exaflop supercomputer receives full funding from Senate appropriators 
NNSA turns to computer modeling to plug infrastructure data gap