NAS Parallel Benchmarks
The NAS Parallel Benchmarks (NPB) are a set of benchmarks targeting performance evaluation of highly parallel supercomputers. They are developed and maintained by the NASA Advanced Supercomputing (NAS) Division (formerly the NASA Numerical Aerodynamic Simulation Program) based at the NASA Ames Research Center. NAS solicits performance results for NPB from all sources.[1]
History
Motivation
Traditional benchmarks that existed before NPB, such as the Livermore loops, the LINPACK Benchmark and the NAS Kernel Benchmark Program, were usually specialized for vector computers. They generally suffered from inadequacies including parallelism-impeding tuning restrictions and insufficient problem sizes, which rendered them inappropriate for highly parallel systems. Equally unsuitable were full-scale application benchmarks due to high porting cost and unavailability of automatic software parallelization tools.[2] As a result, NPB were released in 1991[3] to address the ensuing lack of benchmarks applicable to highly parallel machines.
NPB 1
The first specification of NPB recognized that the benchmarks should feature
- new parallel-aware algorithmic and software methods,
- genericness and architecture neutrality,
- easy verifiability of correctness of results and performance figures,
- capability of accommodating new systems with increased power,
- and ready distributability.
In the light of these guidelines, it was deemed the only viable approach to use a collection of "paper and pencil" benchmarks that specified a set of problems only algorithmically and left most implementation details to the implementor's discretion under certain necessary limits.
NPB 1 defined eight benchmarks, each in two problem sizes. Sample code written in Fortran 77 was supplied but not intended for benchmarking purposes.[2]
NPB 2
Since its release, NPB 1 displayed two major weaknesses. Firstly, due to its "paper and pencil" style of specification, computer vendors usually highly tuned their implementations so that their performance became difficult for scientific programmers to attain. Moreover, many of these implementation were proprietary and not publicly available, effectively concealing their optimizing techniques. Secondly, problem sizes of NPB 1 lagged behind the development of supercomputers as the latter continued to evolve.[3]
NPB 2, released in 1996[4][5], came with source code implementations for five out of eight benchmarks defined in NPB 1 to supplement but not replace NPB 1. It updated the benchmarks with a new, up-to-date problem size. It also amended the rules for submitting benchmarking results. The new rules included explicit requests for output files as well as modified source files and build scripts to ensure public availability of the modifications and reproducibility of the results.[3]
References
- ^ "NAS Parallel Benchmarks Changes". NASA Advanced Supercomputing Division. Retrieved 2009-02-23.
- ^ a b Baily, D.; Barscz, E.; Barton, J.; Browning, D.; Carter, R.; Dagum, L.; Fatoohi, R.; Fineberg, S.; Frederickson, P.; Lasinski, T.; Schreiber, R.; Simon, H.; Venkatakrishnan, V.; Weeratunga, S. (March 1994), "The NAS Parallel Benchmarks" (PDF), NAS Technical Report RNR-94-007, NASA Ames Research Center, Moffett Field, CA
{{citation}}
: CS1 maint: year (link) - ^ a b c Bailey, D.; Harris, T.; Saphir, W.; van der Wijngaart, R.; Woo, A.; Yarrow, M. (December 1995), "The NAS Parallel Benchmarks 2.0" (PDF), NAS Technical Report NAS-95-020, NASA Ames Research Center, Moffett Field, CA
{{citation}}
: CS1 maint: year (link) - ^ Saphir, W.; van der Wijngaart, R.; Woo, A.; Yarrow, M., "New Implementations and Results for the NAS Parallel Benchmarks 2" (PDF), NASA Ames Research Center, Moffett Field, CA
{{citation}}
: Missing or empty|title=
(help) - ^ van der Wijngaart, R. (October 2002), "NAS Parallel Benchmarks Version 2.4" (PDF), NAS Technical Report NAS-02-007, NASA Ames Research Center, Moffett Field, CA
{{citation}}
: CS1 maint: year (link)
External links
- NAS Parallel Benchmarks Changes (official website)
- NAS Kernel Benchmark Program