Perspectives on the SPEC SDET Benchmark

Steven L. Gaede
Lone Eagle Systems Inc.

The year 1999 marks the twentieth anniversary of the benchmark that was adopted in 1991 as the SPEC Software Development Environment Throughput (SDET) benchmark. The fact that this benchmark has been in use for twenty years is remarkable, and indicates that it continues to fill a niche today.

SDET had its beginnings at Bell Laboratories in 1979, where it was used to evaluate whether the UNIX operating system would scale to large mainframe platforms. The benchmark was designed to apply a representative model workload to a system at increasing levels of concurrency in order to generate a graph of throughput vs. offered load. Given a set of curves representing the performance characteristics of a set of systems, a unit-less scaling factor could be calculated for comparing system scalability. Because of its use throughout the industry as a means for evaluating system performance, SPEC adopted SDET as a standard in 1991.

The most plausible explanation for SDET's continued use is that it provides measure of operating system and hardware platform performance. As such, there are ways in which SDET could be improved to provide a higher-quality measure of system performance. Although still useful as a `system' benchmark, workloads today are significantly different than they were in 1979, and care must be taken when using SDET - or any standard benchmark - as a means for predicting performance under today's workloads.