Skip navigation

Standard Performance Evaluation Corporation

 
 

SPEC Organizational Information


 

SPEC's Background

The System Performance Evaluation Cooperative, now named the Standard Performance Evaluation Corporation (SPEC), was founded in 1988 by a small number of workstation vendors who realized that the marketplace was in desperate need of realistic, standardized performance tests. The key realization was that an ounce of honest data was worth more than a pound of marketing hype.

SPEC has grown to become one of the more successful performance standardization bodies with more than 60 member companies. SPEC publishes several hundred different performance results each quarter spanning a variety of system performance disciplines.

 

SPEC's Philosophy

The goal of SPEC is to ensure that the marketplace has a fair and useful set of metrics to differentiate candidate systems. The path chosen is an attempt to balance requiring strict compliance and allowing vendors to demonstrate their advantages. The belief is that a good test that is reasonable to utilize will lead to a greater availability of results in the marketplace.

The basic SPEC methodology is to provide the benchmarker with a standardized suite of source code based upon existing applications that has already been ported to a wide variety of platforms by its membership. The benchmarker then takes this source code, compiles it for the system in question and then can tune the system for the best results. The use of already accepted and ported source code greatly reduces the problem of making apples-to-oranges comparisons.

 

SPEC's Structure

SPEC is a non-profit corporation whose membership is open to any company or organization that is willing to support our goals (and pay our nominal dues). Originally just a bunch of people from workstation vendors devising CPU metrics, SPEC has evolved into an umbrella organization encompassing four diverse groups.

The Open Systems Group (OSG)

The OSG is the original SPEC committee. This group focuses on benchmarks for desktop systems, high-end workstations and servers running open systems environments.

Current OSG Subcommittees:

CLOUD
OSG's Cloud subcommittee is working on the development of industry standard benchmarks on areas related to Cloud computing infrastructure, applications spaces and emerging technologies.
CPU
The people who brought you SPECmarks and the other CPU benchmarks (SPECint, SPECfp, SPECrates, etc).
JAVA
The people who brought you the Java client and server-side benchmarks JVM98, JVM2008, JBB2000, and JBB2005, the SPECjAppServer and SPECjEnterprise Java Enterprise Application Server benchmarks, and the Java Message Service benchmark SPECjms2007.
HANDHELD
The Handheld subcommittee is working on the development of a compute intensive benchmark suite for handheld devices.
POWER
The Power committee developed SPECpower_ssj2008, the SPEC benchmark for evaluating the energy efficiency for server class computers, as well as the Server Efficiency Rating Tool (SERT).
SIP
The SIP committee develops benchmarks comparing performance of servers using the Session Initiation Protocol (SIP). SPECsip_Infrastructure2011 uses the model of a VoIP deployment for an enterprise, telco, or service provider, where the SIP server performs proxying and registration.
SOA
This subcommittee is developing a new industry standard benchmark for measuring performance for typical middleware, database and hardware deployments of applications based on the Service Oriented Architecture (SOA).
SFS
These are the people who brought you the NFS benchmarks SFS93 (LADDIS), SFS97, SFS97_R1, and SFS2008.
VIRTUALIZATION
The Virtualization committee developed SPECvirt_sc2010, the first generation SPEC benchmark for comparing virtualization performance for data center servers, and its successor SPECvirt_sc2013.
 

The High-Performance Group (HPG)

The HPG is a forum for establishing, maintaining and endorsing a suite of benchmarks that represent high-performance computing applications for standardized, cross-platform performance evaluation.

These benchmarks target high performance system architectures, such as symmetric multiprocessor systems, workstation clusters, distributed memory parallel systems, and traditional vector and vector parallel supercomputers.

 

The Graphics and Workstation Performance Group (GWPG)

SPEC/GWPG is the umbrella organization for project groups that develop consistent and repeatable graphics and workstation performance benchmarks and reporting procedures. SPEC/GWPG benchmarks are worldwide standards for evaluating performance in a way that reflects user experiences with popular applications.

Current GWPG Project Groups:

SPECapc
The Application Performance Characterization (SPECapcSM) group was formed in 1997 to provide a broad-ranging set of standardized benchmarks for graphics and workstation applications. The group's current benchmarks span popular CAD/CAM, digital content creation, and visualization applications.
SPECgpc
The Graphics Performance Characterization (SPECgpcSM) group, begun in 1993, establishes performance benchmarks for graphics systems running under OpenGL and other application programming interfaces (APIs). The group's SPECviewperf(r) benchmark is the most popular standardized software for evaluating performance based on popular graphics applications.
SPECwpc
SPECwpc is the newest project group within SPEC/GWPG. The group has created a benchmark that measures the performance of workstations running algorithms used in popular applications, but without requiring the full application and associated licensing to be installed on the system under test. SPECwpc V1.0, released in November 2013, is easy to install and run, but still rigorous enough to provide meaningful, repeatable data for performance evaluation.

 

SPEC Research Group (RG)

The RG is a new group within SPEC created to promote innovative research on benchmarking methodologies and tools facilitating the development of benchmark suites and performance analysis frameworks for established and newly emerging technologies. It is designed to encourage exchange among representatives from academia, industry and research institutes. The scope of the conducted research efforts includes techniques and tools for performance measurement, load testing, profiling, workload characterization, dependability and efficiency evaluation of computing systems. While the focus is on performance, other extra-functional system properties such as scalability, availability, cost and energy efficiency are considered as well.

A major component of the RG is the development of standard scenarios and workloads—called research benchmarks—for emerging technologies and applications. Benchmarks from the research group are intended primarily for in-depth analysis and evaluation of early prototypes and research results. This differentiates them from conventional benchmarks used for direct comparison and marketing of existing products.

Other planned activities of the RG include publishing a newsletter and journal, establishing a portal for benchmarking-related resources, recognizing outstanding contributions to benchmarking research, and organizing conferences and workshops.

 

Frequently Asked Questions

If you are still curious, perhaps we have the answers in the SPEC FAQ.

 

In Memoriam

Kaivalya Dixit

Kaivalya Dixit Kaivalya Dixit, long-time SPEC president, passed away on 22 November, 2004. Kaivalya touched many across the computer performance community and will be missed by all. SPEC has established a page where visitors may read remembrances of Kaivalya or share their own.


Tom Skornia

Tom Skornia Tom Skornia, legal counsel to SPEC for much of its history, succumbed to cancer Wednesday, April 27th, 2005. Tom was not necessarily a public person, but behind the scenes his bright mind and hard work on behalf of many companies and consortia in the industry was a key factor in our successes.


Larry Gray

Larry Gray Larry Gray, longtime SPEC treasurer, passed away on January 30, 2011. Larry was involved from SPEC from its inception, and in addition to serving as treasurer he participated in the development of many of SPEC's benchmarks.


Alan Adamson

Alan Adamson Alan Adamson passed away October 31, 2012. Alan served as chair of the Java and OSG Steering Committees. He helped establish and did much of the hard work to create the SPEC Research Group and the ICPE series of joint conferences with ACM. In addition, Alan served as the Board's inspector of elections and was a supporting contributor to the Power subcommittee.