The FAQs About SPECapcSM

SPECapcSM Information

Reasons to Join the SPECapc Benchmarking Effort

SPECapc Committee Submission Rules



Home
 
Q. Why was the SPECapcSM project group formed?

A. Within SPEC's Graphics Performance Characterization (GPC) Group there was a strong belief that it is important to benchmark graphics performance based on actual applications. Application-level benchmarks exist, but they are not standardized and they do not cover a wide range of application areas. The Application Performance Characterization (SPECapc) group feels that end users will benefit from a broad-ranging set of standardized benchmarks for graphics-intensive applications.

Q. What companies are members of the SPECapc project group?

A. Current members are 3DLabs, Compaq, Dell, Evans and Sutherland, Fujitsu, Hewlett-Packard, IBM, Intel, Intergraph, Real 3D, Siemens, SGI, Sun Microsystems, and the University of Central Florida.

Q. Is it appropriate that vendors drive this effort?

A. Industry vendors have the highest level of interest in developing these benchmarks, but the development process will stem from interaction with user groups, publications, application developers and others. There is sometimes a perception that "vendor-driven" benchmarks are less objective. In fact, these types of consortium-developed benchmarks are likely to be the most objective, since there is a natural system of checks and balances due to the specific market interests of different vendors.

Q. Why did the SPECapc project group decide to join the GPC Group?

A. This initiative stemmed from members of the GPC Group seeing a common need for better application benchmarks. The GPC Group, with its affiliation with SPEC (Standard Performance Evaluation Corp.), already has systems in place for running such a project and ensuring standardization of the workloads, measurement criteria, review processes, and reporting formats. This saves a great deal of administrative time and resources, allowing the project group to concentrate more fully on developing and maintaining benchmarks.

Q. How does the group identify applications in various market segments?

A. We use existing market data, member company knowledge, and ISV input to determine relevant market segments and important applications in those segments. These applications will be updated over time as user needs change and new applications become available.

Q. Does the SPECapc group stipulate that its benchmarks must run on all platforms?

A. They should run on a reasonable number of platforms, but they do not have to run on all platforms. Many applications are targeted to a specific level of hardware, such as PCs or high-end workstations. It would not be appropriate to require a vendor to run a benchmark that is not designed for its platform.

Q. How does the group identify appropriate workloads for selected applications?

A. SPECapc project group members sponsor applications and work with end users, user groups, publications and ISVs to select and refine workloads, which consist of data sets and benchmark script files. Workloads are determined by end users and ISVs, not SPECapc group members. And, they will evolve over time in conjunction with end users' needs.

Q. What are the SPECapc project group's priorities in selecting benchmarks?

A. Number one is to select benchmarks that are useful to end users who are evaluating and selecting platforms. Secondly, the applications must use graphics prominently in at least some phase of user interaction. The workload is selected to capture this phase. There are several existing application benchmarks that are called "graphics-oriented," but whose test results do not reflect the benefits that come from improvements vendors are making in raw graphics performance. The SPECapc group focuses on benchmarks that tax the graphics subsystem, including hardware, software and the OS.

Q. How does the group define, publish, measure, report and review results?

A. This is where our SPEC/GPC affiliation pays benefits. We adopt existing SPEC/GPC processes as much as possible to minimize work and reduce the learning curve. Performance measurements must be accurate and repeatable across diverse computing environments. Reporting schemes are designed to meet end users' needs.

Q. How are SPECapc results reviewed and published?

A. Again, we take advantage of standardized practices used within SPEC/GPC. We have developed a review process and a set of universal reporting and display tools for publishing results in the GPC News.

Q. Are benchmark tools freely available to the public?

A. Yes. We make certain a wide audience has easy access to the workloads and the benchmark results published by the SPECapc project group. These tools are made available for free downloading through the GPC News, and results that have undergone peer review are published on the site. Making the tools and results accessible to the public allows independent reproduction of results and analysis of workloads. Anyone who wants to run the benchmarks, of course, needs copies of the applications.

Vendors, users and publications are free to publish their own results outside of the GPC News web site. These results, however, are not subject to the SPEC/GPC peer review process that takes place before results are published on the Web site.

Q. Applications often support different features on different platforms. This is especially true on the high end, where vendors seek to add value by providing advanced features. How does the SPECapc group accommodate advanced feature sets in its application benchmarks?

A. The project group wants to make certain that we avoid "least-common-denominator" tests. We are developing benchmarks based on "core" functionality, which represents the end users' "musts" for running the application. All vendors reporting on a benchmark must support this functionality, although it is the reporting vendor's choice about whether to implement the functions through hardware or software. In addition to core functionality, we might develop an "enhanced" aspect of the benchmark. The enhanced aspect would make it clear that there is extended functionality and performance available that has value to end users. An example of enhanced functionality might be the ability to do 3D texture mapping.

Q. Where is more information about membership and application benchmarking available?

A. More information and answers to specific questions can be obtained by contacting the SPECapc project group's e-mail alias: gpcapc-info@spec.org.

The Benchmarks

If your browser refuses to load the following links, use [Shift] [left click] to save the file to disk.


SPECapcSM for Pro/ENGINEER™ 2000i
By downloading the following benchmarks, you acknowledge you have read, understand, and agree to abide by the terms of the License Agreement.

SPECapcSM for
SolidWorks 99™

SPECapcSM for Pro/ENGINEER™ Rev. 20
SPECapcSM for SolidWorks 98Plus™
     

Contents © Copyright 1999, Standard Performance Evaluation Corporation