A. Within SPEC's Graphics Performance Characterization (GPC) Group there was a strong belief that it is important to benchmark graphics performance based on actual applications. Application-level benchmarks exist, but they are not standardized and they do not cover a wide range of application areas. The Application Performance Characterization (APC) group feels that end users will benefit from a broad-ranging set of standardized benchmarks for graphics-intensive applications.
Q. What companies are members of the APC project group?
A. Current members are 3D Labs, Compaq, Dell, HAL Computer, Hewlett-Packard, IBM, Intel, Intergraph, Real 3D, S3, Silicon Graphics, Sun Microsystems, Universidad Nacional de Mexico and University of Central Florida.
Q. Is it appropriate that vendors drive this effort?
A. Industry vendors have the highest level of interest in developing these benchmarks, but the development process will stem from interaction with user groups, publications, application developers and others. There is sometimes a perception that "vendor-driven" benchmarks are less objective. In fact, these types of consortium-developed benchmarks are likely to be the most objective, since there is a natural system of checks and balances due to the specific market interests of different vendors.
Q. Why did the APC project group decide to join the GPC Group?
A. This initiative stemmed from members of the GPC Group seeing a common need for better application benchmarks. The GPC Group, with its affiliation with SPEC (Standard Performance Evaluation Corp.), already has systems in place for running such a project and ensuring standardization of the workloads, measurement criteria, review processes, and reporting formats. This saves a great deal of administrative time and resources, allowing the project group to concentrate more fully on developing and maintaining benchmarks.
Q. How does the group identify applications in various market segments?
A. We use existing market data, member company knowledge, and ISV input to determine relevant market segments and important applications in those segments. These applications will be updated over time as user needs change and new applications become available.
Q. Does the APC group stipulate that its benchmarks must run on all platforms?
A. They should run on a reasonable number of platforms, but they do not have to run on all platforms. Many applications are targeted to a specific level of hardware, such as PCs or high-end workstations. It would not be appropriate to require a vendor to run a benchmark that is not designed for its platform.
Q. How does the group identify appropriate workloads for selected applications?
A. APC project group members sponsor applications and work with end users, user groups, publications and ISVs to select and refine workloads, which consist of data sets and benchmark script files. Workloads are determined by end users and ISVs, not APC group members. And, they will evolve over time in conjunction with end users' needs.
Q. What are the APC project group's priorities in selecting benchmarks?
A. Number one is to select benchmarks that are useful to end users who are evaluating and selecting platforms. Secondly, the applications must use graphics prominently in at least some phase of user interaction. The workload is selected to capture this phase. There are several existing application benchmarks that are called "graphics-oriented," but whose test results do not reflect the benefits that come from improvements vendors are making in raw graphics performance. The APC group focuses on benchmarks that tax the graphics subsystem, including hardware, software and the OS.
Q. How will the group define, publish, measure, report and review results?
A. This is where our SPEC/GPC affiliation pays benefits. We will adopt existing SPEC/GPC processes as much as possible to minimize work and reduce the learning curve. Performance measurements must be accurate and repeatable across diverse computing environments. Reporting schemes will be designed to meet end users' needs.
Q. How will APC results be reviewed and published?
A. Again, we will take advantage of standardized practices used within SPEC/GPC. We intend to develop a review process and a set of universal reporting and display tools for publishing results in the GPC News.
Q. Will benchmark tools be freely available to the public?
A. Yes. We will make certain a wide audience has easy access to the workloads and the benchmark results published by the APC project group. These tools will be made available for free downloading through this Web site, and results that have undergone peer review will be published on the site. Making the tools and results accessible to the public will allow independent reproduction of results and analysis of workloads. Anyone who wants to run the benchmarks, of course, will need copies of the applications.
Vendors, users and publications will be free to publish their own results outside of this Web site. These results, however, will not be subject to the SPEC/GPC peer review process that takes place before results are published on the Web site. The APC group has not yet established a policy regarding non-member submissions to the Web site.
Q. Applications often support different features on different platforms. This is especially true on the high end, where vendors seek to add value by providing advanced features. How does the APC group accommodate advanced feature sets in its application benchmarks?
A. The project group wants to make certain that we avoid "least-common-denominator" tests. We are developing benchmarks based on "core" functionality, which represents the end users' "musts" for running the application. All vendors reporting on a benchmark must support this functionality, although it is the reporting vendor's choice about whether to implement the functions through hardware or software. In addition to core functionality, we might develop an "enhanced" aspect of the benchmark. The enhanced aspect would make it clear that there is extended functionality and performance available that has value to end users. An example of enhanced functionality might be the ability to do 3D texture mapping.
Q. Where is more information about membership and application benchmarking available?
A. More information and answers to specific questions can be obtained by contacting the APC project group's e-mail alias: firstname.lastname@example.org.