SPECjEnterprise2010 User's Guide

Version 1.03

Last Modified: August 19, 2010

Table of Contents

Section 1 - Introduction

The SPECjEnterprise2010TM benchmark is an end-to-end benchmark which allows performance measurement and characterization of Java EE 5 servers and supporting infrastructure such as JVM, Database, CPU, disk and servers.

The workloads consist of an end to end web based order processing domain, an RMI and Web Services driven manufacturing domain and a supply chain model utilizing document based Web Services. The application is a collection of Java classes, Java Servlets, Java Server Pages , Enterprise Java Beans, Java Persistence Entities (pojo's) and Message Driven Beans.

This document is a guide for setting up and running the SPECjEnterprise2010 benchmark. For a description of the rules and restrictions pertaining to SPECjEnterprise2010 we strongly recommend you read the SPECjEnterprise2010 Run and Reporting Rules document contained in the SPECjEnterprise benchmark kit or available online at the SPECjEnterprise2010 website. For an overview of the benchmark architecture, see the SPECjEnterprise2010 Design Document also contained in the benchmark kit.

1.1 Terminology

  • SPECjEnterprise2010 EjOPS
    • The SPECjEnterprise2010 performance metric and denotes the Enterprise Operations Per Second completed during the Measurement Interval. "SPECjEnterprise2010 EjOPS" is composed of the total number of business transactions completed in the Dealer Domain, added to the total number of work orders completed in the Manufacturing Domain by Planned Line, normalized per second.

  • Domain
    • A logical entity that describes a distinct business sphere of operations. The three SPECjEnterprise2010 domains are: Dealer, Manufacturing and Supplier.

  • Driver
    • The client code that drives the benchmark, monitors requirements, and reports results.

  • SPECjEnterprise2010 Kit
    • The complete kit provided for SPECjEnterprise2010. This includes the SPECjEnterprise2010 Reference Beans, the Driver, load programs, and documentation.

  • SPECjEnterprise2010 Reference Beans
    • The implementation of the Enterprise Beans provided for the SPECjEnterprise2010 workload.

  • Supplier Emulator
    • A Java Servlet that can run inside any Java enabled web server, and emulates the process of sending and receiving orders to/from suppliers.

  • SUT (System Under Test)
    • Comprised of all components that are being tested. This includes Application Servers/Containers, Database Servers, network connections, etc. The Driver and Supplier Emulator are not part of the SUT.

1.2 Overview

The SPECjEnterprise2010 workload emulates an automobile manufacturing company and its associated dealerships. Dealers interact with the system using web browsers (simulated by the driver) while the actual manufacturing process is accomplished via RMI and Web Services (also driven by the driver). This workload stresses the ability of Web and EJB containers to handle the complexities of memory management, connection pooling, passivation / activation, caching, etc. The SPECjEnterprise2010 Design Document includes a complete description of the workload and the application environment in which it is run. This section of the user's guide describes software and hardware environment required to run the workload.

1.2.1 Hardware Environment

Although SPECjEnterprise2010 can be run on a single machine for testing purposes, compliance with the SPECjEnterprise2010 Run and Reporting Rules requires that the driver and supplier emulator be run on a machine outside the SUT. Therefore, a compliant hardware configuration must include a network and a minimum of two systems – one or more systems to run the components within the SUT and at least one system to run the driver and supplier emulator outside the SUT. A typical configuration is illustrated below.

1.2.2 Software Environment

SPECjEnterprise2010 is a Java EE 5.0 application that requires a Java EE 5 compatible application server as well as a Relational Database Management System (RDBMS) to run as part of the SUT. Outside the SUT, a Java EE 5 compatible application server is required for the supplier emulator, and a Java Runtime Environment (JRE) version 5 or later is required for the driver.

Section 2 - Installing SPECjEnterprise2010

The SPECjEnterprise2010 Kit is supplied as an setup.jar that should be invoked by

  • java -jar setup.jar
During installation you are asked to provide an installation directory. We will call this top level directory KIT_HOME. SPECjEnterprise2010 can be run on any Application Server that is compliant with the Java EE 5 Specification and has passed the Certification Test Suite (CTS) for that specification. The code is implemented in Java, and the the tooling such as build and deploy infrastructure is implemented using ANT build scripts. It is expected that Java EE 5.0 server vendors and database vendors will supply ant scripts for the vendor specific deployment and database initialization i.e. :-

  • KIT_HOME/appservers/_Vendor X_
  • KIT_HOME/database/_Vendor X_

These vendor specific ant files will not necessarily be available as part of the benchmark package.

2.1 Directory Structure

After extracting the kit and running the ant scripts for installing the following directory structure will exist.

  • appservers
    • Application server specific build files and configuration
  • databases
    • Database specific build files and configuration
  • faban
    • The directory containing the build files and configuration related to setting up the faban harness and running the driver.
  • docs
    • A copy of the benchmark documentation (also available from the application web pages)
  • lib
    • Directory containing library jars that are used by the harness to run the workload. There will be application server dependant library jars that will be placed here.
  • schema
    • Schema information and static data to be loaded into the database.
  • schema/sql
    • Generic SQL scripts for creating the tables, etc. in the database. These should be copied and modified as necessary for a particular database product.
  • target
    • Default output directory for all of the compilation ant targets.
  • target/jar
    • Contains the compiled .jar/.war/.ear files.
  • src
    • Root directory of the source files.
  • src/web-docroot
    • HTML and JSP pages used for the web interface.
  • src/org
    • Entry into the org.spec.jent package source.
  • version.info
    • contains the version number of the benchmark kit

Note : For some DBMS products, DBMS specific files are in schema/ .

2.2 Java Package Structure

All SPECjEnterprise2010 Java classes are located in the org.spec.jent package. The following lists the sub-packages of org.spec.jent :

  • common
  • driver
  • launcher
  • load
  • mfg
  • orders
  • servlet
  • supplier
  • util

2.2.1 EJB Domains

The SPECjEnterprise benchmark application consists of three different domains: Order, Mfg and Supplier. Each domain provides a set of business methods exposed to servlets, web services or RMI/EJB: The driver interacts with the Order domain via HTTP requests. The Mfg domain is accessed by the driver using EJB/RMI and web service calls. Communication between the domains is realized by using persistent JMS messages. The roles of the domains and the interactions between the domains and the driver are described in the design document further.

Each domain uses a distinct set of tables and calls business methods of its own domain only. Hence it is possible but not required to setup a distributed environment e.g. using a dedicated (clustered) application server and a dedicated (clustered) database instance for each domain. Please refer to section 3.1.4 to learn more about database access and alternatives for configuring data sources.

Section 3 - Building and Deploying SPECjEnterprise2010

There are several components required to build, deploy, test and run SPECjEnterprise2010. These are :

  • An EJB Container to deploy all the SPECjEnterprise2010 Beans.
  • A Web Container to deploy the JSPs for the HTTP Clients in the Dealer Domain as well as the Delivery Servlet in the Supplier Domain.
  • A Web Services provider to enable Web Services communication for the driver, SUT, and emulator
  • An external application server to run the Supplier Emulator application. Since all communication to and from the Supplier Emulator is done over Web Services, this application server must support Java EE 5 Web Services. The emulator application server is not part of the SUT and may not be deployed on the same system as the EJB, Web containers or Web Services providers running within the SUT.
  • The SPECjEnterprise2010 Driver that runs on client systems external to the SUT

Building and deploying the benchmark requires Ant 1.7.1, which is supplied as part of the SPECjEnterprise2010 kit.

3.1 Build and Deployment process

Several steps must be accomplished prior to running the SPECjEnterprise2010 benchmark:

  • Configure the environment
  • Create the SPECjEnterprise2010 database(s)
  • Build the .jar, .war, and .ear files
  • Load the database
  • Deploy the .ear files
  • Deploy the emulator
  • Test the SPECjEnterprise2010 deployment using the Web interface

3.1.1 Configure the benchmark environment(s)

In order for ANT to run correctly, you must have the JAVA_HOME and ANT_HOME Environment variables set. If you extracted ANT using the extractant.sh provided with the kit, you can use the following:


In the root directory of the benchmark kit, configure the build.properties file. The following properties need to be set for your environment:


The values defined dictate the directory which is used to find the vendor specific ant build targets. These directories are organized as follows:

  • KIT_HOME/appservers/APPSERVER/
  • KIT_HOME/databases/DATABASE/

Some of these directories and ant build targets are provided for select products, or you can use the templates to create your own.

The instructions for modifying the KIT_HOME/faban/driver.properties will be provided in a later section.

3.1.2 Set initial state for benchmark kit

  • ant install
Must be run to unpack various java archives and create a number of initial directories that are required for building,deploying and running the benchmark. Note : to return to a clean initial state for the benchmark kit you can at anytime use the combination of :-
  • ant clean.all
  • ant install
Note that will return the configuration to default and that any user configuration done will be lost e.g. KIT_HOME/build.properties will be returned to it's default value.

3.1.3 Create the Database(s)

For the benchmark, you can create a single database that houses all the Domains, or the domains can be distributed to separate databases. Standard SQL scripts for creating the database schema are provided in schema/sql. These are intended to give a starting point for creating schemas for other database products. It is beyond the scope of this user guide to provide guidance for the installation and configuration of the various RDBMs available on the market. For convenience database creation ANT targets are provided for some databases in the databases/ directory. These are example scripts provided by the RDBMs vendors and can be modified, but serve as a starting place for deployment. However, see section 2.4 of the SPECjEnterprise2010 Run and Reporting Rules for restrictions on modifications to the schema scripts. Loading of the database's is deferred until the environment file and associated batch / script files have been updated in subsequent steps.

3.1.4 Building the Enterprise Archive (EAR) to prepare for deployment Build the .JAR, .WAR and .EAR files

The following commands for both Unix and Windows can be used to build the EJB Jar, Web Application aRchive (WAR)and Enterprise ARchive (EAR) files. We assume that the build environment has been configured as documented above. Only one ant target needs to be called, as it will build the jar, war, and the ear. The ant target specj.ear calls the vendor specific appserver.specj.ear target which allows vendors to replace or change e.g. xml descriptors delivered with the kit (if needed). The second command will create the emulator.ear

  • ant specj.ear
    • Remark: calls vendor specific ant target appserver.specj.ear (if needed)
  • ant emulator.ear

These targets can be used to build a submission compliant ear. Submission compliant ears must use the class files as provided with the kit, so these targets to NOT recompile the source code. For research or other unofficial testing purposes, the code may be recompiled. Targets for this are provided:

  • ant specj.ear.withcompile
  • ant emulator.ear.withcompile

If any errors occur at this step it is generally a setup / directory mis-match problem. Please review all settings carefully prior to contacting SPEC for support. Create Database Tables

Reference schema scripts for the different domains can be found in the schema/sql directory of the benchmark kit.

Optionally after setting up the configuration in the KIT_HOME/databases/$database directory according to your JDBC driver’s documentation you could call the following ant target to set up the database in a vendor specific way:

  • ant database.configure Creates the tables using the schemas Load the Database(s)

There are three different ways to populate the database tables:

  • web based database loader
  • standalone database loader
  • flat files generator (supported by the web based and standalone database loader)
The web based database loader uses the data sources of the benchmark application. Hence no additional configuration is needed. The standalone driver uses a single data source only. Hence a distributed set up might take a longer time because it might be that the standalone loader has to be called for each domain populating all tables of the benchmark application again. For faster loading the flat files could be used for vendor specific tools.

The Injection rate is a measure of how much data will be loaded for use during the benchmark run. A higher injection rate equates to more client load, and therefore more data is created accordingly. Details regarding the mathematical relationship between Injection Rate and real tables sizes can be found in the SPECJEnterprise2010 Run Rules. Web Based Database Loader

The web based database loader is located in the "Benchmark Management" of the Web UI at http://host:port/specj. After specifying the txRate and the (maximum) Parallelizm you can start loading the database by pressing the "Start" button. If you want to generate flat files (only) then check "Generate Flat files ..." and choose a delimiter and directory before pressing "Start" (database tables are not deleted, not truncated and not loaded). Since loading the database happens in the background on the application server the status can be refreshed by pressing the button "Refresh". By pressing the button "Cancel" loading can be stopped.

Database operations like deleting a table cannot be interrupted. Hence deleting a hughe table might need some time until stopping is successful. If an exception occured the stack trace is displayed in the status section. Application server log contains status as shown in the status of the database loader UI. Java Logging API is used. Hence log level can be changed like provided by Java Logging. Standalone Database Loader

The properties in the property file KIT_HOME/databases/$database/database.properties are used by the standalone driver to configure the used data source:

  • database.driver
  • database.uri
  • database.user
  • database.password

The following properties in the property file KIT_HOME/build.properties control the loading process:

  • benchmark.txrate Injection rate (also used by benchmark driver)
  • loader.standalone.parallelism Max. number of threads used by loader

The following target starts loading:

  • ant load.database Populates the tables directly into the database using the configured IR Flat Files

Flat files can be generated by the web based database driver or by calling the following ant target:

  • ant load.flatfile Creates flat files for loading the database

In the latter case the follwing properties of the property file KIT_HOME/build.properties should be configured:

  • loader.standalone.flatFileDelimiter Data written to the flat files are separated by a delimiter. By default it is "|".
  • loader.standalone.flatFileDirectory Per default the flat files will be outputted to KIT_HOME/target/dbflatfiles of the benchmark kit distribution.

Database vendor specific load scripts could be used to load these files into the database. Deploy the Supplier Emulator

The Supplier Emulator must be deployed on a Java EE 5 Web Services compliant server on a separate machines outside of the defined SUT. A reference set of web services generated jars, created via the web services reference implementation of wsgen, is automatically included in the generated Emulator.ear file. Your particular application server may require these to be regenerated. In many cases the deployment of the Supplier Emulator is as simple as deploying the generated Emulator.ear file to the desired application server. Any other changes or requirements may be vendor specific to the server implementation chosen.

3.1.5 Deployment Continued

The remaining steps to deploy the specj.ear file created in the target/jar directory depend on the Application Server you are using and whether or not you are using extended DDs provided by them. Each Application Server is unique and it is impossible here to capture the exact instructions to complete the deployment. Here are a list of steps that are required to complete the deployment if not using vendor supplied DDs. Please refer to Application Server documentation for help in accomplishing these steps:

  • Database table/column to JPA entity/field mapping
    • The JPA entities assume upper case names for tables and columns. According to the SQL93 standard it is assumed that the reference schema scripts generate appropriate upper case table and column names. However if there are upper/lower case issues or your database has limits on table or colunm name lengths you could use the JPA orm.xml to change the mapping.

  • Optimizations
    • E.g. the SPECjEnterprise run rules allow caching of Item entities. You might want to introduce optimizations which require to change for example the persistence.xml.

  • Provide JNDI name mapping
    • Tells the Application Server what JNDI name to use for the reference beans as well as provides a mapping of reference names to their correct target bean.

  • Map JMS Queues
    • Like JNDI binding this step maps the JMS queue names to the underlying nomenclature used by the Application Server for the JMS resources defined in it.
    • A list of the required bindings:
      • LoaderMDB -> jms/LoaderQueue (administrative JMS messages for web based loader only)
      • ReceiveMDB -> jms/ReceiveQueue
      • LargerOrderMDB -> jms/LargeOrderQueue
      • FulfillOrderMDB -> jms/FulfillOrderQueue
      • BuyerMDB -> jms/BuyerQueue
        • purchaseOrderQueue property -> jms/PurchaseOrderQueue
        • queueConnFactory -> jms/BuyerQueueConnectionFactory
      • PurchaseOrderMDB -> jms/PurchaseOrderQueue
      • LargeOrderSenderSession
        • queue property -> jms/LargeOrderQueue
        • queueConnFactory property -> jms/LargeOrderQueueConnectionFactory
      • SupplierSession
        • queue property -> jms/ReceiveQueue
        • queueConnFactory property -> jms/SupplierQueueConnectionFactory
      • LoaderSession
        • loadQueue property -> jms/LoaderQueue
        • queueConnFactory property -> jms/LoaderQueueConnectionFactory
      • MessageSenderSession
        • fulfillQueue property -> jms/FulfillOrderQueue
        • buyerQueue property -> jms/BuyerQueue
        • fulfillOrderQueueConnFactory property -> jms/FulfillOrderQueueConnectionFactory
        • buyerQueueConnFactory property -> jms/BuyerQueueConnectionFactory

  • Provide JTA data sources for the JPA persistence units (in brackets) defined in META-INF/persistence.xml of specj.ear.
    • jdbc/SPECjOrderDS (Order)
    • jdbc/SPECjMfgDS (Mfg)
    • jdbc/SPECjSupplierDS (Supplier)
    • jdbc/SPECjLoaderDS (Loader) (administrative data for web based database loader only)
Those names are used by the web-based database loader to retrieve the JTA data source of the benchmark application.

While configuring the data sources please comply with the SPECjEnterprise Run Rules - if preparing for a submission and not using EAStress for scientific purposes. Remarks:

  • Typically the database isolation level is READ_COMMITTED because the minimum isolation of SPECjEnterprise (and JPA) is (strict) READ_COMMITTED.
  • It is allowed that the persistence units in persistence.xml refer to the same data source connecting to a single database instance holding a schema which contains the tables and data for all three SPECjEnterprise domains. E.g. this could be achieved in a vendor specific way by using the JTA data source names of persistence.xml as aliases for your data source. Before changing the names of the JTA data sources in persistence.xml please consider that the web based loader uses the names as listed above.
  • More sophisticated setups use different data sources e.g. for connecting to different database instances. In this case each database instance might contain the data of a single domain only.
  • Loading data into different database instances:
    • When using the standalone database loader by calling the ant target load.database please create and fully load all tables of all domains for each database instance although the tables and data might be not used. This is a limitation of the standalone database loader.
    • Since the web based loader uses the data source names as listed above to retrive the data sources it loads the tables which are used by the corresponding domains only.
    • Or use flat files to control manually which tables of the different database instances are loaded.
  • Faster loading with the web based database loader: The web based loader first checks if there are data sources with the JNDI names jdbc/SPECj<domain>LoaderDS (<domain> = Order, Mfg, Supplier) available and uses those instead if available. The idea is to allow data source configurations which are different from the configuration of the data sources used by the SPECjEnterprise application. This might result in faster loading.
  • When the persistent JMS messages are stored in a different location than the data of the application then it might be necessary to use XA-capable data sources to fulfill the SPECjEnterprise run rules. However in this case LAO (Last Agent Optimization) might be useful.

Section 4 - Running SPECjEnterprise2010

Running SPECjEnterprise2010 requires that the driver be configured for your environment and particular application server. Configuring the driver properly requires understanding how the driver works, which is described below.

4.1 The SPECjEnterprise2010 Driver

The SPECjEnterprise2010 Driver consists of several Java programs and is designed to run on multiple client systems, using an arbitrary number of JVMs to ensure that the Driver has no inherent scalability limitations. Note that none of the client systems are part of the SUT. SPECjEnterprise2010 uses the Faban testing harness to control the driver agents.

4.1.1 Driver Definitions

  • The Master machine
    • This is the machine from which you control the benchmark run. It is typically one of the client systems. The benchmark run can either be started using the faban harness web user interface, or using ant scripts to invoke the harness command line interface.
  • The Agent machines
    • These are the client systems on which the various workloads run. The Master machine can also be an Agent machine.

4.1.2 Driver Components

The Driver consists of manufacturing and dealer agents.

  • Manufacturing agents are client processes which interact with the SUT via Web Services and RMI. These interactions drive the car production portion of the benchmark, fulfilling order requirements with cars manufactured from the tracked inventory of components.
  • Dealer agents are client processes which interact with the SUT via web requests to a servlet and jsp based GUI. These interactions drive customer order creation and tracking, and well as inventory browsing.
The Driver consists of the following components:
  • The actual applications that implement the workload are defined in the specifications. These are OrderEntry, PlannedLine and LargeOrderLine.
  • There is one Agent type for each of OrderEntry and PlannedLine. The Agents are DealerAgent, and MfgAgent. You can configure as many DealerAgent and MfgAgent agents as you wish on any number of client systems.
  • There is one Controller which runs on the Master machine and with which all the Agents register.

4.1.3 How The Driver Works

The harness is based on Faban, the preferred means to schedule and automate benchmark runs. It also provides an embedded graphing tool that can map and compare run results. Additional information about the harness can be found at http://faban.sunsource.net. The harness runs as a web application hosted on a Tomcat server. It provides a Web user interface and acts as a container for the driver components of SPECjEnterprise2010. The web user interface provided by the harness is used for scheduling runs, managing runs, view run logs which are constantly updating at run time, and view the run results once the run is finished (including detailed graphs). There is also a command line interface that can used to access the same features. The harness, the driver, and all agents communicate using RMI. The harness reads the run configuration and starts the master driver and the driver agents, distributing load to the driver systems as appropriate. The Driver also reads the run configuration and configures the driver agents appropriately. Each agent will then run as many threads of their respective workload. For example, the DealerAgent will run DealerEntry. The number of threads is determined by the scaling rules of the specification and are equally distributed amongst all driver agents. Each thread runs independently, executing the workload according to the rules defined in the spec. When the run completes, the master driver co-ordinates with the Agents to retrieve all the statistics and presents the user with the reports out the reports. The harness also post-processes the reports and presents the user with several charts showing the behavior during the runs and other statistical validations.

4.1.4 The Driver's InitialContext and Lookups

There are a few vendor specific initial context configuration parameters [ic_config_1 ... ic_config_5] in the run.xml which should be defined in the vendor specific property file that serves as an override. These are defaulted to empty strings in the faban/driver.properties file.

4.1.5 The Web Services configuration Manufacturing Domain web services
The manufacturing agent uses Web Services to drive part of the createVehicle work load on the manufacturing domain. There are a few configuration parameters related to the WSDL location and the URL of the Web Service endpoint in the faban/driver.properties file, which are mentioned below.

appserver.workorder.ws.uri -- this should be set to the uri of the Web Service endpoint. For example currently this is defaulted to "WorkOrderSessionService/WorkOrderSession" which means if the benchmark is deployed on host lifeboat and port 8000, then the Web Service is available at http://lifeboat:8000/WorkOrderSessionService/WorkOrderSession .

workorder.web.services.wsdl.location -- this is currently defaulted to "http://ejb.workorderses.mfg.jent.spec.org/WorkOrderSessionService" and the driver uses the jaxws catalog service to find the physical location of the wsdl on the file system. The jax-ws-catalog.xml file is located under resources/driver/META-INF. As an alternative approach to finding the wsdl, the physical location to the wsdl file can also be specified for this property. Buyer Domain and Emulator web services
The communication between the Buyer and Emulator is based on web services. In order to configure the host and port information for each, add the appropriate overrides in the vendor specific build.properties under appservers/[vendor]/build.properties.

buyer.host and buyer.port specify the host and port information for the buyer web service.

supplier.host and supplier.port specify the host and port information for the supplier web service.

Both of the Web Services configurations discussed in and can be overridden in the harness user interface under the "Mfg Driver" tab.

4.2 Configuring and Running the Driver

4.2.1 Using the Faban test harness Setting up the Faban test harness

By default the harness is installed in $KIT_HOME/faban/harness, this is done via the "install" ant target (see section 3.1.2 above).

All the required binaries for faban harness are checked into the workspace under faban as faban-server.tar.gz. To install or re-install the harness on your system in a place other than the default location you can:

  1. set the harness.install.dir property in driver.properties under faban.
  2. run the "ant target faban.harness.install".

You can start the harness (Tomcat) server using "ant faban.harness.start". At this point the harness has no deployed benchmark. For a deployed benchmark to work correctly, its client side dependencies have to be satisfied. The client side depends on javaee libraries such as javaee.jar and/or the webservices runtime jars, e.t.c; some of these dependencies maybe vendor specific.

There are two ways these dependencies can be satisfied --

1. Bundle all the client side jars into the jar that gets deployed on the harness In this approach, the client side dependencies are included in the jar file that gets deployed and the harness is responsible for distributing these bits on the individual agent machines that are used to generate load. The jars included are specified in the harness.lib.jar property. This is the path to the jar file that contains all the dependencies that need to get bundled in specjdriverharness.jar. This jar file needs to be created once by each vendor and can be reused subsequently. When "ant faban.harness.jar" is invoked and the harness.lib.jar property is specified, the build process adds the appropriate jar files into the specjdriverharness.jar.

Example of what needs to be in the harness.libs.jar file:

    • Web Services runtime bundle
    • ORB runtime bundle
    • javaee.jar

2. Use the classpath argument to the driver's JVM command In this approach the client side dependencies are satisfied by specifying the classpath to the individual jar files. It is the responsibility of the deployer to make sure that on each agent system the same classpath is available to the drivers. The classpath is specified in the faban.agent.classpath property of the build.xml under faban/ . This goes into the run.xml and can be overridden in the harness Web user interface.

Once one of the above approaches is picked to satisfy the client side dependencies, to deploy SPECjEnterprise2010, do the following:

  1. Run ant faban.harness.jar
  2. Run ant deploy-on-harness Faban Harness User Interface

The Faban harness exposes its user interface through a web browser. Once the harness is setup, point your web browser to http://host:9980 to view the user interface. You would be asked to create a profile. Profiles allow you to schedule runs for multiple benchmarks at the same time from the same harness. At any time only one benchmark is run. The Faban harness in SPECjEnterprise2010 is completely integrated with the Manufacturing and Dealer drivers. Once you have created a profile, you can click on "Schedule Run" on the left hand menu to start a run. Change the run parameters on the user interface as needed, these parameters are picked up from the run.xml and can be edited. Once edited the run parameters are persistent across runs until you redeploy the benchmark with the option to "clear previous benchmark configuration" selected.

There are currently five tabs available on the harness UI:

  • Java: allows you to specify the java executable for the driver & some JVM options including endorsed directories and agent classpath.
  • Driver: allows you to configure the common run configuration between the two drivers.
    • For running with multiple agents, potentially on remote systems, provide a space separated list of hostnames in the hosts edit box. Also for the individual drivers, change the number of agents appropriately.
  • Dealer Driver: dealer specific configuration.
  • Manufacturing Driver: manufacturing domain specific configuration. The Vendor specific IC configuration has been covered in the note in section The supplier/buyer WebServices URLs can be specified here and are saved by the driver into the specj database before the run begins.
  • Servers: allows you to specify the SUT hostname and port, the tools to run during steady state and whether the tools should be enabled. For example, if in the "Application Server" subsection, the tools entry contains "iostat 10", the harness will collection i/o statistics on a Unix based system during steady state and make the report available at the end of the run.

You should see some descriptive text for each of the configuration parameter with a mouse over on the text box. Once all your edits are done, click on the "Ok" button to start the run. The run can be monitored by clicking on the run id presented after the run has been submitted. Alternatively you can also navigate to the run using the "view results" menu and selecting the appropriate run id (arranged in the order of latest to oldest). Once a run is completed, you can see the summary results by clicking on the "Summary Result." To view the various graphs click on the "Detailed Results" link (appears a bit after the run completes). To review the run configuration click on "Run Configuration". System statistics can be viewed under "Statistics", to see io & cpu statistics you would need to specify the "tools" under the "Driver" tab; for example for Solaris these can be specified as "iostat; mpstat" e.t.c.

4.2.2 The Faban Harness Command Line Interface (faban cli):

The faban cli provides the same functionality that is available from the harness web interface, from the command line. The faban cli is the recommended way of starting a benchmark run from the command line when using multiple agents on remote systems. The benchmark configuration is still picked from the run.xml file. To run multiple agents, potentially on remote systems, make sure the number of agents under the driverconfig tab are set appropriately and the <host> element under <hostConfig> is space separated list of driver hosts. The following ant targets are provided for starting a run using the faban cli:

  • faban.cli.run
  • faban.cli.killrun
The fabancli ant targets are currently by default meant to be used from the system on which the harness is installed. If on the harness you have a profile other than "specj" that you would like to use, make sure to change it.

4.2.3 Driver Configuration Tips

If no <driver host> is specified the default local host is used.

4.2.4 Live Statistics

The live statistics is used to monitor whether the benchmark run is progressing according to expectations you can turn the live stats feature on: <runtimestats enabled="true"> <interval>300</interval> </runtimestats>

This will result in run statistics being printed out at 300 second intervals in the run log after the start of the run (this is when the log says "Ramp up started" including during ramp up). For getting stats at a different interval, change the interval value. The output format is explained based on the example below:

INFO: 4500.00s - MfgDriver: CreateVehicleEJB/CreateVehicleWS CThru=15.533/13.867 OThru=14.821/14.822 CErr=0.000/0.000 CResp=0.018/0.015 OResp=0.010/0.013 CSD=0.034/0.022 OSD=0.014/0.009 C90%Resp=-/- O90%Resp=0.020/0.020

INFO: 4500.00s - DealerDriver: Purchase/Manage/Browse CThru=13.400/14.000/24.000 OThru=12.556/12.621/25.140 CErr=0.000/0.000/0.000 CResp=0.010/0.017/0.026 OResp=0.009/0.015/0.025 CSD=0.016/0.017/0.015 OSD=0.009/0.016/0.012 C90%Resp=-/-/- O90%Resp=0.020/0.030/0.040

These statistics are interpreted as follows: At 300 seconds from start of the run, the MfgDriver is reporting statistics for the two Operations CreateVehicleEJB and CreateVehicleWS.

  • CThru - current thruput (as of the last interval period)
  • OThru - overall measured steady state thruput
  • CErr - current error rate (as of the last interval period)
  • CResp - current response time (avg of the last interval period)
  • OResp - overall avg steady state response time
  • CSD – current standard deviation of the response time (as of last interval period). This number is interpreted in the context of the current response time and does not contribute to the overall standard deviation without taking the current response time into account.
  • OSD – overall standard deviation of the response time
  • C90%Resp - current 90th% response time as of last interval period
  • O90%Resp - overall steady state 90th% response time,

So for example, the CreateVehicleEJB operation's current throughput (over the last collection interval of 300 seconds was 53.8 txs/sec) and the CreateVehicleWS operation's current throughput was 34.000 txs/sec.

4.2.5 Logging Configuration

By default, the SPECjEnterprise2010 comes with logging pre-configured at level INFO. To change the log level, edit the file ${harness.install.dir}/config/logging.properties Log levels are defined by the Java logging facilities, and are as follows: SEVERE WARNING INFO CONFIG FINE FINER FINEST Setting the log level to FINER or FINEST for the whole system would generate a large amount of output. It is recommended to enable such logging only for specific subsystems that needs more detail. For example, to enable logging at level finer for the DealerDriver, add the following line to logging.properties: org.spec.jent.driver.DealerDriver.level = FINER For further information on logging configuraion and logging.properties file format, please refer to the Java Logging Overview at the following location: http://java.sun.com/javase/6/docs/technotes/guides/logging/overview.html.

4.2.6 Auditing

During auditing the driver executes some tests before the workload is started to verify that a few important run rules are met. If a test fails the run is stopped.

For testing purposes auditing can be skipped by setting the audit flag in faban/run.xml.template to false. Alternatively by setting the flag stopIfAuditFailed in faban/run.xml.template to false the run is not stopped if auditing fails.

4.2.7 Troubleshooting

In order to get a valid run please reload the database and restart your application server before starting the driver.

The database has to be reloaded since the driver executes some checks during auditing including table row size.

The reasons for restarting the application server are as follows:

  • The txRate is stored in a static member variable which is initialized once if its value is null. But due to technical reasons it is not reinitialized automatically before each run.
  • The database is loaded using fast techniques like flat files which do not use JPA. Hence id values cached by a JPA TableGenerator could be reused which lead to duplicate key errors on the database.

5 Results Submission

The Submission File contains a detailed description of the SUT in text format that is used by the SPECjEnterprise2010 reporter to produce a report suitable for publication.

To create a SPECjEnterprise2010 Submission File:

  • Start with the Sample Submission.txt file provided in the reporter/sample directory.

  • Edit the fields to describe the details of your SUT. A detailed description of each field and example are provided in the sample file.

  • Supply this file to the report generation scripts as described below. The contents of this file will be included in the final submission properties file.

5.2 Generating a Report

SPECjEnterprise2010 includes a utility for generating a results page called the reporter. The report contains a summary of the key results and a complete system description in a format designed to be consistent with other SPEC reporting pages. In addition to this submission file a sample report is generated which simulates the appearance of the report on the SPEC results website.

  • ant reporter.run
Will run the reporter using the latest run results from either the harness or the ant driver. The reporter output will write output to the currently configured output directory.

To run the reporter by hand, use the following command:

$ java -classpath reporter.jar reporter [-a] [-r] <submission file>.txt* <result output directory>


  • <submission file>.txt is the Submission File created in the previous step.
  • <result output directory> is the path to the results directory for which a report should be generated, not including a trailing separator


  • Sample report

An HTML file named <filename>.report.html is created by default.

The "-a" option will create a text report page named <filename>.report.txt and the "-r" option will write the output to the result output directory.

  • Submission properties file
    • This file is to be submitted via the SPEC submission process outlined below.

5.3 Submitting the Results

Once you have a successful run, you can submit the results to the SPEC OSG Java subcommittee for review. To submit a result to SPEC:

  • Create a configuration diagram (in PNG, JPEG or GIF format) for your SUT (see SPECjEnterprise2010 Run and Reporting Rules section 5.2 for a description of the Configuration Diagram).

  • Create a Full Disclosure Archive (in ZIP, TAR, or JAR format) with the results of your run (see SPECjEnterprise2010 Run and Reporting Rules section 5.3 for a list of the items in the Archive).

  • Create a Submission File as described in Section 5.1 and 5.2 above. The Submission File should contain the file name for the Configuration Diagram and the Archive.

  • Put just these three files in a JAR file.

Every submission goes through a minimum two-week review process, starting on a scheduled SPEC OSG Java sub-committee conference call. During the review, members of the committee may ask for additional information or clarification of the submission. Once the result has been reviewed and accepted by the committee, it is displayed on the SPEC web site at http://www.spec.org/.

Appendix A - ant targets

The following is a list of commonly used ant targets and a brief description of their purpose.

Unless otherwise noted – KIT_HOME refers to the location where the SPECjEnterprise2010 Kit was extracted.

Target Location Description
all KIT_HOME/build.xml Generates the emulator.ear, specj.ear, driver.jar file
appserver.specj.ear KIT_HOME/appservers/appserver_type/build.xml Runs any application server specific tooling on the precompiled and packaged specj.ear file.
clean.all KIT_HOME/build.xml removes all output generated by this build. For example: all jars in the target/jars directory will be deleted.
clean.classes KIT_HOME/build.xml removes all classes generated by this build
database.configure KIT_HOME/database/db_type/build.xml Configure the database by creating the tables
deploy-on-harness KIT_HOME/faban/build.xml Deploys driver on the Faban harness
driver.jar KIT_HOME/faban/build.xml Generates the driver.jar file
driver.run KIT_HOME/faban/build.xml Runs the driver outside the Faban harness by starting Faban registry, driver agents and then the Faban master
emulator.ear KIT_HOME/build.xml Generates the emulator.ear file with existing jar files
emulator.ear.withcompile KIT_HOME/build.xml Generates the emulator.ear file after compiling any required classes
extract.faban.libs KIT_HOME/faban/build.xml Extracts the lib directory from the faban server kit to the faban/lib directory
faban.cli.killrun KIT_HOME/faban/build.xml Stops the benchmark using the faban command line interface
faban.cli.run KIT_HOME/faban/build.xml Starts the benchmark using the faban command line interface
faban.harness.install KIT_HOME/faban/build.xml Installs the faban harness on the current system.
faban.harness.start KIT_HOME/faban/build.xml Starts the faban harness on the current system.
faban.harness.stop KIT_HOME/faban/build.xml Stop the faban harness on the current system.
generate.buyer-supplier.service KIT_HOME/build.xml Generate buyer and supplier webservice jar files
generate.workorder.service KIT_HOME/build.xml Regenerate the WSDL for the WebService
install KIT_HOME/build.xml finish installation, or can be used to retrieve original class, jar, ear files delivered with the kit
kit KIT_HOME/build.xml create a jar that contains all of the bits needed to run the benchmark
load.database KIT_HOME/build.xml Load the database using the standalone database loader
load.flatfiles KIT_HOME/build.xml Create flat files to enable loading the database for the currently configured work load
print-class-path KIT_HOME/build.xml Prints the classpath of the kit
reporter.jar KIT_HOME/build.xml Create the reporter.jar file in the target/jar directory
reporter.run KIT_HOME/build.xml run the reporter using the results from the last run (driver or harness)
specj.ear KIT_HOME/build.xml Generates the specj.ear file from existing jar files
specj.ear.withcompile KIT_HOME/build.xml Generates the specj.ear file after compiling any required classes
specj.jar KIT_HOME/build.xml Create the specj.jar file in the target/jar directory
specj.war KIT_HOME/build.xml Generates the specj.war file
supplier.war KIT_HOME/build.xml Generates the supplier.war file

Product and service names mentioned herein may be the trademarks of their respective owners.

Copyright © 2001-2012 Standard Performance Evaluation Corporation
All Rights Reserved