Scenario: Developing performance tests to measure processor utilization

Company XYZ plans to deploy Maximo® Asset Management with extensive customization. To ensure a successful deployment, Company XYZ develops and runs performance tests.

Background

Company XYZ plans to use Maximo Asset Management for asset management, purchasing, work order tracking. Because of specific business processes, Company XYZ has a customized deployment that uses automated workflows. The users in Company XYZ use the following applications:
  • Assets
  • Purchase Requisitions
  • Purchase Orders
  • Work Order Tracking

Step One: Determine the objectives to measure

The deployment team at Company XYZ considers the key business question and prioritizes the risks, rewards, and costs in the deployment. Based on research, the deployment team determines that their users do not like when transactions in web applications take too long to respond. Interviews with focus groups identify that users become frustrated when a transaction takes more than 2 seconds to respond.

The server management team at Company XYZ determines that if processor utilization remains below 80% for a target user load on the system, then the processor can provide adequate resources for applications to function at the wanted level. This value also can handle occasional spikes in processing, such as during month-end processing, without an effect on response times. Based on the size of the company, the server management team identifies 950 users as the target concurrent load.

Step Two: Develop use cases

The deployment team considers how users behave throughout the day. The team identifies that users generally log in after they arrive in the morning. Users typically complete a set of work activities and then log out. The deployment team estimates that each user completes a use case approximately 20 times per hour.

To approximate the login and logout behavior, the deployment team plans to create use cases in which automated test users log in, run six iterations of a use case, and then log out. The automated test users then log in again and repeat the cycle. A pause of 5 to 10 seconds is incorporated into steps in the scripts to represent actual user processing rates.

The deployment team identifies the uses cases that are required to test the deployment. The team also assigns weight factors to each use case. The weight factor represents the number of automated users to run each use case.
Table 1. Uses cases identified for testing in Company XYZ
Use case identifier Description Weight factor
AS01 Search assets and review safety information. 20%
PO01 Create a purchase requisition, and then create a purchase order from the purchase requisition. 5%
PO02 Change the status of a purchase order to In Progress. 5%
PO03 Receive a purchase order line item. 5%
PO04 Close a purchase order. 5%
WF01 Create a work order and route the work order through the Workflow application. 12%
WF02 View a work order and route the work order through the Workflow application for approval. 12%
WF03 Issue an item on a work order and route the work order though the Workflow application. 12%
WF04 Add labor to a work order and route the work order though the Workflow application for completion. 12%
WF05 Route a work order though the Workflow application for closure. 12%

Step Three: Develop tests

The deployment team writes each use case into a test case. Each test case lists each step that is required to run the test. The following table provides an example of the test case for use case AS01, which searches for assets and then reviews safety information.
Table 2. Example test case to search for assets and review safety information
Transaction Description Expected result
AS01_01_D_Launch Start Maximo Asset Management. The Welcome to Maximo screen is shown.
AS01_02_D_Logon Enter the user name ASSET0001 and the password maxasset. Click Sign In. The Start Center is shown.
Begin loop for multiple work items.
AS01_03_D_GoTo Click Go To. The Go To menu is shown.
AS01_04_D_LaunchAssets Select Assets > Assets The Assets application is shown.
AS01_05_D_EnterAsset
Prefix
In the Asset field, enter CAC and press the Tab key. The background for the Asset field changes to white. The cursor moves to the next field.
AS01_06_D_FindAsset Click the Filter Table icon. A list is shown that lists all assets that have CAC in their names.
Loop up to 9 times to select a random page of data.  
AS01_07_D_NextPage Click the Next Page icon The next page of asset results is shown.
End loop for page data.  
AS01_08_D_SelectAsset Select a random asset number. The details for the selected asset are shown.
AS01_09_D_TabSafety Select the Safety tab. The Safety tab is shown.
AS01_10_D_ReturnTo
StartCenter
Click Start Center. The Start Center is shown.
End loop for multiple work items.  
AS01_11_D_Logoff Click Sign out. The logout is completed. The Welcome to Maximo screen is shown.

Step Four: Define the test environment

In the initial planning stages, the deployment team discusses whether the cost of a test environment that is identical to the production environment is a justifiable expense. In the end, the deployment team decides that the risks of an inadequate test environment outweigh any potential cost savings. Therefore, the test environment at Company XYZ is an exact duplicate of the production environment.

In preparation for deployment, existing data from the system that Maximo Asset Management is scheduled to replace is migrated into the test environment. The migration of data ensures that the team is able to migrate existing data and also provides a realistic volume and structure of the data in the database that is then used for performance testing.

After the initial deployment into production, the team can test additional modifications in the test environment. The similarities between the test and production environments provide a high degree of confidence that similar results can be achieved when the additional modifications are moved to the production environment.

Step Five: Run tests

The deployment team can now record the test for the example test case and repeat the process to develop test cases for all the use cases. The deployment team uses a performance test tool to create the test cases. After all tests are recorded and debugged, the deployment team runs the tests.

To learn how the system operates at different load levels, the deployment team begins the test with 750 concurrent users. The user load is increased by an additional 50 users after a 30-minute interval. The increase of users is repeated until 950 concurrent users are on the system. The test is configured to log in one virtual user every 2000 milliseconds until the users at each load level are logged in. This process is intended to eliminate the extra processing that is required to increase the load.

Step Six: Analyze test results

After the tests are run and the data is compiled, the deployment team extracts the response time and processor utilization results into a spreadsheet. The team can generate a summary chart to identify whether the performance criteria are met. The following chart shows an example of utilization results:

Figure 1. Example of performance test results for processor utilization
The image is described in the main body of the text.

In the results chart, the average response times are under 2 seconds. The processor utilization on the database server remains below 80% at the target load of 950 concurrent users. However, the application server processor exceeds 80% utilization with a load of 850 concurrent users. Therefore, the performance test criteria were not met.

The deployment team must investigate to determine whether the excessive processor utilization issue can be resolved by tuning performance-related settings or changes to the automated workflows. The deployment team can also decide whether additional processor resources are required to meet the performance criteria in the production deployment.



Feedback