Saturday 8 June 2013

Case Studies – Identifying Performance-testing Objectives

Case Study 1 Scenario

A 40-year-old financial services company with 3,000 employees is implementing its annual Enterprise Resource Planning (ERP) software upgrade, including new production hardware. The last upgrade resulted in disappointing performance and many months of tuning during production.

Performance Objectives

The performance-testing effort was based on the following overall performance objectives:
Ensure that the new production hardware is no slower than the previous release.
Determine configuration settings for the new production hardware.
Tune customizations. 

Performance Budget/Constraints

The following budget limitations constrained the performance-testing effort:
No server should have sustained processor utilization above 80 percent under any anticipated load. (Threshold)
No single requested report is permitted to lock more than 20 MB of RAM and 15-percent processor utilization on the Data Cube Server.
No combination of requested reports is permitted to lock more than 100 MB of RAM and 50-percent processor utilization on the Data Cube Server at one time. 

Performance-Testing Objectives

The following priority objectives focused the performance testing:
Verify that there is no performance degradation over the previous release.
Verify the ideal configuration for the application in terms of response time, throughput, and resource utilization.
Resolve existing performance inadequacy with the Data Cube Server.

Questions
  1. The following questions helped to determine relevant testing objectives: 
  2. What is the reason for deciding to test performance? 
  3. In terms of performance, what issues concern you most in relation to the upgrade? 
  4. Why are you concerned about the Data Cube Server? 
Case Study 2
Scenario

A financial institution with 4,000 users distributed among the central headquarters and several branch offices is experiencing performance problems with business applications that deal with loan processing.
Six major business operations have been affected by problems related to slowness as well as high resource consumption and error rates identified by the company’s IT group. The consumption issue is due to high processor usage in the database, while the errors are related to database queries with exceptions.

Performance Objectives
  • The performance-testing effort was based on the following overall performance objectives: 
  • The system must support all users in the central headquarters and branch offices who use the system during peak business hours. 
  • The system must meet backup duration requirements for the minimal possible timeframe. 
  • Database queries should be optimal, resulting in processor utilization no higher than 50-75 percent during normal and peak business activities. 
Performance Budget/Constraints

The following budget limitations constrained the performance-testing effort:
  • No server should have sustained processor utilization above 75 percent under any anticipated load (normal and peak) when users in headquarters and branch offices are using the system. (Threshold) 
  • When system backups are being performed, the response times of business operations should not exceed 8 percent, or the response times experienced when no backup is being done. 
  • Response times for all business operations during normal and peak load should not exceed 6 seconds. 
  • No error rates are allowable during transaction activity in the database that may result in the loss of user-submitted loan applications. 
Performance-Testing Objectives

The following priority objectives focused the performance testing:
  • Help to optimize the loan-processing applications to ensure that the system meets stated business requirements. 
  • Test for 100-percent coverage of the entire six business processes affected by the loan-manufacturing applications. 
  • Target database queries that were confirmed to be extremely sub-optimal, with improper hints and nested sub-query hashing. 
  • Help to remove superfluous database queries in order to minimize transactional cost. 
  • Tests should monitor for relevant component metrics: end-user response time, error rate, database transactions per second, and overall processor, memory, network, and disk status for the database server. 
Questions
  1. The following questions helped to determine relevant testing objectives: 
  2. What is the reason for deciding to test performance? 
  3. In terms of performance, what issues concern you most in relation to the queries that may be causing processor bottlenecks and transactional errors? 
  4. What business cases related to the queries might be causing processor and transactional errors? 
  5. What database backup operations might affect performance during business operations? 
  6. What are the timeframes for back-up procedures that might affect business operations, and what are the most critical scenarios involved in the time frame? 
  7. How many users are there and where are they located (headquarters, branch offices) during times of critical business operations? 

These questions helped performance testers identify the most important concerns in order to help prioritize testing efforts. The questions also helped determine what information to include in conversations and reports.

Case Study 3
Scenario

A Web site is responsible for conducting online surveys with 2 million users in a one-hour timeframe. The site infrastructure was built with wide area network (WAN) links all over the world. The site administrators want to test the site’s performance to ensure that it can sustain 2 million user visits in one hour. 

Performance Objectives

The performance-testing effort was based on the following overall performance objectives:
The Web site is able to support a peak load of 2million user visits in a one-hour timeframe.
Survey submissions should not be compromised due to application errors. 

Performance Budget/Constraints

The following budget limitations constrained the performance-testing effort:
No server can have sustained processor utilization above 75 percent under any anticipated load (normal and peak) during submission of surveys (2 million at peak load).
Response times for all survey submissions must not exceed 8 seconds during normal and peak loads.
No survey submissions can be lost due to application errors. 

Performance-Testing Objectives

The following priority objectives focused the performance testing:
  • Simulate one user transaction scripted with 2 million total virtual users in one hour distributed among two datacenters, with 1 million active users at each data center. 
  • Simulate the peak load of 2 million user visits in a one-hour period. 
  • Test for 100-percent coverage of all survey types. 
  • Monitor for relevant component metrics: end-user response time, error rate, database transactions per second, and overall processor, memory, network and disk status for the database server. 
  • Test the error rate to determine the reliability metrics of the survey system. 
  • Test by using firewall and load-balancing configurations.

Questions


  1. The following questions helped to determine relevant testing objectives: 
  2. What is the reason for deciding to test performance? 
  3. In terms of performance, what issues concern you most in relation to survey submissions that might cause data loss or user abandonment due to slow response time? 
  4. What types of submissions need to be simulated for surveys related to business requirements? 
  5. Where are the users located geographically when submitting the surveys?