Collecting performance data from a performance test

You may already have defined a test that exercises your Web application, using Rational® Performance Tester tools. For information on creating tests with Rational Performance Tester, see Creating tests and Editing tests.

Usually, the test passes (that is, it doesn't fail due to a bug), but performance is not ideal either in the testing environment or as reported by a user in a production environment, so you want to collect performance data using the performance and problem analysis tools.

Prerequisites:

First, you must specify that your test should generate the ARM monitoring data that will be collected and analyzed:

  1. Open the test in the Performance Test editor.
  2. Select the test element that exercises the part of the application that is experiencing the performance problem.
  3. On the Test Element Details page, select the Enable ARM monitoring check box.
  4. Repeat the previous two steps for each test element for which you want to gather data. Notes:
    • Click the Parent link to move to the Test Element Details page for the parent of the current test element.
    • If many tests have poor performance, go to a root test node and select Enable ARM monitoring for it; this will enable ARM monitoring for all of that node's children.
    • Click the Parent link to move to the Test Element Details page for the parent of the current test element.
    • If many tests have poor performance, go to a root test node and select Enable ARM monitoring for it; this will enable ARM monitoring for all of that node's children.
  5. Save the changes.

Next, to collect performance data for an application running under the test, ( create a new profiling configuration), with the following specifics:

  1. Select J2EE Application with Performance Test as the configuration type.
  2. On the Test page, select the test you want to run, and select a deployment for that test. You will have to create a deployment if you have not done so already; see ( Creating a Test Deployment).
  3. On the Execution Results page, specify a location for the test results to be stored.
  4. On the Users page, specify the number of users to simulate. Usually, you will select only one here, to avoid generating too much performance data.
  5. On the Profiling page, on the Overview tab, select J2EE Performance Analysis.
  6. Edit the profiling set, if desired, by clicking the Edit button.
    1. On the Components page, select the types of J2EE components to collect data from.
    2. On the Filters page, specify which hosts and URLs to collect data from.
    3. On the Sampling page, if you want to limit the amount of data you collect, specify sampling rates. See ( Customizing realtime profiling settings) for details.
    4. Click Finish.
  7. Click Profile to attach to the agent, start monitoring and run the test.

Tip: If you do not want to change any of the default settings for the profiling configuration (all component, no filters, no sampling), you can simply right-click on the test in the Test navigator and select Profile > J2EE Application with Performance Test, instead of the above steps.

Note 1: Unlike in other data collection scenarios, when you click Profile for a J2EE Application with Performance Test configuration, both monitoring and the test case are automatically started so that data collection is synchronized with the test starting.

Note 2: Filters and samples work on root transactions. For collecting data from tests, the root transaction takes place on the first ARM-instrumented test element to be run. If the entire test is ARM-instrumented (that is, the Enable ARM Instrumentation check box is selected for the top-level test element), there will only be one root transaction, and so filtering and sampling will be ineffectual.

When the test has finished running, the test results and profiling data are available in the locations specified in the profiling configuration. Only data generated by the test is included in the results. Any other activity on the Web application is filtered out and not included in the performance data collected. For example, if other people are using the application at the same time that the test is running, the collected data does not include data resulting from their actions. Likewise, if you want to use other tools to simulate a load in the background, you can do that without having to collect all the  data associated with that load generation.

Once you have collected the performance data, you can begin analyzing it and diagnosing the problem. You can view the data using several views including statistics views and sequence diagrams of class and object interactions.

Terms of use | Feedback
(C) Copyright IBM Corporation 2005. All Rights Reserved.