Course Registration System
Test Plan for the Architectural Prototype
Version 1.0
Revision History
Date | Version | Description | Author |
---|---|---|---|
7/March/1999 | 1.0 | Initial Release - Prototype Test Plan | K. Stone |
|
|
|
|
|
|
|
|
|
|
|
|
Table of Contents
Test Plan
For the
Architectural Prototype
1. Objectives
1.1 Purpose
This document describes the plan for testing the architectural prototype of the C-Registration System. This Test Plan document supports the following objectives:
This Test Plan describes the integration and system tests that will be conducted on the architectural prototype following integration of the subsystems and components identified in the Integration Build Plan for the Prototype [16].
It is assumed that unit testing already provided thorough black box testing, extensive coverage of source code, and testing of all module interfaces.
The purpose of assembling the architectural prototype was to test feasibility and performance of the selected architecture. It is critical that all system and subsystem interfaces be tested as well as system performance at this early stage. Testing of system functionality and features will not be conducted on the prototype.
The interfaces between the following subsystems will be tested:
The external interfaces to the following devices will be tested:
The most critical performance measures to test are:
Applicable references are:
The listing below identifies those items (use cases, functional requirements, non-functional requirements) that have been identified as targets for testing. This list represents what will be tested.
(Note: Future release of this Test Plan may use Rational RequisitePro for linking directly to the requirements in the Use Case Documents and Supplementary Specification.)
2.1 Data and Database Integrity Testing
Verify access to Course Catalog Database.
Verify simultaneous record read accesses.
Verify lockout during Course Catalog updates.
Verify correct retrieval of update of database data.
2.2. Function Testing
Vision Document, Section 12.2: "The system shall interface with the existing Course Catalog Database System. C-Registration shall support the data format as defined in [2]."
Vision Document, Section 12.2: "The system shall interface with the existing Billing System and shall support the data format as defined in [1]."
Vision Document, Section 12.2: "The server component of the system shall operate on the College Campus Server and shall run under the UNIX Operating System."
Supplementary Specification, Section 9.3: "The server component of the system shall operate on the Wylie College UNIX Server."
Vision Document, Section 12.2: "The client component of the system shall operate on any personal computer with a 486 Microprocessor or better."
Supplementary Specification, Section 9.3: "The client component of the system shall operate on any personal computer with at least a 486 Microprocessor."
Supplementary Specification, Section 9.1: "The system shall integrate with existing legacy system (course catalog database) which operates on the College DEC VAX Main Frame."
Supplementary Specification, Section 9.2: "The system shall integrate with the existing Course Billing System which operates on the College DEC VAX Main Frame."
2.3 Business Cycle Testing
None.
2.4 User Interface Testing
Verify ease of navigation through a sample set of screens.
Verify sample screens conform to GUI standards.
Vision Document Section 10: "The System shall be easy-to-use and shall be appropriate for the target market of computer-literate students and professors."
Vision Document, Section 12.1: "The desktop user-interface shall be Windows 95/98 compliant."
Supplementary Specification, Section 5.1: "The desktop user-interface shall be Windows 95/98 compliant."
Supplementary Specification, Section 5.2: "The user interface of the C-Registration System shall be designed for ease-of-use and shall be appropriate for a computer-literate user community with no additional training on the System."
2.5 Performance Testing
Verify response time to access external Finance system.
Verify response time to access external Course Catalog subsystem.
Verify response time for remote login.
Verify response time for remote submittal of course registration.
Vision Document, Section 12.3: "The system shall provide access to the legacy Course Catalog Database with a latency of 10 seconds of less."
Supplementary Specification, Section 7.2: "The system shall provide access to the legacy Course Catalog Database with a latency of 10 seconds or less."
2.6 Load Testing
Verify system response when loaded with 200 logged on students.
Verify system response when 50 simultaneous student accesses to the Course Catalog.
2.7 Stress Testing
None.
2.8 Volume Testing
None.
2.9 Security and Access Control Testing
Verify Logon from a local PC.
Verify Logon from a remote PC.
Verify Logon security through user name and password mechanisms.
2.10 Failover / Recovery Testing
None.
2.11 Configuration Testing
Vision Document, Section 12.2: "The client component of the system shall run on Windows 95, Windows 98, and Microsoft Windows NT."
Supplementary Specification, Section 9.4: "The web-based interface for the C-Registration System shall run in Netscape 4.04 and Internet Explorer 4.0 browsers.
Supplementary Specification, Section 9.5: "The web-based interface shall be compatible with the Java 1.1 VM runtime environment.
2.12 Installation Testing
None.
The Test Strategy presents the recommended approach to the testing of the software applications. The previous section on Test Requirements described what will be tested; this describes how it will be tested.
The main considerations for the test strategy are the techniques to be used and the criterion for knowing when the testing is completed.
In addition to the considerations provided for each test below, testing should only be executed using known, controlled databases, in secured environments.
The following test strategy is generic in nature and is meant to apply to the requirements listed in Section 4 of this document.
3.1 Testing Types3.1.1 Data and Database Integrity Testing
The databases and the database processes should be tested as separate systems. These systems should be tested without the applications (as the interface to the data). Additional research into the DBMS needs to be performed to identify the tools / techniques that may exist to support the testing identified below.
Test Objective: | Ensure Database access methods and processes function properly and without data corruption. |
Technique: |
|
Completion Criteria: | All database access methods and processes function as designed and without any data corruption. |
Special Considerations: |
|
3.1.2 Function TestingTesting of the application should focus on any target requirements that can be traced directly to use cases (or business functions), and business rules. The goals of these tests are to verify proper data acceptance, processing, and retrieval, and the appropriate implementation of the business rules. This type of testing is based upon black box techniques, that is, verifying the application (and its internal processes) by interacting with the application via the GUI and analyzing the output (results). Identified below is an outline of the testing recommended for each application:
Test Objective: | Ensure proper application navigation, data entry, processing, and retrieval. |
Technique: |
|
Completion Criteria: |
|
Special Considerations: |
|
3.1.3 Business Cycle Testing
3.1.4 User Interface TestingThis section is not applicable to test of the architectural prototype.
User Interface testing verifies a user's interaction with the software. The goal of UI Testing is to ensure that the User Interface provides the user with the appropriate access and navigation through the functions of the applications. In addition, UI Testing ensures that the objects within the UI function as expected and conform to corporate or industry standards.
Test Objective: | Verify the following:
|
Technique: |
|
Completion Criteria: | Each window successfully verified to remain consistent with benchmark version or within acceptable standard |
Special Considerations: |
|
3.1.5 Performance Profiling
Performance testing measures response times, transaction rates, and other time sensitive requirements. The goal of Performance testing is to verify and validate the performance requirements have been achieved. Performance testing is usually executed several times, each using a different "background load" on the system. The initial test should be performed with a "nominal" load, similar to the normal load experienced (or anticipated) on the target system. A second performance test is run using a peak load.
Additionally, Performance tests can be used to profile and tune a system's performance as a function of conditions such as workload or hardware configurations.
NOTE: Transactions below refer to "logical business transactions." These transactions are defined as specific functions that an user of the system is expected to perform using the application, such as add or modify a given contract.
Test Objective: | Validate System Response time
for designated transactions or business functions under a the following two
conditions:
- normal anticipated volume - anticipated worse case volume |
Technique: |
|
Completion Criteria: |
|
Special considerations: |
|
3.1.6 Load TestingLoad testing measures subjects the system-under-test to varying workloads to evaluate the system's ability to continue to function properly under these different workloads. The goal of load testing is to determine and ensure that the system functions properly beyond the expected maximum workload. Additionally, load testing evaluates the performance characteristics (response times, transaction rates, and other time sensitive issues).
NOTE: Transactions below refer to "logical business transactions." These transactions are defined as specific functions that an user of the system is expected to perform using the application, such as add or modify a given contract.
Test Objective: | Verify System Response time for designated transactions or business cases under varying workload conditions. |
Technique: |
|
Completion Criteria: |
|
Special Considerations: |
|
3.1.7 Stress Testing
3.1.8 Volume TestingThis section is not applicable to test of the architectural prototype.
3.1.9 Security and Access Control TestingThis section is not applicable to test of the architectural prototype.
Security and Access Control Testing focus on two key areas of security:
Application security, including access to the Data or Business Functions.
System Security, including remote access to the system.Application security ensures that, based upon the desired security, users are restricted to specific functions or are limited in the data that is available to them. For example, everyone may be permitted to enter data and create new accounts, but only managers can delete them. If there is security at the data level, testing ensures that user "type" one can see all customer information, including financial data, however, user two only sees the demographic data for the same client.
System security ensures that only those users granted access to the system are capable of accessing the applications and only through the appropriate gateways.
Test Objective: | Function / Data Security: Verify
that user can access only those functions / data for which their user type
is provided permissions.
System Security: Verify that only those users with access to the system and application(s) are permitted to access them. |
Technique: |
|
Completion Criteria: | For each known user type the appropriate function / data are available and all transactions function as expected and run in prior Application Function tests |
Special Considerations: |
|
3.1.10 Failover and Recovery Testing
3.1.11 Configuration TestingThis section is not applicable to test of the architectural prototype.
Configuration testing verifies operation of the software on different software and hardware configurations. In most production environments, the particular hardware specifications for the client workstations, network connections and database servers vary. Client workstations may have different software loaded such as applications, drivers, etc. At any one time many different combinations may be active and using different resources.
Test Objective: | Validate and verify that the client Applications function properly on the prescribed client workstations. |
Technique: |
|
Completion Criteria: | For each combination of the Prototype and PC application, transactions are successfully completed without failure. |
Special Considerations: |
|
3.1.12 Installation Testing3.2 ToolsThis section is not applicable to test of the C-Registration architectural prototype.
The following tools will be employed for testing of the architectural prototype:
Tool | Version | |
---|---|---|
Test Management | Rational RequisitePro
Rational Unified Process |
TBD |
Test Design | Rational Rose | TBD |
Defect Tracking | Rational ClearQuest | TBD |
Functional Testing | Rational Robot | TBD |
Performance Testing | Rational Visual Quantify | TBD |
Test Coverage Monitor or Profiler | Rational Visual PureCoverage | TBD |
Other Test Tools | Rational Purify
Rational TestFactory |
TBD |
Project Management | Microsoft Project
Microsoft Word Microsoft Excel |
TBD |
DBMS tools | TBD | TBD |
4. Resources
This section presents the recommended resources for testing the C-Registration architectural prototype, their main responsibilities, and their knowledge or skill set.
4.1 RolesThis table shows the staffing assumptions for the test of the Prototype.
Human Resources
Role | Minimum Resources Recommended
(number of workers allocated full-time) |
Specific Responsibilities/Comments |
---|---|---|
Test Manager | 1 - Kerry Stone | Provides management oversight
Responsibilities:
|
Test Designer | Margaret Cox
Carol Smith |
Identifies, prioritizes, and
implements test cases
Responsibilities:
|
System Tester | Carol Smith | Executes the tests
Responsibilities:
|
Test System Administrator | Simon Jones | Ensures test environment and
assets are managed and maintained.
Responsibilities:
|
Database Administration / Database Manager | Margaret Cox | Ensures test data (database)
environment and assets are managed and maintained.
Responsibilities:
|
Designer | Margaret Cox | Identifies and defines the
operations, attributes, and associations of the test classes
Responsibilities:
|
Implementer | Margaret Cox | Implements and unit tests the
test classes and test packages
Responsibilities:
|
4.2 System
The following table sets forth the system resources for the testing the C-Registration prototype.
System Resources
Resource | Name / Type / Serial No. |
---|---|
Wylie College Server | Serial No: X179773562b |
Course Catalog Database | Version Id: CCDB-080885 |
Billing System | Version Id: BSSS-88335 |
Client Test PC's |
|
3 Remote PCs (with internet access) | Serial No: A8339223
Serial No: B9334022 Serial No: B9332544 |
3 Local PCs (connected via LAN) | Serial No: R3322411 (Registrar's)
Serial No: A8832234 (IT Lab) Serial No: W4592233 (IT Lab) |
Test Repository |
|
Wylie College Server | Serial No: X179773562b |
Test Development PC's - 6 | Serial No: A8888222
Serial No: R3322435 Serial No: I88323423 Serial No: B0980988 Serial No: R3333223 Serial No: Y7289732 |
Testing of the C-Registration Architectural Prototype incorporates test tasks for each of the test efforts identified in the previous sections. Separate project milestones are identified to communicate project status and accomplishments.
Refer to the Software Development Plan [13] and the E1 Iteration Plan [14] for the overall phase or master project schedule.
Milestone Task | Effort (pd) | Start Date | End Date |
---|---|---|---|
Prototype Test Planning | 2 | March 12 | March 15 |
Prototype Test Design | 3 | March 15 | March 18 |
Prototype Test Development | 4 | March 19 | March 23 |
Prototype Test Execution | 3 | March 24 | March 26 |
Prototype Test Evaluation | 1 | March 29 | March 29 |
The work products of the test tasks as defined in this Test Plan are outlined in the table below.
Work Products | Owner | Review / Distribution | Due Date |
Test Plan | K. Stone | Senior Project Mgmt Team | March 15 |
Test Environment | S. Jones | - | March 18 |
Test Suite | C. Smith and M. Cox | Internal Peer Review | March 23 |
Test Data Sets | M. Cox | Internal Peer Review | March 23 |
Test Scripts | M. Cox | - | March 23 |
Test Stubs, Drivers | M. Cox | - | March 23 |
Test Defect Reports | C. Smith | Senior Project Mgmt Team | March 26 |
Test Results | C. Smith | - | March 26 |
Test Evaluation Report | C. Smith | Senior Project Mgmt Team | March 29 |
6.1 Test SuiteThe Test Suite will define all the test cases and the test scripts which are associated with each test case.
6.2 Test LogsIt is planned to use RequisitePro to identify the test cases and to track the status of each test case. The test results will be summarized in RequisitePro as untested, passed, conditional pass, or failed. In summary, RequisitePro will be setup to support the following attributes for each test case, as defined in the Requirements Attributes Guidelines [17]:
- Test status
- Build Number
- Tested By
- Date Tested
- Test Notes
7. Project TasksIt will be the responsibility of the System Tester to update the test status in RequisitePro.
Test results will be retained under Configuration Control.
6.3 Defect ReportsRational ClearQuest will be used for logging and tracking individual defects.
Below are the test related tasks for testing the C-Registration Architectural Prototype:
Plan Test |
|
|
|
|
|
|
Design Test |
|
|
|
|
|
Implement Test |
|
|
|
|
|
Execute Test |
|
|
|
|
|
|
Evaluate Test |
|
|
|
Determine if Test Completion Criteria and Success Criteria have been achieved |
Create Test Evaluation Report |
Copyright (C) IBM Corp. 1987, 2004. All Rights Reserved. |
Course Registration Project Web Example |