sample testplan
TRANSCRIPT
-
7/30/2019 Sample TestPlan
1/29
MPS - MRGIntegration Test Plan
Version 3.0
-
7/30/2019 Sample TestPlan
2/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
Revision HistoryDate Version Description Author
08/02/2004 1.0 Initial Draft of MRG system test plan for
LDS Quality Assurance department
Y.Karob
11/10/2004 2.0 Updated based upon internal review W. Savage
11/18/2004 3.0 Updated Test Coverage Section W. Savage
Confidential Lydian Data Services, 2013 Page 2
-
7/30/2019 Sample TestPlan
3/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
Table of Contents
1. Introduction 5
1.1 Purpose 5
1.2 Terminology and Acronyms 5
1.3 Background 5
1.4 Scope 5
1.5 Intended Audience 6
1.6 Referenced Material 6
2. Outline of Planned Tests 6
2.1 Outline of Test Inclusions 6
2.2 Outline of Other Candidates for Potential Inclusion 6
2.3 Outline of Test Exclusions 7
3. Test Approach 7
3.1 Testing Data and Data Strategy 8
3.2 Testing Techniques and Types 83.2.1 Function Testing 8
3.2.2 User Interface Testing 8
3.2.3 Load Testing 9
4. Entry and Exit Criteria 9
4.1 Entry Criteria 9
4.2 Exit Criteria 10
4.3 Suspension and Resumption Criteria 10
5. Deliverables 10
5.1 Reporting on Test Coverage 10
5.2 Reporting on Issues Encountered/Reported 105.3 Automated Test Scripts 11
5.4 Detailed Testing Results 11
5.5 Summary Report 11
6. Environmental Needs 11
6.1 Base System Hardware 11
6.2 Base Software Elements in the Test Environment 11
6.3 Productivity and Support Tools 12
7. Responsibilities, Staffing, and Training Needs 12
7.1 Human Resources and Roles 12
8. Risks, Assumptions, and Constraints 13
9. Management Process and Procedures 14
9.1 Problem Reporting, Escalation, and Issue Resolution 14
9.2 Test Plan Approval and Signoff 15
APPENDIX A Testing Type Definitions 15
Confidential Lydian Data Services, 2013 Page 3
-
7/30/2019 Sample TestPlan
4/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
APPENDIX B Coverage Summary 17
Confidential Lydian Data Services, 2013 Page 4
-
7/30/2019 Sample TestPlan
5/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
1. Introduction
1.1 Purpose
The purpose of this document is to gather all of the information necessary to coordinate the test effort. It
describes the approach to test the software, and is designed to provide a high-level overview of the testing
effort.
This document supports the following objectives:
Identifies the scope of the testing effort
Outlines the testing approach that will be used
Lists the deliverable elements of the test project
Identifies the required resources and provides an estimate of the test efforts
1.2 Terminology and Acronyms
The following list contains terms and acronyms contained within the document. Testing type definitions are
contained withinAppendix A.
MPS Mortgage Processing System
MRG Middleburg, Riddle, Gianna
LDS Lydian Data Services
PMO Project Management Office
AUT Application Under Test
1.3 Background
MPS MRG Integration is envisioned to replace the existing process in which closing documentation is
established; current process involves usage of the Miracle application. Integration shall reduce the double entry
of information currently required to produce closing documentation. The introduction of the MRG components
shall not affect the existing functionality of MPS.
1.4 Scope
This section shall address the scope for the test phase to be performed by LDS Quality Assurance Department.
All testing outlined below pertains to the System Testing phase, including System Level Integration Testing, of
the AUT.
The following testing types will be considered in scope for the testing phase:
Functional
User Interface
Load
The following testing types placement in scope has yet to be determined. Items listed below are available
testing candidates; however addition of these items will require a re-estimate of the test effort: Data and Database Integrity
Business Cycle
Stress
Volume
Performance Profiling
Failover and Recovery
Configuration
Confidential Lydian Data Services, 2013 Page 5
-
7/30/2019 Sample TestPlan
6/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
Installation
Security and Access Control (Role based)
1.5 Intended Audience
This test plan is intended to provide guidance to the Development Resources, Quality Assurance Management,
Project Management and Executive Sponsors of the application, as to the scope and methodology utilized during the
testing phases as conducted by the LDS Quality Assurance Department.
1.6 Referenced Material
The following list contains name and location of referenced material.
MPS User Guide for Post Closing (Draft)
- Star Team, Project Path: Development Services\CSFB Wholesale Operations\P26 MRG Integration
MRG Integration Software Requirements Specification for MPS (Version 1.0)
- Star Team, Project Path: Development Services\CSFB Wholesale Operations\P26 MRG Integration
MPS Use Case Specification: MRG Integration
- Star Team, Project Path: Development Services\CSFB Wholesale Operations\P26 MRG Integration
MPS MRG Quality Assurance Master Test Plan
- Star Team, Project Path: Development Services\CSFB Wholesale Operations\P26 MRG Integration
2. Outline of Planned Tests
This section provides a high-level outline of the testing that will be performed. The outline in this section represents
a high level overview of both the tests that will be performed and those that will not. It is important to note that
outlined coverage execution is dependent upon the scheduled testing window being available; changes to the overall
testing schedule may have a direct effect upon the execution of the test plan. A coverage summary section is located
within Appendix B.
2.1 Outline of Test Inclusions
The primary driver of the testing will be requirements based use cases specific to the new features provided within
MPS as a direct result of the MRG integration. In addition, general regression testing will be performed upon the
MPS application to ensure behavior and operation of MPS was not adversely effected as a result of the integration.
LDS MPS Use-Case Specification: MRG Integration
MPS Regression Testing Activities
MPS-MRG limited scope load testing
If it is deemed that the above listed use/test cases do not provide sufficient coverage to address those areas
considered in scope, additional use/test cases will be incorporated.
2.2 Outline of Other Candidates for Potential Inclusion
There are several areas in which testing may be desired however benefit has yet to be quantified or responsible
testing entity may reside outside of the Quality Assurance department, those areas are outlined in the following.
Items listed below are available testing candidates; however addition of items listed below will require a re-estimate
of the test effort:
Project Specific Areas:
MPS Messaging Inspection (Customer Portal) currently messaging inspection is addressed during the
user acceptance testing (UAT) phase, currently UAT testing is performed outside of the Quality
Assurance department
MPS-MRG Generated Closing Documentation (sent via email) Review
Confidential Lydian Data Services, 2013 Page 6
-
7/30/2019 Sample TestPlan
7/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
General Testing Areas
Business Cycle
Stress
Volume
Failover and Recovery
Configuration Installation
2.3 Outline of Test Exclusions
Items slated for exclusion from the Quality Assurance departments testing are listed in the following. Listing does
not contain those items as candidates for potential inclusion.
Unit Testing/Unit Integration testing envisioned to be the responsibility of development, if desired,
scope will be adjusted to include the Quality Assurance department in this endeavor
3. Test Approach
It is the expectation of the Quality Assurance department that all use cases will be completed prior to the
commencement of testing. Use cases are envisioned to address all requirements and functionality of the applicationthat requires testing. It is from these use cases that the testing scope and directive originate.
The use cases will be developed into an automated test case using the Mercury Interactive Quick Test Pro (QTP)
application. The automated test cases will be exercised upon Quality Assurance client platforms within the LDS
Quality Assurance laboratory. Automated test cases will leverage established best practices and techniques for
development.
In the event the timeline of automation development exceeds the available; use cases will be addressed by manual
testing of the application. In such an event, manual logs will be kept, detailing testing steps and observations.
In addition to the use cases being exercised, MPS regression testing activities, consisting of both manual and
automated test execution will occur to ensure that MPS functionality was not adversely affected by the introduction
of the MRG components.
Load testing shall be performed within a limited section of the application. Specifically targeted is the transaction
time related to processing the closing documentation request as reflected within the MPS user interface. Testing
shall be structured to vary the workload with relationship to time. More detailed information may be located within
Appendix BCoverage Summary.
Testing exposure is currently limited to system conditions as experienced via the graphical user interface (GUI) of
the AUT. Actions that occur outside of the GUI are not currently in scope of the LDS Quality Assurance
department for this testing endeavor. Testing activity that is desired to occur outside of the GUI is expected to be
requested in the form of a use case.
Testing shall be performed under controlled environmental conditions. Currently the testing is envisioned to occur
within the LDS MPS staging environment. Code shall be promoted to the environment and strict environmental
control procedures will be adhered to in order to provide validity to the testing effort. In the event that control
practices are not adhered to, testing may be halted (reference suspension criteria).
Depending upon additional items added to the scope, as listed in Section 2.2 - Outline of Other Candidates for
Potential Inclusion, the testing approach may expand in response. Depending upon changes to the testing schedule
and the allotted Quality Assurance testing window, testing coverage may be modified in response.
Confidential Lydian Data Services, 2013 Page 7
-
7/30/2019 Sample TestPlan
8/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
3.1 Testing Data and Data Strategy
Data contained within the testing environment (LDS MPS Staging environment) shall be a copy of production data.
Unless specific data is required to exercise established use cases, data will be randomly selected and adjusted
throughout the course of testing. Randomization of data conditions should provide an increased level of system
coverage, as more combinations shall be exercised as compared to a static data path.
3.2 Testing Techniques and Types
The following shall provide a more detailed view of the testing techniques and types to be utilized based upon the
current scope of testing. Areas listed as possible in scope or out of scope will not be addressed within this section.
Testing techniques listed below shall implore the tools as outlined in theProductivity and Support Toolssection of
this document.
3.2.1 Function Testing
Function testing of the AUT should focus on requirements for test that can be traced directly to use cases or business
functions and business rules. The goals of these tests are to verify proper data acceptance, processing, and retrieval,
and the appropriate implementation of the business rules. This type of testing is based upon black box techniques;
that are verifying the application and its internal processes by interacting with the application via the User Interface
(UI) and analyzing the output or results. The following high level summary identifies an outline of the testing type.
Technique Objective: Exercise AUT functionality, including navigation, data entry, processing, and retrieval to
observe and log target behavior.
Technique: Execute each use-case scenarios individual use-case flows or functions and features, using valid and
invalid data, to verify that:
Expected results occur when valid data is used
Appropriate error or warning messages are displayed when invalid data is used
Each business rule is properly applied
Success Criteria: This technique supports the testing of the following:
Key use case scenarios
Key features
Special Considerations: Functional testing is dependent upon creation of use cases or published requirements
documentation.
3.2.2 User Interface Testing
UI testing verifies a users interaction with the software. The goal of UI testing is to ensure that the user is provided
with the appropriate access and navigation through the functions of the AUT. In addition, UI testing ensures that the
objects within the UI function as expected and conform to corporate or industry standards.
Technique Objective: Exercise the following to observe and log standards conformance and target behavior:
Navigation through the AUT reflecting business functions and requirements, including window-to-window, field-to-field, and use of access methods (tab keys, mouse movements, accelerator keys)
Window objects and characteristics can be exercisedsuch as menus, size, position, state, and focus
Technique: Create or modify tests for each window to verify proper navigation and object states for each application
window and object.
Success Criteria: The technique supports the testing of each major screen or window that will be used extensively by
the end user.
Confidential Lydian Data Services, 2013 Page 8
-
7/30/2019 Sample TestPlan
9/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
Special Considerations: Testing will refer to published requirements documentation to interpret object properties and
behavior, if not published, interaction with Project Management and Development may be required to determine
correct application appearance and/or behavior.
3.2.3 Load Testing
Load testing is a performance test that subjects the AUT to varying workloads to measure and evaluate theperformance behaviors and abilities of the AUT to continue to function properly under these different workloads.
The goal of load testing is to determine and ensure that the system functions properly beyond the expected
maximum workload. Additionally, load testing evaluates the performance characteristics, such as response times,
transaction rates, and other time-sensitive issues.
Technique Objective: Exercise designated transactions or business cases under varying workload conditions to
observe and log target behavior and system performance data
Navigation through the AUT reflecting business functions and requirements, including window-to-
window, field-to-field, and use of access methods (tab keys, mouse movements, accelerator keys)
Window objects and characteristics can be exercisedsuch as menus, size, position, state, and focus
Technique: Develop transaction test scripts based upon functional or business transactions, structured to measure a
given operation.
Success Criteria: The technique supports the testing of Workload Emulation, which is the successful emulation of
the workload without any failures due to test implementation problems.
Special Considerations: Testing will require dedicated and isolated environment in order to ensure validity.
4. Entry and Exit Criteria
4.1 Entry Criteria
The following shall represent the entrance criteria for the testing engagement
Project established within TestDirector
Project established within TrackIt
Completion of requirements documentation
Completion of application development
Completion of development unit testing
o Documentation of unit test performed
o Unit testing sign off form completed
Functional walk-thru ofAUT provided by Development to Project Management and Quality Assurance
Documentation of recommended client side settings (Browser Configurations, Java Applets, etc.)
Testing environment provided with established configuration control
Published roll schedule
o Date and time of release to Quality Assurance testing environment
o Duration ofQuality Assurance testing window, when different that estimated/planned
o Scheduled production release date
Confidential Lydian Data Services, 2013 Page 9
-
7/30/2019 Sample TestPlan
10/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
Updated/current data dictionary
Sign off/approvals for LDS Quality Assurance test plan
4.2 Exit Criteria
The following items completion and availability of corresponding documentation shall represent the exit criteria for
the testing engagement.
System testing activities consider in scope.
Issues encountered reported within TestDirector.
Issues reported resolved to the satisfaction of the PMO.
Summary report created and submitted to PMO.
4.3 Suspension and Resumption Criteria
The following shall detail the suspension and resumption criteria for the testing engagement.Testing may be suspended due to any one of the following:
Lack of environmental control: in the event that environmental control is not enforced, testing may be
halted to investigate the underlying cause. Testing may resume once management has determined that
environment is acceptable for continued testing.
Critical defect: in the event a critical defect is located, testing may be halted to allow for modification to
code to allow for defect resolution. Testing may resume once code modifications have been submitted and
incorporated into the controlled environment.
Other Priorities: in the event other priorities are established by the business, testing may be halted. Testing
may resume once business determines the course.
Infrastructure downtime: in the event that components (network, client, servers, etc.) required for testingare unavailable, testing may be halted. Testing may resume once infrastructure is re-established.
Update of application: in the event updates are required to the application, testing may be halted while such
update occurs. Testing may resume once completion of update has occurred and is within a controlled
environment. In the event that updates are to core components, re-testing of areas addressed prior to update
may be required to ensure coverage.
Data needs: in the event that all available data is utilized during the course of testing, database
refresh/restore shall be sought. Testing shall resume once data issues are resolved.
5. Deliverables
The following shall list those items to be delivered by LDS Quality Assurance department during the course of or at
the completion of the testing effort.
5.1 Reporting on Test Coverage
Test Coverage shall be reported upon a daily basis and the formula for calculating coverage will be based upon the
total number of use cases executed as compared to the total number of available test cases. Report will be structure
as textual and is envisioned to be a brief statement of current status.
5.2 Reporting on Issues Encountered/Reported
Issue encountered shall be reported upon a daily basis. All issues submitted within TestDirector will be contained in
Confidential Lydian Data Services, 2013 Page 10
-
7/30/2019 Sample TestPlan
11/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
a summary report. Included within the report will be current/updated status of issues and if applicable the resolution
to reported issue will be contained.
In addition, all issues encountered during the testing that may reside outside of the application will be contained
within this report, including but not limited to environmental control issues, periods of downtime,
miscommunications between departments, in general any item that effects testing shall be reported upon.
5.3 Automated Test Scripts
All automated testing scripts leveraged during the course of the testing shall be archived upon a specified network
location and/or reside within TestDirector.
5.4 Detailed Testing Results
All results generated from usage of automated scripts shall be archived upon a specified network location and/or
reside within TestDirector. In the event manual testing was performed, all reports associated with the manual testing
shall be contained in the same location.
5.5 Summary Report
At the completion of the testing a summary report shall be generated containing a listing of all defects/issues
encountered and their resolution. In addition, any lessons learned from the testing engagement shall be includedwithin this summary report.
6. Environmental Needs
This section presents the non-human resources required for the testing endeavor.
6.1 Base System Hardware
Test environment, consisting of application infrastructure, shall be provided for the duration of Quality Assurance
testing cycle, such environment is the responsibility of the infrastructure coordinator to provide and monitor.
Environment shall be structured to resemble production environment as a method of ensuring valid testing.
Environment shall be under strict configuration control, as stated within the entry criteria established within this
documentation. As the current application is served from a local client browser, databases, application and
networking layers are envisioned to be established and controlled by parties outside of the Quality Assurance
department. Method of access is expected to be provided, consisting of a valid Uniform Resource Locator (URL)
with an established login to be utilized for the duration of testing. Where required, in order to facilitate security
access testing, additional login ids may be needed and shall be provided on an as needed basis, such testing is
expected to be addressed within the context of the use cases.
Client testing platforms that will be utilized shall consist of the LDS Quality Assurance laboratory platforms.
Specific client settings shall be provided and administered by the Quality Assurance lead testing resource in an
effort to resemble that of a production environment. Established environmental control practices apply to all base
system hardware components.
6.2 Base Software Elements in the Test Environment
Infrastructural software elements shall be at the control of the infrastructure coordinator for all server and network
elements; such elements shall be structured to resemble a production environment as a method of insuring valid
testing.
Elements that reside upon the Quality Assurance testing platforms shall fall under control and administration of the
Quality Assurance lead testing resource as assigned for this project. Those items will include all automated testing
software, any browser configuration settings, and any required client software load.
Items that have currently been identified as client software load specific to the MPS-MRG application testing
include the following:
Confidential Lydian Data Services, 2013 Page 11
-
7/30/2019 Sample TestPlan
12/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
Microsoft Internet Explorer: Version 6.0.28.00.1106.xpsp2.030422-1633CO
Microsoft Windows Operating System: Windows XP Professional Version 2002 Service Pack One
All version information refers to the expected client software load as present in the LDS Quality Assurance
laboratory. If additional and/or different software loads are required, such will need to be coordinated with the
Quality Assurance lead testing resource.
6.3 Productivity and Support Tools
The following shall list those productivity and support tools to be leveraged by the Quality Assurance department
for this testing endeavor.
Mercury Interactive TestDirector Utilized for defect tracking and test management
Mercury Interactive Quick Test Pro Utilized for test automation
Borland Corporation StarTeam Utilized for project document repository
7. Responsibilities, Staffing, and Training Needs
This section presents the required resources to address the test effort outlined; the main responsibilities, and the
knowledge or skill sets required of those resources.
7.1 Human Resources and Roles
This table shows the staffing related to the Quality Assurance departments efforts for the test engagement.
Human Resources
Role Minimum ResourcesRecommended and
Associated Resource Name
Specific Responsibilities or Comments
Quality Assurance Test
Manager
Minimum Resources: 1
Resource Name: Joseph Spinner
Provides management oversight.
Responsibilities include:
Planning and logistics
Acquire appropriate resources
Present management reporting
Advocate the interests of test
Evaluate effectiveness of test effort
Confidential Lydian Data Services, 2013 Page 12
http://www.mercury.com/us/products/quality-center/testdirector/http://www.mercury.com/us/products/quality-center/functional-testing/quicktest-professional/http://www.borland.com/starteam/http://www.mercury.com/us/products/quality-center/testdirector/http://www.mercury.com/us/products/quality-center/functional-testing/quicktest-professional/http://www.borland.com/starteam/ -
7/30/2019 Sample TestPlan
13/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
Human Resources
Role Minimum ResourcesRecommended and
Associated Resource Name
Specific Responsibilities or Comments
Quality Assurance Lead
Testing Resource
Minimum Resources: 1
Resource Name: Warren Savage
Provides quality assurance support
Responsibilities include:
Identify test ideas
Define test approach
Define test automation architecture
Verify test techniques
Define testability elements
Structure test implementation
Implement tests and test suites
Execute test suites
Log results
Analyse and recover from test failures
Document incidents
Create summary report
Ensure QA testing laboratory is properly
configured for test
DatabaseAdministrator,Database Manager
Minimum Resources: 1
Resource Name: TBD
Ensure test data (database) environment andassets are managed and maintained
Business Analyst Minimum Resources: 1
Resource Name: Brian
McDonald
Creation of use cases based upon published
requirements
Infrastructure
Coordinator
Minimum Resources: 1
Resource Name: Devon Walker
Ensure testing environments are maintained to
established configuration management standards
Project Manger Minimum Resources: 1
Resource Name: Jane Somerville
Provide general project direction and is
responsible for client interaction and ensuring
project success
8. Risks, Assumptions, and Constraints
The following table shall identify potential risk along with associated mitigation and contingency strategy.
Confidential Lydian Data Services, 2013 Page 13
-
7/30/2019 Sample TestPlan
14/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
Risk Mitigation Strategy Contingency
Entry Criteria have
not been satisfied.
Align Resources to address those outstanding items Table the areas of testing that involve
those uncompleted areas and proceed
testing with the expectation that task
will be completed prior to completion
of testing cycleAutomated test case
development exceeds
the available time.
Additional resources dedicated to development of
automation or scope of automation reduced
Utilize manual testing to account for
those areas in which automation has
yet to be developed
Modifications to
project scope
(Scope Creep)
Monitor product changes to atomic detail as
facilitated by configuration change management
board and/or project management
Adjust testing scope to account for
changes in initial plan
Established QA
testing time reduced
Ensure realistic timelines are established within the
project schedule
Reduce planned coverage to account
for reduction in allotted testing
window
The following shall list assumptions of the test effort. Assumptions, if incorrect, may affect the overall project.
Assumption to be proven
Impact of Assumption being
incorrect Owners
Testing environment will be managed Testing may yield false results in the event that
environmental controls are not established and
followed
Infrastructure Coordinator,
Database Manager and
Project Manager
Testing environment will be available
(Clients, Application Servers, Database
Servers, Network, etc.)
Testing schedule/activity is dependent upon
availability of testing environments;
unavailability may cause delays to the overall
schedule
Infrastructure Coordinator,
and Database Manager
Human Resources will be available
and dedicated for the duration of the
project life cycle
Lack of dedicated resources may cause delays to
the overall schedule
Quality Assurance and
Project management
Use cases are correctly designed and
provide desired coverage
Key areas of application functionality may go
untested due to omission in use cases
Business Analyst
The following are viewed as constraints of the testing exercise.
Constraint on Impact Constraint has on test effort Owners
Dedicated Environment Current process involves the sharing of an
environment between MPS and MPS-MRG; this
has limited the activities performed to date
Infrastructure Coordinator
and Database Manager
9. Management Process and Procedures
The following shall give a high level overview of the management process and procedure as relating to this testingendeavor
9.1 Problem Reporting, Escalation, and Issue Resolution
All issues shall be entered in TestDirector, escalation, if needed, will be done thru Quality Assurance and Project
Management interaction and issue resolution shall be documented within TestDirector.
Confidential Lydian Data Services, 2013 Page 14
-
7/30/2019 Sample TestPlan
15/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
9.2 Test Plan Approval and Signoff
The following individuals approval and signoff are required.
Project Role Resource Name Signature
Project Manager: Jane Somerville _________________________________
Business Analyst: Brian McDonald _________________________________
Development Representative: TBD _________________________________
Operations Representative: Steve Pico _________________________________
Database Representative: Eric Calvino _________________________________
Quality Assurance Test Manager: Joseph Spinner _________________________________
Infrastructure Coordinator Devon Walker _________________________________
APPENDIX A Testing Type Definitions
The following shall define the testing types associated within this document.
Data and Database Integrity Testing: The databases and the database processes tested as an independent subsystem.This testing should test the subsystems without the UI as the interface to the data.
Functional (Function) Testing: Function testing should focus on any requirements for test that can be traced directly
to use cases or business functions and business rules. The goals of these tests are to verify proper data acceptance,
processing, retrieval, and the appropriate implementation of the business rules. This type of testing is based upon
black box techniques; verifying the application and its internal processes by interacting with the application via the
UI and analysing the output or results.
Business Cycle Testing: Business Cycle Testing should emulate the activities performed on the application over
time. A period should be identified, such as one year, and transactions and activities that would occur during a
years period should be executed. This includes all daily, weekly, and monthly cycles, and events that are date-
sensitive.
User Interface Testing: UI testing verifies a users interaction with the software. The goal of UI testing is to ensure
that the UI provides the user with the appropriate access and navigation through the functions of the AUT. In
addition, UI testing ensures that the objects within the UI function as expected and conform to corporate or industrystandards.
Performance Profiling: Performance profiling is a performance test in which response times, transaction rates, and
other time-sensitive requirements are measured and evaluated. The goal of Performance Profiling is to verify
performance requirements have been achieved. Performance profiling is implemented and executed to profile and
tune application performance behaviours as a function of conditions such as workload or hardware configurations.
Confidential Lydian Data Services, 2013 Page 15
-
7/30/2019 Sample TestPlan
16/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
Load Testing: Load testing is a performance test that subjects the application to varying workloads to measure and
evaluate the performance behaviours and abilities of the application to continue to function properly under these
different workloads. The goal of load testing is to determine and ensure that the system functions properly beyond
the expected maximum workload. Additionally, load testing evaluates the performance characteristics, such as
response times, transaction rates, and other time-sensitive issues.
Stress Testing: Stress testing is a type of performance test implemented and executed to understand how a systemfails due to conditions at the boundary, or outside of, the expected tolerances. This typically involves low resources
or competition for resources. Low resource conditions reveal how the application fails that is not apparent under
normal conditions. Other defects might result from competition for shared resources, like database locks or network
bandwidth, although some of these tests are usually addressed under functional and load testing.
Volume Testing: Volume testing subjects the AUT to large amounts of data to determine if limits are reached that
cause the software to fail. Volume testing also identifies the continuous maximum load or volume the AUT canhandle for a given period.
Security and Access Control Testing: Security and Access Control Testing focuses on two key areas of security:
Application-level security, including access to the Data or Business Functions
System-level Security, including logging into or remotely accessing to the system
Based on the security you want, application-level security ensures that actors are restricted to specific functions oruse cases, or they are limited in the data that is available to them. For example, everyone may be permitted to enter
data and create new accounts, but only managers can delete them. If there is security at the data level, testing ensures
that user type one can see all customer information, including financial data, however, user two only sees the
demographic data for the same client. System-level security ensures that only those users granted access to the
system are capable of accessing the applications and only through the appropriate gateways.
Failover and Recovery Testing: Failover and recovery testing ensures that the AUT can successfully failover and
recover from a variety of hardware, software or network malfunctions with undue loss of data or data integrity.
For those systems that must be kept running failover testing ensures that, when a failover condition occurs, the
alternate or backup systems properly take over for the failed system without any loss of data or transactions.
Recovery testing is an antagonistic test process in which the application or system is exposed to extreme conditions,
or simulated conditions, to cause a failure, such as device Input/Output (I/O) failures, or invalid database pointers
and keys. Recovery processes are invoked, and the application or system is monitored and inspected to verify proper
application, or system, and data recovery has been achieved.
Configuration Testing: Configuration testing verifies the operation of the AUT on different software and hardware
configurations. In most production environments, the particular hardware specifications for the client workstations,
network connections, and database servers vary. Client workstations may have different software loadedfor
example, applications, drivers, and so onand, at any one time, many different combinations may be active using
different resources.
Installation Testing: Installation testing has two purposes. The first is to ensure that the software can be installed
under different conditions; such as a new installation, an upgrade, and a complete or custom installation; under
normal and abnormal conditions. Abnormal conditions include insufficient disk space, lack of privilege to createdirectories, and so on. The second purpose is to verify that, once installed, the software operates correctly. This
usually means running a number of the tests that were developed for Function Testing.
Confidential Lydian Data Services, 2013 Page 16
-
7/30/2019 Sample TestPlan
17/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
APPENDIX B Coverage Summary
The following shall summarize the testing coverage to be provided.
MPS Regression Testing process steps shall provide the following:
1.0 Access System
1.1 Log in to MPS system
2.0 Pre-Registration Queue
2.1 Access Pre-Registration Queue
2.2 Assign Loan to current user
3.0 Registration Queue
3.1 Access Registration Queue
3.2 Select Loan for Edit from Registration queue
3.3 Access 1003 Form
3.3.1 Enter 1003 Information values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
3.3.2 Save 1003 Information due to adjusting the data values, testing will address invalid data
types and formats at time of save, error dialog behavior will be tested
3.4 Access Appraisal Form
3.4.1 Enter New Appraisal - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
3.4.2 Save Appraisal - due to adjusting the data values, testing will address invalid data types
and formats at time of save, error dialog behavior will be tested
3.4.3 Edit Appraisal - values shall be adjusted to ensure correct system behavior for invalid
data types, missing data, when combined with the following step, error dialog behavior
will be tested
3.4.4 Save Appraisal - due to adjusting the data values, testing will address invalid data types
and formats at time of save, error dialog behavior will be tested
3.5 Access 1008 Form
3.5.1 Exercise Adobe link for Borrower and Property Information
3.5.2 Verify data sections that auto populate
3.5.3 Edit 1008 form
3.5.4 Save 1008 form3.6 Access Broker Info Form
3.6.1 Verify Data sections that auto populate
3.6.2 Save Broker Info Form
3.7 Access Loan Contact Form
Confidential Lydian Data Services, 2013 Page 17
-
7/30/2019 Sample TestPlan
18/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
3.7.1 Enter Loan Contact Information - values shall be adjusted to ensure correct system
behavior for invalid data types, missing data, when combined with the following step,
error dialog behavior will be tested
3.7.2 Save Loan Contact Information - due to adjusting the data values, testing will address
invalid data types and formats at time of save, error dialog behavior will be tested
3.7.3 Verify Loan Contact Information is correctly displayed in summary section
3.8 Access Comments Form
3.8.1 Enter Comment and Select Add Button
3.8.2 Verify Comment displayed
3.8.3 Save Comment Form
3.9 Access Credit Report Form
3.9.1 Select Pull Credit
3.9.2 Select Pull New Credit this step is dependent upon the existence of a test loan
structured to pull test credit report
3.9.3 Verify Credit Report is returned
3.9.4 Select Raw Text button
3.9.5 Verify Credit Report is displayed in raw text format
3.9.6 Select Pull Credit
3.9.7 Select Reissue Credit - this step is dependent upon the existence of a test loan structured
to pull test credit report
3.9.8 Verify Credit Report is returned
3.9.9 Select Raw Text button
3.9.10 Verify Credit Report is displayed in raw text format
3.10 Access DU Response Form
3.10.1 Inspect DU information, if present, to ensure absence of error
3.11 Access Good Faith Estimate Form
3.11.1 Ensure field 818 Yield Spread Premium is populated and read only (MPS TD 719)
3.11.2 Exercise Adobe link for 1008 Form Information
3.11.3 Enter GFE Information - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
3.11.4 Save GFE Information - due to adjusting the data values, testing will address invalid data
types and formats at time of save, error dialog behavior will be tested
3.12 Access HMDA Form
3.12.1 Exercise MSA/MD code link
3.12.2 Enter HMDA Information - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
Confidential Lydian Data Services, 2013 Page 18
-
7/30/2019 Sample TestPlan
19/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
3.12.3 Save HMDA Information - due to adjusting the data values, testing will address invalid
data types and formats at time of save, error dialog behavior will be tested
3.13 Access Rate Lock Info Form
3.14 Access Registration Info Form
3.14.1 Verify data sections that auto populate
3.15 Access Truth In Lending Form
3.15.1 Enter Truth In Lending Information - values shall be adjusted to ensure correct system
behavior for invalid data types, missing data, when combined with the following step,
error dialog behavior will be tested
3.15.2 Exercise the Calculate button - due to adjusting the data values, testing will address
invalid data types and formats at time of calculate, error dialog behavior will be tested
3.15.3 Verify Calculated Information is displayed correctly
3.15.4 Save Truth In Lending Information
3.15.5 Exercise Adobe link for Truth In Lending Information
3.16 Access Loan Data Form
3.16.1 Enter Loan Data Information - values shall be adjusted to ensure correct system behavior
for invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
3.16.2 Save Loan Data Information - due to adjusting the data values, testing will address
invalid data types and formats at time of save, error dialog behavior will be tested
3.17 Access Apply Conditions Form
3.17.1 Adjust conditions
3.17.2 Save Apply Conditions Form
3.18 Access Product Data Form
3.18.1 Enter Product Data Main Information - values shall be adjusted to ensure correct system
behavior for invalid data types, missing data, when combined with the following step,
error dialog behavior will be tested
3.18.2 Save Product Data Main Information - due to adjusting the data values, testing will
address invalid data types and formats at time of save, error dialog behavior will be tested
3.18.3 Exercise ARM Index Value link
3.18.4 Select Rate Adjustments page in the Product Data Form
3.18.5 Enter New Rate Adjustment - values shall be adjusted to ensure correct system behavior
for invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested3.18.6 Save Rate Adjustment - due to adjusting the data values, testing will address invalid data
types and formats at time of save, error dialog behavior will be tested
3.18.7 Edit Rate Adjustment - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
3.18.8 Save Rate Adjustment - due to adjusting the data values, testing will address invalid data
types and formats at time of save, error dialog behavior will be tested
Confidential Lydian Data Services, 2013 Page 19
-
7/30/2019 Sample TestPlan
20/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
3.19 Access Closing Agent Form
3.19.1 Exercise New Button
3.19.2 Enter Closing Agent Information - values shall be adjusted to ensure correct system
behavior for invalid data types, missing data, when combined with the following step,
error dialog behavior will be tested
3.19.3 Save Closing Agent Information - due to adjusting the data values, testing will address
invalid data types and formats at time of save, error dialog behavior will be tested
3.20 Access Third Party Form
3.20.1 Exercise all available links upon page
3.20.1.1 Flood
3.20.1.2 Torque
3.20.1.3 Appintell
3.20.1.4 Geocode
3.20.1.5 Terrorist Alert3.21 Access Escrow Form
3.21.1 Enter Escrow Information - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
3.21.2 Save Escrow Information - due to adjusting the data values, testing will address invalid
data types and formats at time of save, error dialog behavior will be tested
3.22 Access Insurance Form
3.22.1 Enter Insurance Information - values shall be adjusted to ensure correct system behavior
for invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
3.22.2 Save Insurance Information - due to adjusting the data values, testing will address invalid
data types and formats at time of save, error dialog behavior will be tested
3.23 Access MI Form
3.23.1 Enter MI Information - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
3.23.2 Save MI Information - due to adjusting the data values, testing will address invalid data
types and formats at time of save, error dialog behavior will be tested
3.24 Access Servicer Form
3.24.1 Select Servicer and verify information is populated3.24.2 Save Servicer Information
3.25 Access Custodian Form
3.25.1 Select Custodian and verify information is populated
3.25.2 Save Custodian Information
3.26 Access Shipments Form
Confidential Lydian Data Services, 2013 Page 20
-
7/30/2019 Sample TestPlan
21/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
3.26.1 Enter New Shipment - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
3.26.2 Save Shipment - due to adjusting the data values, testing will address invalid data types
and formats at time of save, error dialog behavior will be tested
3.26.3 Edit Shipment - values shall be adjusted to ensure correct system behavior for invalid
data types, missing data, when combined with the following step, error dialog behavior
will be tested
3.26.4 Save Shipment - due to adjusting the data values, testing will address invalid data types
and formats at time of save, error dialog behavior will be tested
3.27 Access Checklist Form
3.27.1 Attempt to move loan from registration queue to underwriting queue
3.27.2 Ensure message is displayed informing that checklist must be completed before queue
move
3.27.3 Select all checklist items
3.27.4 Save Checklist Form
3.28 Exercise loan movement from registration queue to underwriting queue
4.0 Underwriting Queue
4.1 Access Underwriting queue
4.2 Select Loan for Edit from Underwriting queue
4.3 Access Clear Conditions form
4.3.1 Edit Conditions
4.3.2 Save Conditions
4.3.3 Verify that changes occurred to the conditions form
4.4 Exercise loan movement from underwriting queue to processing queue
5.0 Processing Queue
5.1 Access Processing queue
5.2 Select Loan for Edit from Processing queue
5.3 Exercise loan movement from processing queue to pre-closing coordinator queue
6.0 Pre-Closing Coordinator Queue
6.1 Access pre-closing coordinator queue
6.2 Select Loan for Edit from pre-closing coordinator queue
6.3 Access Doc Prep form
6.4 Access Pre Wire form
6.4.1 Ensure form is read-only
6.5 Access Post Wire form
6.5.1 Ensure form is read-only
6.6 Access Collateral form
Confidential Lydian Data Services, 2013 Page 21
-
7/30/2019 Sample TestPlan
22/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
6.6.1 Modify information upon form
6.6.2 Save form
6.6.3 Verify changes occurred to the collateral form
6.7 Access Checklist Form
6.7.1 Attempt to move loan from pre-closing coordinator queue to closing audit queue
6.7.2 Ensure message is displayed informing that checklist must be completed before queue
move
6.7.3 Select all checklist items
6.7.4 Save Checklist Form
6.8 Exercise loan movement from pre-closing coordinator queue to closing audit queue
7.0 Closing Audit Queue
7.1 Access closing audit queue
7.2 Select Show Only My Loans radio button
7.3 Ensure only users loans are displayed
7.4 Select Loan for Edit from closing audit queue
7.5 Access HUD form
7.5.1 Enter HUD Information - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
7.5.2 Save HUD Information due to adjusting the data values, testing will address invalid
data types and formats at time of save, error dialog behavior will be tested
7.6 Access Final Details form
7.6.1 Verify Information displayed
7.7 Access Checklist Form
7.7.1 Attempt to move loan from closing audit queue to funding post closing queue
7.7.2 Ensure message is displayed informing that checklist must be completed before queue
move
7.7.3 Select all checklist items
7.7.4 Save Checklist Form
7.8 Exercise loan movement from closing audit queue to funding post closing queue
8.0 Funding Post Closing Queue
8.1 Access funding post closing queue
8.2 Select Loan for Edit from funding post closing queue
8.3 Access Exceptions form
8.3.1 Adjust selected exceptions, ensuring that exceptions are selected
8.3.2 Save Exceptions form
8.3.3 Ensure Exceptions are present upon form
Confidential Lydian Data Services, 2013 Page 22
-
7/30/2019 Sample TestPlan
23/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
8.4 Access Clear Exceptions form
8.4.1 Select Cleared box for created exception
8.4.2 Save Clear Exceptions form
8.4.3 Verify exception is updated and marked with user id
8.5 Access Exceptions form
8.5.1 Ensure cleared exceptions are marked
8.6 Access SVC Misc form
8.6.1 Enter SVC Information - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
8.6.2 Save SVC Information due to adjusting the data values, testing will address invalid data
types and formats at time of save, error dialog behavior will be tested
8.7 Access SVC Review form
8.7.1 Exercise Excel Link8.8 Access Checklist Form
8.8.1 Attempt to move loan from funding post closing queue to servicing transfer queue
8.8.2 Ensure message is displayed informing that checklist must be completed before queue
move
8.8.3 Select all checklist items
8.8.4 Save Checklist Form
8.9 Exercise loan movement from funding post closing queue to servicing transfer queue
9.0 Servicing Transfer Queue
9.1 Access servicing transfer queue
9.2 Select Loan for Edit from servicing transfer queue
9.3 Exercise loan movement from servicing transfer queue to shipping queue
10.0 Shipping Queue
10.1 Access shipping queue
10.2 Select Loan for Edit from shipping queue
11.0 Closing Audit (revisited)
11.1 Access Closing Audit Queue
11.2 Assign a loan
11.3 Select Loan for edit from Closing Audit queue
11.4 Select Withdraw button
11.5 Select Cancel button on pop up message
11.6 Ensure loan has not been withdrawn
11.7 Select Withdraw button
11.8 Select OK button on pop up message
Confidential Lydian Data Services, 2013 Page 23
-
7/30/2019 Sample TestPlan
24/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
12.0 Withdraw/Decline Queue
12.1 Access Withdraw/Decline queue
12.2 Select Loan for edit from withdraw/decline queue
12.3 Exercise loan movement from withdraw/decline queue to sales queue
13.0 Sales Queue
13.1 Access sales queue
13.2 Select Loan for edit from sales queue
13.3 Exercise loan movement from sales queue to registration queue
14.0 Registration Queue (revisited)
14.1 Access Registration queue
14.2 Select Show Only My Loans radio button
14.3 Ensure loan moved from sales queue is present
End of current MPS regression testing activities.
Areas specifically targeted for this release, as defined by Lydian Data Services Mortgage Process System Use-Case
Specification: MRG Integration, are outlined in the following (all actions listed occur in the MPS closing audit
queue).
1.0 Access System
1.1 Logon to MPS system
2.0 Closing Audit Queue
2.1 Access 1003 Form
2.1.1 Enter 1003 Information values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
2.1.2 Save 1003 Information due to adjusting the data values, testing will address invalid data
types and formats at time of save, error dialog behavior will be tested
2.2 Access Product Data Form
2.2.1 Enter Product Data Main Information - values shall be adjusted to ensure correct system
behavior for invalid data types, missing data, when combined with the following step,
error dialog behavior will be tested
2.2.2 Save Product Data Main Information - due to adjusting the data values, testing will
address invalid data types and formats at time of save, error dialog behavior will be tested
2.2.3 Exercise ARM Index Value link
2.2.4 Select Rate Adjustments page in the Product Data Form
2.2.5 Enter New Rate Adjustment - values shall be adjusted to ensure correct system behavior
for invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
Confidential Lydian Data Services, 2013 Page 24
-
7/30/2019 Sample TestPlan
25/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
2.2.6 Save Rate Adjustment - due to adjusting the data values, testing will address invalid data
types and formats at time of save, error dialog behavior will be tested
2.2.7 Edit Rate Adjustment - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
2.2.8 Save Rate Adjustment - due to adjusting the data values, testing will address invalid data
types and formats at time of save, error dialog behavior will be tested
2.3 Access Product Data Form
2.3.1 Enter Product Data Main Information - values shall be adjusted to ensure correct system
behavior for invalid data types, missing data, when combined with the following step,
error dialog behavior will be tested
2.3.2 Save Product Data Main Information - due to adjusting the data values, testing will
address invalid data types and formats at time of save, error dialog behavior will be tested
2.3.3 Exercise ARM Index Value link
2.3.4 Select Rate Adjustments page in the Product Data Form
2.3.5 Enter New Rate Adjustment - values shall be adjusted to ensure correct system behavior
for invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
2.3.6 Save Rate Adjustment - due to adjusting the data values, testing will address invalid data
types and formats at time of save, error dialog behavior will be tested
2.3.7 Edit Rate Adjustment - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
2.3.8 Save Rate Adjustment - due to adjusting the data values, testing will address invalid data
types and formats at time of save, error dialog behavior will be tested
2.4 Access Good Faith Estimate Form
2.4.1 Ensure field 818 Yield Spread Premium is populated and read only (MPS TD 719)
2.4.2 Exercise Adobe link for 1008 Form Information
2.4.3 Enter 1008 Information - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
2.4.4 Save 1008 Information - due to adjusting the data values, testing will address invalid data
types and formats at time of save, error dialog behavior will be tested
2.5 Access Escrow Form
2.5.1 Enter Escrow Information - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialogbehavior will be tested
2.5.2 Save Escrow Information - due to adjusting the data values, testing will address invalid
data types and formats at time of save, error dialog behavior will be tested
2.6 Access MI Form
Confidential Lydian Data Services, 2013 Page 25
-
7/30/2019 Sample TestPlan
26/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
2.6.1 Enter MI Information - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
2.6.2 Save MI Information - due to adjusting the data values, testing will address invalid data
types and formats at time of save, error dialog behavior will be tested
2.7 Access HUD form
2.7.1 Enter HUD Information - values shall be adjusted to ensure correct system behavior for
invalid data types, missing data, when combined with the following step, error dialog
behavior will be tested
2.7.2 Save HUD Information due to adjusting the data values, testing will address invalid
data types and formats at time of save, error dialog behavior will be tested
2.8 Access Pre Wire form
2.8.1 Ensure form is read-only
2.9 Access Doc Prep form
2.9.1 Access Title Info Tab2.9.1.1 Exercise Fields upon Title Info
2.9.1.2 Verify Field Limits and Formats (dependent upon additional documentation)
2.9.1.3 Save
2.9.2 Access Vesting Tab
2.9.2.1 Exercise Fields upon Vesting
2.9.2.2 Verify Field Limits and Formats (dependent upon additional documentation)
2.9.2.3 Save
2.9.3 Access Land lease Tab
2.9.3.1 Exercise Fields upon Land lease Tab
2.9.3.2 Verify Field Limits and Formats (dependent upon additional documentation)
2.9.3.3 Save
2.9.4 Access Settlement Tab
2.9.4.1 Exercise Fields upon Settlement Tab
2.9.4.2 Verify Field Limits and Formats (dependent upon additional documentation)
2.9.4.3 Save
2.9.5 Access Package Tab
2.9.5.1 Exercise Fields upon Package Tab2.9.5.2 Verify Field Limits and Formats (dependent upon additional documentation)
2.9.5.3 Save
2.9.6 Access Addl Vesting Tab
2.9.6.1 Exercise Fields upon Addl Vesting Tab
2.9.6.2 Verify Field Limits and Formats (dependent upon additional documentation)
Confidential Lydian Data Services, 2013 Page 26
-
7/30/2019 Sample TestPlan
27/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
2.9.6.3 Save
2.9.7 Access Seller Tab
2.9.7.1 Exercise Fields upon Seller Tab
2.9.7.2 Verify Field Limits and Formats (dependent upon additional documentation)
2.9.7.3 Save
2.9.8 Access Addl Seller Tab
2.9.8.1 Exercise Fields upon Addl Seller Tab
2.9.8.2 Verify Field Limits and Formats (dependent upon additional documentation)
2.9.8.3 Save
2.9.9 Access CEM Tab
2.9.9.1 Exercise Fields upon CEM Tab
2.9.9.2 Verify Field Limits and Formats (dependent upon additional documentation)
2.9.9.3 Save2.9.10 Access COOP Tab
2.9.10.1 Exercise Fields upon COOP Tab
2.9.10.2 Verify Field Limits and Formats (dependent upon additional documentation)
2.9.10.3 Save
2.9.11 Access UW Comments Tab
2.9.11.1 Exercise Fields upon UW Comments Tab
2.9.11.2 Verify Field Limits and Formats (dependent upon additional documentation)
2.9.11.3 Save
2.9.12 Enter Email To: field
2.9.13 Select Disclose Button
2.9.14 Verify GUI response
2.9.15 Verify Email Received
MPS Load Testing activities, specific for this project, shall provide the following.
Measurement of MPS processing time for document generation as reported in MPS in relation to user load
over time.
Timings will be taken measuring time between use case steps 2.9.13 until 2.9.14
Testing is envisioned to occur as follows:
1 User submits document generation request every 5 minutes for period of one hour
1 User submits document generation request every 2.5 minutes for period of one hour
1 User submits document generation request every minute for period of one hour
5 Users submit document generation request every 5 minutes for period of one hour
Confidential Lydian Data Services, 2013 Page 27
-
7/30/2019 Sample TestPlan
28/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
5 Users submit document generation request every 2.5 minutes for period of one hour
5 Users submit document generation request every minute for period of one hour
10 Users submit document generation request every 5 minutes for period of one hour
10 Users submit document generation request every 2.5 minutes for period of one hour
10 Users submit document generation request every minute for period of one hour
Confidential Lydian Data Services, 2013 Page 28
-
7/30/2019 Sample TestPlan
29/29
MPS - MRG Integration Version: 3.0
Master Test Plan Date: 11/18/2004
Example graphical representation of 1user load depicted below:
Figure 1.0 Example One User Load Graph
One User Load
0
1
2
3
4
5
6
7
8
9
10
10 20 30 40 50 60
Requests per Hour per User
ResponseTime(Seconds)
Build 1
Build 2
Build 3