Professional Documents
Culture Documents
Automation Strategy
Signature
Name
Date
MM/DD/YYYY
Controlled copy Test Automation Plan
Table Of Contents
I. Overview...........................................................................................................................3
A. Purpose & Scope.......................................................................................................................3
B. Objective.....................................................................................................................................3
C. Test Automation Scope.............................................................................................................3
II. Automation Approach....................................................................................................4
A. Data Driven ................................................................................................................................5
B. Script Development...................................................................................................................5
C. Automation and Conventions...................................................................................................6
D. Test Automation Standards......................................................................................................6
E. Automation Matrix .....................................................................................................................9
F. Script Maintenance.....................................................................................................................9
G. Evaluation of Errors and Failures...........................................................................................10
H. Test Automation Environment................................................................................................10
I. Test Data....................................................................................................................................11
III. Problem Reporting and Data Recording....................................................................11
IV. Assumptions, Dependencies, Concerns and Risks.................................................11
V. Automation Team Roles and Responsibilities...........................................................11
VI. Milestones and Deliverables......................................................................................12
VII. Test Team Training Requirements............................................................................12
A. Training Schedules to offshore team.....................................................................................12
VIII. References Glossary Reference...............................................................................12
IX. Appendix A..................................................................................................................13
X. Change Log..................................................................................................................13
Controlled copy Test Automation Plan
I. Overview
The purpose and scope of the document is to prepare a plan for Test
automation of <Project Name> Project. The scope of this document includes
the following:
2) Automation Checklist
B. Objective
The objective of test automation is to automate the following modules
present in the <Project> –Item Groups
• <Module 1>
• <Module 2>
• <Module 3>
• <Module 4>
• <Module 5>
• <Module 6>
2. The offshore QA testing team will also do test scripting, verification and
validation of results with co-ordination from the onsite testing team.
The automation project will follow Module Centric Approach as its framework. It will
have Requirements Phase, Module Phase, Design Phase, Script Creation, Script Review
and Acceptance testing.
All the test cases identified, as part of this project will be translated into automated
scripts. Since re-usability and modularity are key points in the automation effort, we
have created an automation framework called Module Centric Approach. The main
objective of this approach is to increase reusability of codes, reduce redundancy in
codes, and easy for maintenance.
The similar functions are written as reusable actions and will be called from the script.
Functions will also be written to perform alternative or negative steps to verify negative
and error conditions in the Test cases. Rather than retrieving the data from within the
function, arguments and data will be passed as parameters in the function call. Making
the scripts function-centric will improve the ability of an automation team to reuse the
code. All steps will be documented in the function for reference. Any pre-requisites for
the function to run successfully will be documented in the design document and in the
comments in the function header.
The Datatable is a feature available in Tool that will be extensively used in order to
automatically read from a data file and populate screen fields with data during run-time.
Data driven will be done by maintaining a Data file (.xls file), which will contain the
input data for the corresponding test case. This Excel file is divided into various sections
and each of these sections provide the number of records to be added into the section
and the data. The data file name will be documented in the script for reference.
All scripts will have appropriate verification points. Verification points for key
intermediary functional results or states in the flow will be inserted in-between steps and
Controlled copy Test Automation Plan
at the end of a script to verify the expected results. The verification points should also
check for every error condition possible from the UI.
On test execution of each test script, the test log will be used to verify the results for
verification point passes and for the items that have not passed.
Entry Criteria
Once the test cases(scenarios) are identified for automation from the manual test cases,
test criteria are defined, the application under test is ready for recording, and the
automation process could start.
Exit Criteria
Once the acceptance testing for a script is complete and the status for the request is
changed to Closed, the automation process will stop.
A. Data Driven
A data driven approach will be followed in the preparation and execution of scripts.
As mentioned earlier in this document, all the major Functional Flows that are going to
be automated using QTP will have one or more Data tables created for the screen input
like dates, names etc. The script can read from and input to screen during run-time.
A sample Datatable used for searching the Item group is attached in the appendix for
reference.
B. Script Development
The automation scripts will be created in a modular fashion. This means that scripts will
be made re-usable. This also means breaking down the functionality into user defined
functions to make it easier for debugging and maintenance. The table below is an
example of how a product specific test case script can be broken down into child and
parent scripts.
<Project> module contains about <number> test cases and in the module phase detail
analysis would be done and the test cases would be modularized based on functions .
The functions that are distinct to the test cases would be developed as Parent script and
functions that are common to most of the test case would be developed as Child scripts.
Example:
FnTrimfunction
Design Document
Design Document will be prepared for all the scripts as part of the Design phase.
It includes scope of the script, Purpose, Group strategy and logic frame, Module
descriptions, pre requisites, module flow, Script flow, Developer Notes and
References.
a)
The Design Document Template has to be followed for all the scripts.
Script Flow
The script flow will be created for all the scripts for better understanding and will
be attached with the design document.
Module Flow
The module flow will be created for all the scripts for better understanding and
will be attached with the design document.
Script Readability
The Script written/recorded should be readable and presentable. The script
should be in such a way that the flow of the script is traced easily. To achieve these
goals we have the following scripting standards
Every logical step should be preceded by a comment. The Comments should
follow the comment standards.
Indentations should be strictly followed
Every script should have a script level comment clearly mentioning the test case
that is emulated by the script, author, preconditions, parameters (if applicable),
post conditions, and brief description. The script header template for this
comment must be maintained across all the scripts.
Script Modularity
Modularization of scripts helps in script maintainability. The following are the
standards followed for modularizing the scripts.
Each, logically separate, operations should be in a user defined Procedures or
functions or script. These sub scripts can be called wherever needed.
Controlled copy Test Automation Plan
Exception Handling
To maintain a smooth flow of the script one should ensure that all the unexpected
behaviors of the application are handled during Playback of the GUI script. The standards
followed for these are as follows
After every step, care should be taken for possibility of any warning/error
message popup. An if-then-else structure should be used to ensure this. No
application specific popup should be treated as unexpected active window by
robot.
Existence of the window should be verified before any operation on that window.
Sufficient timeout should be used.
Necessary verification point should be included to check whether the step was
successful.
There should not be any possibility of infinite loops in the script. Places where
such possibility exists are loops that simulate waiting for application to respond,
like waiting for tab to appear etc. In any scenario, the script should terminate
naturally, and should log possible reason for failure, in case the test case fails.
Keeping a timeout for such loops can ensure this.
Consistent naming convention in scripting will improve readability. The steps as to what
each script does and other comments wherever necessary will be given at the beginning
of every script
All user interaction recordings will be carried out consistently with the mouse click action
and few keyboard actions (wherever required) and identification of menus and combo
box items should be by name and not by ID. The reason recordings will not be based on
keyboard actions is that there are no hot keys available for screen level controls. Hence
screen navigation using keyboard will be a cumbersome task. The recording will be
carried out in a known desktop, which will be documented so that the same settings can
be duplicated in another machine.
#######################################################
################
Test Tool/Version :
Test Tool Settings :
Recorded Browser/version :
Function Automated :
Preconditions : What must be set before the script runs
Post-conditions : The state of the system when the script completes
Test Case Automated :
Parameterization Done :
Controlled copy Test Automation Plan
E. Automation Matrix
The list of Automation tasks that have been identified is mentioned in the below
document Automation Matrix.doc
F. Script Maintenance
There is a high maintenance overhead for automated test scripts as they have to
be continuously kept up to date, otherwise you will end up abandoning hundreds
of hours work because there has been too many changes to an application to
make modifying the test script worthwhile. As a result, it is important that the
test scripts are kept up to date.
Thus once the scripts are created, they need to be version controlled for their
changes. In
When WinRunner is connected to a TestDirector ( which is a defect tracking/test
management tool) project with version control support, one can update and
revise the automated test scripts while maintaining old versions of each test. This
helps keep track of the changes made to each test script, see what was modified
from one version of a script to another, or return to a previous version of the test
script.
Here, we will discuss how to have version control of the scripts using CM Synergy,
a third party version control tool. The process can be similiar if any other tool is
used like VSS or CVS.
1. The requester submits a change request in accordance with the “Script
Request Process”.
2. The automation engineers review the change request and assign it to an
appropriate engineer. In order to determine who best to address the
request, the following items are taken into account:
4. Perform the changes to the script and does a unit test of the same.
5. If the unit test passes then change the status of the Automation request in
Test director to “Development Completed”. Else continue from Step 4
7. If the status of the automation request is closed then change the test case
status in test director to “Baselined”.
8. The scripter then Checks In all the modified scripts with updated readme
Controlled copy Test Automation Plan
9. The status of the test case will be changed to “Baselined” if the automation
of the test case is complete.
Some of the failure conditions during script execution have been tabulated for reference.
S. No Symptom Problem Possible Action Remarks
Description
1. The script stops Failure to find a Re-run the script by using
abruptly window a breakpoint
The system Re-boot the machine
resources are
low
Failure to find a Correct the spreadsheet
file or directory and re-start the script
(as defined in
the Excel
Spreadsheet)
Put files in the correct
location and re-start the
script
2. The script runs to Failure of a Analyze the failure to
completion but verification confirm if it is a defect
there are Context point. and log the defect in Test
sensitive errors director
The verification Point may
need to be overwrite as
the baseline may have
changed. Fix the problem
by updating the baseline.
Break in flow Fix the problem by
due to a failure tweaking script or by re-
to identify a UI running the script.
feature or
control in the
screen
User ids will be setup by the administrator to login and work on the repository from the
QA machine.
Test environment to execute the scripts has to be performed as per the notes available
in the design document of every script.
I. Test Data
The offshore team will prepare Datatable based on the test data for automation
needs.
The tool with working licenses should be available for the environment to be
set up and scripts to be run.
Onsite team leads have to inform the offshore team about the changes in the
test cases and test data and GUI layouts.
Offshore Testing team has to update the datatable if there is any change in
the test data.
UI User Interface
IX. Appendix A
X. Change Log