You are on page 1of 13

Controlled copy

Automation Strategy

Prepared By Reviewed by Approved By


Name
Role
Signature
Date
MM/DD/YYYY

<Project Name>_Automation Project Sign-Off

Signature

Name

Date
MM/DD/YYYY
Controlled copy Test Automation Plan

Table Of Contents
I. Overview...........................................................................................................................3
A. Purpose & Scope.......................................................................................................................3
B. Objective.....................................................................................................................................3
C. Test Automation Scope.............................................................................................................3
II. Automation Approach....................................................................................................4
A. Data Driven ................................................................................................................................5
B. Script Development...................................................................................................................5
C. Automation and Conventions...................................................................................................6
D. Test Automation Standards......................................................................................................6
E. Automation Matrix .....................................................................................................................9
F. Script Maintenance.....................................................................................................................9
G. Evaluation of Errors and Failures...........................................................................................10
H. Test Automation Environment................................................................................................10
I. Test Data....................................................................................................................................11
III. Problem Reporting and Data Recording....................................................................11
IV. Assumptions, Dependencies, Concerns and Risks.................................................11
V. Automation Team Roles and Responsibilities...........................................................11
VI. Milestones and Deliverables......................................................................................12
VII. Test Team Training Requirements............................................................................12
A. Training Schedules to offshore team.....................................................................................12
VIII. References Glossary Reference...............................................................................12
IX. Appendix A..................................................................................................................13
X. Change Log..................................................................................................................13
Controlled copy Test Automation Plan

I. Overview

A. Purpose & Scope

The purpose and scope of the document is to prepare a plan for Test
automation of <Project Name> Project. The scope of this document includes
the following:

1) Plan for Test Automation and coverage

2) Automation Checklist

3) Plan for Environment

B. Objective
The objective of test automation is to automate the following modules
present in the <Project> –Item Groups

• <Module 1>

• <Module 2>

• <Module 3>

• <Module 4>

• <Module 5>

• <Module 6>

C. Test Automation Scope

1. Test automation environment will be setup at offshore by the offshore


QA testing team.

2. The offshore QA testing team will also do test scripting, verification and
validation of results with co-ordination from the onsite testing team.

3. <Mercury’s Automation Tools> will be used for automation.

4. Automation Testing scope is limited to Playback the scripts in the


following OS and Tools.

Tool Tool Version Operating


System
<Tool Name> <Version> Windows
XP/NT/2000
Controlled copy Test Automation Plan

II. Automation Approach

The automation project will follow Module Centric Approach as its framework. It will
have Requirements Phase, Module Phase, Design Phase, Script Creation, Script Review
and Acceptance testing.

 Requirements Phase involves gathering automation test requirements from the


manual test cases. This involves identifying test cases for automation and
prioritizing based on the needs. The complexity factor for each script is also
defined. The Automation Project Plan and Automation Request Matrix would be
the deliverables at the end of this phase.
 Module Phase involves grouping similar test cases into groups and analyzing the
functions performed in each group. The common functions are written as a single
function and can be used as reusable function across scripts. The module flow
diagram and script flow diagram will be incorporated in the Design document.
 Design Phase involves creation and review of design document for all Automation
tasks. The deliverable for this phase would be Automation Design Document.
 Script creation phase will start once the design review is complete. Script
Creation will be done at Offshore. This involves creation of scripts for all modules
based on the priorities. QTP 6.5 is used for creation of scripts. If any other higher
version of this tool is found appropriate it will be used for automation.
 Unit testing will be performed by the developer with the Sample data.
 Script review will be done at offshore. The respective leads and Automation leads
at offshore will perform the review.
 Acceptance testing will be done by the Onsite leads. The respective leads will
perform the acceptance testing based on the real time data. Training will be
provided to all onsite team once the acceptance testing is completed.

All the test cases identified, as part of this project will be translated into automated
scripts. Since re-usability and modularity are key points in the automation effort, we
have created an automation framework called Module Centric Approach. The main
objective of this approach is to increase reusability of codes, reduce redundancy in
codes, and easy for maintenance.

The similar functions are written as reusable actions and will be called from the script.
Functions will also be written to perform alternative or negative steps to verify negative
and error conditions in the Test cases. Rather than retrieving the data from within the
function, arguments and data will be passed as parameters in the function call. Making
the scripts function-centric will improve the ability of an automation team to reuse the
code. All steps will be documented in the function for reference. Any pre-requisites for
the function to run successfully will be documented in the design document and in the
comments in the function header.

The Datatable is a feature available in Tool that will be extensively used in order to
automatically read from a data file and populate screen fields with data during run-time.

Data driven will be done by maintaining a Data file (.xls file), which will contain the
input data for the corresponding test case. This Excel file is divided into various sections
and each of these sections provide the number of records to be added into the section
and the data. The data file name will be documented in the script for reference.

There will be a master script(action) calling all the child scripts(actions).

All scripts will have appropriate verification points. Verification points for key
intermediary functional results or states in the flow will be inserted in-between steps and
Controlled copy Test Automation Plan

at the end of a script to verify the expected results. The verification points should also
check for every error condition possible from the UI.

On test execution of each test script, the test log will be used to verify the results for
verification point passes and for the items that have not passed.

Entry Criteria

Once the test cases(scenarios) are identified for automation from the manual test cases,
test criteria are defined, the application under test is ready for recording, and the
automation process could start.

Exit Criteria

Once the acceptance testing for a script is complete and the status for the request is
changed to Closed, the automation process will stop.

A. Data Driven

A data driven approach will be followed in the preparation and execution of scripts.

As mentioned earlier in this document, all the major Functional Flows that are going to
be automated using QTP will have one or more Data tables created for the screen input
like dates, names etc. The script can read from and input to screen during run-time.

The datatable feature is available QTP, Test->SettingsResourcesDatatable. The script


will read and fetch the datatable name and the field name respectively and populate edit
controls in the screen.

A sample Datatable used for searching the Item group is attached in the appendix for
reference.

B. Script Development

The automation scripts will be created in a modular fashion. This means that scripts will
be made re-usable. This also means breaking down the functionality into user defined
functions to make it easier for debugging and maintenance. The table below is an
example of how a product specific test case script can be broken down into child and
parent scripts.

<Project> module contains about <number> test cases and in the module phase detail
analysis would be done and the test cases would be modularized based on functions .
The functions that are distinct to the test cases would be developed as Parent script and
functions that are common to most of the test case would be developed as Child scripts.

Example:

Test Automation Major Flow NameMajor functions thatScript Names


Document Name (Parent Scriptare occurring more
Name) than once (Child
Script)
Controlled copy Test Automation Plan

FnTrimfunction

C. Automation and Conventions

Design Document

Design Document will be prepared for all the scripts as part of the Design phase.
It includes scope of the script, Purpose, Group strategy and logic frame, Module
descriptions, pre requisites, module flow, Script flow, Developer Notes and
References.
a)
The Design Document Template has to be followed for all the scripts.

Script Flow

The script flow will be created for all the scripts for better understanding and will
be attached with the design document.

Module Flow

The module flow will be created for all the scripts for better understanding and
will be attached with the design document.

D. Test Automation Standards

The Scripting standards have to be followed for all the scripts.

Script Readability
The Script written/recorded should be readable and presentable. The script
should be in such a way that the flow of the script is traced easily. To achieve these
goals we have the following scripting standards
 Every logical step should be preceded by a comment. The Comments should
follow the comment standards.
 Indentations should be strictly followed
 Every script should have a script level comment clearly mentioning the test case
that is emulated by the script, author, preconditions, parameters (if applicable),
post conditions, and brief description. The script header template for this
comment must be maintained across all the scripts.

Script Modularity
Modularization of scripts helps in script maintainability. The following are the
standards followed for modularizing the scripts.
 Each, logically separate, operations should be in a user defined Procedures or
functions or script. These sub scripts can be called wherever needed.
Controlled copy Test Automation Plan

 Subroutines can be of three types based on the operation performed by that


subroutine. There are different domains of functionality that exist in the
application. Depending on that the common subroutines are classified as
• General purpose subroutines, which are not domain specific, and are shared
between the domains
• Domain specific subroutines, which are specific to that domain, and are
used for emulating operations within the boundary of that domain. Thus
only the scripts that touch this particular domain will be required to use
these particular subroutines.
• Common functions or Utilities, these are at a lower layer of abstraction, and
cover single/small functionality.

Exception Handling
To maintain a smooth flow of the script one should ensure that all the unexpected
behaviors of the application are handled during Playback of the GUI script. The standards
followed for these are as follows
 After every step, care should be taken for possibility of any warning/error
message popup. An if-then-else structure should be used to ensure this. No
application specific popup should be treated as unexpected active window by
robot.
 Existence of the window should be verified before any operation on that window.
Sufficient timeout should be used.
 Necessary verification point should be included to check whether the step was
successful.
 There should not be any possibility of infinite loops in the script. Places where
such possibility exists are loops that simulate waiting for application to respond,
like waiting for tab to appear etc. In any scenario, the script should terminate
naturally, and should log possible reason for failure, in case the test case fails.
Keeping a timeout for such loops can ensure this.

Consistent naming convention in scripting will improve readability. The steps as to what
each script does and other comments wherever necessary will be given at the beginning
of every script

All user interaction recordings will be carried out consistently with the mouse click action
and few keyboard actions (wherever required) and identification of menus and combo
box items should be by name and not by ID. The reason recordings will not be based on
keyboard actions is that there are no hot keys available for screen level controls. Hence
screen navigation using keyboard will be a cumbersome task. The recording will be
carried out in a known desktop, which will be documented so that the same settings can
be duplicated in another machine.

Format of the header


Note: The Header lines should be commented.

#######################################################
################
Test Tool/Version :
Test Tool Settings :
Recorded Browser/version :
Function Automated :
Preconditions : What must be set before the script runs
Post-conditions : The state of the system when the script completes
Test Case Automated :
Parameterization Done :
Controlled copy Test Automation Plan

Author : Testing Services


Script Name :
Script Created on :
Reviewed By :
Review Comment :
Modified On :
Modified Comment :
#######################################################
################
Controlled copy Test Automation Plan

E. Automation Matrix

The list of Automation tasks that have been identified is mentioned in the below
document Automation Matrix.doc

F. Script Maintenance

There is a high maintenance overhead for automated test scripts as they have to
be continuously kept up to date, otherwise you will end up abandoning hundreds
of hours work because there has been too many changes to an application to
make modifying the test script worthwhile. As a result, it is important that the
test scripts are kept up to date.

Thus once the scripts are created, they need to be version controlled for their
changes. In
When WinRunner is connected to a TestDirector ( which is a defect tracking/test
management tool) project with version control support, one can update and
revise the automated test scripts while maintaining old versions of each test. This
helps keep track of the changes made to each test script, see what was modified
from one version of a script to another, or return to a previous version of the test
script.

Here, we will discuss how to have version control of the scripts using CM Synergy,
a third party version control tool. The process can be similiar if any other tool is
used like VSS or CVS.
1. The requester submits a change request in accordance with the “Script
Request Process”.
2. The automation engineers review the change request and assign it to an
appropriate engineer. In order to determine who best to address the
request, the following items are taken into account:

a. Each developer’s availability


b. The level of effort of the change request
c. Each developer’s familiarity with similar requests
d. Each developer’s familiarity with the module(s) to which the
script/tool applies
e. For updates to existing scripts, each developer’s familiarity with the
script’s source language is considered.
3. The scripter then Checks out the latest version of the scripts from CM Synergy
and markes the “Automation Request” status in Test as “In Development”.

4. Perform the changes to the script and does a unit test of the same.

5. If the unit test passes then change the status of the Automation request in
Test director to “Development Completed”. Else continue from Step 4

6. Acceptance testing will be performed by the concerned person who raised


the automation request.If it passes then change the status of the Automation request
in Test director to “Closed”. If it fails then start from Step 4.

7. If the status of the automation request is closed then change the test case
status in test director to “Baselined”.

8. The scripter then Checks In all the modified scripts with updated readme
Controlled copy Test Automation Plan

document to CM Synergy and make the script ready for use.

9. The status of the test case will be changed to “Baselined” if the automation
of the test case is complete.

G. Evaluation of Errors and Failures

Some of the failure conditions during script execution have been tabulated for reference.
S. No Symptom Problem Possible Action Remarks
Description
1. The script stops Failure to find a Re-run the script by using
abruptly window a breakpoint
The system Re-boot the machine
resources are
low
Failure to find a Correct the spreadsheet
file or directory and re-start the script
(as defined in
the Excel
Spreadsheet)
Put files in the correct
location and re-start the
script
2. The script runs to Failure of a Analyze the failure to
completion but verification confirm if it is a defect
there are Context point. and log the defect in Test
sensitive errors director
The verification Point may
need to be overwrite as
the baseline may have
changed. Fix the problem
by updating the baseline.
Break in flow Fix the problem by
due to a failure tweaking script or by re-
to identify a UI running the script.
feature or
control in the
screen

H. Test Automation Environment


Licensed <Tool> to be installed on Offshore QA machine with at least 128 MB RAM and
at least 5 GB hard disk space should be available.

User ids will be setup by the administrator to login and work on the repository from the
QA machine.

Test environment to execute the scripts has to be performed as per the notes available
in the design document of every script.

Recording and Execution of the script will be done as mentioned below


Controlled copy Test Automation Plan

Tool Tool Version Operating


System
<Mercury’s tool name> <Version> Windows
XP/NT/2000

I. Test Data

 Test data will be taken from the database.

 The onsite team will provide test data on request.

 The offshore team will prepare Datatable based on the test data for automation
needs.

III. Problem Reporting and Data Recording

 The Results of the script execution of the <Project>Application test


cases will be reported in the log files of the tool used for automation.
 The Results of the script execution of the component cases will be
reported as defect files.

IV. Assumptions, Dependencies, Concerns and Risks

The tool with working licenses should be available for the environment to be
set up and scripts to be run.

Onsite team leads have to inform the offshore team about the changes in the
test cases and test data and GUI layouts.

Offshore Testing team has to update the datatable if there is any change in
the test data.

V. Automation Team Roles and Responsibilities

Role Responsibilities Name

Onsite Lead Reviews Design Document, Script and


acceptance testing

Onsite Review and coordination.


Coordinator

Offshore Review and coordination.


Coordinator

Offshore Team Develops Design Document, Develops Script,


member Script modification, acceptance testing
Controlled copy Test Automation Plan

VI. Milestones and Deliverables

Milestone Task Start Date End Date Estimated Deliverables


Effort( In
Person
Days)
Automation Project Plan Project Plan Approval
Requirements
Design Document Automation design
Creation document.
Script Creation Scripts
Script Review Approved Scripts
Acceptance Testing Final Signoff
Automation Result
Automation Report
Summary

VII. Test Team Training Requirements

Training Target Date Roles/Resources Trainers


Requirement Training Approach for to be Trained
Completion

Script Training will be Onsite Team


Execution and conducted after
Datatable acceptance testing.
modification
on Tools

Tool Advanced Training will be Offshore team Testing


Scripting conducted on need Services
basis Automation
team

A. Training Schedules to offshore team

Product Training Description Target date Resources Trainers


of to be
Completion trained

VIII. References Glossary Reference


Term Definition Pertains to:
TD Test Director Test Management Tool
APS Ahold Pricing System Application
RBRR Rules Based Recommended Retails Application
QTP Quick Test Professional QTP 6.5
GUI Graphical User Interface Objects in a screen
OS Operating System
Controlled copy Test Automation Plan

UI User Interface

IX. Appendix A

Automation Check List

X. Change Log

Please note that this table needs to be maintained even if a Configuration


Management tool is used.
Version Changes made
Number
V1.0 Initial Draft

You might also like