You are on page 1of 21

<Section Name>

TESTING

1.1

Overview

Infosys quality assurance procedures have been built from experience and best practices and solidified by practicing Industry standards. In the product space, we seamlessly fit into the customers QA process and systems and aim to re-use their best practices with enhancement to ensure that optimal benefits are obtained in a global delivery model. If the customer has no rigorous QA practices, we have a well defined process that can be used. Infosys quality procedures elaborate roles, responsibilities, templates to be used, and guidelines across different types of programs and projects. Industry Standards: Infosys was assessed at CMMI Level 5 in the assessment in September 2003. In this appraisal, all types of projects Development, Maintenance, etc were covered. Infosys is ISO 9000 (2001) certified too. Apart from the CMM and ISO, Infosys also possess the Telecom TL-9000 Certification, BS7799 Security Certification and the ISO 14001 certification Infosys Product Verification methodology typically involves Test Requirements gathering, Test strategizing, Test Management Execution Planning, Test Environment set-up, Test Planning, Automated Test Script Generation, Test Execution and Bug Reporting and Defect Analysis Result Reporting. For every stage, Infosys has a stage gate approach where the output of preceding stage is reviewed and formally approved as the input for the next stage. The sections below explain the various stages in detail. The following figure encapsulates the overall process in pictorial workflow manner.

PAGE 1 OF 21

<Section Name>

Figure 1 Approach to Product Testing 1.2 Testing Methodology

Infosys testing methodology consists of following phases:


Test Requirements Gathering Test Strategizing Test Management & Execution Planning Test Environment Setup Test case development Test Automation Test Execution and Bug reporting Defect analysis and result reporting

Each of these phase is described below.

PAGE 2 OF 21

<Section Name>

1.2.1

Test Requirements Gathering

Test Requirements Gathering Overview This phase involves going through the requirement document and interacting with the business users to define the scope of testing and to find out the required details for project initiation and execution. Both the functional and non functional requirements are be captured in this phase.

Entry Criteria Inputs

Project Authorized in writing Business requirements documents Non functional requirements document Onsite Product Architect Test Architect Product Engineering Manager Test Manager WMS Functional Consultant Fashion Functional Consultant Business Analysts Identify the input documents that form the basis of testing (like requirements specification, migration plan, design documents etc) Understand the domain & product Understand DHL process and methodologies Understand the requirements and review its testability Understand the scope of testing like functional, performance, multi-platform testing etc, expected level / depth of testing, special user requirements etc. Identify test automation areas Define standards to be followed during various test LC stages Identify and familiarize with tools and utilities to be used Gather information on operating environment Define entry criteria (to decide on when to start testing) Define Acceptance criteria (to decide on when to stop testing) Define test suspension/resumption criteria (to decide on in-adequacies on part of development team and return the work product to the development team for thorough testing) Test Requirements document Requirements Traceabiltiy Matrix with requirements column completed Business or functional requirements Performance requirements Standards requirements if any Offshore DHL

Key Participants

Manual Test Lead Automation Test Lead Test Team

Subject Matter Experts Business Users

Major Activities

Outputs and Deliverables

Client Contribution and Dependencies

PAGE 3 OF 21

<Section Name>

Test Requirements Gathering

Exit Criteria Standards, Guidelines and Tools

Other non-functional requirements Client sign off of Test Requirements Document MS Office 2000 Test Tools Test Requirements Analysis Checklist Test Requirements Review Checklist Test Requirements Specification Document Template

PAGE 4 OF 21

<Section Name>

1.2.2
Overview

Test Strategizing
This phase involves defining the scope and type of testing and other details and obtaining a sign off for project initiation. This is done based on the requirements gathered in the previous phase. This phase could have iterations depending on the amount of information available until a final sign off is obtained.

Test Strategizing

Entry Criteria Inputs Key Participants

Completion of Test Requirements Gathering Test requirements document Onsite Product Anchor Engineering Offshore DHL

Test Manager Test Architect Product Architect

Manual Test Lead Automation Test Lead

DHL Program Anchor

Major Activities

Assimilate the scope of testing and prioritise test requirements High level effort & Schedule estimation based on the scope of testing and prioritisation of testing Define Test Strategy. This includes Define number of Test Iterations and scope of each iteration, entry and exit for each iteration Test Approach Define bug severity criteria


Outputs and Deliverables

Prepare Test Strategy Document Evaluate and select tool to be used for automation Prepare Automation Test Strategy Document Review Test Strategy documents with DHL Test Strategy Document Automation Test Strategy Document Review Test Strategy Documents Client sign off of Test Strategy Documents MS Office 2000 Test Estimation Guidelines

Client Contribution and Dependencies Exit Criteria Standards, Guidelines and Tools

PAGE 5 OF 21

<Section Name>

1.2.3
Overview

Test Management & Execution Planning


This phase involves planning for the testing. It includes detailed scheduling, planning for tools procurement and infrastructure setup required for test execution, any training required during the test execution

Test Management & Execution Planning

Entry Criteria

Signed off Test Requirement Document Signed off Test Strategy Document Test Requirement Document Test Strategy Document Product Architecture Document Design Document Onsite Product Anchor Engineering Offshore DHL

Inputs

Key Participants

Test Architect

Test Manager Manual Test Lead Automation Test Lead

Major Activities

Prepare detailed Project Plan Plan tool procurement Resource identification and allocation Finalize Plan for test environmental set-up Define Training plan Defined standards and checklists for project Project Plan Test environment set-up plan Tools procurement plan Standards and Checklist None Approved Project Plan Approved Environment set up plan MS Office 2000 MS Project Existing Standards and checklists

Outputs and Deliverables

Client Contribution and Dependencies Exit Criteria

Standards, Guidelines and Tools

PAGE 6 OF 21

<Section Name>

1.2.4
Overview Entry Criteria

Test Environment Set-up


This phase involves setting up of test beds used in various stages of testing

Test Environment Set-up

Signed off Test Requirement Document Signed off Test Strategy Document Test Infrastructure Available Test Environment Set up Plan Test Requirement Document Test Strategy Document Product Architecture Document Design Documents Onsite Offshore DHL

Inputs

Key Participants


Major Activities

Test Architect Manual Test Lead Automation Test Lead Offshore Test Team

Understand the required architecture Address environment (power supply, temp. control, etc) requirements Address connectivity requirements and verify connectivity Set up infrastructure as per environment set-up plan Check the test infrastructure for test readiness Test environment set-up and ready for testing

Outputs and Deliverables Client Contribution and Dependencies Exit Criteria Standards, Guidelines and Tools

Test Environment set up as per plan

PAGE 7 OF 21

<Section Name>

1.2.5
Overview

Test Case Development


The phase involves preparation of the test plan and test matrix for the features to be tested. The test matrix is a matrix of the requirements versus the test case ids. There would have to be at least one scenario for each requirement. Test cases are developed for testing

Test Case Development


Entry Criteria

Function Test Phase 1(UI, relocation programs and interfaces) Function Test Phase 2 Integration test stage 1 (module-spreading) Integration test stage 2 (system-spreading) Signed off Test Requirement Document Signed off Test Strategy Document Baselined standards and checklists Test Requirement Document Test Strategy Document Product Architecture Document Design Document Standards and Checklist Onsite Offshore DHL

Inputs

Key Participants


Go through templates and checklists

Test Manager Test Architect Manual Test Lead Manual Test Team

Major Activities

Detailed understanding of test requirements

Document the test cases. The test cases would be documented as per the template given in Section 5 of do_Projektkonventionen_des_Auftraggebers_051130_do_ProjectConventions_of_Customer_051130 .doc Test cases review Update the Test Matrix Prioritize the test cases. Test cases, irrespective of priority, will be produced in the test case document. However before test execution, priority parameters will be determined and be used to decide on what test cases need to be executed in each round Test Cases. The test cases would be documented as per the template given in Section 5 of do_Projektkonventionen_des_Auftraggebers_051130_do_ProjectConventions_of_Customer_051130 .doc Updated Requirements Traceability Matrix


Outputs and Deliverables

Client Contribution and Dependencies Exit Criteria

Baselined Test Cases

PAGE 8 OF 21

<Section Name>

Test Case Development

Standards, Guidelines and Tools

Updated Requirements Traceability Matrix MS Office 2000 Selected Test Tools Existing Standards and checklists

PAGE 9 OF 21

<Section Name>

1.2.6
Overview

Test Automation
This phase consists of automation analysis, designing automation framework and generating automated test scripts. Infosys will do an analysis of test cases to find out which test cases can be automated to ensure maximum ROI is derived from the automation effort. This phase involves development of automation framework and automation scripts.

Test Automation

Entry Criteria

Approved Automation Test Strategy document Baselined Test Cases Automation test scripts standards and checklists Test Requirement Document Automation Test Strategy Document Product Architecture Document Design Document Standards and Checklist Onsite Offshore DHL

Inputs

Key Participants


Reviews test cases

Test Manager Test Architect Automation Test Lead Automation Test Team

Major Activities

Do an automatability analysis. This involves following steps: Filter out test cases which cannot be automated. This may be due to following reasons

o Software not supported o Limitation of test tools o Areas not amenable to automation ( usability, database access over the link etc)
Identify test cases which are candidates for automation. This are test cases with following characteristics:

o o o o o o o

High frequency of execution High manual effort for execution Long Execution Time Complex test execution steps Test involving many simple repetitive steps Stable functionality and hence low maintenance effort Test cases which use multiple data values for same action

Remove the test cases from the above list which have high cost of automation. This are test cases which

o o o o o

Require high scripting effort Require changes to automation framework Require frequent updation due to frequent changes in requirements One time testing test cases Usability testing test cases

Based on above analysis come up with list of test cases which would be automated Draw a phase wise plan for automation based on priority of test cases to be automated

PAGE 10 OF 21

<Section Name>

Test Automation


Outputs and Deliverables Client Contribution and Dependencies Exit Criteria Standards, Guidelines and Tools

Design and develop the basic automation framework. Build common libraries Develop automation scripts Review the test scripts Log and fix the defects noticed during the script review Automation Project Plan Automation Scripts

Baselined Automation Scripts Selected Automation Test Tools Automation Standards and checklists

PAGE 11 OF 21

<Section Name>

1.2.7
Overview

Test Execution and Bug Reporting


The project team member(s) will carry out testing based on the test plans and scripts prepared / available. This phase covers bug reporting and documentation (update of test results into test plans) as well.

Test Execution and Bug Reporting

Entry Criteria

Baselined test cases Test Environment setup ready Product passes Build Verification Test Test Requirement Document Test Strategy Document Product Architecture Document Design Document Test cases Standards and Checklist Onsite Offshore DHL

Inputs

Key Participants


Major Activities

Test Manager Manual Test Lead Manual Test Team

Perform Testing Activities as per Manual test plan and checklists Run Automation test scripts Report bugs and update defect tracking system Check if suspension criteria is applicable and the work product needs to be returned to the development team Check if the testing needs to be resumed if the resumption criteria are fulfilled Update test results into test plans reports matrix. The results will include Number of test cases executed against Number of test cases planned Number and Percentage of test cases passed Number of defects per module and their classification as per severity and status (Open/Close) Number of defects per day

Outputs and Deliverables

Verify the fixed defect and track them to closure Test Results Product meets defined acceptance criteria

Client Contribution and Dependencies Exit Criteria Standards, Guidelines and Tools

All test cases executed MS Office 2000 Selected Test Tools Existing Standards and checklists

PAGE 12 OF 21

<Section Name>

1.2.8
Overview

Defect Analysis and Result Reporting


This phase involves analysis of the defects found during testing. The quality of the work product is reported both qualitatively and quantitatively. The test results are analysed to find out the defect distribution by type and severity, which would help the development team to find a strategy for prevention of the same, thus improving the quality of the delivery.

Test Execution and Bug Reporting

Entry Criteria Inputs

Test completion Test Results Observation made by test team Onsite Offshore DHL Program Manager

Key Participants


Major Activities

Product Anchor

Engineering

Test Manager Test Architect Product Architect Engineering Managers Project

Manual Test Lead Manual Test Team

Collect the list and details of defect reported from defect tracking system Do Causal Analysis Report the findings Highlight the state of the work product quantitatively and qualitatively Review the report with the DHL team Causal Analysis Report Product meets defined acceptance criteria

Outputs and Deliverables

Client Contribution and Dependencies Exit Criteria Standards, Guidelines and Tools

Submission and approval of causal report MS Office 2000

The key deliverable for this phase is the Causal Analysis Report.

PAGE 13 OF 21

<Section Name>

1.3

Types of Testing

Infosys would perform following types of testing:


Functional Testing User Interface & Usability Testing Scalability & Capacity Testing Regression Testing Security Testing Integration Testing

Inter Module Testing ( Internal interfaces are tested) System Testing ( External interfaces are tested)

Internationalisation & Localization Testing Performance Testing Platform & Compatibility Testing Acceptability Testing Build Verification Testing

Each of these is described below.


1.3.1

Functional Testing Test the product functionality and business processes

Treats the system as a black box and focuses solely on the outputs generated in response to selected inputs and execution conditions. Involves all the test process life cycle stages mentioned above Apart from checking for compliance of business requirements, it also covers specific areas like User Interface checking, Boundary Conditions checking, Exception Handling and Error handling.
Regression Testing Selective re-testing of a work product

1.3.2

Verifies that the current modifications/enhancements have not caused unintended effects and that the work product still complies with its specified requirements.
Security Testing Conducted to test the vulnerability of a system/facility to unintended and/or unauthorized users and processes

1.3.3

PAGE 14 OF 21

<Section Name>

Test the restrictions applicable to different functions of the system/facility. Security testing would be conducted at 2 levels

Infrastructure Security Testing Application Security Testing.

1.3.4

Usability Testing Testing of how easy it is for user to learn and operate the system to accomplish their tasks. This is basically done by validating whether the user interface has followed the Product Usability Guidelines or not.

Measures the human-computer interaction characteristics of the system and weaknesses are identified for correction
Integration Testing The objective of interface testing is to ensure that different entities of an application / external application work cohesively to ensure business requirement are met.

1.3.5

This testing is performed at two levels:

Inter Module Testing (Module Spread): Testing is carried out at application level to ensure different modules work together to meet application functionality. External Interface Testing (System Spread): Testing is carried out at system level to ensure that application developed works in conjunction with external applications to meet business requirements

1.3.6

Scalability and Capacity Testing Done to ensure that the application is scalable up to the specified limit (with respect to Volume, Load etc)

Capacity testing is designed to load a site's hardware/infrastructure to find breakpoints and potential bottlenecks.
Internationalisation/Localization Testing Conducted to ensure that the application works properly across different languages, cultures, currencies and locales.

1.3.7

Internationalisation testing need to be done upfront Localization testing is required as and when product is rolled to each new locale. This requires people with language skills.

PAGE 15 OF 21

<Section Name>

1.3.8

Perfomance/Load/Stress/Relaibility Testing Performance testing is conducted to evaluate the compliance of a system or component with specified performance requirements. It is performed under specified/simulated environment.

Load Testing is conducted to evaluate the compliance of a system or component with specified performance requirements under specific load. Stress/Reliability Testing is conducted to ensure that a system or component performs its required functions under stated conditions for a specified period of time under varied stress.
Platform & Compatibility Testing As a system evolves it may need to be installed/work under new environments. This happened as new versions of hardware, operating systems, browsers, EAI tools come in market

1.3.9

Platform & Compatibility testing is conducted to evaluate the system or component runs successfully on multiple platforms for which they are designed.

1.3.10 Acceptance & Installation Testing Validation performed by the DHL to verify that the system performs required functions. 1.3.11 Build Verification Testing Set of tests run on each new build of a product to verify that the build is testable before the build is released into the hands of the test team.

Generally a subset of the regression test suite, which exercises the mainstream functionality of the application is executed Any build that fails the build verification test is rejected, and testing continues on the previous build (provided there has been at least one build that has passed the acceptance test). Build acceptance tests are important because they let developers know right away if there is a serious problem with the build, and they save the test team wasted time.

PAGE 16 OF 21

<Section Name>

1.4

Test Automation Methodology

Figure 1. Test Automation Methodology

For automating the application, we propose a feasibility analysis, which will ensure the requirements of automation are fulfilled. Following methodology will be used,

Application readiness for automation is checked Technical feasibility and the ROI analysis is carried out Test automation tools are compared, evaluated, selected and deployed. The right tool and the compatible add ins and plug ins will be decided after the technical feasibility is done

PAGE 17 OF 21

<Section Name>

Automation framework is designed to suit the application. A scorecard is prepared to identify the right manual/regression test cases suitable for automation

The design automation framework is independent of the application. It is mainly formed with data driven and keyword driven functions. They are designed to maximize reusability, readability and maintainability as well as incorporating well-defined standards. The frameworks are readily available and are adopted for any project when found suitable for the application under test. When any existing automation framework is not found to be suitable, an automation framework is created. The main components of the automation framework are

Driver script Control script Utility script Data table (Excel) Control table (Excel) User defined functions

A sample test automation framework is given below:

Test Automation can be divided into a 3-phased approach

Test Automation Planning & Design

PAGE 18 OF 21

<Section Name>

Build Test Automation Suite Integrate with Test Management Tool Test Director or Quality Center.
Test Automation Planning & Design Build Test Automation Suite Integrate with Test Management Tool

Understanding the System and the environment in which the application has to be installed. Set-up automation test bed, which involves installation of automation tools and configure them. Analyse manual test scripts functionality from an automat ion standpoint. Identify test cases that cannot be automated due to tool limitations and / or complexity of the functionality Define automation scripting standards (Common language functionality) Identify common functionality across test cases and design c mpiled o modules Categorize test scripts based on functionality Identify functionality which need to be parameterised Application setup and the required environment is ready Finalize test cases to be automated and obtain sign off. Publish the Automation Plan to the client Design of compiled modules completed Standards tailored and communicated to team Final list of test cases that can be automated Test Automation Plan Design of complied modules

Generate automated scripts making use of the platform built in test planning and design phase. Create data set required for parameterisati n o Rerun the scripts iteratively to debug and to make them robust. Run the scripts in batch mode. Review the automated scripts to ensure that the scripts are correct and consistent and that scripting standards are followed. Check for proper comments in the scripts. Check for correctness of results file creation and log file generation

Generate batch scripts Provide routines for uploading of scripts to the test management tool server Provide required documentation for organising test scripts into test suit es based on the existing Test tree to facilitate execution Provide support during running of Test scripts from test management tool and analysing the results

Activities Exit Criteri a Deliverables

Successful dry -run of automated scripts in offshore environment

Successful execution of automated test scripts in client environment

Automated functional test scripts

Documentation on usage of test suite Final Version of Test Scripts

Figure 2. Three-phased Automation Approach

PAGE 19 OF 21

<Section Name>

1.5

Testing Best Practices

ROI Based Testing - The ROI based testing stems from the Requirements Traceability Matrix (RTM) which forms the basis of all our Testing engagements. This is the heart of the entire testing process wherein the requirements are captured and classified based on key parameters such as Business Criticality, Market Priority and Risk in not testing the requirements. The RTM aids in optimal Test Coverage of the requirements. The RTM inputs are taken and the Test Strategy is evolved to address most critical features for the business. The RTM helps in deciding what level / type of testing is required and also in determining the depth of testing required in specific areas of the application, thereby optimising effort for testing the application. Based on this priority based scheduling is done which involves sequencing the test activities based on market priorities. Also considered are features which are critical in each market release. Risk based Test Execution - Risk Based Testing applies a triage to set of all possible test cases and divide them into 3 categories:

Critical test cases with reasonable probability of finding major defects (10% to 15%) Medium priority test cases which may be run if sufficient time and resources are available (15% to 25%) Low yield test cases can be run only if overall levels of system risks are very high (60% to 75%)

Based on the strategy and prioritisation, the risks involved in not testing certain features are evaluated. The test cases will be classified accordingly and tests will be conducted based on the criticality. This becomes important when there is a time crunch for execution. With this proven approach the predictability of delivery date is improved and time to market is reduced to give maximum return on investment. Usage of Infosys Checklists Based on our experience of working with various client over the years we have developed checklists for all stages of testing. This ensures that DHL would benefit from the experience we have gained over the years and Infosys team is much more productive. This include checklist for test case creation, test case review, automation guidelines, automation coding standards, root cause analysis etc. Influx Methodology for Performance Testing - The Infosys InFlux Performance Engineering Methodology comprises of a set of tools and techniques for explicitly capturing Non-Functional Requirements (Response Time, Throughput, Resource Utilization), specifically through Workload Modeling. It allows understanding of the NFRs through multiple viewpoints viz Transaction, Workload profile, Infrastructure, External Systems and Data Retention. Gathering NFRs in a systematic way minimizes ambiguity,

PAGE 20 OF 21

<Section Name>

omissions and people dependencies with tools for Workload Modeling, Workload Characterization and Predictive Workload Forecasting. The Tool enables us to model current and future workloads. Testing in Parallel with development - Infosys does not wait till end of development cycle to begin testing. We define integration milestones within development cycles of each project which allow for early testing of the product. These milestones are defined in such a way that end to end product architecture is validated much earlier in the cycle. This ensures integration defects are identified at an early stage which saves lot of development effort.

PAGE 21 OF 21

You might also like