You are on page 1of 10

How to write effective Test cases, procedures

and definitions:
Writing effective test cases is a skill and that can be achieved by some experience and in-depth
study of the application on which test cases are being written.

Here I will share some tips on how to write test cases, test case procedures and some basic
test case definitions:

What is a test case?


“A test case has components that describe an input, action or event and an expected
response, to determine if a feature of an application is working correctly.” Definition by
Glossary

There are levels in which each test case will fall in order to avoid duplication efforts.
Level 1: In this level you will write the basic test cases from the available specification and
user documentation.
Level 2: This is the practical stage in which writing test cases depend on actual functional
and system flow of the application.
Level 3: This is the stage in which you will group some test cases and write a test
procedure. Test procedure is nothing but a group of small test cases maximum of 10.
Level 4: Automation of the project. This will minimize human interaction with system and
thus QA can focus on current updated functionalities to test rather than remaining busy
with regression testing.

So you can observe a systematic growth from no testable item to an Automation suit.

Why we write test cases?

The basic objective of writing test cases is to validate the testing coverage of the
application. If you are working in any CMM level company then you will strictly follow test
cases standards. So writing test cases brings some sort of standardization and minimizes
the ad-hoc approach in testing.

How to write TEST CASES?

Fields in test cases:

Test case id:


Unit to test: What to be verified?
Assumptions:
Test data: Variables and their values
Steps to be executed:
Expected result:
Actual result:
Pass/Fail:
Comments:

So here is a basic format of test case statement:

Verify
Using [tool name, tag name, dialog, etc]
With [conditions]
To [what is returned, shown, demonstrated]

Verify: Used as the first word of the test case statement.


Using: To identify what is being tested. You can use ‘entering’ or ‘selecting’ here instead of
using depending on the situation.

For any application basically you will cover all the types of test cases including functional,
negative and boundary value test cases.

Keep in mind while writing test cases that all your test cases should be simple and easy to
understand. Don’t write explanations like essays. Be to the point.

Try writing the simple test cases as mentioned in above test case format. Generally I use
Excel sheets to write the basic test cases. Use any tool like ‘Test Director’ when you are
going to automate those test cases.

Plan > Test Suite > Test Case Format


Process impact: This reference page documents the format of test cases
and gives tips on writing test cases. You can copy and paste the sample test
case into your test-cases.html file. This file itself should not be edited to hold
specific test cases.
This test case format is suitable for manual system test cases.
The test cases should be written in enough detail that they could be given to
a new team member who would be able to quickly start to carry out the
tests and find defects.

unique-test-case-id: Test Case Title


Purpose: Short sentence or two about the aspect of the
system is being tested. If this gets too long, break
the test case up or put more information into the
feature descriptions.
Prereq: Assumptions that must be met before the test case
can be run. E.g., "logged in", "guest login allowed",
"user testuser exists".
Test Data: List of variables and their possible values used in
the test case. You can list specific values or
describe value ranges. The test case should be
performed once for each combination of values.
These values are written in set notation, one per
line. E.g.:
loginID = {Valid loginID, invalid loginID, valid
email, invalid email, empty}
password = {valid, invalid, empty}
Steps: Steps to carry out the test. See step formating
rules below.

1. visit LoginPage
2. enter userID
3. enter password
4. click login
5. see the terms of use page
6. click agree radio button at page bottom
7. click submit button
8. see PersonalPage
9. verify that welcome message is correct
username

Notes and NOTE


Questions: QUESTION

Format of test steps


Each step can be written very tersely using the following keywords:
login [as ROLE-OR-USER]
Log into the system with a given user or a user of the given type.
Usually only stated explicitly when the test case depends on the
permissions of a particular role or involves a workflow between
different users.
visit LOCATION
Visit a page or screen. For web applications, LOCATION may be a
hyperlink. The location should be a well-known starting point (e.g., the
Login screen), drilling down to specific pages should be part of the
test.
enter FIELD-NAME [as VALUE] [in SCREEN-LOCATION]
Fill in a named form field. VALUE can be a literal value or the name of
a variable defined in the "Test Data" section. The FIELD-NAME itself
can be a variable name when the UI field for that value is clear from
context, e.g., "enter password".
enter FIELDS
Fill in all fields in a form when their values are clear from context or
when their specific values are not important in this test case.
click "LINK-LABEL" [in SCREEN-LOCATION]
Follow a labeled link or press a button. The screen location can be a
predefined panel name or English phrase. Predefined panel names are
based on GUI class names, master template names, or titles of boxes
on the page.
Click BUTTON-NAME [in SCREEN-LOCATION]
Press a named button. This step should always be followed by a "see"
step to check the results.
See SCREEN-OR-PAGE
The tester should see the named GUI screen or web page. The general
correctness of the page should be testable based on the feature
description.
Verify CONDITION
The tester should see that the condition has been satisfied. This type
of step usually follows a "see" step at the end of the test case.
Verify CONTENT [is VALUE]
The tester should see the named content on the current page, the
correct values should be clear from the test data, or given explicitly.
This type of step usually follows a "see" step at the end of the test
case.
Perform TEST-CASE-NAME
This is like a subroutine call. The tester should perform all the steps of
the named test case and then continue on to the next step of this test
case.
Every test case must include a verify step at the end so that the expected
output is very clear. A test case can have multiple verify steps in the middle
or at the end. Having multiple verify steps can be useful if you want a
smaller number of long tests rather than a large number of short tests.
Further Information
For more information on advice, see:

 Words of wisdom on test case suites.


 Words of wisdom on test cases.

Testing: Test Plan Development - Step 1


Step I - Assembling the test team

The test team should be organized concurrently with the development


team. The purpose of the test team is to perform verification and
validation as it relates to implementation. For a specific project, the
purpose of the test team is:

1)     To perform verification and validation for the deliverables from


development and solution delivery

2)     To act as consultants to the development team during Unit


Testing.

Task I.I - Identify Key Application Areas

This task identifies the key application areas that must be involved in
testing. It should also identify the testing group's responsibilities to
those areas. For example, testing might be responsible to development
for integration testing and system testing, and to solution delivery for
release testing.

Output: Statement of Application Areas

Task I.II - Identify Key Individuals

This task identifies important individuals who will be involved, both


directly and indirectly, in the testing process. The persons selected as
members of the test team will be directly responsible for testing
activities while others that act as sponsors will be indirectly involved.

Specific individuals involved in testing should include the following:

1. Quality Assurance Manager


2. Quality Assurance Analysts

3. Test Manager

4. Test Analysts

5. Project Manager

6. Project Team Leader(s)

7. Analysts

8. Programmers

9. Database Services Personnel

10. Network Services Personnel

11. Data Center (Operations) Personnel

12. Customers (Application Users)

Output: Statement of team member responsibilities - This statement


assigns specific responsibilities to the members of the test team. This
should be the first step in the creation of the Test Work plan that is
described in Task I.III. The work plan should be developed in Microsoft
Project or in the management component of an automated test tool.

The first action is to list the testing tasks to be completed. This should
be followed by a review of the tasks by all of the test team members.
When consensus has be reached that the list is correct and complete,
an individual team member must be assigned to each task. A final
review based on each member's % of the workload should be
completed. MS Project makes this easy as it has several reports that
will provide workload, as well as, other statistics.

Task I.III - Assign Individual Responsibilities

The test team members will be responsible for:

n       Developing the test Plan

n       Developing the required test resources

n       Designing the test cases


n       Constructing test cases

n       Executing test cases according to the test plan

n       Managing test resources

n       Analyzing Test results

n       Issuing test reports

n       Recommending application improvements

n       Maintaining Test statistics

Individual assignments must be made so that each area of


responsibility is covered and someone can be held accountable.

Output: Team Work Plan - The work plan defines milestones and
tentative completion dates for all assigned tasks. A project
management tools such as Microsoft Project can make this task very
easy and the resulting document is a Gantt Chart that illustrates who
is responsible for what and when.

Coverage or
Activity Description
Frequency
Every public
method
Every public We will use if-statements at the beginning of public
method in methods to validate each argument value. This helps
Preconditions
COMPONENT- to document assumptions and catch invalid values
NAME before they can cause faults.
All public methods
that modify data
Every private
method Assertions will be used to validate all arguments to
Every private private methods. Since these methods are only called
method in from our other methods, arguments passed to them
Assertions
COMPONENT- should always be valid, unless our code is defective.
NAME Assertions will also be used to test class invariants
All private methods and some post conditions.
that modify data
We will use source code analysis tools to
Strict compiler
automatically detect errors. Style checkers will help
warnings
make all of our code consistent with our coding
Automated style
standards. XML validation ensures that each XML
Static analysis checking
document conforms to its DTD. Lint-like tools help
XML validation
detect common programming errors. E.g.: lint,
Detect common
lclint/splint, jlint, checkstyle, Jcsc, PyLint,
errors
PyChecker, Tidy
All changes to
Whenever changes must be made to code on a release
release branches
branch (e.g., to prepare a maintenance release) the
All changes to
Buddy review change will be reviewed by another developer before
COMPONENT-
it is committed. The goal is to make sure that fixes do
NAME
not introduce new defects.
All changes
We will hold review meetings where developers will
perform formal inspections of selected code or
Weekly documents. We choose to spend a small,
Review meetings Once before release predetermined amount of time and try to maximize
Every source file the results by selecting review documents carefully. In
the review process we will use and maintain a variety
of checklists.
We will develop and maintain a unit test suite using
100% of public the JUnit framework. We will consider the boundary
methods, and 75% conditions for each argument and test both sides of
of statements each boundary. Tests must be run and passed before
Unit testing
100% of public each commit, and they will also be run by the testing
methods team. Each public method will have at least one test.
75% of statements And, the overall test suite will exercise at least 75% of
all executable statements in the system.
The QA team will author and maintain a detailed
100% of UI screens written suite of manual tests to test the entire system
Manual system and fields through the user interface. This plan will be detailed
testing 100% of specified enough that a person could repeat ably carry out the
requirements tests from the test suite document and other associated
documents.
100% of UI screens
The QA team will use a system test automation tool to
Automated system and fields
author and maintain a suite of test scripts to test the
testing 100% of specified
entire system through the user interface.
requirements
Regression testing Run all unit tests We will adopt a policy of frequently re-running all
before each commit automated tests, including those that have previously
Run all unit tests been successful. This will help catch regressions
nightly (bugs that we thought were fixed, but that appear
Add new unit test
when verifying again).
fixes
We use a load testing tool and/or custom scripts to
simulate heavy usage of the system. Load will be
defined by scalability parameters such as number of
concurrent users, number of transactions per second,
Simple load testing
or number/size of data items stored/processed. We
Load, stress, and Detailed analysis of
will verify that the system can handle loads within its
capacity testing each scalability
capacity without crashing, producing incorrect results,
parameter
mixing up results for distinct users, or corrupting the
data. We will verify that when capacity limits are
exceeded, the system safely rejects, ignores, or defers
requests that it cannot handle.
4 current customers We will involve outsiders in a beta test, or early
40 members of our access, program. We will beta testers directions to
Beta testing developers network focus on specific features of the system. We will
1000 members of actively follow up with beta testers to encourage them
the public to report issues.
As part of our SLA, we will monitor the behavior of
Monitor our ASP
servers to automatically detect service outages or
Instrumentation servers
performance degradation. We have policies and
and monitoring Remotely monitor
procedures in place for failure notification, escalation,
customer servers
and correction.
We want to understand each post-deployment system
failure and actively take steps to correct the defect.
Prompt users to
The system has built-in capabilities for gathering
Field failure report failures
detailed information from each system failure (e.g.,
reports Automatically
error message, stack trace back, operating system
report failures
version). This information will be transmitted back to
us so that we may analyze it and act on it.

You might also like