You are on page 1of 14

Spell out the Project Name

(Project Acronym)
Testing Specifications and Procedures

Month DD, YYYY

UNITED STATES PATENT AND TRADEMARK OFFICE


INFORMATION TECHNOLOGY TESTING BRANCH

[Project Name]

Testing Specifications and Procedures

TABLE OF CONTENTS
Qualification Statement...............................................................................................................iii
1

SCOPE...................................................................................................................................1-1
1.1
1.2
1.3
1.4
1.5

REFERENCED DOCUMENTS..........................................................................................2-1
2.1
2.2
2.3
2.4

System Overview......................................................................................................................................1-1
Overview...................................................................................................................................................1-1
Performance Testing.................................................................................................................................1-1
Security Test and Evaluation....................................................................................................................1-1
System Compatibility Testing...................................................................................................................1-1

Regulatory References..............................................................................................................................2-1
Department of Commerce Document References....................................................................................2-1
Project Specific Document References.....................................................................................................2-1
Commercial Document References..........................................................................................................2-2

TEST SPECIFICATIONS AND PROCEDURES..............................................................3-1


3.1
3.2
3.3
3.4
3.5
3.6
3.7

3.8

Features to be Tested.................................................................................................................................3-1
Features Not to be Tested..........................................................................................................................3-1
Approach Refinements.............................................................................................................................3-1
Feature Pass/Fail Criteria.........................................................................................................................3-1
Input Specifications..................................................................................................................................3-2
Output Specifications................................................................................................................................3-2
Environmental Needs...............................................................................................................................3-2
3.7.1
Hardware...................................................................................................................................3-2
3.7.2
Software....................................................................................................................................3-2
3.7.3
Other Needs...............................................................................................................................3-2
Test Specifications And Procedures..........................................................................................................3-4

APPENDIX A

ACRONYMS AND ABBREVIATIONS....................................................A-1

APPENDIX B

[REQUIREMENTS or USE CASE] TO TEST ALLOCATION MATRIX


B-1

LIST OF TABLES
Table 3-1

Severity Rankings for Discrepancies.....................................................................................................3-1

GENERAL comments on the Header and Footer. The footer date on the right should be
same as the cover date, which is also the deliverable date.
There is a slash and a version number following the date. You assign the version
number based on the iteration of the document (e.g., is this the first time the document
is being written = version 1; are you revising a version 1 document, which has been
delivered to USPTO = version 2 and so on ). The version number is NOT the AIS version
#, which is in the left header.

ii

Month DD,YYYY/v.1

[Project Name]

Testing Specifications and Procedures

QUALIFICATION STATEMENT
The paragraph below is required.
All product names and services identified throughout this document are trademarks or registered
trademarks of their respective companies.

iii

Month DD,YYYY/v.1

[Project Name]

Testing Specifications and Procedures

SCOPE

1.1 System Overview


The purpose of United States Patent and Trademark (USPTO) [project name and version #)
project is to enhance its existing system and further address customer needs. Add anything that
provides further insight into the project.
1.2 Overview
The Testing [Specifications or Scenarios] and Procedures contains: scope, referenced documents,
test [specifications or scenarios], and test procedures. It provides a detailed description of each
test [specification or scenario] and the [requirement(s) or Use Case(s)] it tests. Each requirement
from the [RTMx Report (RTMx) or Use Cases] includes a unique identification (ID) and specified
functionality. The test procedures describe the step-by-step actions, expected results, and any
special conditions necessary for testing.
During System Integration Testing the System Development Lead (SDL) edits the Testing
[Specifications or Scenarios] and Procedures, which the testing staff then uses as a basis for
conducting Functional Qualification Testing (FQT) and determining the systems ability to meet
requirements. The testing staff will update the Testing [Specifications or Scenarios] and
Procedures as needed during the FQT process.
1.3 Performance Testing
This document does not address Performance Testing for [project acronym and version].
1.4 Security Test and Evaluation
This document does not address Certification and Accreditation (C&A) Security Test and
Evaluation (ST&E) testing for [project acronym and version].
1.5 System Compatibility Testing
This document does not address System Compatibility Testing (SCT) for [project acronym and
version].

Month DD,YYYY/v.1

[Project Name]

Testing Specifications and Procedures

REFERENCED DOCUMENTS

The following documents are either referenced in or were used in preparation of this document:
2.1 Regulatory References
Section 508 of the Rehabilitation Act (29 U.S.C. 794d), as amended by the Military Construction
Appropriations Act for Fiscal Year 2001 (P.L. 106-246), July 2000.
2.2 Department of Commerce Document References
Technical Reference Model, Version 7.0, February 2003.
Systems Development Life Cycle, Version 1.00, June 5, 2007.
Quality Assurance for Automated Information Systems Technical Standard and Guideline,
IT- 212.2- 04, January 30, 1997.
Testing Technical Standards and Guidelines, USPTO IT-212.3- 01, October 2001.
Accommodation Technology Compliance (ATC) Functional Qualification Testing Specifications
and Procedures, June 1, 2001.
2.3 Project Specific Document References
Use the exact title of the AIS document rather than writing the skeleton/shortcut titles
shown below.
Keep month, day, and year together by using non-breaking spaces [CTRL + SHIFT +
Space] and NON-breaking hyphens [CTRL underline which means control + shift +
dash]
Do not write a shortcut name such as QA Plan for ABC System, rather, Quality Assurance Plan
for Act Baseline Cat 2.5 (ABC 2.5), January 23, 2006.
Requirements Specification for [name of AIS and version], Month XX, YYYY.
Requirements Traceability Matrix for [name of AIS and version], Month XX, YYYY.
Concept of Operations for [name of AIS and version], Month XX, YYYY.
Detailed Design Document for [name of AIS and version], Month XX, YYYY.
QA Plan for [name of AIS and version], Month XX, YYYY.
Test Plan for [name of AIS], Month XX, YYYY.
Configuration Management Plan for [name of AIS and version], Month XX, YYYY.

Month DD,YYYY/v.1

[Project Name]

Testing Specifications and Procedures

Office of Procurement Check List for Processing EIT Request (Electronic Information
Technology), http://ptoweb/ptointranet/cio/section508/508.htm.
2.4 Commercial Document References

Month DD,YYYY/v.1

TEST SPECIFICATIONS AND PROCEDURES

3.1 Features to be Tested


Specific functionality and features of [project acronym and version] to be tested are included in
Appendix B, [Requirements or Use Cases] To Test Allocation Matrix.
The principal features to be tested are categorized into the following areas:
a) Accessibility (Section 508).
b) External Interfaces.
c) Performance.
d) Important feature.
e) Important feature.
f) Important feature.
3.2 Features Not to be Tested
[Copy the FIRST paragraph from Section 5.5 from the Test Plan.] [Testing will also not include
product features that have not been modified from the previous version.]
3.3 Approach Refinements
There are no changes to the testing approach that was defined in the [name of AIS] Test Plan.
[OR]
[Describe changes that have been made to the test approach. In particular, was 508 planned or
added? Are automated scripts now expected to be used to test system functionality? Are
Performance Testing scripts now planned? Has the expected location of testing changed?]
3.4 Feature Pass/Fail Criteria
Any discrepancies identified are classified as one of three types defined in Table 3-1:
Table 3-1

Severity Rankings for Discrepancies

Severity
Critical

Description
Discrepancies that halt further program execution.
Example: run-time errors that cause the system to lock up in an unexplained or
unexpected manner.

Major

Discrepancies that cause the application not to perform as functionally required.


Example: inability to print a report.

Minor

Discrepancies that are not considered critical or major.


Examples: misspellings on a screen, ambiguous Help messages.

[Project Name]

Testing Specifications and Procedures

3.5 Input Specifications


See the Operator Action column for the detailed input specifications in Section 3.8.
3.6 Output Specifications
See the Expected Results column for the expected outputs of each operator action in Section 3.8.
3.7 Environmental Needs
Copy the hardware and software from the Test Plan unless you have more current
information.
Include Testing Support software (Bobby, JAWS, etc.) in the Software subsection.
The SDL will supply the test environment and is responsible for building the test database.
3.7.1 Hardware
3.7.2 Software
3.7.3 Other Needs
a) System IDs and passwords.
b) PTOnet.
c) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.
d) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.
e) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.
f) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.
g) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.
h) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.
i) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.
j) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.

Month DD,YYYY/v.1

[Project Name]

Functional Qualification Testing

3.8 Test Specifications And Procedures


Examples of Testing [Specifications or Scenarios] and Procedures are provided below.
Remember to include testing Support software (Bobby, JAWS, etc.) in the Prerequisites or Setup.
Test Name:
Description:
Requirement(s):
Prerequisites:
Setup:

Steps:
Step

Test Case 1: Create Workload Navigator Basic Pie Chat Test Case.
The Pie and Text Chart features should display independently for Assigned and Unassigned cases in Workload
Screen.
TICRS-2.
The user is logged on as an SLIE/BA.
Tester must point to Mock P drive in test environment. Verify that the most recent file is currently in
Trademarks\FAST 2.1\BIN\Fast Application.exe of the current CM Build. Create a desktop shortcut of the
FAST executable file from Mock P and launch the FAST exe.
Map to the TICRS drive(s):

Operator Action

Expected Results

1.

Logon to system as an SLIE user with valid ID.

2.

Click on the Assigned tab.

3.
4.
5.
6.
7.

Click on the Pie Chart.


Click Close.
Click on the Unassigned tab.

User should be able to log in the


system and Workload screen appears
for SLIE.
The Assigned work for Law Office in
which the SLIE is assigned by default
appears on screen. The Suspensions
cases appear in the work load screen.
Pie Chat screen will appear.
Display Suspension cases screen.
Unassigned will be appear.

Month DD,YYYY/v.1
3

Observed
Results

Pass/
Fail

[Project Name]

Step

Operator Action

Functional Qualification Testing

Expected Results

8.

Month DD,YYYY/v.1
4

Observed
Results

Pass/
Fail

[Project Name]

Test Name:
Description:
Requirement(s):
Prerequisites:
Setup:

Steps:
Step

Functional Qualification Testing

Test Case 2: Prosecution History Webpage


In case details screen Prosecution History Webpage link will be available which provided by USPTO.
TICRS-4.
The user is logged on as an LIE/SLIE/BA.
Tester must point to Mock P drive in test environment. Verify that the most recent file is currently in
Trademarks\FAST 2.1\BIN\Fast Application.exe of the current CM Build. Create a desktop shortcut of the
FAST executable file from Mock P and launch the FAST exe.
Map to the TICRS drive(s):

Operator Action

Expected Results

1.

Logon to system as an SLIE user with valid ID.

2.

Click on the Assigned tab.

User should be able to log in the


system and Workload screen appears
for SLIE.
The Assigned work for Law Office in
which the SLIE is assigned by default
appears on screen. The Suspensions
cases appear in the work load screen.

3.
4.
5.
6.

Month DD,YYYY/v.1
5

Observed
Results

Pass/
Fail

[Project Name]

Test Name:
Description:
Requirement(s):
Prerequisites:
Setup:

Functional Qualification Testing

Test Case 3: Display list of all TICRS documents.


Case Details screen tab will provide access to users to view or display list of all TCRS documents.
TICRS-6.
The user is logged on as an LIE/SLIE/BA.
Tester must point to Mock P drive in test environment. Verify that the most recent file is currently in
Trademarks\FAST 2.1\BIN\Fast Application.exe of the current CM Build. Create a desktop shortcut of the
FAST executable file from Mock P and launch the FAST exe.
Map to the TICRS drive(s):

Step

Operator Action

Expected Results

1.

Logon to system as an SLIE user with valid user


ID.

2.

Click on the Work List tab.

User should be able to log in the


system and Workload screen appears
for SLIE.
The Work List screen will be appear,
also the Suspensions cases category
page display by default.

3.
4.
5.

Month DD,YYYY/v.1
6

Observed
Results

Pass/
Fail

[Project Name]

Functional Qualification Testing


Specifications and Procedures

APPENDIX A
ATC
C&A
CD
CM
COTS
EIT
FQT
GUI
HP
ID
OCIO
PTOnet
QA
RTMx
SCT
SDL
ST&E
USPTO

ACRONYMS AND ABBREVIATIONS

Accommodation Technology Compliance


Certification & Accreditation
Change Document
Configuration Management
Commercial Off-The-Shelf
Electronic Information Technology
Functional Qualification Testing
Graphical User Interface
Hewlett-Packard
Identification
Office of Chief Information Officer
USPTOs campus-wide network
Quality Assurance
RTMx Report
System Compatibility Testing
System Development Manager
Security Test and Evaluation
United States Patent and Trademark Office

Month DD,YYYY/v.1

[Project Name]

Functional Qualification Testing


Specifications and Procedures

APPENDIX B

Requirement
Number
RE-V1.0001

RE-V1.0002
RE-V1.0003
RE-V1.0004
RE-V1.0036

RE-V1.0037

RE-V1.0067
RE-V1.0095

[REQUIREMENTS or USE CASES] TO TEST ALLOCATION


MATRIX
Requirement Description

Test Case
Where
Verified

The system shall provide for the initial receipt and initial
scanning of reexamination applications in OIPE when a final
filing date is established and will provide in OIPE the ability
to create, maintain and print Contents entries for the
electronic file corresponding to the separate papers in the
paper reexamination application.
The system shall will provide the ability to preclude
reexamination applications from automated initial
classification.
The system shall provide the ability to retrieve an as filed
version of reexamination electronic file wrapper.
The system shall provide the ability to view and print an
electronic Contents listing of all papers for each electronic file
folder in the REPS system.
The system shall provide an interface with the Revenue
Accounting Management System (RAM), so that at any time
during a REPS session, a user may enter or modify fee
information.
The system shall distinguish between a reexam requester and
other types of filers when the bibliographic data is captured.
Additionally, the system shall determine the patent number on
which the reexamination request is based from the
bibliographic data entered on the form.
The PSR printer must print a separator sheet containing this
information as the first page of the print job.
PTCS will have the capability to print an electronic CD-R
contents list to be stored on the CD-R as a file.

1
2
2

Deferred

3
3

Month DD,YYYY/v.1

You might also like