You are on page 1of 64

vREST User Guide

http://vrest.io/
Introduction
vREST is an online web application for Automated Testing, Mocking, Automated Recording and
Specification of REST / RESTful / HTTP APIs. It works in hosted mode. vREST at a glance,

A simple and intuitive tool to quickly validate your REST APIs.


vREST is designed to validate / test REST APIs with simplicity and intuitiveness in mind. Apart
from validation of REST APIs, it also provides functionality for
o

API specification

Mock Server

Deliver zero defect web applications with very less effort in API testing.
User can quickly record the entire test suite with the help vREST Chrome Extension. Then a very
less effort is required to maintain the recorded test cases. The recorded test cases can be
executed again and again to deliver zero defect web applications with continuous development.

No skilled resources required to validate your web application.


No extensive programming skills is required for a tester to operate on vREST. Although we
recommend the following basic skills to speedup testing on vREST:

Basic knowledge of Javascript, only if the user need to write custom validator

Basic understanding of what the REST APIs are.

Adequate knowledge of the web application under test and its API structure.

Thats it.
Quickly generate documentation for your API specifications.

User can quickly generate documentation from the recorded test cases with the help of "Export
as API Specs" function.

Ease of maintenance over a span of releases.


Various operations are provided in vREST to make the management of test cases, API specs
etc. easier. With the help of functions like Bulk Operation, Search and Replace etc., user can
quickly modify a large number of records. With "Make Replica" function, user can quickly copy
the entire version with just single click. These functions are very helpful to make the
maintenance easier when we release a new version of the test application.

De-couple your frontend development with backend development


With the help of Mock Server functionality, user can create API mocks in vREST. With the help of
Mock server
o
o
o

User can start developing frontend directly using mock HTTP requests.
Can speedup the development because one can starting writing the frontend before
the backend is in place. Both teams (Backend and frontend) can work in parallel.
Can be used to write demo applications without the backend is in place.

Getting Started
In vREST, a group of HTTP requests can be organized to be executed in sequence. Each HTTP
request is represented as a test case in vREST. For each HTTP request,
user can assign a response validator to check whether the test case is passed or not

and If HTTP request produces JSON response, then variables can be extracted from HTTP
response data and values of those variables can be used in subsequent test cases.

If you havent created an account on vREST yet then first create your account at vREST. Then create
an instance on vREST and follow the steps below to quickly get started:

1.

Create Test Case(s)


A test case can be created by the following ways:
o

Manual Insertion

A test case can be created by simply clicking on New button in the Test Cases tab. A
popup will appear as shown below and fill in the basic details to create a new test case.

After clicking on save button, a test case will be created as shown below. Further details can
be entered after creating the Test Case.

Automated recording through vREST Chrome Extension (Browser


Extension)
Test Case(s) can also be created by recording HTTP requests of the web application under
test. To record HTTP requests, a chrome browser extension (vREST Chrome Extension)
need to be installed. Refer section Extension which tells about how to download and use
extension.

Exporting from API Specs

If API specs (API specifications) are already inserted in vREST, then test case(s) of the
selected test spec(s) can be created by selecting More menu and then selecting Export to
Test Cases in API Specs tab.
Tip: For creating a large number of test cases, manual insertion can be tedious and error prone.
We recommend to either use vREST Chrome Extension or exporting from API Specs (If API
specs already written).

2.

Response Validation
Response validation is done with the help of response validators. A response validator is simply
a Javascript Function. A test case passes, if the assigned response validator returns true and

fails otherwise. For more details, view Response Validation.

3.

Run the Test Case(s)


A test case or a set of test cases can be executed by simply clicking on Run button in test case
tab. A Test Runner window will appear and test case(s) results will be displayed on that window.

4.

Generate Documentation (Optional)


5

API Documentation can be generated by defining API Specs in vREST. A API spec can be
created by the following ways:
o

Manual Insertion
A API spec can be created by simply clicking on New button in the API Specs tab. A
popup will appear as shown below and fill in the details to create a new API spec.

Exporting from Test Cases

If test cases are already created in vREST either through extension recorder or manual
insertion, then API spec(s) of the selected test case(s) can be created by selecting More
menu and then selecting Export to API Specs in Test Cases tab.
Tip: For creating a large number of API specs, manual insertion can be tedious and error prone.
We recommend to use vREST Chrome Extension to first record test cases and then use the
Export to API Specs feature in "Test Cases" tab.

Example
Objective of this example
To demonstrate,
how we can validate the REST APIs of a web application using vREST

and how we can manage our test suites / test cases / API specs over a span of releases for a
web application in vREST.

For demo, we have developed a sample test web application Contacts.

About Contacts Application


Contacts is a sample CRUD application built with Backbone.js, Twitter Bootstrap, Node.js, Express,
and MongoDB. The application allows you to browse through a list of contacts, as well as add,
update, and delete contacts.
The application has the following REST API endpoints:

Method

API Endpoint

Action

GET

/contacts

Retrieve all contacts

GET

/contacts/{{id}}

Retrieve the contact with the specified id

POST

/contacts

Add a new contact

PUT

/contacts/{{id}}

Update the contact with the specified id

DELETE

/contacts/{{id}}

Delete the contact with the specified id

The application has two versions released in the market.

Version 1.0.0: In this version, the application the following schema with some validations.

contact: {

name: {type: String, required: true, maxLength: 35},

email: {type: String},

designation: {type: String, maxLength: 35},

organization: {type: String, maxLength: 35},

country: {type: String, maxLength: 35},

aboutMe: {type: String},

githubId: {type: String},

facebookId: {type: String},

twitterId: {type: String}

}
}

The version 1.0.0 of the Contacts Application can be accessed at the following URL:
Test Instance: http://example.vrest.io/contacts/v1/test
Production Instance: http://example.vrest.io/contacts/v1/prod

Version 2.0.0: In this version, we added the following functional additions in the application:

Added a email validator for email field

aboutMe field can now be maximum 500 characters long only

{
contact: {
name: {type: String, required: true, maxLength: 35},
email: {type: String, validator: emailValidator},
designation: {type: String, maxLength: 35},
organization: {type: String, maxLength: 35},
country: {type: String, maxLength: 35},
aboutMe: {type: String, maxLength: 500},
githubId: {type: String},
facebookId: {type: String},
twitterId: {type: String}
}
}

The version 2.0.0 of the Contacts Application can be accessed at the following URL:
Test Instance: http://example.vrest.io/contacts/v2/test
Production Instance: http://example.vrest.io/contacts/v2/prod

Usage Scenarios

Testing an existing web application under test using vREST

Upgrading Test Cases for the next version of the web application under test

Scenario I: Testing an existing web application (Contacts Application v1.0.0) under test
using vREST

Prerequisites

vREST instance:

You will need to create your own vREST instance. You can create your own instance after sign
up on vREST. The instance which we have used to test Contacts application, is available at
http://vrest.io/i/contacts
and credentials to access this instance are as follows:
Email shared@optimizory.com
Password shared
Note: Since this is demo instance, the above credentials will give you only the read-only access
to the instance and also provide you credentials to execute test cases stored in this instance.

Test Application:
The web application which we are going to test using vREST. In our case, it is Contacts
Application v1.0.0, which can be accessed at the following URL:
http://example.vrest.io/contacts/v1/test

Browser Extension (vREST Chrome Extension):


A chrome browser extension which is used to record and replay the test cases of vREST. For
more info about this extension, please follow vREST Chrome Extension.

Steps
Step by step instructions to validate REST APIs of a web applications (in our case, Contacts) are as
follows:
1.

Setup Project in vREST


First step is to setup a project in vREST. By default, vREST instance comes with a sample
project, You can either create a new project or use/rename the default project. For more info on
how to configure projects, please follow Configuration Page.

2.

Configure vREST Chrome Extension (a browser extension) for your vREST Instance
We need to configure vREST Chrome Extension with the following parameters:
Instance Name contacts (In your case, it will be the name of your own vREST instance.)
Filters >> URL Pattern Filters Has Substring - /contacts (because all of our test application's API
endpoints contains this string. Specifying this rule will filter out only these HTTP requests and will

10

prevent all other HTTP requests from recording.)

3.

Configure Default Recorder Parameters in vREST


We need to set extentions default parameters so that every recorded test case will be having
that information automatically. We have set the following parameters for our Contacts application
in vREST:
Project Contacts Application

11

Validator Default Validator

4.

Add a test case to initialize the test application


This is similar to setup method of a test case. But in our case, it can be for a group of test cases
or for a test suite. The setup method is simply a yet another test case in vREST. The idea is, you
need to implement some kind of logic over REST API to initialize your test application. The issue
can be resolved in two ways:

12

5.

1.

Implement a REST API endpoint in the test application itself, which is open only in
test environment and which will insert the initialization data upon invocation of the API.

2.

Create another application, which provide REST APIs to execute commands on the
test machine in which test application's data resides. Now with the help of this application,
we can insert database dumps through commands and these commands can be invoked via
REST APIs.

Manually test the Contacts application and test cases will be recorded automatically
through vREST Chrome Extension
This step will create all your test cases automatically. All you have to do is to test the web
application manually covering all scenarios/conditions, browser extension will record the HTTP
traffic according to your configuration rules and store them as test cases in vREST.

13

14

15

After the vREST Chrome Extension has recorded all the test cases as shown previously, one can
fill in the details like summary, description etc. of all the recorded test cases or modify other
details of the test cases as shown below.

6.

Assign the recorded test cases to a version


Versions for a project can be configured in "Configuration" Tab. After you have created versions
for the project, you can bulk select the recorded test cases, then use bulk operation to assign all
the test cases, a version. For more info on bulk operations, please follow Test Cases.

7.

Execute Test Cases


Now try to execute the test cases. A built-in test runner will execute the set of test cases one by
one with the help of browser extension vREST Chrome Extension.

16

In the context of our Contacts application, some of the test cases will fail as shown below:

The test

cases can be failed due to the following:


1.

Dynamic results of test cases


Default Validator checks for exact match between expected results and actual results. Since
our results are dynamic. Default Validator will flag all these dynamic test cases as failed
upon execution.
To resolve this issue, we can do the following:
1.

Define our own Custom Validator

2.

Check only the schema of the actual results with the help of Default Schema
Validator

e.g. In our Contacts Application, the test cases failed due to the following dynamic
properties:

17

1.

_id

2.

createdOn

In Contacts Application, we have created a new Response Validator named "Contacts


Dynamic Response Validator", which will just checks the type of dynamic properties in the
response, not their content value match and other properties are checked for exact match.
We have assigned this validator via bulk operation to the test cases for which responses are
dynamic.
Note: More info on how vREST check whether a test case is passed or failed, can be found
at Response Validation.

Dynamic URL of test cases


In Contacts application, id of contacts are 32 characters long dynamically generated hash.
Every time, we add a contact, a new ID is generated. To resolve this issue, we can store the
id of newly created contact in a variable with the help of variable extractor and then we can

18

use these variables in subsequent test cases.


After assigning the custom validator as described in previous step, we executed the test
cases again. The results are as shown below:

That single test case failed due to id used in the URL is dynamic. To fix this issue, we can
store the id generated of the contact (step 3 in figure) in a variable as shown below:

19

After that we need to replace the id used in URLs and expected results with the variable
"contactId". For that, we can use the "Search and Replace" feature of vREST.

Now after "Search and Replace" operation, our test cases will look like as shown in figure

20

below:
Aft
er running the test cases again, we found that, all the test cases are now passing as shown

21

below:

Scenario II: Upgrading Test Cases for the next version of the web application (Contacts
Application v2.0.0) under test

Prerequisites

Prerequisites are same as scenario I, except that the test application (Contacts Application)
is now upgraded to v2.0.0, which can be accessed at the following URL:
http://example.vrest.io/contacts/v2/test

22

Steps
Step by step instructions to manage test cases for the REST APIs of upgraded web application (in
our case, Contacts) are as follows:
1.

Make Replica of previous version (v1.0.0) in vREST


First step is to create a new version in vREST and create copies of API specs / test cases and
assign them to newer version. In vREST, the whole task can be done easily with a single button
click "Make Replica" in Configuration >> Project Configuration. For more info on how to make
replica of a version, please follow Configuration. After that, name the newly created version to
whatever the name you want (in our case, it is 2.0.0).

2.

Commit previous version (v1.0.0) in vREST


Since previous version of the application is already in the market, and ideally we should not
change the API specs / test cases of the previous version. So, we can commit the previous
version of our web application in vREST. This can be done easily with a single button click
"Commit" in Configuration >> Project Configuration. For more info on how to commit a version,
please follow Configuration.

3.

Add/modify/remove test cases for the newer version


In this step, we will
o

add test cases, if there are any new functional additions in the newer version e.g. for
our Contacts Application, we have added two validations in the newer version. So, we will
add test cases for those additional validations to validate the functionality is working as

23

expected or not.

remove test cases, if there are any APIs which are no longer supported or any
checks which have been removed.

modify test cases, if there are any changes in the previous version of the APIs.

4.

Execute test cases for the newer version


Now, try to execute test cases of the newer version. Some of the test cases may fail, due to
following reasons:
o

functional bug in the functionality covered by the test case. If this is the case, please
correct the functional bug and execute the test cases again to see the changes.

API responses might have changed. It may be due to whether some new fields are
added or some old fields are removed or due to some algorithmic change. If this is the case,
then review the failed test case's diff report and modify the expected results accordingly.

Reporting
In this section, we will review all the reports provided by vREST in context of sample application
Contacts. vREST provides the following reports:

Report: Filter by Test Runs


In vREST, one can filter on test runs to view the testcase's history in filtered those test runs. After
filtering, a small pie chart indicator will be shown for test cases which shows the relation of a test

24

case with number of test runs in which the test case is passed, failed or not executed.

Report: Expected/Actual Results Report

To view expected/actual results of a test case,


o
o

First we have to filter on test run(s)


Then go to "Results" subtab of a test case to view the expected/actual results against
a test run and their status whether the test case is passed or not.

25

Report: Diff Report

To view diff report for a test case,


o

First we have to filter on test run(s)

Diff report can be seen for a test case in "Diff" subtab as shown below in the figure.

26

Report: Test Runs Summary Report

27

To view test run's summary report, simply go to "Test Runs" tab. Summary report will be shown
as shown in figure below:

Test Cases
Test Case Creation

A test case can be created by the following ways:


o

Manual Insertion

28

A test case can be created by simply clicking on New button in the Test Cases tab. A
popup will appear as shown below and fill in the basic details to create a new test case.

After clicking on save button, a test case will be created as shown below. Further details can
be entered after creating the Test Case.

29

Automated recording through vREST Chrome Extension (Browser


Extension)
Test Case(s) can also be created by recording HTTP requests of the web application under
test. To record HTTP requests, a chrome browser extension (vREST Chrome Extension)
need to be installed. Refer section Extension which tells about how to download and use
extension.
Exporting from API Specs

If API specs (API specifications) are already inserted in vREST, then test case(s) of the
selected test spec(s) can be created by selecting More menu and then selecting Export to
Test Cases in API Specs tab.
Tip: For creating a large number of test cases, manual insertion can be tedious and error prone.
We recommend to either use vREST Chrome Extension or exporting from API Specs (If API
specs already written).

Test Case Organization

Test cases can be grouped on the following attributes:

Test Suites

Tags

Versions

Project

Reordering of test cases can be done by the following ways:

via Cut/Copy/Paste operation

30

via drag/drop feature to reorder a test case in a page

Test Case Operations

Commit/Un-commit Test Case(s)

A test case can be made non editable by clicking on Commit button in Test Case tabs
tool bar and can be made editable again by clicking on Un-Commit button.

API specs, test cases and API mocks of an entire project version can be committed /
uncommitted by following the steps below:

First Go to Configuration > Project Configuration

Select the project from the project selector dropdown

In versions section, Click on Commit/Un-Commit button to commit/un-commit


the API specs or test cases of the selected versions.

Delete Test Case(s)

31

A test case or a set of test case(s) can be deleted by clicking on Delete button in Test
Cases tabs toolbar.

Cut/Copy/Paste Test Case


Cut/Copy/Paste operations are available in More menu in Test Cases tabs toolbar.

Export to API Specs


A filtered or selected set of test cases can be exported as API specs by selecting More >
Export to API Specs option. Test cases in the filtered/selected set for which URL and

32

request method is common, are converted to single test spec.

Search and Replace

Search and Replace operation can be performed on filtered/selected set of test cases by
selecting More > Search and Replace option. As of now, simple text based search is
available on the following attributes:

URL

Summary

Description

Parameters

Expected Results

33

Bulk Operation

Bulk operations make the management of large number of test cases easier. Bulk operations
can be performed on the following attributes:

Method

Project

Version

Tag

Test Suite

Validator

Authorization

34

Filters

Filters on the following attributes are available to filter out test cases:

Test Run

Project

Status

Version

Test Suite

Tags

API Specifications
35

API Spec tab is used to specify HTTP REST API specifications.

API Spec Creation

A API spec can be created by the following ways:


o

Manual Insertion
A API spec can be created by simply clicking on New button in the API Specs tab. A
popup will appear as shown below and fill in the details to create a new API spec.

Exporting from Test Cases

If test cases are already created in vREST either through extension recorder or manual
insertion, then API spec(s) of the selected test case(s) can be created by selecting More
menu and then selecting Export to API Specs in Test Cases tab.
Tip: For creating a large number of API specs, manual insertion can be tedious and error prone.
We recommend to use vREST Chrome Extension to first record test cases and then use the
Export to API Specs feature in "Test Cases" tab.

36

API Spec Organization

API specs can be grouped on the following attributes:

Project

Tags

Versions

Reordering of API specs can be done by the following ways:

via Cut/Copy/Paste operation

via drag/drop feature to reorder a API spec in a page

37

API Spec Operations

Commit / Un-commit API Spec(s)

A API spec can be made non editable by clicking on Commit button in API Specs tabs
tool bar and can be made editable again by clicking on Un-Commit button.

API specs, test cases and API mocks of an entire project version can be committed /
uncommitted by following the steps below:

First Go to Configuration > Project Configuration

Select the project from the project selector dropdown

In versions section, Click on Commit/Un-Commit button to commit/un-commit


the API specs, test cases and API mocks of the selected versions.

Delete API Spec(s)

38

An API spec or a set of API specs can be deleted by clicking on Delete button in API
Specs tabs toolbar.

Cut/Copy/Paste API Spec


Cut/Copy/Paste operations are available in More menu in API Specs tabs toolbar.

Export to Test Cases


A filtered or selected set of API specs can be exported as API cases by selecting More >
Export to Test Cases option. By default, all the fields which are common to both test case

and api spec will get populated for test cases.

39

Search and Replace

Search and Replace operation can be performed on filtered/selected set of api specs by
selecting More > Search and Replace option. As of now, simple text based search is
available on the following attributes:

URL

Summary

Description

Parameters

Bulk Operation

Bulk operations make the management of large number of api specs easier. Bulk operations
can be performed on the following attributes:

Method

Project

Version

Tag

40

Filters

Filters on the following attributes are available to filter out API specifications:

Project

Version

Tag

Mock Server
Mock Server provides functionality to mock any HTTP request.

This functionality can be helpful in decoupling backend development from the frontend
functionality. One can start developing frontend directly using mock HTTP requests.

To speedup the development because one can starting writing the frontend before the
backend is in place. Both teams (Backend and frontend) can work in parallel.

Can be used to write demo applications without the backend is in place.

41

API Mock Creation


A API Mock can be created by simply clicking on New button in the Mock Server tab. A popup
will appear as shown below and fill in the details to create a new API Mock.

API Mock Organization


API Mocks can be grouped on the following attributes:

Tags

Versions

Project

42

Reordering of API Mocks can be done by the following ways:

via Cut/Copy/Paste operation

via drag/drop feature to reorder a API Mock in a page

API Mock Operations

Commit/Un-commit API Mock(s)


A API Mock can be made non editable by clicking on Commit button in API Mock" tabs
tool bar and can be made editable again by clicking on Un-Commit button.

API specs, Test Cases and API Mocks of an entire project version can be committed /
uncommitted by following the steps below:

43

First Go to Configuration > Project Configuration

Select the project from the project selector dropdown

In versions section, Click on Commit/Un-Commit button to commit/un-commit


the API specs, test cases and API mocks of the selected versions.

Delete API Mock(s)


An API Mock or a set of API Mock(s) can be deleted by clicking on Delete button in "Mock
Server" tabs toolbar.

Cut/Copy/Paste API Mock


Cut/Copy/Paste operations are available in More menu in Mock Server tabs toolbar.

44

Search and Replace

Search and Replace operation can be performed on filtered/selected set of API Mocks by
selecting More > Search and Replace option. As of now, simple text based search is
available on the following attributes:

URL

Summary

Description

Parameters

Bulk Operation

Bulk operations make the management of large number of API Mocks easier. Bulk
operations can be performed on the following attributes:

Method

Project

Version

Tag

45

Filters

Filters on the following attributes are available to filter out API Mocks:

Project

Version

Tags

Response Validation
Response validation is done with the help of response validators. A response validator is simply a
JavaScript function. A test case passes if the assigned response validator returns true otherwise it
fails. Any variables used in the expected results are first replaced and new variables are extracted
from the actual results of the test case to be used in subsequent requests, if response type is JSON
and if extraction rules are defined during test case definition in Variable Extractor tab.
A response validator function will get the following input parameters:

46

1.

testcase
First parameter is test case, having all the details of a test case which has been provided in the
test cases tab.

2.

response
Second parameter is actual response of the HTTP request. It is a JSON object with the following
keys:
headers: HTTP Response headers retrieved

o
o

3.

actualResults: HTTP Response result retrieved. It is also a JSON object with the
following keys:

content: (String) Actual HTTP Response body

resultType: (String) Content type of HTTP Response body

methods
Third parameter is methods object. This parameter contains the following predefined utility
methods:
o

compareJSON: Compares two JSON objects property by property and also provides
special checks for given properties by using "customJSONPropertyChecker" parameter.

validateJSONSchema: Validates JSON objects using JSON Schemas. For more


information, please see the source code at the following link:

Source: https://github.com/kriszyp/json-schema/blob/master/lib/validate.js

Version: 0.2.2

User can define their own custom response validators. By default, following two response validators
are provided:
1.

Default Validator
This validator checks the contents of expected results with actual results retrieved for a test case.
This validator checks the results for exact match.
Note: If a test case always produces dynamic results, then default validator will always fail the
test case. In this scenario, in order to correctly validate this test case, define your own custom
response validator and in that validator, either ignore the dynamic attributes in the actual results
or just check the schema of the dynamic attributes.

2.

Default Schema Validator


This validator checks the schema of actual results (Only applicable for JSON responses) with the
expected schema defined during test case definition.

47

Authorization
This functionality serves the purpose of making authenticated/authorized HTTP requests. User can
setup any number of authorizations in vREST. Authorizations can be setup by following the "More" >
"Setup Authorization(s)" option in "Test Cases" tab's toolbar. At present, following types of

authorizations are possible:

1.

Basic Authorization

48

In Basic Authorization, system asks for username and password for making authenticated HTTP
requests and basic authorization header will be sent while making HTTP requests.

2.

OAuth 1.0
In OAuth 1.0 authorization, system asks for the following attributes:
o

Signature Method (Supports HMAC-SHA1, PLAINTEXT)

Consumer Key

Consumer Secret

Scope

49

After filling these attributes, user need to follow the two steps mentioned below:
Step 1: Access Token Generator (Optional)

If you already have the access token key and secret (generated from external application)
then you can skip this step completely otherwise this step is required. Fill in the following
details to complete this step:

Request Token URL

Authorize URL

Access Token URL

After filling the above attributes, click on "Authorize & Generate Access Token". System will
redirect user to authorize the consumer provided in a popup window (If popup window
doesn't open then Please check in the address bar whether popups are blocked or not. If
popups are blocked then allow access to complete authorization process.). After successful
authorization, close the popup window and click on "Refresh" button in previous window, to
fill the access token key/secret in step 2 automatically.

50

Step 2: If already having the access token or generated from Step

o
1

If you have generated the access token key/secret pair from any external application, then
just fill the details and save the authorization. And if you have completed the step 1, then
click on refresh to fill these details automatically by the system.
Note: For more details on OAuth 1.0, please refer OAuth 1.0 Spec

2. OAuth 2.0
In OAuth 2.0 authorization, system asks for the following attributes:
o

Client Id

Client Secret

Scope (Optional)

After filling these attributes, user need to follow the two steps mentioned below:

51

Step 1: Access Token Generator (Optional)

If you already having the access token key and secret (generated from external application)
then you can skip this step completely otherwise this step is required. Fill in the following
details to complete this step:
Grant Type: vREST supports the following grant types:

Authorization Code Flow

Resource Owner Password Credentials

Client Credentials

Authorize URL

Access Token URL

In addition, if user has chosen grant type "Resource Owner Password Credentials", then
system asks the following additional information:

Username

Password

After filling the above attributes, click on "Authorize & Generate Access Token". System will
redirect user to authorize the consumer provided in a popup window (If popup window
doesn't open then Please check in the address bar whether popups are blocked or not. If
popups are blocked then allow access to complete authorization process.). After successful
authorization, close the popup window and click on "Refresh" button in previous window, to
fill the access token in step 2 automatically.
Step 2: If already having the access token or generated from Step

o
1

If you have generated the access token from any external application, then just fill the details
and save the authorization. And if you have completed the step 1, then click on refresh to fill
these details automatically by the system.
Note: For more details on OAuth 2.0, please refer OAuth 2.0 Spec

2.

Raw Authorization
In Raw Authorization, System asks for authorization header. This header will be sent with each
HTTP request for which this authorization is assigned.

52

Test Runner
In vREST, an inbuilt test runner is provided with the help of Chrome Extension (vREST Chrome
Extension). With this Test Runner, user can make any HTTP requests for a web application whether
it is deployed locally or over the intranet/internet. To run the test cases, follow the steps below:

First select the test cases, which you want to execute otherwise the filtered list of test cases
will be executed.
Then click on "Run" button in "Test Cases" tab's toolbar to start the test runner.

53

All the test cases will be executed in sequence. A popup window will appear which will show
the results of the executed test cases, their status indicator and current test run progress.

Note: Test Runner functionality will only work in Google Chrome Browser with the help of Chrome
Extension (vREST Chrome Extension).

Variables
Variables in vREST provides a way to dynamically change the test case configuration at run time.
Variables in vREST are of format {{VARIABLE_NAME}} and of two types:

1.

Predefined/Global (Project wise) Variables


Predefined variables are defined before the execution of test case(s). To define a predefined
variable

54

2.

Select the "More" > "Variables" option in "Test Cases" tab's toolbar

Then select the project and define the variables for the selected project

Variables extracted from test cases

55

Variables can also be extracted from Test Case's responses and can be used in subsequent
HTTP requests (test cases). To extract the variable's values from test case's actual results,
please follow the steps below:
o

Expand the test case from which you want to extract variables

Go to "Variable Extractor" sub tab

Define the variable's name and their corresponding path in the json response

Note:
o
o
o

Variable extractor is only applicable for JSON responses.


We are using jsonPath version 0.8.5. Please use syntax starting with "$." or "$[" as
provided in the link.
These variables will be available in all subsequent test cases within a test run.

vREST Chrome Extension


Purpose

vREST Chrome Extension (A browser extension, currently available only for Chrome) is a very
simple tool, which records the HTTP requests of web application under test and their responses
and it will automatically store them as test cases in vREST. It can be configured to filter out
HTTP requests according to content type and URL pattern for recording.
Note: Please note that

Any network communication (via HTTP requests) which is not in the knowledge of
the browser tab, will not be recorded by the extension also.

In order to extension to work, user must be logged in into the vREST instance which
the user has configured in the extension configuration also.

Installation
vREST Chrome Extension is a browser extension and can be directly installed from Chrome
Web Store.

56

Configuration

vREST Chrome Extension can be configured by


o

Click on extension icon in browser navigation bar

Then Click on "Configuration Menu Item

57

After that vREST Chrome Extension Configuration page will open as shown below:

Provide the following details in the vREST Chrome Extension Configuration screen

Instance Name Provide the instance name of the vREST application

Filters Filters specify whether a HTTP request will be recorded or not

according to its content type and URL. There are two types of filters applied in the
following order:
1.

Request Type Filter: Request Type Filter records the HTTP requests
according to its content type. e.g. if the user has applied filter for XHR then any
HTTP request that represents an XHR, will be recorded.

2.

URL Pattern Filter: URL Pattern Filter records the HTTP request
according to its URL. If the URL matches the pattern specified, then HTTP request
will be recorded else it will be ignored.

Note: User can also check whether a particular URL will be recorded or not according to URL
Pattern Filter by using the Check URL section.

58

How to use

First open the web application under test in Google Chrome.

Then Click on extension icon in browser navigation bar

Then select Start Recording menu item from a list of options

Extension icon will be colored green to signal that Recorder is in recording mode

o
o

Manual test the web application, vREST Chrome Extension will record the requests
(filtered by the rules) and save them as test cases in vREST application.
Stop the recording by selecting Stop Recording menu item from a list of options.

Diagnostics

If you find any issues while recording the requests, then you can diagnose the issue by following
the steps as below:
o

Go to vREST Chrome Extension Configuration screen

Click on Diagnose Extension Issues button

Note: If you still have any issues after running diagnostics, then contact to support@vrest.io

Error Console
59

If by some reason, extension doesn't record the HTTP requests then errors can be seen in the
error console provided by the chrome also. For viewing the error console:
o

Go to chrome://extensions page

Click on background page for the vREST Chrome Extension extension

A new window will open, go to Network tab

Any recorded HTTP request will call an API to the vREST instance, these API
requests will be shown in the Network tab, and if there are any errors then those errors will
be shown in the response of these requests.

Configuration
Application Configuration

Application Configuration can be accessed by following "Configuration" > "Application


Configuration". It has the following sections:
General Configuration

In this section, the following basic information about the application is provided:

Application Name

Version

Instance Name

Organization Name

Projects

60

In this section, user can configure various projects for this instance and their managers.

Users
In this section, admin user can configure various users for this instance and their instance
wide roles.

Roles and Permissions


This section is non-editable and only available for viewing purpose. In this section, user can
view instance/project wide roles and their individual permissions.

61

Response Validators

In this section, default response validators are already provided. A custom response
validator can be created by clicking on "Create New" link in left hand side. For more
information on response validators, please refer Response Validation

Project Configuration

Project Configuration can be accessed by following "Configuration" > "Project Configuration".


After selecting "Project Configuration" subtab, select the project from the right hand side
dropdown. It has the following sections:

Versions

In this section, user can defines the versions of the selected project. Other than the basic
CRUD operations, the following operations are also possible:
Make Replica

To make a replica of version's API specs, test cases and API mocks, follow the steps
below:

First select a version, for which you want to make a replica.


then click on "Make Replica" button to make a copy of the selected
version. All the API specs, test cases and API mocks assigned to the selected
versions are replicated and assigned to new version.
Commit/Un-Commit

To commit/un-commit a version or a set of version, follow the steps below:

First select a version or a set of versions, which you want to


commit/un-commit.

62

then click on "Commit"/"Un-Commit" button to perform the operation.


All the API specs, test cases and API mocks assigned to selected version(s) are
committed/un-committed.

Test Suites
In this section, user can defines the test suites of the selected project which can be further
assigned to the individual test cases.

Project Users

63

In this section, user can defines the different team members for the selected project and can
manage their individual roles in the project.

Frequently Asked Questions (FAQ)


1.

How can I update the Chrome extension "vREST Chrome Extension"


manually?
By default chrome autoupdates its extensions at some interval. And between two intervals, if
vREST throws an error of incompatible version of chrome extension, then extension need to be
manually updated. To manually update the chrome extension, follow the steps below:
o

Open chrome://extensions page in new browser's tab (Google Chrome)

Enable the developer mode

2.

Then "Update extensions now" button will appear, click on this button to update the
extensions.

How to specify JSON payload or multi level parameters in a test case?


To specify JSON payload or multi level parameters, follow the below steps:
o

First specify a header of "Content-Type" of appropriate value (e.g. "application/json"


for json payload or "application/xml" for xml payload) in "Headers" sub-tab for a test case

Then specify the request payload (either json or xml or any other type) in "Raw Body"
sub-tab

3.

How to perform cookie based authentication?


To perform cookie based authentication in vREST, follow the steps below:
o

First record the login request or create a test case manually for the login request with
form parameters(e.g. username/email and password etc.) in "Parameters" sub-tab.

vREST will automatically save the cookie after executing this test case and will use
this cookie in subsequent test cases.

64

You might also like