You are on page 1of 56

Quantitative and Qualitative

Research Design

Prepared By:
Dr. Nik Rahimah Yacob
Interpretivist
Positivist Paradigm Paradigm

Quantitative Methods Qualitative Methods

Verification Discovery
The Research Process by Cavana et al. (2001)
Catalyst for business research

Management action • Opportunity Preliminary information


gathering and literature
• Plan • Problem
survey
•Implement
• Monitor Problem definition

Report preparation and


presentation Framework development
• Conceptual
Interpretation of findings • Theoretical

Data analysis Research objectives


• Qualitative • Research questions
• Quantitatve • Hypotheses

Data collection Research design


• Qualitative
• Quantitative
Research Design by Cavana et al. (2001)
Purpose of the Types of Extent of Measurement Qualitative data
study investigation Researcher and measures collection
Interference
• Exploration • Clarification • Operational • Interviews
• Minimal definition
• Description • Causal • Focus groups
• Manipulation • Items
• Hypotheses • Correlational • Observation
(measure)
Problem Statement

testing • Experimental
• Scaling
• Case Study

Data analysis
Unit of Analysis Study setting Time Horizon Sampling Quantitative data
design collection
• Contrived • One-shot
• Individuals
(cross- • Probability/ • Questionnaires
• Non-
• Dyads sectional) non-
contrived • Experimental
• Groups • Longitudinal probability
designs
• Organisations • Sample size

• Machines, etc.
A Classification Of Research Data

RESEARCH DATA

SECONDARY DATA PRIMARY DATA

QUALITATIVE QUANTITATIVE QUALITATIVE QUANTITATIVE


DATA DATA DATA DATA

DESCRIPTIVE CAUSAL

OBSERVATIONAL EXPERIMENTAL
SURVEY DATA & OTHER DATA DATA
A Classification of the Qualitative
Research Methods

QUALITATIVE RESEARCH
PROCEDURES

DIRECT INDIRECT
(NONDISGUISED) (DISGUISED)

FOCUS DEPTH PROJECTIVE


CASE SECON-
GROUPS INTERV TECHNIQUES
STUDY DARY DATA

ASSOCIATION COMPLETION CONSTRUCTION EXPRESSIVE


TECHNIQUES TECHNIQUES TECHNIQUES TECHNIQUES
Quantitative Research
Approaches/Methods
A Classification Of Survey Methods
SURVEY METHODS

TELEPHONE PERSONAL MAIL

MALL COMP-ASSISTED
IN-HOME PERSONAL
INTERCEPT
INTERVIEWING

COMPUTER-ASSISTED MAIL MAIL


TRADITIONAL TELEPHONE
TELEPHONE INTERVIEW PANEL
INTERVIEWING
A Classification Of Observation Methods
OBSERVATION
METHODS

PERSONAL MECHANICAL CONTENT TRACE


OBSERVATIO OBSERVATIO AUDIT ANALYSIS ANALYSIS
N N
A Classification Of Experimental
Designs
EXPERIMENTAL DESIGNS

PREEXPERIMENTAL TRUE QUASI STATISTICAL


EXPERIMENTAL EXPERIMENTAL

One-Shot Case Pretest-Posttest Time Series Randomized


Study Control Group Blocks
Multiple Time
One Group Posttest Only Series Latin Square
Pretest- Control Group
Factorial Design
Posttest
Solomon Four
Static Group Group
Qualitative Research
Approaches/Methods
The continuum model for interviews
Structured interviews Semi-structured interviews Unstructured
interviews

• Standardised * In-depth interviews * In-depth interviews


interviews * Survey interviews * Group interviews
* Survey interviews * Group interviews • Oral or life-history
interviews
The Pattern of an Interview
(Source: Delahaye (2000:166))

Level of
defence Ritual Ritual
barriers
Pass time
Pass time
Reason

Rules
Final questions
Preview

Activity no.1 Future action

Activity no. 2 – a series of question sequences


Rapport
Intimacy zone
Exit time
Entrance time
investment
investment

Time elapsed
Focus Groups

Depth interviewing of a group of 5 to 12


people; researcher serves as a moderator
• Logistics
• Group Composition
• Homogeneity
• Representation
• Strangers vs acquaintances
• Size of group
Case Study
By understanding a single system a researcher can
better understand similar instances and address the
problems and issues identified in one case.

As bounded systems of time and space (context), cases


rely on multiple sources of information to provide
an in-depth picture of an organisation or situation
(phenomenon) under study.

Via observation, interviews, documents, or surveys.


Survey Methods
Survey as a Research Approach
• It is a quantitative method
• Capitalizes on the communication approach (respondents are required
to communicate their responses to the researcher through a
structured or unstructured questionnaire)
• Involves the creation and selection of the measurement questions
• Sampling issues which drive contact and call-back procedures
• Instrument design which incorporates attempts to reduce error and
create respondent-screening procedures
• Data collection processes which create the need for follow-up
procedures and possible interviewer training
A Classification of Survey
Methods
Survey
Methods

Cross- Longitudinal
sectional design
design
On Sampling…..
• Capitalizes on a relatively large sample size
• The sample can either be drawn on a probability
sampling procedure or a nonprobability sampling
procedure, depending on the purpose of the study
On Instrument Design….
• The instrument for a survey is a questionnaire.
• The questionnaire can either be highly structured (close-ended
questions) or highly unstructured (open-ended questions)
• The norm is to utilize a structured questionnaire for ease of
data coding and analysis.
• The questionnaire design has to take into consideration the
data collection method (personal interview, telephone
interview, mail survey, Internet survey, mall intercept or self-
administered)
Data Collection Method
SURVEY METHODS

TELEPHONE PERSONAL MAIL

MALL COMP-ASSISTED
IN-HOME PERSONAL
INTERCEPT
INTERVIEWING

COMPUTER-ASSISTED MAIL MAIL


TRADITIONAL TELEPHONE
TELEPHONE INTERVIEW PANEL
INTERVIEWING

Examine the pros and cons of each method prior to


selecting a particular method for your study.
Observation Methods
Observation as a Research
Approach
• It can either be a quantitative method or a qualitative method
depending on whether the purpose of your study is to verify or
to discover
• Capitalizes on visual data collection. It also involves listening,
reading, smelling and touching
• Monitors a full range of behavioural (nonverbal, lingusitic,
extralinguistic and spatial analysis) and nonbehavioural
(record, physical condition and physical process analysis)
activities and conditions
• Sampling issues
• Instrument design
• Data collection process
A Classification of the
Observation Methods
Observation
methods

Direct Indirect
observation observation
On Sampling…..
• Capitalizes on a relatively large sample size for a
quantitative study and a small sample size for a
qualitative study
• The sample can either be drawn on a probability
sampling procedure or a nonprobability sampling
procedure for a quantitative study
• The sample is almost always drawn on a
nonprobability sampling procedure for a qualitative
study
On Instrument Design….
• The instrument for an observation study is known as an
observation checklist or an observation form
• The observation checklist or form can either be highly
structured (clear indications of what to observe and how to
tally the observation) or highly unstructured (vague ideas on
the scope of observation and what to observe)
• Observation checklist or form for quantitative research would
tend to be more structured than that of a qualitative research
• The observation checklist or form has to take into
consideration the data collection method (personal
observation, mechanical observation, audit, content analysis or
trace analysis)
Data Collection Methods for
Observation Studies
OBSERVATION
METHODS

PERSONAL MECHANICAL CONTENT TRACE


OBSERVATIO OBSERVATIO AUDIT ANALYSIS ANALYSIS
N N

Examine the pros and cons of each method before


selecting a particular method for your study
Experimentation
“Ex post facto research designs, where a
researcher interviews respondents or
observes what is or what has been, also have
the potential for discovering causality. The
distinction between these methods and
experimentation is that the researcher is
required to accept the world as it is found,
whereas an experiment allows the researcher
to alter systematically the variables of
interest and observe what changes follow.”
Source: Cooper & Schindler (2003, pp.424-425)

Beaumind Soft sdn bhd


What is experimentation?
• Involves at least one independent variable (IV) and
one dependent variable (DV) in a causal
relationship.
• The IV constitutes the intervention or manipulation
and its effect on the DV is measured.
• Three requirements for drawing a causal conclusion
are:
Concomittant variation
Time occurrence of variables
Control over extraneous factors

Beaumind Soft sdn bhd


Advantages & Disadvantages of
Experimentation
ADVANTAGES:
• Ability to manipulate the IV
• Increases the probability that changes in the DV are a function of the
changes in the IV
• Use of control group strengthens the causality finding
• Convenient and cost effective
• Ease of replication
DISADVANTAGES:
• Artificiality of the setting
• Generalizability of the study findings
• For some experiments, they can be costly
• Limited to issues of present and immediate future not past
• Due to ethical considerations, there is a limitation on manipulation on
people
Beaumind Soft sdn bhd
IV and Experimental Treatments
• Experimental treatments are the various levels in the
manipulation of the IV
• Normally, one level of the IV constitutes the control
• For the following RQ of an experiment:

Does the new ad generate more sales than the existing ad?
IV = Advertisement
DV = Sales
Levels of IV = Treatments = 1. New ad (experimental treatment)
2. Existing ad (control)

Beaumind Soft sdn bhd


Sampling and Randomization
• With two treatments in the experiment, there must be two
experimental groups
• Randomization can happen at three stages:
Stage 1 - Sample selection
Stage 2 – Group division
Stage 3 - Assignment of treatment to group
• Stage 1 randomization is not as important as the other stages.
• Stage 2 randomization is important to equate the characteristics of
the two groups. When this is not possible then use a matching
technique.
• Stage 3 randomization defines the experimental method. Without
this randomization the experimental design is flawed.

Beaumind Soft sdn bhd


Experimental Research Designs
• Pre-Experimental Designs
One-Shot Case Study X O
One-Group Pretest-Post-Test O X O
Static Group Comparison X O1
O2
• True Experimental Designs
Pretest-Post-Test Control Group R O1 X O2
R O3 O4

Post-Test-Only Control Group R X O1


R O2

Beaumind Soft sdn bhd


Internal Validity of Experiments
• Maturation
• Testing
• Instrumentation
• Selection
• Statistical regression
• Experiment mortality

Beaumind Soft sdn bhd


Depth Interview
Accuracy and Replicability
1. Trustworthiness
2. Verification
3. Acknowledging subjectivity and bias
4. Process and sequence
5. Interpretation
6. Referential adequacy
7. Paint the path
Field Interview Techniques
It takes two people to speak the truth-one to
speak, and the other to listen (Henry Thoreau 1849)

• Interviewing in the field requires a researcher to develop


a rapport with interviewees or informants.
• Researchers often apprehensive about engaging
strangers in conversation.
• The objective is to keep the informant talking .
Field Interviews
Building Rapport / Relationships
Gain confidence through purposeful conversation.

By explaining what you want to know.


Repeating key phrases used by interviewee.
Restating what they say in your own words.

Don’t have questions asking for meanings or motives:


“what do you mean?” or “why would you?”

These contain hidden judgments, interviewees may think


they have not clearly explained things or answered your questions!
Improving Interviewee Responses

 The basic probe: repeat the initial question, is useful when


interviewee wanders off track.
 Explanatory probe: building onto incomplete or vague
statements;
asks: what did you mean by that?
 Focused probe: to obtain specific information;
asks: what sort of X?
 The silent probe: pause and let interviewee break silence
drawing out: when interviewee has halted or slows simply repeat
the last few words from the interviewee then look expectant or
say tell me more
 Giving ideas or suggestions: offering an idea or suggestion to
think about; have you thought about? Did you know?
 Mirroring or reflection: express in your (interviewer) own words
what interviewee just said; what you seem to be saying is X?
Focus Group Method
Design Issues

• Logistics
• Group Composition
• Homogeneity
• Representation
• Strangers vs acquaintances
• Size of group
Conducting the Focus Group

1. Use pattern of interview as guide


2. Specific considerations
– Facilitator team
– Recording
– Use of visual aids
– Thinking time
– Group dynamics
Procedure for Planning and Conducting Focus
Groups
Determine the objectives of the Marketing Research Project and Define the
Problem
Specify the objectives of Qualitative Research

State the objectives/questions to be answered by Focus Group

Write a Focus Group Protocol

Develop a moderator’s outline

Conduct the Focus Group interviews

Review tapes and analyze the data

Summarize the findings and plan follow-up research or action


Case Study Method
Case studies

 Case studies can be single or multiple

 Within case analysis or across cases analysis

Case studies are: where a researcher explores a single


entity or phenomenon (the case) bounded by time and activity
(a program, event, institution, process or group) and collects
detailed information by using a variety of data collection
procedures (Yin 2001).
Selecting a Research Method
Research Strategy Type of research Requires Control over Focus on
Question Events Contemporary
Events
Experiment HOW , WHY YES YES
Survey WHO, WHAT, WHERE, NO YES
HOW MANY, HOW
MUCH
Archival Analysis WHO, WHAT WHERE, NO YES/NO
HOW MANY, HOW
MUCH
History HOW, WHY NO NO
Case Study HOW, WHY NO YES

Source: Yin (1989:17)


Case Study for four design tactics.
Tests Case Study Tactic Phase of Research
Construct Validity Use multiple sources of evidence. Data collection
Establish chain of evidence. Data collection
Have key informants review draft
Composition
report.

Internal Validity Do pattern matching Data analysis


Do explanation building Data analysis
Do time-series analysis Data analysis
External validity Use replication logic in Research design
multiple case studies
Reliability Use case study Protocol Data collection
Develop case study data
Data collection
base

Source: Yin(1989:p41)
Case study protocol components
Case Study Protocol Components Component Requirements
Overview Objectives & Auspices
Study Issues
Relevant Readings
Field Procedures Credentials
Access to Site
General sources of Information
Procedural Reminders
Case Study Instrument Questions Specific Questions
Potential sources for Answers
Case Report Guide Outline
Format
Additional Documentation
Source: Yin(1989:p70)
Quality Case Study Designs
(design & collection issues)

Construct Validity:
Establish & follow appropriate (correct) operational measures for the
Concepts (phenomenon) being studied.

External Validity:
Establish the domain to which the findings can be generalised.

Reliability:
Demonstrate that the operations of the study (data collection methods)
Can be repeated, achieving similar results.

• All aspects must be adhered to throughout the case study life and
in effect there is a strong link between design and collection.
Quality Case Study Designs
(design & collection issues)

Establishing the Chain of Evidence (construct validity)

Case study report

Case database

Reference to specific sources in the database

Case study protocol

Case study questions


Source: Yin, 2003
Quality Case Study Designs
(design & collection issues)
Generalisation (external validity)
Generalisation to THEORY not populations

Criticisms: cases cannot be generalised (or poor generalisation)

Generalisation to samples – universe is not correct as survey


designs rely on statistical generalisation and
cases use theoretical generalisation through
the results.

Generalisation requires replication via multiple cases of the


phenomenon (this is replication logic)
Quality Case Study Designs
(design & collection issues)

Reliability

Emphasis on replication

The key is doing the same case again, not replicating


the findings, i.e., results

The objective of reliability is to minimise error and bias

This is achieve through correctly detailed plans of actions


undertaken (a-priori and post gathering) through protocols,
Questions, histories etc
Basic Types of Case Design
Single V Embedded Designs

SINGLE MULTIPLE
Context Context
Context
HOLISTIC Case Case

SINGLE Case
Context Context

Case Case

Context Context Context


Case 1 Case 1
EMBEDDED Case 2 Case 2
MULTIPLE Case 1
Context Context
Case 2 Case 1 Case 1
Case 2 Case 2
Source: Yin, 2003
Basic Types of Case Design

Multiple Case Studies

Literal Replication V Theoretical Replication

Literal sees the purpose of the cases being to predict


similar results (findings).

Theoretical sees the purpose of the cases being to


predict different results (outcomes) for theoretically
predictable reasons.
Basic Types of Case Design
Single Case Study

Critical case: tests well established theory is correct or


alternative explanations are found and acceptable.
Extreme case: rare situation is found and studied to
establish theory.
Representative case: study situation identified to be
common and theory explains how situation occurs or
functions
Revelatory case: study a situation not previously able to
be examined.
Longitudinal case: same situation studied at multiple
points of time to see how things change over the stages
(time).

You might also like