You are on page 1of 5

Survey methodology

From Wikipedia, the free encyclopedia

For the Statistics Canada publication, see Survey Methodology.


A field of applied statistics, survey methodology studies the sampling of individual units from
a population and the associated survey data collection techniques, such asquestionnaire
construction and methods for improving the number and accuracy of responses to surveys.
Statistical surveys are undertaken with a view towards making statistical inferences about the
population being studied, and this depends strongly on the survey questions
used.Polls about public opinion, public health surveys, market research surveys, government
surveys and censuses are all examples of quantitative research that use contemporary survey
methodology to answer questions about a population. Although censuses do not include a
"sample", they do include other aspects of survey methodology, like questionnaires, interviewers,

Survey methodology topics[edit]


The most important methodological challenges of a survey methodologist include making
decisions on how to:[2]

Identify and select potential sample members.


Contact sampled individuals and collect data from those who are hard to reach (or
reluctant to respond)

Evaluate and test questions.

Select the mode for posing questions and collecting responses.

Train and supervise interviewers (if they are involved).

Check data files for accuracy and internal consistency.

Adjust survey estimates to correct for identified errors.

Selecting samples[edit]
Main article: Survey sampling
Survey samples can be broadly divided into two types: probability samples and non-probability
samples. These are discussed in several sources including Salant and Dillman. [3]Stratified
sampling is a method of probability sampling such that sub-populations within an overall
population are identified and included in the sample selected in a balanced way.

Modes of data collection[edit]


Main article: Survey data collection
There are several ways of administering a survey. The choice between administration modes is
influenced by several factors, including
1. costs,
2. coverage of the target population,
3. flexibility of asking questions,
4. respondents' willingness to participate and

5. response accuracy.
Different methods create mode effects that change how respondents answer, and different
methods have different advantages. The most common modes of administration can be
summarized as:[4]

Telephone

Mail (post)

Online surveys

Personal in-home surveys

Personal mall or street intercept survey

Hybrids of the above.

Cross-sectional and longitudinal surveys [edit]


There is a distinction between one-time (cross-sectional) surveys, which involve a single
questionnaire or interview administered to each sample member, and surveys which repeatedly
collect information from the same people over time. The latter are known as longitudinal surveys.
Longitudinal surveys have considerable analytical advantages but they are also challenging to
implement successfully.
Consequently, specialist methods have been developed to select longitudinal samples, to collect
data repeatedly, to keep track of sample members over time, to keep respondents motivated to
participate, and to process and analyse longitudinal survey data [5]

Response formats[edit]
Usually, a survey consists of a number of questions that the respondent has to answer in a set
format. A distinction is made between open-ended and closed-ended questions. An open-ended
question asks the respondent to formulate his or her own answer, whereas a closed-ended
question has the respondent pick an answer from a given number of options. The response
options for a closed-ended question should be exhaustive and mutually exclusive. Four types of
response scales for closed-ended questions are distinguished:

Dichotomous, where the respondent has two options

Nominal-polytomous, where the respondent has more than two unordered options

Ordinal-polytomous, where the respondent has more than two ordered options

(Bounded) continuous, where the respondent is presented with a continuous scale

A respondent's answer to an open-ended question can be coded into a response scale


afterwards,[4] or analysed using more qualitative methods.

Nonresponse reduction[edit]
The following ways have been recommended for reducing nonresponse [6] in telephone and faceto-face surveys:[7]

Advance letter. A short letter is sent in advance to inform the sampled respondents about
the upcoming survey. The style of the letter should be personalized but not overdone. First, it
announces that a phone call will be made/ or an interviewer wants to make an appointment

to do the survey face-to-face. Second, the research topic will be described. Last, it allows
both an expression of the surveyor's appreciation of cooperation and an opening to ask
questions on the survey.

Training. The interviewers are thoroughly trained in how to ask respondents questions,
how to work with computers and making schedules for callbacks to respondents who were
not reached.

Short introduction. The interviewer should always start with a short instruction about him
or herself. She/he should give her name, the institute she is working for, the length of the
interview and goal of the interview. Also it can be useful to make clear that you are not selling
anything: this has been shown to lead led to a slightly higher responding rate. [8]

Respondent-friendly survey questionnaire. The questions asked must be clear, nonoffensive and easy to respond to for the subjects under study.

Brevity is also often cited as increasing response rate. A 1996 literature review found mixed
evidence to support this claim for both written and verbal surveys, concluding that other factors
may often be more important.[9] A 2010 study looking at 100,000 online surveys found response
rate dropped by about 3% at 10 questions and about 6% at 20 questions, with drop-off slowing
(for example, only 10% reduction at 40 questions)[10] Other studies showed that quality of
response degraded toward the end of long surveys.[11]

Interviewer effects[edit]
Survey methodologists have devoted much effort to determining the extent to which interviewee
responses are affected by physical characteristics of the interviewer. Main interviewer traits that
have been demonstrated to influence survey responses are race, [12] gender,[13] and relative body
weight (BMI).[14] These interviewer effects are particularly operant when questions are related to
the interviewer trait. Hence, race of interviewer has been shown to affect responses to measures
regarding racial attitudes,[15] interviewer sex responses to questions involving gender issues,
[16]
and interviewer BMI answers to eating and dieting-related questions.[17] While interviewer
effects have been investigated mainly for face-to-face surveys, they have also been shown to
exist for interview modes with no visual contact, such as telephone surveys and in videoenhanced web surveys. The explanation typically provided for interviewer effects is social
desirability bias: survey participants may attempt to project a positive self-image in an effort to
conform to the norms they attribute to the interviewer asking questions.

See also[edit]
Statistics portal

Data Documentation Initiative

Enterprise feedback management (EFM)

Likert Scale

Official statistics

Paid survey

Quantitative marketing research

Questionnaire construction

Ratio estimator

Social research

Total survey error

References[edit]
1.

Jump up^ "WhatIsASurvey.info". WhatIsASurvey.info. Retrieved 2013-10-03.

2.

^ Jump up to:a b Groves, R.M.; Fowler, F. J.; Couper, M.P.; Lepkowski, J.M.; Singer, E.;
Tourangeau, R. (2009). Survey Methodology. New Jersey: John Wiley & Sons. ISBN 978-1-11821134-2.

3.

Jump up^ Salant, Priscilla, I. Dillman, and A. Don. How to conduct your own survey. No.
300.723 S3.. 1994.

4.

^ Jump up to:a b Mellenbergh, G.J. (2008). Chapter 9: Surveys. In H.J. Adr & G.J.
Mellenbergh (Eds.) (with contributions by D.J. Hand), Advising on Research Methods: A
consultant's companion (pp. 183209). Huizen, The Netherlands: Johannes van Kessel
Publishing.

5.

Jump up^ Lynn, P. (2009) (Ed.) Methodology of Longitudinal Surveys. Wiley. ISBN 0-47001871-2

6.

Jump up^ Lynn, P. (2008) "The problem of non-response", chapter 3, 35-55,


in International Handbook of Survey Methodology (ed.s Edith de Leeuw, Joop Hox & Don A.
Dillman). Erlbaum. ISBN 0-8058-5753-2

7.

Jump up^ Dillman, D.A. (1978) Mail and telephone surveys: The total design method.
Wiley. ISBN 0-471-21555-4

8.

Jump up^ De Leeuw, E.D. (2001). "I am not selling anything: Experiments in telephone
introductions". Kwantitatieve Methoden, 22, 4148.

9.

Jump up^ Bogen, Karen (1996). "THE EFFECT OF QUESTIONNAIRE LENGTH ON


RESPONSE RATES -- A REVIEW OF THE LITERATURE" (PDF). Proceedings of the Section on
Survey Research Methods (American Statistical Association): 10201025. Retrieved2013-03-19.

10.

Jump up^ "Does Adding One More Question Impact Survey Completion Rate?". 201012-10. Retrieved 2013-03-19.

11.

Jump up^ "Respondent engagement and survey length: the long and the short of it".
research. April 7, 2010. Retrieved 2013-10-03.

12.

Jump up^ Hill, M.E (2002). "Race of the interviewer and perception of skin color:
Evidence from the multi-city study of urban inequality". American Sociological Review 67 (1): 99
108.doi:10.2307/3088935. JSTOR 3088935.

13.

Jump up^ Flores-Macias, F.; Lawson, C. (2008). "Effects of interviewer gender on survey
responses: Findings from a household survey in Mexico". International Journal of Public Opinion
Research 20 (1): 100110. doi:10.1093/ijpor/edn007.

14.

Jump up^ Eisinga, R.; Te Grotenhuis, M.; Larsen, J.K.; Pelzer, B.; Van Strien, T. (2011).
"BMI of interviewer effects". International Journal of Public Opinion Research 23 (4): 530
543.doi:10.1093/ijpor/edr026.

15.

Jump up^ Anderson, B.A.; Silver, B.D.; Abramson, P.R. (1988). "The effects of the race
of the interviewer on race-related attitudes of black respondents in SRC/CPS national election
studies". Public Opinion Quarterly 52 (3): 128. doi:10.1086/269108.

16.

Jump up^ Kane, E.W.; MacAulay, L.J. (1993). "Interviewer gender and gender
attitudes". Public Opinion Quarterly 57 (1): 128. doi:10.1086/269352.

17.

Jump up^ Eisinga, R.; Te Grotenhuis, M.; Larsen, J.K.; Pelzer, B. (2011). "Interviewer
BMI effects on under- and over-reporting of restrained eating. Evidence from a national Dutch
face-to-face survey and a postal follow-up". International Journal of Public Health 57 (3): 643
647. doi:10.1007/s00038-011-0323-z. PMC 3359459. PMID 22116390.

Further reading[edit]

Abramson, J.J. and Abramson, Z.H. (1999). Survey Methods in Community Medicine:
Epidemiological Research, Programme Evaluation, Clinical Trials (5th edition). London:
Churchill Livingstone/Elsevier Health Sciences ISBN 0-443-06163-7

Adr, H. J., Mellenbergh, G. J., and Hand, D. J. (2008). Advising on research methods: A
consultant's companion. Huizen, The Netherlands: Johannes van Kessel Publishing.

Andres, Lesley (2012). "Designing and Doing Survey Research". London: Sage.

Dillman, D.A. (1978) Mail and telephone surveys: The total design method. New York:
Wiley. ISBN 0-471-21555-4

Engel. U., Jann, B., Lynn, P., Scherpenzeel, A. and Sturgis, P. (2014). Improving Survey
Methods: Lessons from Recent Research. New York: Routledge. ISBN 978-0-415-81762-2

Groves, R.M. (1989). Survey Errors and Survey Costs Wiley. ISBN 0-471-61171-9

You might also like