SOC 10 Surveys

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to Lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

Why is survey research considered effective for gathering original data about a population?

  • It guarantees 100% participation rates, leading to comprehensive data sets.
  • It's suited for understanding the intricate details of individual behaviors.
  • It allows researchers to observe subjects directly without interference.
  • It is the best approach for gathering data about a population too large to observe directly. (correct)

In survey design, what is the primary goal of using both questions and statements within a questionnaire?

  • To reduce the length of the survey by covering more topics with fewer items.
  • To ensure that every question is understood in exactly the same way by all respondents.
  • To confuse respondents, making it difficult to predict the intent of the survey.
  • To allow researchers to measure concepts and collect empirical measures of attitudes. (correct)

How do open-ended questions primarily benefit researchers in survey design?

  • By capturing issues the respondent considers important that the researcher may have overlooked. (correct)
  • By offering uniformity in responses, simplifying the analysis process.
  • By allowing for the inclusion of a greater number of questions, improving research efficiency.
  • By making data collection easier for large populations.

What is a critical consideration when constructing closed-ended questions for a survey?

<p>The response categories should be exhaustive and mutually exclusive. (C)</p> Signup and view all the answers

Why should survey designers avoid using double-barreled questions?

<p>Because they confuse respondents by combining multiple questions into one. (D)</p> Signup and view all the answers

Why is it important for survey designers to use clear and positive language in their questions?

<p>To avoid confusion and ensure that questions are easily understood. (C)</p> Signup and view all the answers

What is the primary risk associated with using biased language in survey questions?

<p>It skews the results by encouraging respondents to choose one answer over others. (D)</p> Signup and view all the answers

In terms of survey questionnaires, what does an 'uncluttered layout' primarily ensure?

<p>It promotes ease of understanding and reduces potential confusion for respondents. (A)</p> Signup and view all the answers

What is the purpose of contingency questions in a survey?

<p>To ask follow-up questions that are only relevant based on the respondent's answer to a previous question. (D)</p> Signup and view all the answers

What is a potential drawback of using matrix questions in surveys?

<p>They may encourage respondents to check the same answer for all statements, a response set. (D)</p> Signup and view all the answers

In survey design, why is pretesting with a pilot group considered essential before administering the survey widely?

<p>To anticipate order effects, refine questions, and gather feedback for revisions. (C)</p> Signup and view all the answers

What is one of the main strategies to increase response rates in self-administered mail surveys?

<p>Sending reminders 2-3 times over several weeks. (D)</p> Signup and view all the answers

What is a key aspect that differentiates survey interviews from self-administered questionnaires?

<p>Survey interviews ensure greater standardization and comparability of responses. (A)</p> Signup and view all the answers

Which of the following is a potential advantage of face-to-face interviews in survey research?

<p>Ability for the interviewer to clarify confusing items and observe the respondent's demeanor. (C)</p> Signup and view all the answers

What is a primary limitation of online surveys compared to traditional methods (e.g., face-to-face interviews)?

<p>Online surveys may suffer from representativeness issues due to unequal internet access. (A)</p> Signup and view all the answers

What does the process of 'operationalizing constructs' as variables primarily involve in survey research?

<p>Designing survey items to measure theoretical concepts. (B)</p> Signup and view all the answers

What is the possible outcome if an interviewer does not present survey items in a neutral way?

<p>The data can be less reliable. (D)</p> Signup and view all the answers

What could be missed if questionnaires are designed to be minimally appropriate for the average respondent?

<p>The particularities of any given respondent. (B)</p> Signup and view all the answers

What does surveys relying on self-report rather cause?

<p>Surveys rely on self-report rather than direct observation of social action and its social context. (C)</p> Signup and view all the answers

What do researchers assume about questionnaire items?

<p>That a questionnaire item will mean the same thing to every respondent. (D)</p> Signup and view all the answers

What do results represent when they are given a probability sample?

<p>Sample can be generalized (A)</p> Signup and view all the answers

What are some things that sociology surveys are commonly used to measure?

<p>All of the above (D)</p> Signup and view all the answers

What do statement items usually ask respondents to do?

<p>If they agree or disagree with the statement (B)</p> Signup and view all the answers

What do open-ended questions ask the respondent to do?

<p>Provide their own answer (A)</p> Signup and view all the answers

The longer and more involved a question is, the greater the likelihood of what?

<p>misunderstanding (A)</p> Signup and view all the answers

What may encourage respondents to choose one answer over the alternatives?

<p>leading questions (C)</p> Signup and view all the answers

Matrix questions consist of several questions that have what?

<p>The same answer categories (A)</p> Signup and view all the answers

What can be anticipated by trying out different question orderings with a pilot group?

<p>order effect (A)</p> Signup and view all the answers

What do you call, the practice of trying out a survey with a pilot group?

<p>pre-testing (C)</p> Signup and view all the answers

What is something that should happen before administering a survey?

<p>Researchers should always pre-test their questionnaire (D)</p> Signup and view all the answers

What is something that researchers may sometimes offer to participants?

<p>monetary or other compensation (C)</p> Signup and view all the answers

What should happen if a survey gives an insufficient answer to a question?

<p>Researchers should probe the participate to clarify or elaborate (B)</p> Signup and view all the answers

Face-to-face interviews tend to yield what?

<p>high response rates (A)</p> Signup and view all the answers

What decreases because of the presence of an interviewer?

<p>&quot;I don't know&quot; and &quot;no answer&quot; responses (C)</p> Signup and view all the answers

Many phone surveys are conducted via what?

<p>CATI systems (B)</p> Signup and view all the answers

Online surveys have many advantages, for example?

<p>They are time and cost effective (C)</p> Signup and view all the answers

What two things give researchers the ability to generalize from sample to population?

<p>Given a probability sample and proper standardization (C)</p> Signup and view all the answers

Flashcards

Survey research

A method for collecting original data about a population too large to observe directly.

Survey items

The individual questions or statements within a survey.

Open-ended questions

Questions that allow respondents to provide their own answers.

Closed-ended questions

Questions that provide a list of answers for the respondent to select.

Signup and view all the flashcards

Double-barreled questions

Complex questions that contain two or more questions in one.

Signup and view all the flashcards

Pretesting

Trying out a survey with a small group before the main study.

Signup and view all the flashcards

Contingency questions

Questions relevant only to respondents who provide a specific answer to a preceding question.

Signup and view all the flashcards

Matrix questions

Several questions with same answer categories, increasing comparability among items.

Signup and view all the flashcards

Self-administered questionnaire

Self-administered questionnaires sent to respondents by mail includes a letter explaining the study and a self-addressed, stamped return envelope.

Signup and view all the flashcards

Survey interviews

Questioning technique that are highly structured to ensure standardization and comparability of responses

Signup and view all the flashcards

Face-to-face interviews

Interviews conducted face-to-face with high response rates.

Signup and view all the flashcards

Telephone interview

Interview via a phone call

Signup and view all the flashcards

Online surveys

Surveys through a website or app for large population access.

Signup and view all the flashcards

Probes

The practice of clarifying or expanding on an answer.

Signup and view all the flashcards

Study Notes

  • Surveys: SOC 10, 3/10/2025

Overview

  • Surveys are used for describing populations
  • Surveys are used for constructing surveys
  • Surveys are used for administering surveys
  • Surveys are used for measuring variables and testing hypotheses
  • Surveys should have validity and reliability

Describing Populations

  • Survey research is useful for collecting original data about populations too large to observe directly
  • Probability sample results can be generalized to the population
  • Surveys are used in sociology to measure attitudes, opinions, values, orientations toward social issues, and self-reported behavior like religious attendance
  • Surveys facilitate secondary analysis by deidentifying data

Survey Content: Questions and Statements

  • Questionnaires use survey items, which are a mix of questions and statements
  • Questions and statements are useful for researchers to operationalize concepts for measurement.
  • Questions about attitudes toward abortion, for instance, can render empirical measures of attitude
  • Statements ask respondents to agree or disagree, sometimes indicating the strength of agreement
  • Question items can be open-ended or closed-ended

Open- and Closed-Ended Questions

  • Open-ended questions prompt respondents to provide their own answers
    • An example: "What do you feel is the most important issue facing the US today?"
  • Open-ended responses must be coded for quantitative analysis
  • Open-ended questions capture important issues but are hard to code consistently
  • Closed-ended questions ask respondents to select an answer from a provided list
    • An example; "Do you agree with the recent tax increase? Yes or no."
  • Closed-ended questions allow response uniformity and are easier to process, but may omit issues important to respondents

Constructing Closed-Ended Questions

  • Closed-ended questions should be constructed according to two structural requirements
    • Response categories should be exhaustive
    • Answer categories must be mutually exclusive
  • Questions should be precise for clarity.
    • If the question is unclear, it won't measure what it is designed to measure, i.e., it won't be a valid measure
    • Avoid jargon and technical vocabulary

Examples of bad questions

  • "Is this person's origin or descent (check one): Mexican, Puerto Rican, Cuban, Central or South American, Other Spanish, or No, none of these" (From the 1970 US Census)
  • Double-barreled questions are complex questions with two parts
    • An example: "The United States should abandon its space program and spend the money on domestic programs. Do you agree or disagree?"
    • The word "and" in a question or statement may indicate a double-barreled question
  • Respondents should be able to answer questions
    • An example: "At what age did you first talk back to your parents?"
  • Respondents must be willing to answer honestly
    • Someone may be afraid of repercussions or sensitive to social desirability
  • Questions should be relevant enough for participants to have an opinion.
    • Participants may sometimes claim expertise they don't have, as when they endorse fictional politicians
  • Questions should be kept short and succinct.
    • Lengthy, involved questions increase the likelihood of misunderstanding
  • Questions should be phrased positively to avoid negative constructions and confusion
    • "The US should not recognize Cuba. Do you agree or disagree?" is an example of a bad question
  • Avoid biased items or terms
    • Questions should not encourage respondents to choose one answer over others

Question Wording and Bias

  • Respondents to the General Social Survey were asked their opinions on government poverty reduction efforts (Rasinski, 1989)
    • 62.8% felt there was too little assistance for the poor
    • 23.1% felt there was too little welfare
  • A survey about Obamacare (Hart Research, 2013) found:
    • 35% had "very negative" feelings toward Obamacare
    • 24% had "very negative" feelings toward the Affordable Care Act

Survey Format

  • Survey questionnaires should have spaced questions and an uncluttered layout
  • Do not use abbreviations or place more than one question on the same line because it causes confusion
  • Respondents indicate answers by checking a box or circling the answer

Contingency Questions

  • Not all questions are relevant for each respondent
  • Contingency questions are follow-up questions for respondents with particular answers to preceding questions

Matrix Questions

  • Matrix questions consist of multiple questions with the same answer categories
    • An example is questions in the Likert scale format
  • Matrix questions increase comparability among items
  • Matrix questions may encourage respondents to check the same answer (e.g., "agree") for all statements creating a response-set
  • Reduce this problem by alternating the emphasis of successive statements with different perspectives

Order Effects

  • Item order can influence responses
  • An example: If terrorism questions in the US came before an open-ended question about the greatest dangers to the country, the respondent is likely to list terrorism
  • Order effects can be anticipated through testing different question orderings with a pilot group i.e., pretesting

Pretesting

  • Trying out a survey with a pilot group is known as pre-testing
  • Researchers should always pre-test the entire questionnaire before administering it in whole or in part
  • Researchers should ask respondents to comment on the questionnaire and incorporate feedback in revisions

Administering Surveys

  • Survey research administration can be achieved in these ways:
    • Self-administered mail surveys
    • Face-to-face interviews
    • Telephone interviews
    • Online surveys

Self-Administered Questionnaires

  • Self-administered questionnaires are usually sent to respondents by mail
  • Questionnaires include a letter explaining the study and a self-addressed, stamped return envelope which in some cases may be sealed/returned without another envelope
  • Response rates can be increased via follow-up mailings sent approximately 2-3 times over several weeks
  • Low response rates can be a problem because the respondents not returning questionnaires may differ in important ways from those who return them

Self-Administered Questionnaires and Nonresponse Bias

  • In cases of low response rate, testing for nonresponse bias should be done
    • Treating respondents after 2-3 reminders as nonrespondents may show evidence of nonresponse bias if significant differences are between early and later responders
  • Researchers may incentivize responses by offering money or other compensation
  • Delivering or picking up questionnaires in person improves response rates

Interviews

  • Typically done face-to-face, but sometimes over the phone
  • Survey interviews are structured toward standardization and comparability of responses
  • Interviewers receive training to follow standardized protocols
  • Protocols include probes which help respondents clarify or elaborate on an answer
    • If a respondent gives an insufficient answer the probe might be something like, "can you tell me more?"
  • As with survey questions, probes should be worded neutrally to avoid biasing the response

Face-To-Face Interviews

  • Face-to-face interviews commonly yield high response rates, often 80-85%
  • The presence of an interviewer decreases the frequency of "I don't know" and "no answer" responses
  • Interviewers clarify confusion and observe respondent demeanor, dress, tone of voice, surroundings, etc
  • Face-to-face interviews are time and resource intensive, and may raise safety concerns
  • The presence of an interviewer may prompt respondents to give socially desirable answers

Telephone Interviews

  • Telephone interviews are more cost and time effective than face-to-face interviews
  • They used to be a source of class bias, but this is no longer a problem
  • Many phone surveys are conducted via Computer Assisted Telephone Interview Systems (CATI)
  • Response rates tend to be lower than face-to-face interviews, due partly to telemarketing and phony-survey prevalence
  • Survey interviews have higher response rates than mail interviews, but they have declined recently

Online Surveys

  • Online surveys have many advantages since they are time and cost effective and have access to large populations as well as voice recognition and virtual "face-to-face" interviews
  • Not everyone has internet access, or is equally likely to receive the survey or respond which can lead to nonresponse issues
  • Many researchers approach a mixed-methods using online or paper-based questionnaires

Measuring Variables and Specifying Hypotheses

  • Survey items allow researchers to operationalize measurement
  • Items that measure attitudes, opinions, self-reported behavior, etc., can be examined for relationships between each other
    • Doing responses to variables predicting dependent variable responses

General Social Survey Measurement

  • Measure items designed to measure the same construct can combined into an index
  • 1996 General Social Survey questions for measuring racial prejudice:
    • If your party nominated an African-American for President, would you vote for him if he were qualified for the job? (yes = 0/no = 1)
    • Do you think there should be laws against marriages between African-Americans and whites? (yes = 0/no = 1)
    • Black-white relations opinions: White people have a right to keep blacks out of their neighborhoods if they want to, and blacks should respect that right. (0 = disagree, 1 = agree)

Racial Prejudice Index

  • Analysis to see whether a level of education affects racial prejudice
  • Tests can be performed to determine if increased education leads to reduced prejudice

Validity and Reliability

  • "Survey research is generally weak on validity and strong on reliability." (Babbie 2021:282)
  • Data scores high on reliability if an interviewer presents survey items neutrally and follows protocols for respondents
  • Standard format gives superficial access to how respondents interpret questions, making it hard to see if measured attitudes, beliefs, etc, are designed appropriately

Strengths and Limitations

  • Surveys are useful for describing large populations' characteristics
  • With a proper probability sample, surveys generalize from sample to population
  • Because questionnaires are minimally appropriate for the average respondent, peculiarities of any given respondent get missed, and interviewer abilities are limited
  • Surveys rely on self-report rather than direct observation of social action
  • Survey research is based on unrealistic stimulus-response theory, where researchers assume questionnaires mean the same thing to every respondent and given response although this is impossible to achieve entirely survey questions have been drafted to achieve the ideal as closely as possible.(Babbie 2021:269)

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

Related Documents

More Like This

Use Quizgecko on...
Browser
Browser