Survey Research PDF
Document Details
Uploaded by FastGermanium
Tags
Summary
This document covers a history of survey research, its methodology, different types of surveys, and their advantages and disadvantages. It emphasizes the importance of careful survey design to avoid biases, including social desirability effects. The document also explores how surveys are used to gather data on behaviors, attitudes, beliefs, characteristics, and expectations.
Full Transcript
Survey Research A History of Survey Research Survey Interviewing The Logic of Survey Research The Ethical Survey Construction of the Questionnaire Conclusion Types of Surveys: Advanta...
Survey Research A History of Survey Research Survey Interviewing The Logic of Survey Research The Ethical Survey Construction of the Questionnaire Conclusion Types of Surveys: Advantages and Disadvantages Every method of data collection, including the survey, is only an approximation to knowledge. Each provides a different glimpse of reality, and all have limitations when used alone. Before undertaking a survey the researcher would do well to ask if this is the most appropriate and fruitful method for the problem at hand. The survey is highly valuable for studying some problems, such as public opinion, and worthless for others. —Donald P. Warwick and Charles A. Lininger, The Sample Survey, pp. 5–6 In public opinion polls, most Americans say they would vote for a qualified female presidential candidate. Support for a qualified female candidate has steadily risen from 33 percent in 1937 to more than 92 percent in 2005. However, when survey researchers ask about controversial issues, they know that social desirability effects are a possibility (i.e., people give a false opinion so they will conform to general social norms). Streb et al. (2008) hypothesized that many Americans were being untruthful about this issue on surveys. Testing such a hypothesis required creativity. They created a list of four issues (e.g., gasoline prices rising, being required to wear seat belts) and asked how many “make you angry or upset.” They created a second identical list with the same questions, but including a fifth issue, “A woman serving as president.” They randomly selected more than 1,000 people for each list and conducted telephone interviews. The authors learned that when the woman as president item was on the list, the number of items that make people angry or upset was 26 percent higher. This suggests that about one in four people are giving a false, socially desirable answer on opinion polls and actually oppose a female presidential candidate. The survey is the most widely used social science All rely on the principles of the professional social data-gathering technique. Surveys have many uses research survey. Many people say that they will do a and take many forms—phone interviews, Internet survey to get information when they should say that opinion polls, and various types of questionnaires. they need the most appropriate way to get good data. SURVEY RESEARCH Surveys can provide us accurate, reliable, and valid data, but to do this they require serious effort and EXPANSION BOX 1 What Is Asked in a Survey thought. General public familiarity with the survey technique and the ease of conducting a survey can Although the categories overlap, the following can be a drawback. Despite their widespread use and be asked in a survey: popularity, without care, surveys can easily yield misleading results. As the issue of social desirabil- 1. Behavior. How frequently do you brush your teeth? ity bias (discussed later in the chapter) described Did you vote in the last city election? When did you in the chapter’s opening box shows, the survey last visit a close relative? methodology requires diligence. In this chapter, you 2. Attitudes/beliefs/opinions. What type of job do you will learn about survey research as well as its think the mayor is doing? Do you think other peo- limitations. ple say many negative things about you when you Survey research grew within a positivist are not there? What is the biggest problem facing approach to social science.1 As Groves remarked, the nation these days? 3. Characteristics. Are you married, never married, “Surveys produce information that is inherently sta- single, divorced, separated, or widowed? Do you tistical in nature. Surveys are quantitative beasts” belong to a union? What is your age? (1996:389). Most surveys ask a large number of 4. Expectations. Do you plan to buy a new car in the people (usually called respondents) about their next 12 months? How much schooling do you think beliefs, opinions, characteristics, and past or present your child will get? Do you think the population in behaviors (see Expansion Box 1, What Is Asked in this town will grow, decrease, or stay the same? a Survey). For this reason, surveys are appropriate 5. Self-classification. Do you consider yourself to be when we want to learn about self-reported beliefs or liberal, moderate, or conservative? Into which social behaviors. Most surveys ask many questions at once, class would you put your family? Would you say you thereby measuring many variables. This allows us are highly religious or not religious? to gather descriptive information and test multiple 6. Knowledge. Who was elected mayor in the last election? About what percentage of the people in hypotheses in a single survey this city are non-White? Is it legal to own a personal We can use surveys for exploratory, descriptive, copy of Karl Marx’s Communist Manifesto in this or explanatory research. However, we should be country? cautious when asking “why” questions of respon- dents (e.g., Why do you think crime occurs?).2 Such questions may tell us about people’s beliefs and subjective understandings, but people often military service. After representative democracy have incomplete, mistaken, or distorted views. We developed, officials used the census to assign elected do not want confuse what people say or believe representatives based on the population in a district about why things occur with actual cause-effect and to allocate funds for public improvements. relations in the social world. Surveys for social research started with nine- teenth century social reform movements in the United States and Great Britain. Surveys helped A H ISTORY OF S U RVEY RES EARCH people document urban conditions and poverty pro- The modern survey goes back to ancient forms duced by early industrialization. The early surveys of the census.3 A census is government-collected were descriptive and did not use scientific sampling information on characteristics of the entire popula- or statistical analyses. For example, between 1851 tion in a territory. For example, the Domesday Book and 1864, Henry Mayhew published the four- was a census of England conducted from 1085 to volume London Labour and the London Poor based 1086 by William the Conqueror. The early census on conversations with street people and observa- assessed property for taxation or young men for tions of daily life. Later studies by Charles Booth’s SURVEY RESEARCH seventeen-volume (1889–1902) Labour and Life of the social reformers who had used the survey to the People of London and B. Seebohm Rowntree’s document local social problems. The professional Poverty (1906) documented urban poverty in En- researchers incorporated principles from the natu- gland; the Hull House Maps and Papers of 1895 and ral sciences and sought to make the survey method W. E. B. DuBois’s Philadelphia Negro (1899) doc- more objective, quantitative, and nonpolitical. umented urban conditions in the United States. Many academic researchers sought to distance In the early twentieth century, the Social Sur- themselves from social reform politics after the vey Movement in Canada, Great Britain, and the Progressive Era (1895–1915) ended. Competition United States used the survey method as part of among researchers and universities for status, pres- qualitative community field studies. The Social tige, and funds accelerated a reorientation or posi- Survey Movement was an action-oriented commu- tivist “modernization” of the survey method. This nity research program that interviewed people and period saw the creation of several survey research documented conditions to gain support for sociopo- centers: the Office of Public Opinion Research at litical reforms. By the 1940s, the positivist, quanti- Princeton University, the Division of Program Sur- tative survey had largely displaced this early form veys in the U.S. Department of Agriculture under of survey research. Rensis Likert, and the Office of Radio Research at Early social surveys offered a detailed empir- Columbia University. A publication devoted to ical picture of specific areas and combined sources advancing the survey research method, Public of quantitative and qualitative data. Their goal was Opinion Quarterly, began in 1937. Several large to inform the public of the problems of rapid indus- private foundations (Carnegie, Rockefeller, and trialization. Early leaders of the social survey— Sage) funded the expansion of quantitative, posi- Florence Kelly and Jane Addams of the Hull House tivist-oriented social research.4 and settlement movement and African American Survey research dramatically expanded during W. E. B. DuBois—were outside the mainstream World War II, especially in the United States. Aca- of academic life. Kelly, Addams, and Dubois had demic social researchers and practitioners from difficulties securing regular academic employ- industry converged in Washington, D.C., to work ment because of race and gender discrimination of for the war effort. Survey researchers received gen- that era. The early social surveys provide impres- erous funding and government support to study sive pictures of daily community life in the early civilian and soldier morale, consumer demand, pro- twentieth century. For example, the six-volume duction capacity, enemy propaganda, and the effec- Pittsburgh Survey published in 1914 includes tiveness of bombing. Wartime cooperation helped data from face-to-face interviews, statistics on academic researchers and applied practitioners health, crime, and industrial injury, and direct learn from one another and gain valuable experi- observations. ence in conducting many large-scale surveys. Aca- By the 1920s and 1930s, researchers began demic researchers helped practitioners appreciate to use statistical sampling techniques, especially precise measurement, sampling, and statistical after the Literary Digest debacle. They created atti- analysis. Practitioners helped academics learn the tude scales and indexes to measure opinions and practical side of organizing and conducting surveys. subjective beliefs in more precise, quantitative Officials in government and business executives ways. Professionals in applied areas (e.g., agricul- saw the practical benefits of using information from ture, education, health care, journalism, marketing, large-scale surveys. Academic social scientists real- public service, and philanthropy) adapted the sur- ized they could advance understanding of social vey technique for measuring consumer behavior, events and test theories with survey data. public opinion, and local needs. After World War II, officials quickly disman- By the 1930s, professional researchers who tled the large government survey establishment. embraced a positivist orientation were fast displacing This was done to cut costs and because political con- servatives feared that reformers might use survey SURVEY RESEARCH methods to document social problems. They feared sees survey research becoming a separate discipline such information about ill treatment and poor from the many fields (e.g., sociology, political sci- conditions could be used to advance policies that ence, marketing) that use it. conservatives opposed, such as helping unemployed Professionals in education, health care, man- workers or promoting racial equality for African agement, marketing, policy research, and jour- Americans in the segregated southern states. nalism use survey research. Governments from After the war, many researchers returned to the local to national levels around the world spon- universities and founded new social research organ- sor surveys to inform policy decisions. The private- izations such as the National Opinion Research sector survey industry includes opinion polling Center at the University of Chicago in 1947. Likert (e.g., Gallup, Harris, Roper,Yankelovich and Asso- moved from the Department of Agriculture to cre- ciates), marketing (e.g., Nielsen, Market Facts, ate what became the Institute for Survey Research Market Research Corporation), and nonprofit at the University of Michigan in 1949. research (e.g., Mathematica Policy Research, Rand At first, universities were hesitant to embrace Corporation, etc.).8 In addition, survey research has the new survey research centers. They were expen- several professional organizations.9 sive and employed many people. Traditional social Over the past two decades, researchers have researchers were wary of quantitative research and increasingly studied the survey process itself skeptical of bringing a technique popular within and developed theories of the communication- private industry into the university. The culture interaction process of a survey interview. They can of applied research and business-oriented poll pinpoint the effectiveness of visual and other clues takers clashed with an academic culture of basic in questionnaire design, recognize the impact of researchers, yet survey use quickly increased in the question wording or ordering, adjust for social United States and other advanced nations. By 1948, desirability, incorporate computer-related tech- France, Norway, Germany, Italy, the Netherlands, nologies, and theorize about survey respondent Czechoslovakia, and Britain had each established cooperation or refusals.10 national survey research institutes.5 Publications including survey research accel- TH E LOGIC OF S U RVEY RES EARCH erated in the 1950s to 1960s. For example, about 18 percent of articles in sociology journals used the In experimental research we divide small numbers survey method in the period 1939–1950; this rose of people into equivalent groups, test one or two to 55 percent by 1964–1965. In the 1960s, higher hypotheses, manipulate conditions so that certain education and social science rapidly expanded, also participants receive the treatment, and control the spurring survey research. During the 1970s, com- setting to reduce threats to internal validity (i.e., puters first became available; they provided the sta- confounding variables). At the end of an experi- tistical analysis of large-scale quantitative datasets, ment, we have quantitative data and compare par- and hundreds of graduate students learned survey ticipant responses on the dependent variable. research techniques.6 Survey research follows a different logic. We usu- Since the 1970s, quantitative survey research ally sample many respondents and ask all of them has become huge in private industry, government, the same questions. We measure many variables and in many academic fields (e.g., communication, with the questions and test multiple hypotheses education, economics, political science, public simultaneously. We infer temporal order from ques- health, social psychology, and sociology). The pro- tions about past behavior, experiences, or charac- fessional survey industry employs more than teristics. For example, years of schooling 60,000 people in the United States alone. Most are completed or race are prior in time to a person’s part-time workers, assistants, or semiprofessionals. current attitudes. We statistically analyze associa- About 6,000 full-time professional survey researchers tions among the variables to identify causal rela- design and analyze surveys.7 Weissberg (2005:11) tionships. We also anticipate possible alternative SURVEY RESEARCH explanation and measure them with other survey questions (i.e. control variables). Later, we statis- Step 1: tically examine their effects to rule out alternative Develop hypotheses. explanations. Surveys are sometimes called corre- Decide on type of survey (mail, interview, telephone). lational because the researchers do not control and Write survey questions. manipulate conditions as in an experiment. In sur- Decide on response categories. vey research, we use control variables to statisti- Design layout. cally approximate an experimenter’s physical controls on confounding variables. Steps in Conducting a Survey Step 2: To conduct a survey, researchers start with a theo- Plan how to record data. Pilot test survey instrument. retical or applied research problem. We can divide the steps in a survey study as outlined in Figure 1. The first phase is to create an instrument—a survey questionnaire or interview schedule. Respondents read the questions in a questionnaire themselves Step 3: and mark the answers themselves. An interview Decide on target population. schedule is a set of questions read to the respon- Get sampling frame. dent by an interviewer, who also records responses. Decide on sample size. Select sample. To simplify the discussion, I will use only the term questionnaire. Survey research proceeds deductively. First, we conceptualize variables and then operationalize each variable as one or more survey questions. This Step 4: Locate respondents. means we write, rewrite, and again rewrite survey Conduct interviews. questions for clarity and completeness. Once we Carefully record data. have a collection of survey questions, we must organize them on the questionnaire and group and sequence the questions. Our research question, the types of respondents, and the type of survey (see types of surveys later in this chapter) should guide Step 5: how we do this. Enter data into computers. Recheck all data. Let us say you are going to conduct a survey. Perform statistical analysis on data. As you prepare a questionnaire, think ahead to how you will record and organize the data. You also should pilot test the questionnaire with a small set of respondents who are similar to those in the final survey. If you use interviewers, you must Step 6: Describe methods and findings train them with the questionnaire. In the pilot test in research report. and interviewer training, you ask respondents and Present findings to others for interviewers whether the questions were clear, and critique and evaluation. you need to explore their interpretations to see whether your intended meaning was clear (see pilot testing and cognitive interviewing later in the FIGURE 1 Steps in the Process of Survey chapter).11 Research SURVEY RESEARCH This is the stage at which you would draw the sample of respondents. After the planning phase, EXPANSION BOX 2 Sources of Errors in Survey Research you are ready to collect data. You must locate sam- pled respondents in person, by telephone, by mail, Error is the difference between obtained values and or over the Internet. You provide respondents the “true values.” It occurs when survey data (obtained instructions on completing the questionnaire or values) do not accurately reflect the actual behaviors, interview. The questions usually follow a simple beliefs, and understandings of respondents in a pop- stimulus/response or question/answer pattern. You ulation that a survey researcher seeks to understand must accurately record the answers or responses (true values). immediately after they are given. After all respon- dents have completed the questionnaire and you 1. Errors in selecting the respondent thank them for participating, you organize the a. Sampling errors (e.g., using a nonprobability sam- quantitative data and prepare them for statistical pling method) analysis. b. Coverage errors (e.g., a poor sampling frame omits certain groups of people) Conducting survey research requires good c. Nonresponse errors at the level of a sampled unit organization. A large survey can be complex and (e.g., a respondent refuses to answer) expensive. It involves coordinating other people, 2. Errors in responding to survey questions moving through multiple steps, and accurate record a. Nonresponse errors specific to a survey item (e.g., keeping.12 You must keep track of each respondent’s certain questions are skipped or ignored) answer to every question on each questionnaire. To b. Measurement errors caused by respondent (e.g., help with this task, you should assign each sampled respondent does not listen carefully) respondent an identification number and attach the c. Measurement errors caused by interviewers (e.g., number to the questionnaire. interviewer is sloppy in reading questions or After collecting all of the data, you will want to recording answers) review responses on individual questionnaires, store 3. Survey administration errors a. Postsurvey errors (e.g., mistakes in cleaning data original questionnaires, and transfer information or transferring data into an electronic form) from questionnaires to a computer-readable format b. Mode effects (e.g., differences due to survey for statistical analysis. Meticulous bookkeeping and method: by mail, in person, over the Internet) labeling are essential. If you are sloppy, you can lose c. Comparability errors (e.g., different survey organ- the data or end up with worthless, inaccurate data. izations, nations, or time periods yield different There are many ways to make mistakes or data for the same respondents on the same errors in survey research (see Expansion Box 2, issues). Sources of Errors in Survey Research). Errors can See: Weisberg (2005:10–28) and Willis (2005:13–17). occur in sampling and respondent selection, in cre- ating questionnaires or interviewing, and in han- dling or processing the data. Next we look at possible errors to avoid when you write questions for a survey research questionnaire. mixture of art and science. It is best to see the entire questionnaire as an integrated whole with the ques- tions flowing smoothly from one to another after a CONSTRUCTION OF few introductory remarks and instructions for ease TH E QU ESTION NAI RE of entry and clarity. Two key principles guide writing good survey Principles of Good Question Writing questions: Avoid possible confusion and keep the Dozens of books have been published on writing sur- respondent’s perspective in mind. Avoiding confu- vey questionnaires, so only the basics are reviewed sion is easier said than done. You want the survey here. Writing good survey questions involves a questions to provide a valid and reliable measure. SURVEY RESEARCH Being valid and reliable means that the respondents dozen others that use the MSS abbreviation. I belong should quickly grasp each question’s meaning as to a professional association, the Association you intended, answer completely and honestly, and for Asian Studies, or AAS. Six other academic believe that their answers are meaningful. organizations use the same acronym: American You do not want questions that confuse or frus- Astronomical Society, American Association of trate respondents. This means that you must exer- Suicidology, American Audiology Society, Ameri- cise extra care if the respondents are heterogeneous, can Astronautical Society, American Antiquarian come from life situations unfamiliar to you, or have Society, and the Assyrian Academic Society. different priorities than yours. You must be vigilant When you survey the public, you should use if the respondents use a different vocabulary or think the language of popular culture (i.e., what is on in different ways than you do. television or in a local newspaper with about an You want the questions to be equally clear, rel- eighth-grade reading vocabulary). Survey research- evant, and meaningful to all respondents, but you face ers have found that respondents often misun- a dilemma. If the respondents have diverse back- derstand basic terms and are confused by many grounds and frames of reference, the same question words. For example, a survey asked respondents wording may not have the same meaning for every- whether they thought television news was impar- one, yet you want everyone to hear the same ques- tial. Researchers later learned that large numbers of tion because you will combine all answers into respondents had ignored the word impartial—a numerical data for analysis. If each question is tai- term the researchers assumed everyone would lored to each respondent, you would not know know. Less than half of the respondents had inter- whether variations in the data are due to question preted the word as intended with its proper mean- wording or real differences among the respondents. ing. More than one-fourth had no idea of its Survey question writing takes skill, practice, meaning; others gave it unusual meanings, and one- patience, and creativity. You can understand princi- tenth thought it was directly opposite to its true ples of question writing by knowing ten things to meaning. In another case, one in four respondents avoid when you write survey questions. The list who had less than a high school degree (about 20 includes only frequently encountered potential percent of the U.S. adult population) did not know problems.13 what vaginal intercourse meant.14 2. Avoid ambiguity, confusion, and vague- 1. Avoid jargon, slang, and abbreviations. ness. Ambiguity and vagueness plague most ques- Jargon and technical terms come in many forms. tion writers. It is very easy to make implicit Plumbers talk about snakes, lawyers about a con- assumptions that can confuse respondents. For tract of uberrima fides, and psychologists about the example, the question “What is your income?” Oedipus complex. Slang is a kind of jargon within could mean weekly, monthly, or annually; family or a subculture. For example, people who are home- personal; before taxes or after taxes; for this year or less talk about a snowbird, and snowboarders talk last year; from salary or from all sources. Such con- about goofy foot. People inside a profession or fusion can cause inconsistencies in respondents’ members of a distinct subculture may be familiar answers to the question. If you want before-tax and comfortable with the jargon or slang terms but annual family income for last year, you should only confuse outsiders. Also, avoid using abbrevi- explicitly ask for it. Many respondents may not ations and acronyms. The same ones often have know this, but they tell you their weekly take-home many meanings. For example, I received a letter pay (see item 6 following as to questions beyond from the Midwest Sociological Society (MSS). respondent capabilities).15 Indefinite words or Look up the acronym, and you will see that MSS response categories are also sources of ambiguity. refers to Manufacturers Standardization Society, For example, an answer to the question “Do you Marine Systems Simulator, Medical Student Soci- jog regularly? Yes _____ No _____ ” hinges on the ety, and Minnesota Speleological Society, among a meaning of the word regularly. Some respondents SURVEY RESEARCH may define regularly as every day, others as once a avoid words with emotional “baggage” because week. To reduce confusion and get more informa- respondents may be reacting to the emotional words tion, be more specific: Rather than ask if a person rather than the substantive issue. For example, the regularly jogs, ask whether a person jogs “about question “What do you think about paying murder- once a day,” “a few times a week,” “once a week,” ous terrorists who threaten to steal the freedoms of and so on. (See Expansion Box 3, Improving peace-loving people?” is full of emotional words: Unclear Questions.) murderous, freedoms, steal, and peace. 3. Avoid emotional language and prestige Prestige bias occurs when questions include bias. Words have implicit connotative as well as terms about a highly prestigious person, group, or explicit denotative meanings. Likewise, titles or institution and a respondent’s feelings toward the positions in society (e.g., president, expert) carry prestige and status. Words with strong emotional Prestige bias A problem in survey research question connotations and issues connected to high-status writing that occurs when a highly respected group or people can color how respondents answer survey individual is associated with an answer choice. questions. It is best to use neutral language and EXPANSION BOX 3 Improving Unclear Questions Here are three survey questions written by experi- inadequate answers (e.g., don’t know). As you can enced professional researchers. They revised the see, question wording is an art that may improve with original wording after a pilot test revealed that 15 per- practice, patience, and pilot testing. cent of respondents asked for clarification or gave ORIGINAL QUESTION PROBLEM REVISED QUESTION Do you exercise or play What counts as Do you do any sports or hobbies, physical sports regularly? exercise? activities, or exercise, including walking, on a regular basis? What is the average number of Does margarine This next question is just about butter— days each week you have butter? count as butter? not including margarine. How many days a week do you have butter? [Following question on eggs] How many eggs is a On days when you eat eggs, how many What is the number of servings serving? What is a eggs do you usually have? in a typical day? typical day? PERCENTAGE OF PERCENTAGE RESPONSES TO ASKING FOR QUESTION CLARIFICATION Original Revision Original Revision Exercise question (saying “yes”) 48% 60% 5% 0% Butter question (saying “none”) 33 55 18 13 Egg question (saying “one”) 80 33 33 0 Source: Survey questions adapted from Fowler, Survey Research Methods, Sage Publications. 1992. SURVEY RESEARCH prestigious person or group overshadows how he leads respondents to answering no. A question or she answers a question. You would not know phrased, “Should the mayor allocate funds to fix whether you are measuring their feelings about a streets with large potholes that have become prestigious person or their real thoughts on the dangerous and are forcing drivers to make costly issue. For example, you ask, “Most doctors say that repairs?” leads respondents to say yes. cigarette smoke causes lung disease for those who 6. Avoid questions beyond respondents’capa- are near a smoker. Do you agree?” People who think bilities. Asking something that respondents do not it best to agree with doctors might agree even if they know creates confusion, frustration, and inaccurate personally disagree. responses. Respondents cannot always recall past 4. Avoid double-barreled questions. This details and may not know specific information. For is a version of avoiding ambiguity. You want example, asking a 40-year-old, “How did you feel each question to be about one and only one topic. about your brother when you were 6 years old?” is A double-barreled question consists of two or probably worthless, as is asking about an issue more questions mixed together. For example, you respondents know nothing about (e.g., a technical ask, “Does your employer offer pension and health issue in foreign affairs or an internal policy of an insurance benefits?” A respondent working for a organization). Respondents may give you an answer company that offers health insurance benefits but but an unreliable and meaningless one. When many not a pension could answer either yes or no. respondents are unlikely to know about an issue, use A respondent who hears the word and and thinks it special question formats (we discuss this later in the means and/or will say yes. A respondent who hears chapter). and and thinks it means both or and also will say Try to rephrase questions into the terms in “no.” With double-barreled questions, you cannot which respondents think. For example, few respon- be certain of the respondent’s intention. If you want dents can answer, “How many gallons of gasoline to ask about the joint occurrence of two things, ask did you buy last year for your car?” Yet they might two separate questions, each about a single issue. be able to answer a question about gasoline pur- During data analysis, you can see whether people chases in a typical week. You can do the calcula- who answered yes to one question also answered tions to estimate annual purchases.16 yes to another. Clear, relevant questions increase accuracy 5. Avoid leading questions. You always want and reduce errors. Clear questions contain built-in respondents to believe that all response choices are clues and make contrasts explicit. Instead of asking equally legitimate and never want them to become “Do you pay money to the children of your past aware of an answer that you expect or want. marriage?” it would be better to ask “Do you pay A leading (or loaded) question is one that leads child support?” For those answering yes, follow- the respondent to one response over another by its up questions could be “Did you pay alimony in wording. There are many kinds of leading ques- addition to child support?” and “Did you have any tions. For example, the question “You don’t smoke, other financial obligations, such as paying health do you?” leads respondents to state that they do not insurance, tuition, or contributing to the mortgage smoke. or rent payments?”17 Loaded questions can lead respondents to 7. Avoid false premises. If you begin a ques- either positive or negative answers. For example, tion with a premise with which respondents dis- “Should the mayor spend even more tax money to agree and offer choices regarding it, respondents keep the city’s excellent streets in super shape?” may become frustrated and not know how to answer. About two years ago, I experienced a false premise question, but it was not in a survey. I was Double-barreled question A survey enquiry that contains more than one issue and can create respon- an airline passenger shortly after the airlines ceased dent confusion or ambiguous answers. providing free in-flight snacks. A flight attendant handed me an optional snack, and asked, “Will you SURVEY RESEARCH be paying by cash or credit card?” I hesitated a sec- dislike. Exhaustive means that every respondent ond and then realized that it was a ploy to get me has a choice—a place to go. For example, asking to purchase the now optional snack that I did not respondents, “Are you working or unemployed?” want. I replied “neither” and returned it quickly. omits respondents who are not working and who are The false premise in this situation was that I wanted not unemployed, such as full-time homemakers, to buy the snack. I became a little irritated with this people on vacation, full-time students, people who premise. Apparently, the false premise had irritated are permanently disabled and cannot work, and peo- others because six months later, flight attendants ple who are retired. To avoid such problems, first no longer tried to trick passengers into buying the think seriously about what you really want to mea- snacks. sure and consider the circumstances of all possible 8. Avoid asking about distant future intentions. respondents. For example, if you ask about employ- Avoid asking people about what they might do ment, do you want information on a primary job or under hypothetical circumstances. Questions such on all jobs, on full-time work only or both full- and as “Suppose a new grocery store opened down the part-time work, on jobs for pay only or on unpaid road. Would you shop at it?” are usually a waste of or volunteer jobs as well? time. It is best to ask about current or recent atti- Keep response categories balanced. Unbal- tudes and behavior. Respondents give more reliable anced response categories create a type of leading answers to specific, concrete, and relevant questions question. An unbalanced choice is “What kind of than to questions about things remote from imme- job is the mayor doing: outstanding, excellent, very diate experiences. good, or satisfactory?” Another type of unbalanced 9. Avoid double negatives. Double negatives question omits information—for example, “Which in ordinary language are grammatically incorrect of the five candidates running for mayor do you and confusing. For example, “I ain’t got no job” favor: Eugene Oswego or one of the others?” grammatically and logically means that I have a You can balance categories by offering polar job. Some people use the second negative for opposites. It is easy to see that the terms honesty and emphasis. Such blatant errors are rare, but subtle dishonesty have different meanings and connota- forms of the double negative are also confusing. tions. If you ask whether a mayor is highly, some- They can arise when we ask respondents to agree what, or not very honest is not the same as asking or disagree with a statement. For example, you ask whether a mayor is very honest, somewhat honest, “Do you agree or disagree that students should not neither honest nor dishonest, somewhat dishonest, be required to take a comprehensive exam to grad- or very dishonest. The way that you ask a question uate?” This is confusing. To disagree is a double could give you very different pictures of what peo- negative; it is to disagree with not doing something. ple think. Unless you have a specific reason for You always want to keep questions simple and doing otherwise, offer polar opposites at each end straightforward. of a continuum18 (see Table 1). 10. Avoid overlapping or unbalanced response categories. Make response categories or choices Respondent Recall mutually exclusive, exhaustive, and balanced. Mutually exclusive means that the response cate- We often want to ask respondents about past behav- gories do not overlap. It is easy to fix overlapping iors or events. Respondents vary in their ability to categories that are numerical ranges (e.g., 5–10, recall accurately when answering survey ques- 10–20, 20–30 become 5–9, 10–19, 20–29). tions.19 Recalling past events often takes more Ambiguous verbal choices can be overlapping time and effort than the few seconds we give response categories: for example, “Are you satisfied respondents to answer a survey question. Also, with your job, or are there things you do not like the ability of people to recall accurately declines about it?” Assume that I am satisfied overall with quickly over time. They might accurately recall a my job, but it has some specific things I really significant event that occurred 2 weeks ago, but SURVEY RESEARCH TA B L E 1 Summary of Survey Question Writing Pitfalls THINGS TO AVOID NOT GOOD A POSSIBLE IMPROVEMENT Jargon, slang, abbreviations Did you drown in brew until you Last night, about how much beer were totally blasted last night? did you drink? Vagueness Do you eat out often? In a typical week, about how many meals do you eat away from home, at a restaurant, cafeteria, or other eating establishment? Emotional language and “The respected Grace Commission How important is it to you that prestige bias documents that a staggering $350 Congress adopt measures to reduce BILLION of our tax dollars are government waste? being completely wasted through Very Important poor procurement practices, bad Somewhat Important management, sloppy bookkeeping, Neither Important or Unimportant ‘defective’ contract management, Somewhat Unimportant personnel abuses and other wasteful Not Important at All practices. Is cutting pork barrel spending and eliminating government waste a top priority for you?”* Double-barreled questions Do you support or oppose raising Do you support or oppose raising Social Security benefits and increased Social Security benefits? spending for the military? Do you support or oppose increasing spending on the military? Leading questions Did you do your patriotic duty and Did you vote in last month’s mayoral vote in the last election for mayor? election? Issues beyond respondent Two years ago, how many hours did In the past two weeks, about how capabilities you watch TV every month? many hours do you think you watched TV on a typical day? False premises When did you stop beating your Have you ever slapped, punched, girl- or boyfriend? or hit your girl- or boyfriend? Distant future intentions After you graduate from college, get Do you have definite plans to put a job, and are settled, will you invest some money into the stock market a lot of money in the stock market? within the coming two months? Double negatives Do you disagree with those who There is a proposal to build a new do not want to build a new city city swimming pool. Do you agree swimming pool? or disagree with the proposal? Unbalanced responses Did you find the service at our hotel Please rate the service at our hotel: to be Outstanding, Excellent, Superior, Outstanding, Very Good, Adequate, or Good? or Poor. *Actual question taken from a mail questionnaire that was sent to the author in May 1998 by the National Republican Congressional Committee. It is also a double-barreled question. SURVEY RESEARCH few can be accurate about minor events that hap- must write survey questions specifically for that pened 2 years ago. purpose and interpret results with caution. To Survey researchers recognize that memory is improve recall, we can offer special instructions less trustworthy than was once assumed. Many fac- and extra thinking time. We can provide aids to tors influence recall: the topic (threatening or respondent recall, such as a fixed timeframe or socially desirable), events occurring simultaneously location references. Rather than ask “How often and subsequently, the significance of an event for a did you attend a sporting event last winter?” you person, the situational condition (question wording should say, “I want to know how many sporting and interview style), and a respondent’s need for events you attended last winter. Let’s go month by internal consistency. Also, recall (e.g., what is the month. Think back to December. Did you attend name of your town’s mayor) is more difficult than any sporting events for which you paid admission recognition (e.g., look at this list of names and in December? Now, think back to January. Did please identify which one is your town’s mayor). you attend any sporting events that charged admis- The issue of respondent recall does not mean sion in January?” (See Example Box 1, How to that we cannot ask about past events; rather, we Measure TV Watching in a Survey.) EXAMPLE BOX 1 How to Measure TV Watching in a Survey Two studies by Prior (2009a, 2009b) illustrate the dif- or normative answer. In a series of experiments with ficulty of using recall survey questions to measure tel- survey question formats, Prior found little support for evision watching. The primary way we measure media satisficing or social desirability, at least for TV news usage is by self-reports on surveys. In the past 10 years, recall. Even when given extra time to think, told that nearly fifty studies in leading scholarly journals used their answer was important, and asked a second time, survey self-reports of media usage as data. Unfortu- people highly overstated. When people were told how nately, people do not recall accurately and can dra- much others watched TV news, they changed answers matically overstate media usage in surveys. Survey dramatically to conform. However, when given some self-reports of watching television news during the assistance in recall, extreme overstating decreased. past week are three times higher than the media com- When people were given an “anchor” or some addi- pany Nielsen has found based on its in-set usage- tional factual information to assist their recall, their esti- monitoring technology. While most demographic mates improved. Respondents were asked, “The next groups overreport, Prior found overreporting was question is about the nightly national network news on highest in the 18–34-year-old age group. About thirty- CBS, ABC, and NBC. This is different from local news five percent in this age group said they watch TV news shows about the area where you live and from cable on each day, but the Nielsen technology shows that news channels such as CNN or Fox News channel. only 5 percent really do. Even older age groups who How many days in the past week did you watch are much more accurate overstate by a factor of 2. national network news on television?” One group of Prior looked at three explanations for inaccurate recall respondents heard the following introductory state- of behavior on surveys from the literature on how ment. “Television news audiences have declined a respondents answer in survey: satisficing, flawed esti- lot lately. Less than one out of every ten Americans mates, and social desirability. Satisficing is a word that watches the national network news on a typical week- describes people having inaccurate recall because they day evening.” Respondents who heard this introduc- lack motivation or do not try hard enough to search tory statement took longer to answer and gave lower their memories. Flawed estimates result when people reports of news watching. Prior’s research suggests do not use good memory searching strategies to that respondents may give more accurate recalls in remember. Social desirability indicates that people survey questions if they are both given more time to report what they believe to be a socially appropriate respond and are helped along in the recall process. SURVEY RESEARCH Respondents often telescope, or compress time, TA B L E 2 Threatening Questions and Sensitive when asked about past events. They recall an event Issues but earlier (backward telescope) or later (forward telescope) than it actually occurred. Several tech- PERCENTAGE niques reduce telescoping (see Expansion Box 4, TOPIC VERY UNEASY Four Techniques to Reduce Telescoping). Masturbation 56 Sexual intercourse 42 Honest Answers Use of marijuana or hashish 42 Use of stimulants and depressants 31 Questions about Sensitive Topics. We sometimes Getting drunk 29 ask about sensitive issues or ones that people Petting and kissing 20 believe threaten their presentation of themselves. Income 12 These include questions about sexual behavior, Gambling with friends 10 drug or alcohol use, mental health problems, law Drinking beer, wine, or liquor 10 violations, or socially unpopular behavior. Respon- Happiness and well-being 4 dents may be reluctant to answer completely and Education 3 Occupation 3 truthfully. To ask about such issues, we adjust how Social activities 2 we ask and are especially cautious about the results20 General leisure 2 (see Table 2). Sports activity 1 Questions on sensitive issues are part of the larger issue of ego protection. Most of us try to Source: Adapted from Improving Interview Method and Questionnaire Design. Bradburn and Sudman. 1980. JosseyBass. ISBN 10: 087589402X Telescoping Survey research respondents’ com- pressing time when answering about past events, present a positive image of ourselves to others. We overreporting recent events, and underreporting distant past ones. may be ashamed, embarrassed, or afraid to give truthful answers, or may find confronting our actions honestly to be emotionally painful, let alone admit- ting them to others. When this occurs, we under- EXPANSION BOX 4 report the behaviors or attitudes we wish to hide or Four Techniques to Reduce Telescoping believe to violate social norms. People often under- report having an illness or disability (e.g., cancer, 1. Situational framing. Ask the respondent to recall a mental illness, venereal disease), engaging in illegal specific situation and ask details about it (“Tell me or deviant behavior (e.g., evading taxes, taking what happened on the day you were married, start- drugs, consuming alcohol, engaging in uncommon ing with the morning”). sexual practices), or revealing their financial status 2. Decomposition. Ask the respondent several specific (e.g., income, savings, debts) events and then add them up (“Last week did you buy anything from a vending machine? Now, for the We can increase honest answering about sen- week before that, did you buy any items?”). sitive topics in four ways: create comfort and trust, 3. Landmark anchoring. Ask the respondent whether use enhanced phrasing, establish a desensitizing something occurred before or after a major event context, and use anonymous questioning methods. (“Did that occur before or after the major earthquake Each is discussed next. here in June 2010?”). 4. Bounded recall. (for panel surveys). Ask the respon- 1. Create comfort and trust. Establish trust dent about events that occurred since the last inter- and a comfortable setting before asking questions. view (“We last talked 2 years ago; since that time, Before starting an interview we can explicitly what jobs have you held?”). restate guarantees of anonymity and confidential- ity and emphasize the need for obtaining honest SURVEY RESEARCH answers from respondents. We also can ask sensi- viewer is available to help or answer questions. tive questions only after a “warm-up period” of ask- Respondents hear questions over earphones and/or ing nonthreatening questions and creating feelings read them on a screen and then enter answers with- of trust or comfort. out the interviewer directly observing. While com- 2. Use enhanced phrasing. Modify question pleting computer-based interviews, respondents wording to reduce threat. For example, you could appear to believe they have privacy even if others ask “Have you ever shoplifted?” which carries an are present.22 accusatory tone and uses the emotional word A complicated method for asking sensitive shoplift that names an illegal act. You could get at questions in face-to-face interview situations is the same behavior by asking “Have you ever taken the randomized response technique (RRT). The anything from a store without paying for it?” This technique uses statistics beyond the level of this only describes the behavior, avoids using emotional book but is similar to the method described in the words, and leaves open the possibility that it hap- chapter’s opening box on female presidential can- pened under acceptable conditions (e.g., acciden- didates. The basic idea is to use known probabili- tally forgetting to pay). ties to estimate unknown proportions. Here is how 3. Establish a desensitizing context. We can RRT works. An interviewer gives the respondent also reduce threat and make it easier for respondents two questions: One is threatening (e.g., “Do you to answer honestly about sensitive topics by pro- use heroin?”), the other not threatening (e.g., “Were viding desensitized contextual information. One you born in September?”). A random method (e.g., way is to first asking about behaviors more serious toss of a coin, using heads to indicate the heroin than ones of real interest to us. For example, a question and tails for the birthdate question) is used respondent may hesitate to answer a question about to select the question to answer. The interviewer shoplifting, but if it follows questions regarding a does not see the question and records the respon- long list of serious crimes (e.g., armed robbery, dent’s answer (yes or no). By using the probability burglary), it will appear less serious and might be of the random outcomes (e.g., the percent of peo- answered honestly. ple born in September), we can estimate the fre- 4. Use anonymous questioning methods. The quency of the sensitive behavior. questioning format significantly affects how respon- We want honest answers to questions on sensi- dents answer sensitive questions. Formats that per- tive topics and want to reduce the chances that mit increased anonymity, such as a self-administered respondents will give a less-than-honest socially questionnaire or a Web-based survey, increase the likelihood of honest responses to sensitive questions over formats that require interacting with another person as in a face-to-face interview.21 Computer-assisted self-administered interviewing (CASAI) Technique in which a respondent reads Technological innovations such as computer- questions on a computer screen or listens over ear- assisted self-administered interviewing (CASAI) phones and then answers by moving a computer and computer-assisted personal interviewing mouse or typing on a keyboard. (CAPI) enable respondents to have a degree of Computer-assisted personal interviewing (CAPI) anonymity. CASAI “interviews” a respondent by Technique in which an interviewer sets up a laptop having the person read questions on a computer computer and is available to help respondents who screen or listen to them with earphones. The hear questions over earphones and/or read them on a screen and then enter answers. respondent answers by moving a computer mouse or typing on a keyboard. Even when an interviewer Randomized response technique (RRT) A special- or others are present in the same room, the respon- ized method in survey research used for very sensitive topics; the random receipt of a question by the respon- dent is semi-insulated from human contact and dent without the interviewer being aware of the ques- interacts only with an automated system. In CAPI, tion to which the respondent is answering. the respondent uses a laptop computer, and an inter- SURVEY RESEARCH acceptable answer as described in this chapter’s Declaration of Independence). If we use knowledge opening box. However, social desirability bias is questions to learn what respondents know, we need widespread. It occurs when respondents distort to be careful because respondents may lie because answers to conform to popular social norms. they do not want to appear ignorant.24 Knowledge People tend to overstate being highly cultured (e.g., questions are important because they address the reading, attending cultural events), giving money basis on which people make judgments and form to charity, having a good marriage, loving their opinions. They tell us whether people are forming children, and so forth. One study found that 34 opinions based on inaccurate information. percent of people who reported in a survey that they Nadeau and colleagues (1993) found that most gave money to a local charity really did not.23 Americans seriously overestimate the percent of Because a norm says that one should vote in racial minorities in the population. Only 15 percent elections, many report voting when they did not. In (plus or minus 6 percent) of U.S. adults accurately the United States, those under the greatest pressure report that 12.1 percent of the U.S. population is to vote (i.e., highly educated, politically partisan, African American. More than half believe it is above highly religious people who had been contacted by 30 percent. Similarly, Jews make up about 3 percent an organization that urged them to vote) are the of the U.S. population, but a majority (60 percent) people most likely to overreport voting. This pat- of Americans believe the proportion to be 10 per- terned misrepresentation of voting “substantially cent. A follow-up study by Sigelman and Niemi distorts” studies of voting that rely on self-reported (2001:93) found that “African Americans them- survey data (Bernstein et al., 2001:41). selves overestimate the black population by at least One way to reduce social desirability bias is to as much” as other respondents. Nearly twice as phrase questions in ways that make norm violation many African Americans (about 30 percent) versus appear less objectionable or give respondents 15 percent of Whites thought that African Ameri- “face-saving” alternatives. For example, Belli et al. cans were one-half of the U.S. population. Appar- (1999) reduced overreporting of voting and per- ently, many Americans have a distorted view of the mitted respondents to “save face” by including in true racial composition of their country. their voting question statements such as “A lot of Race is not the only issue of which the public people were not able to vote because they were not has a distorted picture. For example, when we ask registered, were sick, or just didn’t have time.” Americans about government spending for foreign They offered four response choices: “I did not vote aid, a large percentage will say that it is too high. in the November 5 election; I thought about voting However, if we ask them how much the government but did not vote; I usually vote but did not vote this should be spending on foreign aid, people report an time; I am sure I voted on November 5.” Only the amount that is actually more than the government last response choice is a clear, unambiguous is currently spending. This situation creates a indication that the person voted. Phrased in this dilemma. If we ask about the issue in one way, we manner, more people admitted that they did not find that the public says the spending is too high, vote. but if we ask in a different way, we find the public says (indirectly) that it is lower than it should be. Knowledge Questions. Studies suggest that a Such a dilemma is not unique to the foreign large majority of the public cannot correctly answer aid issue. In many issue areas—university expenses, elementary geography questions, name their elected health care programs, aid to poor people— leaders, or identify major documents (e.g., the respondents offer an opinion to support or oppose an issue or policy position, but if we ask them about the issue in a different way, their position reverses. Social desirabilty bias A problem in survey research This dilemma does not mean that we cannot in which respondents give a “normative” response or a socially acceptable answer rather than an honest answer. obtain valid measures of public opinions with sur- veys. It reminds us that social life is complex and SURVEY RESEARCH writing good surveys to learn about what people the person” (see the next section, open- versus think requires effort and diligence. If we carelessly closed-ended questions). ask for an opinion, we may receive a superficial one offered without serious thought or based on inac- Contingency Questions. Some questions apply curate knowledge. Or we might get an opinion par- only to specific respondents, and researchers should roted from what a neighbor said or what was heard avoid asking questions that are irrelevant for a in a television advocacy “sound bite.” respondent. A contingency question (sometimes You may think having an inaccurate view of called a screen or skip question) is a two- (or more) the country’s racial composition or foreign aid part question.26 The answer to the first part of spending occurs because the information is beyond the question determines which of two different people’s everyday experiences, but people can also questions to ask a respondent next. Contingency give inaccurate answers to questions about the questions identify respondents for whom a sec- number of people living in their household. This is ond question is relevant. On the basis of the answer not due to ignorance but comes from the complex- to a first question, the researchers instruct the ity of their daily lives. Some people will not report respondent or the interviewer to go to another or to as part of their households marginal persons (e.g., skip certain questions (see Expansion Box 5, Exam- a boyfriend who left for a week, the adult daughter ple of a Contingency Question). who ran out after an argument about her pregnancy, or the uncle who walked out after a dispute over Open-Ended versus Closed-Ended money). However, such marginal people may not Questions have another permanent residence. If we asked them where they live, they would say they are still Researchers actively debate the merits of open living in the household that did not include them, versus closed survey questions.27 An open-ended and they plan to return to it.25 question (requiring an unstructured, free response) Our goal in survey research is to obtain accu- asks a question (e.g., “What is your favorite televi- rate information (i.e., a valid and reliable measure sion program?”) to which respondents can give any of what a person really thinks, does, or feels). Pilot answer. A closed-ended question (asking for a testing questions (discussed later in this chapter) structured, fixed response) asks a question and helps to achieve this. Pilot tests reveal whether offers a fixed set of responses from which a respon- questions are at an appropriate level of difficulty. dent can choose (e.g., “Is the president doing a very We gain little if 99 percent of respondents cannot good, good, fair, or poor job, in your opinion?”). answer the question. We must word questions so that respondents feel comfortable saying they do not know the answer—for example, “How much, if anything, have you heard about...?” Sleeper question Survey research inquiry about We can check whether respondents are over- nonexistent people or events to check whether respon- dents are being truthful. stating their knowledge with a sleeper question to which a respondent could not possibly know the Contingency question A two-part survey item in answer. For example, in a study to determine which which a respondent’s answer to a first question directs him or her either to the next questionnaire item or to U.S. civil rights leaders respondents recognized, a more specific and related second question. researchers added the name of a fictitious person. Open-ended question A type of survey research This person was “recognized” by 15 percent of inquiry that allows respondents freedom to offer any the respondents. This implies that 15 percent of answer they wish to the question. the actual leaders that respondents “recognized” Closed-ended question A type of survey research were probably unknown. Another method is to ask inquiry in which respondents must choose from a fixed respondents an open-ended question after they rec- set of answers. ognize a name, such as “What can you tell me about SURVEY RESEARCH suming coding may make them impractical for EXPANSION BOX 5 many studies. Example of a Contingency Question We use closed-ended questions in large-scale surveys because they are faster and easier for both QUESTION VERSION 1 (NOT CONTINGENCY QUESTION) respondents and researchers, yet we can lose some- thing important whenever we force an individual’s In the past year, how often have you used a seat belt when you have ridden in the backseat of a car? beliefs and feelings into a few fixed, predetermined categories. To learn how a respondent thinks and QUESTION VERSION 2 (CONTINGENCY discover what is important to him or her or for ques- QUESTION) tions with numerous answer categories (e.g., age), In the past, have you ridden in the backseat of a car? open questions are best. You can reduce the disadvantages of a ques- No [Skip to next question] tion format by mixing open-ended and closed- Yes → When you rode in the backseat, how often did ended questions in a questionnaire. Mixing them you use a seat belt? also offers a change of pace and helps interviewers Results Always Use Never Use establish rapport. Periodic probes (i.e., follow-up questions by interviewers, discussed later) with Version 1 30% 24% closed-ended questions can reveal a respondent’s Version 2 42 4 reasoning. Having interviewers periodically use During pilot testing, researchers learned that probes to ask about a respondent’s thinking can many respondents who answered “never” to Version check on whether the respondent understands the 1 did not ride in the backseat of a car. Version 1 cre- questions as you intended. However, probes are not ated ambiguity because respondents who never rode substitutes for writing clear questions or creating a in the backseat plus those who rode there but did not framework of understanding for the respondent. use a seat belt both answered “Never.” Version 2 Unless carefully stated, probes might influence a using a contingency question format clarified the respondent’s answers or obtain answers for respon- question. dents who have no opinion, yet flexible or con- Source: Adapted from Presser, Evaluating Survey Question- versational interviewing (discussed later in this naires, Hoboken, NJ: Wiley. (2004). Reprinted by permission of chapter) encourages many probes. For example, to John Wiley & Sons, Inc. the question “Did you do any work for money last week?” a respondent might hesitate and then reply, “Yes.” An interviewer probes, “Could you tell me Each question form has advantages and disadvan- exactly what work you did?” The respondent may tages (see Table 3). The crucial issue is not which reply “On Tuesday and Wednesday, I spent a cou- form is better, but which form is most appropriate ple of hours helping my buddy John move into his for a specific situation. Your choice of an open- or new apartment. For that he gave me $40, but I closed-ended question depends on the purpose and didn’t have any other job or get paid for doing the practical limits of a study. The demands of anything else.” If your intention is to get reports of using open-ended questions requiring interviewers only regular employment, the probe revealed a to write verbatim answers followed by time-con- misunderstanding. We also use partially open ques- tions (i.e., a set of fixed choices with a final open choice of “other”), which allows respondents to offer Partially open question A type of survey research an answer other than one of the fixed choices. enquiry in which respondents are given a fixed set of A total reliance on closed questions can dis- answers to choose from, but the addition an “other” category is offered so that they can specify a different tort results. For example, a study compared open answer. and closed versions of the question “What is the major problem facing the nation?” Respondents SURVEY RESEARCH TA B L E 3 Closed versus Open Questions ADVANTAGES OF CLOSED DISADVANTAGES OF CLOSED They are easier and quicker for respondents to They can suggest ideas that the respondent would answer. not otherwise have. The answers of different respondents are easier Respondents with no opinion or no knowledge to compare. can answer anyway. Answers are easier to code and statistically analyze. Respondents can be frustrated because their The response choices can clarify a question’s desired answer is not a choice. meaning for respondents. It is confusing if many (e.g., 20) response choices Respondents are more likely to answer about are offered. sensitive topics. Misinterpretation of a question can go unnoticed. There are fewer irrelevant or confused answers to Distinctions between respondent answers may be questions. blurred. Less articulate or less literate respondents are not Clerical mistakes or marking the wrong response at a disadvantage. is possible. Replication is easier. They force respondents to give simplistic responses to complex issues. They force respondents to make choices they would not make in the real world. ADVANTAGES OF OPEN DISADVANTAGES OF OPEN They permit an unlimited number of possible Different respondents give different degrees of answers. detail in answers. Respondents can answer in detail and can qualify Responses may be irrelevant or buried in useless and clarify responses. detail. They can help us discover unanticipated findings. Comparisons and statistical analysis become very They permit adequate answers to complex issues. difficult. They permit creativity, self-expression, and richness Coding responses is difficult. of detail. Articulate and highly literate respondents have an They reveal a respondent’s logic, thinking process, advantage. and frame of reference. Questions may be too general for respondents who lose direction. Responses are written verbatim, which is difficult for interviewers. An increased amount of respondent time, thought, and effort is necessary. Respondents can be intimidated by questions. Answers take up a lot of space in the questionnaire. ranked different problems as most important a job, half of the respondents who answered the depending on the form of the question. As Schu- open-ended version gave answers outside closed- man and Presser (1979:86) reported, “Almost all ended question responses. respondents work within the substantive frame- Open-ended questions are especially valuable work of the priorities provided by the investigators, in early or exploratory stages of research. For large- whether or not it fits their own priorities” [empha- scale surveys, we can use open questions in pilot sis added]. In a study that asked respondents open tests and later develop closed-ended questions from and closed questions about what was important in the open question answers. SURVEY RESEARCH Closed-ended questions require us to make an issue and really having no true opinion or many decisions. How many response choices do we view on it. provide? Should we offer a middle or neutral 3. False negative. Caused when a respondent choice? What should be the order of responses? refuses to answer some questions or withholds What types of response choices should be included? an answer when he or she actually has infor- Answers to these questions are not easy. For exam- mation or really holds an opinion. ple, two response choices are too few, but more than seven are rarely a benefit. We want to measure The three types of responses overlap. The first meaningful distinctions, not collapse them. More involves an inaccurate direction of a response toward specific answer choices yield more information, but a normative position, the second substitutes wild too many specifics create respondent confusion. For guesses for a serious response, and the last type is the example, rephrasing the question “Are you satisfied partial and selective nonresponse to the survey.28 with your dentist?” (which has a yes/no answer) to Neutral Positions. Survey researchers debate “How satisfied are you with your dentist: very sat- whether they should offer respondents who lack isfied, somewhat satisfied, somewhat dissatisfied, knowledge or have no position a neutral position or not satisfied at all?” gives us more information and a “no opinion” choice.29 and a respondent more choices. Some argue against offering a neutral or middle position and the no opinion option and favor pres- Neutral Positions, Floaters, and Selective suring respondents to give a response.30 This per- Refusals spective holds that respondents engage in satisficing; that is, they pick no opinion or a neutral response to Failing to get valid responses from each respondent avoid the cognitive effort of answering. Those with weakens a survey. Respondents may answer three this position maintain that the least educated respon- ways that yield invalid responses. dents may pick a no opinion option when they actu- 1. Swayed opinion. This involves falsely over- ally have one they believe that pressuring respondents stating a position as with the social desirability for an answer does not lower data quality. bias, or falsely understating or withholding a Others argue that it is best to offer a neutral (“no position as with sensitive topics. opinion”) choice because people often answer ques- 2. False positive. This results from selecting an tions to please others or not to appear ignorant. attitude position but lacking any knowledge on Respondents may give opinions on fictitious issues, objects, and events. By offering a nonattitude (mid- dle or no opinion) choice, we can identify respon- Satisficing Avoiding exerting cognitive effort when dents without an opinion and separate them from answering survey questions and giving the least respondents who really have one. demanding answer that will satisfy the minimal require- ments of a survey question or interview situation. Floaters. Survey questions address the issue of Standard-format question A survey research inquiry nonattitudes with three types of attitude questions: for which the answer categories do not include a “no standard-format, quasi-filter, and full-filter ques- opinion” or “don’t know” option. tions (see Expansion Box 6, Standard-Format, Quasi-filter question A survey research inquiry that Quasi-Filter, and Full-Filter Questions). The includes the answer choice “no opinion,” “unsure,” or standard-format question does not offer a “don’t “don’t know.” know” choice; a respondent must volunteer it. Full-filter question A survey research inquiry that A quasi-filter question offers a “don’t know” first asks respondents whether they have an opinion alternative. A full-filter question is a special type or know about a topic; then only those with an opin- ion or knowledge are asked specifically about the of contingency question. It first asks whether topic. respondents have an opinion, and then asks for the opinion of those who state that they do have one. SURVEY RESEARCH EXPANSION BOX 6 Standard-Format, Quasi-Filter, and Full-Filter Questions STANDARD FORMAT Here is a question about another country. Do you agree or disagree with this statement? “The Russian leaders are basically trying to get along with America.” QUASI-FILTER Here is a statement about another country: “The Russian leaders are basically trying to get along with America.” Do you agree, disagree, or have no opinion on that? FULL FILTER Here is a statement about another country. Not everyone has an opinion on this. If you do not have an opinion, just say so. Here’s the statement: “The Russian leaders are basi- cally trying to get along with America.” Do you have an opinion on that? No (go to next question), Yes (continue). Do you agree or disagree? Example of Results from Different Question Forms Standard Format (%) Quasi-Filter (%) Full Filter (%) Agree 48.2 27.7 22.9 Disagree 38.2 29.5 20.9 No opinion 13.6* 42.8 56.3 *Volunteered Source: Adapted from Schuman and Presser (1981). Questions and Answers in Attitude Surveys: Experi- ments in Question Form, Wording, and Context (116–125). Academic Press. With permission from Elsevier. Standard format is from Fall 1978; quasi- and full-filter forms are from February 1977. The logic behind these three formats is that the last alternative offered. The recency effect sug- many respondents will answer a question if a “no- gests that we should present responses on a con- opinion” choice is missing, but they pick “don’t tinuum and place the neutral position in the middle. know” when we offer it, or say they do not have an Attitudes have two aspects: direction (for or opinion if asked directly. These respondents are against) and intensity (strongly held or weakly floaters because they “float” from responding to held). For example, two respondents both oppose questions they understand and have knowledge abortion. One is fiercely attached to the opinion and about responding to questions which they have no strongly committed to it; the other holds the opin- knowledge and do not understand. Minor wording ion weakly and is wavering. If we ask only an changes are likely to change their answers. Quasi- filter or full-filter questions help screen out floaters. Filtered questions may not eliminate all respondents Floaters Survey research respondents without the answering to nonexistent issues, but they reduce the knowledge or an opinion to answer a survey question but who answer it anyway, often giving inconsistent problem. answers. Middle alternative floaters will choose a middle position when we offer it but another alter- Recency effect A result in survey research that occurs when respondents choose the last answer native if we do not. They feel ambivalent or less response offered rather