Document Details

ImmenseSunflower

Uploaded by ImmenseSunflower

Università Cattolica del Sacro Cuore di Milano

Tags

interview design survey design research methods social science

Summary

This document provides an introduction to research design, differentiating between basic and applied research. It explores qualitative and quantitative approaches, including various data collection tools and methods. The text also includes information on ethical considerations in research.

Full Transcript

Research design introduction 1 Research - systematic and “objective” attempt to study a problem for the purpose of deriving general principles A researcher seeks exhaustively: 1. For more explanations...

Research design introduction 1 Research - systematic and “objective” attempt to study a problem for the purpose of deriving general principles A researcher seeks exhaustively: 1. For more explanations 2. Foe veri able true 3. To make discoveries According to speci c goal. Research is classi ed into: 1. Basic or Pure research (problem understanding and knowledge creation, induction) 2. Applied research ( problem solving, deduction) -The di erence between applied and basic research is straightforward - ndings of applied research can be applied to resolve issues, whereas fundamental studies used to simplify to explore certain issues and elements. 1. Di erence in purpose. Purpose of applied studies is closely associated with the solution of speci c problems, while the purpose of fundamental studies relate to creation of new knowledge or expansion of the current knowledge without any concerns to applicability 2. Di erence in context. In applied studies, research objectives are set by clients or sponsors as a solution to speci c problem they are facing. Fundamental studies, on the other hand, are usually self-initiated in order to expand the levels of knowledge in certain areas. 3. Di erence in methods. Research validity represents an important point to be addressed in all types of studies. Nevertheless, applied studies are usually more concerned with external validity, whereas internal validity can be speci c as the main point of concern for fundamental researchers. Applied research is de ned as a research which is used to answer a speci c question, determine why something failed or succeeded, solve a speci c, pragmatic problem related to product development, or to gain better understanding. Applied research : Real issue —>research —>intervention/action 1. The presence of a real social problem, of an external client 2. The development of a “problem-> research->problem” sequence to favor/support an intervention ( building knowledge to guide actions) 3. A exible articulation of objectives and methodology oriented by the research for the best answers to the problem Examines a speci c set of circumstances, and its ultimate goal is relating the results to a particular situation. ie, applied research uses the data directly for real world application. Research which studies the relationship and applicability of theories or principles to the solution of a problem The level and type of involvement of faculty researchers can di er based on the scope of work. It is designed to solve practical problems of the modern world, rather than to acquire knowledge. This kind of research usually performed by consultants or professionals (rather than academics). Typically motivated by the need to solve a speci c problem in a particular organisation. The goal of an applied scientist is to improve the human condition. Examples: - Researching which strategies work best to motivate workers fl ff ff ff fi fi ff fi fi fi fi fi fi fi fi fi fi ff - Investigating which treatment approach is the most e ective for reducing anxiety - A hospital might conduct applied research on how to prepare patients for certain types of surgical procedures. Research design —> qualitative (interviews ) / quantitative ( surveys) Data collecting tools Qualitative research tools collaps into 2 main categories 1. Direct approach phenomena -Behaviour observation —> conversation/ exchanges —> analysis of : texts , speeches, stories, conversation 2. Mediate approach to phenomena -Focus on individual ( interview, questionnaire) —> Focus on groups -> focus group , ideate group TOOLS FOR DATA COLLECTION: CHOOSING OR CREATING NEW ONES? 1. Choice —> within the opportunity o ered by literature/previous researches 2. Construction -> generation from scratch - Apart from formal di erences, both options refer to a similar conceptual process (described – for a better understanding – in the situation of construction) ff ff ff Research Design 2 Qualitative Research: focuses on understanding deep, subjective meanings and experiences. It uses: - Words as data - Interviews, focus groups, observations - Sample sizes are small and non-representative - Data is interpreted and described in detail, focusing on the essence of things. Quantitative Research focuses on measurement and generalisation, using: - Numbers and statistics - Surveys, experiments - Larger, representative samples - Formal data analysis for statistical veri cation. Di erences - Epistemological Roots: - Qualitative: Constructionism (reality is socially constructed) - Quantitative: Positivism (objective reality measured and quanti ed) -Research Focus: - Qualitative: Meanings, deep feelings, behaviours - Quantitative: Facts, opinions, behaviours measured statistically Mixed Methods Research - Combines qualitative and quantitative approaches in either sequential or concurrent phases. - Concurrent Design: Both methods are used simultaneously to cover di erent aspects of the research problem. - Sequential Design: Phases are used progressively, e.g., qualitative research to explore a phenomenon, followed by quantitative research to generalise ndings. Research Questions - Qualitative Questions focus on “why” and “how” questions to explore processes or meanings. - Example: Why do people choose sustainable brands? - Quantitative Questions focus on “how many” and “what” questions, aimed at quantifying relationships. - Example: How many people are satis ed with public transport services? Sampling Techniques - Qualitative Sampling (non-probabilistic): - Methods include extreme cases, intensive experiences, maximum variability, critical cases. - Focuses on the quality and richness of data, not generalisability. - Sample size depends on theoretical saturation—when new data no longer brings additional insights. - Quantitative Sampling (probabilistic): - Strategies include: - Simple Random Sampling: Every individual has an equal chance of being selected. - Systematic Sampling: Selects every nth element after a random starting point. - Strati ed Sampling: Divides population into subgroups and randomly selects from each. - Cluster Sampling: Clusters are randomly selected, and data is gathered from within those clusters. - The goal is representativeness and generalisation to the population. Sampling Strategies and Size in Quantitative Research - Probabilistic Sampling ensures representativeness and allows for generalisation of ndings to the broader population. - Sample Size should consider: ff fi fi fi fi fi ff fi - Accuracy: Dependent on sample size and sampling strategy. - Representativeness: The extent to which the sample re ects the population. - Larger samples reduce sample error, but representativeness is crucial, even for smaller samples. Exercise Example: Preventing Unhealthy Eating Habits - Research Objectives: - Identify target audience for communication. - Detect the best media channels for the campaign. - Gather insights for communication style and content. - Research Design: - Phase 1 (Qualitative): Focus groups to explore people’s views on unhealthy eating. - Phase 2 (Quantitative): Surveys to quantify the prevalence of unhealthy eating habits and media preferences. - Population: Inclusion criteria could be based on demographics (age, income, education, etc.) and health behaviours. - Sampling: - Phase 1: Non-probabilistic sampling (e.g., quota sampling) for focus groups. - Phase 2: Probabilistic sampling (e.g., strati ed sampling) for surveys. Developing a data collection tool - Goal: To collect the necessary information for answering your research questions. - Steps to Develop a Tool: 1. De ne the rationale: Identify the information units (topics) needed to answer the research aims. 2. Establish a sequence: Organise these topics logically. 3. Articulate the guideline/questionnaire: Turn these topics into questions. 4. Pilot phase: Test the tool to ensure it's e ective before full use. Step-by-Step Breakdown a) De ne a Rationale: - The rationale is the conceptual structure of your data collection tool. - It's important to break down the research topics into subtopics and then further into minimum units of information. - Focus on what you need to ask, not how you will ask it at this stage. - The rationale acts as an index to ensure all necessary areas are covered. - Quantitative Research requires a nished rationale—everything is clearly de ned upfront. - Qualitative Research may work with an un nished rationale, meaning some areas will evolve during eldwork. Example: - For a study on unhealthy eating, the rationale should cover: - Size and description of the phenomenon (e.g., how many people consume unhealthy food). - Knowledge, opinions, and attitudes towards unhealthy eating. - Educational and informational needs on the topic. Importance of the Rationale: - Without a clear rationale, your tool may have missing questions, wrong ow, or irrelevant content. - Start with a draft rationale and re ne it according to your research objectives. Example: Prototype Rationale for Unhealthy Eating Study: 1. Behaviour: Focus on how often people engage in unhealthy eating. 2. Knowledge, Opinions, Attitudes: Explore what people think and feel about their eating habits. 3. Classi cation Descriptors: Collect demographic information to pro le at-risk groups. fi fi fi fi fi fi ff fi fi fl fi fl fi Questionnaire Structure: - A good questionnaire should have a logical ow, and questions should be grouped into categories based on the rationale: - Behavioural Questions: How frequently do people consume unhealthy food? - Opinion Questions: What is their attitude towards healthy eating campaigns? - Descriptive Questions: Demographic details like age, gender, income. Pilot Testing: - Before using the tool in a full study, conduct a pilot phase to test for clarity, relevance, and ow. - The pilot phase helps identify issues and allows for re ning the tool to improve its e ectiveness. Di erences in Approaches: - Quantitative Approach: Develop a fully explicit rationale, ensuring every piece of data to be collected is prede ned. - Qualitative Approach: The rationale may evolve during the research process, allowing exibility to probe deeper into certain areas based on participant responses. Group Exercise Example: - Research topic: Unhealthy Eating Habits - Objective: Understand the prevalence, attitudes, and knowledge about unhealthy eating. - Step 1: Break the research question into clear subtopics, such as behaviour patterns, demographic pro les, and educational needs. - Step 2: Develop speci c questions based on these subtopics, ensuring that each research aim is addressed. Developing Good Questions 1. Instruments for Data Collection - Interviews - Questionnaires - Focus Groups - Ideative Groups - Each instrument serves as a way to ask questions and gather answers from respondents. 2. Formulation of Questions (Lexical Issues): - The goal is to create clear, unambiguous questions that provide reliable data. - Avoid vague or misleading wording. - Formulation of Answers (Metric Issues): - Decide what kind of data to collect (open or closed questions, di erent scales). - The answer type should align with the research objective (quantitative or qualitative). 3. Types of Questions - Speci c Questions: Focused on immediate, concrete actions or timeframes (e.g., "In the last week, how many times did you exercise?"). - They produce more reliable and informative responses. - General Questions: Broader and can cover larger periods (e.g., "In the past year, have you exercised regularly?"). - Less reliable due to the cognitive challenge and memory recall issues. 4. Priority Tips: - Use short, simple sentences (subject → verb → object). - Introduce the topic rst, and then ask the question. - Use direct interrogation (avoid negative phrasing or "not" in questions). fl ff ff fi fi fi fi fi fl fi ff fl - Use colloquial language to make questions understandable. - Frame questions within a speci c context or timeframe. - Example: Instead of asking “Do you like sports?”, ask “In the last month, how many times did you engage in sports activities?” 5.Common Mistakes to Avoid - Prejudicial Language: Avoid language that may insult or bias the respondent. - Example: Instead of “What do you think about people defacing city walls?” ask “What do you think about people writing on city walls?” - Leading Questions: Do not suggest a particular answer. - Example: “Don’t you think the government is doing a good job?” → leads the respondent to agree. - Technical Terms and Jargon: Use plain language that everyone can understand. - Avoid terms like “habeas corpus” or jargon that might confuse respondents. - Vague or Ambiguous Questions: Make sure terms are universally understood. - Example: Instead of asking, “How important is it that a brand shares your values?”, ask “How important is it that a brand supports sustainability as a value?” - Double-Barreled Questions: Only ask one thing per question. - Example: "Do you like chocolate and ice cream?" → This asks two things in one question. - Assumptive Question: Avoid assumptions about beliefs or behaviours. - Example: “Are you worried about the increasing crime rate?” assumes the respondent is worried. - Hypothetical Question: Avoid questions that ask for speculative answers, as these do not predict behavior accurately. - Evaluative Questions: Don’t make people feel judged. - Example: Avoid asking, "Do you think it's wrong to...?" 6.To Sum Up: Dos and Don’ts - Avoid: - Technical or slang terms. - Vague or ambiguous terms. - Double-barreled questions. - Assumptive or evaluative questions. - Use: - Justi catory premises for sensitive questions (e.g., "Some people nd it di cult to quit smoking; do you agree?"). - Temporal framing (e.g., "In the past week, how many times...?"). - Concrete examples to provide context. 7. Examples of Poorly Formulated Questions - Poor Example: "Do you think the government is doing a great job?" - Problem: Leading question. - Improved Version: "Can you tell me what you think about the government's performance?" - Poor Example: "Don't you think the banking system is untrustworthy?" - Problem: Negative phrasing and leading. - Improved Version: "Do you think the banking system is trustworthy?" - Poor Example: "Do you read newspapers?" - Problem: Too vague. - Improved Version: "In the past week, how often did you read a national newspaper?" Here are detailed notes based on the provided presentation on qualitative research, interview design, and practical activities: fi fi fi ffi Introduction to Qualitative Research Qualitative Research Tools: - Utilises direct and mediated approaches to understand phenomena. - Methods include: - Observation of practices and exchanges. - Analysis of texts, conversations, and stories. - Interviews: individual, focus groups, ideate/creative techniques. Plurality of Tools: - Observation, analysis, interviews, focus groups, and creative techniques help in understanding behaviours, stories, and conversations. These tools provide a deeper insight into social phenomena through non-numerical data. What is Qualitative Interviewing? - A qualitative interview is a professional interaction that follows speci c rules and techniques. It involves exchanging opinions based on trust and common interest in producing knowledge. - According to Corbetta (1999), a qualitative interview is a conversation promoted by an interviewer to subjects selected based on a data collection plan. The interview is exible and not standardised, encouraging the interviewee to provide detailed narratives. Why Use Qualitative Interviewing?: - Storytelling: Humans have a natural storytelling ability, which can lead to astonishing results when nurtured. - It is a personal and intimate encounter where the interviewer asks open, direct questions, eliciting detailed stories. Key Characteristics: - Questions asked are **not prede ned**. - Interviewees answer in their **own words**, o ering unique discourses. - The information gathered is not treated with **statistical procedures** but rather is analyzed in a qualitative manner. Types of Qualitative Interviews 1. Unstructured Interviews: - Focus on the subject. - Non-directive style: The interviewer builds rapport and encourages the respondent to express themselves freely. There is no structured guide; instead, the interview evolves organically. 2. Semi-Structured Interviews: - Balance between open-ended and guided questions. - The interviewer has a set of pre-determined topics but remains exible to allow the interviewee's responses to shape the conversation. - Focus on both object and subject. 3. Structured Interviews: - Directive style with pre-determined questions. - The interview focuses on speci c topics, and the data collection is standardised. Rationale for Interview Design: Case Study on Unhealthy Eating - Objective: Understand the behavior and attitudes of people with unhealthy eating habits, with the goal of identifying patterns that could inform communication strategies or interventions. fi fi ff fl fi fl - Target Group: Italian population, particularly those exhibiting unhealthy eating behaviors. - Key Topics to Explore: - Attitudes towards unhealthy eating. - Personal experiences and beliefs about unhealthy food. - Risk perception: How do people perceive the risks associated with their eating habits? - Knowledge and awareness about unhealthy eating. - Information needs: What gaps in knowledge exist, and how can these be addressed through communication? - Methodology: - Combine quantitative surveys with qualitative interviews for a comprehensive understanding. - Use focus groups and in-depth interviews to explore opinions, needs, and insights that can shape health campaigns or interventions. Here are detailed notes from the presentation on **Carrying out Qualitative Interviews** to help you prepare for your exam: Key Points for Conducting Qualitative Interviews 1. Preparing for the Interview: - Do not begin right away: Start with a friendly greeting and explain the process. Establish the interviewer as a learner (cultural ignorance) to position yourself as open and curious. - Begin with easy-to-answer questions and gradually move to more complex or personal topics. This technique, known as funnelling, helps the interviewee feel comfortable. - Move from general to focused topics. 2. Listening and Engaging: - Interviews should feel like a friendly conversation rather than a strict Q&A. However, remain neutral—do not approve or disapprove of the interviewee’s answers. - Use the interviewee’s own language to ask follow-up questions and encourage them to expand their answers. Use prompts like: - **Describe…” - “Tell me about…” - Do not shift to a new topic until you feel the current one has been thoroughly explored. 3. Questioning Techniques: - Ask one question at a time. - Ensure questions are clear and avoid ambiguous or jargon-laden phrases. - Avoid using terms that suggest an evaluation or judgment, as these can bias the responses. - Use broad, expansive questions: Open-ended questions encourage participants to share information that might not have been anticipated, often revealing important insights. 4. The Art of Probing: - E ective probing is crucial for extracting deeper insights without in uencing the responses. Techniques include: - Silent Probe: Remain silent and let the interviewee continue, often when they sense you're writing or re ecting on their previous answer. - Echo Probe: Repeat the last thing the interviewee said and ask them to elaborate. - Uh-huh Probe: Use a rmative noises like “Uh-huh,” “I see,” or “Right” to encourage them to continue speaking. - Probing helps uncover new meanings while keeping the interviewer’s in uence minimal. The key is to let the informant lead the conversation while staying within the overall topic. 5. Maintaining a Fluid Conversation: - Ensure smooth transitions between topics to keep the conversation natural and logical. ff fl ffi fl fl - The interview should remain conversational, with the interviewer acting primarily as a listener and not rushing the pace or pushing for answers. - Let the informant determine the direction of the conversation, staying aligned with the research objectives but allowing exibility for unexpected insights. Common Pitfalls and Final Tips: 1. Background noises can impact the quality of recordings—choose a quiet setting. 2. Faulty equipment can ruin valuable data—ensure all recording devices are working properly. 3. Sometimes participants say important things after the recording is turned o —keep listening even after the formal interview concludes. 4. Interviews may not always last the full time allocated—be prepared to adapt to the ow of conversation. 5. Pay attention to non-verbal cues: gestures, pauses, silence, or laughter can provide additional insights. 6. Transcribing is time-consuming (approximately 3 hours of transcription for a 1-hour interview) but is crucial for thorough data analysis. Practical Activity: - Students are encouraged to practice interviewing in class: 1. Groups select an interviewer and an interviewee*, with others acting as observers. 2. Conduct a non-structured interview on a chosen topic, focusing on communication and relational strategies. 3. Observers note the interviewer’s strategies and provide insights into the dynamics of the interview. Theory and Techniques of Group Interviews - Group interviews involve gathering multiple people in a setting to explore their opinions, attitudes, and perceptions. They serve to: - Collect group-based insights and understand social interactions. - Analyse how ideas evolve through group discussion, highlighting consensus and negotiation processes. Key formats: - Focus Groups: Structured for insights into group consensus. - Ideative/Creative Groups: Designed for brainstorming and cooperative problem-solving without social comparisons. Key Characteristics of Group Techniques Dual Role of Group Setting: - Acts as both a container (holding content) and an in uencer (shaping ideas). - According to Lewin’s theory, a group generates unique insights that cannot be obtained from individuals alone. Applications: - Used to uncover insights, develop hypotheses, or re ne research instruments. - E ective in studying social norms and the formation of collective attitudes. Bene ts of Group Techniques - Exploratory Value: Useful in early research stages to shape study design. - Insight into Processes: Provides a deeper understanding of collective thoughts and emotions. - Hypothesis Formation: Helps generate hypotheses in areas where they are not yet established. - Real-World Dynamics: Partial structure allows for more “natural” data collection that re ects real social exchanges. fi ff fl fl fi ff fl fl Focus Groups: Structure and Conduct Style of Moderation: - Moderation should be adaptable, balancing structured guidance with openness to participant ow. - Techniques include free associations, analogies, projective techniques to elicit authentic responses. Group Composition: - Homogeneous groups (similar demographic or social backgrounds) enhance focus and comfort. - Typically, 6-8 participants per group, and 2-4 groups per project are ideal for su cient data. Session Duration: Generally lasts 2.5 to 4 hours. Setting Up the Group Interview Environment - Comfort and Con dentiality: Arrange seating to make participants comfortable and facilitate observation and recording. - Recording and Assistance: Ensure a clear recording process and consider an assistant to manage logistics or record observations. How to conduct a Group Discussion? Step 1: Conceptualisation and Planning - De ne the research purpose and the target population. - Outline procedural details such as timelines, costs, recruitment methods, and any incentives. - Develop a screener for recruiting participants and select an appropriate venue. Step 2: Interviewing - Develop a topic guide to structure questions. - Select moderation techniques to match the topic: - Use projective techniques (e.g., free associations, analogies) to explore analogical thinking. - Use role-playing to help participants “re-live” experiences. - Record the discussion, encourage openness, and allow diverse viewpoints. Step 3: Analysis and Reporting - Analyse relationship dynamics and content. - Record non-verbal cues and relational aspects that add context to verbal responses. Interview Techniques for Eliciting Responses - Deep Emotional and Analogical Exploration: - Free Associations: Prompt spontaneous thoughts related to a topic. - Analogies: Encourage participants to describe something as a color, animal, or landscape, facilitating metaphorical thinking. - Projective Techniques: Use hypothetical or imaginative prompts to elicit deeper insights. Needs Analysis and Idea Generation: - Divergent Thinking: Individual and subgroup brainstorming, creating a wide range of ideas. - Convergent Thinking: Organising, ranking, and prioritising ideas collaboratively to focus on key themes. Moderation Techniques - Successful group interviews depend on skilled moderation: - Establishing Trust: Creating an environment where participants feel comfortable sharing honest opinions. - Managing Dynamics: Encouraging quieter members to speak and managing dominant personalities to ensure balanced input. - Permissive Environment: Ensures no pressure to agree or reach consensus, enabling genuine, diverse insights. Practical Applications of Group Interviews - Pre-Study Stages: Generate insights or hypotheses. - Ongoing Studies: Develop or re ne study tools or explore program e ectiveness. fl fi fi fi ff ffi - Post-Study Evaluation: Assess program impact and generate ideas for further research or improvements. Quantitative approach Surveys and questionnaires Questionnaires - a structured set of questions designed by the researcher. It can include: - Closed questions with prede ned answer options. - Open questions that allow respondents to give their own answers. Historical Context: - Invented in the 19th century by Sir Francis Galton. - Developed during the Victorian era, a time of scienti c and industrial revolutions and a strong belief in quantifying reality. Philosophical Basis: - Early questionnaires were seen as neutral, objective tools. - However, modern perspectives recognize that the design re ects the subjectivity of the researcher. Important Note: - A questionnaire does not purely record reality - it frames reality through the lens of the research goals. Purpose of Surveys - Surveys are used to gather data about speci c issues or conditions within a target population. Surveys are based on the desire to collect information about a well de ned issue or situation from a well de ned population. It consists of a predetermined set of questions Key Purposes: - Describe: Document current attitudes or behaviours. - Pro le: Identify characteristics within a population. - Explain: Explore relationships between variables. - Identify and Solve Problems: Pinpoint issues and potential solutions. - Measure Change: Track changes over time, often with longitudinal studies. Types of Surveys Descriptive Surveys: - Aim to capture current conditions, preferences, or characteristics within a population. - Examples: Surveys to assess lifestyle choices, audience preferences. Analytical Surveys: - Go beyond description to explore relationships between two or more variables. - Examples: Examining how lifestyle a ects media habits, or the impact of gaming on adolescents. Survey Research Designs Cross-sectional Studies: - Capture data from a population at a single point in time. - Useful for describing characteristics but cannot infer causality. Before-and-After Studies: - Measure changes resulting from an intervention (e.g., before and after a training program). - Ideal for evaluating program e ectiveness. Longitudinal Studies: - Follow the same sample over time to track changes. - E ective for studying natural developments or trends. ff fi fi fi ff ff fi fi fl fi The Survey Development Process Step 1: Identifying the Study Focus and Objectives - Clearly de ne the research purpose and expected outcomes. - Establish speci c goals for what the survey should achieve. Step 2: Establishing an Information Base - Gather relevant background information to inform the survey’s scope and questions. - Consider previous studies or theories related to the topic. Step 3: Determining Sampling Frame, Size, and Selection - De ne the sampling frame (target population). - Choose a sample size that balances accuracy with feasibility. - Use a selection method (random sampling, strati ed sampling, etc.) to ensure representativeness. Step 4: Designing the Survey Instrument - Develop questions that align with research objectives and are easy to understand. - Decide between open-ended (qualitative) and closed (quantitative) questions depending on the data needed. Step 5: Pretesting the Survey Instrument - Conduct a pilot test with a smaller sample to check for clarity, ow, and e ectiveness. - Re ne questions based on feedback from the pilot phase. Step 6: Implementing the Survey - Choose a delivery method (online, in-person, mail, etc.). - Ensure participants have access and understand the procedure. Step 7: Data Analysis - Enter data, clean it for errors, and analyse statistically if necessary. - Use statistical methods appropriate to the research question (e.g., frequency analysis, correlation). Step 8: Interpretation of Results - Move from numerical results to meaningful interpretations. - Extract insights that explain the observed data in terms of the research objectives. Step 9: Communication of Results - Present ndings in a clear, accessible format. - Tailor communication style for the intended audience, emphasising insights and implications. Key Considerations for E ective Surveys - Validity: Ensure the survey measures what it intends to measure. - Reliability: Achieve consistent results over repeated administrations or samples. - Clear Language: Use straightforward, neutral language to avoid misunderstandings. - Avoid Bias: Design questions that are free from leading or prejudicial language. Examples of Common Survey Applications - Audience Surveys: Assess the preferences or demographics of a speci c group (e.g., media preferences). - Customer Satisfaction Surveys: Measure satisfaction with products or services. - Employee Surveys: Evaluate workplace satisfaction, engagement, or e ectiveness of initiatives. - Health Surveys: Track public health behaviors, like vaccination rates or exercise habits. fi fi fi fi fi ff fi fl ff fi ff Developing Good Answers Formulation of Answers: Open vs. Closed Responses Open-Ended Answers: - Require respondents to think and express their views independently, allowing them to elaborate freely. - Provide insights into salient issues from the respondent's perspective. - May have higher non-response rates and social desirability bias. - Require post-survey coding by the researcher. Closed Answers: - Provide a list of response options, guiding the respondent’s answer. - Simplify analysis, as responses are already structured. - Allow for consistent data collection and analysis. - Require advance coding by the researcher to ensure all relevant options are available. Intermediate Option: Pre-coded spontaneous answers, where the structure is prede ned but not entirely restrictive. Designing Closed Answer Choices Requirements for Closed Answers: - Single Concept Focus: Each question should address only one concept (e.g., preference, frequency). - Exhaustive Options: Ensure each respondent can nd an answer that re ects their view. - Mutually Exclusive Options: Options should not overlap. - Consistent Speci city: All options should be at the same level of detail. Additional Response Options: - Include “I don’t know” or “Prefer not to answer” for sensitive questions. - Use an “Other, please specify” option if needed, especially for exploratory questions. Types of Scales in Questionnaires 1.Likert Scales: - Measures attitudes or agreement levels using options from Strongly Disagree to Strongly Agree. - The number of response points (odd or even) can in uence neutrality or require a lean towards agreement/disagreement. 2.Self-Anchoring Scales: - Often used without labeled intervals, relying on visual cues. - Allows respondents to gauge their answers based on personal standards (e.g., health scale from “poor” to “excellent”). 3.Semantic Di erential Scales: - Measures meanings attributed to a concept, using opposing adjectives (e.g., “Good” vs. “Bad”). Survey Filters - Purpose: Filter questions streamline the survey by branching questions based on prior answers. - Example: “Do you smoke?” → If “Yes,” then ask, “How many cigarettes per day?” - Challenges: - Can complicate the survey ow. - May halve the sample size if separate paths are created, impacting sample consistency. Introducing a Questionnaire Components of a Good Introduction: 1. Thank You Statement: Makes respondents feel valued. ff fi fl fi fl fl fi 2. Study Purpose: Clari es the survey’s objective and who is conducting it. 3. Time Estimate: Sets expectations for the completion time. 4. Con dentiality Assurance: Reinforces trust by ensuring anonymity. Common mistakes to Avoid in Questionnaire Design Leading Questions: - Avoid strong or emotionally charged language that may bias responses. - Example: Replace “The government should force taxes” with “The government should increase taxes.” Overlapping or Non-Mutually Exclusive Options: - Ensure that each choice is distinct. - Example Issue: Age ranges like “20–30” and “30–40” overlap at 30. Correct this by using “20–29” and “30–39.” Vague Questions: - Avoid general questions that could be interpreted in multiple ways. - Example: “What suggestions do you have for improving Tom’s Tomato Juice?” → Specify whether it’s about taste, packaging, etc. Sensitive or Intrusive Questions: - For private or potentially uncomfortable topics, include a “Prefer not to answer” option. - Example: Questions on income, political beliefs, or health. Incomplete Answer Choices: - Include all relevant options or allow an “Other” option. - Conduct a pretest: If over 10% of responses fall under “Other,” consider expanding options. Double-Barreled Questions: - Avoid questions that ask two things simultaneously. - Example: “What is the fastest and most economical internet service for you?” Separate this into two distinct questions. Questionnaire Length: - Keep it concise to avoid participant fatigue, which can lower response quality and completion rates. Practice Exercises - Identify Problems and Revise Questions: - Example Exercise: “Do you enjoy practicing sports and spending time outdoors?” This is double-barreled. Revise to “Do you enjoy practicing sports?” and a separate question, “Do you enjoy spending time outdoors?” Developing Questions for a Speci c Study: - Use clear objectives to design relevant questions for research, such as studying unhealthy eating habits among students. Create questions that address behavior (e.g., frequency of fast- food consumption), attitudes, and educational needs. fi fi fi Pilot Phase and Implementation Pilot Phase: Purpose and Importance - The pilot phase involves testing the research instrument (questionnaire, interview guide) on a small sample representative of the target population. Purpose: - Detect and correct issues in the instrument. - Ensure the tool e ectively collects the intended data. - Prevent errors that could disrupt the actual study. Evaluation Criteria in the Pilot Phase: - Sustainability: Duration and intrusiveness of the survey. - Sequence and Flow: Check for interference e ects between questions. - Comprehensibility: Ensure vocabulary and phrasing are clear. - Social Appropriateness: Avoid embarrassment or prestige bias. - Attention and Fatigue: Monitor participant engagement throughout. - Scale E ects: Check for asymmetry or issues with Likert or other scales. - Coverage: Con rm that all relevant information is addressed. - Technical Aspects: Ensure all logistical tools (e.g., online platforms) function correctly. How to Conduct a Pilot Phase Steps: 1. Administer the instrument to a small group (usually ~10 participants). 2. Ensure participants share characteristics with the target population. 3. Administer in the presence of the researcher (for feedback collection). 4. Collect feedback on: - Completion time. - Di culty of questions. - Suggestions for improvement. Cognitive Pretesting Techniques: - Think-Aloud Method: Participants verbalise their thoughts while completing the survey. - Probing: Researchers ask follow-up questions to clarify participant responses. - Con dence Rating: Participants rate how con dent they are about their answers. - Outcome: Revise the instrument based on pilot feedback to ensure its functionality. Questionnaire Administration: Methods Types of Administration: 1. Personal (PAPI/CAPI): Conducted in-person with assistance. 2. Telephonic (CATI): Conducted over the phone with responses recorded. 3. Internet (CAWI): Self-administered online. 4. Postal Surveys: Paper-based surveys distributed and returned by mail. 5. Pen-and-Paper Self-Administered Surveys: Participants complete surveys independently on paper. Comparison of Administration Methods: CAPI (Computer-Assisted Personal Interviews): - Conducted in person; researcher asks questions using a computer. - Advantages: High response rate, fast data transfer, minimal interaction bias. - Disadvantages: Time-consuming, expensive, potential for interviewer in uence. CATI (Computer-Assisted Telephone Interviews): - Phone-based interviews recorded on a computer. - Advantages: Rapid data collection, relative anonymity. - Disadvantages: High non-response rate, limited duration and complexity. ffi fi ff fi ff fi ff fl CAWI (Online Surveys): - Conducted online via platforms like Qualtrics or Google Forms. - Advantages: Cost-e ective, large reach, exible timing for participants. - Disadvantages: Sampling issues, self-selection bias, potential nonresponse. Key Considerations: - Evaluate costs, time, response rates, privacy, and the complexity of the questionnaire. - Consider participant convenience and the in uence of social desirability on answers. Platforms for Online Surveys - Professional Solutions: Require licenses (e.g., Qualtrics), allowing complex designs with randomisation and control. - Freemium Tools: O er basic functionality for free; advanced features available at cost. - Free Platforms (e.g., Google Forms): - Simple, user-friendly, and su cient for many projects. - Limited advanced functionality and graphical customisation. Practical Features in Questionnaire Platforms Basic Interface: - Add and customise questions. - Use titles and descriptions for clarity and section transitions. Question Types: - Multiple Choice: Single or multiple answers. - Open-Ended: Short or paragraph responses for qualitative data. - Likert Scales: Agreement or frequency scales (e.g., 5-7 steps). Additional Options: - Mandatory Questions: Ensure essential data is collected but balance participant convenience. - Filters: Use logic (e.g., “If Yes, then ask...”) for tailored question paths. Design and Distribution: - Customise graphics and format. - Distribute via anonymous links to ensure privacy. Steps in Research Phases 1. Identify the Research Problem: De ne what the research aims to address. 2. Formulate Objectives: Specify what the study seeks to achieve. 3. Choose Research Design: Decide between qualitative, quantitative, or mixed-methods. 4. De ne Tools: Develop instruments like questionnaires or interview guides. 5. Set Conditions: Determine the mode of instrument delivery. 6. Select a Sample: Identify and recruit the target population. 7. Collect Data: Implement the survey or interviews. 8. Process Data: Clean and organise responses for analysis. 9. Analyse Data: Extract insights and patterns from the responses. 10. Communicate Results: Present ndings to stakeholders or the research audience. Advantages and Disadvantages of Administration Methods Face-to-Face: - Advantages: Rich data collection, high response rates. - Disadvantages: High costs, interviewer bias. Online: - Advantages: Cost-e ective, rapid implementation, large reach. - Disadvantages: Risk of non-response and self-selection bias. Telephone: - Advantages: Quick and relatively anonymous. - Disadvantages: Distrust from respondents, high refusal rates. fi ff ff ff ffi fi fi fl fl Qualitative Data Analysis What is Qualitative Data Analysis? - It is the process of making sense of qualitative data to answer research questions. - Involves: - Consolidating, reducing, and interpreting data. - Alternating between concrete data and abstract concepts, inductive and deductive reasoning, and description and interpretation. - It is a systematic search for meaning, allowing researchers to: - Identify patterns, themes, and relationships. - Develop explanations, interpretations, and potentially generate theories. - Wolcott's "Mindwork": Analysis always involves intellectual e ort to interpret data meaningfully. General Aim of Qualitative Data Analysis - Begin by identifying units of data—segments responsive to your research questions. - A unit of data should: 1. Reveal relevant information for the study and stimulate thinking (heuristic value). 2. Be interpretable independently within the broader study context. Process: - Break down data into small information units. - Assign these bits into categories or classes. - Compare units for recurring regularities. The Coding Process - Assigning labels (codes) to data segments to capture their meaning. - Codes can take various forms: - Single words, phrases, numbers, or colours. Purpose: - Facilitate data retrieval and comparison. - Identify patterns or relationships in the data. Types of Codes: - Descriptive Codes: Label segments with a descriptive term. - Analytical Codes: Go beyond description to interpret data meaning. From Codes to Categories - Categories are groups of related codes. Steps 1. Assign codes to data. 2. Identify patterns among codes to construct categories. - Category Construction: - Categories may be hierarchical (with subcategories) or at (non-hierarchical). Examples: - Data on groceries could be categorized by type (fresh, frozen), price range, or purpose. - Criteria for Good Categories: 1. Responsiveness: Re ect the study’s purpose. 2. Exhaustiveness: Cover all signi cant data points. 3. Mutual Exclusivity: Each data unit belongs to one category only. 4. Conceptual Clarity: Names should clearly re ect their contents. 5. Conceptual Congruence: Maintain consistent abstraction across categories. From Categories to Themes - Themes are unifying ideas identi ed across categories. - Transition from description to interpretation. Themes help: - Synthesise data. fl fi fi fl fl ff - Highlight dominant patterns or stories in the dataset. Steps in Qualitative Data Analysis Step 1: Transcribing Data - Convert audio/video recordings into text. - Types of Transcriptions: 1. Denaturalised: Include every verbal detail (e.g., "um," "uh"). 2. Naturalised: Use standard conventions to enhance readability. - Include paralinguistic features (tone, pauses, laughter) for context. Step 2: Coding - Examine data for notable behaviours, patterns, and interactions. - Label these segments with codes for easier analysis. Step 3: Categorisation - Organise related codes into categories. - Re ne categories over time, adjusting their names and structure as necessary. Step 4: Developing Themes - Identify recurring ideas that unify categories. - Transition from organising data to interpreting its meaning. Data Interpretation - Analysis identi es patterns; interpretation explains them. - Challenges include: - Understanding data in context. - Balancing the values and perspectives of all participants. - Key Tools: - Codes: Link raw data to broader ideas. - Memos: Record thoughts and hypotheses during analysis. Developing E ective Presentations with Survey Data Communicating Research Results - E ective communication involves tailoring the presentation to the target audience using appropriate media and channels. Key Components: - Target Audience: Determine who will receive the results (e.g., peers, stakeholders, public). - Purpose: Identify why the communication is being made (e.g., inform, persuade, educate). - Media and Context: - Written reports (e.g., online, print). - Oral presentations (e.g., lectures, TED-style talks). - Audio/Video content (e.g., podcasts, TV). - Consider the context—formal conferences vs. public-facing media. Sections of a Research Report - A good research report includes the following sections: 1. Introduction and Background: - Who is the client? - What is the context or issue being addressed? 2. Objectives: - What does the research aim to achieve? 3. Methods: - Describe the research design, tools, sample, and population. ff fi fi ff 4. Results: - Present key ndings with evidence. 5. Practical Advice: - Provide actionable recommendations or implications for the client. - Tailor the presentation style depending on whether the research is qualitative or quantitative. Presenting Qualitative vs. Quantitative Data Qualitative Research: - Focus on key themes and insights. - Use visuals like metaphors, analogies, photographs, or diagrams to convey ndings. - Include quotes or verbalisations from participants to provide depth. Quantitative Research: - Focus on numeric data and statistical relationships. - Use charts and graphs to represent ndings. - Include descriptive and multivariate statistics (e.g., averages, frequencies, cross-tabulations). How to Present Quantitative Data? Purpose of Charts: 1. Simplify data to highlight key ndings. 2. Aid decision-making by providing clear evidence. - General Rules for Charts: - Title: Should convey the main message clearly. - Sample Information: Include the sample size and population context. - Units of Measure: Clearly de ne what the numbers represent (e.g., percentages, averages). - Logo: belonging to the company organising the event - Simpli ed Scales: Avoid overcrowding the chart with excessive intervals. - Labels and Legends: Place them near the relevant data points for easy understanding. - Colours: Use them meaningfully, avoiding poor contrast or excessive variety. Selecting the Right Chart - Choose a chart type that best represents the data: - Bar Charts: For comparisons between groups. - Line Charts: For trends over time. - Pie Charts: For proportional data (avoid misuse—ensure proportions are clear). - Scatter Plots: To show relationships between variables. Common Errors to Avoid: - Misleading proportions (e.g., incorrectly sized pie slices). - Overlapping or unclear categories. - Crowded or overly complex visuals. Tips for E ective Data Visualisation 1.Tell a Story: - Ensure the visual leads the audience through a clear narrative. - Highlight key points while avoiding unnecessary details. 2.Avoid mess: - Focus on simplicity and clarity. - Use whitespace strategically to improve readability. 3.Check Relevance: - Ensure every visual answers a question or supports a point. - If a chart or graph doesn't add value, reconsider its inclusion. Steps to Create a Presentation Select the Data: - Focus on data relevant to the objectives and audience. Design the Slides: ff fi fi fi fi fi fi - Use concise text and impactful visuals. - Avoid heavy text blocks; rely on bullet points and charts. Practice Delivery: - Ensure smooth transitions between sections. - Prepare to explain key ndings and answer questions. Ethical Perspectives in Applied Research 1. Importance of Ethics in Research: - Ethics form the foundation of responsible research by: - Protecting participants’ dignity, rights, and well-being. - Maintaining the integrity and credibility of results. - Ethical research fosters trust, ensures professional standards, and produces socially responsible outcomes. Questionable experiments: - Zimbardo: the Stanford prison experiment - Milgram experiment Ethics: - Derives from Greek ethos (character) and Latin moralis (customs or manners). - De ned as a code of professional conduct distinguishing acceptable from unacceptable behaviour. Moral vs. Ethical: - Morality concerns personal or cultural beliefs about right and wrong. - Ethics pertains to professional norms and standards. 2. Principles of Ethics in Research Reasons to Adhere to Ethical Norms: - Promote research aims like knowledge and truth. - Facilitate collaboration and accountability. - Build public trust and support for research. - Uphold broader moral and social values, such as human rights and public safety. 3. Three Key Ethical Dimensions: 1. Personal Ethical Decision-Making: - Individual researcher’s judgment and conscience. 2. Professional Ethical Standards: - Guidelines established by professional organisations (e.g., CERPS, CNOP for psychologists). 3. Regulatory Mechanisms: - Legal frameworks governing research practices. 4. Key Ethical Practices - Informed Consent: - Participants must understand the research, risks, and their rights. - Con dentiality: - Protect private information and maintain anonymity. - Right to Withdraw: - Participants can leave the study at any point without consequences. - Risk Assessment: - Anticipate potential harm and mitigate risks. - Deception: - Avoid deception unless scienti cally justi ed and approved by an ethics board. - Debrie ng: fi fi fi fi fi fi - Provide full disclosure after participation, explaining the study and its purpose. - Use of Incentives: - Avoid undue in uence or coercion through rewards. 5. Core Ethical Principles 1. Respect for People: - Protect autonomy by ensuring informed and voluntary participation. - Avoid coercion; participants must freely decide to partake. - Disclose con icts of interest transparently. 2. Bene cence: - Take positive steps to further participants' legitimate interests. - Minimise harm and unnecessary risks. - Respect participants’ rights and dignity. 3. Justice: - Ensure fairness and equity in distributing research bene ts and burdens. - Treat all participants with equal respect and consideration. 6. Ethical Review Processes - Research must undergo evaluation by ethics committees (e.g., CERPS at Unicatt). - Key Questions for Planning: - Does the researcher have the competence to conduct the study? - Are participants recruited fairly? - Has the study been reviewed and approved by relevant ethics boards? - Conducting Research: - Ensure participants are protected from harm and can withdraw without penalty. - Avoid data fabrication or falsi cation. - Post-Research Obligations: - Ful ll promises to participants and deliver promised incentives. - Accurately represent ndings in reports and publications. 7. Problematic Ethical Areas - Dependent Relationships: - Avoid coercion in relationships like teacher-student or employer-employee. - Freedom to Participate: - Ensure no repercussions for choosing not to participate. - Use of Images: - Obtain explicit consent for using participants' images. - Use of AI: - Address privacy and bias concerns when integrating AI tools. Practical Ethical Considerations - During Planning: - Ensure the researcher is knowledgeable about ethical standards and local customs. - Select participants fairly and in line with guidelines. - During Research: - Protect participants’ rights to privacy, con dentiality, and informed consent. - Avoid harm and deception unless scienti cally justi ed. - Post-Research: - Deliver accurate reports and ensure proper attribution in authorship. - Provide raw data for replication when requested. fi fi fl fl fi fi fi fi fi fi

Use Quizgecko on...
Browser
Browser