CS2008 FOR MASTER DOCUMENT NOTES.docx
Document Details
Uploaded by SecureSousaphone
Nanyang Technological University
Tags
Full Transcript
[Week 1 Content: 1](#week-1-content) [Week 2 Content: 7](#week-2-content) [Week 3 Content: 19](#week-3-content) [Week 4 Content: 25](#week-4-content) [Week 5 Content: 33](#week-5-content) Week 1 Content: =============== Empistemology is the theory of knowledge, especially with regard to its me...
[Week 1 Content: 1](#week-1-content) [Week 2 Content: 7](#week-2-content) [Week 3 Content: 19](#week-3-content) [Week 4 Content: 25](#week-4-content) [Week 5 Content: 33](#week-5-content) Week 1 Content: =============== Empistemology is the theory of knowledge, especially with regard to its methods, validity, and scope, and the distinction between justified belief and opinion. - Empiricism is the theory of knowledge at the heart of science and the scientific method - Empiricism posits that knowledge comes from experience -- Truth claims [must be based on evidence]. - Empirical research is defined as any study whose conclusions are [exclusively derived from concrete, verifiable evidence] - Guided by proper scientific experimentation and/or evidence. What is research? - A systematic investigation of a problem that involves gathering evidence to make inferences - Scientific research attempts to reveal objective reality - However, certain aspects of research are inherently subjective -- The meaning of evidence can involve interpretation Four Goals of Social Scientific Research - Prediction - Causation - Explanation - Description Inductive and Deductive Reasoning: - Deductive reasoning is based on formal steps of logic. - Inductive reasoning refers to verification through empirical observation. - Inductive-deductive cycle - Moving from observations to generalizations (i.e., theories) then back to implications (e.g., hypotheses) **Research in Public Relations** - Research is a vital function in the process of public relations. - Provides the initial information necessary to plan public relations action and to evaluate its effectiveness. - Related to but distinct from advertising research - Can be qualitative, quantitative, or both Types of Public Relations research: - **Applied Research:** Examines specific practical issues, conducted to solve a specific problem - **Strategic Research (Branch of Applied Research):** Used to develop public relations campaigns and programs - **Evaluation Research (Branch of Applied Research):** Conducted to assess the effectiveness of a public relations program - Problem solving; strategic and evaluative - **Introspective Research:** Examines the field of public relations - Examining the profession and its function in society - **Basic Research:** Creates knowledge that cuts across public relations situations. - Knowledge creation, theoretical **PR Research helps:** 1\. To define PR problems 2\. With PR program planning 3\. With program implementation 4\. Evaluate PR programs **Research In The Public Relations Process:** Problem definition techniques - Environmental monitoring (two phases) - Early warning analyses to identify emerging issues - Tracking public opinion on major issues - Observe trends in public opinion and social events that may have a significant impact on an organization - Public relations audits - Comprehensive study of the public relations position of an organization - May unearth basic issues that the organization might not be aware of - What does the public think of the company? Do they recognise the brand? How well do they know this brand? - Communications audits - Resembles a public relations audit but has narrower goals - Concerns the internal and external means of communication used by an organization rather than the company's entire public relations program - May be general, examining all of the company's communication efforts, or specifically targeting at a certain methodology or means of communication the company has used. - Media audits - A survey of reporters, editors, and other media personnel that asks about their preferences for stories and how they perceive the public relations agency's clients. Program planning - Designing a campaign or program to address problem and achieve specific goals Implementing Public Relations Programs - Gatekeeping Research - Analyzes and filters the characteristics of press releases and video news releases that allow them to appear in a mass medium. - Shortened and simplified versions of these press releases and video news for publication - Output Analysis - Measures how well the organization presents itself to others and the amount of exposure or attention that the organization receives. - E.g: Measure the total number of stories or articles that appear in selected mass media and examine the tone of the articles Program evaluation through all phases - Implementation checking - Investigates whether the intended target audience is actually being reached by the message. - In-progress monitoring - Shortly after the campaign starts, researchers check to see whether the program is having its intended effects. - If there are unanticipated results or if results seem to be falling short of objectives, the program might still be modified or adapted to maximise its efficacy. - Outcome evaluation - Results are assessed after the campaign ends - Findings are used to suggest changes for the future. **Qualitative Methods in Public Relations** Critical Incident Technique - Combination of in-depth interviewing and case study approach - Critical incident is defined as an incident in which the purpose or intent of the act is clear to the observer and the consequences are definite - Content analysis of incident descriptions Advantages and Disadvantages: - Advantages: - Focuses on real-world incidents as seen through the eyes of those who are directly involved - Disadvantages: - Depends on the memory of the informants for its data, questionable reliability - Some people may remember more details than others, and some may have their memories distorted by selective perceptions. - Accounts might not be accurate due to imperfect retrospection or sensitive nature of incident - E.g: Employees might be reluctant to divulge information that might reflect badly on superiors or the organization for which they work. Discourse Analysis - Interpretative approach to understanding language use related to the campaign - Focuses on larger linguistic units, such as whole conversations or written messages - How language is being used to convey messages, ideas and beliefs - Examines how the form and content of the language is being used - Differs from quantitative content analysis Advantages and Disadvantages: - Advantages - Can be used to study different situations and subjects - Allows public relations researchers to uncover deeply held attitudes and perceptions that are important in an organization's image and communication practices that might not be uncovered by any other method - Disadvantages - Can take large amounts of time and effort - Focuses solely on language, which rarely tells the full story **PR Research and Social Media** - Social media are increasingly important - Analytics (scraping the web) - UX research (user experience) **Copy Testing** - Important in PR and Advertising (Strat Comm) - Developing effective messages - Two key criteria or dependent variables: - Recall (memory) - Liking (attitudes) - Strat Comm and the hierarchy of effects - Knowledge - Attitudes - Behavior **Dimensions of Impact** - Cognitive - Attention (predicated on exposure) - Comprehension - Recognition/recall - Affective - Liking or favorability - Involvement - Conative (i.e., behavioral) - Intention to act - Action - Consolidation (repetition, establishment of pattern or habit) - Advocacy **Reasons for Copy Testing** - Copy testing to improve message quality - Comprehensibility - Liking - Involvement - Copy testing to avoid unintended outcomes **Copy Testing Methods** - Can be done individually or in groups - Research method depends on question asked - Self-reports - Surveys - CRM (continuous response measurement) - Susceptible to demand characteristics - Direct measures - Unobtrusive measures - Psychophysiology - Trade-offs between ease and validity - Multi-method can be useful, but costly Week 2 Content: =============== **What is the Scientific Method?** - Science as a formal approach to finding answers - One method of knowing (C. S. Peirce) - "Scientific research is an organized, objective, and controlled empirical analysis of one or more variables." - Wimmer & Dominick **Characteristics of the Scientific Method:** - Public - Freely available information, shared routinely - Published reports need to include information on sampling methods, measurements, and data-gathering procedures - To allow other researchers to independently verify a given study and support or refute the initial research finding - Objective - Fact-based and rule-based, meant to minimize errors by researchers - Empirical - Concerned with a world that is knowable and measurable - Moving from abstract concepts to observable things - Concepts must be strictly defined to allow for objective observation and measurement - Use of operational definitions - Must link abstract concepts to the empirical world through observations, which may be made either directly or indirectly via various measurement instruments. - Must be able to perceive and classify what they study and reject metaphysical and nonsensical explanations of events - Systematic and Cumulative - Use previous studies as building blocks - Theories guide scientific inquiry - Predictive - A theory's adequacy lies in its ability to predict a phenomenon or event successfully - Relating the present to the future - Data supporting predictions prompts extension of theory - Theories that generate predictions that are supported by the data can be used to make predictions in other situations. - Prevent and mitigate harm from occurring - Self-correcting - When errors in previous research are uncovered, theories can be altered or rejected - If theories no longer account for the data at present, theories can be revised and reviewed to fit changing contexts - Not static **Research Procedures** 1\. Select a problem. 2\. Review existing research and theory (when relevant). 3\. Develop hypotheses or research questions. 4\. Determine an appropriate methodology/research design. (Qualitative research vs Quantitative Research) 5\. Collect relevant data. 6\. Analyze and interpret the results. 7\. Present the results in an appropriate form. 8\. Replicate the study (when necessary). **Commercial Research vs Academic Research** Commercial Research: - Generally applied research - Results are intended to facilitate decision making - E.g: Media content and consumer preferences, acquisitions of additional businesses or facilities, analysis of on-air talent, advertising and promotional campaigns, public relations approaches to solving specific informational problems, sales forecasting, and image studies of the properties owned by the company - Value put on time and money, private sector often has more money and time to invest in research - Proprietary (owned) data which is private data is kept as confidential data - Private research tends to be more rigorous as there are tight deadlines to fulfill Academic Research: - Public data is made available on a public repository - Generally, do not have specific deadlines for their research projects - Generally, less expensive to conduct than research in the private sector **What makes for a good research topic?** - Solves a problem - Fills a gap in our understanding - Eight questions - Is the topic too broad? -- - Can it actually be investigated (as stated)? -- - Can the data be analyzed? - Is the problem significant? - Can the results be generalized? - Are the costs or time required too great? - Is the approach appropriate - Is there any potential harm to participants? ![](media/image2.png) **The Need for Validity** - Validity is a critique, when something does not make sense for a certain context. - Not justified if it is not valid. - Research must be validated to be meaningful **Internal validity** - Internal = within the study - We need to be able to rule out plausible but incorrect explanations of results - Extraneous variables: potentially influence the relationship between variables under investigation - [Confounding] variables: things that actually do [influence] the relationship being studied - They are a subset of extraneous variables - Presence of confounding variables indicates lack of internal validity - The extent to which a study is well designed and permits inferences that are warranted - Threats to internal validity → alternative explanations - **History:** Various events that occur during a study may affect the subjects' attitudes, opinions, and behavior. - **Maturation:** Subjects' biological and psychological characteristics change over the course and duration of the study, especially if it is over a long period of time. - E.g: Growing hungry or tired or becoming older may influence how subjects respond in a research study. - **Testing:** When subjects are given similar pretests and posttests. - A pretest may sensitize subjects to the material and improve their posttest scores regardless of the type of experimental treatment given - Subjects learn how to answer questions and to anticipate researchers' demands - **Instrumentation:** Refers to the deterioration of research instruments or methods over a study. - Equipment may wear out, observers may become more casual in recording their observations, and interviewers who memorize frequently asked questions might fail to present them in the proper order. - **Statistical Regression:** When an extreme outlier is more likely to become closer to the mean after a second testing (E.g: Sports illustrated curse), performance gets worse in subsequent tests. - **Experimental Mortality:** All research studies face the possibility that subjects will drop out for one reason or another. - Loss of subjects over time - **Sample Selection:** Compare two or more groups of subjects to determine whether differences exist on the dependent measurement - Groups must be selected randomly and tested for homogeneity to ensure that results are not due to the type of sample used - **Demand Characteristics:** Subjects being aware that they are under observation may subconsciously behave in a certain way to produce "good" data for researchers. - Cross-validation is necessary to verify subjects' responses; by giving subjects the opportunity to answer the same question phrased in different ways. - Helps researcher to identify discrepancies, which are generally error-producing responses - Can help control demand characteristics by disguising the real purpose of the study from participants - **Experimenter bias:** When an experiment ends up getting skewed to support a researchers' hypothesis, it becomes a biased study. - **Evaluation Apprehension:** Subjects are essentially afraid of being measured or tested. - Interested in receiving only positive evaluations from the researcher and from the other subjects involved in the study - Hesitant to exhibit behavior that differs from the norm and tend to follow the group even though they may totally disagree with the others - **Causal time order:** Organization of an experiment may create problems with data collection and interpretation - **Diffusion of treatments:** Contamination of subjects through interactions with one another at different times of the day or different periods of the study - Respondents may have the opportunity to discuss the project with someone from another session and contaminate the research project - **Compensation:** Individuals who work with a control group (the one that receives no experimental treatment) may unknowingly treat the group differently because the group is "deprived" of something. - **Compensatory rivalry:** Subjects who know they are in a control group may work harder or perform differently to outperform the experimental group. - **Demoralization:** Control group subjects may literally lose interest in a project because they are not experimental subjects. - May give up or intentionally do badly because they may feel demoralized or angry that they are not in the experimental group **External Validity** - How well can the results of a study be generalised across populations, settings, and time? - Can be affected by interactions among threats to internal validity - How to improve external validity - Use random samples - Use heterogeneous samples and replicate - Use representative samples - Representativeness is key to external validity **Elements of Research** - The role of theory in research - Concepts as abstract ideas - Constructs as special kinds of concepts - Specified in detail - Often have dimensions or hierarchical structure - Not directly observable - Specific to a research context - At empirical level, we have variables, which can be measured in a variety of ways **Concepts, Constructs, and Variables** - From a high level of abstracticism, concepts are derived as abstract ideas and notions - To be useful in research, concepts require thoughtful specification - Delineation and articulation - Explication--- Theorising what the concept may be, concepts need explication for research to be conducted - Constructs require operationalization - Once constructs are operationalized, they become variables - Variables have measures that must be specified - Concepts \> Constructs \> Variables - One overall concept broken down into a narrower constructs and then subdimensions/variables are derived from it Concepts are: - A term that expresses an abstract idea - Represents a wide variety of observable objects - E.g: Advertising effectiveness, message length, media usage, and readability - Help simplify the research process by combining particular characteristics, objects, or people into general categories - Simplify communication among those who have a shared understanding of them. Constructs are: - Abstract idea that is usually broken down into dimensions represented by lower-level concepts, a combination of concepts - Constructs usually cannot be observed directly - Usually designed for a specific research purpose so that its exact meaning relates only to the context in which it is found Variables are: - Empirical counterpart of a construct or concept - A phenomenon, characteristic or event whose value varies within a sample - Independent variable can be manipulated to test a hypothesis - Presumed to cause/determine/influence a dependent variable - Systematically varied by the researcher - Independent variable (IV) - Presumed to cause/ determine/ influence a dependent variable - Is systematically varied by the researcher - Dependent variable (DV) - Unchanged variable, variable that is being observed and measured; Varies with respect to independent variable - Presumed to be caused or influenced by another variable - Dependent variable is what the researcher wishes to explain. - Discrete (Fixed answers) - Only a finite set of values; it cannot be divided into subparts. - Fixed whole numbers/values. - E.g: Number of children in a family, political affiliation, population, and gender - Continuous (A spectrum) - Can take on any value, including fractions, and can be meaningfully broken into smaller subsections. - E.g: Height, Time spent watching television, [average] number of children in a family - For continuous variables (a variable that varies continuously) scale and range matter - Predictor Variable - The variable that is used for predictions or is assumed to be causal (analogous to the independent variable) - Criterion Variable - The variable that is predicted or assumed to be affected (analogous to the dependent variable). - Predictor vs. Criterion - Labels used for non-experimental research - Example: contact and ATLG (attitudes towards lesbians and gays) - Control variables - Used to ensure that the results of the study are due to the independent variables, not to another source - Control variables vs. variables of interest - Left-handedness, gender, educational attainment - Variables under study require operationalization **Qualitative Research vs Quantitative Research** Qualitative Research - Involves several methods of data collection, such as focus groups, field observation, in-depth interviews, and case studies. - Varied questioning methods - E.g: Although the researcher enters the project with a specific set of questions, follow-up questions are developed and asked as needed. - Variables in qualitative research may or may not be measured or quantified. Advantages of Qualitative Research: - Allows a researcher to view behavior in a natural setting without the artificiality that sometimes surrounds experimental or survey research - Can increase a researcher's depth of understanding of the phenomenon under investigation - Stands true especially when phenomenon has not been investigated before - Flexible and allow the researcher to pursue new areas of interest Disadvantages of Qualitative Research: - Sample sizes are sometimes too small for researcher to allow the researcher to generalize the data beyond the sample selected for the particular study - Qualitative research is often the preliminary step to further investigation rather than the final phase of a project - Data reliability - Single observers are describing unique events, possible to lose objectivity - May lose the necessary professional detachment - If qualitative research is not properly planned, the project may produce nothing of value Quantitative Research - Involves several methods of data collection, such as telephone surveys, mail surveys, and Internet surveys - Questioning is static or standardized---all respondents are asked the same questions and there is no opportunity for follow-up questions - Variables under consideration need to be measured Advantages of Quantitative Research: - Use of numbers allows greater precision in reporting results **Measurement** - Stevens (1951): Measurement "is the assignment of numbers to objects or events according to the rules" - "the process of linking abstract concepts to empirical indicators" - Problematic for social scientists as many measured phenomena are neither objects nor events - Quantification allows for - mathematical processing and probabilistic analyses - making inferences about unobservable phenomena **Levels of Measurement:** - Nominal - Categorical in nature and classify one or more qualitative categories that describe the characteristic of interest. - Does not provide any ordering of the different categories nor do they measure distance between values - Can be listed in any order without affecting the relationship between them - Categories are mutually exclusive, no logical order exists between categories - E.g: Gender: Male, Female, Non-binary. Hair color: Black, Brown, Blonde. Blood Type: A, B, O, AB. - Ordinal - Categories are ordered or ranked. Differences between ranks are not quantifiable - inherent order to the relationship among the different categories - Implied ordering of the categories although the quantitative distance between levels is unknown and may not be the same - E.g: Smallest to largest, Socioeconomic status (Categorizing families according to class: lower, lower middle, middle, upper middle, or upper. A rank of 1 is assigned to lower, 2 to lower middle, 3 to middle, and so forth.), Rankings of football or basketball teams, Military ranks, Restaurant ratings, and Beauty pageant results - Interval - When a scale has all the properties of an ordinal scale and the intervals between adjacent points on the scale are of equal value - inherent order to the relationship among the different categories but the intervals between the numbers represent something real - distance between values within interval variables are measured and have constant, equal distances between values - Dealing with scores instead of original description when using a Likert Scale - Numbers are assigned to the positions of objects on an interval scale in such a way that one may carry out arithmetic operations on the differences between them - Lacks a true zero point, no absolute zero point (zero does not indicate the absence of quantity---zero has a meaning) - Absence of a true zero point means that a researcher cannot make statements of a proportional nature; for example, someone with an IQ of 100 is not twice as smart as someone with an IQ of 50, and a person who scores 30 on a test of aggression is not three times as aggressive as a person who scores 10. - E.g: Temperature, IQ - Ratio - Have a true zero point within their measurement - Zero means none of quantity, true zero is a bottomline, you cannot go beyond that - Allows researchers to talk about ratios between subjects - E.g: A person's lung capacity can be twice someone else's lung capacity - E.g: Height, Weight, Income (No minus weight, no minus height, no minus income) **Composite Measures (Scales)** - Scale can refer to two things - Response option range (e.g., not at all -- a lot) - Composite measures (items combined) - Response range depends on - The nature of the construct - Desired and available precision - Human capability to differentiate - Scales can be transformed - Rescaled (i.e., change range) - Flipped (i.e., reversed) - Likert scales are very popular in mainstream usage - Assess level of agreement with various statements - Summation versus averaging - Knowledge measures are usually not composite measures - Composite measures can help to increase measurement reliability (e.g., Cronbach's α) **Reliability and Validity of Measures** - Reliability: consistently giving the same response - Test-retest reliability - Internal consistency - Intercoder reliability - Four types of measurement validity: - 1\. Face (judgment-based) - Achieved by examining the measurement device to see whether, on the face of it, it measures what it appears to measure - 2\. Predictive (criterion-based; sometimes called empirical validity) - Checking a measurement instrument against some future outcome - 3\. Concurrent (criterion-based) - Measuring instrument is checked against some present criterion - 4\. Construct (theory-based) - Relating a measuring instrument to some overall theoretic framework to ensure that the measurement is logically related to other concepts in the framework - Researcher should be able to suggest various relationships between the property being measured and the other variables. **\ ** Week 3 Content: =============== **Research Ethics** - An integral part of doing research in the social sciences - Most interactions with people will need to undergo review - Used to be called Human Subjects Review - Exemption for journalism - Researchers must ensure that the rights of the participants are not violated and that the data are analyzed and reported correctly What are Ethics? - Ethics concerns examining, understanding, and enforcing what is "right" and what is "wrong" - Rules of acceptable behavior based on moral judgement, what is good and what is bad - When applied to most research settings, ethics involve the application of formal rules to the research process to systematically recommend and defend behaviors designed to protect wellbeing and curb misconduct - Set of practices good researchers have to abide by Why must we be ethical? - Unethical behavior may have an adverse effect on research participants. - Just one experience with an ethically questionable research project may completely alienate a respondent - Reflect poorly on the profession and may result in an increase in negative public opinion - Ethical behavior is the right thing to do Ethical Theories: 1\. Deontological theories (rule-based/process-focused) - Kant: In order to act morally, one must act from duty - "right" actions are independent of the outcome of the action - Focuses on the process, how you do things 2\. Teleological theories (balancing/utilitarian) - Focus on the goodness/badness of the outcome, not the action - Weighing pros and cons - Focuses on the outcome 3\. Relativistic theories (no universal right/wrong) - No one is objectively right or wrong - We ought to tolerate actions of others even when we don't agree about the morality of it - Not part of another culture, no right to interfere and intervene - E.g: Female Genital Mutilation Notable unethical experiment examples: - 1932-1972 Tuskegee syphilis study - Impoverished African American men suffering from syphilis were studied without their consent and left untreated so that researchers could study the progress of the disease - 600 poor African American Men were deceived into thinking they were going to receive free health care - 399 already had syphilis, 201 did not---Eventually their partners were also infected - Were not testing efficacy of syphilis medication(which they had the cure for), but testing for the health complications of syphilis - The Stanley Milgram Experiments - A series of studies conducted at Yale University in 1961 - Why would someone listen to an order that would be harmful to themselves/others? - Learner would go into a booth to memorise words, a teacher (participant) would flip switches that would increase the voltage that would shock the learner to punish the learner for making mistakes. - Researcher tells teacher to increase the shocks higher, some comply, some are reluctant and stop. - Informed consent obtained, but not safe from harm - The Stanford Prison Study - How people would behave under certain conditions, when individuals are given a role to act out - Participants roleplayed as prisoners/guards - Through roleplay of guards or prisoners, to assess how cruel individuals can be - What happens to individuals when they are incarcerated and given power and authority? - Movie "The Lazarus Effect" made inspired from this experiment - Participants were not given free will to stop the study or leave **Four Ethical Principles:** - Autonomy - Maintain their identity as unique and free individuals - Have the right and capacity to decide - Have values and dignity respected, preserve their dignity - Have the right to informed consent - Typical legal standard of consent - In order to give consent, individuals must have - The capacity to make rational, mature decisions (Need to reach the age of majority, those prior to age of majority need to attain consent from parents) - Adequate information about what is to occur. Good description of research process and experimentation. - Adequate comprehension of possible effects - Freedom from coercion or undue pressure - Implied consent - Consent that is not given explicitly, but which can be inferred based on the individual\'s actions and the facts of a particular situation. - Consent procedures overseen by IRBs - Institutional Review Board (IRB) - Formally checks the plans of researchers to maintain ethical standards - Researchers must submit applications that must be approved by the IRB prior to the start of the research project - oversee treatment of research animals - Help to ensure that there is a minimal risk of harm for human participants in research (incl. use of deception) -- - Help to ensure research integrity, prevent data fabrication - Vulnerable Participants - E.g: Children, Elderly, Mentally ill - Assent needed - Used with people who cannot make their own decisions - Gain consent of the guardian - Ask permission of the participant - **Nonmaleficence** - Ensure that your actions will not harm participants - Primum non nocere - "first, do no harm" - Respect and protect: - Privacy - Physical wellbeing (e.g., discomfort, fatigue) - Psychological wellbeing (e.g., emotional state) - Deception allowed in special cases - Debriefing as a means to reduce harm - Need to be transparent with social experiments/roleplay, need to inform participants about the study's true nature - Immediately after participant completes the study -- Delayed debriefing (full explanation which reveals everything to prevent contamination) - Reasons for debriefing: - Helps return participants to their pre-participation state (before the study affected their beliefs, attitudes, feelings, etc.) - Diminishes anxiety and other unpleasant emotional reactions - Gives subjects a sense of true value of his/her participation - Explains ethical use of deception (if deception was used) - Promotes perception of researcher honesty - Harm can include mental stress and trauma - Minimal harm offset by benefits - **Beneficence** - Actions done to benefit others, justification to explain why this research is being conducted - Weighing risks against benefits - **Justice** - People should be treated equally in various aspects - Holds that people who are equal in relevant respects should be treated equally - Positive results of research should be shared with all ![](media/image5.png) **Specified Ethical Problems Researchers may face:** - Voluntary Participation and Informed Consent - Individuals are entitled to decline to participate in any research project or to terminate participation at any time - Participation has to always be voluntary, any form of coercion is unacceptable - Researchers who are in a position of authority over subjects (as when a teacher/ researcher hands questionnaires to university students) should be especially sensitive to implied coercion - Researchers should not attempt to induce subjects to participate by misrepresenting the organization sponsoring the research or by exaggerating its purpose or importance - Researchers have the responsibility to inform potential subjects or respondents of all features of the project that can reasonably be expected to influence participation. - Potential subjects must be warned of any possible discomfort or unpleasantness that might be involved. - Clear consent must be obtained through consent forms - Concealment and Deception - Concealment is withholding certain information from the subjects - Deception is deliberately providing false information - Five necessary and sufficient conditions under which deception can only be justified: - When there is no other feasible way to obtain the desired information - When the likely benefits substantially outweigh the likely harm - When subjects are given the option to withdraw at any time without penalty - When any physical or psychological harm to subjects is temporary - When subjects are debriefed about all substantial deception and the research procedures are made available for public review - Other questions to consider: - \(1) Have all reasonably possible costs and benefits been accounted for in considering whether deception may be justified? (Y/N) - \(2) Is there any way this study could be done either without or with a lesser degree of deception? - \(3) Is the deception associated with more than minimal risk? ( - 4\) Are there possible risks that may have been overlooked in the description of this study? (Y/N) - Protection of Privacy - Respondents have a right to know whether their privacy will be maintained and who will have access to the information they provide - Assuring anonymity and by assuring confidentiality Week 4 Content: =============== **Paradigms** - Accepted set of theories, procedures, and assumptions about how researchers look at the world Different kind of researchers: - Positivist and Post-positivist (aka Objectivist) AKA Positivism - Oldest and most widely used - Natural and social sciences - Believe that the world is knowable and that there is an [objective reality] out there - [Deterministic], very hardline but the approach has been softened over the years (post-positivism) which allows some level of subjectivity - Interpretive (aka Constructivist) - Focus on the [creation of meaning ] - Meaning comes from a shared understanding - Social construct of reality - Reality is something we determine and make, we give meaning to things. We create our own narrative. - [Qualitative Research] - Critical (can be either Objectivist or Constructivist) - Focus on power distribution and dynamics and relations - Normative theories - E.g: Racism, Social Justice, Feminism, Sexism - Very applied research, trying to solve real life problems Paradigmatic Differences Positivist vs Interpretive Paradigms: - Nature of reality -- - Objectivity (Positivist) vs. Subjectivity (Interpretive) - How long is this lecture period? 50 minutes. VS Depends on what he's feeling, he could overrun. - Reality is purely objective, personal experience guides their understanding of reality - Nature of the individual - Focus on commonalities vs. uniqueness - General trend of things vs What makes things different - Qualitative Research focuses on observing things on the margins, the tails - Focus: general trends vs. particular cases - Aggregation vs. individuation - Paradigms differ in their ontology, epistemology, and methodology Practical Differences: - Role of the researcher - Objective observer vs. active participant - Analyst vs. interpreter - Research design - Pre-planned vs. evolving - Research questions change along the way - Research setting - Control prioritized vs. naturalness prioritized - Measurement - Distinct from researcher vs. researcher is the instrument - Analysis: statistical vs. interpretative - Coming up with new measures - Theory building - Hypothetico-deductive vs. inductive (data-emergent) - Predicting the outcome with specified hypothesis - Does the data support the hypothesis? - Qualitative researchers describe what they see, going from observation of something particular to a general notion of what is going on. Method vs. Methodology - Method refers to a specific research technique - Methodology is the study of methods - Reviewing techniques and their suitability - Efforts to improve methods - Quantitative methodology examines: - Experimentation as a method - Surveys as a method - Content analysis as a method - Qualitative methodology examines: - Focus groups as a method - In-depth interviews as a method - Participant observation as a method **Qualitative and Quantitative** - Qualitative and quantitative are not completely separate domains - Can be used to answer the same research questions - Can be complementary in a single research project - Concurrent mixed methods - Sequential mixed methods, combines both quantitative and qualitative - Quantitative then qualitative (Surveys first then focus group discussions. - Qualitative then quantitative (Interview people first, content analysis then experiment) When to Use Qualitative Research - When concepts cannot be quantified easily - Need to do research to even understand the main topic of investigation - When concepts are [best understood in their natural setting ] - More prominent in situation, dependent on the people involved *(in situ---watch them where they are)* - When studying intimate details of roles, processes, and groups - When the paramount objective is "understanding" Limitations of Qualitative Research - Usually not generalizable - Very niche, no representative sampling - Time/resource intensive - Reliability is not applicable - A lot of consistency - Validity means something different - Strive for verification by leaving audit trail, member checks, etc. **Qualitative Analysis** - Data preparation -- reduction and display - Separate ideas, come up with categories - Order chronologically - Categorize on preliminary themes - Note, some categorization can occur during interview - Recategorize as more data comes in and new themes emerge - Constant comparative technique - Part of Grounded Theory - Categorize via comparison among data - Interpret data to refine categories - Identify relationships and specify themes among categories - Simplify and interpret data into a coherent theoretical structure - Analytic induction strategy -- Blends analysis and "hypothesis generation" - 1\) Formulate general question(s) - "From which media sources do people seek information?" - 2\) Examine a case (participant) to see if question is appropriate - "I watch some television, but I prefer newspapers and the Internet." - 3\) Refine question in reference to other cases - "Information seekers engage with many different media sources, so the question is why go to different sources?" - 4\) Look for negative cases - "I love to seek information, and mostly get it from talking with my friends." - 5\) Continue until question is adequately addressed - Are negative cases typical? What are the most common themes? Qualitative Research Methods - Field observation - Focus groups - In-depth interviews - Case studies - Ethnography Field Observation - Data collection in natural settings or habitats - E.g., "Student use of tech in lecture" - You go out to observe them - Advantages - Good for framing research, identify an issue or problem out in the world - Ideal to answer some RQs - Can be inexpensive - Disadvantages - Questionable representativeness - Other students? Other classrooms? - Observers may have confirmation bias (You already have a theory or an answer you want to hear so your research ends up distorted due to this preconceived bias) - Looking for answers and sources that seem to confirm your bias - You choose what you want to see or hear, things that align with your opinions - More potential for reactivity - Observation of the subject may affect the behaviour under study - Participants may be influenced subconsciously that because they are being observed, they act in a certain manner that may be unnatural - Field Observation Process - Six stages - Choosing research site - Gaining access - Sampling - Collecting data - Analyzing data - Exiting (How you extricate yourself and leave the setting of the experiment untouched, try not to leave things altered) Focus Groups - Focused group discussions - Groups of 6 to 10 participants - Can have just one, but often have more - Goal is to achieve saturation, the point that you reach in your research where you get no new information - Groups are homogeneous on key characteristic - Attitudes twards lesbians and gay men (ATLG) focus groups based on s377a position, age, religion - Moderator follows structured guide or questioning route - Opening/intro - Transition/funnel - Key questions - Summarizing/ending Focus Groups Pros and Cons: - Advantages - Good source of preliminary information (e.g., for scale development) - Quick turnaround, generally inexpensive - Conversational, detailed and uninhibited responses - Can be adapted for online groups - Disadvantages - One member may monopolize the conversation - Concerns about saturation; inadequate group size/number - Data collected can be voluminous In-depth Interviews - Similar to focus groups, but only one participant at a time - Generally, long but providing in-depth observations - Broadly structured, but lots of room for customization - Advantages - The greatest depth of information per case - Allows more accurate recording of sensitive information - Disadvantages - Small sample may not be representative, cannot generalize - Large sample is very resource-intensive - High potential for reactivity Case Studies - Combining available information to describe a case in detail - Characteristics - Particularistic: Specific to a situation - Descriptive: Detailed description of a case - Heuristic: Serves as an example of a "type" of case - Inductive: Attempts to define general phenomenon - Advantages - Can address various goals, e.g., description, explanation - Accommodates a wide spectrum of evidence - Disadvantages - Often lacks scientific precision - Difficult to generalize Ethnography - Originated in anthropology and sociology - Observation of: - Whole cultures (macro-ethnography) - Elements within cultures (micro-ethnography) - The self (auto-ethnography) - Characteristics - Researcher is immersed in topic under study, adopts their culture, trying to build a bond and trust with subjects - Data collection occurs over long periods of time - Emphasis on participant's frame of reference - Can involve observation, interviewing, photo/video, etc. - Disadvantage - High potential for reactivity - Other Considerations: - Gaining access to sites/participants - Various sampling strategies - Online vs face-to-face considerations - Data collection via social/new media - Ethical issues with observation/recording - Exiting/debriefing - Status conferral from researcher to subject - Writing the research report Ethnography vs Field observation - Ethnography has more of a cultural aspect to it and takes place for a longer duration than Field observation Quantitative Research vs Qualitative Research Qualitative Research: - Uses a flexible questioning approach. - Researcher can change the questions anytime or ask follow-up questions at any time. Quantitative Research - Uses a static or a standardised set of questions. - All respondents are asked the same questions - Although follow-up questions (and skips) can be designed into a questionnaire, they must be included in the questionnaire or measurement instrument before the research project begins - Interviewers conducting the interview are not allowed to stray from the questionnaire Week 5 Content: ===============