🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

211_Exam 1 Study Guide_FA23.docx

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Full Transcript

**[PSY 211 -- Statistics and Experimental Methods I]** **[Study Guide - Exam 1]** 1. **What are the core components of science?** - **Values** an approach to understanding - Critical though + empiricism - Honesty - Open and flexible - **Methods** "Rules" for collecting...

**[PSY 211 -- Statistics and Experimental Methods I]** **[Study Guide - Exam 1]** 1. **What are the core components of science?** - **Values** an approach to understanding - Critical though + empiricism - Honesty - Open and flexible - **Methods** "Rules" for collecting evidence - Objective approaches - Systematic measurement - Standardized analyses of data - **Content** facts and findings - Empirical evidence - Ways of classifying nature - Well supported theories 2. **What are the "ways of knowing" science/scientific facts and what are some limitations of each?** - **Observation** - Limitations: - The senses can be easily fooled - People see things differently - What you can see at a glance may be different than what you can see with measuring tools - One should not generalize from a limited set of observations - **Intuition** - Limitations: - Relies unquestioningly on personal judgement - Involves cognitive and motivational bias erroneous conclusions about cause and effect - [Illusory correlation] perceiving a relationship between variables even when no such relationship exists - [Confirmation bias] people's tendency to process information by looking for, or interpreting information that is consistent with their existing beliefs - [Emotional salience on information] cognitive bias that predisposes individuals to focus on or attend to items, information, or stimuli that are more prominent, visible, or emotionally striking - **Authority** - Limitations: - Many accept statements based on faith in the authority - Scientific approach rejects the notion of authority and requires much more evidence before conclusions can be drawn 3. **What are the essential components to critical/scientific thinking? If presented with information, how would you use critical thinking to evaluate its validity or to make a decision?** - **Scientific thinking** using the cognitive skills required to generate, test, and revise theories - Keep belief and evidence distinct - Make testable claims - Evaluate strengths and weaknesses of the evidence - Try to disconfirm your idea after it has been confirmed - Have your belief follow the best evidence - **Critical thinking:** - Active, skeptical, creative 4. **What are some "red flags" that may indicate information is pseudoscience?** - The scientific claim/theory/hypothesis must be falsifiable (able to be proved false) - The theory/hypothesis - Should be backed up by (peer reviewed) research - Should be published in scientific journals - Robust enough to replicated (repeat in other experiments) - Not based on anecdote - Can always be modified by new evidence - Should fit with other evidence-based theories - Should not confuse the presence of any relationship with a causal relationship (correlation does not equal causation) 5. **What are some common sources of bias in scientific thinking and/or scientific research in general?** - Confirmation bias - Cognitive bias - Motivational bias 6. **What are the goals of behavioral science?** - To better understand human behaviors and apply this understanding to improving the quality of life for people 7. **What are the essential steps to forming a research question and testable hypothesis? If given a "bad" or nonspecific question/hypothesis, could you change it to make it testable like we did in class? Essay question** - Identify a general area of interest - Narrow down focus - Ask open-ended questions - Refine your research question - Formulate a testable hypothesis specific, measurable, falsifiable - Refine and finalize 8. **What does it mean to operationalize a variable?** - Turning abstract conceptual ideas into measurable observations 9. **What are the goals and steps of the scientific method?** - **Goals:** - Description, prediction, explanation and control - **Steps: OPTICR** - Observation, prediction, testing, interpretation, communication, replication 10. **Can you define the types of scientific misconduct and explain why we should void them?** - Plagiarism, Falsification, Fabrication - Why should we void them? - If clinically relevant, harm to patients - Damages career of colleagues - Wasted/time effort of other scientists - Loss of public confidence in field 11. **Can you list and explain the various ethical principles research as established by the APA Ethics Code and the Belmont Report.** - **Beneficence** maximize benefits and minimize risks - **Integrity** promote accuracy, honesty, truthfulness in the science, teaching, and practice of psychology - **Justice** equal access to and benefit from contributions of procedures and services of study and psychology in general - **Respect for persons** dignity and autonomy of participants, esp. vulnerable populations - **Privacy and confidentiality** privacy pertains to the participant\'s right to control access to personal information, confidentiality pertains to the researcher\'s obligation to protect and prevent unauthorized disclosure of this personal information. - **Informed consent** consent process in which they are provided with all relevant information about a study that a reasonable person would need and that they fully comprehend the information they are provided 12. **Can you discuss the main ethical issues violated by the Tuskegee, Milgram, and Stanford Prison studies? Essay question** - **Tuskegee** - Lack of informed consent - Refusal of treatment - Coercive enrollment and retention - Deception - Exploitation - Co-opted population members - **Milgram** - Participants were deceived about the true nature of the study and subjected to severe emotional distress - **Stanford Prison** - Lack of informed consent - Not protected from harm psychological distress - Abuse of participants - Lack of appropriate debriefings 13. **In the context of psychological measurements, could you explain the difference between a conceptual and operational definition?** - **Conceptual** outlines what a concept means - Abstract characteristics - Relationship to other concepts - **Operational** outlines how to measure a concept - How you measure the concept in an experiment - Operationalization turning abstract conceptual ideas into measurable observations 14. **What is the purpose/job of an IRB?** - Responsible for reviewing research at the institution - Ensure that they comply with applicable regulations - Meet commonly accepted ethical standards - Follow institutional policies - Adequately protect research participants 15. **What components should be included when obtaining informed consent?** - Purpose of the research - Procedures that will be used - Risks, benefits, and compensation - Confidentiality - Assurance of voluntary participation and permission to withdraw - Contact information for questions 16. **What is the purpose of a debriefing?** - Occurs after completion of the study - Opportunity for the researcher to deal with issues of withholding information, deception, and harmful effects of participation - Explains why deception was necessary - Provides additional resources, if necessary - Ensures participant leaves the experiment without any ill feelings toward the field of psychology 17. **Can you list, describe, and give examples of the various categories of measurement?** - **Nominal** label and categorize, no quantitative distinctions - Gender, diagnosis, experimental or control - **Ordinal** categorizes observations, categories organized by size or magnitude - Rank in class, clothing sizes, Olympic medals - **Interval** ordered categories, interval between categories of equal size, arbitrary or absent zero point - Temperature, IQ, golf scores - **Ratio** ordered categories, equal interval between categories, absolute zero point - Number of correct answers, time to complete task, gain in height and/or weight since last year 18. **Can you explain the differences between reliability and validity?** - **Reliability:** consistency of a measure whether the results can be reproduced under the same conditions - **Validity:** accuracy of a measure whether the results really do represent what they are supposed to measure 19. **What qualities of instruments might you consider when choosing one a particular instrument for your experiments?** - Ease of use - Access - Appropriateness - Accuracy - Cost 20. **What are some sources of measurement error?** - **General characteristics of the individual** - Level of ability - Test taking skills - Ability to understand instructions - **Lasting characteristics of the individual** - Level of ability related to the trait being measured - Test taking skills specific to the type of items on the test - **Reliability of measurement tool** - **Temporary individual factors** - Health, fatigue, motivation, emotional strain, test environment - **Factors affecting test administration** - Conductions of test administration - Interaction between examiner and test taker - Bias in grading - Other factors 21. **How might you increase and assess reliability?** - Increase the number of observations or items - Eliminate items that are unclear - Standardize the test-taking conditions - Adjust easiness or difficulty of the test - Minimize effects on external events - Standardize instructions - Maintain consistent scoring procedures 22. **What are the different types of validity?** - **Convergent validity** the extent to which the new measure correlates well with measures of the same construct - **Discriminant validity** the extent to which the new measure correlates poorly measures of different, unrelated constructs 23. **What types of effects might participants have on experimental measures?** - **Reactivity** change in behavior due to awareness of measure - **Social desirability** behaviors influenced by social norms or preferences - **Demand characteristics** experiment measures cue participants to experimenters' expectations 24. **What is the relationship between populations and samples?** - **Population:** the entire group about whom you have a research question - **Sample:** the portion of a population that is actually observed 25. **What is the difference between a discrete and continuous variable? Can you define the various scales of measurement and be able to identify the scale if given a specific measure? (e.g., temperature, hair color, etc.)** - **Discrete variable** is one that can only take on specific, distinct values (e.g., the number of students in a class) - **Hair color, \# of \_\_** - **Continuous variable** can take on any value within a given range (e.g., height or temperature). - **Weight, time, distance, temperature** A screenshot of a white paper Description automatically generated 26. **What is the difference between an independent and dependent variable? If given descriptions of research studies, could you identify the IVs and DVs?** - **Independent variable:** the variable that is **[MANIPULATED]** under controlled conditions by experimenter - **Dependent variable:** the outcome or response variable that is measure by the experimenter - **Examples:** 1. **IV**: Amount of sunlight a plant receives\ **DV**: Plant growth (height) 2. **IV**: Hours of study per day\ **DV**: Exam score 3. **IV**: Type of therapy (cognitive-behavioral vs. psychodynamic)\ **DV**: Reduction in anxiety symptoms 4. **IV**: Temperature of water\ **DV**: Time taken for sugar to dissolve 5. **IV**: Number of hours of sleep\ **DV**: Reaction time in a driving test 6. **IV**: Type of diet (high-protein vs. low-carb)\ **DV**: Weight loss 7. **IV**: Frequency of exercise\ **DV**: Blood pressure levels 27. **What is random assignment and why do we need it?** - **Random assignment:** the use of chance procedures in psychology experiments to ensure that each participant has the same opportunity of being in any specific group 28. **What are the differences between Experimental, Nonexperimental, and correlational research?** - **Experimental** trying to determine **[CAUSE AND EFFECT]** relationships - Comparative analysis in which you study two or more variables and observe a group under a certain condition or groups experiencing different conditions - **Nonexperimental** relies on descriptive, observational, or correlational data - Not dependent on the manipulation of an independent variable - **Correlational** measure two or more variables and their relationship to one another - Strength and direction of relationship - Can be positive or negative 29. **What is the relationship between confounding variables and internal validity?** - To ensure the internal validity of your research, you must account for confounding variables if not, your results may not reflect the actual relationship between the variables that you are interested in, biasing your results 30. **What is the purpose of random assignment?** - Assign different participants to each of the conditions without regard to any personal characteristics - Reduces the chance of selection differences systematic differences in random variables between conditions 31. **What are the various ways we can attempt to achieve experimental control?** - **Controlling environmental factors** similar computers, desk, chairs, etc - **Controlling procedural factors** guided by a scripted protocol, ensures all participants receive the same thing - **Measure any potentially confounding variables** - To achieve high internal validity, confounding variables needs to be controlled for - The degree of control over confounding variables is related to the level of confidence in the results - The possibility of confounding variables cannot be fully eliminated from an experiment, that is why experiments do not prove anything instead, experiment provide support for a hypothesis 32. **Do you understand the differences between Independent Groups and Repeated Measures experimental designs? Advantages and disadvantages of each? Could you create a simple experiment using one of these designs?** - **Independent groups:** participants are assigned to only one group - **Between-subjects design:** comparisons are made between different groups of participants - **Advantages:** - No order effects - Less time/money - Increased external validify - **Disadvantages:** - Individual differences in participants can sometimes lead to differences in the groups' results - **Repeated measures:** participants are assigned to multiple groups - Within-subjects design: comparisons are made within the same group of participants - **Advantages:** - Requires fewer participants - Extremely sensitive to statistical differences - **Disadvantages:** - Order effect: order of presenting the treatments affects the dependent variable - **Practice (learning) effect:** performance improves because of the practice gained from previous tasks - **Fatigue effect:** performance deteriorates because the participant becomes tired, bored, or distracted from previous tasks - **Carryover effect:** effect of the previous treatment carry over to influence the response of the next treatment 33. **What are the major characteristics of Survey Research and what are the ways that Survey research differs from Experimental research?** - **Survey research** gather information - Provides a methodology for asking people about themselves - Helps study relationships (correlations) between/among variables - Important complement to experimental research findings - **Experimental research** establish cause-and-effect relationship between the independent and dependent variables 34. **What are the major factors to consider when constructing survey questions and response types/options?** - Clear questions - No loaded questions - Trying not to confuse people Example Multiple Choice Questions A recent study reported that students who just finished playing a prosocial video game were more likely to help others than students who had just finishing playing a neutral or antisocial game. For this study, the kind of game given to the students was the \_\_\_\_\_. a. control group b. independent variable c. quasi-independent variable d. dependent variable The distinction between basic research and applied research is that basic research: a. relies on the social sciences such as psychology or sociology, whereas applied research relies on the fundamental sciences such as chemistry or biology. b. focuses on fundamental questions, often of a theoretical nature, whereas applied research focuses on identifying and resolving practical problems. c. relies on the fundamental sciences such as chemistry or biology, whereas applied research relies on the social sciences such as psychology or sociology. d. focuses on identifying and resolving practical problems, whereas applied research focuses on fundamental questions, often of a theoretical nature. The participants in a research study self-report their sleep quality levels by choosing the response option that best characterizes their average sleep quality per night from the following response options: 1 = extremely low sleep quality, 2 = very low sleep quality, 3 = low sleep quality, 4 = extremely high sleep quality. Which measurement scale is being used to classify sleep quality? a. interval b. ratio c. nominal d. ordinal Mike takes his temperature with a thermometer three times over a 20-minute period and observes the following measurements: 98, 106, and 89 degrees. In this context, Mike concludes that the \_\_\_\_\_ of the thermometer is \_\_\_\_\_. a. reactivity; high b. reliability; low c. variability; low d. validity; high

Use Quizgecko on...
Browser
Browser