Research Methods: Epistemology and Methodology
92 Questions
1 Views

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to Lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

Which area of study focuses primarily on the nature and scope of knowledge itself?

  • Empiricism
  • Experiential Reality
  • Epistemology (correct)
  • Methodology

What is the primary focus of methodology in the context of research?

  • Developing techniques for data analysis
  • Discovering the relationship between variables (correct)
  • Identifying common societal beliefs and norms
  • Establishing the philosophical basis of knowledge

What type of knowledge is derived primarily from direct observation and experience?

  • Traditional Knowledge
  • Empirical Knowledge (correct)
  • Authoritative Knowledge
  • Agreement Reality

Which of the following is the best example of experiential reality?

<p>Knowing that touching a hot stove will cause pain after having burned yourself (B)</p> Signup and view all the answers

Agreement reality can be problematic in the pursuit of knowledge because:

<p>It can hinder critical thinking and questioning of assumptions. (B)</p> Signup and view all the answers

How does replication help guard against overgeneralization in personal inquiry?

<p>By testing findings in different settings or with different groups (D)</p> Signup and view all the answers

Which of the following is an example of applied research?

<p>Testing a new therapeutic intervention in a clinical setting (D)</p> Signup and view all the answers

What role does replication play in addressing selective observation?

<p>It tests the consistency of findings across different samples or settings. (D)</p> Signup and view all the answers

Why is it impossible to definitively 'prove' a hypothesis in empirical research?

<p>Alternative explanations for observed relationships always exist. (D)</p> Signup and view all the answers

How does deductive reasoning differ from inductive reasoning in research?

<p>Deductive reasoning starts with a general theory and tests it with data, while inductive reasoning develops a theory from specific observations. (D)</p> Signup and view all the answers

In research, what is the critical difference between a policy question and a research question?

<p>Policy questions address overarching issues of interest, while research questions specify how to study the issue. (D)</p> Signup and view all the answers

In the context of research, what distinguishes a construct from a variable?

<p>A construct is an abstract idea, whereas a variable is a concrete representation that can be measured. (D)</p> Signup and view all the answers

What best describes the relationship between variables and attributes?

<p>Attributes are characteristics that describe an object; variables are logical sets of attributes. (C)</p> Signup and view all the answers

In a study examining the effect of poverty (independent variable) on crime rates (dependent variable), what kind of relationship are we examining?

<p>We are examining if there is a change in response to values of IV. (A)</p> Signup and view all the answers

What is the significance of 'units of analysis' in research design?

<p>They specify what or whom is being studied. (D)</p> Signup and view all the answers

What type of error occurs when a researcher draws conclusions about individual behavior based solely on group-level data?

<p>Ecological Fallacy (C)</p> Signup and view all the answers

What is the main characteristic of a cross-sectional study?

<p>Data is collected at a single point in time. (D)</p> Signup and view all the answers

What is subject attrition, and why is it a problem in longitudinal studies?

<p>It involves participants dropping out of the study, potentially biasing the results. (B)</p> Signup and view all the answers

What distinguishes retrospective studies from prospective studies?

<p>Retrospective studies collect data about past events, while prospective studies collect data as events occur. (C)</p> Signup and view all the answers

What is the primary goal of descriptive questions in research?

<p>To describe the characteristics of a particular group or situation (A)</p> Signup and view all the answers

What is the main objective of causal research questions?

<p>To determine if changes in one variable cause changes in another (A)</p> Signup and view all the answers

What additional criterion must be met to establish causality beyond just observing a correlation between two variables?

<p>There must be no plausible alternative explanations for the observed relationship. (B)</p> Signup and view all the answers

In the context of research, what does validity refer to?

<p>The extent to which conclusions about cause and effect are true (A)</p> Signup and view all the answers

What does conclusion validity primarily assess in research?

<p>Whether the independent variable is genuinely associated with the dependent variable (B)</p> Signup and view all the answers

What does 'power' refer to in the context of statistical testing?

<p>It refers to the likelihood that the test will detect a real effect if one exists. (C)</p> Signup and view all the answers

What is the key difference between 'a priori' and 'post hoc' power analyses?

<p>A priori analyses determine power before data collection, while post hoc analyses determine power after data collection. (B)</p> Signup and view all the answers

What does internal validity primarily ensure in a research study?

<p>The observed effects are genuinely due to the independent variable. (B)</p> Signup and view all the answers

What is the primary concern of external validity in research?

<p>Ensuring the findings can be generalized to other populations and settings (A)</p> Signup and view all the answers

What is the focus of construct validity in research?

<p>Whether the test measures the construct you are trying to measure. (D)</p> Signup and view all the answers

How do conceptions relate to concepts in the research process?

<p>Conceptions are mental images, concepts are words/symbols to represent those images (C)</p> Signup and view all the answers

In the context of conceptualization, what is a 'dimension'?

<p>A specifiable aspect or facet of a concept (B)</p> Signup and view all the answers

What is the role of 'indicators' in research?

<p>To provide evidence of the presence or absence of a dimension of a concept (B)</p> Signup and view all the answers

What is the purpose of a conceptual definition in research?

<p>To assign a working definition to a construct so researchers can agree on it (B)</p> Signup and view all the answers

In research, what does 'operationalization' involve?

<p>Specifying the procedures for measuring constructs. (D)</p> Signup and view all the answers

Why is it essential for attributes of a variable to be exhaustive?

<p>To ensure that each observation can be classified into an attribute. (D)</p> Signup and view all the answers

What problem arises if the attributes of a variable are not mutually exclusive?

<p>Observations may be incorrectly classified, inflating results. (A)</p> Signup and view all the answers

Which of the following is an example of a nominal measure?

<p>Types of crime (e.g., assault, robbery) (B)</p> Signup and view all the answers

What is a key characteristic of ordinal measures?

<p>They rank attributes along a continuum, but the intervals may not be equal. (B)</p> Signup and view all the answers

What differentiates interval measures from ordinal measures?

<p>The equal distance between values (C)</p> Signup and view all the answers

Which type of measurement includes a true zero point?

<p>Ratio (D)</p> Signup and view all the answers

How do systematic and random errors affect the observed score in measurement?

<p>Systematic errors are predictable biases, while random errors are chance variations. (A)</p> Signup and view all the answers

What does reliability, as a measurement quality, primarily indicate?

<p>The extent to which the tests are free of random error (B)</p> Signup and view all the answers

Test-retest reliability is used to assess:

<p>Stability (A)</p> Signup and view all the answers

What does inter-rater reliability assess?

<p>The degree of agreement between different observers or scorers (C)</p> Signup and view all the answers

Which of the options listed would increase reliability?

<p>Ensure there are longer tests (C)</p> Signup and view all the answers

What does measurement validity indicate?

<p>The extent to which the measurement reflects the construct (A)</p> Signup and view all the answers

What is face validity?

<p>Common agreement or mental images about a construct (B)</p> Signup and view all the answers

What is the goal of the Pretest/Posttest methodology?

<p>To show the change in after exposure (D)</p> Signup and view all the answers

In the context of research, applying logical arguments helps to specifically guard against which potential error in personal inquiry?

<p>Illogical reasoning (A)</p> Signup and view all the answers

Which type of research primarily aims to address immediate practical problems rather than expanding theoretical knowledge?

<p>Applied Research (C)</p> Signup and view all the answers

How would you describe a hypothesis?

<p>A testable statement that proposes a relationship between variables (A)</p> Signup and view all the answers

Why is it impossible to definitively prove a hypothesis, even with strong empirical support?

<p>Alternative explanations for the observed relationships may always exist (B)</p> Signup and view all the answers

What is the primary characteristic of deductive reasoning?

<p>Testing a general theory through specific data (A)</p> Signup and view all the answers

How does inductive reasoning contribute to the development of theories?

<p>By starting with specific observations and developing a general explanation (C)</p> Signup and view all the answers

How does a research question differ from a policy question?

<p>A research question is specific and testable, while a policy question is broad and relates to an issue of public concern. (A)</p> Signup and view all the answers

What is the distinction between a construct and a variable in research?

<p>A construct is abstract, while a variable is its measurable representation. (B)</p> Signup and view all the answers

What is the relationship between variables and attributes in research?

<p>Variables are characteristics, attributes are categories. (C)</p> Signup and view all the answers

In a study examining the impact of education levels (independent variable) on income (dependent variable), what type of relationship is being explored?

<p>Causal (D)</p> Signup and view all the answers

Why are 'units of analysis' important in research design?

<p>They define what or whom the study is about, shaping data collection and interpretation (D)</p> Signup and view all the answers

What type of error occurs when researchers make inferences about group behavior based solely on individual-level data?

<p>Exception fallacy (A)</p> Signup and view all the answers

What is the primary characteristic of a cross-sectional study design?

<p>Data collected at a single point in time (B)</p> Signup and view all the answers

Subject attrition is a common problem in longitudinal studies. What does it refer to, and why is it an issue?

<p>Participants dropping out of the study over time; it can introduce bias and affect the study's validity (C)</p> Signup and view all the answers

How do retrospective studies differ from prospective studies?

<p>Retrospective studies look backward in time, while prospective studies look forward. (B)</p> Signup and view all the answers

Beyond correlation, what additional factor is essential to establish a causal relationship between two variables?

<p>Temporal order (cause precedes effect) (D)</p> Signup and view all the answers

In research, what does validity refer to?

<p>The extent to which a measure accurately reflects the concept it is intended to measure (C)</p> Signup and view all the answers

In the context of statistical testing, what does 'power' refer to?

<p>The ability of a test to detect a true effect (C)</p> Signup and view all the answers

What is the primary focus of internal validity in a research study?

<p>Ensuring that there is a causal relationship between the independent and dependent variables (C)</p> Signup and view all the answers

In the process of conceptualization, what does a 'dimension' refer to?

<p>A specifiable aspect or facet of a concept (B)</p> Signup and view all the answers

What is the primary purpose of a conceptual definition in research?

<p>To establish a common understanding of a concept under study (B)</p> Signup and view all the answers

What does 'operationalization' involve in the context of research?

<p>Specifying the procedures for measuring a concept (B)</p> Signup and view all the answers

Which of the options listed would most likely increase measurement reliability?

<p>Using clear and standardized instructions (C)</p> Signup and view all the answers

What does face validity assess?

<p>Whether a measure appears, on the face of it, to measure the construct of interest (C)</p> Signup and view all the answers

How does 'agreement reality' potentially impede the pursuit of knowledge?

<p>It can limit critical thinking by uncritically accepting cultural norms and expertise. (A)</p> Signup and view all the answers

How does specifying observations help mitigate the risk of selective observation in personal inquiry?

<p>By establishing clear criteria for what data is relevant, reducing the impact of bias (A)</p> Signup and view all the answers

In the context of research, what is the relationship between exploration, description, and explanation?

<p>Exploration is used to generate data which leads to description, which then can lead to explanation. (C)</p> Signup and view all the answers

How does the statement 'Increased levels of education lead to decreased crime rates' function as a hypothesis?

<p>As it delineates a testable relationship between variables. (B)</p> Signup and view all the answers

What is the critical element that differentiates quantitative from qualitative data?

<p>Whether the data is numerical or descriptive in nature. (D)</p> Signup and view all the answers

How would a researcher use policy questions to formulate research questions?

<p>Research questions break down policy issues into specific, researchable inquiries. (D)</p> Signup and view all the answers

What role do variables play in relation to constructs in research?

<p>Constructs are abstract while variables are concrete representations that can be observed or measured. (B)</p> Signup and view all the answers

What is the difference between an independent variable and a dependent variable? (Select all that apply)

<p>The dependent variable is the outcome that is measured to see if it is affected by the independent variable. (B), The independent variable is manipulated by the researcher. (D)</p> Signup and view all the answers

In a study examining the effects of community policing initiatives on reducing crime rates, what would be the unit of analysis?

<p>The geographical areas (e.g., neighborhoods, cities) where the initiatives are implemented. (A)</p> Signup and view all the answers

What is the 'ecological fallacy,' and why is it problematic in research?

<p>Making individual-level inferences from group data, which can lead to inaccurate conclusions about individuals. (D)</p> Signup and view all the answers

How do retrospective and prospective studies differ in their approach to investigating potential causal relationships?

<p>Retrospective studies begin with the outcome and look backward for causes, while prospective studies start with potential causes and track outcomes forward in time. (A)</p> Signup and view all the answers

Why is temporal precedence a necessary criterion for establishing causality?

<p>To confirm that the cause occurs before the effect, rather than vice versa. (D)</p> Signup and view all the answers

A researcher finds a strong correlation between ice cream sales and crime rates. However, temperature is a confounding variable, as both ice cream sales and crime rates increase in warmer weather. What is this an example of?

<p>A spurious relationship where the correlation is not causal. (B)</p> Signup and view all the answers

In the context of research, how does a larger sample size typically affect the power of a statistical test?

<p>It increases power, making it more likely to detect a real effect. (B)</p> Signup and view all the answers

Which of the following scenarios illustrates a threat to internal validity due to 'maturation'?

<p>A training program appears effective, but participants naturally improve their skills over time regardless of the program. (C)</p> Signup and view all the answers

What purpose do indicators serve in the conceptualization process?

<p>Indicators specify the presence of a dimension. (C)</p> Signup and view all the answers

Why is it important for the attributes of a variable to be mutually exclusive?

<p>To ensure that each observation can be classified into only one attribute. (B)</p> Signup and view all the answers

Which of the following best illustrates an example of an ordinal measure?

<p>Ranking of satisfaction levels (e.g., very dissatisfied, dissatisfied, neutral, satisfied, very satisfied). (D)</p> Signup and view all the answers

How do systematic errors affect the observed score in measurement, and why are they a concern?

<p>Systematic errors introduce consistent bias, distorting the observed score in a particular direction and affecting accuracy. (B)</p> Signup and view all the answers

How does increasing the length of a test typically impact its reliability, and why?

<p>Longer tests generally increase reliability by providing a larger sample of behavior, thereby reducing the impact of any single item. (D)</p> Signup and view all the answers

Flashcards

Epistemology

The study of knowledge itself.

Methodology

The science of discovering and understanding.

Empirical

Knowledge based on observation and experience.

Experiential Reality

Knowledge gained through direct, personal experience.

Signup and view all the flashcards

Agreement Reality

Knowledge accepted as true due to cultural or social consensus.

Signup and view all the flashcards

Problem with Agreement Reality?

Tradition/authority can hinder critical thinking.

Signup and view all the flashcards

Errors in Personal Inquiry

Inaccurate observation, overgeneralization, selective observation, illogical reasoning, ideology/politics.

Signup and view all the flashcards

Purposes of Research

Basic (exploration, description, explanation) and Applied (evaluation, policy analysis).

Signup and view all the flashcards

Theory

A systematic explanation of relationships among phenomena.

Signup and view all the flashcards

Hypothesis

A testable statement predicting associations between variables.

Signup and view all the flashcards

Why Hypotheses Can't be Proven

Alternative explanations always exist.

Signup and view all the flashcards

Deductive Reasoning

Reasoning from general principles to specific instances.

Signup and view all the flashcards

Inductive Reasoning

Reasoning from specific observations to general theories.

Signup and view all the flashcards

Qualitative vs. Quantitative

Qualitative: descriptive data. Quantitative: numerical data.

Signup and view all the flashcards

Policy vs. Research Questions

Policy: broad issue; Research: specific study.

Signup and view all the flashcards

Constructs

Abstract ideas forming the foundation of research and theory.

Signup and view all the flashcards

Variables

Concrete, observable representations of constructs.

Signup and view all the flashcards

Attributes

Characteristics that describe something; categories of a variable.

Signup and view all the flashcards

Independent vs. Dependent Variable

IV: Predictor, treatment. DV: Outcome, changes in response to IV.

Signup and view all the flashcards

Units of Analysis

The entities being studied (individuals, groups, organizations, etc.).

Signup and view all the flashcards

Ecological Fallacy

Drawing conclusions about individuals from group-level data.

Signup and view all the flashcards

Exception Fallacy

Drawing conclusions about groups based on a few individuals.

Signup and view all the flashcards

Cross-Sectional Studies

Examining a phenomenon at one point in time.

Signup and view all the flashcards

Longitudinal Study

Examining a phenomenon over an extended period.

Signup and view all the flashcards

Types of Longitudinal Studies

Trend, cohort, and panel studies.

Signup and view all the flashcards

Retrospective vs. Prospective

Retrospective: Looks back for causes. Prospective: Looks forward for outcomes.

Signup and view all the flashcards

Descriptive Questions

Describe a situation.

Signup and view all the flashcards

Causal Questions

How change in one variable effects another.

Signup and view all the flashcards

Correlational vs. Causal Relationships

Correlational: synchronized behavior. Causal: one variable causes change in another.

Signup and view all the flashcards

Criteria for Causality

Correlation, temporal precedence, and no alternative explanations.

Signup and view all the flashcards

Validity

The truthfulness of cause-and-effect statements.

Signup and view all the flashcards

Conclusion validity

If IV is really correlate with DV.

Signup and view all the flashcards

Power

Likelihood of a test finding a true effect in your data.

Signup and view all the flashcards

A Priori vs. Post Hoc

A priori: Determined before data collection. Post Hoc: Determined after data collection.

Signup and view all the flashcards

Internal validity

Ability to determine cause-and-effect inferences from study.

Signup and view all the flashcards

External validity

general findings to real world.

Signup and view all the flashcards

construct validity

Extent to which variables measure their intended.

Signup and view all the flashcards

Conceptions vs Concepts

Mental Image vs Words and Symbols.

Signup and view all the flashcards

Dimensions

A specifiable aspect of a concept.

Signup and view all the flashcards

Indicators

Indicate presecence/absence of a dimension

Signup and view all the flashcards

Conceptual definition

Working definition assigned to a construct.

Signup and view all the flashcards

Operarationalization

Procedure for measuring constructs by specifying operations.

Signup and view all the flashcards

Operational Definition

What we wil observe, how we will do it, what interperitations made.

Signup and view all the flashcards

Attribute Classification

Classification needs to be exhaustive, should not overlap due to inflation results.

Signup and view all the flashcards

Nominal Measures

Qualities that do not represent anything except difference among groups.

Signup and view all the flashcards

ordinal Measures

more or less of variable gap categories not uniform. Aggravated assault to homocide

Signup and view all the flashcards

Interval Measures

Distance between measures in interval. age first violent offense

Signup and view all the flashcards

Ratio Measures

Interval with a true zero.

Signup and view all the flashcards

observerd score

true score + systematic error + random error

Signup and view all the flashcards

Reliability

Measurement technique applied repeatedly results in same value.

Signup and view all the flashcards

Study Notes

  • These notes cover key concepts in research methods, focusing on epistemology, methodology, research design, validity, and more.

Epistemology and Methodology

  • Epistemology is the study of knowledge itself.
  • Methodology is the science of finding out or the procedures for scientific investigation.
  • Empirical knowledge is derived from experience or observation.

Reality and Knowledge

  • Experiential reality refers to knowledge gained through direct experiences.
  • Agreement reality encompasses things accepted as knowledge due to cultural consensus.
  • Tradition and authority can be problematic aspects of agreement reality, necessitating critical thinking.

Potential Errors in Personal Inquiry

  • Inaccurate observation can be mitigated by specifying observations.
  • Overgeneralization can be guarded against by using samples and replication.
  • Selective observation can be avoided by specifying observations in advance.
  • Illogical reasoning can be addressed through logical arguments.
  • Ideology or politics can be avoided by maintaining neutrality and eliminating bias.

Purposes of Research

  • Basic research includes exploration, description, and explanation.
  • Applied research includes evaluation and policy analysis.

Theory and Hypothesis

  • A theory is a systematic explanation of relationships among phenomena.
  • A hypothesis is a testable statement outlining associations among variables.
  • Hypotheses cannot be proven due to the existence of alternative explanations.

Reasoning

  • Deductive reasoning starts with a theory and tests it against data (general to specific).
  • Inductive reasoning starts with data and develops a theory (specific to general).

Data

  • Qualitative data is descriptive information.
  • Quantitative data is numerical information.

Questions

  • Policy questions are broad issues of interest.
  • Research questions are specific ways to study a policy issue.

Constructs, Variables, and Attributes

  • Constructs are abstract ideas forming the basis of research and theory.
  • Variables are concrete, observable representations of constructs.
  • Attributes are characteristics describing something and composing a variable.

Variables

  • Independent variables (IV) predict or define treatment conditions.
  • Dependent variables (DV) change in response to the IV and are outcome variables.

Units of Analysis

  • Units of analysis are the what or whom being studied.
  • The ecological fallacy involves drawing individual-level conclusions from group-level data.
  • The exception fallacy draws conclusions about groups based on a small number of people.

Types of Studies

  • Cross-sectional studies examine a phenomenon at a single point in time.
  • Longitudinal studies examine a phenomenon over an extended period, facing subject attrition.
    • Trend studies, cohort studies, and panel studies are types of longitudinal studies.
  • Retrospective studies collect data about past outcomes and look back for causes.
  • Prospective studies begin with potential causes and collect data about future outcomes.

Questions

  • Descriptive questions describe a situation. Example: What percentage of violent offenders in Vancouver have a history of drug addiction?
  • Causal questions explore how changes in one variable affect another. Example: Does chronic drug addiction increase the likelihood of committing violent crimes in Vancouver?

Relationships

  • Correlational relationships involve synchronized movement between two variables.
  • Causal relationships involve one variable causing a change in another.

Criteria for Causality

  • Variables must be correlated.
  • Temporal precedence: the IV must occur before the DV.
  • No plausible alternative explanations.

Validity

  • Validity refers to the truthfulness of statements about cause and effect.

Types of Validity

  • Conclusion validity assesses the reasonableness of relationships in data and if IV is related to DV.
    • Lack of conclusion validity leads to bias, sample size affects power.
  • Power is the likelihood a test will find a relationship between sample and population. A bigger sample size equals more power.
    • A priori power is determined before data collection.
    • Post hoc power is determined after data collection.
  • Internal validity is the extent to which cause-and-effect inferences can be drawn from a study.
  • External validity is the extent to which findings can be generalized to real-world settings.
  • Construct validity is the extent to which variables measure their intended constructs.

Conceptions, Concepts, Dimensions, and Indicators

  • Conceptions are mental images.
  • Concepts are words and symbols representing these images.
  • Dimensions are specifiable aspects of a concept.
  • Indicators signify the presence or absence of a dimension.

Definitions

  • Conceptual definitions assign working definitions to constructs for research agreement.
  • Operationalization specifies procedures for measuring constructs.
  • Operational definitions detail what will be observed, how, and what interpretations will be made.

Attributes

  • Attributes need to be exhaustive to classify every observation.
  • Attributes need to be mutually exclusive to avoid inflating results.

Levels of Measurement

  • Nominal measures are qualitative, representing differences among groups (e.g., crime types).
  • Ordinal measures rank-order attributes on a continuum with non-uniform gaps (e.g., severity of assault).
  • Interval measures have meaningful distances between attributes (e.g., age).
  • Ratio measures are interval measures with a true zero point (e.g., number of offenses).

Observed Scores

  • An observed score consists of a true score + systematic error + random error.

Reliability

  • Reliability indicates whether a measurement technique yields consistent results when applied repeatedly.
  • Reliability is freedom from random error but does not ensure accuracy.
  • Test-retest reliability measures consistency by using the same test on two occasions.
  • Inter-rater reliability assesses how similarly two different scorers would score a test.
  • Longer tests, large variation on measured factor, clear instructions and no distractions increase reliability.

Measurement Validity

  • Measurement validity is the extent to which a measure reflects the construct.
  • Face validity is agreement on mental images about a construct.
  • Criterion-related validity compares a measure with an external criterion.
    • Convergent validity checks if a measure predicts scores on another accepted criterion.
    • Divergent validity checks if a measure does not predict scores on another accepted criterion that it shouldn't be related to.
  • Content validity measures the range of meanings in a construct.

Types of Measurement

  • Multiple measures compare your measure with other measures of the same construct.

Experimental Design Components

  • Pretest measures the DV before the IV to establish a baseline.
  • Posttest measures the DV after exposure to the IV to see the change.
  • An experimental group is exposed to the IV.
  • A control group is not exposed to the IV.

Bias

  • Research subject bias arises from participant characteristics leading to skewed study outcomes.
  • The Hawthorne effect is a change in behavior due to awareness of being studied.
  • A double-blind experiment prevents both experimenters and participants from knowing treatment assignments.

Random Assignment

  • Random assignment assigns participants to experimental and control conditions by chance to minimize differences between groups.

Experimental Design Essentials

  • Independent and dependent variables are manipulated.
  • Experimental and control groups are used for comparison.
  • Random assignment of participants of the most importance.
  • Pretests and posttests measure changes.
  • Control of extraneous variables ensures accurate results.
  • Replication verifies findings.

Randomized Controlled Trials

  • Randomized controlled trials (RCTs) randomly assign individuals to experimental or control groups.

Causality Considerations

  • Variables must be correlated.
  • The IV must precede the DV.
  • Changes in the DV should result from changes in the IV, not a third variable (random assignment helps).

Bias

  • Omitted variable bias occurs when a variable affects the DV and is correlated with the IV.

Threats to Internal Validity

  • History: External events affect the experimental group.
  • Maturation: Natural changes over time affect participants.
  • Testing: The pretest affects posttest results.
  • Instrumentation: Changes in measurement instruments over time.
  • Statistical regression: Extreme scores regress toward the mean upon retest.
  • Attrition: Reduction in participant numbers from pretest to posttest.
  • Causal time order: The IV must precede the DV temporally.
  • Selection bias using nonequivalent groups, leading to differences in DV through creaming.
  • Diffusion of treatment: Communication between groups contaminates results.
  • Compensatory: control group deprived of something valuable. pressures to offer compensation exist
  • Compensatory rivalry: The control group works harder to compensate for being deprived of the stimulus.
  • Demoralization: The control group gives up due to deprivation.

Minimizing Threats to Validity

  • Argue the threat is not reasonable.
  • Demonstrate the threat is not a problem.
  • Add a control, use random assignment, keep groups separate.
  • Conduct tests to rule out alternative explanations.
  • Implement increased controls or structure.

Internal and External Validity Conflict

  • Greater control for internal validity may reduce the natural setting and external validity.

Threats to Statistical Conclusion Validity

  • Low statistical power, such as samples that are too small.
  • Weakly defined or implemented independent variable.
  • Unreliable implementation of a treatment.

Experiment Notation

  • R = random assignment
  • O = observations
  • X = the treatment (key IV)
  • [ ] = no treatment (control group)
  • X1 = a 2nd treatment
  • T = time

Designs

  • Randomized 2-group
  • Before-After 2-Group Design
  • Solomon 4-group design
  • Latin square design
  • Factorial design

Canadian Uniform Crime Reports (UCR)

  • The Canadian Uniform Crime Reports (UCR) gathers crime information collected by police, the main source of crime statistics.
  • Cross-sectional design enables easy annual comparability.
  • Data is collected directly from police, detailing over 100 crime types.
  • Data includes offense type, location, victim/accused characteristics, and case status.
  • Crimes use standardized codes.
  • The "most serious offense" rule records only the most severe crime in an incident.
  • UCR1 reports basic incident numbers.
  • UCR2 provides detailed incident reports.
  • The UCR only covers reported crimes, potentially underrepresenting total criminal activity.

Sherman 1984 Study

  • The Sherman 1984 study aimed to test police responses to misdemeanor domestic violence.
  • The study was located Minneapolis, Minnesota.
  • It employed a randomized controlled trial (RCT).
    • Arresting the suspect.
    • Separating the suspect and victim.
    • Mediation without arrest.
  • Arrest was the most effective deterrent.

Limits of Sherman Study

  • Sherman study's Minneapolis location limits external validity due to unique social factors.
  • Uncontrolled factors like offender history limit internal validity.
  • The study's focus on recidivism omits broader impacts like emotional well-being, undermining construct validity.
  • Ethical issues from random assignments may put victims at risk.

Grounded Theory

  • Grounded theory is an inductive approach where theories emerge from field observations.

Intersubjective Agreement

  • Intersubjective agreement is when different researchers reach the same conclusion.

Paradigm

  • Paradigm is a fundamental perspective shaping our worldview and research approach.

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

Description

Explore research methods, epistemology, and methodology. Understand empirical knowledge, experiential reality, and agreement reality. Learn to avoid errors in personal inquiry through structured observations and logical reasoning.

More Like This

Metodologické princípy vedy
18 questions
Qualitative Research Methods
40 questions

Qualitative Research Methods

FormidablePennywhistle avatar
FormidablePennywhistle
Metodología: Tipos y Aplicaciones
32 questions
Use Quizgecko on...
Browser
Browser