RD #5: PART 2
51 Questions
1 Views

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to Lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

What does a baseline measure in single-subject research designs primarily indicate?

  • The potential for replication across different environments.
  • The amount of variability present in the control condition. (correct)
  • The effectiveness of post-hoc analyses in identifying significant differences.
  • The degree of statistical significance achieved during the intervention.

In single-subject design analysis, what is the key reason for comparing the slopes of the baseline and treatment phases?

  • To assess the degree of variability within each phase.
  • To assess the impact of the treatment on the behavior of interest. (correct)
  • To determine the statistical power of the study.
  • To evaluate the social validity of the treatment.

When using the two standard deviation band method in single-subject design analysis, what is the purpose of extending the baseline SD band into the treatment phase?

  • To create a visual representation of the statistical significance of the treatment.
  • To examine how many data points during treatment fall outside the expected range based on baseline variability. (correct)
  • To determine the percentage of non-overlapping data points between the baseline and treatment phases.
  • To calculate the mean of the treatment data more accurately.

In the context of single-subject research, what does social validation primarily aim to establish?

<p>The practical importance and acceptability of the treatment and its outcomes. (C)</p> Signup and view all the answers

What is the primary focus of correlational analysis?

<p>Examining the linear relationship between two variables. (B)</p> Signup and view all the answers

What does covariance indicate in the context of correlation analysis?

<p>The extent to which two variables change in similar patterns. (D)</p> Signup and view all the answers

In correlational studies, what is the purpose of partial correlations?

<p>To control for the effects of extraneous variables on the correlation between two variables. (C)</p> Signup and view all the answers

Why is adequate variability in both scores necessary when calculating a correlation?

<p>To provide a sufficient range of data points for detecting a relationship. (C)</p> Signup and view all the answers

What does homoscedasticity refer to in the context of correlation assumptions?

<p>The homogeneity of variance across all levels of the variables. (C)</p> Signup and view all the answers

In interpreting correlations, which aspect does 'variance shared' relate to?

<p>The practical or clinical significance of the correlation. (B)</p> Signup and view all the answers

What is represented by a scattergram in correlational analysis?

<p>A plot showing the relationship between two variables. (D)</p> Signup and view all the answers

In the context of interpreting relationships, what does the 'line of best fit' represent?

<p>The unique line that best describes the trend in a data set, minimizing the distance from each point. (C)</p> Signup and view all the answers

How is the direction of a relationship determined from a scatterplot?

<p>By whether the slope goes up to the right (positive) or down to the right (negative). (D)</p> Signup and view all the answers

What does the visual proximity of data points to the line of best fit indicate?

<p>The strength of the relationship. (D)</p> Signup and view all the answers

How is the 'strength' of a relationship expressed using correlation coefficients?

<p>As a numerical value between -1.00 and +1.00. (D)</p> Signup and view all the answers

What does the coefficient of determination ($r^2$) represent in correlational analysis?

<p>The amount of variance in one variable explained by the other. (A)</p> Signup and view all the answers

In statistical testing for correlations, what is the role of the null hypothesis ($H_0$)?

<p>To assume there is no relationship between variables. (B)</p> Signup and view all the answers

What do confidence intervals around a correlation coefficient provide?

<p>The range in which the 'true' correlation likely lies. (C)</p> Signup and view all the answers

What is the primary characteristic of a Pearson product-moment correlation?

<p>It is a parametric test that assesses the linear relationship between two continuous variables. (C)</p> Signup and view all the answers

In the context of correlation, what is a key difference between parametric and non-parametric tests?

<p>Parametric tests make assumptions about the distribution of the data, while non-parametric tests do not. (C)</p> Signup and view all the answers

Why can't correlation be interpreted as causation?

<p>Because there might be an unobserved intermediary or confounding variable. (D)</p> Signup and view all the answers

Under what circumstances might a correlation imply causality?

<p>When there is a plausible biological explanation, a logical time sequence, and consistency of findings across studies. (A)</p> Signup and view all the answers

What is a critical consideration when interpreting correlations to avoid overgeneralization?

<p>Generalizing only within the tested range of variables. (D)</p> Signup and view all the answers

In the context of regression analysis, what is the primary assumption regarding the relationship between the independent variable (IV) and the dependent variable (DV)?

<p>The relationship is linear. (B)</p> Signup and view all the answers

What is the main goal of regression analysis?

<p>To predict the value of the DV from the IV. (B)</p> Signup and view all the answers

In regression analysis, what is the 'predictor variable' another name for?

<p>Independent Variable (D)</p> Signup and view all the answers

What characterizes bivariate regression?

<p>The use of a single independent variable to predict a single dependent variable. (B)</p> Signup and view all the answers

In bivariate regression, what does the term 'residuals' refer to?

<p>The difference between actual and the predicted (observed) values of the dependent variable. (A)</p> Signup and view all the answers

In the equation for a regression line, Ŷ = a + bX, what does 'a' represent?

<p>The predicted value of Y when X is zero (Y intercept). (A)</p> Signup and view all the answers

What effect do outliers typically have on correlations and regression lines?

<p>Outliers can have a dramatic effect on correlations and regression lines. (C)</p> Signup and view all the answers

What does the Coefficient of Determination indicate?

<p>The proportion of the total variation in the dependent measure explained by the relationship with the independent variable. (D)</p> Signup and view all the answers

In regression analysis, what kind of test is used on beta weights?

<p>A test of statistical significance. (A)</p> Signup and view all the answers

What does a Regression ANOVA tell you?

<p>Whether the observed relation occurred by chance. (B)</p> Signup and view all the answers

What is a reliability analysis?

<p>Extensions to the Pearson Product Moment Correlation. (C)</p> Signup and view all the answers

If you see two or more continuous variables, and want test the Interclass Correlation Coefficients , what must be true?

<p>Two or more continuous variables must be used. (C)</p> Signup and view all the answers

Which is true of Kappa score?

<p>It involves nominal variables with 2 or more categories. (C)</p> Signup and view all the answers

What is the Multiple Regression Analysis tool used for?

<p>Multiple independent variables predicting a single dependent variable. (A)</p> Signup and view all the answers

What is multicollinearity?

<p>IVs are highly correlated with each other. (C)</p> Signup and view all the answers

In multiple regression analysis, if the assumption of a linear relationship between IVs (predictors) and DV is not met, what should be done?

<p>The data must be transformed. (C)</p> Signup and view all the answers

One of the assumptions of Multivariate regression is homoscedasticity. Outliers in one of the variables can impact this. What should be done?

<p>The outliers may produce heteroscedasticity – transform data (A)</p> Signup and view all the answers

What is a goal of Multiple Linear Regressions?

<p>Best prediction with least variables (B)</p> Signup and view all the answers

In single-subject design analysis, what is the primary reason for computing a regression line for each participant's baseline and treatment phases?

<p>To compare the trends and slopes between the baseline and treatment conditions. (D)</p> Signup and view all the answers

When applying the two standard deviation band method in single-subject research, what does it indicate if a substantial number of data points during the treatment phase fall outside the baseline's standard deviation band?

<p>The treatment has a statistically significant effect compared to the baseline. (D)</p> Signup and view all the answers

What is the key consideration when evaluating the acceptability of a treatment procedure through social validation in single-subject designs?

<p>Whether the treatment is preferred by the patient, and is safe, cost-effective, and simple. (D)</p> Signup and view all the answers

In correlational research, how does the concept of covariance relate to the variables being studied?

<p>It indicates the degree to which two variables vary together in a similar pattern. (B)</p> Signup and view all the answers

What problem does calculating partial correlations address in correlational studies?

<p>Removing the effect of a third variable on the relationship between two variables. (B)</p> Signup and view all the answers

Why is it important to ensure that there are no floor or ceiling effects when conducting correlational analyses?

<p>They reduce the variability in the scores, leading to potentially low correlations. (A)</p> Signup and view all the answers

How does the 'line of best fit' on a scatterplot aid in interpreting relationships between two variables?

<p>It visually represents the strength and direction of the relationship. (A)</p> Signup and view all the answers

If a correlation coefficient ($r$) between two variables is found to be 0.75, what is the coefficient of determination ($r^2$) and how should it be interpreted?

<p>$r^2 = 0.5625$; about 56% of the variance in one variable is predictable from the other. (A)</p> Signup and view all the answers

In a bivariate regression analysis, the equation is Ŷ = 2 + 0.5X, what does the value '2' represent?

<p>A predicted value. Specifically, when the regression line crosses the Y axis. (A)</p> Signup and view all the answers

In multiple regression, what does a simultaneous approach to adding independent variables involve?

<p>All independent variables are entered into the regression equation at the same time. (C)</p> Signup and view all the answers

Flashcards

Statistical Significance

Determines if results are likely not due to chance.

Quasi-experimental

Involves manipulation but lacks random assignment.

Multivariate Designs

Tests multiple dependent variables simultaneously.

Paired T-test

Compares means of two related samples.

Signup and view all the flashcards

ANOVA

Analysis of variance; compares means across groups.

Signup and view all the flashcards

Post-hocs

Follow-up tests to ANOVA, pinpointing specific group differences.

Signup and view all the flashcards

Baseline

The initial control condition, showing existing variability.

Signup and view all the flashcards

Replication

Repeating a study to confirm findings.

Signup and view all the flashcards

Baseline Phase

The first phase in a single-subject design.

Signup and view all the flashcards

Intervention Phase

The phase where the intervention is implemented.

Signup and view all the flashcards

Non-overlapping Data

Determine percentage of data points within overlapping ranges.

Signup and view all the flashcards

Social Validation

Establishing the importance of the treatment effect.

Signup and view all the flashcards

Correlations

Measures linear association between two variables.

Signup and view all the flashcards

Partial Correlations

Effects of a third variable held constant.

Signup and view all the flashcards

Homoscedasticity

Both variables vary at the same degree

Signup and view all the flashcards

Scattergram

Visually depict relationships between 2 variables

Signup and view all the flashcards

Positive Correlation

Variables move in same direction.

Signup and view all the flashcards

Negative Correlation

Variables move in opposite direction.

Signup and view all the flashcards

Strength of Relationship

Tells how close dots are to the line of best fit.

Signup and view all the flashcards

Variance Shared

How much variance is accounted for.

Signup and view all the flashcards

Significance of Relationship

Statistical testing of strength.

Signup and view all the flashcards

Confidence Intervals

Range in which 'true' score likely lies.

Signup and view all the flashcards

Parametric Correlation

Interval and ratio data assumed.

Signup and view all the flashcards

Correlation coefficient

Stat representing relation between 2 or more

Signup and view all the flashcards

Correlation vs. Causation

States correlation does not imply causation.

Signup and view all the flashcards

Correlation Interpretation

Generalize scores only within the tested range.

Signup and view all the flashcards

Linear Regression

Assumes linear relationship between IV and DV

Signup and view all the flashcards

Bivariate Regression

Uses single predictor to predict single outcome

Signup and view all the flashcards

Line of Best Fit

The line that minimizes the sum of squared errors

Signup and view all the flashcards

Residuals

Actual minus the predicted value

Signup and view all the flashcards

Sample Regression

Line is an approximation of population

Signup and view all the flashcards

Outliers

Potential to dramatically impact regression

Signup and view all the flashcards

Accuracy

Coefficient of multiple determination = r2

Signup and view all the flashcards

Multiple Regression

Regression with multiple predictors

Signup and view all the flashcards

Reliability Analysis

Reliability and correlation of the items in the test.

Signup and view all the flashcards

Canonical Correlation

Used to predict multiple outcomes

Signup and view all the flashcards

Discriminant Analysis

A multivariate statistical analysis used to predict a single

Signup and view all the flashcards

Factor Analysis

Looks for factors/groups of variables that correlate to DV

Signup and view all the flashcards

Study Notes

  • Statistical significance is a factor in research
  • Quasi-experimental designs exist
  • Multivariate designs exist
  • Paired t-tests are used
  • ANOVA (1 vs 2 way) statistical test can be used
  • Post-hoc tests are available
  • Baselines indicate amount of variability in control conditions
  • Replication increases power and validity
  • Replication may occur across different behaviors, environments, and over time

Single-Subject Design Analysis

  • Single-subject design analysis involves separate lines for the baseline and treatment, then a comparison of slopes
  • Alternatively, computes a regression line for each participant
  • Single-subject design analysis compares level, trend, slope and variability

Two Standard Deviation Band Method

  • Calculate the mean and standard deviation for the baseline data
  • Then draw a baseline line +/- 2 SD and extend this line through the treatment phase
  • Then examine how many data points fall outside the SD range

Percentage of Non-Overlapping Data

  • Range is determined in each phase
  • Percentage of data points is determined that fall within in overlapping ranges

Single-Subject Designs: Social Validation

  • Establishing the importance of the treatment effect
  • Also, assesses the acceptability of treatment procedures regarding patient preference, comfort, safety, cost, and practicality
  • Determines the social importance of target behavior and magnitude treatment effects
  • Can evaluate if treatments made functional change or not beyond statistical significance

Analysis of Relationships

  • Uses correlations to look at the linear relationship between 2 variables

Correlations

  • Based on covariance, 2 variables will vary in similar patterns
  • Partial correlations find the relationships between 2 variables while holding effects of a 3rd variable constant, effectively removing its effect

Correlations Assumptions

  • Assumes a linear relationship
  • Requires adequate variability in both scores to get a correlation
  • Should be no floor or ceiling effects because otherwise, correlations will be low
  • Homoscedasticity assumes homogeneity of variance but multivariate such that variables have equal variability
  • This results in participant and control groups vary to the same degree

Interpreting Relationships

  • Involves direction,strength, variance shared, significance and confidence intervals

Interpreting Relationship - Scattergram

  • Shows the relationship between two variables
  • Plotted using dots
  • A line of best fit can be used for the scattergram
  • Line of best best is straight and unique to the dataset
  • This is done by minimizing the value from summing the distance of each point from the line

Direction of Relationship

  • Positive relationships mean both variables increase or decrease in the same direction, with a slope up to the right
  • Negative relationships mean variables move in opposite directions, with a slope down to the right

Strength of Relationship

  • Determined visually by how close the dots are to the line of best fit
  • Correlation coefficients exist ('r') to determine strength of relationships
  • Note strength is measured by statement of strength of relationship
  • Ranges from between 0.00 & ±1.00
  • 0.00 - 0.25 is little or no relationship
  • 0.25 - 0.49 is a fair relationship
  • 0.50 - 0.69 is a moderate relationship
  • 0.70 - 0.89 is a high relationship
  • 0.90 - 1.00 is a very high relationship
  • Strict cut-offs do not exist
  • Correlation is affected by sample size, measuring error and type of variables
  • Meaningfulness depends on context and how closely related variables are

Variance Shared

  • Looks at practical/clinical significance by measuring how much variance is accounted for
  • Coefficient of determination represents r squared
  • Effect sizes are based on variance, such eta square & omega squared

Significance of Relationship

  • Uses statistical testing of strength in all correlations
  • Must test the null-hypothesis, H0, that there is no relationship exists between the variables, and that r = 0
  • Statistical significance is reached by ensuring the p value is at or below alpha level
  • Hypothesis can be rejected and it can be concluded that two variables are correlated

Confidence Intervals

  • Represents range in which "true" score lies
  • Set to 95% usually
  • Uses a range of given scores
  • Larger sample sizes provides a smaller confidence interval

Correlations - Parametric

  • Uses Pearson Product Moment Correlation - r
  • Uses same assumptions as other parametric tests

Correlation coefficient types

  • Correlation coefficient is a stat representing relationships between 2 or more things
  • Pearson product moment correlation (r) is for continuous numbers and compares two items
  • Intraclass correlation is for continuous numbers and compares two or more items
  • Spearman rank order correlation (rs) is for ordinal (ranked) numbers and compares two items
  • Kendall's tau (τ) is for ordinal (ranked) numbers and compares two items
  • Cohen's kappa (к) is for nominal numbers and compares two items but can be adapted for more
  • Phi coefficient (Ф) is for nominal numbers and compares two items
  • Cramer's V is for nominal numbers and compares two items
  • Biserial correlation (rb) is for interval & ordinal numbers and compares two items
  • Point-biserial correlation is for interval & nominal numbers and compares two items

Correlations Interpretations

  • Interpreting correlation coefficients must acknowledge correlation ≠ causation
  • Observed relationships may be caused by intermediary variables
  • Correlations might imply causality when logical time sequences are identified
  • More indications are plausible biological explanations, a dose-response relationship and consistency of findings across studies

Correlation Interpretation - Generalization

  • Generalizations must be within tested range
  • Impossible to know what would happen before or after
  • Restricted range of scores might not reflect true relationship
  • Therefore measure over full range

Linear Regression

  • Assumes a linear relationship between the IV and DV
  • Seeks to predict DV from IV
  • The IV is sometimes called predictor variable
  • There are Bivariate and Multivariate types

Bivariate Regression

  • Uses the correlation between one independent predictor variable (X) and one dependent variable (Y) to predict Y, e.g. reading predicted by phonemic awareness
  • Calculates a "line of best fit" known as a regression line
  • Line yields the smallest residuals from each participant or data point (i.e., their delta)
  • The equation of any sample regression line is an approximation of the population regression line

Bivariate Regression Data

  • Actual Y and the predicted values of Y will differ for any data set
  • The distance from the regression line of actual values of Y is called residuals
  • Intercept and slope allows to predict an individual's score, e.g. predictive reading score
  • This is done using an equations, where Ŷ = a + bX
  • Ŷ = predicted value of Y
  • a = Y intercept, e.g. Value of Y when X = 0
  • b = slope of regression line
  • X = value of independent variable

Bivariate Regression - Outliers

  • Bivariate Regression is susceptible to outliers
  • Therefore outliers should be omitted
  • Or do a comparative analysis with and without outlier to estimate its effect

Bivariate Regression Accuracy

  • Accuracy of prediction is measured by the coefficient of Determination = r2
  • It represents an effect and the variation in the dependent measure
  • Regression is accurate when the multiple coefficient of Determination is higher
  • Example, when r = .50, r2 = .25 meaning that 25% of the variance in the dependent measure can be explained by the predictor variable
  • Also requires evaluating accuracy against a a Beta weight which measures regression coefficient - e.g. Y' = a + b₁X1, indicating how much variable is contributing

Regression ANOVA

  • Tests observed relation between X and Y to see if it happened by chance
  • This is done using F test/p value in ANOVA

Advanced Procedures

  • These advanced procedures exist: reliability analysis, multiple regression analysis, canonical correlation analysis, discriminant analysis and factor analysis

Reliability Analysis

  • Reliability uses Pearson Product Moment Correlation Extensions -Paired t-test, slope & intercept documentation and SEM & confidence interval
  • Uses intraclass Correlation Coefficients, and Two or more continuous variables
  • Kappa also, using two or more nominal variables with 2 or more categories, and accounts for 'chance' using scales from .00-1.00

Multiple Regression Analyses

  • This approach uses more than one IVs to predict single DVs
  • Equation, Y' = a + β₁X₁ + B2X2 + B3X3 ...
  • Goal is best possible predicition
  • R = the correlation coefficient of multiple IVs
  • amount of variance explained by the model
  • Can be concurrent or predictive
  • The model accounts for the predictive power
  • Examples are; predicting reading scores or hearing aid compliance

Multiple Regression Analysis - Assumptions

  • Requires linear relationship of predictor to DV
  • Requires homogeneity outliers must be removed via data transformations
  • Requires no Multicollinearity so reduce correlations amongst each other
  • Select on variable, 10 participants are required per variable

Multiple Regression Analysis - Types

  • Simultaneous method adds the IVs all at once
  • Stepwise adds one at a time based on which adds most variance through the Pearson correlation coefficient
  • Hierarchical is used to control the order of addition based on theory

Multiple Regression Analysis - Beta weights

  • Beta weights are Standaized regression coefficients, e.g. Y = a + ẞ₁X₁ + B2X2 + B3X3
  • Tests occur to to observe relations happened by chance via a p-val from ANOVA
  • Get each R2, and R² change

Multivariate Analyses

  • 1 IVs predicting multiple DVs taken as a group

  • Example DVs like word decoding, reading fluency and reading comprehension,
  • Also, IVs like PA, vocabulary and MLU
  • Logistic Regression is used for predicting dichotomous outcomes
  • A typical DV would wear hearing aids vs in drawer, versus hearing aid improvement

Multivariate Analyses - Discriminant Analysis

  • 1 IVs predict a group membership (DV)

  • This obtains equations needed for group prediction, while evaluating +ves & -ves
  • Can be descriptive and looking to see what variables differentiate given partipants
  • Also, predictive (prescriptive) and assigns individuals not diagnosed

Multivariate Analyses - Factor Analysis

  • Seeks to factors among variable in predicting a DV
  • Analyzes sub components
  • Determines Eigenvalues via variability percentages
  • Typically eigenvalue ≥ 1 is important
  • Factor loadings identify each factor

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

Related Documents

More Like This

Use Quizgecko on...
Browser
Browser