Podcast
Questions and Answers
Study Notes
Critical Analysis Aspects
- Examine the title (important words)
- Review the abstract for an overview
- Analyze the introduction and conclusion (context, research aims, gaps addressed, conclusions)
- Note the research date/year
- Identify the argument presented
- Identify problems and limitations
- Assess the logical flow of conclusions
- Point out assumptions
- Evaluate the research's novelty and value
- Examine the use of theory
Five Ws of Critical Analysis
- Who: Author's background, other publications
- What: Author's interpretation of key concepts
- Where: Location of interviews or data collection
- When: Date of data collection/publication
- Why: Rationale for method choices
Paragraph Structure for Critical Analysis
- Introduce a point
- Elaborate on the point
- Provide evidence from sources
- Analyze the evidence
- Conclude the point
Evaluating Resources (ABC)
- A: Authority and accuracy
- B: Bias
- C: Currency
Peer Review
- Evaluation of work by professionals in the same field
Open Peer Review
- Review process with known identities of author and reviewers
Closed Peer Review
- Review process with potentially unknown identities of author and reviewers
Single-Blind Peer Review
- Author's identity known to reviewer, but reviewer's identity unknown to author
Double-Blind Peer Review
- Author's and reviewer's identities unknown to each other
Questionnaire
- Structured set of questions for data collection
- Measures variables or gathers information
- Facilitates data gathering for research
Questionnaire Examples
- Self-administered
- Postal questionnaires
- Face-to-face interviews
- Phone interviews
Questionnaire Development (7 Steps)
- Define research objectives, resources, and constraints.
- Determine data collection methods (question types).
- Decide on question wording and format.
- Determine question layout and flow.
- Evaluate the questionnaire design.
- Conduct a pilot study to evaluate the questionnaire.
- Produce final copies and implement the questionnaire.
Research Problem
- Area requiring further knowledge for practice
- Requires background information using secondary data
- Specific research questions should guide questionnaire design
Information Needed Before Questionnaire
- Establish required results tables to prevent extraneous data collection.
- Determine data type, level, and format.
Resources and Constraints
- Staff availability and skills
- Facilities (software)
- Budget
- Time constraints
Factors Influencing Data Collection Method
- Budget: Limits available resources.
- Stimulus/Task Exposure: Some methods necessitate interactions or tasks.
- Questionnaire Length: Affects suitability of different methods.
- Question Structure: Impacts the necessary level of interaction.
- Sample Precision: Influence on the desired representative sample.
- Incidence Rate: Number of potential participants.
Open Question
- Open-ended question format allowing free-form responses
- Produces qualitative data
Closed Question
- Questions with fixed response options
- Produces quantitative data
Closed Question Types
- Nominal: Categorical responses without order
- Ordinal: Categorical responses with a specific order
Ratio Scale
- Scale with a meaningful zero point (e.g., age, salary)
- Allows comparisons of "twice as much"
Interval Scale
- Ordered categories with equal spacing, BUT no absolute zero (e.g., temperature)
Question Wording Considerations
- Target Population Appropriateness: Question wording is appropriate for the group being questioned.
- Comprehension: Respondents must understand questions.
- Recall: Questions are worded to encourage accurate recall.
- Unambiguity: Questions have a singular meaning.
- Respondent Sensitivity: Questions are worded to ensure truthful responses even when private.
Avoiding Pitfalls in Question Wording
- Double-Barrelled Questions: Avoid asking multiple things in one question
- Proverbs and Sayings: Prevent unthinking agreement
- Double Negatives: Phrasing should be positive
- Abbreviations, Jargon, and Technical Terms: Avoid confusing or intimidating respondents
- Ambiguous Words: Use words with clear meanings
- Words with Varying Meanings: Ensure clarity
- Leading Questions: Avoid influencing the respondent's answers
- Loaded Words: Avoid emotionally charged terms that create bias
Questionnaire Introduction and Instructions
- Start by explaining study purpose, organisation, and participant rights (confidentiality/anonymity).
- Use clear section introductions within the survey.
Routing Directions and Filters
- Routing Directions guide respondents through the survey based on their answers.
- Filters determine which questions a respondent should answer based on prior answers
Pilot Testing Importance
- Identify and address issues with questions, wording, layout, and instructions.
- Find confusing language and ambiguous phrasing
- Improve the survey for the target population
Improving Response Rates
- Advance warning
- Explanation of participant selection method
- Sponsorship letters
- Professional-looking envelopes
- Incentives
- Publicity
- Confidentiality statements
- Anonymity option
- Follow-up attempts if necessary
Confidentiality vs. Anonymity
- Confidentiality: Researcher knows participant identity but keeps the information private.
- Anonymity: Researcher doesn't know participant identity; responses are untraceable
Non-response Bias in Questionnaires
- Potential Bias in sample, those who don't respond differ from those who do
- Assess sample representativeness, response bias, and factors impacting response.
Quantitative Research Methods
- Surveys (e.g., module evaluation)
- Experiments (e.g., Milgram study)
Experiment
- Controlled observation testing a hypothesis to infer a cause-and-effect relationship
Causality
- Idea that one factor's change causes another factor to change
- Occurrences increase the likelihood of another
Evidence of Causality
- Concomitant Variation: Factors vary together predictably
- Time Order Occurrence: Cause precedes or occurs simultaneously with effect
- Absence of Other Causal Factors: No alternative explanations for the observed relationship.
Essential Experiment Components
- Dependent Variable: Measured variable
- Independent Variable: Changed variable
- Effect/Outcome: Result of the treatment, observed in the dependent variable
Validity
- Extent to which the test measures what it intends to measure
Extraneous Factors Affecting Validity
- History & Maturation: External and internal changes influencing results.
- Repeated Testing: Familiarity affecting results
- Researcher Impacts: Researcher presence altering subject behaviour
- Mortality: Loss of respondents over time
- Selection Errors: Biases within subject selection
- Regression to the Mean: Extreme scores tend toward averages.
Confounding Variables
- Uninteresting variables that could falsely influence the independent variable.
- Examples: Diet and increased meat intake.
Controlling Confounding Variables
- Randomisation: Random assignment of participants
- Matching: Pair subjects based on key variables
- Statistical control: Measuring extraneous variables, controlling for their effect.
- Design Control: Using experimental design to address factors
Experimental Designs
- Pre-experimental: No random assignment
- True Experimental: Random assignment
- Quasi-experimental: Lack full experimental control
- Statistical Designs: Statistical control of extraneous variables
Symbols in Experimental Design
- X: Exposure to treatment
- O: Observation/Measurement of the dependent variable
- R: Random assignment
Pre-Experimental Designs
- One-Shot Case Study: One group exposed to treatment, measured once
- One-Group Pretest-Posttest: One group, pretest, treatment, posttest.
- Static-Group Comparison: Two groups, one with treatment, one without, measured only once.
True Experimental Designs
- Pre-test/Post-test Control Group: Random assignment to groups, pretest, treatment, posttest
- Post-test Only Control Group: Two groups, randomly assigned, only posttest measurement
- Solomon Four-Group: Four groups, two with pretest, two without, to see if pretest affects outcomes.
Quasi-Experimental Designs
- Time Series Design: Repeated measurements over time, before and after treatment.
- Multiple Time Series Design: Repeated measurements, a control group, to help avoid confunding issues.
Questionnaire Evaluation and Analysis
- Wording/appearance: ease of use
- Targeted Data Collection: Ensure questions cover research objectives
- Continuous vs Categorical Data Differentiation: Collect highest data level possible
- Missing data identification and treatment
- Group comparisons: consistency across groups
- Pre-coding: facilitate easier analysis
- Coding sheet preparation: systematic data entry
Pilot Testing
- Small-scale trial to evaluate research project feasibility.
- Identify problems with questions, wording, order, and layout.
Questionnaire Distribution
- Personal distribution/collection
- Personal distribution/postal/email returns
- Postal/email distribution/returns
Response Rate Influencers
- Advance warning
- Explanation of selection method
- Sponsorship letters
- Professional envelopes
- Incentives
- Publicity
- Confidentiality and anonymity assurances
- Follow-up efforts
Continuous vs. Categorical Data
- Continuous: Numerical data with infinite possible values (e.g., height)
- Categorical: Data grouped into categories (e.g., gender)
Data Analysis Preparation
- Coding sheets: Organize data for analysis
- Reverse scoring: Consistent scoring for negatively worded items
Reverse Scoring
- Reversing scoring for negatively worded question to ensure consistent direction of responses
Studying That Suits You
Use AI to generate personalized quizzes and flashcards to suit your learning preferences.
Description
Test your understanding of critical analysis methods, including the evaluation of titles, abstracts, and arguments presented in research. This quiz covers aspects such as the Five Ws of critical analysis and resource evaluation. Sharpen your skills and learn how to effectively dissect academic texts.