Evidence-Informed Practice PDF

Document Details

PlayfulHarmony

Uploaded by PlayfulHarmony

Canadian College of Naturopathic Medicine

Tags

evidence-based practice research methods evidence-based medicine health sciences

Summary

This document provides an overview of evidence-informed practice, including its benefits and drawbacks. It also covers various research concepts, including correlation, causation, and different types of biases.

Full Transcript

Research Week 1 History + Purpose 1972 Archie Cochrane: most treatment decisions not based on systematic review of evidence 1992: evidence-based medicine introduced o Thus shifted from ‘intuition, unsystematic clinical experience, and pathophysiologic rationale’ à clinically relevant scientific rese...

Research Week 1 History + Purpose 1972 Archie Cochrane: most treatment decisions not based on systematic review of evidence 1992: evidence-based medicine introduced o Thus shifted from ‘intuition, unsystematic clinical experience, and pathophysiologic rationale’ à clinically relevant scientific research Research practice gap: 17 years Evidence-Informed Practice Arose by evidence based medicine, evidence based practice, evidence informed practice Arguments FOR/BENEFITS of EBP Avoid (or decrease) biases from clinical experience alone: false attribution, lack of follow up, small sample size, rose-coloured glasses • Use the vast amount of literature that exists, use of more credible sources (vs “just google it”) • Efficient use of resources • Improved clinical care (which treatment is most effective, safest) • Stop ineffective practices (treatments, diagnostic tools) • Consistency within/across professions, communication and collaboration • Promotes inquiry, continual improvement (We can’t possibly be taught everything!) • Arguments DOWN?AGAINST of EBP “practitioner uses only modalities or treatments that have been proven effective by empirical means” – Misconception • May reduce treatment options (under studies modalities) • Challenging to study complex clinical situation, complex interventions • Excluded factors: can’t apply results to complex clinical situations • Concerns about undermining naturopathic philosophy (less individualization, ‘lost art’) • Studies show that on average improvement was seen, doesn’t mean your patient will benefit • Doesn’t capture significance, meaning to the patient • Gold standard studies are expensive and don’t always exist (undermines other type of evidence) • Reduced emphasis on professional judgement, creativity • Doing EIP 1. Formulate an answerable research question (Ask) 2. Find the best available evidence (Acquire) 3. Critically appraise/evaluate the evidence (Appraise) 4. Apply the evidence by integrating with clinical expertise and patient’s values (Apply) 5. Evaluate performance (Assess) Critical Appraisal – process of systematically examining research evidence to judge its trustworthiness, its value + relevance in a particular context Essential to understand and critically evaluate research in order to apply it properly Conclusions from research studies may be reflect the truth All research is open to bias Presentation in the media aimed at generating attention and interest rather than accuracy • Exciting Headlines: “Study reveals that smelling your partner's farts is the secret to a longer life” • Synthetic hydrogen sulphide doner chemical protects mitochondria from oxidative damage Science, Research, Some Basic Concepts Systematic study of the structure and behaviour of the physical and natural world through observation and experimentation Empirical method of acquiring knowledge (accessible to sense experience or experimental procedures) Correlation Correlation: a measurement of the size and direction of the relationship between 2 or more variables Example of positive correlation: height and weight, taller people tend to be heavier Example of negative correlation: mountain altitude and temperature, as you climb higher it gets colder Examples of Correlation As margarine consumption decreased, divorce rate also decreased – random chance?? Sugar intake and rates of Diabetes in the UK More Examples Confounding Variable An additional variable that causes a change in the dependent variable § § § Causation A relationship where one variable (independent variable) CAUSES (is responsible for the occurrence) the other (dependent variable) Ex. decapitation causes death Generally, very difficult to prove a causal relationship Not all Associations Are Causal Associations may APPEAR causal due to: Confounding Chance: ever present randomness. Ex. flip a coin 100 times, may get 58 heads and 42 tails Bias Bias - Anything that systematically influences the conclusion or distorts comparions Can impact any kind of research study Type of Bias: Selection Bias Systematic differences between groups Likely due to inadequate randomization Ex. Research study with 2 locations: 1 location is in an upscale neighbourhood, these participants get the treatment. The other location is in an inner city neighbourhood, these participants get the placebo. Ex2. Survey of the naturopathic profession about evidence based practice. 100 NDs respond, overall largely favourable views of EBP Performance Bias Systematic differences in the care provided apart from the intervention being assessed Ex. Participants in the treatment group spend 10 hours with the researchers, control group spends 1 hour Attrition Bias Systematic difference in withdrawals from the trial Ex. Participants who have a negative reaction (or no benefit) from the study treatment drop out more often than the people who find the treatment helpful Detection Bias Systematic differences in outcome assessment - Ex. Study of the effects of working with radioactive material on skin cancer risk. More cases of skin cancer discovered in patients who report working with radioactive material Ex 2. A research genuinely believes that the study drug will help psoriasis. If they know who is receiving the real drug, they may underestimate when measuring the psoriasis skin lesion Observation Bias When participants are aware of being observed, they alter their behaviour Ex. DIET DIARY! Publication Bias Studies with negative feelings less likely to be submitted and published Recall Bias When asked about things in the past, may have difficulty remembering and respond in an inaccurate way Ex. What did you eat for breakfast 10 years ago? Bias – KEY IS ‘SYSTEMATIC’ There will be always be random factors Ex. One day the research is tired and less observant, notices less cancer lesions (variation in detection) Ex. Some people in a study group with move away, loose interest etc ( attrition) Ex. Some people over/under-estimate their vegetable servings Principles of Causation 1. Temporality Cause came before the effect Some study types limited in ability to detect this (cross-sectional, case-control) Ex. may do a survey of men who currently have prostate cancer and find higher fish intake – does dietary fish cause prostate cancer? VS measure fish intake and follow over time to see who develops cancer 2. Strength Stronger association is better evidence of cause/effect relationship 3. - Dose-response Varying amounts of the cause result in varying amounts of the effect A dose-response relationship is good evidence of cause/effect relationship Ex. number of cigarettes smoked per day and lung cancer risk BUT risk of confounding: heavy smokers more likely to consume more alcohol 4. - Reversibility The association between the cause and the effect is reversible Ex. people who quit smoking have a low risk of cancer STILL think about confounding: people who quit may start other healthy lifestyle behaviours too! 5. Consistency Several studies conducted at different times, in different settings and with different kinds of patients all come to the same conclusions Some inconsistency does not invalidate other trials – look at trial design and quality 6. - Biological plausibility If the relationship between cause and effect is consistent with our current knowledge of mechanisms of disease When present, strengthens the cause for cause/effect Ex. cigarette ingredients cause cancer in cell cultures, animal models Challenges with homeopathy, energy medicine 7. - Specificity One cause à one effect (A only causes B) Vitamin c deficiency à scurvy Absence of specificity is weak evidence against cause; ex, smoking causes cancer, bronchitis, periodontal disease 8. Analogy Cause and effect relationship is strengthened if there are examples of well established causes that are analogous to the one in question - Ex. if we know a virus can cause chronic, degenerative CNS disease (Subacute Sclerosing Panencephalitis) it is easier to accept that another virus might cause degeneration of the immunologic system (e.g. HIV and AIDS) Analogy is weak evidence for cause Week 2 Research Study Methodologies New + UNIMPROVED Nutrition Hierarchy of Evidence Other Types of Evidence Textbooks, peers, qualitative research, traditional/historical use, N of 1, whole systems research Experimental/Intervention Studies Do something to the patient + OBSERVE what happens Does treatment change likelihood of outcome? RCT: characteristics Defined population (inclusion/exclusion criteria) 2(+) groups: treatment arm + comparison arm Prospective: look forward in time Randomized: Equal chance of being assigned to the intervention or control group Control group: accounts for natural course of illness, placebo effect, confounding factors May have Blinding: minimize expectation effect RCT Use: **Best Design for confirming cause/effect** Cross Over – Intervention Study Everyone gets intervention + comparison Preference – Intervention Study (Controlled, not randomized): choose between different arms (ex. Cancer survivors, pick MBT or Tai Chi) Open Label, Pre/Post – Intervention Study Everyone gets the intervention (know it), assess changes before and after intervention Observational Studies Exposure NOT controlled by the researcher They ask: Is there a relationship between a risk factor (or health factor) and an outcome (harm or benefit) Ex. Is high intake of blueberries associated with a lower risk of cancer? Is increased stress associated with an increased risk of a heart attack? Cohort Study Recruit the cohort (outcome is NOT present) Assess risk/health factors (creates a comparison grp) Follow over time See who develops the outcome “longitudinal” “prospective” Cohort Example: Saturated Fat Case-Control Outcome is PRESENT at the beginning of the study Looks backwards in time for exposure (how much meat did you eat 10 years ago?) Case-Control Example à à à à Observational Study Strengths Can study any question Can be less expensive/faster Case Control Studies Cross-Sectional Studies Outcome is PRESENT at the beginning of the study Assess exposure and outcome at ONE time point Ex. Patients with CVD and healthy controls, ask about CURRENT meat intake Case Reports, Case Series Report previously undocumented events (success, adverse reaction) May lead to further action Real patients and real clinical approaches BUT concerns about bias and generalizability N of 1 Study Randomized, double blind, multiple crossover comparisons in an individual patient. “individualized RCT” - compare to self while taking the real treatment vs taking the comparison (ex. patient with hypertension: magnesium vs placebo or blood pressure medication) N of 1 Trial Design Preclinical Studies In vitro: outside the body: cell lines, organs In vivo: in non-disease model: healthy human to study pharmacokinetics (absorption, elimination), animal models Base of the EBM hierarchy Basic knowledge to build on, allows to innovation, highly controlled, ‘ethical’, study mechanisms of action Synthesis Research Why? TIME!! Incorporate the results of individual studies together Draw bigger conclusions Narrative Reviews Researcher combines some of the research on a topic Reports on the collection of evidence Often does NOT describe how they searched and how they decided to include certain studies High risk of bias – results often consistent with author’s hypothesis Systematic Reviews Explicit and rigorous methods to: Identify (2+ databases, specific inclusion/exclusion criteria) Critically appraise Synthesize (combine) Scientific investigation with pre-planned methodology Enormous effort to minimize bias Meta-Analyses Statistically combine the results of studies in a systematic review Ex. 5 studies with 20 participants à 1 study with 100 participants Visual representation of the studies (Forest Plot) Whole Practice Research Need: Do the results of RCTs apply to real clinical practice of naturopathic medicine? Issues: RCT often use one intervention to treat one disease in a uniform patient population Naturopathic medicine: often complex interventions, prescribed in an individualized way, to patients with complex health conditions Whole Practice/Systems Research Assess entire system of care vs individual treatments of modalities Ex. individualized acupuncture treatments vs testing 3 set acupuncture points Ex. Individualized homeopathy vs a single remedy Ex. Naturopathic medicine as a whole WSR Trial Design Goal: accurately study what is actually done in the real world Modified RCT design: use an entire system of medicine vs individual therapy What’s the Methodology? Systematic review Intervention study Cohort study Case-control study

Use Quizgecko on...
Browser
Browser