Evaluation in Healthcare Education PDF

Summary

This document provides an overview of evaluation in healthcare education. It describes different types of evaluation, focusing on the process and steps involved. The presentation also touches on important considerations for the evaluation process, including clarity, ability, and emotional factors.

Full Transcript

EVALUATION IN HEALTHCARE EDUCATION Dr. Muhammad Arsyad Subu MEETING 2 IMPACT EVALUATION  Purpose: to determine relative effects of education on institution or community  Scope: broad, complex, sophisticated, long- term; occurs infrequently TOTAL PROGRAM EVALUAT...

EVALUATION IN HEALTHCARE EDUCATION Dr. Muhammad Arsyad Subu MEETING 2 IMPACT EVALUATION  Purpose: to determine relative effects of education on institution or community  Scope: broad, complex, sophisticated, long- term; occurs infrequently TOTAL PROGRAM EVALUATION  Purpose: to determine extent to which total program meets/exceeds long-term goals  Scope: broad, long-term/strategic; lengthy, therefore conducted infrequently STEP TWO Designing The Evaluation DESIGNING THE EVALUATION…  An important question to be answered in designing an evaluation is “How rigorous should the evaluation be”  All evaluation should be systematic and carefully planned and structured before they are conducted.  Evaluation design could be structured from a research perspective DESIGNING THE EVALUATION…  Essential questions to be asked when designing evaluation tool 1. What types of data will be collected? Complete (people, program, environment) 2. From whom or what will data be collected? From participants, surrogates, documents, and/or preexisting databases Include population or sample DESIGNING THE EVALUATION…  Essential questions to be asked when designing evaluation tool 3. How, when, and where will data be collected? By observation, interview, questionnaire, test, record review, secondary analysis Consistent with type of evaluation Consistent with questions to be answered. DESIGNING THE EVALUATION…  Essential questions to be asked when designing evaluation tool 4. By whom will data be collected?  By learner, educator, evaluator, and/or trained data collector  Select to minimize bias EVALUATION BARRIERS 1. Lack of clarity 2. Lack of ability 3. Fear of punishment or loss of self- esteem EVALUATION BARRIERS 1. Lack of clarity Resolve by clearly describing five evaluation components. Specify and operationally define terms. EVALUATION BARRIERS… 2. Lack of ability Resolve by making necessary resources available. Solicit support from experts. EVALUATION BARRIERS… 3. Fear of punishment or loss of self-esteem  Resolve by being aware of existence of fear among those being evaluated.  Focus on data and results without personalizing or blaming.  Point out achievements.  Encourage ongoing effort.  COMMUNICATE!!! SELECTING AN EVALUATION INSTRUMENT  Identify existing instruments through literature search, review of similar evaluations conducted in the past.  Critique potential instruments for: Fit with definitions of factors to be measured Evidence of reliability and validity, especially with a similar population Appropriateness for those being evaluated Affordability, feasibility. STEP THREE Conducting An Evaluation WHEN CONDUCTING AN EVALUATION?  Conduct a pilot test first. Assess feasibility of conducting the full evaluation as planned. Assess reliability, validity of instruments.  Include extra time. Be prepared for unexpected delays.  Keep a sense of humor! STEP FOUR Data Analysis and Interpretation DATA ANALYSIS AND INTERPRETATION  The purpose for conducting data analysis is two-fold: 1. To organize data so that they can provide meaningful information, such as through the use of tables and graphs, and 2. To provide answers to evaluation questions.  Data can be quantitative and/or qualitative in nature. STEP FIVE Reporting and using result of data REPORTING AND USING EVALUATION RESULTS 1. Be audience focused. 2. Stick to the evaluation purpose. 3. Use data as intended. REPORTING AND USING EVALUATION RESULTS… 1. Be audience focused.  Begin with a one-page executive summary.  Use format and language clear to the audience.  Present results in person and in writing.  Provide specific recommendations. REPORTING AND USING EVALUATION RESULTS… 2. Stick to the evaluation purpose. Directly answer questions asked. 3. Use data as intended. Maintain consistency between results and interpretation of results. Identify limitations. SUMMARY OF EVALUATION PROCESS  The process of evaluation in healthcare education is to gather, summarize, interpret, and use data to determine the extent to which an educational activity is efficient, effective, and useful to learners, teachers, and sponsors.  Each aspect of the evaluation process is important, but all of them are meaningless unless the results of evaluation are used to guide future action in planning and carrying out interventions. FIVE FOCI OF EVALUATION  Evaluation focus includes five basic components: 1. Audience 2. Purpose 3. Questions 4. Scope 5. Resources EVALUATION FOCUS  To determine these components, the following five questions should be asked: 1. For which audience is the evaluation being conducted? 2. For what purpose is the evaluation being conducted? 3. What questions will be asked in the evaluation? 4. What is the scope of the evaluation? 5. What resources are available to conduct the evaluation? RSA EVALUATION MODEL 5 types of evaluation –simple (process) to complex (impact). Total program evaluation –summarizes all 4 levels. high low time & frequency cost Impact Outcome Content Process low Total Program high PROCESS (FORMATIVE) EVALUATION  Audience: individual educator  Purpose: to make adjustments as soon as needed during education process.  Question: What can better facilitate learning?  Scope: limited to specific learning experience; frequent; concurrent with learning  Resources: inexpensive & available CONTENT EVALUATION  Audience: educator/clinician individual or team  Purpose: to determine if learners have acquired knowledge/skills just taught  Question: To what degree did learners achieve specified objectives?  Scope: limited to specific learning experience & objectives; immediately after education completed (short-term)  Resources: relatively inexpensive; available OUTCOME (SUMMATIVE) EVALUATION  Audience: educator, education team/ director, education funding group  Purpose: to determine effects of teaching  Question: Were goals met? Did (planned) change in behaviors occur?  Scope: broader scope, more long term & less frequent than content evaluation  Resources: expensive, sophisticated, may require expertise less readily available IMPACT EVALUATION  Audience: institution administration, funding agency, community  Purpose: to determine relative effects of education on institution or community (is it worth the cost?)  Question: What is the effect of education on long-term changes at the organizational or community level  Scope: broad, complex, sophisticated, long-term; occurs infrequently  Resources: extensive, resource-intensive TOTAL PROGRAM EVALUATION  Audience: education dept., institutional administration, funding agency, community  Purpose: to determine extent to which total program meets/exceeds long-term goals  Question: To what extent did all program activities meet annual departmental/ institutional/community goals?  Scope: broad, long-term/strategic; lengthy therefore conducted infrequently  Resources: extensive, resource-intensive. THANK YOU

Use Quizgecko on...
Browser
Browser