Chapter 7 Changing Views on Assessment for STEM Project-Based Learning PDF
Document Details
Uploaded by CuteBromeliad
Robert M. Capraro, M. Sencer Corlu
Tags
Related
- STEM Project-Based Learning Paradigm PDF
- Chapter 6 Classroom Management PDF
- Chapter 8 STEM Project-Based Learning in Inclusive Settings: Students With and at Risk of Disabilities PDF
- Affordances of Virtual Worlds and Virtual Reality for STEM Project-Based Learning PDF
- Chapter 12 Using Free and Open-Source Hardware and Software With STEM Project-Based Learning PDF
- STEM Project-Based Learning - Aggie STEM PDF
Summary
This chapter explores changing views on assessment in STEM project-based learning. It emphasizes formative assessment, highlighting how it's intertwined with pedagogy and integrated assessment methods that foster creativity and responsibility. The chapter emphasizes the importance of shifting from summative to formative assessment.
Full Transcript
Chapter 7 Changing Views on Assessment for STEM Project-Based Learning Robert M. Capraro M. Sencer Corlu Department of Teaching, Learning & Culture Graduate School of Education Aggie STEM Bilkent University, Turk...
Chapter 7 Changing Views on Assessment for STEM Project-Based Learning Robert M. Capraro M. Sencer Corlu Department of Teaching, Learning & Culture Graduate School of Education Aggie STEM Bilkent University, Turkey Texas A&M University Science, technology, engineering, and mathematics (STEM) project-based learning (PBL) integrates assessment methods across different aspects of learning experiences. While STEM PBL shifts the focus of attention from summative to formative assessment, a greater attention is given to the interpersonal domain. Because of the nature of STEM PBL, which is centered on developing real-world projects where students can apply their understandings of various concepts, authentic assessment underlies both formative and summative assessment tasks through technology, such as classroom response systems, and rubrics. Authentic assessment in STEM PBL helps students transition from an authority-imposed regulation to the self- regulation of their learning. Therefore, assessment in STEM PBL is inextricably linked to and interwoven with pedagogy through integrated assessment methods that develop the whole person, stimulate creativity, and foster individualized group responsibility. Changing Views on Assessment 105 Chapter Outcomes When you complete this chapter, you should better understand and be able to explain the nature of STEM PBL assessment various rubrics used in the development of STEM PBL complexities teachers face when assessing STEM PBL When you complete this chapter, you should be able to develop an assessment plan that matches your selected learning outcomes or student expectations for your STEM PBL activity communicate early and effectively with administrators and parents about valuing student learning and not just evaluating it assess student learning in terms of academic progress instead of meeting arbitrary decision points (e.g., 90, 80, 70, 60) Chapter Overview The major focus of this book has been on the practical integration of knowledge so that students can demonstrate what they learn in meaningful ways to be academically successful. This chapter is concentrated on determining what students can do and on facilitating students to do more than what they think they can. The particular emphasis of this chapter is on formative assessment, beginning with a discussion on the various types of assessment, though making connections to grading and evaluating knowledge products are discussed as a necessity in the current age of accountability. Furthermore, we describe the importance of teacher assessments for successfully implementing STEM PBL and finally provide an example of how assessment can be used to support a STEM PBL activity in the classroom. Overview of Assessment The Role of Assessment STEM PBL requires a whole new perspective on what assessment means. As an integral component of STEM PBL, assessment holds the project components together, maintains student motivation for learning (Brophy, 2004), and provides both the teacher and the student with useful information about learning (Kulm, 1994). In STEM PBL assessment, teachers need to change their focus from summative to formative assessment. When the focus is formative, (1) assessment is not seen as simply quantifying a product but is more concerned with the learning process (Ashcroft & Palacio, 1996), (2) test scores or grades have minimal impact on the process (Wright, 2008), and (3) students are keenly aware of their own learning or metacognition. 106 Changing Views on Assessment Students are not accustomed to encountering STEM PBL assessment. In typical classroom practices, assessment is synonymous with grading, which determines individual success or failure at school. This typical approach to assessment leads students to strive to do well on tests in order to get a good grade rather than develop learning strategies through self- improvement, self-regulation, and understanding. For students, an authority-imposed regulation of learning through grading precludes the interpretation of assessment as a means of feedback towards the desired learning objectives. For teachers, the typical approach to assessment emphasizes the common belief that teachers need to understand what students do not know so they can document their performance. Over the course of students’ education, they have already developed a preconceived notion of what assessment is and how it is done. Sometimes breaking the mold requires confronting student conceptions as well as shifting the practices of teachers. Regardless, assessment must shift toward formative practices so that teachers can use the acquired information to adjust teaching content, teaching style, or the ways they assess learning to improve student learning. Changing one’s beliefs about assessment can be a hurdle for teachers, but teachers may face even more resistance from students. Of course, that resistance will not come from struggling students but from the students who have learned the old assessment system and mastered it. Teachers need to be prepared to help students transition to STEM PBL assessment. Based on our experiences with teachers, it is common at the beginning stages of STEM PBL projects that students struggle with the ambiguity, new structure, and new outcome expectations. Students often ask for further clarification of the assessment and the outcome, and some even want to ask what is expected. To teachers who experience this, our response is that students must be taught how to interpret a rubric and how to interpret the teacher’s comments on a rubric. Students need to learn that formative assessments are a checkpoint rather than a grade. The criteria are explicit and clearly communicate what is expected and what students can do improve their grade. It is also common that your overachieving students might be tempted to translate the rubric score into a grade and could be turned off at the initial stages of STEM PBL. Therefore, it is paramount that the teacher set the stage by discussing how formative rubrics are used and that rubrics are designed to help students identify the areas for improvement rather than to evaluate their success or failure. The STEM PBL perspective on assessment requires a change in both teachers’ and students’ views on assessment. The inclusion of formative assessments is a major change in the classroom, and how those assessments are provided is evolving. In our classrooms, we use “digital check ins.” These are templates that provide highly descriptive prompts that allow the instructor to apply check marks to the appropriate descriptor when met and enable students to have immediate access to feedback. There are less obtrusive electronic grade books and quick links that can be completed Changing Views on Assessment 107 on a smart phone as the instructor moves between groups as the class progresses. Most importantly, not every student needs the same number of formative assessments. They should be used to provide guidance to students who are moving quickly through a project and doing a great job and to provide support for students who are struggling with their progress. Formative assessment is the lynchpin in the valuing of learning and potentially the most transformative practice that can happen in today’s classrooms. Formative and Summative Assessment There are two broad categories of assessment: formative and summative. Formative assessment provides students with regular feedback to help them learn to regulate their own learning processes. Summative assessment primarily concentrates on evaluating the learning that has taken place following a predetermined instructional period. In the most general terms, almost any assessment can be used in a formative or summative way, albeit some assessment tasks, such as multiple-choice tests, provide only limited information for learning to self-regulate and to improve one’s own learning process. Summative STEM PBL assessment tasks are ideally planned concurrently with lesson development. It is, however, not unusual that preplanned rubrics are modified or new rubrics are created during the later stages. In this perspective, summative assessment is not relegated to the last day of instruction. Summative assessments can occur in smaller increments throughout the STEM PBL activity. Teachers may choose to use short summative assessment tasks to guide students toward an improvement in collaboration with other team members by emphasizing the sense of individual accountability or toward a development of their content knowledge. Yet, such short summative assessment tasks should be accompanied by an advanced preparation of the students to the tasks rather than come as a surprise. Just as teachers would not be happy to have their teaching assessed without preparation or without knowing the criteria on which their teaching will be assessed, using surprise summative assessment demoralizes students, diminishes their intrinsic motivation, causes discontinuity in group and individual learning, and can even break down the learning process extensively. Summative STEM PBL assessment should only be used after closely aligned formative assessment tasks are introduced to the students. The formative STEM PBL assessment encompasses an accumulation of learning artifacts, which are assembled by students through clear and explicit directions from the teacher. Teacher- driven directions align the expected learning outcomes to the STEM PBL projects, while the artifacts are used as summaries of student knowledge or are knowledge products that depict a richer and more complete picture of what students have learned. In this regard, formative STEM PBL assessment should be a means for helping students apply their knowledge, thereby 108 Changing Views on Assessment owning the knowledge rather than acing a test. Thus, the formative STEM PBL assessment helps to move beyond evaluating student success based on rote memorization. In short, formative assessment can differ based on several aspects of the STEM PBL environment, including the following: The setting (e.g., group or individual) The content Outcome expectations Allotted time frame The time students spend on the activity Constraints in the design brief Criteria In the age of accountability, success on multiple-choice tests still continues to be an important benchmark, measuring the effectiveness of teaching for tests. Multiple choice tests are limited in their ability to guide learning but are easy to grade; however, by focusing on critical assessment of students’ progress in thinking, through their writing about what they learn and why they believe they learned it, teachers are more likely to lead students to be flexible with their knowledge (Boaler, 1998). Being flexible with their knowledge may help students develop certain test-taking skills, such as critical-reading skills that may help students develop the ability to better comprehend the readings presented in multiple-choice items on high-stakes state tests. Group Responsibility and Individual Accountability (Both Formative and Summative) STEM PBL assessment evaluates both individual and group performance. It is important to match the assessment to the learning activity and setting in which the learning takes place. For instance, individualized assessment of a group activity is less productive than a more encompassing and group-based assessment of learning. If students pursue learning individually, the group-based assessment may create dissonance with individualized learning and, thus, have a negative impact on student learning. For group-based assessment, if group membership is heterogeneously assigned, less customization of the assessment is required. When students are randomly or self-assigned to groups, the assessment needs to be modified for each group’s personality and academic idiosyncrasies. In cases where a high degree of customization occurs, groups may only demonstrate one specific learning goal of the STEM PBL as compared to student groups with less customization, who may be able to produce more comprehensive artifacts (see Figure 1). Simply, the content is an essential variable that should be accommodated when designing the assessment. Some content is more easily assessable by some methods than others. For example, it is a challenge to assess knowledge level content Changing Views on Assessment 109 through creative assessment tasks. Thus, it may be difficult to assess knowledge level content at the analysis or evaluation levels. The group assessments should value tasks that are clearly defined by group responsibility. Tasks include both social and intellectual components. On the social side of group responsibility are collaboration, communication, accepting ideas, working with others, and meeting deadlines, among many other possibilities. On the intellectual side of group responsibility are the quality of the final product and the intermediate steps that lead to the final product. Failing to maintain group responsibility can undermine the activity and make group work more challenging for students, especially for those who may feel that working in groups is a hindrance to her or his success. Each person must be individually accountable in a STEM PBL activity. Again, there are social and intellectual aspects to this notion. In this case personal behavior contracts should be initiated, and this should remain separate from any group assessment of social behaviors. Intellectually, students must be able to explain the final project and be able to explain the intermediate steps that led to the final product. It is often helpful to have several small assessments to value individual intellectual contributions. For example, an exit ticket with two items used regularly, every other day or some other easily recognizable pattern, where a student 1) explains what he or she learned today related to the project and 2) explains steps he or she completed for the project. Checklists, 3–4 multiple choice questions, or peer ratings can also help to achieve estimates of individuals’ intellectual contributions to the project. These assessment strategies build self-regulation and promote individual accountability. Authentic Assessment Authentic assessment is the most complicated assessment method compared to other formative and summative schemes. Despite the lack of an agreed definition, there is a consensus among educators that authentic assessment tasks should focus on the knowledge of the products, which make the assessment relevant to the learner through real-world applications. Authentic assessment matches the content being learned and knowledge products with student interests guided by clearly defined outcomes. Examples of authentic assessment can include tasks as simple as students listing what they learned to get to a certain stage of the project or may be as complicated as filing a report of their progress and the steps involved in solving one or more problems encountered in the STEM PBL activity. Authentic assessment fits into various aspects of STEM PBL to different degrees. For example, when assessment of procedural skills is the focus, authentic assessment is less relevant compared to the situation when the goal of assessment is to understand how students apply those procedural skills in real-world contexts. Another example is the “just-in-time assessment,” which is a form of 110 Changing Views on Assessment authentic assessment that utilizes technology. In one just-in-time assessment model, tablets (e.g., iPads or Android-based mobile technologies) can be used in the role of a data collector. The tablet easily captures student performance as a video and audio file, which can be used by the teacher to digitally record information into rubrics made immediately available to students. Just-in-time assessment is instantly performed by the teacher with minimal delay between the time that the assessment is performed and the time that students received information regarding their progress. Another just-in-time assessment example is classroom response systems or classroom clickers (Duncan, 2005). Clickers provide the teacher with the opportunity to carefully plan assessment, be it alpha numeric (the students type in a response), multiple choice, or numeric. With the help of clickers, all students simultaneously participate in the learning process through both group and individual feedback. The group feedback can help the teacher make decisions about how the rest of the lesson should proceed. In return, students get a firm understanding of what the whole class understands and their corresponding learning level compared to peers. Because the identities of each individual are masked, students can only see the individualized feedback provided to them while the feedback to their peers remains anonymous. These forms of just-in-time assessment can be powerful in differentiating STEM PBL instruction from more traditional practices in a cost-effective way (Cavanaugh, 2006). Just-in-time assessment methods clarify the utilization of authentic assessment methods in the digital domain (see Chapters 11 and 12 on technology). The Venn diagram in Figure 1 categorizes the assessment methods explained in this chapter, some of which are more closely aligned to the intent of PBL than those peripherally associated. Figure 1. Comparison of Assessment Methods in STEM PBL and Traditional Instruction Changing Views on Assessment 111 STEM PBL Assessment It is essential to integrate assessment and instruction in each STEM PBL lesson (Solomon, 2003). In the practical design of STEM PBL, the standards are clearly delineated so that assessment and instruction are intertwined. If teachers are keenly aware of the standards in their content area, then they can base their students’ expectations on these standards and develop a STEM PBL environment that addresses these expectations and the assessments that follow the expectations. It is not necessary for the teacher to predetermine every aspect of the assessment methods to be used with the STEM PBL lesson at the onset. Different assessment methods may be chosen after the initial selection of standards and perhaps even during the actual STEM PBL activity because assessment needs to be aligned with the learning environment, and sometimes unforeseen changes occur so the assessment should be fluid and responsive. For instance, teachers can adjust the assessment method based on the setting because the assessment of the same content or standard can differ depending on whether learning occurs in groups or individually. When students learn in group settings, it is important to respect the group intelligence and assess work by groups while maintaining individual accountability. We present some examples of common rubrics as well as other examples and helpful tools in the Appendices that might be helpful to teachers in setting up their STEM PBL environments. Individual Accountability There are several accountability strategies that attenuate and facilitate group intelligence yet encourage individual accountability at the same time. Peer assessment is one of those strategies that can provide the teacher with valuable insights about individuals’ contributions to group intelligence. Furthermore, it is necessary to set up requirements where students are randomly or pseudo-randomly selected by the teacher to explain the group’s results so that the team’s score is in part based on that person’s individual responses. Reflection is another way to gain insights into individual performance. When the teacher uses reflection strategically, students can respond to the questions about what would have improved the project, what would have improved the group’s product, and how could their performance have changed to improve the quality of the deliverable. These questions can yield surprising insights about both the respondent and the team members. There are several examples of reflection assessments in Appendix Q (Self-Reflections) and Appendix R (Reflection on Team Collaboration). To help guide individual accountability, teachers may consider the use of contracts, both social and intellectual, to establish common goals (common to the teacher and students) that clearly articulate expectations. The contracts can be agreed upon between groups when they define group behaviors (whether those behaviors are social or intellectual), between a group member 112 Changing Views on Assessment and his or her group, or between the teacher and an individual group member or some members. Appendix O (Comprehensive Group Contract) provides an example of a completed contract, and several other contract types that can be used or modified to meet specific classroom and instructional needs are also included in the Appendices. Additionally, it is important to use individualized assessment that mirrors assessment tasks at the state level because students need to be able to demonstrate their learning on high-stakes testing formats too. As long as schools, teachers, and student performances are measured with high-stakes tests, any educational innovation that fails to provide measurable impact on high- stakes assessment is doomed. Therefore, it is paramount to achieve an equilibrium between authentic and high-stakes assessment when considering individual accountability. In a STEM PBL environment where the instruction focuses on designing, constructing, and synthesizing, it is important that assessment is similarly focused and that sufficient weight is given to these concepts as opposed to the high-stakes variety. One effective way to reflect student accountability in authentic assessment is through the careful design and application of rubrics. Development of Rubrics This book contains many rubrics, which are designed to provide educators with important guidance. Some of the rubrics have been tried and tested for many years, while some are newer. However, all are developed, used, and shared by the teachers with whom we work. Rubrics should be used with an important principle in mind: teachers should always prepare students before they use rubrics in class. Rubric use and grading has to be taught just like any other classroom practice so that it can become the routine and not the exception. It is our honest goal that the included rubrics are viewed as intellectually stimulating and prompt you, the reader, to try your hand at developing the rubrics you will use in your classroom to facilitate student learning and to stimulate creativity and in-depth STEM learning. Rubrics are one means for providing students with formative and summative feedback about their learning processes. Rubrics can help teachers evaluate students’ learning efficiently (Andrade, 2000). Rubrics also provide guidance for students throughout the self- and peer- assessment processes (Andrade, 2000). The specific and clear criteria identified in rubrics are particularly helpful for professionals who are not teachers and thus not familiar with assessing student performance as they evaluate projects. A well-defined rubric contains components that reflect the specifics of the standards and conceptual generalities of an activity and intangible aspects like those reflected in the Secretary’s Commission on Achieving Necessary Skills (2000) report. Various attainment degrees of learning goals are specified in rubrics (Andrade, 2000). Rubrics should also provide sufficient information to help students understand what they know and do not know and some guidance about what they need to learn. Changing Views on Assessment 113 The rubric’s scale can be closely related to the grading system or be one that obfuscates the relation between the scale score and the A to F grade equivalency. For example, a rubric can either be interpreted by the point value and the points converted to a percentage score, or the six-point mastery rubric can be interpreted directly from A+ to F. Contrarily, a rubric can be based on a three- or four-point scale that does not align well with the conventionally based A to F grading scale. An even number of ratings (such as four or six) precludes a midpoint decision on the part of the rater. This is often considered desirable. What is most important when designing a rubric is to assign more weight to the critical and important aspects of the task while placing less emphasis on things tangential to the clearly defined outcomes. Figure 2. Sample Generic Rubric Summative Rubric 1 - Journal - Research Notes, Data Use, Analysis, Reflections Demonstrated Exemplary Proficient Novice Emerging 4 points 3 points 2 points 1 point Research The student journal The student journal References are Notes are typically a Notes shows correct reference shows mostly present but mostly cataloging of what citations to key ideas correct references incorrectly was done or from research-based to acceptable, formatted or performed, lack sources of thorough sourced ideas (for presented, sources evidence of new information such as example, WIKIs, are unreliable. knowledge or new Google Scholar or Blogs, General Conclusions or understandings, and similar peer-reviewed Internet Searchers). judgements are there is little source. Synthesized based on weak evidence of descriptions for fundamental modifications or integrating their principles. changes based on findings and the influence of accompanying diagrams research. are aligned to the text notes. Active There are clear, The student journal The student Lacks connection to Reflection sequential, and shows some active journal lacks prior knowledge, is logical thoughts about evidence of reflections about unformed, or design improvements. reflection about design, indicates Notes reflect knowledge design and improvements, disconnected of background content. improvements and and rationale and thoughts. There is Successes and failures some notes about does not show lack of knowledge are clearly identified, as successes and clear evaluation of evidence in key is what to retain or failures. Evidence tests. words and key modify. Evidence of (signatures) of No evidence concepts. consideration of others’ consideration of (signatures) is Reflections lack suggestions as well as others’ suggestions noted in the evidence of evaluation of is partial. changes of design. engagement with suggestions and the related topics. modifications (signatures). 114 Changing Views on Assessment Journal Representations show Representations Representations Student’s journal Representations organizational hierarchy show visual show some was not neatly of thoughts and ideas. organization with thoughts, labeling, presented in an Comparisons, contrasts, thoughts labeled. and measured data organized way to and judgements are Comparison of without regard for show data and ideas shown. The journal measured data, the BLE RSSI based on the BLE showed use of a neat including the BLE score to make RSSI score. and clear graphic RSSI score, to make improvement Representations organization of ideas decisions for decisions for were not presented with measurement data improving their antenna in a neat and to show the BLE RSSI antenna is evident. organized fashion score in order to make using an acceptable decisions for improving grade appropriate their antenna. choice, lacked headings or other necessary information, or were based on data only. Information After representing the The student journal The student Thinking, reflection, Analysis data, thoughts, and showed orderly journal showed and analysis not ideas, the student formulations basic effort to demonstrated in showed use of indicating some organize info and journal notations. reflection and analysis thought about think Comparisons not to compare and contrast shape and about antenna made between information properties to shape and designs for design demonstrating evaluate design properties to decisions. evaluation of design. decisions. suppose few design decisions. Design The student journal The student journal The student The student journal displays a clear–neat displays neat work journal displays does not display Scientific understanding of the and understanding only a basic process steps or Process design (scientific) of the design understanding of information in order Demonstrated process with sequenced (scientific) process the design to show their (overall design research, notes, with the research, (scientific) process understanding of the representations, notes, w/o using process design (scientific) process information and representations, steps, tools for process or orderly representation) analysis, and tools. and evaluation organized thinking. method shown. evaluation, and adjustments. Total out of 20 Note. This rubric meets some of the tenets of rubric design, but from this rubric, the student would not have sufficient information about the knowledge gaps indicated in the non-exemplary categories, just that he or she has gaps. Changing Views on Assessment 115 To improve the above rubric, one could replace words like “knowledge,” “skills,” “content,” “concepts,” and “mostly” with specific knowledge and/or skills necessary to the learning outcome, as in the exemplary category. Rubrics are an essential component of STEM PBL that serve different purposes for those who are involved in the assessment process both at the stage of the rubric’s development and its utilization during the evaluation. There are many stakeholders involved in the assessment process, and the whole group should have some level of responsibility in the development of rubrics, including students, peers, the supervisor (teacher), and possibly even external evaluators, such as other content-area teachers, administrators, coaches, or interested community members. When all stakeholders are involved in rubric development, they not only understand the criteria but also own them. The use of rubrics by students through teacher modeling can help them develop important self- and peer-assessment skills. However, in urban schools it is often difficult to enculturate self- and peer-assessments, and teachers can find the enculturation process to be time consuming to attain the positive impact that these assessments are intended to achieve. However, some groups of students and/or school cultures are less resistant, and teachers can be surprised by how rapidly students own the self- and peer-assessment methods. Sometimes students may be overly critical, whereas at other times they are overly accommodating. It is important to model critical feedback (Falchikov, 1995) that is both honest and constructive. Students should understand that to identify a weakness without an accompanying suggestion for improvement does not foster intellectual development. To foster the development of self- and peer- assessment, it is important for students to (1) be involved in the development of rubrics, (2) be reflective by learning to self-assess, and (3) receive critical commentary on their assessment of peers. The enhanced understanding of learning goals and assessment criteria help students to develop metacognitive awareness and an intrinsic motivation (Peckham & Sutherland, 2000). Students who regularly engage in PBL activities should be able to thoughtfully answer the following: How can I tell if I have learned _____ well enough? Does the learning serve my current needs? Did I learn it in a way that I will be able to use it in the future? Will I be able to transfer this learning to new situations? Do I know what I do not know? Do I have the necessary foundation to learn more? 116 Changing Views on Assessment Self-Regulation Explicit assessment helps students to self-regulate their behavior. Two different levels of self- regulation are present when students are integrally involved in the assessment process. The first level of self-regulation emerges as students co-develop rubrics for assessing various aspects of the PBL activity. Through involvement in the development of the rubrics, students establish ownership of the assessment model and clearly understand what aspects of learning will be evaluated and how (Fraile et al., 2017). This process will allow students to decide the degree to which their artifact meets expectations. This thorough understanding of the rubric can guide students as they implement self-regulation to plan their learning activities to achieve the objectives of the rubric. Thus, involving students in the development of rubrics fosters a sense of self-determination, as they feel like agents of their own learning. The second level of self-regulated behavior takes place when students learn peer- and self- assessment through the application of the rubrics they develop. As students do self- assessment, they get to know their areas of weakness and strength and allocate their effort to different areas of the learning objectives accordingly, thus holding themselves responsible. Students also start to align the requirements of the rubric with their learning process and desire to meet the requirements for their own benefit and purposes rather than merely meet the requirements of the teacher. Peer-assessment also could function as information for their own learning, especially when assessment focuses on the development of particular skills in a non-competitive environment. Informational feedback could further enhance students’ self- regulation. This implementation of this second level of self-regulation may require several attempts and clarification by the teacher. Although the application of the rubric to assess a student’s own learning and behavior may be difficult initially, repetition will lead to success and the student will eventually develop an appreciation for the assessment and value for the learning task. Formative Assessment of Teacher Enactments of STEM PBL It is important to include the teacher in a chapter about assessment. The teacher, too, should participate in being formatively assessed in his or her enactment of STEM PBL. We have included a sample document, which was developed by the Aggie STEM team. The Aggie STEM teacher assessment instrument follows from our STEM PBL model as well as professional development training program. However, this teacher assessment instrument should never be used as a summative assessment of teachers. The document is designed to provide criteria- specific information (Stearns et al., 2012). In order to improve the quality of STEM education classes, which are designed to encourage conceptual development (i.e., STEM PBL activities), teachers need feedback and support, too. Changing Views on Assessment 117 “There is considerable evidence from different studies suggesting that how teachers behave in the classroom, the instructional approaches they employ, significantly affect the degree to which students learn (Van Tassel-Baska et al., 2008, p. 85). In fact, research shows that ineffective teachers can depress student achievement in mathematics by as much as 54% regardless of students’ abilities (Sanders & Rivers, 1996). Without some form of classroom observation, teachers’ assimilation of professional development ideas cannot be assessed and continuous improvements may be compromised (Van Tassel-Baska et al., 2008). Observations can be either peer or professional in nature, but the observer needs to provide feedback to the educator so he or she may evaluate and adjust their teaching to benefit students (Patrick, 2009). See Appendix S for an example of a teacher peer evaluation. Therefore, to ensure translation of any professional development into classroom practice, assessment must be present in some form during actual teaching activities. When carefully aligned with professional development, a classroom observation instrument can be an effective tool for providing feedback about assimilation of professional development teaching strategies. An effective way of evaluating teaching behaviors is with the use of a specifically designed observational instrument (Guskey, 2002; O’Malley et al., 2003; Simon & Boyer, 1969). An observation tool can yield a descriptive account of targeted performances. This can be achieved with a conceptual rubric that contains a numeric range of descriptors for each predetermined objective. Observational data can also be structured with a frequency-counting system or coding system (Taylor-Powell & Steele, 1996). Observational tools can serve to monitor progress toward increasing a desirable trait or diminishing an undesirable behavior based on some theoretical framework. For example, the Aggie STEM teacher assessment instrument includes this category: “The teacher worked with members of all small groups.” Noting that a teacher did this well and providing a score of 4 or 5 would likely provide confirmation that the actions were noteworthy and meritorious and might likely reinforce the practice. However, assigning a low score of 1 or 2 and noting in the discussion with the teacher that “Too much time was spent in only one single group, resulting in not checking in with or visiting with other groups. This resulted in some students not making as much progress as others toward the completion of the project,” would likely identify the issue and describe the condition and the effect. Thus, with all these points taken together, the teacher has a solid structure for altering instruction to meet the intent of the category. The information gained through an observation tool can also be used for teacher reflection and to customize subsequent professional development. See the Project-Based Learning Observation Record (Stearns et al., 2012) in Appendix T. The Aggie STEM teacher assessment instrument was specifically created to evaluate observable teaching and learning objectives when teachers develop and implement STEM PBL activities in 118 Changing Views on Assessment their classrooms. Teachers being evaluated with this instrument should have participated in sustained professional development (five or more full days) focusing on STEM PBL. The professional development should focus on each of the measured objectives. Both the observers and the teachers should be trained on the component and purposes of the instrument. The instrument contains 22 items organized by six objectives. The objectives include (a) STEM PBL Structure, (b) STEM PBL Facilitation, (c) Student Participation, (d) Resources, (e) Assessment, and (f) Classroom Learning Environment. The number of indicators under each objective varies. Each indicator is evaluated on a scale ranging from 1 (no evidence) to 5 (to a great extent), with the observer justifying every score assigned to each item. Occasionally, an item will not apply to what is taught during a particular observation. When this happens or when the observer is only present for part of a STEM PBL activity, a well-documented lesson plan can provide insights and further details. The observer may still choose to indicate that a particular behavior was not applicable or not observed during the class period. Finally, the authors of the instrument at the Aggie STEM Center encourage you to seek professional development prior to using it and to participate in an observer’s workshop for teachers, who are already expert STEM PBL implementers, to learn to provide constructive-formative feedback and to carefully rate the teaching enactments. Guiding Thoughts for Teachers About PBL Activities and Assessments Think about the content you teach. Think about what makes your content area and the assessments you traditionally use distinct from assessments in other content areas. Consider the changes that PBL requires in both teaching practices and assessments (Moursund, n.d.). A sample project development rubric is included in Appendix U. Think about how students learn. Much is known about the value of metacognition, self- assessment, and reflection on student learning. Do you think self-assessment is a valuable attribute for students who enter the workforce in a field related to your content area? How important is it in your content area or field to learn to assess one’s own work and learning and that of peers or co-workers (Moursund, n.d.)? Think about your PBL activity. Critically examine your PBL activity and the lessons or smaller activities that comprise it. Did the PBL activity cover the standards and objectives in your curriculum? Did you align your assessment tools with your standards and objectives? Did you balance formative versus summative assessments? Think about providing useful formative feedback within the constraints imposed by the length of your instructional time allotment. How will you ensure the feedback is timely so students’ efforts can reflect this information before the next assessment occurs (Moursund, n.d.)? Changing Views on Assessment 119 Think about your PBL activity versus traditional instruction practices. Consider the substantive adaptations or modifications you need to make in the structure of your curriculum and teaching practice. What aspects of PBL attracted you to make the effort and go through these changes (Moursund, n.d.)? If you are satisfied with the results of your current teaching practices, then one reason to implement PBL is to infuse the social responsibility so prevalent in PBL. Perhaps you are ready to try something new that will provide you a new challenge and add rigor to your activities to build on previous successes. You may have considered that times have changed and students will need to be prepared to thrive in a STEM world where the ability to creatively solve problems in dynamic and fluid situations is a necessity. Regardless, students who are preparing to enter college will benefit from their experiences with PBL, and those students who do not participate in post-secondary education will develop a deeper and more salient understanding of the working world that they will enter. All students will have the opportunity to develop the cooperation and collaborative skills that are in demand regardless of if they become factory workers or engineers. PBL Sample and Assessments In the “Who Killed Bob Krusty?” PBL activity (see Appendix V), the scenario contains all the salient information that a student needs to successfully engage the problem. The activity integrates calculus and science with a forensic science and criminology spin. There are important skills that need to be assessed before the start of the project and then again after the completion of the project. In this STEM PBL, students are given the same assessment form before and after the activity. The pretest serves as one formative assessment. It provides students with a structure about what they are expected to be able to do upon completion of the STEM PBL. The posttest provides a direct measure of how much improvement was achieved through the STEM PBL. Another summative assessment may be included, such as asking students to keep a daily journal where students can reflect on their learning, record their thought processes during the STEM PBL, and discuss what mathematics they need to employ or learn more about. For teachers, the assessments provide insights about students’ strengths and weaknesses so that the teacher can adjust the STEM PBL process to meet students’ needs, such as providing whole-group instruction on specific topics. This activity can facilitate incorporation of knowledge from additional disciplines. For example, a drawing of the crime scene can be useful to determine if the conditions are aligned with falling from the window or being thrown. This aspect of the activity may involve the contribution of the engineering or computer-aided design teacher. Geometry and trigonometry as well as physics and chemistry topics may easily be integrated into the STEM PBL activity. Nevertheless, it is always essential to foster scientific process skills in any PBL lesson, such as 120 Changing Views on Assessment those employed by medical examiners during a death investigation. That is, they rule out the cause of death based on death scene characteristics, medical history, and various other factors. Additionally, in real life, coroners, forensic examiners/investigators, and police officers are included in the process as case reporters; therefore, within this activity students should also be expected to write reports to meet learning objectives, thereby facilitating connections to language arts classes. At periodic intervals during the activity, to check on learning, students should provide forensic reports that rule out possible causes of death. The final report should incorporate these preliminaries and provide a detailed hypothesis and a conclusion so that students can demonstrate a clear final explanation that incorporates the mathematical and scientific processes that support their hypothesis and conclusion. Understanding STEM PBL Given that this chapter is focused on assessment, it is important to connect the discussions in the book through an assessment model. The PBL Refresher Quick Quiz (see Appendix W) should be considered as a formative assessment task. Some answers are not obvious initially from just reading this book. In fact, STEM PBL is much like riding a bicycle. No matter how many technical manuals one reads about riding a bike, one must still get on, fall off, and reflect on both actions and suggestions in order to master the task. What makes riding a bike so complex? It is not just one task. It is composed of many small tasks that must be mastered to enjoy success. You must be able to balance, coordinate your pedaling and steering, remember that maintaining your balance is easier as long as you are moving forward, remember how to brake, and understand that loose gravel can result in a painful lesson. Just like riding a bike, STEM PBL is not just one task but the interaction of several smaller tasks, including choosing learning outcomes, planning content, determining a scenario, writing the scenario, developing formative assessment tasks, creating rubrics, and designing summative assessment tasks. Then, once the PBL activity starts, two new tasks arise: managing the materials and students. Therefore, as one reads and implements their STEM PBL activity, one will gradually be more confident about the answers to the PBL Refresher Quick Quiz. It is only the iterative process of reading about STEM PBL and implementing it in the classroom that is required to make it second nature. Only through practice is it possible to perfect one’s searching because it is the teachers’ own experiences and reflections that offer the best opportunities to improve student achievement. Conclusion Formative assessment is an important tool for supporting STEM PBL learning and teaching. Using formative assessment in conjunction with summative assessments, and especially authentic assessment strategies, teachers can support in-depth learning in students while also Changing Views on Assessment 121 balancing specific learning goals for high-stakes testing requirements. Rubrics in particular are effective assessment tools, and understanding how to both develop and utilize rubrics for student learning and mastery of PBL is vital for the effective implementation of PBL in STEM classrooms. It is thus important for teachers to receive professional development in using and developing rubrics so that they can begin the process of practicing and evaluating their implementation of STEM PBL. Reflection Questions and Activities 1. What are the two main types of assessment and how do they differ, especially in relation to STEM PBL? What is authentic assessment? 2. How do assessments keep students accountable and facilitate self- and peer-reflection? 3. What are they key aspects of a rubric? Think about an activity you have or currently use in your own classroom. Create a rubric for it using what you learned in this chapter. If there is already a rubric for it, think about how that rubric might be changed. 4. Try the PBL Refresher Quick Quiz in Appendix W. How did you do? What can you learn about your own knowledge and learning from the results? Further Readings Chang, C. C., Kuo, C. G., & Chang, Y. H. (2018). An assessment tool predicts learning effectiveness for project-based learning in enhancing education of sustainability. Sustainability, 10(10), 3595. López, M. J. T., Archilla, Y. B., & Quintana, P. J. V. (2020). Individual assessment procedure and its tools for PBL teamwork. The International Journal of Engineering Education, 36(1), 352–364. Surahman, E., Wedi, A., Soepriyanto, Y., & Setyosari, P. (2018). Design of peer collaborative authentic assessment model based on group project based learning to train higher order thinking skills of students. In Proceedings of the International Conference on Education and Technology (ICET 2018) (pp. 75–78). Atlantis Press. Wahyuni, V., Kartono, K., & Susiloningsih, E. (2018). Development of project assessment instruments to assess mathematical problem solving skills on a project-based learning. Journal of Research and Educational Research Evaluation, 7(2), 147–153. References Andrade, H. G. (2000). Using rubrics to promote thinking and learning. Educational Leadership, 57(5), 13– 18. Ashcroft, K., & Palacio, D. (1996). Researching into assessment and evaluation in colleges and universities. Kogan Page. Boaler, J. (1998). Open and closed mathematics approaches: Student experiences and understandings. Journal for Research in Mathematics Education, 29, 41–62. Brophy, J. (2004). Motivating students to learn (2nd ed.). Erlbaum. 122 Changing Views on Assessment Cavanaugh, S. (2006). Technology helps teachers home in on student needs. Education Week, 26(24), 12. Duncan, D. (2005). Clickers in the classroom: How to enhance science teaching using classroom response systems. Addison Wesley. Falchikov, N. (1995). Peer feedback marking: Developing peer assessment. Innovations in Education and Teaching International, 32, 175–187. Fraile, J., Panadero, E., & Pardo, R. (2017). Co-creating rubrics: The effects on self-regulated learning, self-efficacy and performance of establishing assessment criteria with students. Studies in Educational Evaluation, 53, 69–76. https://doi.org/10.1016/j.stueduc.2017.03.003 Guskey, T. R. (2002). Does it make a difference? Evaluating professional development. Educational Leadership, 59(6), 46–51. Kulm, G. (1994). Mathematics assessment: What works in the classroom. Jossey-Bass. Moursund, D. (n.d.). Part 7: Assessment. ICT-Assisted Project-Based Learning. https://darkwing.uoregon.edu/~moursund/PBL/part_7.htm O’Malley, K. J., Moran, B. J., Haidet, P., Seidel, C. L., Schneidr, V., Morgan, R. O., Kelly, P. A., & Richards, B. (2003). Validation of an observation instrument for measuring student engagement in health professions settings. Evaluation & Health Professions, 26(1), 86–103. Patrick, P. (2009). Professional development that fosters classroom application. Modern Language Journal, 93, 280–287. Peckham, G., & Sutherland, L. (2000). The role of self-assessment in moderating students’ expectation. South African Journal for Higher Education, 14(1), 75–78. Sanders, W. L., & Rivers, J. C. (1996). Cumulative and residual effects of teachers on future students’ academic achievement. University of Tennessee, Value-Added Research and Assessment Center. Secretary’s Commission on Achieving Necessary Skills. (2000). What work requires of schools: A SCANS report for America 2000. U. S. Department of Labor. Simon, A., & Boyer, E. G. (1969). Mirrors for behavior: An anthology of classroom observation instruments (ED031613). ERIC. https://eric.ed.gov/?id=ED031613 Solomon, G. (2003). Project-based learning: A primer. Technology & Learning, 23(6), 20–30. Stearns, L. M., Morgan, J., Capraro, M. M., & Capraro, R. M. (2012). The development of a teacher observation instrument for PBL classroom instruction. Journal of STEM Education: Innovations and Research, 13(3), 25–34. Taylor-Powell, E., & Steele, S. (1996). Collecting evaluation data: Direct observation. Program development and evaluation. University of Wisconsin-Extension, Cooperative Extension, Program Development and Observation. Van Tassel-Baska, J., Feng, A. X., Brown, E., Bracke, B., Stambaugh, T., French, H., & Bai, W. (2008). A study of differentiated instructional change over 3 years. The Gifted Child Quarterly, 52, 297–312. Wright, R. J. (2008). Educational assessment: Tests and measurement in the age of accountability. Sage. Changing Views on Assessment 123