Using AI to Teach AI: Lessons from an Online AI Class PDF

Summary

This article describes an online AI class and lessons learned about using AI to teach AI. The class incorporates principles and practices of the learning sciences and uses intelligent tutoring agents to guide students. The authors highlight the importance of interactivity and community engagement for effective online learning.

Full Transcript

Articles Using AI to Teach AI: Lessons from an Online AI Class Ashok K. Goel, David A. Joyner A I In fall 2014, we laun...

Articles Using AI to Teach AI: Lessons from an Online AI Class Ashok K. Goel, David A. Joyner A I In fall 2014, we launched a founda- I is in vogue again. As a result the demand for AI cours- tional course in artificial intelligence es and programs is growing in universities and colleges (CS7637: Knowledge-Based AI) as part across the world. This presents an opportunity for of Georgia Institute of Technology’s spreading knowledge of AI globally. Some of the increase in Online Master of Science in Computer demand comes from industry where many IT professionals Science program. We incorporated prin- ciples and practices from the cognitive want to renew their knowledge of AI or learn about it for the and learning sciences into the develop- first time. This affords an opportunity to influence the prac- ment of the online AI course. We also tice of AI in the real world. But these opportunities also pose integrated AI techniques into the major challenges. How can we satisfy the rapidly growing instruction of the course, including desire for learning about AI? How can we scale learning of AI embedding 100 highly focused intelli- so that it is repeatable and testable? How can we ensure that gent tutoring agents in the video les- the quality of learning AI at scale is comparable to that in sons. By now, more than 2000 students small residential classes? have taken the course. Evaluations have indicated that OMSCS students enjoy Recent trends in computing technology provide new affor- the course compared to traditional dances for both education in AI and AI in education. On one courses, and more importantly, that hand, the ubiquity of the Internet and the rise of cloud com- online students have matched residen- puting have enabled scaling for teaching almost any topic to tial students’ performance on the same large segments of the world’s population. This has led to the assessments. In this article, we present development of numerous massive open online courses the design, delivery, and evaluation of (MOOCs). The successes of Peter Norvig and Sebastian the course, focusing on the use of AI for teaching AI. We also discuss lessons we Thrun’s MOOC, Introduction to Artificial Intelligence, and learned for scaling the teaching and learning of AI. 48 AI MAGAZINE Copyright © 2017, Association for the Advancement of Artificial Intelligence. All rights reserved. ISSN 0738-4602 Articles Andrew Ng’s MOOC on machine learning, both based discussion forums. The video lessons are creat- launched at Stanford University in 2011, are well ed specifically for the online program; while many known (for example, Leckart , Raith ). online programs operate by recording professors live On the other hand, cognitive systems research on AI in residential classrooms, all OMSCS material is cus- in education over the last few decades has developed tom-produced for the program. The video lessons human-centered AI techniques to personalize stu- and the class forums together form the virtual class- dent learning and improve learning outcomes. These room (Joyner, Goel, and Isbell 2016). techniques are often embodied in intelligent tutor- ing systems and intelligent learning environments (for example, Azevedo and Aleven ; Koedinger Knowledge-Based AI (CS7637) and Corbett 2006; Jonassen, Peck, and Wilson 1999; It was within the OMSCS program that in January Sleeman and Brown 1982). Thus, at least in principle, 2014 we began work on an online version of CS7637: we now have computing technology for scaling Knowledge-Based AI. Ashok Goel, the first author of teaching as well as cognitive technology for support- this article, was the instructor for the course after cre- ing and assessing personalized learning. ating and teaching it on campus for some 15 years. In January 2014, Georgia Institute of Technology David Joyner, the second author, was the course inaugurated its fully accredited online Master of Sci- developer for the course after previously taking the ence in Computer Science (OMSCS) program. In residential knowledge-based artificial intelligence August 2014, we launched the first foundational (KBAI) class one year, and working as a teaching assis- course in AI, CS7637: Knowledge-Based AI (KBAI), as tant (TA) for it in another year. He also completed his part of the program. As a foundational course, the Ph.D. in human-centered computing with Goel. This material presupposed no prior experience with artifi- extant working relationship between the two proved cial intelligence; the only prerequisites are those for highly valuable in developing the online KBAI course admission into the program, including literacy in in 2014. English and training in computer programming. The KBAI class focuses on the “cognitive systems From the beginning, we adopted the methodology of school of AI” (Langley 2012) that we characterize as design-based research in developing the course, creating human-level, humanlike, and human-cen- incorporated lessons from cognitive and learning sci- tered AI (Goel and Davies 2011). The KBAI class ences in the design of the course, and integrated AI adopts a design stance toward learning about AI techniques and tools for teaching AI (Goel and Joyn- (Goel 1994), and thus much of the learning is organ- er 2016a). In this article, we present the design, deliv- ized around intensive design and programming proj- ery, and evaluation of the course, focusing on the use ects that build on one another. The design for the of AI for teaching AI. We also discuss lessons we online KBAI class follows a four-tiered learning hier- learned for scaling the teaching and learning of AI. archy consisting of learning goals, outcomes, assess- ments, and strategies. Learning goals represent what we expect students to know by the end of the course; The Georgia Tech OMSCS Program outcomes describe what we expect them to be able to Georgia Tech launched its online OMSCS program1 do in terms we can measure; assessments provide in January 2014. The video lessons for the OMSCS mechanisms for evaluating their achievement of the courses are delivered by the online education startup outcomes; and strategies prescribe methods of ensur- Udacity.2 The OMSCS program currently has about ing they accomplish the goals and outcomes, thus 4000 students, an order of magnitude more than the succeeding on the assessments. number of students in the Georgia Tech residential At a high level, the goals of the class were to under- MS in CS program, and now is the largest MS in CS stand the tasks that KBAI addresses; the methods it program in the United States (Goodman, Melkers, employs to address those tasks; the systems that com- and Pallais 2016). However, while the residential prise those methods and tasks; and the relationship degree costs several tens of thousands of dollars, the between creating those systems and understanding OMSCS program charges only $170 per credit hour human cognition. To demonstrate mastery of these and thus costs only several thousand dollars, an order learning goals, students build systems that address of magnitude less than the residential program. complex problems, and reflect on the relationship The goal of the OMSCS program is to offer the between those systems and human cognition. A full same courses online that we offer to residential MS articulation of the class’s goals, outcomes, assess- students, and with the same depth, substance, and ments, and strategies can be found in our paper An rigor. Students take the same classes and complete Experiment in Teaching Cognitive Systems Online the same assessments as residential students, receive (Goel and Joyner 2016a). grades from the same graders, and must meet the same requirements for graduation. The online stu- dents interact with the professor and the teaching Design of the Online Course assistants during virtual office hours and on web- The online KBAI course comprises 26 lessons on the SUMMER 2017 49 Articles following topics: (1) introduction to the course, (2) Enrollment in the class has varied from 200 to 400 introduction to KBAI, (3) semantic networks, (4) gen- students per term; thus, at this writing more than erate and test, (5) means-ends analysis and problem 2000 students have taken the course. The teaching reduction, (6) production systems, (7) frames, (8) staff consists of the instructor of record, a head teach- learning by storing cases, (9) case-based reasoning, ing assistant (TA), and an additional TA for every 50 (10) incremental concept learning, (11) classification, students that enroll in the course. Each of the TAs (12) logic, (13) planning, (14) understanding, (15) work for about 20 hours per week; this results in each commonsense reasoning, (16) scripts, (17) explana- student receiving roughly 7 hours of dedicated TA tion-based learning, (18) analogical reasoning, (19) time per semester. In the KBAI course, the TAs are pri- generalization and version spaces, (20) constraint marily responsible for grading assignments, while the propagation, (21) configuration, (22) diagnosis, (23) instructor and head TA take care of interacting with learning by correcting mistakes, (24) metareasoning, students on the forum and organizing the remaining (25) advanced topics, and (26) course wrap-up. The elements of class administration. lessons vary in length based on the topic (one of the One of the major lessons we have learned is that advantages of preparing the class in this medium), delivery of the online KBAI class is as important to but average approximately one hour per lesson when student learning as developing the video lectures. A including the time students spend completing the common misconception about online learning interactive exercises in each lesson. The videos of all appears to be that the video lessons are the online 26 lessons are now available freely through Udacity.2 equivalent of the traditional classroom for residential Ou, Goel, Joyner, and Haynes (Ou et al. 2016) pro- students. However, we quickly realized that the video vide an analysis of the student perceptions of the lessons were more like a textbook for the online class, video lessons. During the first offering of CS7637 in and that the true online classroom is in the discus- the fall 2014 term, only online students had access to sion forum. It is the forum that replicates most activ- these materials; however, the visuals and exercises ities that happen in a physical classroom, such as produced for the online course were reused as the class announcements, discussions, student collabora- materials for the residential section, with the same tion, and instructional support through question structure for the online and residential classes. In the answering. The discussion forum’s asynchronous, next two offerings of the residential class in the fall persistent, and self-documentation nature, however, 2015 and fall 2016 terms, residential students were fundamentally change how the discussions unfold also provided access to the online lecture materials as and the instructional support they require (Joyner, part of experiments in flipped classrooms and blend- Goel, and Isbell 2016), an observation confirmed and ed learning. repeated by instructors of other classes in the pro- The recommended readings came from several gram (Carey 2016). Generally, we and other instruc- textbooks, including Winston (1993), Stefik (1995), tors have observed that the online experience can be Nilsson (1998), and Russell and Norvig (2009). In more richly interactive than the residential experi- addition, we included several optional readings on ence, but this requires properly understanding the selected topics in cognitive systems such as Lehman, ideal roles for the video material and the online Laird, and Rosenbloom (2006) on the SOAR cognitive forum. architecture. While the course does not teach AI pro- gramming, it provides access to AI programming Evaluation resources such as the reimplementation of several We concentrate on two variables in evaluating the classic AI systems in Python (Connelly and Goel online KBAI course: class assessment outcomes and 2013) described in Norvig (1992). student experience. During semesters in which the residential section of the KBAI class is offered simul- Development and Delivery taneously, we approach evaluating learning out- Development of the online KBAI course began in Feb- comes using a quasi-experimental approach. Resi- ruary of 2014 with an intense two-day boot camp at dential and online students are given the same Udacity and ran through the launch of the course in assessments on the same schedule, and they are eval- August 2014. We estimate that during this six-month uated by the same graders. Graders evaluate the period Joyner spent approximately 750 to 800 hours assignments blind as to whether a given student is of his time and Goel spent about 200 to 250 hours enrolled online or residentially. Thus, we compare on the course development. This investment of time online and residential students’ grades on the assess- was needed because we developed all the videos from ments to ensure that the learning outcomes online scratch and specifically for the online course. The are at least as good as those on campus. As table 1 paper by Goel and Joyner (2016a) provide more illustrates, in fall 2014 we found that the online stu- details of the process of development. dents outperformed residential students on all 14 We have offered the online KBAI course each fall, assessments, with statistical significance on 7 of spring, and summer term since the fall 2014 term. To those assessments. There may be multiple explana- date we have offered the course eight times so far. tions of this phenomenon. On one hand, it may be 50 AI MAGAZINE Articles OMSCS Residential Item Max (Mean) (Mean) Assignment 1 4 3.90 3.52 Assignment 2 4 3.94 3.70 Assignment 3 4 3.95 3.52 Assignment 4 4 3.92 3.83 Assignment 5 4 3.89 3.75 Assignment 6 4 3.86 3.62 Assignment 7 4 3.91 3.77 Assignment 8 4 3.97 3.90 Project 1 100 94.47 92.61 Project 2 100 92.74 89.64 Project 3 100 93.10 92.17 Project 4 100 92.0 88.5 Midterm 100 70.2 70.0 Final Exam 75 93.76 93.48 Final Grade 100 92.32 91.31 Table 1. Average Grades Given on Each Assignment for the Residential and Online Sections of the KBAI Course during the Fall 2014 Term. possible that the instruction online is comparable to, found the vast majority of students rate the online or perhaps even superior to, the residential instruc- KBAI course as superior to courses in all three other tion; a similar dynamic has been echoed by other categories. Although student evaluations and survey instructors in the program (Carey 2016). On the oth- data are always not always completely reliable, we er hand, online students tend to be older and more attribute more credibility to these results given their experienced, and so this superior performance may consistency and the experienced midcareer status of be solely due to their superior professional back- the vast majority of students in the program. Perhaps ground and maturity in managing the coursework most interestingly, students reliably rate the KBAI and learning the course materials. Although online course more favorably compared to other college students’ superior performance is interesting, the courses than compared to other OMSCS courses, sug- greater point is that online students’ performance is gesting an underlying belief that many courses in the at least as good as residential students’ performance, OMSCS program are likely to be better on the whole providing some support to the claim that the online than traditional residential courses. degree is equivalent to the residential degree. With regard to the student experience, we ask stu- dents to compare the KBAI class to other OMSCS Using AI to Teach AI courses, to online courses and general, and to college Besides the detailed approach taken to the course’s courses as a whole. Each and every semester, we have initial creation, the KBAI course is unique in its usage SUMMER 2017 51 Articles What are the next possible states for each of the current states? REVIEW INSTRUCTIONS SUBMIT ANSWER CONTINUE TO ANSWER Figure 1. Example Exercise. This is an example exercise from the fourth lesson of CS7637: Knowledge-Based AI. Here, students are asked to fill in 24 boxes to represent the possible next states of a problem in means-ends analysis in accordance with rules provided. of AI not only as the subject matter of the course, but gent tutoring systems (Joyner and Goel 2015a, also as a tool to teach the course. In this section, we 2015b), we created about 100 “nanotutors” to sup- describe two ways in which we chose to use AI to port the exercises and embedded them in the video teach AI, which we would advocate other advanced lessons. The nanotutors are highly focused intelligent courses on artificial intelligence adopt. tutoring agents guiding students’ understanding of one narrowly defined skill such as completing a Intelligent Tutoring of AI Concepts semantic network for a particular problem or simu- While traditionally intelligent tutoring systems cre- lating an agent’s planning in the blocks world. ate computer-aided learning activities, the KBAI Figure 2 shows some of the behaviors of the nan- course already is online. The Udacity infrastructure otutor for the exercise in figure 1. The nanotutor for video lessons provides a facility for creating flexi- operates by first assessing the readability of the stu- ble interactive exercises involving multiple input dent’s input; for example, in the exercise shown in types that can be evaluated by custom Python code. figure 1, if a student entered a noninteger as input, Using that framework, we equipped the lecture mate- the nanotutor would alert the student that the input rial for the course with about 150 interactive exercis- did not match the rules of the problem, and would es. Figure 1 illustrates an example of an exercise; this reiterate the exercise’s acceptable input. In this way, exercise can be completed in the video lesson itself. the nanotutor first operates by taking open-ended In addition, building on our prior work on intelli- student text input and guiding it toward the narrow- 52 AI MAGAZINE Articles 3 0 Try again 3 0 It looks like there’s a few problems 3 0 with your answer. Please note of the issues below; note boxes are 3 0 numbered from top to bottom. 2 1 2 1 In the third box, no one moved from the right to the left. 3 0 What are the Remember, the boat can’t travel 2 1 alone! next possible states for each You’ve written the same state 3 0 twice in the first three boxes. 3 0 of the current There should be three different states? possible next states. 3 0 Remember, in this exercise we’re 3 0 looking for possible next states, for all possible next states, not just the legal ones! SUBMIT ANSWER CONTINUE TO ANSWER 3 0 Correct! Very nicely done! You’ve written 2 1 3 0 every possible next state. Next well look at how our dumb tester 3 0 would rule out some of these 2 1 states. 3 0 3 0 What are the 2 1 next possible 3 0 states for each 3 0 of the current states? 3 0 3 0 SUBMIT ANSWER CONTINUE TO ANSWER Figure 2. Feedback from the Nanotutor. Examples of two pieces of feedback the student may receive from the nanotutor based on her input. On the top (a), there are errors with the student’s reasoning in two of the states, and the nanotutor provides the guidance to correct these errors. On the bottom (b), the nan- otutor confirms that the student has successfully completed the assigned exercise. SUMMER 2017 53 Articles er set of inputs the agent can understand. The agent al. 2013). The online KBAI class too has used the would then test whether the now-readable input same kind of design projects since its inception in obeyed the rules of the problem. In figure 2a, for 2014. example, the student disobeyed a rule of the prob- To allow students to participate in these projects lem. The nanotutor explains the rule to the student. authentically, we supply them a set of RPM-style Then, if the student input is readable and all rules are problems that we developed, and then test their obeyed, the nanotutor assesses whether the final agents against the real Raven’s Progressive Matrices state matches the goal state. If not, the nanotutor (which is never provided to the students directly for directs the student to the difference between their copyright reasons). Two examples of these RPM-style answer and the goal state. At every step of the problems are shown in figure 3. This authenticity has process, the nanotutor contextualizes the feedback in several pedagogical benefits. First, it contextualizes terms of the concept demonstrated. the challenging elements of the project as inherent to Altogether, the 150 exercises in the course are the problem rather than artificially for the sake of dif- equipped with approximately 100 nanotutors; some ficulty. Second, it encourages students to think not exercises share nanotutors, and others have no indi- just about the established principles and methods of vidualized feedback. The construction of these nan- the community, but also the dynamic and emerging otutors addresses the problem of labor in constructing theories. Third, it provides to students a fundamen- intelligent tutoring systems; each nanotutor required tal view on the types of questions and methods the on average of less than an hour to build, ranging from community asks and uses. The quality of some stu- a few minutes to several hours depending on the dent projects is high enough that it already has led to extent to which generalizable frameworks could be one publication (Joyner et al. 2015), and more are leveraged for the individual tutor’s reasoning. forthcoming as former students in the class have Evaluation of the nanotutors is embedded in the begun follow-on projects building on their classwork. two forms of course evaluation described previously. First, the nanotutors act in support of the course’s video lessons, which is assessed through the written Lessons Learned assignments and examinations. Only online students Delivering the online version of CS7637 has been an received access to these exercises in nanotutors in the incredible learning experience for us over the past fall 2014 term, and therefore it is possible that this two years. We have both been struck by the owner- access is responsible for the online students’ superior ship of online students over their class experience. performance on the assessments. Second, in surveys Every semester we have strived to improve the class of student satisfaction, we explicitly ask about stu- and leverage lessons we learned during the previous dents’ perceptions of the interactive exercises and semester. The following subsections are five of the accompanying nanotutors. In general we found that lessons we have learned along the way that we would about 80 percent of students agree that the interac- recommend transferring both to future online class- tive exercises improve their understanding of the es in AI and to other online learning programs in material, and about 75 percent of students agree that general. Interestingly, these lessons generally nanotutors also help enhance their understanding of demand considerable expertise and commitment to the material. developing strong online experiences; however, they do not necessarily demand enormous resource Authentic Engagement in AI Research investment. Although producing the class required Research in cognitive and learning sciences informs an enormous number of person-hours from Goel, us that student learning is enhanced through engage- Joyner, and the video production team at Udacity, ment with authentic scientific practices (for example, many of the elements that contribute to the success Edelson ). Thus, for several years, our residen- of the class are not reliant on this kind of resource tial KBAI classes have using design and programming investment. projects that derive from real AI research (Goel et al. 2013). In particular, our research laboratory is inves- Integrating Interactivity tigating problem solving on the Raven’s Progressive from the Beginning Matrices Test of human intelligence (RPM; Raven While most educators know the value of active learn- ) and has developed several techniques for AI ing in delivering superior learning outcomes, the lack agents to address RPM problems with human-level of experience with online education and in the performance (Kunda, McGreggor, and Goel 2013; intensity of the production process can lure many McGreggor, Kunda, and Goel 2014). Thus, for the last first-time online course creators into a rote instruc- few years residential KBAI classes have been using tional approach. Thus, many online courses simply design projects derived from our research on RPM record traditional lectures with no interactivity what- problem solving: the students re-create the AI agents soever, while others inject token rather than deep we have developed in our laboratory but are also interactivity. For example, simple multiple-choice or encouraged to design their own techniques (Goel et unevaluated essay prompts are common in MOOCs, 54 AI MAGAZINE Articles Basic Problem B-12 1 4 A B 2 5 C # ? 3 6 Basic Problem E-11 A B C 1 5 D E F 2 6 3 7 ? G H # 4 8 Figure 3. RPM-Style Problems. The figure provides examples of 2x2 (top) and 3x3 (bottom) RPM-style problems used during the projects in CS7637: Knowledge-Based AI. Students begin by working on simpler 2x2 problems, and over time start to approach more complex 3x3 problems. SUMMER 2017 55 Articles program. The community knowledge surpasses the material we could ever deliver intentionally through Training/Learning AI agent preprepared lecture material. Nothing we do can replicate the power of having actual AI researchers as An instructor thinks this is a good note students (and later, teaching assistants) in the class. Not only are the students fantastically qualified, but Which basic D problems are you having... they also take significant ownership over the class experiences. Figure 4 shows a snapshot of the level of Funny AI-Gone-Wrong Shirt student activity in the class: students created four dis- cussions, three questions and a poll in 36 hours dur- Tip: Java snippet to render image Calcu... ing the Summer 2015 semester, drawing over three An instructor thinks this is a good note dozen responses and two student answers to class- 2 Unresolved Followups mates’ questions. Three of these posts involved stu- dents sharing to help their classmates, while a fourth I was wondering what should we write i... posed a philosophical discussion question and a fifth was purely social. No incentive was given for partici- Do assignments have to focus exclusiv... pation; this student ownership is purely organic. In response to this discovery, we have learned to WEEK 7/5 - 7/11 take active steps to empower the student communi- ty in the online KBAI class. Thus, we have created a What is creativity? more accommodating collaboration policy to maxi- mize the extent to which students may learn from Question about lecture 23 their well-qualified classmates. We also stress usage of a peer review system that pairs each student with sev- eral classmates on each assignment, allowing them to benefit from the professional experience of other students. Figure 4. Discussion Snapshot. Leveraging Research for Authentic Projects A snapshot of the discussions created by students in a 36-hour period dur- As noted previously, one of our approaches to using ing the summer 2015 offering of the KBAI course. AI to teach AI is to engage students in authentic research projects that can immediately translate to publications or participation in active groups. This but these approaches do not take full advantage of requires two unique efforts. First, the projects that the interactivity possible in this medium. The video students work on within the class must be designed lessons in the online KBAI class, by contrast, was con- in such a way that there is the potential they may structed with interactivity as its foundation. Every translate to real-world publications and research. In lesson is built around an example of some type of rea- KBAI, students re-create and contribute to an ongo- soning an AI agent could perform, and students are ing body of research pursued by the community. frequently asked to simulate or predict the results of However, creating projects that have the potential this reasoning themselves. Rather than adding sim- to carry over into real-world research does not guar- ple questions after the fact, this interactivity was the antee they actually will. Steps must also be taken to foundation of the initial course scripting process. As support students interested in continuing to pursue noted above, we observed that online students out- those projects. We have accomplished this in a num- performed residential students in fall 2014 on the ber of ways: by specifically offering students the course assessments; if this result is due to the superi- opportunity to collaborate on a publication based on or instructional material, it suggests this design deci- their project work; by opening master’s projects and sion improves students’ learning outcomes and satis- theses for students to continue developing their proj- faction with the course. ects for class credit; and by setting up a research lab targeted at online students. Empowering the Student Community As noted above, arguably our greatest lesson from Recreating Features of the Residential Class teaching this course has been the role of the com- One of the common criticisms of online education is munity of learning in the online KBAI class. First, the the perception that to offer the class online, certain student community in the program is remarkably material, relationships, or procedures must be well-qualified: nearly a fifth of students already have removed, thus weakening the class. We have graduate-level degrees of some kind, and many have observed that many of these, such as student-student worked professionally in software development, data and student-instructor interaction, are no weaker science, or related fields for years prior to entering the online than in person (and in fact, may be stronger). 56 AI MAGAZINE Articles However, there are other elements of the residential been one of the most satisfying educational experi- class experience that are taken for granted and must ences of our careers, and we wholeheartedly encour- be re-created manually. For example, we initially age anyone with the opportunity to participate in underestimated the extent to which having a regu- this new environment to try it out for themselves. larly scheduled meeting time sets up what we call a The level of student motivation, engagement and classroom cadence, a rhythm to the class’s interac- ownership are worth the massive time needed to cre- tion. We replicated this in part through weekly rou- ate and deliver these courses. That said, there are tine announcements to create an online equivalent many open issues left to address. With regard to of the in-person routine. using AI to teach AI, we are still exploring the range AI may play a key role in this lesson as well. The of topics that can be addressed by nanotutors, the dynamics that create a classroom cadence are rou- level of authenticity that can be provided through tine, predictable, and foreseeable; thus, it should be class projects, as well as the development of virtual possible to equip an AI agent with the ability to inter- teaching assistants can answer automatically some act in a way that establishes that rhythm. AI agents classes of questions on the discussion forums. With like our nanotutors may also play a role in re-creating regard to online education as a whole, we must con- natural features of the residential class; the online tinue to explore metrics for ensuring that learning environment does not have a natural equivalent of a outcomes rival or exceed residential equivalents and class exercise in which an instructor can intervene represent real value to students. live to give feedback, but our interactive exercises play exactly this role. Acknowledgements We are grateful to several people for their support for Using Automated Evaluation for the development of the OMS CS7637 KBAI class: the Frequent Formative Feedback staff of the Georgia Tech OMSCS program, especially As education scales up to classes with hundreds or Zvi Galil, Charles Isbell, and David White; the Geor- thousands of students, one of the pushes is for an gia Tech team within Udacity, especially Sebastian increased emphasis on automatic evaluation. This Thrun, Jason Barros, and Jennie Kim; and the staff of can range from simple multiple-choice quizzes to Georgia Tech Professional Education, especially Nel- more complex simulation-graded assignments. What son Baker. We are especially grateful to our video pro- is often lost in this emphasis is the incredible influ- ducer, Aaron Gross, for producing the MOOC mate- ence these forms of automated evaluation can have rial for the course, and to Joe Gonzales for creating on students’ individual feedback cycles in working the peer review system we use in the class. Lastly, we on assignments. Many classes only run these auto- are grateful to the numerous who have worked as mated evaluators after students have submitted their CS7637 Teaching Assistants since Fall 2014. This arti- work. The emphasis here is on generating grades, not cle is based in part on Goel and Joyner (2016a, generating feedback or supporting learning experi- 2016b). ences. If the evaluation is generated automatically, though, it presents a wonderful opportunity to equip Notes students with the tools necessary to rapidly iterate in 1. www.omscs.gatech.edu. their understanding. 2. classroom.udacity.com/courses/ud409/. As noted previously, students in the KBAI class 3. www.youtube.com/watch?v=WbCguICyfTA. design agents that can answer a set of problems the student can see, and we then evaluate them against a set of problems the student cannot see. Prior to the References summer 2016 semester, this latter step was only con- Azevedo, R., and Aleven, V. 2013. International Handbook of ducted after the submission deadline. In summer Metacognition and Learning Technologies. Berlin: Springer. 2016 term, however, we launched a new automated doi.org/10.1007/978-1-4419-5546-3 grader that would allow students to see their agents’ Carey, K. 2016. An Online Education Breakthrough? A Mas- results on those unseen problems without having ter’s Degree for a Mere $7,000. The New York Times, Septem- access to the problems themselves. This presents a ber 28. (www.nytimes.com/2016/09/29/upshot/an-online- education-breakthrough-a-masters-degree-for-a-mere-7000. lesson to any course developing automated evalua- html) tion solutions: while it is natural to focus on such Connelly, D., and Goel, A. 2013. Paradigms of AI Program- solutions for generating the grades necessary to scale, ming in Python. In Proceedings of the Fourth Symposium on make sure to extend the benefits of those automated Educational Advances in AI. Palo Alto, CA: AAAI Press. evaluators to the students as well through frequent Edelson, D. 1998. Realising Authentic Science Learning formative feedback. Through the Adaptation of Science Practice. In Internation- al Handbook of Science Education, ed. B. Fraser and K. Tobin, 317–331. Dordrecht, The Netherlands: Kluwer. Conclusions doi.org/10.1007/978-94-011-4940-2_19 Creating and delivering the online KBAI class has Goel, A., and Joyner, D. A. 2016a. An Experiment in Teach- SUMMER 2017 57 Articles ing Cognitive Systems Online. International Journal for Schol- Leckart, S. 2012. The Stanford Education Experiment Could arship of Technology — Enhanced Learning 1(1). Change Higher Learning Forever. Wired, March 20. Goel, A. 1994. Teaching Introductory Artificial Intelligence: Lehman, J.; Laird, J.; and Rosenbloom, P. 2006. A Gentle A Design Stance. In Improving Instruction of Introductory Introduction to Soar, An Architecture for Human Cogni- Artificial Intelligence: Papers from the AAAI Fall Symppo- tion: 2006 Update. Unpublished Paper, The University of sium. Technical Report FS-94-05. Menlo Park, CA: AAAI Michigan, Ann Arbor, MI. Press. McGreggor, K.; Kunda, M.; and Goel, A. 2014. Fractals and Goel, A., and Davies, J. 2011. Artificial Intelligence. In Hand- Ravens. Artificial Intelligence 215 (October): 1–23. book of Intelligence, 3rd edition, ed. R. Sternberg and S. Kauff- doi.org/10.1016/j.artint.2014.05.005 man, 468-484. Cambridge, UK: Cambridge University Press. Nilsson, N. 1998. Principles of Artificial Intelligence: A New Goel, A. and Joyner, D. 2016a. An Experiment in Teaching Synthesis. San Francisco: Morgan Kauffman Publishers. Cognitive Systems Online. International Journal for Scholar- Norvig, P. 1992. Paradigms of AI Programming. San Francisco: ship of Technology-Enhanced Learning 1(1). Morgan Kauffman Publishers. Goel, A., and Joyner, D. 2016b. Design of an Online Course Ou, C.; Goel, A.; Joyner, D.; and Haynes, D. 201). Designing on Knowledge-Based AI. In Proceedings of the Thirtieth AAAI Videos with Pedagogical Strategies: Online Students’ Per- Conference on Artificial Intelligence, 4089–4094. Palo Alto, CA: ceptions of Their Effectiveness. In Proceedings of the Third AAAI Press. (2016) ACM Conference on Learning @Scale 2016, 141–144. Goel, A.; Kunda, M.; Joyner, D.; and Vattam, S. 2013. Learn- New York: Association for Computing Machinery. ing about Representational Modality: Design and Program- doi.org/10.1145/2876034.2893391 ming Projects for Knowledge-Based AI. In Proceedings of the Raith, A. 2011. Stanford for Everyone: More Than 120,000 Fourth Symposium on Educational Advances in Artificial Intelli- Enroll in Free Classes. MindShift. San Francisco: KQED Inc. gence. Palo Alto, CA: AAAI Press. (ww2.kqed.org/mindshift/2011/08/23/stanford-for-every- Goodman, J.; Melkers, J.; and Pallais, A. 2016. Does Online one-more-than-120000-enroll-in-free-classes) Delivery Increase Access to Education? Harvard University Raven, J. C. 1941. Standardization of Progressive Matrices, Kennedy School Faculty Research Working Paper Series 1938. British Journal of Medical Psychology 19(1): 137–150. RWP16-035. Cambridge, MA: Harvard University. doi.org/10.1111/j.2044-8341.1941.tb00316.x Jonassen, D.; Peck, K.; and Wilson, B. 1999. Learning with Russell, S., and Norvig, P. 2009. Artificial Intelligence: A Mod- Technology in the Classroom: A Constructivist Perspective. New ern Approach, 3rd ed. Engelwood Cliffs, NJ: Prentice Hall. York: Merrill/Prentice-Hall. Sleeman, D. H., and Brown, J. S. 1982. Intelligent Tutoring Joyner, D., and Goel, A. 2015a. Improving Inquiry-Driven Systems. London: Academic Press. Modeling in Science Education Through Interaction with Intelligent Tutoring Agents. In Proceedings of the 20th ACM Stefik, M. 1995. Knowledge Systems. San Francisco: Morgan Conference on Intelligent User Interfaces, 5-16. New York: Asso- Kauffman Publishers. ciation for Computing Machinery. Winston, P. 1993. Artificial Intelligence, 3rd Ed. Cambridge, doi.org/10.1145/2678025.2701398 MA: The MIT Press. Joyner, D., and Goel, A. 2015b. Improving Scientific Mod- eling through Metacognitive Tutoring Based on Functional Roles of Teachers. In Proceedings of the 37th Annual Meeting Ashok Goel is a professor of computer science in the School of the Cognitive Science Society. Wheat Ridge, CO: Cognitive of Interactive Computing at Georgia Institute of Technolo- Science Society. gy in Atlanta, USA. He is also the director of Georgia Tech’s Design and Intelligence Laboratory, and the Ph.D. program Joyner, D.; Bedwell, D.; Graham, C.; Lemmon, W.; Martinez, in human-centered computing. For more than 30 years, O.; and Goel, A. 2015. Using Human Computation to Ashok has conducted research into artificial intelligence, Acquire Novel Methods for Addressing Visual Analogy Prob- cognitive science, and human-centered computing, with a lems on Intelligence Tests. Paper presented at the Sixth focus on computational design, modeling, and creativity. International Conference on Computational Creativity, He is the editor-in-chief of AI Magazine. As part of the Park City, UT, 29 June–2 July. OMSCS KBAI class described here, he developed Jill Watson, Joyner, D. A.; Goel, A.; and Isbell, C. 2016. The Unexpected a virtual teaching assistant for answering questions in Pedagogical Benefits of Making Higher Education Accessi- online discussion forums.3 ble. In Proceedings of the Third Annual ACM Conference on Learning at Scale. New York: Association for Computing David Joyner is the product lead in charge of the Georgia Machinery. doi.org/10.1145/2876034.2893383 Tech Online Master of Science in Computer Science Koedinger, K., and Corbett, A. 2006. Cognitive Tutors: Tech- (OMSCS) at Udacity, as well as a lecturer in the Georgia Tech nology Bringing Learning Science to the Classroom. In The College of Computing, teaching three online classes — Cambridge Handbook of the Learning Sciences, ed. K. Sawyer, CS6460: Educational Technology; CS6750: Human-Com- 61–78. Cambridge, UK: Cambridge University Press. puter Interaction; and CS1301: Introduction to Computing. He is the founder and director of LucyLabs, a research lab Kunda, M.; McGreggor, K.; and Goel, A. K. (2013). A Com- dedicated to research by and about online students, as well putational Model for Solving Problems from the Raven’s as the 2016 recipient of the College of Computing’s Lock- Progressive Matrices intelligence Test Using Iconic Visual heed Excellence in Teaching award, and the 2017 recipient Representations. Cognitive Systems Research 22-23(June): 47– of the College of Computing’s Outstanding Instructor 66. doi.org/10.1016/j.cogsys.2012.08.001 award. Langley, P. 2012. The Cognitive Systems Paradigm. Advances in Cognitive Systems 1: 3-13. 58 AI MAGAZINE

Use Quizgecko on...
Browser
Browser