Moulinette PDF
Document Details
42
Jean-Baptiste Austry
Tags
Summary
This document describes the Moulinette, an automated evaluation system for student projects at 42. It explains the different aspects of the Moulinette, including its function, the process, and the way peer evaluations are considered. It's designed for professional development.
Full Transcript
👁️ Moulinette Tag Verified Verified Owner Jean-Baptiste Austry Created time @August 11, 2023 4:13 PM Status Abstract The Moulinette is an auto...
👁️ Moulinette Tag Verified Verified Owner Jean-Baptiste Austry Created time @August 11, 2023 4:13 PM Status Abstract The Moulinette is an automated evaluation of the student’s code after they finish the peer evaluations. The final mark of the project is calculated taking into account both kinds of evaluations. At 42, we also call Moulinette to the program and VM running this evaluation and to the website where we can do a follow-up. There are several articles related to each one of the mentioned concepts. Index General Information What is the Moulinette? (v.1) Disambiguation What is the Moulinette? (v.2) Pedago PoV Why do we have a Moulinette? On which projects does it work? When does it run? How is the Moulinette mark taken into account? The student PoV Traces General Information Moulinette 1 What is the Moulinette? (v.1) The Moulinette is an automated evaluation of the student’s code. It applies to the first projects that the candidates and students do at 42. When they finish a project, they should be corrected by their peers and once they finish the evaluations they will receive the mark of the Moulinette. The final mark of the project is calculated taking into account both kinds of evaluations. Disambiguation At 42 we call “Moulinette“ to four different concepts; none of them referring to “food mill“, which is the real meaning of what a Moulinette is in french. Here’s the proof: At 42 the word “Moulinette” can mean: 1. The pedagogical tool to evaluate the students' projects with a sort of test the pedago team has previously written. If you want information about this, please read this page. 2. The program running this process. It receives the student’s code, runs the tests and submits the results. 3. The Moulinette Web interface: it shows the information related to each evaluation and allows you to take different actions. If you want more information about the second or third definition, we recommend you to read this page 🌎 Moulinette Website. 4. The Moulinette VM: where the Moulinette program runs, it hosts the website and it contains all the code with the tests for the evaluations. If you need information about this service have a look at the Moulinette section in 🖥️ Intra & Tools Settings. What is the Moulinette? (v.2) Now that we can talk about the different meanings of the Moulinette without getting lost, let’s recap on what actually is the Moulinette and how it works: [We are going to refer to the student but it could also be a candidate on the piscine] Moulinette 2 1. When the student finishes a project the intra will check if this project has a Moulinette and what is its configuration (see the section “On which project does it run?“). 2. If the project has Moulinette the intra will check if the student needs to do peer-evaluations or if the Moulinette runs right after the student close the project (see the section “When does it run?“). In case the student needs to do peer evaluations the intra will wait for them to be done or a maximum of 24 hours, then it will execute the Moulinette. 3. The intra sends to the Moulinette program the needed information to run the tests: team id, repo URL, name of the project, etc… This is done by sending a webhook to the Moulinette program which has an endpoint listening on an open port of the VM. 4. The next step is a more related to the technical part of the Moulinette and it may not be of your interest, so we are going to jump directly to when the Moulinette has already run the tests. (*) 5. The Moulinette upload the mark from the test to the intra. Also, it updates the trace and the mark to the Moulinette Website. 6. Once the intra has received this upload, it calculates the final mark of the team and, then will update the project final mark if needed. This is managed by the specific rules of the project or the project session. More information on the section: “How is the Moulinette mark taken into account?”. 7. The student may not be very happy with the output of all this process and will argue that the Moulinette is wrong and that another student with the same code passed the tests. More information on the section: “The student PoV“. (4. *) As a resume: the information sent by the intra is transformed through different services into a task containing the student’s repo and the code to test (coming from the VM corrections folder). This task is sent to the queue to be dispatched to a computer on your campus on which the tests will be run. In the end, the mark is calculated and the information is sent through all the services back to the intra and the moulinette website. Pedago PoV Moulinette 3 Why do we have a Moulinette? The 42 pedagogical system is based on peer learning, including the peer- evaluations. Since at 42 we don’t require a minimum coding level to start the piscine it is possible that the candidates are not yet able to judge if their peer's code is correct, functional, and not crashing or not. That is why for the first projects we have an automated set of tests that aims to increase the quality of students’ progression. There is another reason why we may include a Moulinette on a project: when the peer evaluation does not apply. This is the case of the rushes (not the piscine ones, but the main cursus) for example. Don’t hesitate to contact us in case you want to have more information about this. On which projects does it work? At the moment the projects with Moulinette are the piscine projects (except the rushes) and the first 3 C projects of the Common Core: Libft, get_next_line and ft_printf. The see if a project has a Moulinette or not you can check it on the project scale page. Ex: ft_printf (has moulinette) and born2beroot (doesn’t). To specify if a project has Moulinette, it should be included and configured on its repository on 42born2fit: the file scale/scale_commom.yml. When does it run? The moulinette is triggered automatically after the peer evaluations, in case of time boxed projects and the student hasn’t finished his evaluations the moulinette will be automatically triggered at the end of the time boxed project. How is the Moulinette mark taken into account? Once the Moulinette has calculated the mark from the tests there are two calculations to take into account: the team_mark and the project_mark. To make the difference: each time that a student retries a project, it generates a new team_mark and when this one gets updated the intra will recalculate the project_mark. About the team_mark : Moulinette 4 It is calculated depending on the eval_compilation rule that you have on your PS (or check the one on the default if you have not created your own). If there is none the team_mark will be the average between the moulinette and the peer evaluations average. When we include this rule and we indicate Piscine C the team_mark is calculated as follows: if the difference between the Moulinette and the peer evaluations is bigger than 15 points, the team_mark will be the moulinette mark and the peer evaluations won’t be taken into account; otherwise works as the default rule. About the project_mark : It is calculated depending on the final_compilation rule that you have included on the PS (or the default). The most common ones are (1) last_mark and (2) best_mark ; both pretty self-explanatory. The student PoV When the moulinette is being run, the student will see on their project page a message saying “Waiting for upload“. Once the Moulinette has finished, it will display the following information: A more lucky one: And the project_mark will be displayed at the top of the page. Also, the student will receive a mail with the trace of the tests performed on their code in case the Moulinette of the project is configured as a public trace (for the Moulinette 5 student and the staff). The students see the moulinette as a wall they have to jump, therefore when they receive a not very favourable result, they will come to the pedago team claiming that they are sure that their code is right and the Moulinette is not. If you are reading this page because of an issue like that one please read the pedago section on 🧐 Troubleshooting moulinette. Traces The trace is the log resulting as an output from the test carried out by the Moulinette on the student’s code. The student will receive their traces by mail once the moulinette has finished grading their project. Note: the student gets his trace only if the configuration is activated in the yaml file that defines the exam assignment. On your local servers, you should have a copy of the exam-assignment repository, and in the pools directory you find the yaml files. The second line of these files contains the parameter has_trace: that should be set to ‘true’. Example: https://42born2git.42.fr/pedago_world/common/exam- assignments/-/blob/master/pools/exam-rank-02.yml on line 2. As a staff, you can always have a look at the student’s trace using the Moulinette Web Interface. The trace is not always easy to read, here’s the list of the different parts and what to look at on each one: 1. Host-specific information: it shows the information of the computer where the tests were run (as explained before, each task is dispatched to a different computer from your cluster). It also includes the date and the compiler version. = Host-specific information ================================= =================== $> hostname; uname -msr e2r1p9.clusters.42paris.fr Linux 5.4.0-96-generic x86_64 $> date Moulinette 6 Sun 06 Feb 2022 02:20:09 PM CET $> gcc --version gcc (Ubuntu 10.3.0-1ubuntu1~20.04) 10.3.0 Copyright (C) 2020 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTI CULAR PURPOSE. $> clang --version Ubuntu clang version 12.0.0-3ubuntu1~20.04.4 Target: x86_64-pc-linux-gnu Thread model: posix InstalledDir: /usr/bin 2. User files collection: the URL of the users' Vogsphere repository. = User files collection ===================================== =================== Collecting user files from Vogsphere Repository URL: [email protected]:vogsphere/intra-uuid-9 3cda832-c4e4-4f86-b863-7004d5edb0db-3984399-gamoreno 3. Git history: the git log of the repo. = Git history =============================================== =================== $> git -C /tmp/tmpnnorznl3/user log --pretty='%H - %an, %ad : %s' a2eb6f0511f4ef6a0aa2843a977215a28c73092b - Gabriel Moreno, Sa t Feb 5 22:54:17 2022 +0100 : first commit 4. Collected files: prints a list of the files inside the repo. From line 5 to 7 the folders, and below the file on each folder. Moulinette 7 = Collected files ========================================== $> ls -lAR /tmp/tmpnnorznl3/user /tmp/tmpnnorznl3/user: total 40 drwxr-xr-x 2 root root 4096 Feb 6 14:20 ex00 drwxr-xr-x 2 root root 4096 Feb 6 14:20 ex01 -rw-r--r-- 1 root root 910 Feb 6 14:20 __GIT_HISTORY /tmp/tmpnnorznl3/user/ex00: total 4 -rw-r--r-- 1 root root 958 Feb 6 14:20 ft_putchar.c /tmp/tmpnnorznl3/user/ex01: total 4 -rw-r--r-- 1 root root 1060 Feb 6 14:20 ft_print_alphabet.c 5. Exercises and each test [here’s a modified example of how the traces look like from the C Piscine - C 00 - Ex 00 ]: Line 2: the moulinette check the norminette Line 4: compilation of the user’s code, taking into account the allowed functions (line6) Line 8: compilation reference code Line 15: test user code Line 22: test reference code Line 25: get the difference between the reference and the user test outputs Line 27: no difference… so Line 28: we give the 10 points of this exercise = ex00 ====================================================== =================== $> /usr/bin/norminette -R CheckForbiddenSourceHeader ft_putch Moulinette 8 ar.c | grep -E '^(Error|Warning)' $> gcc -Wextra -Wall -Werror test_ft_putchar.c ft_putchar.c - o user_exe _GLOBAL_OFFSET_TABLE_ write $> gcc -Wextra -Wall -Werror test_ft_putchar.c ft_putchar.c - o ref_exe = Test 1 ===================================================.: total 20 -rwxr-xr-x 1 deepthought root 16672 Feb 6 13:20 test_1 $>./test_1 > user_output_test_1 !"#$%&'()*+,-./0123456789:;?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\] ^_`abcdefghijklmnopqrstuvwxyz{|}~.: total 20 -rwxr-xr-x 1 deepthought root 16672 Feb 6 13:20 k2t0pp1sqsuz uuvt51dxdese $>./test_1 > ref_output_test_1 !"#$%&'()*+,-./0123456789:;?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\] ^_`abcdefghijklmnopqrstuvwxyz{|}~ $> diff -U 3 user_output_test1 ref_output_test_1 | cat -e Diff OK :D Grade: 10 Moulinette 9