Anchor Links Example

How to grade fast

Why would you want to grade fast?

Because the time to announce grades to students is short  

  • Normally, you have 15 working days to announce grades (and 10 working days before the next assessment).  
  • For Q4 assessments, in addition to the previous points, all course grades must be announced the Friday following the last regular exam week (i.e. Friday 7 July in week 5.1). 
  • For bachelor year one courses that have a resit in Q4, all grades must be announced the Friday following the summer resit week (i.e. Friday 28 July in week 5.4). All other courses have until 11 August to hand in their last grades of the year.  

For more information on the academic year and the quartiles Q1-Q4 and the summer resit period Q5, see the academic calendar

 

Guiding principles

For all assessments, adhere to the following general quality requirements: 

Validity Reliability Transparency Feasibility
  • Assessments assess all LOs (at the correct level)
  • Match between assessment method and learning objectives
  • Unbiased grades
  • Correct pass/fail decisions (and all grades)
  • Clear instructions
  • Before the assessment
  • During the assessment
  • After the assessment (during inspection or discussion)
  • Feasible time windows to finish the assessment (students)
  • Feasible time windows create and grade (lecturers)


Additional requirements for digital exams:
Specifically for digital exams, the following requirements must also be met: 

  • Students need to practice with the assessment tools & question type: Students (and you) need to practice with both the question types and the assessment tools before the exam. 
  • Arrange for technical support for your students and yourself during the exam. 

 

How to make short term grading time reductions? 

Minimize the time to grade your assessment 

The following measures can help you to minimise the time between the assessment and publishing the grade. The measures are divided into ideas that are relevant while constructing the questions and the exam/assignment, and ideas on optimising the grading process. 

 

Optimise the grading process 

  • Organise a second examiner who can take over if you happen to be unavailable during the course, exam or grading. Make sure they have access to all of your material. This is especially relevant during the pandemic. Share the following documents with your colleague: 
  • Exam / assignment 
  • Answer template 
  • Answer model / rubric / assessment sheet 
  • Find teaching assistants (TAs) either via ESA in your faculty (e.g. TPM) or contract them via your department, early in the academic year (otherwise, they might have other plans). Train, supervise and exchange experience amongst TAs to ensure good quality and efficient grading and to update the answer model on the go. Your faculty can arrange a blended training for your TAs. 
  • Fully organize and prepare the grading of the exam/assignment as well as the resit/addition. Organize help for the process long before TAs and colleagues plan their holidays. 

 

Optimise the questions, the exam/assignment, and the use of digital assessment tools 

What 

Why is this faster? 

Warnings/examples 

  • Divide large questions into multiple smaller questions 

  • Answers to short questions are more uniform and therefore faster to grade 

  • Warning: Make sure that the answers of ‘analyse’ and ‘evaluate’ level questions are not unveiled by the follow-up questions: Students might not need the analyse-level of Bloom to determine the steps, since students receive hints on the steps by the a-b-c-d-subquestions 

  • Prevent continuation errors (doorrekenfouten) in follow-up (sub)questions 

  • Prevents checking calculations/answers from students in follow-up questions with incorrect numbers 

  • How: In case of subquestions that can result in continuation errors, ask students to continue working with a specific (different) value for each sub-question. Make sure that this value is not the answer to the previous subquestion. 

  • Limit the answer length 

  • Less words to read 

  • Add a maximum number of words 

  • Instruct students to write their answer in a frame (downside: students may write too small to read) 

  • Provide scrap paper 

  • Easier to read/follow since less texts/graphs have strike-through parts 

  • Instruct students to use the scrap paper in order to enter neatly written/drawn graphs on the exam paper 

  • Instruct students to use a given template for their answer 

  • You know where to look. 

  • Examples: 

  • Diagram in which they need to draw something 

  • A table in which they need to either fill all values / cells, or only the relevant ones for the applicable calculation. 

  • For graphs: you can pre-draw the axes. 

  • Starting point for a situational sketch 

  • Picture 

  • Have students type the answers 

  • Reduces time loss due to legibility issues 

  • Warning: It might take students more time to type their answers than to write. Example: Mathematical questions or questions in which students have to use many symbols 

  • In case of handwritten exams: instruct students to write from top to bottom. 

  • Easier to read the answers. 

 

  • In case of numerical questions: instruct students to write their developments of questions as a formula, before filling in the given/calculated values of the variables in the formula 

  • This will make it easier to spot errors. 

  • Warning: You will need to have students practice with this during the course, and give them feedback, too. Furthermore, inform students about this before and during the exam. 

  • In case of open-ended problems with multiple solution routes: ask students which solution strategy they took, and divide the options over graders. 

  • It is faster to grade answers that used the same solution route. 

  • How: 

  • Add a short question (for 0 points) in which you ask them which solution strategy they took, and divide the options over graders. 

  • Use automatically gradable question types 

  • It is faster to have software grade the questions automatically 

  • Examples of question types that can be graded automatically: 

  • Numerical questions 

  • Symbolic questions (not possible in all assessment tools) 

  • Multiple choice questions (for many learning objectives, this is not an option 

  • Short answer questions (e.g. 1 word) 

  • Downside: 

  • Setting up the exam may cost more time. 

  • The following tools support automatically graded questions: 

  • Ans, 

  • Möbius and 

  • Grasple (only availabe for math service education) 

 

Mitigating negative effects of automatic grading
Negative effects of automatic grading 

Using digital assessment tools to grade answers automatically decreases the time to grade considerably. However, automatic grading decreases the reliability of the grade: 

  • If exams are scored automatically, you have no insight into why students ended up with an incorrect or correct answers, and therefore you have less indications for whether questions contained errors or unclarities. In manually graded questions in which students explain their answers, unclarities or errors in the question would become apparent in the given answers. 
  • It is not fully possible to grant partial points for partially correct answers. As a result, grades will be lower than in manually graded questions. In the latter case, most lecturers would grant partially correct answers with partial points. 
  • In case of e.g. multiple choice questions, guessing can result in higher grades. 
  • As with any assessment tool, students might be hindered in their performance by technical problems or lack of practice with the tool

 

Measure to mitigate Negative effects of automatic grading 

These negative effects can be mitigated by the following measures before, and after the exam: 

Before the exam 

  • Grant partial points for partially correct answers 
  • Give partial points for partially correct answers up. All-or-nothing scoring will reduce the precision of the grade in. 
  • Attribute low weight for all/nothing questions 
  • Questions that are scored with either all or no points, should not have a too high number of points (to enlarge the precision/step size of grades). Split up questions, if necessary. 
  • Closed-ended questions like multiple choice: correct the grade for guessing 
  • In the case of multiple choice questions (or other closed-ended questions), correct for guessing in the grade calculation from the scores. See examples below where grade is the exam grade ϵ [1,10], score is an individual student’s sum of points that they received for their (partially) correct answers in the assessment, guessing_score is the average score for random guessing answer options, and max_score is the maximum score that students can receive by answering all questions fully correct. 
     
    • Original formula 1: grade = 1 + 9*(score/max_score) 
      New formula 1: grade = 1 + 9*((score-guessing_score)/(max_score-guessing_score)) 
    • Original formula 2: grade = 10*(score/max_score) 
      New formula 2: grade = 10*((score-guessing_score)/(max_score-guessing_score)) 

 

After the exam 

Check if you need to correct your answer model: 

  • Use test result analysis to spot issues in the question / answer model, especially in automatically graded questions. In case a question has a negative/low correlation with the total/the other scores, consider possible causes and change the answer model accordingly. Fix issues for all students. More information on test result analysis, see the UTQ ASSESS reader
  • If students who contact you have valid arguments on why their answer is (partially) correct, grant all students with similar answers full or partial points. 
  • You could ask students to hand in their elaborations, next to entering their final answer in the assessment tool. That way, you can check what the reason is if some students come up with unexpected answers, and grant all students with similar answers with partial or full points. 
  • Communicate clearly to students whether or not their elaborations will be used to grade their exam, or if you will only use it in specific cases, like described here.  
  • If you change the answer model, make sure that no student should receive a lower grade due to the correction.

 

Schedule additions in advance for e.g. projects/assignments/practicals/fieldwork 

If students are allowed to hand in an addition, for example, because they missed a practical or received a low grade for an assignment, schedule this process according to the deadlines: 

  1. Communicate the grades and the assignment for the students who are eligible for the addition on 7 July 2023 (or earlier). If the addition is an extra assignment, make sure that it is available in time to the students. 
  2. Set the students’ deadline for handing in the addition in week 5.3 (e.g., Friday 21 July 2023). 
  3. In the case of first-year bachelor students, communicate the grades via Osiris on the 28th of July (or earlier). Deadline for handing in all grades in all other cases: 15 working days after the deadline, no later than 11 August. 

 

How to make long term grading time reductions?

Next year you may want to make larger course changes and below there are two changes suggested. These changes require permission from the programme director and the Board of Studies. Please keep in mind that in most cases a new course code and a transition regulation are needed. In addition, adjustments to the assessment also require changes in the study guide. Contact your ESA department for the timeline and templates for course changes in your faculty. As a rule of thumb, changes for courses in the next academic year need to be proposed to your educational committee (Board of Studies) by your programme director (Director of Studies) in February.

Anchor Links Example