This article discusses the use of rubrics to grade student assessments at Southern Cross University.
What is a rubric?
Assessment rubrics (sometimes referred to as criteria sheets) are guides to assist markers to make consistent and reliable judgements about the quality of student work. They are also used to provide feedback to students about the quality of their work and how they might improve. Rubrics are commonly presented in the form of a matrix that includes:
- Marking criteria – the elements that the marker will consider when judging a piece of work (such as quality of argument, research, technical aspects, etc.)
- Performance levels – the grade standards or SCU Grade Descriptors that apply to the assessment (e.g. High Distinction, Distinction, Credit, Pass, Fail for graded units)
- Descriptors – detailed and descriptive statements about the performance level of each criterion (these should be objective and measurable)
Rubrics may vary in complexity from simple tables to detailed matrices that provide descriptions for each dimension of quality and characterise each level of accomplishment. Rubrics can be adapted to grade many different types of assignments including essays, reports, oral presentations, group work and research papers. It is important to ensure that rubrics are simple and clear so that students can readily understand and engage with them. The following video introduces rubrics in more depth.
The simple illustration of an analytic rubric (below) highlights the key elements used to grade student assessment against SCU Grade Descriptors.
Figure 1. Elements of an analytical rubric.
Note that the criteria used in a rubric should be weighted based on their importance to the assessment task and the related learning outcomes. All criteria should be relevant to a student demonstrating the assessment learning outcomes (e.g. essay writing mechanics could demonstrate a unit learning outcome about 'written communication skills', but the graphical design of a title page wouldn't).
Grading Standards at Southern Cross University
- The University’s Grade Descriptors apply to all assessment tasks. Avoid the use of other performance levels when using rubrics (such as competent, advanced, poor) as these confuse markers and students alike.
- See the Grade Description guidelines and Rule 3 Coursework Awards to learn more about the grading standard policies in use at Southern Cross University.
- Turnitin digital rubrics can be used to grade student work, which can automate the calculation of rubric percentage ranges and criteria weightings.
Why use rubrics?
Rubrics bring transparency to assessment and marking for both staff and students. Rubrics play 3 main roles in assessment:
- Assist markers to make consistent and reliable judgements about the quality of student work, enable them to form a shared understanding about how grades should be awarded, and increase the efficiency and consistency of marking and moderation processes.
- Provide feedback to students about the quality of their work and how they might improve.
- Provide guidance to students by explicitly communicating how their work will be graded, and what they need to focus on when addressing the marking criteria. This can help to clarify and articulate industry or discipline standards to students.
It is important to understand that the use of rubrics at Southern Cross University is a key element of the university's approach to assessment policy.
Rubrics and assessment policy
Under the SCU Assessment, Teaching and Learning Procedures all graded assessment tasks are assessed using clear, explicit criteria in the form of a rubric (where appropriate) that are aligned to the Unit Learning Outcomes. Approved exceptions exist where the assessment task is designed to fulfil an accreditation requirement (that does not count towards a grade), or where a unit is specified as ungraded in the Unit Outline (assessed as Satisfied requirements or Fail). Other exceptions (such as examinations) must be approved by the Associate Dean of Teaching and Learning of the faculty.
Meeting rubric guidelines
An appropriately designed rubric meets the assessment policy guidelines by:
- including explicit and detailed specifications of "Marking criteria" and "Expected student performance" (see Assessment, Teaching and Learning Procedures: 11c & d).
- ensuring that "Marks will be based solely on merit and academic achievement assessed against academic standards with explicit criteria" (see Assessment, Teaching and Learning Procedures (44)).
- using the Grade Description guidelines to assess and grade student work (also see Rule 3 Coursework Awards)
When can rubrics help with assessment?
Rubrics are more than just an assessment tool used by markers to grade work. Rubrics can support all three stages of the unit assessment process:
Rubric benefits – Markers/Assessors
Rubric benefits – Students
Before starting assessment
For the marking team, the rubric provides an opportunity to explain and moderate understandings about criteria and standards before marking commences, by using the unit calibration process before term starts.
For students, a rubric provides a scaffold for assessment as learning - rubrics explain what is required in the assessment task and provide important cues about the expected elements and approaches (Ragupathi & Lee, 2020).
During the assessment writing process, rubrics provide the Unit Assessor with a specific point of reference to direct students to if they are unsure of or confused about assessment task expectations. Similarly, it will likely reduce questions/confusion overall.
During the assessment process, rubrics provide students with a roadmap to ensure they are on the right track. Rubrics also facilitate students' ability to self-assess their work in progress against the marking criteria.
After the assessment has been submitted
Rubrics support markers as they provide a detailed framework for consistently judging individual student submissions and encourage the provision of systematic feedback on student performance against each criterion. Quality rubrics help to reduce marker bias (Chakraborty 2021).
Rubrics provide important information to students about the quality of their performance against the specified criteria. They allow students to diagnose their strengths and weaknesses and where they can improve. Rubrics also provide transparency to students about academic standards, and how grades are derived.
Calibration and rubrics
Calibration is a process of co-creation and peer review carried out by the Unit Assessor and all academic staff involved in teaching and marking of a unit, before the start of the teaching term. This is to ensure that the learning activities and assessment support student achievement of learning outcomes, and is reflected in the design of marking rubrics.
The teaching team involved in calibration will:
- establish that the learning tasks used are valid preparation for ULOs
- agree on the interpretation of assessment marking rubric field descriptors, and
- outline the ‘best practice’ strategies to guide students towards achievement of ULOs and assessment LOs in class.
See the Assessment Moderation Guidelines for more information about calibration.
Rubrics and constructive alignment
In order for rubrics to validly assess student work at Southern Cross University, rubrics must be constructively aligned with the learning outcomes and assessment task requirements. Constructive alignment is the design of learning, assessment and rubrics around the unit learning outcomes (Biggs & Tang, 2007). Well-designed rubrics will provide a clear illustration of student progress towards learning outcomes.
Figure 2. Constructive alignment between learning outcomes, teaching and learning activities and assessment tasks (adapted from Biggs 1999, p. 27)
The following video illustrates how rubrics can be constructively aligned with assessment and learning outcomes, ensuring they are valid and reliable tools for assessing student work.
Developing a rubric using constructive alignment
Ideally, the development of an effective rubric starts early in the unit development process - when learning outcomes are being decided (however, this is not always possible). It is critical to clearly identify the criteria and standards for the rubric and adapt these to the assessment context (Ragupathi & Lee, 2020). This rubric development process is illustrated in figure 3 and covered in the following four steps.
Figure 3. Rubric development as part of course design (in Ragupathi & Lee, 2020, adapted from Huba and Freed, 2000).
Step 1. Start with the learning outcomes
Formulate the unit learning outcomes and start thinking about ways that these could be assessed.
e.g. Learning Outcome 1: Use an iterative creative process to develop interactive projects
Step 2. Develop an assessment method
When thinking about a suitable assessment, unpack the learning outcome phrasing:
- Actions: the verbs or behaviours have been used. What will students demonstrate or do?
- Content: the skills or content that is being covered. What is this about knowing or doing?
- Context: the situations, conditions or circumstances. In what context and to what level will students demonstrate this?
Consider adding specific elements to each assessment task that provide evidence of mastery of the mapped learning outcome/s. Ensure that these elements are the focus of the assessment and that other requirements are supportive.
e.g. Project 1: Develop an interactive prototype website. You will storyboard at least three ideas and then use iterative design to develop a final prototype of one of your ideas.
What if the assessment has already been decided?
Start with unit outline/UCMS unit report and find the information about learning outcomes and assessment. It's worth double-checking if the assessment design provides opportunities for students to demonstrate the learning outcomes (e.g. an essay task would be an inappropriate task to demonstrate the development of an interactive project). Where you have established that the assessment is not aligned with the mapped learning outcome/s, discuss any changes urgently with the course coordinator or Associate Dean of Teaching and Learning in your faculty.
Step 3. Choose relevant marking criteria
Your criteria should be based on the key elements identified in the learning outcomes: Actions, Content, Context.
For example, the criteria to assess could include:
- creative thinking
- prototype design
Choose the right marking criteria for written work
In the last video, the learning outcome was about 'writing an argument in an essay form' so it made sense to use writing mechanics (grammar, spelling, punctuation, tone and style) as a key marking criteria - a key element of demonstrating the essay format. However, other units may not require students to write essays to demonstrate learning outcomes. Ask yourself, what (if any) academic writing criteria are the focus of the learning outcomes for this assessment? Be careful not to assess students solely against their ability to demonstrate academic writing - unless specified in a learning outcome and directly related to the assessment.
Where you can come unstuck is the use of 'generic rubrics' which may contain criteria unrelated or unimportant to the assessment task or learning outcomes being assessed. Sometimes criteria are given a large weighting, when they are of minor or secondary importance to demonstrating the learning outcomes.
- 'referencing' as criteria: where the task is to write a personal reflection, and the learning outcome is about 'reflecting on personal experiences'
- 'student attendance', or 'writing mechanics' as criteria: in an oral presentation, where the learning outcome is about 'presenting information orally'
- 'graphic design' as criteria: in a mind mapping task, where the learning outcome is about 'linking concepts and relating ideas'
- 'APA referencing' assigned a 20% weighting in a written report, where the learning outcome is 'developing a project proposal'. (Instead, 'APA referencing' could be a part of a 'report communication' criteria which could include tone, style, formatting and structure.)
Step 4. Write rubric descriptors
Develop a rubric descriptor for each criterion. Ensure that what you are measuring or observing for each criterion clearly demonstrates a component of the learning outcome. You may wish to start with the High Distinction level.
e.g. High Distinction for creative thinking: Evaluates and reflects on the creative process, draws perceptive insights. Adopts robust design thinking methodology to develop a range of novel or unique ideas.
More rubric resources
The following resources provide further guidance on the design and writing of rubrics:
Biggs, J., & Tang, C. (2007). Teaching for Quality Learning at University (3rd ed.). Buckingham, UK: McGraw Hill & Open University Press.
Chakraborty, S., Dann, C., Mandal, A., Dann, B., Paul, M., and Hafeez-Baig, A. (2021). Effects of rubric quality on marker variation in higher education. Studies in Educational Evaluation 70: 1—12. doi: https://doi.org/10.1016/j.stueduc.2021.100997
Ragupathi, K. & Lee, A. (2020). Beyond Fairness and Consistency in Grading: The Role of Rubrics in Higher Education. 10.1007/978-981-15-1628-3_3. https://link.springer.com/chapter/10.1007/978-981-15-1628-3_3.
Bolton, C. F. (2006). Rubrics and adult learners: Andragogy and assessment. Assessment Update 18(3): 5–6.
Orrell, F. (2021). TEQSA Online learning good practice: Designing an assessment rubric. https://www.teqsa.gov.au/sites/default/files/designing-assessment-rubric.pdf.
Petkov, D., and Petkova, O. (2006). Development of Scoring Rubrics for IS Projects as an Assessment Tool. Issues in Informing Science and Information Technology 3: 499–510.
Reddy, Y. M., and Andrade, H. (2010). A Review of Rubric Use in Higher Education. Assessment & Evaluation in Higher Education 35(4): 435–448. https://doi.org/10.1080/02602930902862859.
University of Western Sydney (2015). Developing an assessment guide to support the implementation of a criteria and standards-based approach to assessment at UWS (2nd ed.), Learning and Teaching. Unit. https://www.westernsydney.edu.au/__data/assets/pdf_file/0004/449860/PVC5557_Assessment_Guide_LR3.pdf.