Interactions > Assessment improvement

Changing assessment is often driven by previous learner performance as well as by internal and external reviews of programs. These types of evaluations can frame and shape assessment design. When developing assessments, educators should consider both prior evaluation data and how they will generate evaluation data to ‘feed forward’ into the next iteration of development. Prior evaluation data can include formal data gathered by the institution, as well as learner performance such as inability to complete tasks well, disengagement or good meeting of desired learning outcomes. It may be data on a similar type of task or unit, outside of the institution, but within the experience of the educator. Generation of evaluation data can be as simple as this year’s teachers keeping a reflective journal for the unit on the successes or difficulties with respect to assessment intended to inform next year’s teachers, or it can be as complex as external formal evaluation intended for a broader audience. Some types of assessment tasks which have many and easily quantified items (e.g. multiple choice questions) can undergo psychometric checks for quality assurance. It is worth remembering that asking learners to contribute to the piloting or design of assessment activities can be very valuable.

Assessment considerations:

  • What experience do you have regarding the successes or challenges faced by this or similar types of assessment? How will this, with a focus on learner performance, shape your current assessment?
  • What data are available (formal and informal) regarding the successes or challenges faced by this or similar types of assessment? How will this shape your current assessment?
  • How might you engage learners in future design of assessments?
  • What data will you generate for future iterations of the unit? What data are valuable for your type of assessment (e.g. psychometric analysis, qualitative review, learner feedback)? How can these data inform your understanding of the assessment task?
  • How will you remind yourself, or inform others, of how learners performed in this task previously?
  • How will you let learners, colleagues and your institution know about the outcomes of any evaluations?

Also refer to:

Context > Institutional assessment principles and policies

Tasks > Rationale

Tasks > Activities which drive learning

Interactions > Resistance or engagement

Educator experiences

Evaluating every semester saved time in the long run

The first time I taught my current unit, the students practically revolted over the assessment. I hadn’t taught this sort of cohort before, and I’d made a lot of incorrect assumptions; stuff like workload and prior knowledge. After that experience, I now devote some time in class at the start and end of semester for the students to evaluate the assessment. I get them to work through the checklist on the Assessment Futures website in groups and anonymously feed back to me. This has led to rapid improvement in the assessment, particularly around clarity of the task; I spend so much less time explaining procedural stuff about the assessment now. – Education lecturer

Keeping notes and reflections

Every semester I dedicate a two-page spread in my notebook to jotting down feedback or ideas about my unit. Much of this relates to assessment. I might have explained something poorly; a student might have misinterpreted the task in a way I should have foreseen; or alternatively someone might just have a great idea. Students seem to really like that I write down their feedback, and at the start of each semester I look back at my notes from last time and make adjustments where they’re needed. – Science lecturer

Formal course review led to improvement

Last year we went through a formal course review process. It was gruelling and meticulous, but revealed some issues with our assessment we weren’t aware of. We had huge variations in the workload between our units, which we’ve now changed. But more troublingly it showed that we were mostly just getting students to write a lot, which doesn’t really lead to the sorts of outcomes our short practically-focused postgraduate course should produce. We’re still working on that one
 – Education lecturer

Review of quantitative data

We debate for ages looking at the midterms or the questions, we look at the averages. We’re always trying to look at it and say, “How can we make these questions better?” Not easier, better. – Engineering lecturer

Resource: