In this rapidly ever-changing digital world, I am often left quite disappointed about the lack of regard given to the online learning environment in this country, particularly regarding the VET sector. Sure, we have some pretty cool Learning Management Systems (LMS’s) that will give you all the bells, coloured lights and progress bars you can handle as you work your way through your course, but what are you actually learning (I mean really, REALLY learning) during your course, and how is that learning being measured?
You don’t need to be an adult education expert to know that you cannot gauge competence without assessment, but it should be equally as evident that you cannot judge competence through a range of spurious and ineffective assessments, even if they are “just formative assessment questions”. Simply because we are living online does not excuse our obligations to the Principles of Assessment (any assessment) and the Rules of Evidence (all evidence).
Of particular concern is the trend to take a physical, real-life, experienced assessor out of the process altogether, relying rather on magical algorithms and other devices to ‘assess’ competence. In other words, if you get, for example, 80% of the proffered multiple choice questions correct, you’re competent. Well done you! You have clearly shown that you can refer back to a select piece of text to find an answer (at least 80% of the time, anyway).
That, in my opinion, is not evidence of ‘learning’ at any AQF level, but most definitely not at Level 4 and beyond. The simple regurgitation of information, or maybe even the oft-cited method of “always picking ‘C’ and/or ‘all of the above’” will never provide enough insight into the learner’s comprehension of the skills and knowledge they are there to acquire. The fact that there is usually zero real feedback using this lazy assessment method only compounds the problem. What about that 20% you got wrong? Let’s just hope it was nothing important, right?
It is through formative assessment that we, as educators, should get a definitive understanding of the learner’s comprehension of the subject matter, including application in a range of situations, together with their preparedness to undertake the more rigorous activities associated with summative assessments.
Quality education needs quality processes, and quality people driving them. We need real, experienced assessors providing real, insightful feedback to students on their individual and unique responses to questions that are open to interpretation, not a formulaic approach as provided in some automated, online rubrics. We need less “From the list below…” or “Which of the following…” and much more “In your own words…” or “Describe a situation where…”.
Yes, it will cost money for the additional resources to meet these assessment demands, and, yes, it will mean that the whole assessment process may be slowed down to accommodate feedback and re-assessment, but to achieve the end goal of ensuring our students are actually getting what they are paying for (real learning through quality training and quality assessment), it is worth the commitment on the part of all genuine VET providers working in an online environment. We owe it to the students, and to the industries relying on VET to produce quality candidates to drive and lead change in our future workforce.