Intelligent Assessment – A window into the mind of the learner
For decades, we’ve heard CLOs use all sorts of buzz words and catchphrases (alternate delivery strategies, modality mix, second life, avatars, digital learning, online simulations, virtual reality) to describe one common desire: deliver less learning in the classroom.
The common goal has been on-point: reserve in-person classroom time for topics where it is most beneficial to have people together to participate in immersive, highly-valuable experiences that can’t be replicated in any other mode of delivery.
COVID-19 has changed the game on organizations, practically overnight. Live learning programs are all canceled. Some learning organizations are more optimistic than others, but nobody is planning much for summer and some are not planning any for the remainder of 2020. With this unexpected and unforeseen change, the “mad dash” to virtualize the planned classroom deliveries is well underway.
Measuring the learning and business impact of classroom learning has always been a challenge and that is magnified in a virtual classroom environment. We often rely heavily on the experienced facilitator to be our “eyes and ears” to provide anecdotal measures of how well content was received, understood, engaged, etc. In a virtual world, that line of sight is greatly diminished.
A few questions to consider:
How many of your most important and critical programs have been forced to migrate to virtual?
In the absence of a pandemic, would you have ever considered moving those strategic classes away from a live classroom?
How are you determining the effectiveness for the learner of those programs in their new virtual delivery setting?
In our informal discussions, it appears the vast majority of clients have not considered measurement beyond the traditional Level 1 satisfaction survey. Further, deeper measurement strategies for existing live classes have been eliminated in some cases due to changes in overall program design, in the move to virtual.
All that said, we “are where we are” and we are all working diligently to put forth the best possible offerings and make the best of a less than optimal situation. We all know the common realities of working virtually, let alone learning virtually. We have categorized the common challenging realities of learning in a virtual setting as:
We believe for your most “mission-critical” programs that have been migrated to vILT, it may be worth considering another layer of measurement to be able to identify and isolate the behaviors depicted in the image above. To that end, we thought creatively about how to leverage our adaptive learning solution to be used less as a truly “adaptive” learning experience and more so as what we have dubbed an ‘Intelligent Assessment’.
The goal of the Intelligent Assessment is simple, provide a line of sight into the hearts and minds of your learners who are participating in virtual learning. The assessment focuses on 3 key data points from the learner’s point of view:
“I don’t know”… – the learner has, for whatever reason, not grasped a particular learning objective and answers “they don’t know” in the assessment.
“I am not certain…” – the learner may have struggled to completely grasp the content and when it comes to the assessment is unable to confidently and correctly answer questions. They submit a response but are able to “admit” they are sure.
“I am confident…” – and right or wrong? Confident and right is the desired result. However, confident and wrong (what we affectionately refer to as: Confidently-Held Misinformation)
These Intelligent Assessments will most commonly be used as post-tests immediately following a vILT. They could also be used as pre-tests to give a facilitator a sense of where their learner audience is starting from related to the learning content. In any case, the result is a set of data in dashboard reports that can isolate problem areas by learner and by learning objective, and allow for immediate remediation, additional coaching or other appropriate interventions for the learner.
You recall our cast of characters from the earlier referenced image that were all coming at the vILT from a different mindset?… here is what the Intelligent Assessment was able to tell us about each of them:
Notice, two of our learners, Nina and James (don’t worry, names have been changed), both have a high-level of uncertainty after completing the vILT. What you don’t see is the data visualization allows you to drill down into exactly where that uncertainty exists, aligned to learning objectives. Likely, there needs to be some level of intervention and remediation for both James and Nina. Peter on the other hand, has a confidence problem. He believes he knows the content however the assessment shows otherwise. James will require a different type of follow up to get to the root cause of this confidence issue. James is the person that causes the most concern. Someone who thinks they know something is highly likely to share that knowledge with others.
For your most strategic learning programs that have now made the unplanned and possibly reluctant, move to virtual, would this level of insight into your learners not provide incremental value? Only you can answer that question. You have heard the cliché “knowledge is power”… We believe that line of sight into this level of detail allows for learner impact to be taken to another level in a quick and easy manner, with a very modest incremental investment in time and budget to what is already required to deliver your vILT. We are confident. Only an Intelligent Assessment could tell us if we are confident and correct.
In the 2nd to last paragraph, is it Peter who is the person that causes the most concern since he is highly confident but also incorrect?