How to Choose Effective eLearning - Part 3
The first two articles in this series focused on how to spot good quality eLearning, and both articles equally apply to instructor-led training. Links to both articles can be found at the end of this one. In this article I want to explore a couple of other instructional design techniques that can also make a huge difference to the effectiveness of learning regardless of the delivery method.
Active and Passive Content
Active and passive learning were first introduced when we discussed cognitive load. Most people learn better by ‘doing’ something rather than listening or reading about it, however, effective training will always aim to provide a balance between participation and passive learning methods. The ‘active’ component should not just be included to add engagement without purpose. A typical example of this is having to click ‘next’ on a screen after every paragraph. This is called behavioural engagement and while it might make the process of reading a long document less ‘passive’, it is not engaging the reader mindfully. What we really want to achieve is a psychological engagement – when learners have to actively process and organise content, and connect it to what they already know. Adding diagrams and worked examples to content can be a lot more effective than just breaking up text with a ‘next’ button.
The active component also needs to be at the right level for the course. Let’s assume our goal is for learners to be able to bake a cake. Spending hours in a lab experimenting to figure out how much raising agent to add, rather than following a recipe, is probably not a great use of time. Experimenting is certainly more ‘active’ and will teach our learners a great deal about raising agents but does the time required for this activity provide any real value to cake baking? If our learners just want to bake a simple cake, then the answer is no.
Scenarios
Unless your ultimate goal is to win the local pub quiz, most learning doesn’t become useful until you start to apply it. Scenarios are a great way to apply knowledge gained from a course to existing situations. Many training courses only ever get to the point of ensuring you understand something without really teaching you how to apply what you have learnt to your job. Ideally you want to go a step further and not only apply what you have learnt but modify and adapt to other, real-life scenarios. Going back to our bakers, not only do we want them to complete our course being able to bake a specific cake, we also want them to be able to use the same basic recipe and techniques to make cupcakes or a chocolate cake. We want our learners to not just remember, but understand, apply and adapt what they have learnt.
Questioning
Writing good questions for a course is a skill in itself. It is simple enough to check if a learner can recall or has understood something but writing questions that check they can apply knowledge is much harder. This is particularly difficult when feedback is computer generated and not provided by an instructor, as is often the case in an eLearning environment. Some sort of process for checking learning is taking place is essential otherwise neither the learner nor the purchaser can accurately gauge how much is being learnt. If questioning is limited to remembering facts, then this doesn’t check if the material is has been understood and, likewise, if questioning is limited to measuring understanding, then there is no measurement of the application of knowledge.
In the last article we touched on Bloom’s Taxonomy and how this methodology applies to objective writing. Bloom’s Taxonomy can also be used both to write, and more importantly for someone looking to purchase eLearning, to identify what level questions have been written at. You don’t have to be an expert question writer to identify the level of a question, all you need to do is look at the verb. An internet search on ‘Bloom’s Taxonomy question verbs’ will bring up many sites that map verbs to question levels. Let’s assume we are writing a course on how to make a peanut butter sandwich:-
Remember: What are the ingredients of a peanut butter sandwich?
Understand: Explain the purpose of the two slices of bread as opposed to one?
Apply: Which of the following implements would be best for spreading peanut butter?
Analyse: When would someone not want to eat a peanut butter sandwich?
Evaluate: What judgement would you make about someone who doesn’t like peanut butter?
Create: What changes could you make to the design of a sandwich for someone struggling with mobility?
At the very minimum you want a course to be questioning up to the ‘apply’ level. Going higher than this level will depend on the subject matter of the content and the expectations of the course. The ‘apply’ level was certainly high enough for our peanut butter sandwich makers but a course on people management, for example, would need to go higher. Questions should also go up to the same level as the objective. So if your objective is to ‘classify cakes into those suitable for vegetarians’, which is at the ‘analyse’ level, then your questions should also go up to this level.
Looking out for relevant active content, scenario-based activities and questions at the right level can help to gauge the quality of a potential eLearning course. With knowledge of cognitive load and spotting good objectives, you should now have the tools you need to fully evaluate learning content before purchasing. Remember to evaluate how effective learning has been at the end of the course and, more importantly, 3 and 6 months later.
theatre4business Passion Purpose Power
4yI love the peanut butter sandwich image....
Cisco Learning Product Management
4yKeep them coming Liz.