Deploying a training programme involves calculating the benefits of the programme itself, and to do so, an array of KPIs (Key Performance Indicators) exists. Whether on the training platform or via external and more general indicators, these statistics represent a significant source of information. But which statistics are most relevant when calculating the impact of asynchronous training?
On the platform
Quantitative feedback from learners on the course
It’s important to be able to analyse learner satisfaction in the moment and at a glance. To do so, nothing beats a graduation system, which allows learners to grade their course out of five or ten. This provides an overall idea which is easy to interpret. Discover the example of Roche Diabetes Care France, which managed to capitalise on learner satisfaction.
The satisfaction survey should come at the end of the course and should ideally be very short if it’s included in all of your courses. Of course, there’s no point in displaying the grade on its own.
Qualitative feedback from learners about each course
Obtaining a quantitative grade from learners gives you a quick score and lets you know if a piece of content is really sub-standard or, on the other hand, if it’s a resounding success.
Once you have garnered this information, you need to ask yourself why the course earned the score it did? Learners therefore need to be able to justify their answer and explain what they loved and hated, while it’s important for you to become aware of this information in order to make improvements, showcase the plus-points, or even to come up with some best practices to use in future courses. Choose an open-ended question and give learners freedom when responding; above all, refrain from influencing their answer, “What feedback can you provide about the course?”
“What did you like about the course? What didn’t you like?”
Learners’ average basket
How much of the course are learners doing on average? Your learners’ average basket will provide you with particularly important information: do you have a well-developed learning culture within your organisation or within your team? Even if your courses are marvellously designed, if your employees don’t get onboard, perhaps you should be investing your time and energy elsewhere. For example, by propagating a culture of learning, (awareness raising for managers, executive sponsorship, setting up communication drives, etc.).
Time spent training, completion rate and success rate
Time spent training is a supplementary factor to the average basket. Indeed, the average length of your courses can vary, especially if your catalogue contains both microlearning and longer pieces of content.
This statistic gives you an insight into learner retention, as well as into the quality of your content on which most time is spent (with the same average times). The completion and success rates are two indicators which allow you to track a learner’s interest in the subject. In this way, you can take a more measured look at the pedagogical design and adapt it to suit specific needs. It’s hard to make an impact if no one is doing your courses. Another interesting statistic is the number of times a course is launched; if learners return regularly, it means they have embraced the course, find it of interest and more importantly, your application has won the learner’s loyalty.
After detecting use, it’s worth placing the data into context to gain an overview. At this point, it’s important to stick the course’s pedagogical aims back together with the concrete steps the learner has put in place.
Ideally, you should always have pre-course statistics and compare them with post roll-out data.
It’s also important to consult statistics not related to your solution.
Outside the platform
That is, an increase in the results of trained individuals before and after the course. In the context of sales force training for example, managers can observe the sales performance of consultants prior to and after the course. The impact cannot be any more visible, especially for business or product-specific courses. This statistic can rely on product sales, customer satisfaction or better mastery of a tool, depending on the courses you have on offer.
It’s therefore particularly interesting to cross-check this statistic with other data to ensure it isn’t affected by other changes (for example a sales increase of a product in-store could also be due to an advertising campaign).
Another statistic which may be paired with this, in particular for customer service and consulting courses, is the final customer satisfaction survey. Is the final customer more satisfied than they were before the course? Such change can be a true indicator.
A drop in turn-over, talent retention
A statistic which is longer-term in nature of course, but which is particularly relevant. Did you know that an organisation’s learning culture can be an attractive tool for retaining talent? The impact of your training programme can thus be monitored using this indicator.
The average length of time an employee spends working for an organisation can be increased if your training programme is successful. Once again, this data should be contextualised in order for it to remain relevant.
Calculating the impact of your courses therefore requires several types of quantitative and qualitative data in order to analyse, measure, describe and detect the different aspects of your training programme to anticipate its impact.
Statistical analysis has a sole purpose here, to understand and address areas for improvement, while scaling-up the drivers of success. This data should form the basis of a continuous action plan for developing your mobile learning training programme.
D’abord éditrice de manuels scolaires, professeure et coordinatrice pédagogique à l’Université, Julia a rejoint l’équipe Learning Experience chez Teach on Mars pour apporter ses compétences en pédagogie. La gamification et la différenciation pédagogique sont notamment ses chevaux de bataille.