How to Leverage Data in Training Needs Analysis?

How to leverage data for training needs analysis?

How to Leverage Data in Training Needs Analysis?

The training needs analysis is a real staple in the corporate L&D field. Everyone does it, yet the real value-add is ambiguous. The traditional ways of doing it are not employee-centric, which results in irrelevant and inconveniencing rather than enabling learning activities. While extensive use of data to support that analysis is clearly the best direction to take, organisations don’t often understand how. Thus, we wanted to explain a few of different ways you could leverage data in your training needs analysis.

Using search data to understand what your learners really need

One of the biggest problems in training needs analysis is that the people doing it don’t often really talk to the end user of the training. And naturally, they don’t have the time either. While it would be nice to sit down for a 1-on-1 with each learner, often that’s not a practical nor feasible possibility. But what if we could have the learners talk to us anyways? That’s where data collection comes in handy.

By monitoring e.g. what your employees search during their work can be a really good indicator of the types of things they would need to learn. As most of workplace learning happens that way – employees searching for quick performance support resources – you should really aim to understand that behaviour. So, why don’t you start playing Google? You already should have the capabilities of tracking search history on company devices or within your learning systems. These searches are highly contextual, as they happen within the direct context of learning or work. It’s just a matter of compiling this data and using it to support your training decisions.

Using real-time learning data to identify organisational skill gaps

Another stream of data that you should be looking into when doing training needs analysis comes directly from the learning activities themselves. First of all, you should make sure that the learning data you collect is relevant and actually gives an accurate representation of learning. If you’re not yet using xAPI, start now. You’ll unlock a whole new level of analytical power.

Once you’ve got that covered, you should track that data across the board. This enables you access to individual-, group- and subject matter level insights. For subject matter (i.e. training topics), you’re better off tagging all your learning content appropriately. By having an up-to-date representation of what learning experience related to what topic or competency, you enable quick glances into your organisation’s learning. For instance, a skills heat map might aggregate this “tagging” data and learning data to give you a visual representation on which areas your learners are lacking in competence. Then, you can start drilling down on the group- and individual levels to determine why some are succeeding and some are not. This helps you to craft better and much more personalised training activities and learning solutions.

Using performance data to understand the business needs

Naturally, organisational learning should always support the business rather than inconvenience it. Therefore, it’s important to measure and understand performance. If you don’t keep track of performance, it’s impossible to measure real learning impact and consequently do effective training needs analysis. Performance data is everywhere, often scattered across the business in various systems and silos. Different departments might have their own data and some of it may be centralised. But whether it’s sales, marketing, customer facing staff, operations, finance or HR, the data is often there already. And it’s incredibly important to tap into this data, regardless of where it is.

However, one extremely important thing to note is not to use performance data in isolation. Rather, you should always compare it with your learning data. For instance, if looking at performance data alone, you might see that performance of department X is lacking. The easy answer would be to “assign” more training. However, looking at learning data could reveal that training has not solved the problem before and thus you should be looking at completely different solutions to it. Furthermore, you should always be careful in jumping to conclusions when linking learning to performance impact. Again, the L&D department might see performance improvement as a quick win, but a deeper cross-analysis with learning data could reveal that the performance improvement wasn’t actually caused by the training.

Final words

Overall, there are tremendous amounts and types of both learning- and non-learning data we can leverage in training needs analysis. The above provides just a few examples. With better analysis we can provide better learning experiences and positively impact business performance. To not leverage the vast amounts of data available to do that is simply foolish.

If you need help in crafting more data-driven learning strategies or adopting technology to do so, lets talk. Just drop us a note here.

More Learning Ideas

How to Use Learning Analytics? 3 Value-add Cases

How to use learning analytics?

How to Use Learning Analytics? 3 Value-add Cases

As corporations become more data-driven in their decision making, learning & development has to follow suit. To make better decisions, you naturally need to collect a lot more learning data. But that alone isn’t enough. You also need capabilities to analyse the data to understand what it means. While there’s a lot of ambiguity about corporate training analytics and some organisations intentionally try to make it sound extremely difficult, it’s not entirely true. To clear out some of that ambiguity, here are 3 different use cases for learning analytics that are applicable for organisations of all sizes.

1. How to use learning analytics to increase engagement?

One of the bottleneck issues in corporate learning today is engagement. It’s not always an easy task to put out learning experiences that resonate with the learners and keep them engaged. Naturally, your content has to be of good quality, and you should likely use a fair bit of interactivity. But once all that is said and done, you should unleash the analytics.

Through learning content analytics, we can get a much better understanding of our users. We can see what are the pieces of content that are used the most or the least. We can also get an understanding of ‘when’ and ‘where’ learners tend to drop off, which then enables to start figuring out ‘why’. Furthermore, we can drill down to each interaction between the learner and content/instructors/other learners to really understand what is working and what is not. All of this (and a fair bit more!) enables us to constantly develop our learning experiences based on real information instead of gut-feels and opinions. And when we can make our content to be more relevant and to-the-point, a lot of the engagement tends to come naturally.

2. How to use learning analytics to personalise learning experiences?

Our professional learners – the employees – come with various skills, degrees of experience, education and backgrounds. As they certainly don’t represent a one-size sample, we shouldn’t be putting them through one-size-fits-all learning experience either. As organisations have understood this, the hype around personalised learning has grown significantly over the past few years. But it’s not just hype, there’s real value to personalisation that learning analytics can help us to unlock.

First of all, learning analytics help us to understand the different individuals and groups of learners in our organisation. By being able to drill down all the way to the level of individual’s interactions, we can understand our learners’ needs and challenges much better. This enables us to cater to their various strengths, diverse learning history and varying interests. Instead of providing a simple one-size-fits-all learning experience, we can use this information to design personalised learning paths for different groups or even up to an individual level. These learning paths can branch out and reconnect based on difficulty of content, experience, current job and various other factors. The learning experience thus becomes a spider’s web instead of a straight line, and you’ll be able to catch much more of your learners.

3. How to use learning analytics to prove the impact of learning?

Proving the impact or the ROI of learning is something that L&D professionals often struggle with. One of the reasons for struggle is not using learning analytics. For learning results in terms of knowledge acquisition, a data-driven approach beats out the traditional multiple choice testing or feedback forms by a long shot. Furthermore, it enables a much more formative way of assessment, thanks all the data points collected and available.

But simple knowledge acquisition isn’t simply enough to demonstrate corporate learning impact. After all, what’s the learning good for if no one applies it? Thus, it’s imperative that we combine learning analytics with performance metrics and indicators. By doing this, we’ll get a lot closer to real learning results. E.g. how did the sales training affect the sales staff routines, behaviours and performance? How much of the risky behaviour did the compliance training help to eliminate? Is our training on team management actually resulting in teams being managed better? By enabling this level of analytics, you can answer a lot more questions. Furthermore, you can also start asking questions that you were not even aware of.

In our work, learning analytics and data-driven approaches play a big part. While technology plays a big part, there’s obviously more to it. For instance, you want to be sure that you’re setting your corporate learning objectives to enable this. If you’re looking to move into more data-driven learning strategies or understand your training impact better, we can probably help you. Just reach out to us here.

More Learning Ideas

How to Set Better Corporate Learning Objectives?

How to set effective corporate learning objectives?

How to Set Better Corporate Learning Objectives?

When designing learning activities, one of the first things to consider is what you want to accomplish with the training. Without proper goals, you can’t really know what to measure, let alone demonstrating the effects of the learning. While all L&D departments probably do set goals, not all of the goals are very meaningful. Specifically, learning professionals tend to set knowledge-based goals (e.g. “after this training, the participant will know good practices of leadership in the workplace”). However, the accumulation of knowledge, while a noble goal, doesn’t really provide any value to the business. It’s the enactment of new, desired behaviours and change, i.e. implementing the learning on the job, that determines the value-add. Thus, to effectively demonstrate the value of learning in our organisations, we need to set our corporate learning objectives in another way. And here’s a 4-step process to do that.

1. Define the workplace behaviours that you want to affect with training

First, you need to determine the specific behaviours you’d like to affect through training. And really, it means getting specific (you’ll run into trouble in #2 if you don’t). To continue with the leadership example: “we want our managers to become better leaders”. Bad. “We want our managers to have more frequent conversations with their direct reports”. Better.

The behaviours will naturally vary by topic, and some are easier to drill down to than others. However, “loose” learning objectives like masked as “performance objectives”, like in example #1 will turn out to be near impossible to measure.

2. Figure out what to measure and how. Don’t rely on self-reported data

If the first step is already a critical, the what and how of measurement is often the detrimental one in the context of corporate learning objectives. When trying to assess behavioural change (i.e. the impact of said learning) in organisations, there are two major mistakes that happen across the board.

First, not understanding what to measure. In similar fashion to setting the learning objectives, the ‘what’ is often too vague. If you’re doing sales training, measuring sales growth directly is too broad: you’re cutting a lot of corners and making dangerous assumptions. Sales may increase, but it may have no correlation with the training. Rather, the effect could be due to external environment, team relationships, incentives, seasonality, etc. Therefore, you need to drill down deeper. A proper level for example in sales training would be individual metrics, such as conversion ratios, time on calls, etc. These may or may not result in performance improvement, but that’s for you to find out without making ill-founded assumptions.

Second, the ‘how’ part of measurement is often lacking as well. If you really want to make an impact through better corporate learning objectives, it’s important to get this right. First, never rely on self-reported results. People lie, exaggerate, underestimate and aim to please, and even anonymity doesn’t remove the barrier to give honest answers. Rather, you should always use hard data. If the data is not readily available through non-learning channels (e.g. HR systems, performance management systems, ERPs, CRMs etc.), find a way to capture the needed information.

3. Quantify your corporate learning objectives

The relieving thing is that once you really drill down on the specific behaviours and get objective data sources, quantifying your learning objectives becomes much easier. In e.g. sales, finance, marketing or operations that is already a lot easier naturally. But even in the previous leadership example, there’s quite a large difference between “we want our managers to be 50% better leaders” vs. “we want our managers to have 50% more conversations with their direct reports”. The first is impossible to measure accurately, hence the quantification is moot and void. The second can be measured e.g through internal network analysis, communication meta-data and even calendar appointments.

Furthermore, once you quantify the learning objectives, you’re setting a transparent set of expectations. Consequently, you’ll have a much more easier job to sell the idea to your management and subsequently report the results. Once we analyse things a bit more deeply, we can assign “dollar values” to the changes in workplace behaviour. The value of sales staff converting 10% more of their calls is real and tangible, and it’s easy to track whether the learning investment is paying off. When the behaviours become less tangible (e.g. that leadership practice), you should agree with the business heads on what the value of those behaviours is to the business. For e.g. learning company values etc. it might seem silly, but you should consider doing it nonetheless to enable transparency in assessment and reporting. Of course, as you probably haven’t measured learning this way before, it’s important to acknowledge that in the beginning. So don’t punish yourself if you don’t “hit the target” right away.

Final words

By using this simple 3-step approach to setting corporate learning objectives, understanding the link between learning, impact and performance becomes a lot less burdensome. On an important note, once you’ve put this in place, you really need to actually measure things and commit to using the data. Collecting the data and insights, even if done properly, is itself a bad investment if you or your management still resort to making assumptions rather than trusting hard facts.

If you need help in understanding your organisation’s learning on a deeper level or to develop a data-driven learning strategy, contact us. We’ll walk you through what it takes.

More Learning Ideas

Omnichannel Learning – Steps Towards Unified Experiences

Omnichannel learning experiences - unified and seamless

Omnichannel Learning – Steps Towards Unified Experiences

The concept of omnichannel comes from the retail sector, where retailers are striving to provide a seamless, unified and personalised shopping experience across different channels, such as online, mobile and physical stores. Organisations who fail to utilise some of the individual channels or integrate them seamlessly seem to be struggling in business because of low customer engagement. While omnichannel is not much of a buzzword in the learning and development space, we should adopt the same ideology. After all, learning engagement as well as tracking learning across different channels is a challenge for many organisations. Here’s how we could move towards an omnichannel learning approach to tackle these problems.

Omnichannel learning starts with cross-platform functionality

We live in the era of learning apps. For almost every need, there’s an app. On top of that, you have your corporate LXP (or LMS) systems, learning portals, intranets and co-working platforms. The problem is that often these systems are don’t communicate very well with each other. Your learner may complete a learning activity in a dedicated application, but doesn’t in any way reflect in the content that e.g. your LMS might push to him/her. Running multiple platforms easily results in an incredible amount of duplicate work and activities. Furthermore, it tends to hide information in silos and the confines of the platform.

The aim of successful omnichannel learning is to abolish the boundaries of individual platforms. While running a single learning platform for all the learning needs would be ideal from a systems management standpoint, it’s often a non-feasible reality. Hence, when you’re looking at “yet another app” to solve your learning challenges, you should pay attention to the interoperability possibilities with your existing infrastructure. An important aspect of that is the Application Programming Interfaces (APIs) the systems can use to fetch and receive information from each other.

Omnichannel learning should aim for a unified user experience

Another omnichannel feature that may be equally challenging to create is a unified user experience across platforms. If we use a retail analogy, the aim is not only for the mobile app to match the design of the responsive website/web application, but the physical environment (the retail stores) to match it as well. A seamless transition between online and offline will be key to deliver a great user experience and sustain engagement. Interestingly, the online to offline is a particular challenge in learning as well (more on that later).

This area of omnichannel learning is the one where running multiple platforms usually kills the game. However, with a bit of effort on visual- and functional design, we can do quite a lot. Naturally, visual design, colour schemes etc. should match across platforms, as it is a low effort – high return type of situation. In terms of functionality, you’re better off if your applications follow similar logic in terms of accessing and consuming learning. Furthermore, you shouldn’t unreasonably restrict functionalities on mobile platforms, otherwise you may lose a lot of engagement.

How do we collect uniform learning data from all the different channels – even offline?

To, first of all, understand and further develop omnichannel learning experiences, we need comprehensive learning data. As we want to eliminate unnecessary overlaps in delivery, we need to grasp how the different channels work together. While each app or learning tool may very well have its own analytics, they don’t necessarily help the bigger picture. Furthermore, a major challenge is bringing offline (face-to-face) into the mix and collecting data from them. Thus, we need a unified framework of recording all different learning activities, whether mobile, online or classroom-based.

Luckily, we already have the technological answer for the problem – The Experience API (xAPI). The xAPI specification enables us to track and collect uniform data from all learning activities, even offline and pass them onto a single locker of data for analysis. It helps not only in learning analytics, but also enables better understanding of content engagement and learner-centric design.

What about content development for omnichannel?

Finally, content development is an important topic in an omnichannel approach to learning. Naturally, all digital content should be fully responsive, so it can be accessed via a browser on all devices and wrapped into mobile applications for native use. Interoperability and accessibility is imperative, as the concept of omnichannel expands the “mobile learning paradigm” of “anytime, anywhere” to “any content, anytime, anywhere”.

Integrating this mode of operation to offline activities is again the biggest challenge. The approach requires a degree of flexibility from the trainers, coaches and mentors. They need to adapt their classroom content to form a natural continuum to the prior (digital) learning experiences. But thanks to xAPI and learning analytics, they nowadays have the power to understand each learner on a very individual level.

Are you delivering seamless and unified learning experiences across different channels? If you want to move away from siloed learning approaches, we can help. Our advisory services cover both technology implementations and strategic learning consulting. Just contact us.

More Learning Ideas

Learning Technology Trends for 2019 – What’s Ahead?

Learning Technology Trends for 2019

Learning Technology Trends for 2019 – What’s Ahead? 

During the past few years, we’ve witnessed an unprecedented speed of development in the learning technology space. Likewise, the year 2019 looks to be no different. At Learning Crafters we are lucky to have an inside view to much of the development happening in the learning technology space thanks to our work with some of the leading technology vendors. Therefore, we thought it would be worthwhile to share some of our thoughts, views and first-hand experiences on what’s ahead for the industry next year. Hence, here are four key learning technology trends for 2019. 

Learning Technology Trend #1: Big Data will deliver exponential impact in 2019

For the past few years, organisations have been adopting tools and technologies to capture, analyse and execute on business data. While the human resources function in general seems to be lagging slightly behind in that adoption, 2019 looks to a be a big year for big data. For learning and development, the holy grail of learning data – the Experience API (xAPI) – has already been available for several years. While adoption of the xAPI standard has been slower than expected, any organisation claiming to do “learning analytics” today cannot remain credible without involving with xAPI. The old, commonplace ways of capturing learning data (e.g. SCORM) are simply not powerful enough. As we move into data-driven decision making in the L&D space, big data capabilities are an absolute requirement – and that will be delivered with xAPI. 

Learning Technology Trend #2: Artificial Intelligence (AI) will undergo rapid developments

Naturally, in the era of machines, the xAPI learning data will not only be used for analytics. Rather, this type of behavioural data (comparable e.g. to Google Analytics) will be used to develop more advanced AI. Now, what is AI good for in the learning space? 

Currently, AI in learning is being used to build adaptive, as well as personalised learning. Furthermore, the currently available more advanced AI applications are able to curate learning content based on the individual roles, needs and preferences of the learner. In 2019, we’ll definitely see major developments in both fronts. Additionally, we predict another AI application in learning analysis. In other words, the use of artificial intelligence to form insights on the link of learning and performance. 

Learning Technology Trend #3: Virtual Reality (VR) will become more “commercial” 

If you’re a learning professional and didn’t hear about VR in 2018, it’s time to go out! While a lot of the hype surrounding VR is arguably just that, hype, 2019 looks interesting. In addition to developing an industry understanding of what VR is good for, we are likely to see some major enablers.

The first major problem with VR currently is the price tag. Arguably, building VR the way companies currently build it does not enable long term adoption. Since VR is currently mostly developed with game engines, there are few possibilities for the non-tech-savvy to build content. If you look at e.g. how videos have grown the their current dominance, that’s because every single individual can produce them. 

The second major problem with VR this year has been the lack of data capabilities. Without the ability to record big data from the VR experiences, organisations cannot possibly prove the investment worthwhile. While VR experiences are definitely a great gimmick, many organisations have vastly over-invested in it. However, there’s light at the end of the tunnel already in 2019. In fact, we are already seeing some of the first VR content editors emerge. These tools require no technical knowledge, game-engines or programming and come with big data capabilities. Hence, they overcome some of the two current major problems and are set for wider adoption. 

Learning Technology Trend #4: Augmented Reality (AR) will redefine workflow learning 

While VR has been on everyone’s news feed, augmented reality has gone largely unnoticed in 2018. However, several companies both in- and outside of the learning field are developing their AR tools. With the current pipeline of technological development, AR is likely to have a major impact on bringing learning into the workflow. A lot of the initial impact will focus on the technical fields, such as engineering. 

For the first time in history, people will actually be able to learn without interruption to work. This will happen with specialised AR headsets, which you can use to open learning content into your additional layer of reality. Best of the tools will have voice control and come with remote capabilities. This enables, e.g. trainers and experts to follow the learners and guide them through activities. Through a live connection, the trainers may influence the “reality” visible to the learner. Furthermore, the advanced headsets will likely incorporate cameras and tracking capabilities to capture great amounts of data. This data will be incredibly useful both for learning and the business as a whole, as it enables a totally new level of recording work, understanding workflows and the learning happening during them.

Now, the four technologies here represent only a part of the future of learning, but arguably they’re the most hyped. Later, we’ll look at some other technologies as well as emerging methodological trends in L&D. 

Is your organisation ready to take advantage of the upcoming technological developments in the learning space? If not, we’re happy to work with you in building that capability. Just contact us. 

More Learning Ideas

Leveraging Learning Content Analytics for Better Learning Experiences

Learning content analytics cover

Leveraging Learning Content Analytics for Better Learning Experiences


We published this article first on eLearning Industry, the largest online community of eLearning professionals. You may find the original article here

An area where Learning and Development professionals could learn a lot from, e.g. marketing experts, is content analytics. Whereas marketing has embraced the need to constantly iterate and redevelop content based on real-time campaign analytics, learning professionals tend to take the easier route. Once an eLearning activity is produced and published, it’s easy to just leave it there and be done with it. But the work is really only at its midway. How do you find out if the content resonated with the audience or not? If it didn’t, how do you figure out what are the problem areas with the content? This is where learning content analytics come in handy.

Example Of Learning Content Analytics On A Training Video

When analysing the effectiveness of eLearning content, you should pay attention to what kind of metrics you are tracking. For instance, in the case of a training video, traditional metrics like how many times the video was opened don’t necessarily carry a lot of value. Instead, we should be looking at the content consumption behaviour on a wider scale, throughout the content and the learning journey. Let’s take a look at an analytical view of a training video.

Learning content analytics on training video
With learning content analytics, you can easily capture where your learners lose interest and drop off.

In this example, you can see the users’ behaviour at various stages of the training video. As usual, you see a slump immediately in the beginning, followed by another bigger slump later on. We’ve coloured the 2 main points of interest to break them down.

1. Initial Attrition

You are always bound to lose some learners in the beginning due to a plethora of reasons. However, if you constantly see big drops starting from 0 seconds, you might want to double-check, e.g. the loading times of the content, to make sure your learners are not quitting because of inability to access the material in a timely manner.

2. Learning Content Engagement Failure

Going further in the video, we see another big slump where we lose around 40% of the remaining learners in just 30 seconds. Clearly, this represents a learning engagement failure. Something is not right there. Learners are likely dropping off because the content is not engaging, relevant or presented in an appealing way.

How Should I Incorporate Content Analytics In The eLearning Development Process?

The above-mentioned video analytics is just a single example of how you can use content analytics to support your learning. Ideally, you should be running these kind of analytics across all your learning content. xAPI tracking capabilities give a lot of possibilities in this regard. Once you’re collecting the data and running the analytics, this is how you could build the use of analytics into your eLearning development process:

  1. Develop an initial version of eLearning materials
  2. Roll it out to a test group of learners, monitor the analytics
  3. Identify potential learning engagement failures and re-iterate content accordingly
  4. Mass roll-out to a wider audience
  5. Revisit the content analytics at regular milestones (e.g. when a new group of learners is assigned the content) to ensure continued relevance and engagement

This type of approach helps to ensure that the learning activities you provide and invest money in, perform at their best at all times.

How Can I Use Learning Content Analytics To Provide Better Learning Experiences?

By now, you’ve surely developed many use cases for content analytics. To summarise, here’s how you could provide a better learning experience through data-driven insights:

1. Identify The Types Of Content Your Learners Like

In the case of videos, you could benchmark the performance of different types of videos (e.g. talking heads, animations, storytelling videos) against each other and see what type of content keeps your learners engaged the best.

2. Develop Engaging Content

With the power of analytics, you’ll be able to develop better learning. You are able to find out immediately what works and what doesn’t. No need to run extensive surveys. The behavior of the learners is the best feedback.

3. Personalise Learning Experiences

You can naturally run analytics for individuals and defined groups, in addition to the whole mass of learners. This helps you personalise the learning experiences according to e.g. skill levels, seniority, experience, previous learning history, etc.

All in all, learning content analytics provide a powerful tool for increased transparency and visibility into the performance of your eLearning. As learning moves to more in-demand and just-in-time, they help to ensure that you’re delivering the right content to the right audience.

Are you interested in developing more analytical, data-driven approaches to your L&D? Or want to know more about different content analytics possibilities? Just drop us a note, and we’ll get back to you. 

More Learning Ideas

Training Evaluation in Digital – Kirkpatrick Model & Learning Analytics

Digital Training Evaluation

Digital Training Evaluation – Using the Kirkpatrick Model and Learning Analytics

Ask an L&D professional about how they measure training effectiveness and learning. The likely answer is that they are using the Kirkpatrick 4-level evaluation model. The model has been a staple in the L&D professionals’ toolbox for a long time. However, if you dig deeper, you’ll find that many organisations are only able to assess levels 1 & 2 of the model. While these levels do constitute valuable information, they help very little in determining the true ROI of learning. Luckily, thanks to technological development, we nowadays have the capability to do digital training evaluation on all 4 levels. And here are some best practices on how to do it.

Level 1: Reaction – Use quick feedback and rating tools to monitor engagement

The first level of Kirkpatrick is very easy to implement across all learning activities. You should use digital tools to collect quick feedback on all activities. That can be in the form of likes, star ratings, scoring or likert scales. Three questions should be enough to cover the ground.

  1. How did you like the training?
  2. How do you consider the value-add of the training?
  3. Was the training relevant to your job?

Generally, scale or ratings based feedback is the best for level 1. Verbal feedback requires too much to effectively analyse.

Level 2: Learning – Use digital training evaluation to get multiple data points

For level 2, it all start with the learning objectives. Learning objectives should be very specific, and tied to specific business outcomes (we’ll explain why in level 4). Once you have defined them, it’s relatively easy to build assessment around it. Naturally, we are measuring the increase in knowledge rather than just the knowledge. Therefore, it is vital to record at least 2 data points throughout the learning journey. A handy way to go about this is to design pre-learning and post-learning assessment. The former captures the knowledge and skill level of the employee before starting the training. Comparing that with the latter, we can comfortably identify the increase in knowledge. You can easily do this kind of assessment with interactive quizzes and short tests.

“If you’re measuring only once, it’s almost as good as not measuring at all”

Level 3: Behaviour – Confirm behavioural change through data and analytics

Finally, the level 3 of measuring behaviour is delving into somewhat uncharted territory. There are a couple of different angles for digital training evaluation here.

First, you could engage the learners in self-assessment. For the often highly biased self-assessment, two questions should be enough. If no behavioural change is reported, another question captures the reason behind it, and L&D can intervene accordingly.

  1. Have you applied the skills learnt? (linking to specific learning, can be a yes/no question)
  2. If not, why not?

Secondly, since self-assessment is often highly biased, it’s not necessary meaningful to collect more data directly from the learner itself. However, to really get factual insight into level 3, you should be using data and analytics. On the business level, we record a lot of data on a daily basis. Just think about all the information that is collected or fed into the systems we use daily. Thus, you should be using the data from these systems with the self-assessment to get a confirmed insight into the reported behavioural change. For instance, a sales person could see an increase in calls made post training. A marketing person could see an increase in the amount of social media posts they put out. The organisation has all the necessary data already – it’s just a matter of tapping into it.

Level 4: Results – Combining Learning Analytics and Business Analytics

Finally, the level 4 evaluation is the pot of gold for L&D professionals. This is where you link the learning to business performance and demonstrate the ROI through business impact. With modern ways of digital training evaluation you can eliminate the guess work and deliver facts:

To be noted, it is highly important to understand that the evaluation steps are not standalone. Level 4 is linked to levels 2 and 3. If there was no increase in knowledge or behavioural change did not happen, there’s no business impact. You might see a positive change in results, but you should not mistake that as the product of learning if the previous levels have not checked out. But once levels 2 and 3 have come out positive, you can look into the bigger picture.

Firstly, you should look back at the learning objectives, especially the business outcomes they were tied to. If your aim with the sales training was to increase the number of calls made, it’s important to look at what happened in that specific metric. If you see a change, then you can look at the business outcomes. How much additional revenue would those extra sales calls produced? The results can also be changes in production, costs, customer satisfaction, employee engagement etc. In any business, you should be able to assign a dollar value on most if not all of these metrics. Once you have the dollar value, it’s simple math to figure out the ROI.

All in all, there’s really no excuse for not dealing with levels 3 and 4 of Kirkpatrick. You can manage digital training evaluation and learning analytics even with limited budget. It’s just a matter of embracing data and the benefits of data driven decision making.

Want to start evaluating your learning on all levels? Click here to start.



More Learning Ideas

Why xAPI is the Most Important Thing in the Future of Learning?


Why xAPI is the Most Important Thing in the Future of Learning?

The new interoperability standard and specification of eLearning, the Experience API (xAPI), is replacing SCORM. In today’s mobile world, where learning happens all the time and data is everywhere, it was necessary to develop a future framework for digital learning. Project TinCan not only achieved just that, it set out to fulfil the dreams of L&D and HR professionals with the Experience API. The specification enables us to capture vast amounts of data previously unavailable, run powerful analytics and link learning to business performance. In fact, xAPI is so powerful, that it will be the cornerstone future learning is built on, and here are just a few reasons why.

If you’re not familiar with the Experience API, you can read more about it here

xAPI enables us to track behaviours and interactions

Whereas SCORM enabled us to track test scores, completions and other basic factors, xAPI goes much deeper. With similar concept to e.g. Google Analytics, xAPI tracks interactions. This means that we can record every single click, comment, learning interaction and activity. This gives an immensely rich picture into how learning happens in the organisation. For the first time, learning professionals really know whether learning content works or not, i.e. do learners really use it. This makes content curation and decision making much easier. Learning professionals can also pinpoint the individuals or groups who require learning interventions. Furthermore, xAPI enables us to truly measure the ROI of learning in relation to all possible KPIs.

xAPI can track all learning activities including informal and offline

Nowadays, learning is increasingly happening outside of the workplace and schools – outside monitored environments. For a long period of time, learning professionals have struggled to get a complete picture of the whole life-long learning journey of individuals. However, with xAPI that is possible. The technology can track all imaginable learning activities, whether they happen outside of the employer’s system or even completely offline. For instance, it can track websites and articles that employees read and engage with. Capturing learning data is no longer confined within the borders of the learning management system (LMS). Every single interaction anywhere can be communicated with xAPI to a learning records store, which acts as the database.

xAPI enables us to finally link learning and business performance

Finally, the greatest struggle of learning professionals has been identifying the business value of different learning activities. Establishing links between performance and learning has been guessing game since the beginning. However, xAPI is here to change also that. We can use it to pull data from all systems (think ERPs, CRMs, HRMs, PMSs). Hence, with the right use of analytics, we can monitor business performance in all imaginable metrics and track it against the learning that is happening in the organisation. Thus, we can pinpoint whether employees in the organisation apply the learning, i.e. is there a behavioural change. Furthermore, we can confidently assign dollar values to these behavioural impacts, and hence the learning activities as well. Measuring the ROI of learning goes from a guessing game to data-driven science. Hence, you can be comfortable knowing that you’re getting the most out of your limited resources in L&D.

Do you already use xAPI for advanced learning insights? Do you want to finally link your learning to business performance? If yes, contact us, and let’s transform your learning together. 

More Learning Ideas

Seamless Learning Tracking vs. Formal Assessments

Learning Tracking

Seamless Learning Tracking vs. Formal Assessments

Previously, we have had very limited tools for tracking corporate learning activities – namely, formal assessments. Formal assessments may have come in a variety of forms: tests, learning surveys, self-assessments, performance evaluations etc. The common denominator for these is that they are relatively hard to administer and produce vague results. Luckily, thanks to advances in learning technology, we can do all this much more efficiently. Let’s look at how we can use seamless learning tracking to make our learning assessments much more efficient.

First, let’s consider a two traditional assessment tools used in corporate L&D: tests and learning evaluations.

Seamless Learning Tracking vs. Conventional Tests

The first problem with formal tests is that they often become the end instead of means to an end. Tests should be a tool for learning, not the reason we are learning for. Consequently, tests are also quite dreadful to the learners. Therefore, many learners are subject to stress and anxiousness when taking examinations. Hence, they are not performing at their best in terms of problem solving or creativity. Frankly, formal tests and scores are also subject misinformation caused by cheating, etc.

By utilising modern tracking methods instead of archaic testing, we can extract a superior amount of information without subjecting our learners to the adversary effects. With modern digital learning environments, we can track everything the learner does. We can follow every individual learner’s journey: how many times did they watch a training video? How much time did they spend on topic X? How many times did they try a quiz before passing? When we have access to learning data like this, the old formal tests scores become practically meaningless. Rather than assessing our learners bi-annually or in defined intervals, we can track them continuously. Hence, we can tackle problems in real time, rather than six months later.

Seamless Learning Tracking vs. Learning Evaluations

Other things we could do more reliably with modern technology are feedback and learning evaluations.

For feedback, the fundamental problem is the one that all marketing research people know – people lie. In organisations, especially hierarchical ones, it’s often a mission impossible to extract genuine feedback. Instead of being truthful, the learners give mid-range positive feedback to avoid confrontation – perhaps in the fear that even the anonymous feedback may be traced back to them. And if we are not getting honest feedback, we’ll shoot ourselves in the foot trying to improve the learning. However, by enabling comprehensive learning tracking, we can let behaviour be our best feedback. We can accurately pinpoint the learning materials that are not engaging or effective and work to improve them.

For learning evaluations, we can pull information from a whole history of learning, rather than just a formal test here and there. This helps us to provide much more personalised feedback to the learners. Instead of focusing on what (“you scored 88/100” in test X), we can focus on how (“you took twice as long to complete the training as your peers”), and most importantly, why (“could another style of learning work better for you?”). This provides us a much more comprehensive view to our people, their skill-sets and capabilities than we could ever achieve by traditional, formal assessments.

Are you using advanced learning tracking methods in your organisation? Would you like to take a more data-driven approach to your HR? We can help, just leave us a note here.  

More Learning Ideas

Learning Data – How to Derive Meaningful Insights?

Learning data - deriving meaningful insights?

Learning Data – How to Derive Meaningful Insights?

As organisations move towards more data-driven decision making, we often find ourselves requiring more sophisticated data collection and analytics tools. For HR in general, we have become quite comfortable with deriving important insights through data analytics. However, for learning and development, these types of operations on the learning data are still in a bit of a grey zone.

As learning moves beyond “just another compliance exercise”, we find that tracking different streams of data can provide real value. These include insights into our employees’ skill level, the effectivity of our learning content and ultimately, the learning ROI. Here are a few illustrations how modern tracking technologies and data analytics can help you to deliver better and more efficient learning.

Assessing employee’s skill levels and learning through data

Assessing our employee’s learning should probably be one of the most important parts in the whole learning architecture. However, many organisations generally fare poorly in this regard (to be fair, the technology has been around for only a few years). Previously, our tracking of learning activities completed has been reliant on very narrow streams of data: the user’s click on “mark as complete” button and/or user input to formal assessment/tests.

First of all, the fact that the user has marked the learning module complete has very little value other than compliance. These types of tick-box exercises tell nothing about the way the learning content was consumed (if at all!). Thus, they provide little to no insight into whether learning actually happens.

Moreover, the formal assessment or tests are not much better. Sure, they again help us to fill the compliance requirements. However, the problem with formal tests is that they are quite dreadful to the learner and can effectively assess a limited part of learning. They do give us a glimpse into how well the learner knows the theory (or how well Google does…). However, they tell us nothing about whether the learning carries through to their jobs resulting in a behavioural change.

What type of data should we collect and utilise instead?

Instead of this kind of tick-the-box data, we should be collecting and leveraging on more qualitative data. How was the content interacted with? In which order were the learning activities completed? How long did it take to complete the learning and how was that time divided between the different sub-activities?

By collecting data to respond to these types of queries, we are producing actionable insights. For example, a learner with lower time-to-complete more likely had higher skills and confidence in their ability than someone who took longer. Similarly, learners who start tackling the difficult content first are more likely to possess advanced skills.

Using data analytics to measure the effectiveness of learning content

Nowadays, learning content plays a major part in the whole learning experience. As we are investing time and money in providing better content, we sure want to keep track of what kind of results we are getting in return.

Previously we would track whether a piece of content was viewed/consumed/marked complete. But again, these kinds of metrics really don’t provide much value to us. If we commit resources into production of e.g. training videos, we surely prefer a much more detailed view to what’s happening. In the case of the videos, we would want to track how long our videos are being watched. If the learners only watch our fancy and expensive video halfway through, that sure seems like money wasted. We would also want to track how learners proceed on the video timeline. For instance, if many of the learners seem to be jumping back and forth multiple times, our video might have failed to communicate the key messages clearly enough.

Using this kind of simple analytics can be a tremendous help in justifying the investments in learning. It’s much easier to get buy-in from the top of the organization once you can show quality insights into the performance of the content rather than just guess work and gut feelings. This also helps us to analyse the Return On Investment of learning. However, a good measure of ROI should also incorporate metrics from the operational side of the business.

Using learning data to determine the Learning ROI

As mentioned, relying only on learning data to determine the learning ROI only gets us so far. As it is, we are rarely learning for just the sake of learning. Rather, we are providing our employees learning experiences in the hopes of them translating into better business results. Therefore, it is equally important that we try to measure the benefits on the business itself, rather than just the fun/participation/liked/etc. index.

By pooling the learning data we collect with other sets of data from the operational side, we can start to assess how well the learning translates to the learners’ daily jobs and how the skills develop. Let’s take sales training as an example, as it is an area where business benefits are easily demonstrable.

Example: combining CRM and learning data for better insights

To assess the effect of learning on sales performance, we could look at results from the sales training and plot them against several sales metrics, such as number of calls made or conversion rates. This kind of data can should be easily extracted from the company’s CRM software. When you see an increase in any of the metrics in correlation to the training you have given and the learning data, you may have managed to produce positive results. It’s also easy to dig deeper to look at cross-performance on both group and individual level.

Going a bit deeper, we can also identify the individuals who have got the most benefit from learning. Could you perhaps gain even more in performance by giving these individuals additional, targeted learning? Similarly, you can identify the individuals lagging in their KPIs and having just ticked the boxes with their learning. It’s time to explain these individuals that it’s simply not enough to mark the e-learnings completed after a brief glance.

Additionally, you’ll also be able to pinpoint the types of content which seem to be driving the increase in performance. Similar to your learners, you can track every piece of content individually. If a certain type of content, e.g. animations or simulations, seem to be the most effective, start using more of them!

Finally, there are so many streams of data that the applications are practically endless. However, you should first pay attention to whether you are collecting meaningful data or not. And then, secondly, think about how to deliver better insights that benefit the overall business.

Are you looking to implement a more data driven approach to learning and development? Would you like to be able to drive business performance through learning and be able to show proof for it? We are happy to help you with your learning data, just drop us a note here.

More Learning Ideas