How to Leverage Data in Training Needs Analysis?

How to leverage data for training needs analysis?

How to Leverage Data in Training Needs Analysis?

The training needs analysis is a real staple in the corporate L&D field. Everyone does it, yet the real value-add is ambiguous. The traditional ways of doing it are not employee-centric, which results in irrelevant and inconveniencing rather than enabling learning activities. While extensive use of data to support that analysis is clearly the best direction to take, organisations don’t often understand how. Thus, we wanted to explain a few of different ways you could leverage data in your training needs analysis.

Using search data to understand what your learners really need

One of the biggest problems in training needs analysis is that the people doing it don’t often really talk to the end user of the training. And naturally, they don’t have the time either. While it would be nice to sit down for a 1-on-1 with each learner, often that’s not a practical nor feasible possibility. But what if we could have the learners talk to us anyways? That’s where data collection comes in handy.

By monitoring e.g. what your employees search during their work can be a really good indicator of the types of things they would need to learn. As most of workplace learning happens that way – employees searching for quick performance support resources – you should really aim to understand that behaviour. So, why don’t you start playing Google? You already should have the capabilities of tracking search history on company devices or within your learning systems. These searches are highly contextual, as they happen within the direct context of learning or work. It’s just a matter of compiling this data and using it to support your training decisions.

Using real-time learning data to identify organisational skill gaps

Another stream of data that you should be looking into when doing training needs analysis comes directly from the learning activities themselves. First of all, you should make sure that the learning data you collect is relevant and actually gives an accurate representation of learning. If you’re not yet using xAPI, start now. You’ll unlock a whole new level of analytical power.

Once you’ve got that covered, you should track that data across the board. This enables you access to individual-, group- and subject matter level insights. For subject matter (i.e. training topics), you’re better off tagging all your learning content appropriately. By having an up-to-date representation of what learning experience related to what topic or competency, you enable quick glances into your organisation’s learning. For instance, a skills heat map might aggregate this “tagging” data and learning data to give you a visual representation on which areas your learners are lacking in competence. Then, you can start drilling down on the group- and individual levels to determine why some are succeeding and some are not. This helps you to craft better and much more personalised training activities and learning solutions.

Using performance data to understand the business needs

Naturally, organisational learning should always support the business rather than inconvenience it. Therefore, it’s important to measure and understand performance. If you don’t keep track of performance, it’s impossible to measure real learning impact and consequently do effective training needs analysis. Performance data is everywhere, often scattered across the business in various systems and silos. Different departments might have their own data and some of it may be centralised. But whether it’s sales, marketing, customer facing staff, operations, finance or HR, the data is often there already. And it’s incredibly important to tap into this data, regardless of where it is.

However, one extremely important thing to note is not to use performance data in isolation. Rather, you should always compare it with your learning data. For instance, if looking at performance data alone, you might see that performance of department X is lacking. The easy answer would be to “assign” more training. However, looking at learning data could reveal that training has not solved the problem before and thus you should be looking at completely different solutions to it. Furthermore, you should always be careful in jumping to conclusions when linking learning to performance impact. Again, the L&D department might see performance improvement as a quick win, but a deeper cross-analysis with learning data could reveal that the performance improvement wasn’t actually caused by the training.

Final words

Overall, there are tremendous amounts and types of both learning- and non-learning data we can leverage in training needs analysis. The above provides just a few examples. With better analysis we can provide better learning experiences and positively impact business performance. To not leverage the vast amounts of data available to do that is simply foolish.

If you need help in crafting more data-driven learning strategies or adopting technology to do so, lets talk. Just drop us a note here.

More Learning Ideas

How to Use Learning Analytics? 3 Value-add Cases

How to use learning analytics?

How to Use Learning Analytics? 3 Value-add Cases

As corporations become more data-driven in their decision making, learning & development has to follow suit. To make better decisions, you naturally need to collect a lot more learning data. But that alone isn’t enough. You also need capabilities to analyse the data to understand what it means. While there’s a lot of ambiguity about corporate training analytics and some organisations intentionally try to make it sound extremely difficult, it’s not entirely true. To clear out some of that ambiguity, here are 3 different use cases for learning analytics that are applicable for organisations of all sizes.

1. How to use learning analytics to increase engagement?

One of the bottleneck issues in corporate learning today is engagement. It’s not always an easy task to put out learning experiences that resonate with the learners and keep them engaged. Naturally, your content has to be of good quality, and you should likely use a fair bit of interactivity. But once all that is said and done, you should unleash the analytics.

Through learning content analytics, we can get a much better understanding of our users. We can see what are the pieces of content that are used the most or the least. We can also get an understanding of ‘when’ and ‘where’ learners tend to drop off, which then enables to start figuring out ‘why’. Furthermore, we can drill down to each interaction between the learner and content/instructors/other learners to really understand what is working and what is not. All of this (and a fair bit more!) enables us to constantly develop our learning experiences based on real information instead of gut-feels and opinions. And when we can make our content to be more relevant and to-the-point, a lot of the engagement tends to come naturally.

2. How to use learning analytics to personalise learning experiences?

Our professional learners – the employees – come with various skills, degrees of experience, education and backgrounds. As they certainly don’t represent a one-size sample, we shouldn’t be putting them through one-size-fits-all learning experience either. As organisations have understood this, the hype around personalised learning has grown significantly over the past few years. But it’s not just hype, there’s real value to personalisation that learning analytics can help us to unlock.

First of all, learning analytics help us to understand the different individuals and groups of learners in our organisation. By being able to drill down all the way to the level of individual’s interactions, we can understand our learners’ needs and challenges much better. This enables us to cater to their various strengths, diverse learning history and varying interests. Instead of providing a simple one-size-fits-all learning experience, we can use this information to design personalised learning paths for different groups or even up to an individual level. These learning paths can branch out and reconnect based on difficulty of content, experience, current job and various other factors. The learning experience thus becomes a spider’s web instead of a straight line, and you’ll be able to catch much more of your learners.

3. How to use learning analytics to prove the impact of learning?

Proving the impact or the ROI of learning is something that L&D professionals often struggle with. One of the reasons for struggle is not using learning analytics. For learning results in terms of knowledge acquisition, a data-driven approach beats out the traditional multiple choice testing or feedback forms by a long shot. Furthermore, it enables a much more formative way of assessment, thanks all the data points collected and available.

But simple knowledge acquisition isn’t simply enough to demonstrate corporate learning impact. After all, what’s the learning good for if no one applies it? Thus, it’s imperative that we combine learning analytics with performance metrics and indicators. By doing this, we’ll get a lot closer to real learning results. E.g. how did the sales training affect the sales staff routines, behaviours and performance? How much of the risky behaviour did the compliance training help to eliminate? Is our training on team management actually resulting in teams being managed better? By enabling this level of analytics, you can answer a lot more questions. Furthermore, you can also start asking questions that you were not even aware of.

In our work, learning analytics and data-driven approaches play a big part. While technology plays a big part, there’s obviously more to it. For instance, you want to be sure that you’re setting your corporate learning objectives to enable this. If you’re looking to move into more data-driven learning strategies or understand your training impact better, we can probably help you. Just reach out to us here.

More Learning Ideas

How to Use Formative Assessment in Corporate Learning?

How to Use Formative Assessment in Corporate Learning

How to Use Formative Assessment in Corporate Learning?

Wherever there’s learning, there should always be assessment. Assessment naturally comes in many types and formats, but generally a good distinction to draw is that between summative assessment and formative assessment.

In simplified terms, summative assessment focuses on trying to gauge the learning outcomes at the end of the learning activity. Students may be compared against each other and the assessment is a “high stakes” one. Formative assessment, on the other hand, attempts to measure learning throughout the learning activities. Formative evaluation can be considered less competitive – as the evaluation is based on criterion – and relatively “low stakes”.

How does formative assessment benefit corporate learning?

In our experience, a lot of corporate learning assessment is summative. L&D practitioners may dread the extra effort or may not even be familiar with formative practices. Furthermore, the prevalent tendency to developed slide-based courses with an exam at the end feeds into this behaviour. While building formative evaluation does require a bit more effort, the benefits tend to far outweigh the time investment.

Here are some of the benefits of formative assessment in corporate learning:

  • Trainers / L&D is able to recognise learning problems and skill gaps more effectively – on both individual and group levels
  • Learners are able to identify their own problem areas, self-correct and monitor their own progress
  • It provides valuable feedback to L&D to improve learning experiences and activities
  • It promotes active learning on the employees’ part
  • The focus shifts from achieving arbitrary outcomes (test scores, tick-box compliance etc.) to the learning process itself

In general, a well thought-out formative assessment approach helps all the stakeholders – trainers, learners and managers alike.

How to use formative assessment in practice?

Now that you’ve considered the benefits, here are some practical and highly manageable ways to improve your assessments.

The tools for formative assessment are plentiful, and the benefits are not limited to just evaluation either. By replacing summative assessment with something like this, you’ll also be creating much more engaging and learner-centric experiences. Furthermore, the approach is more data-driven by nature, helping you to make more informed L&D decisions. So start investing the time into it!

If you need help on designing digitally enabled assessments to support your learning strategy, we are happy to help. Just contact us.

More Learning Ideas

Omnichannel Learning – Steps Towards Unified Experiences

Omnichannel learning experiences - unified and seamless

Omnichannel Learning – Steps Towards Unified Experiences

The concept of omnichannel comes from the retail sector, where retailers are striving to provide a seamless, unified and personalised shopping experience across different channels, such as online, mobile and physical stores. Organisations who fail to utilise some of the individual channels or integrate them seamlessly seem to be struggling in business because of low customer engagement. While omnichannel is not much of a buzzword in the learning and development space, we should adopt the same ideology. After all, learning engagement as well as tracking learning across different channels is a challenge for many organisations. Here’s how we could move towards an omnichannel learning approach to tackle these problems.

Omnichannel learning starts with cross-platform functionality

We live in the era of learning apps. For almost every need, there’s an app. On top of that, you have your corporate LXP (or LMS) systems, learning portals, intranets and co-working platforms. The problem is that often these systems are don’t communicate very well with each other. Your learner may complete a learning activity in a dedicated application, but doesn’t in any way reflect in the content that e.g. your LMS might push to him/her. Running multiple platforms easily results in an incredible amount of duplicate work and activities. Furthermore, it tends to hide information in silos and the confines of the platform.

The aim of successful omnichannel learning is to abolish the boundaries of individual platforms. While running a single learning platform for all the learning needs would be ideal from a systems management standpoint, it’s often a non-feasible reality. Hence, when you’re looking at “yet another app” to solve your learning challenges, you should pay attention to the interoperability possibilities with your existing infrastructure. An important aspect of that is the Application Programming Interfaces (APIs) the systems can use to fetch and receive information from each other.

Omnichannel learning should aim for a unified user experience

Another omnichannel feature that may be equally challenging to create is a unified user experience across platforms. If we use a retail analogy, the aim is not only for the mobile app to match the design of the responsive website/web application, but the physical environment (the retail stores) to match it as well. A seamless transition between online and offline will be key to deliver a great user experience and sustain engagement. Interestingly, the online to offline is a particular challenge in learning as well (more on that later).

This area of omnichannel learning is the one where running multiple platforms usually kills the game. However, with a bit of effort on visual- and functional design, we can do quite a lot. Naturally, visual design, colour schemes etc. should match across platforms, as it is a low effort – high return type of situation. In terms of functionality, you’re better off if your applications follow similar logic in terms of accessing and consuming learning. Furthermore, you shouldn’t unreasonably restrict functionalities on mobile platforms, otherwise you may lose a lot of engagement.

How do we collect uniform learning data from all the different channels – even offline?

To, first of all, understand and further develop omnichannel learning experiences, we need comprehensive learning data. As we want to eliminate unnecessary overlaps in delivery, we need to grasp how the different channels work together. While each app or learning tool may very well have its own analytics, they don’t necessarily help the bigger picture. Furthermore, a major challenge is bringing offline (face-to-face) into the mix and collecting data from them. Thus, we need a unified framework of recording all different learning activities, whether mobile, online or classroom-based.

Luckily, we already have the technological answer for the problem – The Experience API (xAPI). The xAPI specification enables us to track and collect uniform data from all learning activities, even offline and pass them onto a single locker of data for analysis. It helps not only in learning analytics, but also enables better understanding of content engagement and learner-centric design.

What about content development for omnichannel?

Finally, content development is an important topic in an omnichannel approach to learning. Naturally, all digital content should be fully responsive, so it can be accessed via a browser on all devices and wrapped into mobile applications for native use. Interoperability and accessibility is imperative, as the concept of omnichannel expands the “mobile learning paradigm” of “anytime, anywhere” to “any content, anytime, anywhere”.

Integrating this mode of operation to offline activities is again the biggest challenge. The approach requires a degree of flexibility from the trainers, coaches and mentors. They need to adapt their classroom content to form a natural continuum to the prior (digital) learning experiences. But thanks to xAPI and learning analytics, they nowadays have the power to understand each learner on a very individual level.

Are you delivering seamless and unified learning experiences across different channels? If you want to move away from siloed learning approaches, we can help. Our advisory services cover both technology implementations and strategic learning consulting. Just contact us.

More Learning Ideas

Learning Technology Trends for 2019 – What’s Ahead?

Learning Technology Trends for 2019

Learning Technology Trends for 2019 – What’s Ahead? 

During the past few years, we’ve witnessed an unprecedented speed of development in the learning technology space. Likewise, the year 2019 looks to be no different. At Learning Crafters we are lucky to have an inside view to much of the development happening in the learning technology space thanks to our work with some of the leading technology vendors. Therefore, we thought it would be worthwhile to share some of our thoughts, views and first-hand experiences on what’s ahead for the industry next year. Hence, here are four key learning technology trends for 2019. 

Learning Technology Trend #1: Big Data will deliver exponential impact in 2019

For the past few years, organisations have been adopting tools and technologies to capture, analyse and execute on business data. While the human resources function in general seems to be lagging slightly behind in that adoption, 2019 looks to a be a big year for big data. For learning and development, the holy grail of learning data – the Experience API (xAPI) – has already been available for several years. While adoption of the xAPI standard has been slower than expected, any organisation claiming to do “learning analytics” today cannot remain credible without involving with xAPI. The old, commonplace ways of capturing learning data (e.g. SCORM) are simply not powerful enough. As we move into data-driven decision making in the L&D space, big data capabilities are an absolute requirement – and that will be delivered with xAPI. 

Learning Technology Trend #2: Artificial Intelligence (AI) will undergo rapid developments

Naturally, in the era of machines, the xAPI learning data will not only be used for analytics. Rather, this type of behavioural data (comparable e.g. to Google Analytics) will be used to develop more advanced AI. Now, what is AI good for in the learning space? 

Currently, AI in learning is being used to build adaptive, as well as personalised learning. Furthermore, the currently available more advanced AI applications are able to curate learning content based on the individual roles, needs and preferences of the learner. In 2019, we’ll definitely see major developments in both fronts. Additionally, we predict another AI application in learning analysis. In other words, the use of artificial intelligence to form insights on the link of learning and performance. 

Learning Technology Trend #3: Virtual Reality (VR) will become more “commercial” 

If you’re a learning professional and didn’t hear about VR in 2018, it’s time to go out! While a lot of the hype surrounding VR is arguably just that, hype, 2019 looks interesting. In addition to developing an industry understanding of what VR is good for, we are likely to see some major enablers.

The first major problem with VR currently is the price tag. Arguably, building VR the way companies currently build it does not enable long term adoption. Since VR is currently mostly developed with game engines, there are few possibilities for the non-tech-savvy to build content. If you look at e.g. how videos have grown the their current dominance, that’s because every single individual can produce them. 

The second major problem with VR this year has been the lack of data capabilities. Without the ability to record big data from the VR experiences, organisations cannot possibly prove the investment worthwhile. While VR experiences are definitely a great gimmick, many organisations have vastly over-invested in it. However, there’s light at the end of the tunnel already in 2019. In fact, we are already seeing some of the first VR content editors emerge. These tools require no technical knowledge, game-engines or programming and come with big data capabilities. Hence, they overcome some of the two current major problems and are set for wider adoption. 

Learning Technology Trend #4: Augmented Reality (AR) will redefine workflow learning 

While VR has been on everyone’s news feed, augmented reality has gone largely unnoticed in 2018. However, several companies both in- and outside of the learning field are developing their AR tools. With the current pipeline of technological development, AR is likely to have a major impact on bringing learning into the workflow. A lot of the initial impact will focus on the technical fields, such as engineering. 

For the first time in history, people will actually be able to learn without interruption to work. This will happen with specialised AR headsets, which you can use to open learning content into your additional layer of reality. Best of the tools will have voice control and come with remote capabilities. This enables, e.g. trainers and experts to follow the learners and guide them through activities. Through a live connection, the trainers may influence the “reality” visible to the learner. Furthermore, the advanced headsets will likely incorporate cameras and tracking capabilities to capture great amounts of data. This data will be incredibly useful both for learning and the business as a whole, as it enables a totally new level of recording work, understanding workflows and the learning happening during them.

Now, the four technologies here represent only a part of the future of learning, but arguably they’re the most hyped. Later, we’ll look at some other technologies as well as emerging methodological trends in L&D. 

Is your organisation ready to take advantage of the upcoming technological developments in the learning space? If not, we’re happy to work with you in building that capability. Just contact us. 

More Learning Ideas

Rapid Learning Interventions – Removing Process Bottlenecks

rapid learning interventions - removing bottlenecks

Rapid Learning Interventions – Removing & Redesigning Process Bottlenecks

As the business and corporate landscape is changing faster than ever, learning and development faces a difficult time. Skills evolve at such a fast pace that predictions into future are no longer meaningful. Business models are also becoming more complex, and employees seem to have much more responsibilities than before. Delivering effective corporate learning to stay on top of the change is not easy. However, organisations could help themselves by trying to eliminate some of the traditional barriers in the learning delivery process. Here are a two common and often major bottlenecks that hinder rapid learning interventions and how to get rid of them.

Rethinking training needs analysis

One of the major bottlenecks in the corporate learning design process is the training needs analysis. The process itself is often too infrequent to respond to rapidly evolving needs. It’s also often reactive, rather than proactive. Finally, the common top-down approach where the L&D department assumes they could even have a chance at grasping the complexity of the roles in their organisation is outright infeasible.

The predicament that learning professionals know best when it comes to training needs analysis is causing more harm than good. In fact, people actually doing the day-to-day jobs often have much better visibility. Thus, learning professionals should leverage that visibility by polling for needs and crowdsourcing ideas. Further, to respond faster and enable rapid learning interventions, organisations need to go real-time. Learning data analytics can provide real-time insights into the skill gaps, competencies and training needs in the organisation. No more guess work and fabricated evaluation intervals, the company can see the learning as it is happening.

Redesigning the learning design process

At it’s current state, the learning design process seems to be a bit broken as well. In our experience, the lead times for developing learning materials can extend to 6 months or even a year for some organisations. A lot of this is attributable to the traditional and tedious development processes of learning. Initially, rapid eLearning tools emerged to combat this problem. However, even they still require quite long lead times. While everyone would like to develop a perfect product, most likely it’s not going to happen. Hence, it probably makes sense enable rapid learning interventions by more agile learning design.

Rather than perfecting and fine tuning the learning product for ages, you should start audience exposure already at the “minimum viable product (MVP)” phase. By involving employees and subject matter experts through a more user-centric design process, you can collect timely feedback and improve gradually. A more collaborative approach has the added benefit of potentially greater impact, as the involvement of the different stakeholders results in more personalised learning.

Actually, does the L&D even need to be the one designing content?

There are two things we’ve noticed with the emergence of the online economy. One, anyone can create content for global audiences. Two, there are endless amounts of content publicly available on the internet. Wouldn’t it make sense to leverage these?

How about enabling rapid learning interventions by flipping the paradigm altogether? Since your employees are the best subject matter experts anyway, why not have them create learning content for each other? Or how about leveraging what’s already out there and replacing time-consuming design with learning content curation? There are a lot of tools out there to power up these types of approaches and further customise learning. Here’s an example for curating interactive microlearning videos.

Does your organisation face challenges in deploying rapid learning interventions and responding to business needs? We may be able to help. Just drop us a note here.

More Learning Ideas

Leveraging Learning Content Analytics for Better Learning Experiences

Learning content analytics cover

Leveraging Learning Content Analytics for Better Learning Experiences

 

We published this article first on eLearning Industry, the largest online community of eLearning professionals. You may find the original article here

An area where Learning and Development professionals could learn a lot from, e.g. marketing experts, is content analytics. Whereas marketing has embraced the need to constantly iterate and redevelop content based on real-time campaign analytics, learning professionals tend to take the easier route. Once an eLearning activity is produced and published, it’s easy to just leave it there and be done with it. But the work is really only at its midway. How do you find out if the content resonated with the audience or not? If it didn’t, how do you figure out what are the problem areas with the content? This is where learning content analytics come in handy.

Example Of Learning Content Analytics On A Training Video

When analysing the effectiveness of eLearning content, you should pay attention to what kind of metrics you are tracking. For instance, in the case of a training video, traditional metrics like how many times the video was opened don’t necessarily carry a lot of value. Instead, we should be looking at the content consumption behaviour on a wider scale, throughout the content and the learning journey. Let’s take a look at an analytical view of a training video.

Learning content analytics on training video
With learning content analytics, you can easily capture where your learners lose interest and drop off.

In this example, you can see the users’ behaviour at various stages of the training video. As usual, you see a slump immediately in the beginning, followed by another bigger slump later on. We’ve coloured the 2 main points of interest to break them down.

1. Initial Attrition

You are always bound to lose some learners in the beginning due to a plethora of reasons. However, if you constantly see big drops starting from 0 seconds, you might want to double-check, e.g. the loading times of the content, to make sure your learners are not quitting because of inability to access the material in a timely manner.

2. Learning Content Engagement Failure

Going further in the video, we see another big slump where we lose around 40% of the remaining learners in just 30 seconds. Clearly, this represents a learning engagement failure. Something is not right there. Learners are likely dropping off because the content is not engaging, relevant or presented in an appealing way.

How Should I Incorporate Content Analytics In The eLearning Development Process?

The above-mentioned video analytics is just a single example of how you can use content analytics to support your learning. Ideally, you should be running these kind of analytics across all your learning content. xAPI tracking capabilities give a lot of possibilities in this regard. Once you’re collecting the data and running the analytics, this is how you could build the use of analytics into your eLearning development process:

  1. Develop an initial version of eLearning materials
  2. Roll it out to a test group of learners, monitor the analytics
  3. Identify potential learning engagement failures and re-iterate content accordingly
  4. Mass roll-out to a wider audience
  5. Revisit the content analytics at regular milestones (e.g. when a new group of learners is assigned the content) to ensure continued relevance and engagement

This type of approach helps to ensure that the learning activities you provide and invest money in, perform at their best at all times.

How Can I Use Learning Content Analytics To Provide Better Learning Experiences?

By now, you’ve surely developed many use cases for content analytics. To summarise, here’s how you could provide a better learning experience through data-driven insights:

1. Identify The Types Of Content Your Learners Like

In the case of videos, you could benchmark the performance of different types of videos (e.g. talking heads, animations, storytelling videos) against each other and see what type of content keeps your learners engaged the best.

2. Develop Engaging Content

With the power of analytics, you’ll be able to develop better learning. You are able to find out immediately what works and what doesn’t. No need to run extensive surveys. The behavior of the learners is the best feedback.

3. Personalise Learning Experiences

You can naturally run analytics for individuals and defined groups, in addition to the whole mass of learners. This helps you personalise the learning experiences according to e.g. skill levels, seniority, experience, previous learning history, etc.

All in all, learning content analytics provide a powerful tool for increased transparency and visibility into the performance of your eLearning. As learning moves to more in-demand and just-in-time, they help to ensure that you’re delivering the right content to the right audience.

Are you interested in developing more analytical, data-driven approaches to your L&D? Or want to know more about different content analytics possibilities? Just drop us a note, and we’ll get back to you. 

More Learning Ideas

User-Centred Learning Design – Using the 5Di Model

User-centred learning design 5Di

User-Centred Learning Design – Using the 5Di Model for Learning Activity Development

A few weeks back, we touched on the topic of delivering engaging experiences with learner-centric design. While that article covered some general principles of user-centred learning design, we wanted to further introduce you to an actual design framework. Naturally, we picked a framework that we’ve adopted and keep adapting at Learning Crafters, called 5Di. The 5Di is not something we’ve developed ourselves, rather it was actually spearheaded by Nick Shackleton-Jones. We recognised the value-add in the approach and have since adapted it to our learning design process. So what’s the 5Di all about?

The 5Di User-centred learning design model

The model outlines a 6-step learning design process, the five Ds and the I.

  1. Define
  2. Discover
  3. Design
  4. Develop
  5. Deploy
  6. Improve

And here’s a rundown of the activities within each part of the process.

1. Define

As with any project, user-centred learning design should also start with identifying the problem. It’s important to partner with the business to define the desired outcomes. The desired outcomes should be based on results, not learning objectives per say. After all, you’re developing learning to achieve business impact. However, don’t be too confined to a familiar set of solutions when in the definition – a course or even training is not always the right answer.

2. Discover

Then, partner with the assumed audience of the learning to gain deeper understanding of the business problem. Involve subject-matter experts to identify the behaviour required and barriers for improved performance. It’s very difficult to translate learning into behaviour later on if you don’t take the time to understand the line of business initially.

3. Design

Next, develop a formulated approach into solving the learning problem and document it for presentation to the decision-maker. Develop scripts, wireframes or storyboards outlining the approach. A good wireframe helps to divide up tasks later on to enable a quicker and more agile development.

4. Develop

Next, develop a Minimum Viable Product (MVP) to get user and stakeholder feedback on. Reiterate and refine the learning design accordingly. Test the “product” for usability, interoperability with existing systems etc. And remember, collecting feedback is adamant. If you don’t focus on gathering user feedback, the whole concept of MVP renders itself obsolete. Furthermore, it’s important that designers continue to partner with subject-matter experts to guarantee a truly user-centred learning design.

5. Deploy

Roll out the learning activity to the users while drumming it up with communications and marketing using common channels available to you. Good communication is needed for a successful learning activity. Therefore, you should treat it as a marketing campaign. Thus, a single informative email is not enough. Rather, you should drum it up over time and involve user feedback, referrals and success stories where possible. In business units, it also often pays to get line managers to recommend the learning activities to their teams.

6. Improve

Finally, we arrive at the most important step! The learning development process doesn’t stop even after learners have completed the course. Rather, you should keep monitoring the content performance and user engagement levels and make improvements accordingly. A learning data driven approach is well suited for this, and xAPI capabilities help tremendously in analysing engagement. Remember, it’s not only the subject-matter refinement you should focus on! Rather, it’s the delivery and user experience that are often more important.

That’s 5Di, a user-centred learning design approach, in a nutshell. With this agile method, we’ve been able to actually reduce our learning development times. Also, the results have been a lot better in terms of measurability, user experience and learning results.

Are you using 5Di or a similar learning design approach? If you’d like to implement a more agile learning development approach with your learning designers, we can help you. Just drop us a note

More Learning Ideas

Training Evaluation in Digital – Kirkpatrick Model & Learning Analytics

Digital Training Evaluation

Digital Training Evaluation – Using the Kirkpatrick Model and Learning Analytics

Ask an L&D professional about how they measure training effectiveness and learning. The likely answer is that they are using the Kirkpatrick 4-level evaluation model. The model has been a staple in the L&D professionals’ toolbox for a long time. However, if you dig deeper, you’ll find that many organisations are only able to assess levels 1 & 2 of the model. While these levels do constitute valuable information, they help very little in determining the true ROI of learning. Luckily, thanks to technological development, we nowadays have the capability to do digital training evaluation on all 4 levels. And here are some best practices on how to do it.

Level 1: Reaction – Use quick feedback and rating tools to monitor engagement

The first level of Kirkpatrick is very easy to implement across all learning activities. You should use digital tools to collect quick feedback on all activities. That can be in the form of likes, star ratings, scoring or likert scales. Three questions should be enough to cover the ground.

  1. How did you like the training?
  2. How do you consider the value-add of the training?
  3. Was the training relevant to your job?

Generally, scale or ratings based feedback is the best for level 1. Verbal feedback requires too much to effectively analyse.

Level 2: Learning – Use digital training evaluation to get multiple data points

For level 2, it all start with the learning objectives. Learning objectives should be very specific, and tied to specific business outcomes (we’ll explain why in level 4). Once you have defined them, it’s relatively easy to build assessment around it. Naturally, we are measuring the increase in knowledge rather than just the knowledge. Therefore, it is vital to record at least 2 data points throughout the learning journey. A handy way to go about this is to design pre-learning and post-learning assessment. The former captures the knowledge and skill level of the employee before starting the training. Comparing that with the latter, we can comfortably identify the increase in knowledge. You can easily do this kind of assessment with interactive quizzes and short tests.

“If you’re measuring only once, it’s almost as good as not measuring at all”

Level 3: Behaviour – Confirm behavioural change through data and analytics

Finally, the level 3 of measuring behaviour is delving into somewhat uncharted territory. There are a couple of different angles for digital training evaluation here.

First, you could engage the learners in self-assessment. For the often highly biased self-assessment, two questions should be enough. If no behavioural change is reported, another question captures the reason behind it, and L&D can intervene accordingly.

  1. Have you applied the skills learnt? (linking to specific learning, can be a yes/no question)
  2. If not, why not?

Secondly, since self-assessment is often highly biased, it’s not necessary meaningful to collect more data directly from the learner itself. However, to really get factual insight into level 3, you should be using data and analytics. On the business level, we record a lot of data on a daily basis. Just think about all the information that is collected or fed into the systems we use daily. Thus, you should be using the data from these systems with the self-assessment to get a confirmed insight into the reported behavioural change. For instance, a sales person could see an increase in calls made post training. A marketing person could see an increase in the amount of social media posts they put out. The organisation has all the necessary data already – it’s just a matter of tapping into it.

Level 4: Results – Combining Learning Analytics and Business Analytics

Finally, the level 4 evaluation is the pot of gold for L&D professionals. This is where you link the learning to business performance and demonstrate the ROI through business impact. With modern ways of digital training evaluation you can eliminate the guess work and deliver facts:

To be noted, it is highly important to understand that the evaluation steps are not standalone. Level 4 is linked to levels 2 and 3. If there was no increase in knowledge or behavioural change did not happen, there’s no business impact. You might see a positive change in results, but you should not mistake that as the product of learning if the previous levels have not checked out. But once levels 2 and 3 have come out positive, you can look into the bigger picture.

Firstly, you should look back at the learning objectives, especially the business outcomes they were tied to. If your aim with the sales training was to increase the number of calls made, it’s important to look at what happened in that specific metric. If you see a change, then you can look at the business outcomes. How much additional revenue would those extra sales calls produced? The results can also be changes in production, costs, customer satisfaction, employee engagement etc. In any business, you should be able to assign a dollar value on most if not all of these metrics. Once you have the dollar value, it’s simple math to figure out the ROI.

All in all, there’s really no excuse for not dealing with levels 3 and 4 of Kirkpatrick. You can manage digital training evaluation and learning analytics even with limited budget. It’s just a matter of embracing data and the benefits of data driven decision making.

Want to start evaluating your learning on all levels? Click here to start.

 

 

More Learning Ideas