Data-driven Learning Design – How to Get Started?

Data-driven Learning Design

Getting Started With Data-driven Learning Design

As a whole, the L&D industry hasn’t always been doing a terribly good job when it comes to designing learning. However, we have started to recognise that one-size-fits-all activities are probably not the way to go, and that we should design learning for the people doing the actual jobs, not for the company HR department. Fundamentally, designing better learning is about knowing your learners. In that aspect, the overall capabilities of the industry have developed tremendously over the past few years (with things like xAPI etc). However, as we start to accumulate more data and information, it’s important to know how to use it well. Thus, we decided to look at data-driven learning design, how to get started and the different types of data you can use in design decisions. We’ll divide this article into two, resembling an initial- and a subsequent round of design.

Understanding who you are designing for

At the start of any design process, you should always spend time understanding the problem and the “customers”. In corporate learning, this discovery is equally important, yet something that many organisations skip almost entirely. Here’s where data-driven learning design approaches already come in handy, albeit not perhaps in the way you expect.

Since it’s your people and employees you are designing for, you have an abundance of data available to you. However, this data is not necessarily siloed within the L&D’s systems or records. Rather, you might have to look for it in other places. For instance, demographic data might sit in an HR system. Assignment and task related data might sit in a performance management system. These kinds of data can help you create rough archetypes, or “personas” of your learners, i.e. who they are, what they do etc.

However, if we leave it there, we might still miss the mark. At the initial design stage, we should also explore how our learners can engage with the learning content at the workplace. As we don’t want to inconvenience them, it’s important to get to know the workflows and they ways we could instil learning into them. Now this a part of data-driven learning design that you don’t have an easy tool or a dashboard for. Rather, you have to get out there, start observing and exploring, and collect qualitative data. Different service design methods prove quite effective in this regard.

Understanding how learners engage with the content

Unfortunately, once you’ve put a learning activity together, your job doesn’t end there. Although the initial time spent on learning design does pay off, it’s still unlikely that everything works perfectly. Maybe there are pieces of content that the learners don’t engage with. Maybe they engage in ways different to what you initially thought. Whatever the actual usage and engagement behaviour is, it’s our job to find out.

To start out, tools like web analytics can provide handy insights into e.g. engagement times, devices used and geographical locations. Then, more specific tools for learning content analytics can tell us stories about how the content is being consumed. Finally, it’s tools like xAPI that enable us to practically follow the learners’ journeys through the material, tracking and seeing every interaction along the way.

Once we know what’s not working, we can fix it. Maybe we need to cater to different device sets than initially thought. Maybe the video we produced doesn’t actually engage the learners. Or perhaps the sequencing of learning activities seems to be wrong, as the data might show they jump between sections rather than following a linear path. Regardless of what it is, smart data-driven learning design enables us to get information, understand its magnitude, and make design decisions accordingly. Remarkable results are not produced in one iteration.

Final thoughts

If we want to improve as an industry, L&D has to start working with data to be able to produce better outcomes. It’s easy to view data-driven learning design as something daunting and terrifying, but it’s really not. Sure, we need to adjust our mentality a bit. We need to become more comfortable with “betas” and iterations, and the fact that we may not always get it right the first time. But once we get past that, once we learn that, there should be a great future ahead. And if you’re not entirely comfortable with all this just yet, we are happy to hold your hand. Just contact us here.

More Learning Ideas

Quick Guide: Brinkerhoff’s Success Case Method in Workplace Learning

How to use Brinkerhoff's Success Case Method in workplace learning?

How to Use Brinkerhoff’s Success Case Method in Workplace Learning

There are a lot of different frameworks that organisations use to evaluate the impact of their workplace learning initiatives. The Kirkpatrick model and the Philips ROI model may be the most common ones. While the Brinkerhoff’s Success Case Method is perhaps a less known one, it can too provide value when used correctly. In this post, we’ve compiled a quick overview of the method and how to use it to support L&D decisions in your organisations.

What’s the Brinkerhoff’s Success Case Method?

The method is the brainchild of Dr. Robert Brinkerhoff. While many of its original applications relate to organisational learning and human resources development, the method is applicable to a variety of business situations. The aim is to understand impact by answering the following four questions:

  • What’s really happening?
  • What results, if any, is the program helping to produce?
  • What is the value of the results?
  • How could the initiative be improved?

As you may guess from the questions, the Success Case Method’s focus is on qualitative analysis and learning from both successes and failures on a program level to improve for the future. On one hand, you’ll be answering what enabled the successful to succeed and on the other hand, what barred the worst performers from being successful.

How to use the Brinkerhoff Method in L&D?

As mentioned, the focus of the method is on qualitative analysis. Therefore, instead of using large scale analytics, the process involves surveys and individual learner interviews. By design, the method is not concerned with measuring “averages” either. Rather the aim is to learn from the most resound successes and the worst performances and then either replicate or redesign based on that information.

So ideally, you’ll want to find just a handful of individuals from both ends of the spectrum. Well-designed assessment or learning analytics can naturally help you in identifying those individuals. When interviewing people, you’ll want to make sure that their view on what’s really happening can be backed with evidence. It’s important to keep in mind that not every interview will produce a “success case”, one reason being the lack of evidence. After all, you are going to be using the information derived with this method to support your decision making, so you’ll want to get good information.

Once you’ve established the evidence, you can start looking at results. How are people applying the newly learnt? What kind of results are they seeing? This phase requires great openness. Every kind of outcome and result is a valuable one for the sake of analysis, and they are not always the outcomes that you expected when creating the program. Often training activities may have unintended application opportunities that only the people on the job can see.

When should you consider using Brinkerhoff’s Success Case Method?

It’s important to acknowledge that while the method doesn’t work on everything, there are still probably more potential use cases than we can list. But these few situations are ones that in our experience benefit from such qualitative analysis.

  • When introducing a new learning initiative or a pilot. It’s always good to understand early on where a particular learning activity might be successful and where not. This lets you make changes, improvements and even pivots early on.
  • When time is of the essence. More quantitative data and insights takes time to compile (assuming you have the necessary infrastructure already in place). Sometimes we need to prove impact fast. In such cases, using the Brinkerhoff method to extract stories from real learners helps to communicate impact.
  • Whenever you want to understand the impact of existing programs on a deeper level. You may already be collecting a lot of data. Perhaps you’re already using statistical methods and tools to illustrate impact on a larger scale. However, for the simple fact that correlation doesn’t mean causation, it’s sometimes important to engage in qualitative analysis.

Final thoughts

Overall, Brinkerhoff’s Success Case Method is a good addition to any L&D professional’s toolbox. It’s a great tool for extracting stories of impact, telling them forward and learning from past successes and failures. But naturally, there should be other things in the toolbox should too. Quantitative analysis is equally important, and should be “played” in unison with the qualitative. Especially nowadays, when the L&D function is getting increased access to powerful analytics, it’s important to keep on exploring beyond the surface level to make the as informed decisions as possible to support the business.

If you are struggling to capture or demonstrate the impact of your learning initiatives, or if you’d like start doing L&D in a bit more agile manner, let us know. We can help you in implementing agile learning design methods as well as analytical tools and processes to support the business.

More Learning Ideas

How to Use Data to Support Face-to-face Training?

How to support face-to-face training with data?

How to Use Data to Support Face-to-face Training?

Organisational learning and development is becoming increasingly data-driven. This is fuelled by the need to demonstrate impact, be more effective and direct resources more efficiently. With the advent of new learning technologies and platforms – many of which come with built-in analytics capabilities – we are increasingly better equipped to measure all kinds of learning in a meaningful way. However, for the most part, the collection and especially the use of this data has been limited to only digital learning experiences. But there’s no reason to draw that kind of limitation. In fact, traditional face-to-face training could benefit greatly from having access to data and analytics. So, let’s explore how we could support face-to-face training with data!

Current challenges with face-to-face training

Face-to-face training has its fair share of challenges ahead. On one hand, it’s rather expensive, once you factor in all of the lost productivity and indirect costs. However, cost becomes less of an issue as long as you can demonstrate impact and value. And that’s perhaps a business challenge. The real learning challenges, on the other hand, are related to the delivery.

Overall, face-to-face learning is not particularly personalised. Trainers are often not aware of the existing knowledge of the participants, let alone their personal context: jobs, tasks, challenges, problems, difficulties, team dynamics etc. Hence, the training – especially in subject matter intensive topics – often results in a more or less one-size-fits-all type of approach: trainer goes through the slide deck, perhaps with a few participatory activities and some feedback at the end. Even if you’re an experienced trainer, it’s difficult to improvise and go off-course in the heat of the moment to pursue the emerging (personal) needs of the learners.

So, wouldn’t it be beneficial and make sense to put that information into good use and start to support face-to-face training with data? Yes it would. Here are two easy ways you can get a lot more out of your “classroom” sessions.

1. Determining existing knowledge and skill level with pre-work

One of the simplest things you can do to get more value out of your face-to-face training is to start using pre-work. Have your learners go through digital learning materials before coming to the session. Build in some seamless assessment and collect information in the form of user submissions and feedback. With good design and proper use of learning analytics, this already gives you a lot of valuable information.

As a trainer, you can then check e.g. what your learners already know and what they are having difficulties with. It probably doesn’t make sense to spend a lot of time in the classroom on things they already know. Rather, you’re better off using the time on addressing problem areas, challenges and personal experiences that have come out during the pre-work. Or if you want to explore making things even more impactful, try an approach like flipped learning. In flipped learning, you use digital to deliver the knowledge while focusing the classroom time solely on discussions, practice and hands-on activities.

2. Using learning records history to understand the people you’re training

Another idea we could do better at is understanding the people we deal with. At their best, these records may provide a whole history of learning. As these digital platforms compile more and more data about our learning experiences, it would be beneficial to let the trainers access that as well. By understanding prior experiences, the trainer can create scaffolding – build on what the employees already know from before. This might be totally unrelated to the current topic too.

Furthermore, having access to a “HR” history of the employees might be beneficial too, especially in large organisations where the trainer doesn’t necessarily now the people personally. For instance, what are the attendees jobs? Where do they work? Where have they worked before? In what kind of roles? All the information like this brings additional data points to personalise the learning experience on. In some cases, you might even find that there’s a subject matter expert in the group. Or someone who has dealt in practice with the issues of the ongoing training. These could be assets you can leverage on, of which you wouldn’t perhaps even know about without the data.

Final thoughts

All in all, there’s a whole lot that data and analytics can offer to “traditional” training. The need for personalisation is real, and smart use of learning data helps to cater to that need. Of course, you can use data to support face-to-face training in many more ways, these are just two examples. For instance, post-session feedback is much more handy to do digitally. This feedback can then be used to improve future sessions on the same topic (or with the same participants).

If you feel you could do more with data and smart learning design, don’t hesitate to reach out. We can help you design blended learning experiences that deliver impact and value.

More Learning Ideas

How to Help Learners Succeed with Personal Learning Analytics?

How to use personal learning analytics to help learners succeed?

Personal Learning Analytics – Helping Learners with Their Own Data

Generally, the use of data and analytics in workplace learning is reserved for a small group senior people. Naturally, learning analytics can provide a lot of value for that group. For instance, data-driven approaches to training needs analysis and measuring corporate learning are quickly gaining ground out of the need to prove the impact of workplace learning initiatives. However, there could be further use cases for those analytical powers. One of them is helping the learners themselves with personal learning analytics.

What is ‘personal learning analytics’?

Like the title may give away, personal learning analytics is just that: individualised information made available to the learner. The major difference with conventional “managerial” analytics is that most of the information is about the learner in question. Whenever that’s not the case, the information of others would always be anonymised. A few exceptions could include e.g. gamification elements which display user names and achievements. So, effectively, it’s all about giving the user access to his/her own data and anonymised “averages”.

How can we use personal analytics to help learners?

One of the challenges in conventional approaches to workplace learning is that the process is not very transparent. Often, the organisation controls the information, and the learners may not even gain access. However, a lot of this information could help the learners. Here are a few examples.

  • Comparing performance against others. While cutthroat competition is probably not a good idea, and learners don’t necessarily want others to know how they fared, they can still benefit from being able to compare their performance against the groups. Hence, they’ll know if they’re falling behind and know to adjust their effort/seek new approaches.
  • Understanding the individual learning process. All of us would benefit greatly from information about how we learn. For instance, how have we progressed, how are we developing as well as how and when do we engage with learning. Luckily, personal learning analytics could tell us about all of that. The former helps to keep us motivated, while the latter helps us to identify patterns and create habits of existing behaviour.
  • Access to one’s learning history. We are learning all the time and all kinds of things. However, we are not necessarily very good at keeping track ourselves. If we just could pull all that data into one place, we could have a real-time view into what we have learned in the past. Potentially, this could enable us to identify new skills and capabilities – something that the organisation would likely be interested in too.

Towards self-regulated learning

Across the globe, organisations are striving to become more agile in their learning. One key success factor for such transformation is the move towards more self-regulated learning. However, achieving that is going to be difficult without slightly more democratised information.

If the learners don’t know how they are doing, they cannot really self-regulate effectively. And no, test scores, completion statistics and annual performance reviews are not enough. Learning is happening on a daily basis and the flow of information and feedback should be continuous. Thankfully, the technology to provide this sort of individual learning analytics and personalised dashboards is already available. For instance, xAPI and Learning Record Stores (LRS) enable us to store and retrieve this type of “big learning data” and make it available to the learners. Some tools even provide handy out-of-the-box dashboards.

On a final note, we do acknowledge that the immediate applications of “managerial” learning analytics likely provide greater initial value to any given organisation. And if you’re not already employing learning analytics to support your L&D decision making, you should start. However, once we go beyond that stage, providing access to personal learning analytics may be a good next step that also helps to facilitate a more modern learning culture in the organisation.

If you’re eager about learning analytics, whether on an organisational or personal level, but think you need help in figuring out what to do, we can help. Just drop us a note here, and let’s solve problems together.

More Learning Ideas

How to Optimise Learning Experiences? 3 Advanced Methods

How to optimise learning experiences cover

How to Optimise Learning Experiences? 3 Advanced Methods

Good and effective learning is not just about the content. Rather, it’s the sum of content, user experience and fit-to-purpose that defines the success of a learning experience. Nowadays, as we develop digital learning experiences, we need to pay increasing attention to how everything works. Frankly, there’s a lot of factors to take into consideration. Luckily, the prevalence of digital and web-based tools brings us the capability to optimise learning like never before. Therefore, we summed up three different methods for optimising learning experiences.

1. Using A/B testing to discover the best design or content

If you’ve ever done digital marketing, or UX design, you’re probably familiar with A/B testing. The underlying idea of A/B testing is to try out two versions of a piece of content or design, and measure the response. To optimise a learning experience, we could for instance measure:

  • Whether a text element or video conveys the required information faster
  • Which typeface/colour scheme/structure creates the most positive response
  • Task performance after using immersive simulations vs. a conventional e-learning module
  • Ease of use of navigation and user flow between two different design versions

By comparing different options with each other in live use, we can get a lot of data. This enables us to optimise the learning experience and get a little closer to the best solution. However, while A/B testing is a good tool, use it wisely. You should always make sure you’re only testing one variable at a time. Otherwise, you can’t be certain of the contributing factors.

2. Using web analytics to optimise the learning experience

Just like with A/B testing, if you’ve been involved with marketing, you’re likely familiar with web analytics. Nowadays, as a lot of the learning platforms out there are in fact “websites”, we can leverage web analytics to understand how a particular platform is being used.

The most famous web analytics tool is probably Google Analytics. But it’s not really about the tool itself, but rather how to use the data it collects. Some traditional web analytics data that can be used to optimise learning experiences include:

  • Device information. How many of the learners are using mobile? What about tablets? Desktop?
  • Bounce rates. How many learners don’t go beyond the first page? Where do they exit?
  • Time of usage. When are learners engaging on the platform? Are they learning during the workday or on their free time?
  • Frequency. How many times have your learners visited your platform? Are they coming back?

All of these data points, and many more, help us to further optimise the learning experience. While these types of web analytics are handy, you may also consider xAPI compatible platforms and analytics. The advantage of xAPI is that whereas e.g. Google’s data is largely anonymised, xAPI lets you drill down to the level of individual learners, and all their interactions within the platform.

3. Using heatmaps and user recordings to understand the flow of the learning experience

A handy new tool in the analytics space is the “heatmap”. While these tools collect largely similar type of data to web analytics, they go slightly further. With these types of heatmaps and user recordings, we can find out for instance:

  • The scrolling behaviour of our learners
  • Mouse movements / taps / clicks
  • The “flow” within the page or learning activity

This type of information helps us to further address problem areas, as we’ll know exactly where the learners tend to pause (perhaps there’s an unclear explanation?), where they progress to (does it happen linearly or as intended?) and how they flow through the activity. For instance, you might find out that only 25% of the learners reach the piece of content you spent a lot of time on. In such case, you might want to rework the activity.

Final words

Learning design as a process is becoming much more agile. We can no longer justify developing large amounts of content or designing in a specific way without validating the assumptions with data. By working to optimise learning experiences, we ensure that learners receive the right resources in the right way, which greatly contributes to their learning success. While the above are great methods and tools for optimisation, you can do quite well even with more traditional means, e.g. surveys or focus groups. In the end, it’s all about getting the right data and letting it guide your decisions.

If you’d like to explore more agile or learner-centric ways of designing workplace learning, feel free to drop us a note. Let’s optimise your learning experiences together!

More Learning Ideas

Quantitative vs. Qualitative Data in Learning – Where’s the Value?

Quantitative vs Quantitative Data in Learning featured image

Quantitative vs. Qualitative Data in Learning

Corporate learning and development is becoming increasingly data-driven. On one hand, learning teams need to be able to track and assess all the different learning experiences. On the other hand, they also need to demonstrate business value. This requires smart use of data collection and analytics. While all efforts towards more data-driven strategies are a step towards the better, we’ve noticed a strong bias towards quantitative data. While quantitative data is important, it’s not quite enough to effectively evaluate learning as a process. To achieve that, we’ll need to also pay attention to qualitative data in learning. To help clear up some of the ambiguity, let’s look at each of the two and how to use them smartly.

What is quantitative data in learning?

Quantitative data, by definition, is something that can be expressed in numerical terms. This type of learning data is used to answers questions such as “how many” or “how often”. In learning, organisations often use quantitative data in the form of tracking:

  • Enrolment rates
  • Completion rates
  • Time spent on learning
  • Quiz scores

Unfortunately, this is often where most organisations stop with their learning analytics capabilities. However, the problem is that this type of data tells us absolutely nothing about the learning efficacy or the process itself. While we can always employ better quantitative metrics, such as engagement rates, that will never be enough. In any case, we are always better off with access to qualitative data as well.

What is qualitative data in learning?

There’s a lot of questions that we cannot effectively answer with numbers, hence we need qualitative data. Qualitative data, by definition, is non-numerical and used to answer questions such as “what”, “who” or “why”. Examples of qualitative data in learning could include:

  • How the learners progressed through the activities
  • The nature and topics of discussion between the learners
  • How employees accessed the learning
  • How the employees applied the learning on the job

It’s quite evident, that these types of questions go a lot further in understanding behaviours, efficacy and the learning process as a whole. Without this kind of data, you effectively have no idea what works and what doesn’t. Or, you may be able to see the effect (e.g. low completion or engagement rates) but may have no idea of the underlying cause (e.g. irrelevant content, bad user experience). From a learning design perspective, these type of data points are immensely valuable.

How to use quantitative and qualitative learning data meaningfully?

So, in general, there’s a lot of untapped value in qualitative data and understanding the learning process on a holistic level. But of course, that doesn’t mean that you should forget about quantitative data either. Instead, you should always aim to validate ideas and insights derived from qualitative learning data through the use of quantitative data. How else would you know the impact of things at scale?

For instance, we think there’s a lot of value in employee discussions and sharing. These provide a great opportunity for organisations to source knowledge and transfer information between its layers. It often happens that employees in these learning discussions bring up their own best practices and work methods (qualitative), that even the L&D team is not aware of. However, to understand if the practice can be applied across the organisation, we may need to do a survey or a poll to understand the magnitude of the idea (quantitative).

Final words

Overall, we believe that a lot of the traditional “LMS metrics” are quite useless for anything other than compliance purposes (and even for that, there are better ways…). To really deliver great learning experiences, organisations need to understand learning as a process and not strip it down to simple numbers of how many people took part. In essence, companies need to focus more on the quality of their data and the ability to quantify the impact of insights derived from qualitative data in learning.

This often requires technical capabilities, such as the xAPI, but once again, buying technology is not enough. Rather, organisations have to understand the meaningful things to measure and cut through the noise. If your organisation needs help in that, or in crafting more data-driven learning strategies in general, we are happy to help. Just drop us a note here and tell us about your problem.

More Learning Ideas

How to Use Social Analytics in Organisational Learning?

How to use social analytics in organisational learning

How to Use Social Analytics in Organisational Learning?

Nowadays, the HR and L&D functions of organisations are increasingly data-driven. Many employ analytics to aid in decision making processes and to try to analyse e.g. the effectiveness of learning initiatives. While there’s a lot of ways to use learning analytics, we found that organisations are underutilising a particular type of data. While digital learning platforms increasingly come with social features (walls, blogs, news feeds, etc.), not many are yet paying attention to how people use these social elements, and the potential implications for the organisation. Thus, here are three cases for using social analytics in organisational learning.

1. Measuring interactions between learners

If we want to understand learning on a holistic level, it’s important to also understand it granularly. Hence, one good use of social analytics is to analyse interaction between the learners. Some example data points for these interactions could be:

  • How many times was a particular piece of content or user submission liked/shared?
  • The number of comments that a post or a piece of content attracted
  • How often/for how long are users interacting with each other?

The first two examples above could help you to understand what kind of content works the best or sparks the most discussion. The latter one could help in understanding how people collaborate with each other.

2. Measuring the quality of interactions and organisational influence

Naturally, quantitative data only gets us so far and it’s important to understand the quality of the “social” as well. Empty comments that don’t contribute to the discussion are not likely to create value. Hence, organisations could consider using semantic analysis, powered by NLP algorithms to gauge “what” is being talked about, and whether the social discourse is contributions or just mere commenting. The benefits of semantic analysis are two-fold. It may, again, help you to spot problem areas in your content (e.g. when learners need to clarify concepts to each other). But perhaps more importantly, it can provide you information on who are the “contributors” in your organisation.

Also, it’s important to understand “who” are interacting and “how” they interact. This level of analysis could be helpful in determining organisational influence. Who are the individuals with networks across the organisation, or liked by their peers, or helping everyone. These people may even go unnoticed if not for the social analytics, but maybe they could be among the future leadership potential in the organisation. Even if not, there’s a good chance that these may be local opinion leaders that you could utilise to execute your strategy in the future.

3. Sourcing ideas and innovation from the ground up

Finally, a potentially highly impactful application of social analytics is in sourcing information, ideas and innovation from within your own organisation. Often, the people doing a particular job have a lot of ideas on how to improve. It’s just that these ideas rarely reach the top, due to organisational layers, bureaucracy, culture etc. Could we help in that?

With social analytics, you could effectively set up hidden knowledge collection tool. By analysing discussions/sharing/likes around content or user submissions, you could establish a direct flow of information from the line of duty all the way to the decision makers in the upper echelon’s of the organisation. The decision makers would see what kind of work practices/ideas/methods gain the most traction, and then find ways of replicating them across the organisation. On a technical level, such flows are not hard to set up. Mostly, you just need quantitative data, or a combination of quantitative and semantics, depending on the case.

Final words

All in all, there’s a lot of under-utilised value in social analytics for workplace learning and organisational development purposes. As learning is fundamentally a social experience, this data helps in understanding the learning that is taking place. So, as you’ll get deeper into the world of learning data, don’t just focus on the traditional metrics like course completions etc. A more social data set might provide much better insights. And if you need help in social learning or hidden knowledge collection, we can help. Just contact us here.

More Learning Ideas

How to Leverage Data in Training Needs Analysis?

How to leverage data for training needs analysis?

How to Leverage Data in Training Needs Analysis?

The training needs analysis is a real staple in the corporate L&D field. Everyone does it, yet the real value-add is ambiguous. The traditional ways of doing it are not employee-centric, which results in irrelevant and inconveniencing rather than enabling learning activities. While extensive use of data to support that analysis is clearly the best direction to take, organisations don’t often understand how. Thus, we wanted to explain a few of different ways you could leverage data in your training needs analysis.

Using search data to understand what your learners really need

One of the biggest problems in training needs analysis is that the people doing it don’t often really talk to the end user of the training. And naturally, they don’t have the time either. While it would be nice to sit down for a 1-on-1 with each learner, often that’s not a practical nor feasible possibility. But what if we could have the learners talk to us anyways? That’s where data collection comes in handy.

By monitoring e.g. what your employees search during their work can be a really good indicator of the types of things they would need to learn. As most of workplace learning happens that way – employees searching for quick performance support resources – you should really aim to understand that behaviour. So, why don’t you start playing Google? You already should have the capabilities of tracking search history on company devices or within your learning systems. These searches are highly contextual, as they happen within the direct context of learning or work. It’s just a matter of compiling this data and using it to support your training decisions.

Using real-time learning data to identify organisational skill gaps

Another stream of data that you should be looking into when doing training needs analysis comes directly from the learning activities themselves. First of all, you should make sure that the learning data you collect is relevant and actually gives an accurate representation of learning. If you’re not yet using xAPI, start now. You’ll unlock a whole new level of analytical power.

Once you’ve got that covered, you should track that data across the board. This enables you access to individual-, group- and subject matter level insights. For subject matter (i.e. training topics), you’re better off tagging all your learning content appropriately. By having an up-to-date representation of what learning experience related to what topic or competency, you enable quick glances into your organisation’s learning. For instance, a skills heat map might aggregate this “tagging” data and learning data to give you a visual representation on which areas your learners are lacking in competence. Then, you can start drilling down on the group- and individual levels to determine why some are succeeding and some are not. This helps you to craft better and much more personalised training activities and learning solutions.

Using performance data to understand the business needs

Naturally, organisational learning should always support the business rather than inconvenience it. Therefore, it’s important to measure and understand performance. If you don’t keep track of performance, it’s impossible to measure real learning impact and consequently do effective training needs analysis. Performance data is everywhere, often scattered across the business in various systems and silos. Different departments might have their own data and some of it may be centralised. But whether it’s sales, marketing, customer facing staff, operations, finance or HR, the data is often there already. And it’s incredibly important to tap into this data, regardless of where it is.

However, one extremely important thing to note is not to use performance data in isolation. Rather, you should always compare it with your learning data. For instance, if looking at performance data alone, you might see that performance of department X is lacking. The easy answer would be to “assign” more training. However, looking at learning data could reveal that training has not solved the problem before and thus you should be looking at completely different solutions to it. Furthermore, you should always be careful in jumping to conclusions when linking learning to performance impact. Again, the L&D department might see performance improvement as a quick win, but a deeper cross-analysis with learning data could reveal that the performance improvement wasn’t actually caused by the training.

Final words

Overall, there are tremendous amounts and types of both learning- and non-learning data we can leverage in training needs analysis. The above provides just a few examples. With better analysis we can provide better learning experiences and positively impact business performance. To not leverage the vast amounts of data available to do that is simply foolish.

If you need help in crafting more data-driven learning strategies or adopting technology to do so, lets talk. Just drop us a note here.

More Learning Ideas

How to Use Learning Analytics? 3 Value-add Cases

How to use learning analytics?

How to Use Learning Analytics? 3 Value-add Cases

As corporations become more data-driven in their decision making, learning & development has to follow suit. To make better decisions, you naturally need to collect a lot more learning data. But that alone isn’t enough. You also need capabilities to analyse the data to understand what it means. While there’s a lot of ambiguity about corporate training analytics and some organisations intentionally try to make it sound extremely difficult, it’s not entirely true. To clear out some of that ambiguity, here are 3 different use cases for learning analytics that are applicable for organisations of all sizes.

1. How to use learning analytics to increase engagement?

One of the bottleneck issues in corporate learning today is engagement. It’s not always an easy task to put out learning experiences that resonate with the learners and keep them engaged. Naturally, your content has to be of good quality, and you should likely use a fair bit of interactivity. But once all that is said and done, you should unleash the analytics.

Through learning content analytics, we can get a much better understanding of our users. We can see what are the pieces of content that are used the most or the least. We can also get an understanding of ‘when’ and ‘where’ learners tend to drop off, which then enables to start figuring out ‘why’. Furthermore, we can drill down to each interaction between the learner and content/instructors/other learners to really understand what is working and what is not. All of this (and a fair bit more!) enables us to constantly develop our learning experiences based on real information instead of gut-feels and opinions. And when we can make our content to be more relevant and to-the-point, a lot of the engagement tends to come naturally.

2. How to use learning analytics to personalise learning experiences?

Our professional learners – the employees – come with various skills, degrees of experience, education and backgrounds. As they certainly don’t represent a one-size sample, we shouldn’t be putting them through one-size-fits-all learning experience either. As organisations have understood this, the hype around personalised learning has grown significantly over the past few years. But it’s not just hype, there’s real value to personalisation that learning analytics can help us to unlock.

First of all, learning analytics help us to understand the different individuals and groups of learners in our organisation. By being able to drill down all the way to the level of individual’s interactions, we can understand our learners’ needs and challenges much better. This enables us to cater to their various strengths, diverse learning history and varying interests. Instead of providing a simple one-size-fits-all learning experience, we can use this information to design personalised learning paths for different groups or even up to an individual level. These learning paths can branch out and reconnect based on difficulty of content, experience, current job and various other factors. The learning experience thus becomes a spider’s web instead of a straight line, and you’ll be able to catch much more of your learners.

3. How to use learning analytics to prove the impact of learning?

Proving the impact or the ROI of learning is something that L&D professionals often struggle with. One of the reasons for struggle is not using learning analytics. For learning results in terms of knowledge acquisition, a data-driven approach beats out the traditional multiple choice testing or feedback forms by a long shot. Furthermore, it enables a much more formative way of assessment, thanks all the data points collected and available.

But simple knowledge acquisition isn’t simply enough to demonstrate corporate learning impact. After all, what’s the learning good for if no one applies it? Thus, it’s imperative that we combine learning analytics with performance metrics and indicators. By doing this, we’ll get a lot closer to real learning results. E.g. how did the sales training affect the sales staff routines, behaviours and performance? How much of the risky behaviour did the compliance training help to eliminate? Is our training on team management actually resulting in teams being managed better? By enabling this level of analytics, you can answer a lot more questions. Furthermore, you can also start asking questions that you were not even aware of.

In our work, learning analytics and data-driven approaches play a big part. While technology plays a big part, there’s obviously more to it. For instance, you want to be sure that you’re setting your corporate learning objectives to enable this. If you’re looking to move into more data-driven learning strategies or understand your training impact better, we can probably help you. Just reach out to us here.

More Learning Ideas

How to Set Better Corporate Learning Objectives?

How to set effective corporate learning objectives?

How to Set Better Corporate Learning Objectives?

When designing learning activities, one of the first things to consider is what you want to accomplish with the training. Without proper goals, you can’t really know what to measure, let alone demonstrating the effects of the learning. While all L&D departments probably do set goals, not all of the goals are very meaningful. Specifically, learning professionals tend to set knowledge-based goals (e.g. “after this training, the participant will know good practices of leadership in the workplace”). However, the accumulation of knowledge, while a noble goal, doesn’t really provide any value to the business. It’s the enactment of new, desired behaviours and change, i.e. implementing the learning on the job, that determines the value-add. Thus, to effectively demonstrate the value of learning in our organisations, we need to set our corporate learning objectives in another way. And here’s a 4-step process to do that.

1. Define the workplace behaviours that you want to affect with training

First, you need to determine the specific behaviours you’d like to affect through training. And really, it means getting specific (you’ll run into trouble in #2 if you don’t). To continue with the leadership example: “we want our managers to become better leaders”. Bad. “We want our managers to have more frequent conversations with their direct reports”. Better.

The behaviours will naturally vary by topic, and some are easier to drill down to than others. However, “loose” learning objectives like masked as “performance objectives”, like in example #1 will turn out to be near impossible to measure.

2. Figure out what to measure and how. Don’t rely on self-reported data

If the first step is already a critical, the what and how of measurement is often the detrimental one in the context of corporate learning objectives. When trying to assess behavioural change (i.e. the impact of said learning) in organisations, there are two major mistakes that happen across the board.

First, not understanding what to measure. In similar fashion to setting the learning objectives, the ‘what’ is often too vague. If you’re doing sales training, measuring sales growth directly is too broad: you’re cutting a lot of corners and making dangerous assumptions. Sales may increase, but it may have no correlation with the training. Rather, the effect could be due to external environment, team relationships, incentives, seasonality, etc. Therefore, you need to drill down deeper. A proper level for example in sales training would be individual metrics, such as conversion ratios, time on calls, etc. These may or may not result in performance improvement, but that’s for you to find out without making ill-founded assumptions.

Second, the ‘how’ part of measurement is often lacking as well. If you really want to make an impact through better corporate learning objectives, it’s important to get this right. First, never rely on self-reported results. People lie, exaggerate, underestimate and aim to please, and even anonymity doesn’t remove the barrier to give honest answers. Rather, you should always use hard data. If the data is not readily available through non-learning channels (e.g. HR systems, performance management systems, ERPs, CRMs etc.), find a way to capture the needed information.

3. Quantify your corporate learning objectives

The relieving thing is that once you really drill down on the specific behaviours and get objective data sources, quantifying your learning objectives becomes much easier. In e.g. sales, finance, marketing or operations that is already a lot easier naturally. But even in the previous leadership example, there’s quite a large difference between “we want our managers to be 50% better leaders” vs. “we want our managers to have 50% more conversations with their direct reports”. The first is impossible to measure accurately, hence the quantification is moot and void. The second can be measured e.g through internal network analysis, communication meta-data and even calendar appointments.

Furthermore, once you quantify the learning objectives, you’re setting a transparent set of expectations. Consequently, you’ll have a much more easier job to sell the idea to your management and subsequently report the results. Once we analyse things a bit more deeply, we can assign “dollar values” to the changes in workplace behaviour. The value of sales staff converting 10% more of their calls is real and tangible, and it’s easy to track whether the learning investment is paying off. When the behaviours become less tangible (e.g. that leadership practice), you should agree with the business heads on what the value of those behaviours is to the business. For e.g. learning company values etc. it might seem silly, but you should consider doing it nonetheless to enable transparency in assessment and reporting. Of course, as you probably haven’t measured learning this way before, it’s important to acknowledge that in the beginning. So don’t punish yourself if you don’t “hit the target” right away.

Final words

By using this simple 3-step approach to setting corporate learning objectives, understanding the link between learning, impact and performance becomes a lot less burdensome. On an important note, once you’ve put this in place, you really need to actually measure things and commit to using the data. Collecting the data and insights, even if done properly, is itself a bad investment if you or your management still resort to making assumptions rather than trusting hard facts.

If you need help in understanding your organisation’s learning on a deeper level or to develop a data-driven learning strategy, contact us. We’ll walk you through what it takes.

More Learning Ideas