Learning Technology Trends for 2019 – What’s Ahead?

Learning Technology Trends for 2019

Learning Technology Trends for 2019 – What’s Ahead? 

During the past few years, we’ve witnessed an unprecedented speed of development in the learning technology space. Likewise, the year 2019 looks to be no different. At Learning Crafters we are lucky to have an inside view to much of the development happening in the learning technology space thanks to our work with some of the leading technology vendors. Therefore, we thought it would be worthwhile to share some of our thoughts, views and first-hand experiences on what’s ahead for the industry next year. Hence, here are four key learning technology trends for 2019. 

Learning Technology Trend #1: Big Data will deliver exponential impact in 2019

For the past few years, organisations have been adopting tools and technologies to capture, analyse and execute on business data. While the human resources function in general seems to be lagging slightly behind in that adoption, 2019 looks to a be a big year for big data. For learning and development, the holy grail of learning data – the Experience API (xAPI) – has already been available for several years. While adoption of the xAPI standard has been slower than expected, any organisation claiming to do “learning analytics” today cannot remain credible without involving with xAPI. The old, commonplace ways of capturing learning data (e.g. SCORM) are simply not powerful enough. As we move into data-driven decision making in the L&D space, big data capabilities are an absolute requirement – and that will be delivered with xAPI. 

Learning Technology Trend #2: Artificial Intelligence (AI) will undergo rapid developments

Naturally, in the era of machines, the xAPI learning data will not only be used for analytics. Rather, this type of behavioural data (comparable e.g. to Google Analytics) will be used to develop more advanced AI. Now, what is AI good for in the learning space? 

Currently, AI in learning is being used to build adaptive, as well as personalised learning. Furthermore, the currently available more advanced AI applications are able to curate learning content based on the individual roles, needs and preferences of the learner. In 2019, we’ll definitely see major developments in both fronts. Additionally, we predict another AI application in learning analysis. In other words, the use of artificial intelligence to form insights on the link of learning and performance. 

Learning Technology Trend #3: Virtual Reality (VR) will become more “commercial” 

If you’re a learning professional and didn’t hear about VR in 2018, it’s time to go out! While a lot of the hype surrounding VR is arguably just that, hype, 2019 looks interesting. In addition to developing an industry understanding of what VR is good for, we are likely to see some major enablers.

The first major problem with VR currently is the price tag. Arguably, building VR the way companies currently build it does not enable long term adoption. Since VR is currently mostly developed with game engines, there are few possibilities for the non-tech-savvy to build content. If you look at e.g. how videos have grown the their current dominance, that’s because every single individual can produce them. 

The second major problem with VR this year has been the lack of data capabilities. Without the ability to record big data from the VR experiences, organisations cannot possibly prove the investment worthwhile. While VR experiences are definitely a great gimmick, many organisations have vastly over-invested in it. However, there’s light at the end of the tunnel already in 2019. In fact, we are already seeing some of the first VR content editors emerge. These tools require no technical knowledge, game-engines or programming and come with big data capabilities. Hence, they overcome some of the two current major problems and are set for wider adoption. 

Learning Technology Trend #4: Augmented Reality (AR) will redefine workflow learning 

While VR has been on everyone’s news feed, augmented reality has gone largely unnoticed in 2018. However, several companies both in- and outside of the learning field are developing their AR tools. With the current pipeline of technological development, AR is likely to have a major impact on bringing learning into the workflow. A lot of the initial impact will focus on the technical fields, such as engineering. 

For the first time in history, people will actually be able to learn without interruption to work. This will happen with specialised AR headsets, which you can use to open learning content into your additional layer of reality. Best of the tools will have voice control and come with remote capabilities. This enables, e.g. trainers and experts to follow the learners and guide them through activities. Through a live connection, the trainers may influence the “reality” visible to the learner. Furthermore, the advanced headsets will likely incorporate cameras and tracking capabilities to capture great amounts of data. This data will be incredibly useful both for learning and the business as a whole, as it enables a totally new level of recording work, understanding workflows and the learning happening during them.

Now, the four technologies here represent only a part of the future of learning, but arguably they’re the most hyped. Later, we’ll look at some other technologies as well as emerging methodological trends in L&D. 

Is your organisation ready to take advantage of the upcoming technological developments in the learning space? If not, we’re happy to work with you in building that capability. Just contact us. 

More Learning Ideas

Leveraging Learning Content Analytics for Better Learning Experiences

Learning content analytics cover

Leveraging Learning Content Analytics for Better Learning Experiences

 

We published this article first on eLearning Industry, the largest online community of eLearning professionals. You may find the original article here

An area where Learning and Development professionals could learn a lot from, e.g. marketing experts, is content analytics. Whereas marketing has embraced the need to constantly iterate and redevelop content based on real-time campaign analytics, learning professionals tend to take the easier route. Once an eLearning activity is produced and published, it’s easy to just leave it there and be done with it. But the work is really only at its midway. How do you find out if the content resonated with the audience or not? If it didn’t, how do you figure out what are the problem areas with the content? This is where learning content analytics come in handy.

Example Of Learning Content Analytics On A Training Video

When analysing the effectiveness of eLearning content, you should pay attention to what kind of metrics you are tracking. For instance, in the case of a training video, traditional metrics like how many times the video was opened don’t necessarily carry a lot of value. Instead, we should be looking at the content consumption behaviour on a wider scale, throughout the content and the learning journey. Let’s take a look at an analytical view of a training video.

Learning content analytics on training video
With learning content analytics, you can easily capture where your learners lose interest and drop off.

In this example, you can see the users’ behaviour at various stages of the training video. As usual, you see a slump immediately in the beginning, followed by another bigger slump later on. We’ve coloured the 2 main points of interest to break them down.

1. Initial Attrition

You are always bound to lose some learners in the beginning due to a plethora of reasons. However, if you constantly see big drops starting from 0 seconds, you might want to double-check, e.g. the loading times of the content, to make sure your learners are not quitting because of inability to access the material in a timely manner.

2. Learning Content Engagement Failure

Going further in the video, we see another big slump where we lose around 40% of the remaining learners in just 30 seconds. Clearly, this represents a learning engagement failure. Something is not right there. Learners are likely dropping off because the content is not engaging, relevant or presented in an appealing way.

How Should I Incorporate Content Analytics In The eLearning Development Process?

The above-mentioned video analytics is just a single example of how you can use content analytics to support your learning. Ideally, you should be running these kind of analytics across all your learning content. xAPI tracking capabilities give a lot of possibilities in this regard. Once you’re collecting the data and running the analytics, this is how you could build the use of analytics into your eLearning development process:

  1. Develop an initial version of eLearning materials
  2. Roll it out to a test group of learners, monitor the analytics
  3. Identify potential learning engagement failures and re-iterate content accordingly
  4. Mass roll-out to a wider audience
  5. Revisit the content analytics at regular milestones (e.g. when a new group of learners is assigned the content) to ensure continued relevance and engagement

This type of approach helps to ensure that the learning activities you provide and invest money in, perform at their best at all times.

How Can I Use Learning Content Analytics To Provide Better Learning Experiences?

By now, you’ve surely developed many use cases for content analytics. To summarise, here’s how you could provide a better learning experience through data-driven insights:

1. Identify The Types Of Content Your Learners Like

In the case of videos, you could benchmark the performance of different types of videos (e.g. talking heads, animations, storytelling videos) against each other and see what type of content keeps your learners engaged the best.

2. Develop Engaging Content

With the power of analytics, you’ll be able to develop better learning. You are able to find out immediately what works and what doesn’t. No need to run extensive surveys. The behavior of the learners is the best feedback.

3. Personalise Learning Experiences

You can naturally run analytics for individuals and defined groups, in addition to the whole mass of learners. This helps you personalise the learning experiences according to e.g. skill levels, seniority, experience, previous learning history, etc.

All in all, learning content analytics provide a powerful tool for increased transparency and visibility into the performance of your eLearning. As learning moves to more in-demand and just-in-time, they help to ensure that you’re delivering the right content to the right audience.

Are you interested in developing more analytical, data-driven approaches to your L&D? Or want to know more about different content analytics possibilities? Just drop us a note, and we’ll get back to you. 

More Learning Ideas

Training Evaluation in Digital – Kirkpatrick Model & Learning Analytics

Digital Training Evaluation

Digital Training Evaluation – Using the Kirkpatrick Model and Learning Analytics

Ask an L&D professional about how they measure training effectiveness and learning. The likely answer is that they are using the Kirkpatrick 4-level evaluation model. The model has been a staple in the L&D professionals’ toolbox for a long time. However, if you dig deeper, you’ll find that many organisations are only able to assess levels 1 & 2 of the model. While these levels do constitute valuable information, they help very little in determining the true ROI of learning. Luckily, thanks to technological development, we nowadays have the capability to do digital training evaluation on all 4 levels. And here are some best practices on how to do it.

Level 1: Reaction – Use quick feedback and rating tools to monitor engagement

The first level of Kirkpatrick is very easy to implement across all learning activities. You should use digital tools to collect quick feedback on all activities. That can be in the form of likes, star ratings, scoring or likert scales. Three questions should be enough to cover the ground.

  1. How did you like the training?
  2. How do you consider the value-add of the training?
  3. Was the training relevant to your job?

Generally, scale or ratings based feedback is the best for level 1. Verbal feedback requires too much to effectively analyse.

Level 2: Learning – Use digital training evaluation to get multiple data points

For level 2, it all start with the learning objectives. Learning objectives should be very specific, and tied to specific business outcomes (we’ll explain why in level 4). Once you have defined them, it’s relatively easy to build assessment around it. Naturally, we are measuring the increase in knowledge rather than just the knowledge. Therefore, it is vital to record at least 2 data points throughout the learning journey. A handy way to go about this is to design pre-learning and post-learning assessment. The former captures the knowledge and skill level of the employee before starting the training. Comparing that with the latter, we can comfortably identify the increase in knowledge. You can easily do this kind of assessment with interactive quizzes and short tests.

“If you’re measuring only once, it’s almost as good as not measuring at all”

Level 3: Behaviour – Confirm behavioural change through data and analytics

Finally, the level 3 of measuring behaviour is delving into somewhat uncharted territory. There are a couple of different angles for digital training evaluation here.

First, you could engage the learners in self-assessment. For the often highly biased self-assessment, two questions should be enough. If no behavioural change is reported, another question captures the reason behind it, and L&D can intervene accordingly.

  1. Have you applied the skills learnt? (linking to specific learning, can be a yes/no question)
  2. If not, why not?

Secondly, since self-assessment is often highly biased, it’s not necessary meaningful to collect more data directly from the learner itself. However, to really get factual insight into level 3, you should be using data and analytics. On the business level, we record a lot of data on a daily basis. Just think about all the information that is collected or fed into the systems we use daily. Thus, you should be using the data from these systems with the self-assessment to get a confirmed insight into the reported behavioural change. For instance, a sales person could see an increase in calls made post training. A marketing person could see an increase in the amount of social media posts they put out. The organisation has all the necessary data already – it’s just a matter of tapping into it.

Level 4: Results – Combining Learning Analytics and Business Analytics

Finally, the level 4 evaluation is the pot of gold for L&D professionals. This is where you link the learning to business performance and demonstrate the ROI through business impact. With modern ways of digital training evaluation you can eliminate the guess work and deliver facts:

To be noted, it is highly important to understand that the evaluation steps are not standalone. Level 4 is linked to levels 2 and 3. If there was no increase in knowledge or behavioural change did not happen, there’s no business impact. You might see a positive change in results, but you should not mistake that as the product of learning if the previous levels have not checked out. But once levels 2 and 3 have come out positive, you can look into the bigger picture.

Firstly, you should look back at the learning objectives, especially the business outcomes they were tied to. If your aim with the sales training was to increase the number of calls made, it’s important to look at what happened in that specific metric. If you see a change, then you can look at the business outcomes. How much additional revenue would those extra sales calls produced? The results can also be changes in production, costs, customer satisfaction, employee engagement etc. In any business, you should be able to assign a dollar value on most if not all of these metrics. Once you have the dollar value, it’s simple math to figure out the ROI.

All in all, there’s really no excuse for not dealing with levels 3 and 4 of Kirkpatrick. You can manage digital training evaluation and learning analytics even with limited budget. It’s just a matter of embracing data and the benefits of data driven decision making.

Want to start evaluating your learning on all levels? Click here to start.

 

 

More Learning Ideas

Why xAPI is the Most Important Thing in the Future of Learning?

xAPI

Why xAPI is the Most Important Thing in the Future of Learning?

The new interoperability standard and specification of eLearning, the Experience API (xAPI), is replacing SCORM. In today’s mobile world, where learning happens all the time and data is everywhere, it was necessary to develop a future framework for digital learning. Project TinCan not only achieved just that, it set out to fulfil the dreams of L&D and HR professionals with the Experience API. The specification enables us to capture vast amounts of data previously unavailable, run powerful analytics and link learning to business performance. In fact, xAPI is so powerful, that it will be the cornerstone future learning is built on, and here are just a few reasons why.

If you’re not familiar with the Experience API, you can read more about it here

xAPI enables us to track behaviours and interactions

Whereas SCORM enabled us to track test scores, completions and other basic factors, xAPI goes much deeper. With similar concept to e.g. Google Analytics, xAPI tracks interactions. This means that we can record every single click, comment, learning interaction and activity. This gives an immensely rich picture into how learning happens in the organisation. For the first time, learning professionals really know whether learning content works or not, i.e. do learners really use it. This makes content curation and decision making much easier. Learning professionals can also pinpoint the individuals or groups who require learning interventions. Furthermore, xAPI enables us to truly measure the ROI of learning in relation to all possible KPIs.

xAPI can track all learning activities including informal and offline

Nowadays, learning is increasingly happening outside of the workplace and schools – outside monitored environments. For a long period of time, learning professionals have struggled to get a complete picture of the whole life-long learning journey of individuals. However, with xAPI that is possible. The technology can track all imaginable learning activities, whether they happen outside of the employer’s system or even completely offline. For instance, it can track websites and articles that employees read and engage with. Capturing learning data is no longer confined within the borders of the learning management system (LMS). Every single interaction anywhere can be communicated with xAPI to a learning records store, which acts as the database.

xAPI enables us to finally link learning and business performance

Finally, the greatest struggle of learning professionals has been identifying the business value of different learning activities. Establishing links between performance and learning has been guessing game since the beginning. However, xAPI is here to change also that. We can use it to pull data from all systems (think ERPs, CRMs, HRMs, PMSs). Hence, with the right use of analytics, we can monitor business performance in all imaginable metrics and track it against the learning that is happening in the organisation. Thus, we can pinpoint whether employees in the organisation apply the learning, i.e. is there a behavioural change. Furthermore, we can confidently assign dollar values to these behavioural impacts, and hence the learning activities as well. Measuring the ROI of learning goes from a guessing game to data-driven science. Hence, you can be comfortable knowing that you’re getting the most out of your limited resources in L&D.

Do you already use xAPI for advanced learning insights? Do you want to finally link your learning to business performance? If yes, contact us, and let’s transform your learning together. 

More Learning Ideas

Seamless Learning Tracking vs. Formal Assessments

Learning Tracking

Seamless Learning Tracking vs. Formal Assessments

Previously, we have had very limited tools for tracking corporate learning activities – namely, formal assessments. Formal assessments may have come in a variety of forms: tests, learning surveys, self-assessments, performance evaluations etc. The common denominator for these is that they are relatively hard to administer and produce vague results. Luckily, thanks to advances in learning technology, we can do all this much more efficiently. Let’s look at how we can use seamless learning tracking to make our learning assessments much more efficient.

First, let’s consider a two traditional assessment tools used in corporate L&D: tests and learning evaluations.

Seamless Learning Tracking vs. Conventional Tests

The first problem with formal tests is that they often become the end instead of means to an end. Tests should be a tool for learning, not the reason we are learning for. Consequently, tests are also quite dreadful to the learners. Therefore, many learners are subject to stress and anxiousness when taking examinations. Hence, they are not performing at their best in terms of problem solving or creativity. Frankly, formal tests and scores are also subject misinformation caused by cheating, etc.

By utilising modern tracking methods instead of archaic testing, we can extract a superior amount of information without subjecting our learners to the adversary effects. With modern digital learning environments, we can track everything the learner does. We can follow every individual learner’s journey: how many times did they watch a training video? How much time did they spend on topic X? How many times did they try a quiz before passing? When we have access to learning data like this, the old formal tests scores become practically meaningless. Rather than assessing our learners bi-annually or in defined intervals, we can track them continuously. Hence, we can tackle problems in real time, rather than six months later.

Seamless Learning Tracking vs. Learning Evaluations

Other things we could do more reliably with modern technology are feedback and learning evaluations.

For feedback, the fundamental problem is the one that all marketing research people know – people lie. In organisations, especially hierarchical ones, it’s often a mission impossible to extract genuine feedback. Instead of being truthful, the learners give mid-range positive feedback to avoid confrontation – perhaps in the fear that even the anonymous feedback may be traced back to them. And if we are not getting honest feedback, we’ll shoot ourselves in the foot trying to improve the learning. However, by enabling comprehensive learning tracking, we can let behaviour be our best feedback. We can accurately pinpoint the learning materials that are not engaging or effective and work to improve them.

For learning evaluations, we can pull information from a whole history of learning, rather than just a formal test here and there. This helps us to provide much more personalised feedback to the learners. Instead of focusing on what (“you scored 88/100” in test X), we can focus on how (“you took twice as long to complete the training as your peers”), and most importantly, why (“could another style of learning work better for you?”). This provides us a much more comprehensive view to our people, their skill-sets and capabilities than we could ever achieve by traditional, formal assessments.

Are you using advanced learning tracking methods in your organisation? Would you like to take a more data-driven approach to your HR? We can help, just leave us a note here.  

More Learning Ideas

Learning Data – How to Derive Meaningful Insights?

Learning data - deriving meaningful insights?

Learning Data – How to Derive Meaningful Insights?

As organisations move towards more data-driven decision making, we often find ourselves requiring more sophisticated data collection and analytics tools. For HR in general, we have become quite comfortable with deriving important insights through data analytics. However, for learning and development, these types of operations on the learning data are still in a bit of a grey zone.

As learning moves beyond “just another compliance exercise”, we find that tracking different streams of data can provide real value. These include insights into our employees’ skill level, the effectivity of our learning content and ultimately, the learning ROI. Here are a few illustrations how modern tracking technologies and data analytics can help you to deliver better and more efficient learning.

Assessing employee’s skill levels and learning through data

Assessing our employee’s learning should probably be one of the most important parts in the whole learning architecture. However, many organisations generally fare poorly in this regard (to be fair, the technology has been around for only a few years). Previously, our tracking of learning activities completed has been reliant on very narrow streams of data: the user’s click on “mark as complete” button and/or user input to formal assessment/tests.

First of all, the fact that the user has marked the learning module complete has very little value other than compliance. These types of tick-box exercises tell nothing about the way the learning content was consumed (if at all!). Thus, they provide little to no insight into whether learning actually happens.

Moreover, the formal assessment or tests are not much better. Sure, they again help us to fill the compliance requirements. However, the problem with formal tests is that they are quite dreadful to the learner and can effectively assess a limited part of learning. They do give us a glimpse into how well the learner knows the theory (or how well Google does…). However, they tell us nothing about whether the learning carries through to their jobs resulting in a behavioural change.

What type of data should we collect and utilise instead?

Instead of this kind of tick-the-box data, we should be collecting and leveraging on more qualitative data. How was the content interacted with? In which order were the learning activities completed? How long did it take to complete the learning and how was that time divided between the different sub-activities?

By collecting data to respond to these types of queries, we are producing actionable insights. For example, a learner with lower time-to-complete more likely had higher skills and confidence in their ability than someone who took longer. Similarly, learners who start tackling the difficult content first are more likely to possess advanced skills.

Using data analytics to measure the effectiveness of learning content

Nowadays, learning content plays a major part in the whole learning experience. As we are investing time and money in providing better content, we sure want to keep track of what kind of results we are getting in return.

Previously we would track whether a piece of content was viewed/consumed/marked complete. But again, these kinds of metrics really don’t provide much value to us. If we commit resources into production of e.g. training videos, we surely prefer a much more detailed view to what’s happening. In the case of the videos, we would want to track how long our videos are being watched. If the learners only watch our fancy and expensive video halfway through, that sure seems like money wasted. We would also want to track how learners proceed on the video timeline. For instance, if many of the learners seem to be jumping back and forth multiple times, our video might have failed to communicate the key messages clearly enough.

Using this kind of simple analytics can be a tremendous help in justifying the investments in learning. It’s much easier to get buy-in from the top of the organization once you can show quality insights into the performance of the content rather than just guess work and gut feelings. This also helps us to analyse the Return On Investment of learning. However, a good measure of ROI should also incorporate metrics from the operational side of the business.

Using learning data to determine the Learning ROI

As mentioned, relying only on learning data to determine the learning ROI only gets us so far. As it is, we are rarely learning for just the sake of learning. Rather, we are providing our employees learning experiences in the hopes of them translating into better business results. Therefore, it is equally important that we try to measure the benefits on the business itself, rather than just the fun/participation/liked/etc. index.

By pooling the learning data we collect with other sets of data from the operational side, we can start to assess how well the learning translates to the learners’ daily jobs and how the skills develop. Let’s take sales training as an example, as it is an area where business benefits are easily demonstrable.

Example: combining CRM and learning data for better insights

To assess the effect of learning on sales performance, we could look at results from the sales training and plot them against several sales metrics, such as number of calls made or conversion rates. This kind of data can should be easily extracted from the company’s CRM software. When you see an increase in any of the metrics in correlation to the training you have given and the learning data, you may have managed to produce positive results. It’s also easy to dig deeper to look at cross-performance on both group and individual level.

Going a bit deeper, we can also identify the individuals who have got the most benefit from learning. Could you perhaps gain even more in performance by giving these individuals additional, targeted learning? Similarly, you can identify the individuals lagging in their KPIs and having just ticked the boxes with their learning. It’s time to explain these individuals that it’s simply not enough to mark the e-learnings completed after a brief glance.

Additionally, you’ll also be able to pinpoint the types of content which seem to be driving the increase in performance. Similar to your learners, you can track every piece of content individually. If a certain type of content, e.g. animations or simulations, seem to be the most effective, start using more of them!

Finally, there are so many streams of data that the applications are practically endless. However, you should first pay attention to whether you are collecting meaningful data or not. And then, secondly, think about how to deliver better insights that benefit the overall business.

Are you looking to implement a more data driven approach to learning and development? Would you like to be able to drive business performance through learning and be able to show proof for it? We are happy to help you with your learning data, just drop us a note here.

More Learning Ideas