Learning Technology Trends for 2019 – What’s Ahead?

Learning Technology Trends for 2019

Learning Technology Trends for 2019 – What’s Ahead? 

During the past few years, we’ve witnessed an unprecedented speed of development in the learning technology space. Likewise, the year 2019 looks to be no different. At Learning Crafters we are lucky to have an inside view to much of the development happening in the learning technology space thanks to our work with some of the leading technology vendors. Therefore, we thought it would be worthwhile to share some of our thoughts, views and first-hand experiences on what’s ahead for the industry next year. Hence, here are four key learning technology trends for 2019. 

Learning Technology Trend #1: Big Data will deliver exponential impact in 2019

For the past few years, organisations have been adopting tools and technologies to capture, analyse and execute on business data. While the human resources function in general seems to be lagging slightly behind in that adoption, 2019 looks to a be a big year for big data. For learning and development, the holy grail of learning data – the Experience API (xAPI) – has already been available for several years. While adoption of the xAPI standard has been slower than expected, any organisation claiming to do “learning analytics” today cannot remain credible without involving with xAPI. The old, commonplace ways of capturing learning data (e.g. SCORM) are simply not powerful enough. As we move into data-driven decision making in the L&D space, big data capabilities are an absolute requirement – and that will be delivered with xAPI. 

Learning Technology Trend #2: Artificial Intelligence (AI) will undergo rapid developments

Naturally, in the era of machines, the xAPI learning data will not only be used for analytics. Rather, this type of behavioural data (comparable e.g. to Google Analytics) will be used to develop more advanced AI. Now, what is AI good for in the learning space? 

Currently, AI in learning is being used to build adaptive, as well as personalised learning. Furthermore, the currently available more advanced AI applications are able to curate learning content based on the individual roles, needs and preferences of the learner. In 2019, we’ll definitely see major developments in both fronts. Additionally, we predict another AI application in learning analysis. In other words, the use of artificial intelligence to form insights on the link of learning and performance. 

Learning Technology Trend #3: Virtual Reality (VR) will become more “commercial” 

If you’re a learning professional and didn’t hear about VR in 2018, it’s time to go out! While a lot of the hype surrounding VR is arguably just that, hype, 2019 looks interesting. In addition to developing an industry understanding of what VR is good for, we are likely to see some major enablers.

The first major problem with VR currently is the price tag. Arguably, building VR the way companies currently build it does not enable long term adoption. Since VR is currently mostly developed with game engines, there are few possibilities for the non-tech-savvy to build content. If you look at e.g. how videos have grown the their current dominance, that’s because every single individual can produce them. 

The second major problem with VR this year has been the lack of data capabilities. Without the ability to record big data from the VR experiences, organisations cannot possibly prove the investment worthwhile. While VR experiences are definitely a great gimmick, many organisations have vastly over-invested in it. However, there’s light at the end of the tunnel already in 2019. In fact, we are already seeing some of the first VR content editors emerge. These tools require no technical knowledge, game-engines or programming and come with big data capabilities. Hence, they overcome some of the two current major problems and are set for wider adoption. 

Learning Technology Trend #4: Augmented Reality (AR) will redefine workflow learning 

While VR has been on everyone’s news feed, augmented reality has gone largely unnoticed in 2018. However, several companies both in- and outside of the learning field are developing their AR tools. With the current pipeline of technological development, AR is likely to have a major impact on bringing learning into the workflow. A lot of the initial impact will focus on the technical fields, such as engineering. 

For the first time in history, people will actually be able to learn without interruption to work. This will happen with specialised AR headsets, which you can use to open learning content into your additional layer of reality. Best of the tools will have voice control and come with remote capabilities. This enables, e.g. trainers and experts to follow the learners and guide them through activities. Through a live connection, the trainers may influence the “reality” visible to the learner. Furthermore, the advanced headsets will likely incorporate cameras and tracking capabilities to capture great amounts of data. This data will be incredibly useful both for learning and the business as a whole, as it enables a totally new level of recording work, understanding workflows and the learning happening during them.

Now, the four technologies here represent only a part of the future of learning, but arguably they’re the most hyped. Later, we’ll look at some other technologies as well as emerging methodological trends in L&D. 

Is your organisation ready to take advantage of the upcoming technological developments in the learning space? If not, we’re happy to work with you in building that capability. Just contact us. 

More Learning Ideas

Rapid Learning Interventions – Removing Process Bottlenecks

rapid learning interventions - removing bottlenecks

Rapid Learning Interventions – Removing & Redesigning Process Bottlenecks

As the business and corporate landscape is changing faster than ever, learning and development faces a difficult time. Skills evolve at such a fast pace that predictions into future are no longer meaningful. Business models are also becoming more complex, and employees seem to have much more responsibilities than before. Delivering effective corporate learning to stay on top of the change is not easy. However, organisations could help themselves by trying to eliminate some of the traditional barriers in the learning delivery process. Here are a two common and often major bottlenecks that hinder rapid learning interventions and how to get rid of them.

Rethinking training needs analysis

One of the major bottlenecks in the corporate learning design process is the training needs analysis. The process itself is often too infrequent to respond to rapidly evolving needs. It’s also often reactive, rather than proactive. Finally, the common top-down approach where the L&D department assumes they could even have a chance at grasping the complexity of the roles in their organisation is outright infeasible.

The predicament that learning professionals know best when it comes to training needs analysis is causing more harm than good. In fact, people actually doing the day-to-day jobs often have much better visibility. Thus, learning professionals should leverage that visibility by polling for needs and crowdsourcing ideas. Further, to respond faster and enable rapid learning interventions, organisations need to go real-time. Learning data analytics can provide real-time insights into the skill gaps, competencies and training needs in the organisation. No more guess work and fabricated evaluation intervals, the company can see the learning as it is happening.

Redesigning the learning design process

At it’s current state, the learning design process seems to be a bit broken as well. In our experience, the lead times for developing learning materials can extend to 6 months or even a year for some organisations. A lot of this is attributable to the traditional and tedious development processes of learning. Initially, rapid eLearning tools emerged to combat this problem. However, even they still require quite long lead times. While everyone would like to develop a perfect product, most likely it’s not going to happen. Hence, it probably makes sense enable rapid learning interventions by more agile learning design.

Rather than perfecting and fine tuning the learning product for ages, you should start audience exposure already at the “minimum viable product (MVP)” phase. By involving employees and subject matter experts through a more user-centric design process, you can collect timely feedback and improve gradually. A more collaborative approach has the added benefit of potentially greater impact, as the involvement of the different stakeholders results in more personalised learning.

Actually, does the L&D even need to be the one designing content?

There are two things we’ve noticed with the emergence of the online economy. One, anyone can create content for global audiences. Two, there are endless amounts of content publicly available on the internet. Wouldn’t it make sense to leverage these?

How about enabling rapid learning interventions by flipping the paradigm altogether? Since your employees are the best subject matter experts anyway, why not have them create learning content for each other? Or how about leveraging what’s already out there and replacing time-consuming design with learning content curation? There are a lot of tools out there to power up these types of approaches and further customise learning. Here’s an example for curating interactive microlearning videos.

Does your organisation face challenges in deploying rapid learning interventions and responding to business needs? We may be able to help. Just drop us a note here.

More Learning Ideas

Leveraging Learning Content Analytics for Better Learning Experiences

Learning content analytics cover

Leveraging Learning Content Analytics for Better Learning Experiences

 

We published this article first on eLearning Industry, the largest online community of eLearning professionals. You may find the original article here

An area where Learning and Development professionals could learn a lot from, e.g. marketing experts, is content analytics. Whereas marketing has embraced the need to constantly iterate and redevelop content based on real-time campaign analytics, learning professionals tend to take the easier route. Once an eLearning activity is produced and published, it’s easy to just leave it there and be done with it. But the work is really only at its midway. How do you find out if the content resonated with the audience or not? If it didn’t, how do you figure out what are the problem areas with the content? This is where learning content analytics come in handy.

Example Of Learning Content Analytics On A Training Video

When analysing the effectiveness of eLearning content, you should pay attention to what kind of metrics you are tracking. For instance, in the case of a training video, traditional metrics like how many times the video was opened don’t necessarily carry a lot of value. Instead, we should be looking at the content consumption behaviour on a wider scale, throughout the content and the learning journey. Let’s take a look at an analytical view of a training video.

Learning content analytics on training video
With learning content analytics, you can easily capture where your learners lose interest and drop off.

In this example, you can see the users’ behaviour at various stages of the training video. As usual, you see a slump immediately in the beginning, followed by another bigger slump later on. We’ve coloured the 2 main points of interest to break them down.

1. Initial Attrition

You are always bound to lose some learners in the beginning due to a plethora of reasons. However, if you constantly see big drops starting from 0 seconds, you might want to double-check, e.g. the loading times of the content, to make sure your learners are not quitting because of inability to access the material in a timely manner.

2. Learning Content Engagement Failure

Going further in the video, we see another big slump where we lose around 40% of the remaining learners in just 30 seconds. Clearly, this represents a learning engagement failure. Something is not right there. Learners are likely dropping off because the content is not engaging, relevant or presented in an appealing way.

How Should I Incorporate Content Analytics In The eLearning Development Process?

The above-mentioned video analytics is just a single example of how you can use content analytics to support your learning. Ideally, you should be running these kind of analytics across all your learning content. xAPI tracking capabilities give a lot of possibilities in this regard. Once you’re collecting the data and running the analytics, this is how you could build the use of analytics into your eLearning development process:

  1. Develop an initial version of eLearning materials
  2. Roll it out to a test group of learners, monitor the analytics
  3. Identify potential learning engagement failures and re-iterate content accordingly
  4. Mass roll-out to a wider audience
  5. Revisit the content analytics at regular milestones (e.g. when a new group of learners is assigned the content) to ensure continued relevance and engagement

This type of approach helps to ensure that the learning activities you provide and invest money in, perform at their best at all times.

How Can I Use Learning Content Analytics To Provide Better Learning Experiences?

By now, you’ve surely developed many use cases for content analytics. To summarise, here’s how you could provide a better learning experience through data-driven insights:

1. Identify The Types Of Content Your Learners Like

In the case of videos, you could benchmark the performance of different types of videos (e.g. talking heads, animations, storytelling videos) against each other and see what type of content keeps your learners engaged the best.

2. Develop Engaging Content

With the power of analytics, you’ll be able to develop better learning. You are able to find out immediately what works and what doesn’t. No need to run extensive surveys. The behavior of the learners is the best feedback.

3. Personalise Learning Experiences

You can naturally run analytics for individuals and defined groups, in addition to the whole mass of learners. This helps you personalise the learning experiences according to e.g. skill levels, seniority, experience, previous learning history, etc.

All in all, learning content analytics provide a powerful tool for increased transparency and visibility into the performance of your eLearning. As learning moves to more in-demand and just-in-time, they help to ensure that you’re delivering the right content to the right audience.

Are you interested in developing more analytical, data-driven approaches to your L&D? Or want to know more about different content analytics possibilities? Just drop us a note, and we’ll get back to you. 

More Learning Ideas

User-Centred Learning Design – Using the 5Di Model

User-centred learning design 5Di

User-Centred Learning Design – Using the 5Di Model for Learning Activity Development

A few weeks back, we touched on the topic of delivering engaging experiences with learner-centric design. While that article covered some general principles of user-centred learning design, we wanted to further introduce you to an actual design framework. Naturally, we picked a framework that we’ve adopted and keep adapting at Learning Crafters, called 5Di. The 5Di is not something we’ve developed ourselves, rather it was actually spearheaded by Nick Shackleton-Jones. We recognised the value-add in the approach and have since adapted it to our learning design process. So what’s the 5Di all about?

The 5Di User-centred learning design model

The model outlines a 6-step learning design process, the five Ds and the I.

  1. Define
  2. Discover
  3. Design
  4. Develop
  5. Deploy
  6. Improve

And here’s a rundown of the activities within each part of the process.

1. Define

As with any project, user-centred learning design should also start with identifying the problem. It’s important to partner with the business to define the desired outcomes. The desired outcomes should be based on results, not learning objectives per say. After all, you’re developing learning to achieve business impact. However, don’t be too confined to a familiar set of solutions when in the definition – a course or even training is not always the right answer.

2. Discover

Then, partner with the assumed audience of the learning to gain deeper understanding of the business problem. Involve subject-matter experts to identify the behaviour required and barriers for improved performance. It’s very difficult to translate learning into behaviour later on if you don’t take the time to understand the line of business initially.

3. Design

Next, develop a formulated approach into solving the learning problem and document it for presentation to the decision-maker. Develop scripts, wireframes or storyboards outlining the approach. A good wireframe helps to divide up tasks later on to enable a quicker and more agile development.

4. Develop

Next, develop a Minimum Viable Product (MVP) to get user and stakeholder feedback on. Reiterate and refine the learning design accordingly. Test the “product” for usability, interoperability with existing systems etc. And remember, collecting feedback is adamant. If you don’t focus on gathering user feedback, the whole concept of MVP renders itself obsolete. Furthermore, it’s important that designers continue to partner with subject-matter experts to guarantee a truly user-centred learning design.

5. Deploy

Roll out the learning activity to the users while drumming it up with communications and marketing using common channels available to you. Good communication is needed for a successful learning activity. Therefore, you should treat it as a marketing campaign. Thus, a single informative email is not enough. Rather, you should drum it up over time and involve user feedback, referrals and success stories where possible. In business units, it also often pays to get line managers to recommend the learning activities to their teams.

6. Improve

Finally, we arrive at the most important step! The learning development process doesn’t stop even after learners have completed the course. Rather, you should keep monitoring the content performance and user engagement levels and make improvements accordingly. A learning data driven approach is well suited for this, and xAPI capabilities help tremendously in analysing engagement. Remember, it’s not only the subject-matter refinement you should focus on! Rather, it’s the delivery and user experience that are often more important.

That’s 5Di, a user-centred learning design approach, in a nutshell. With this agile method, we’ve been able to actually reduce our learning development times. Also, the results have been a lot better in terms of measurability, user experience and learning results.

Are you using 5Di or a similar learning design approach? If you’d like to implement a more agile learning development approach with your learning designers, we can help you. Just drop us a note

More Learning Ideas

Training Evaluation in Digital – Kirkpatrick Model & Learning Analytics

Digital Training Evaluation

Digital Training Evaluation – Using the Kirkpatrick Model and Learning Analytics

Ask an L&D professional about how they measure training effectiveness and learning. The likely answer is that they are using the Kirkpatrick 4-level evaluation model. The model has been a staple in the L&D professionals’ toolbox for a long time. However, if you dig deeper, you’ll find that many organisations are only able to assess levels 1 & 2 of the model. While these levels do constitute valuable information, they help very little in determining the true ROI of learning. Luckily, thanks to technological development, we nowadays have the capability to do digital training evaluation on all 4 levels. And here are some best practices on how to do it.

Level 1: Reaction – Use quick feedback and rating tools to monitor engagement

The first level of Kirkpatrick is very easy to implement across all learning activities. You should use digital tools to collect quick feedback on all activities. That can be in the form of likes, star ratings, scoring or likert scales. Three questions should be enough to cover the ground.

  1. How did you like the training?
  2. How do you consider the value-add of the training?
  3. Was the training relevant to your job?

Generally, scale or ratings based feedback is the best for level 1. Verbal feedback requires too much to effectively analyse.

Level 2: Learning – Use digital training evaluation to get multiple data points

For level 2, it all start with the learning objectives. Learning objectives should be very specific, and tied to specific business outcomes (we’ll explain why in level 4). Once you have defined them, it’s relatively easy to build assessment around it. Naturally, we are measuring the increase in knowledge rather than just the knowledge. Therefore, it is vital to record at least 2 data points throughout the learning journey. A handy way to go about this is to design pre-learning and post-learning assessment. The former captures the knowledge and skill level of the employee before starting the training. Comparing that with the latter, we can comfortably identify the increase in knowledge. You can easily do this kind of assessment with interactive quizzes and short tests.

“If you’re measuring only once, it’s almost as good as not measuring at all”

Level 3: Behaviour – Confirm behavioural change through data and analytics

Finally, the level 3 of measuring behaviour is delving into somewhat uncharted territory. There are a couple of different angles for digital training evaluation here.

First, you could engage the learners in self-assessment. For the often highly biased self-assessment, two questions should be enough. If no behavioural change is reported, another question captures the reason behind it, and L&D can intervene accordingly.

  1. Have you applied the skills learnt? (linking to specific learning, can be a yes/no question)
  2. If not, why not?

Secondly, since self-assessment is often highly biased, it’s not necessary meaningful to collect more data directly from the learner itself. However, to really get factual insight into level 3, you should be using data and analytics. On the business level, we record a lot of data on a daily basis. Just think about all the information that is collected or fed into the systems we use daily. Thus, you should be using the data from these systems with the self-assessment to get a confirmed insight into the reported behavioural change. For instance, a sales person could see an increase in calls made post training. A marketing person could see an increase in the amount of social media posts they put out. The organisation has all the necessary data already – it’s just a matter of tapping into it.

Level 4: Results – Combining Learning Analytics and Business Analytics

Finally, the level 4 evaluation is the pot of gold for L&D professionals. This is where you link the learning to business performance and demonstrate the ROI through business impact. With modern ways of digital training evaluation you can eliminate the guess work and deliver facts:

To be noted, it is highly important to understand that the evaluation steps are not standalone. Level 4 is linked to levels 2 and 3. If there was no increase in knowledge or behavioural change did not happen, there’s no business impact. You might see a positive change in results, but you should not mistake that as the product of learning if the previous levels have not checked out. But once levels 2 and 3 have come out positive, you can look into the bigger picture.

Firstly, you should look back at the learning objectives, especially the business outcomes they were tied to. If your aim with the sales training was to increase the number of calls made, it’s important to look at what happened in that specific metric. If you see a change, then you can look at the business outcomes. How much additional revenue would those extra sales calls produced? The results can also be changes in production, costs, customer satisfaction, employee engagement etc. In any business, you should be able to assign a dollar value on most if not all of these metrics. Once you have the dollar value, it’s simple math to figure out the ROI.

All in all, there’s really no excuse for not dealing with levels 3 and 4 of Kirkpatrick. You can manage digital training evaluation and learning analytics even with limited budget. It’s just a matter of embracing data and the benefits of data driven decision making.

Want to start evaluating your learning on all levels? Click here to start.

 

 

More Learning Ideas