If you’ve had experience with any of the various learning management systems and platforms, particularly as an admin, you have most likely come across various types of learner data. These data include content completion, assessment pass, and user adoption rates. Getting access to and compiling these data is easy enough, but knowing what to do with them isn’t as straightforward. In this post, we’ll discuss these common learning data you’ll find and actionable steps you can take to improve your users’ learning outcomes.
Content Completion Rate
We’ll start by talking about the Content Completion Rate. This generally refers to one of two things:
- The proportion of users who have completed a particular piece of content
- The progress users have made in specific pieces of content
A high content completion rate is usually positive and hints that your users are engaged with the content on your platform. However, it could also just mean that users are completing content you’ve made compulsory.
On the other hand, a poor content completion rate is a clear indication that something isn’t working out. That said, the reasons for the low rates aren’t always clear. It’s important to assess the relevant factors that could play a part as this could help to address the issue’s root cause.
A reason could be that the content is new and users have not had the chance to look at it yet, which explains the low completion rate. In that instance, it’s probably best to give users more time to complete the course before collecting the data.
If a piece of content has been accessible for many months, however, and the completion rate is still low, here are some possible reasons:
- Users are simply unaware of the content. They don’t check the platform regularly or even if they do, don’t know that the content has been uploaded. In these instances, switching on notifications, such as email, in-app, in-push or push notifications, may be useful when publishing new pieces of content. This will ensure that your users are updated whenever there is a new piece for them to start reading. If you’re using EDMs to let users know when there is content available, it may be worthwhile to check that your lists are updated and that the software you use is working properly.
- Users don’t like the content. Although a hard pill for any course designer or content creator to swallow, there’s a real possibility that users just aren’t enjoying what you’ve uploaded. Hence, dropping off before making it to the end of your content. In this situation, a useful next step would be to gather feedback on what your users don’t like about the current content. Perhaps there are too many long videos or too much jargon used – you can then take this feedback and make your content more engaging. Alternatively, they simply aren’t interested in the topic at hand. In this case, you should find out what topics they want to learn and create your content based on those.
Your content should be made for your users, and you want to keep them engaged.
Assessment Pass Rates
The next learning datum we’ll look at is Assessment Pass Rates. Similar to content completion, having high assessment pass rates is usually a positive indication. It suggests that your learners understand the concepts they’ve learned and can successfully apply them during the assessments. However, high assessment pass rates could mean that the assessments are too simple and pose no challenge at all. You would want to engage your users to think in the assessments as this will help them retain important knowledge better. In addition, if you’re tying an assessment to a particular certification, it may not be a good look if everyone who enrols can easily obtain it.
On the flip side, having poor passing rates is also not good. Here are some potential reasons for an assessment’s poor pass rate:
- The content prior to the assessment was designed poorly. Poor passing rates could indicate that learners don’t understand what they learned or can’t apply them skillfully. One recommendation is to review your content and make it more digestible for learners. Also, some learning systems allow users to take the assessments even before completing the learning components of the course. In such cases, consider enforcing completion of the learning content before the assessment can be attempted.
- The assessment was designed poorly. If you find that a large bulk of users are answering specific questions wrongly, there may an issue with the wording of the questions or answers. For instance, two options may not be distinct enough in a multiple-choice question. If the similarity is intentional to test a specific concept, however, be sure to explain the difference thoroughly in your explanation. It’s also up to the course designer to ensure that the assessment questions are grammatically accurate and not ambiguous.
Poorly designed content and questions make it frustrating for users and make them less willing to continue learning.
User Adoption Rate
User Adoption Rate refers to the proportion of users who are actively using your learning platform out of those who have access to it.
If a sizeable majority of users are using the platform, it signals a healthy user adoption rate. On the other side of the spectrum, having a low user adoption rate is not something any administrator wants to see. This is especially true since resources are spent on the platform and have to be justified to the other organisational stakeholders.
If you find that only a small minority of your user base is actually using the platform, it’s time to take a step back and find out why. It will be helpful to ask these questions:
- Is there enough learning content being produced? Even the most fantastic platform is dead in the water if insufficient content is created. After all, why would your users log into the platform if there’s nothing new for them to consume? If you don’t have a team of content creators to provide regular updates, consider opening the floor up to other members of your organisation to start contributing. For example, subject matter experts can give insight into their areas of expertise or interests. With a steady stream of interesting content, you’ll soon find that users will begin to use your platform more.
- Does the learning programme have support within the organisation? In our experience, a key person or department driving the programme leads to better user engagement on learning platforms. This ensures someone is responsible for maintaining the quality of content on top of facilitating an ongoing stream of content. It also seems to make a difference in user adoption rates when a senior staff member drives the programme. This is because they have a greater circle of influence in the organisation and can also help to keep other top stakeholders interested in the initiative. If you’ve seen low user adoption rates, it could be a good idea to rope in a senior member of the organisation to help push the initiative forward.
- Is the learning platform easy to use? This question may seem like a simple one, but it’s an important one. As administrators, you have to keep your users’ experiences in mind. Are users finding it hard to navigate the platform? Do they have trouble accessing the content that they are interested in? Are they finding it difficult to continue learning? The poorer the user experience, the more likely it is to have a poor user adoption rate. If the platform you’re using isn’t as intuitive to use, there’s no need to consider switching just yet. Instead, consider creating some simple user guides to help them along. This can include compiling the expected user flows and answering frequently asked questions. However, if you find that does not seem to help, perhaps it’s time to consider switching.
Finding a platform you and your users enjoy using is important.
These three forms of learning data we’ve covered merely scratch the surface of what’s available on most learning platforms. If you’re willing to dive deeper, you’ll be sure to uncover significant insights and discover learning trends among your users. When analysed properly, this can help you enhance their learning.
In our experience at SmartUp, we’ve found that data without the proper interpretation isn’t particularly meaningful. As such, we are working on providing smarter analytical insights in the future. Stay tuned!