Numbers & Unicorns

Written by Helen Bailey, Head of L&D.

Recently I had the pleasure and the privilege of speaking to the Derby Notts CIPD Learning and Development Community about Evidence-Based L&D, and for those of you who are interested in the topic and couldn’t make it – here is a summary of my thoughts…

I have always been interested in evaluation in L&D – to me, there is a natural thought process that follows through from interventions and what happens back in the workplace. In essence, you can do all the L&D activity in the world but if nobody is using it, what’s the point?

In the past, I designed and facilitated an award-winning management development programme and one of the key components in our bid was how the learning had impacted the workplace.

During the pandemic I pivoted from L&D to a communications-focused role and became fascinated by the metrics side and how this guides what to publish and when. My most notable learning from this period was how similar it is to L&D which you can read more about here.

The fascination has remained and here I am still pondering and thinking.

What Is Data Literacy?

Stella Lee from Paradox Learning describes this as “the ability to work with and communicate effectively with data.” in essence what we are trying to do with data is tell a story or build a narrative for our audience so they understand our business case.

So why is an Evidence-Based Approach Important?

As an L&D professional the most obvious place to start is the HR Profession Map which defines the key purpose, knowledge, and behaviours which underpin being an effective HR Practitioner as outlined below.

  • Purpose – Evidence-based – defined as “Adding weight to your professional judgement by supporting your case with strong evidence from diverse sources”
  • Core Knowledge – Evidence-Based Practice
  • Core Behaviours – Insight Focused – defined as asking questions and evaluating evidence and ideas, to create insight and understand the whole
  • Specialist Knowledge – People Analytics – using analytics to inform decision making

It is clear that an evidence-based approach runs through the map and this is reflected in wider L&D discussions as shown below.

  • We are moving from an L&D-driven world to a learner-driven world of just-in-time learning. With this in mind is Kirkpatrick still relevant? Great for measuring training courses less so for content-driven interventions.
  • Using an evidence-based approach means we can focus on the areas where we can have the most impact, particularly in what could be a resource-short world
  • Research by Emerald Works as Towards Maturity suggests that high-performing learning organisations that embrace business insights are four times more likely to measure business-specific metrics when evaluating learning.

What’s Happening at the Moment?

There is some evaluation activity taking place which is potentially being driven by the need to show value to stakeholders. The last statistic does indicate that L&D activity is not always focused in the right place if there appears to be a misalignment between content and skills gaps.

So What Could We Measure?

As you might expect there are lots of thoughts around what we could measure – for example, Dr Hannah Gore stresses that we need to move away from vanity metrics which please us and Ben Betts makes the argument for not using a single measure.

Let’s have a look at a few more examples:

  1. The L&D Value Spectrum by Laura Overton

This demonstrates a clear split between Learning Value and Business Value with the narrative in the latter being concerned with Performance and Culture.

What is interesting to note here is that the focus tends to be on engagement and efficiency with business performance coming in as the fourth most popular measure – perhaps an example of the vanity metrics Dr Gore was referring to.

  1. Kevin M. Yates (The L&D Detective)

Yates notes that people who analyse L&D data are like unicorns in organisations – rare (hence the name of this blog) and goes on to highlight three areas organisations should focus on:

  • Operational efficiency – how many people, hrs, what did we offer, consumption
  • Learning effectiveness – results in incremental knowledge, using what has been learned to do differently. Most credible through business/environmental simulation – alternatively use a paper test, 360
  • Business and performance outcomes – are people performing differently back in the business and how are they impacting business goals?

There are some clear themes between what Yates and Overton suggests i.e. measures around activity, effectiveness and business outcomes are important. This seems to correlate with Ben Betts’ idea of using a mixture of measures and perhaps the key is to focus on those that matter to the business.

What have I learned on my travels in L&D?

General Thoughts

  • Ask questions from our LMS providers before we start – what metrics can it help us see, can we customise reports?
  • Think about using the Return on Expectation Approach rather than Return on Investment. Talk to our stakeholders about what they would like to see differently as a result of the intervention and build in metrics to measure that.

Learning Management Systems/Intranet

ASK

  • What are people looking at?
  • When are they looking at it
  • How long do they spend viewing content?

Consider what are the themes and trends – what are they telling you?

Remember that lurking is not a bad thing! Consider why people are reviewing and not commenting.

Nelson Sivalingham recommends using data to personalise/push relevant content. Look at:

  • What skills do people have?
  • Performance – is there anyone not hitting performance targets
  • Career history – what systems have people used before do they need to have that learning again?
  • Behavioural – people sending emails outside of working hours
  • Benchmark – what are people accessing the most and what does this tell you?

Teams/Slack

ASK

  • How many people interact?
  • What posts get the most engagement?
  • Who posts most often?
  • What are the hot topics?

At the end of posts ask a question to start a discussion i.e. ‘What do you do to help manage your Mental Health?

Culture Change

Use 360-degree feedback before and after to monitor changes in behaviour.

Apprenticeships/Working with Third Party Suppliers

  • Gather feedback from learners and managers and set up regular feedback sessions for suppliers to address issues early on
  • Ask to see any feedback gathered by the provider and compare it to internal feedback.

Leadership Development

  • Using online forms increases quality – incentivise where possible or allow time in the session
  • Change the focus of questions from ‘The facilitator was able to engage me’ to ‘I was fully engaged in the session’
  • 4 options to avoid ‘the middle’
  • Use Menti or Slido for instant reaction feedback
  • Use a Knowledge Uplift (a quiz at the start and end of a session) to demonstrate the increase in knowledge
  • Ask people to write a blog
  • Line Manager discussions at pre, mid and end of the programme to measure impact on workplace
  • Support Line Managers to help implementation with guides, briefing sessions etc
  • Use stretch assignments/internal projects
  • Use systems i.e LMS where possible to automate the processes

Whatever you Measure – Remember

  • Incomplete data is better than no data at all
  • Recent data is better than older data
  • Relevant data is better than more data

This is suggested by Sahin Guvenilir and is a useful reminder that there is no perfect moment to start with data and we have to start somewhere.

Some Thoughts on Presenting Data

You may have number crunched to your heart’s content – however the key is to make it presentable to your audience in a format they will understand. The key is to make it…

  • Accessible to as many people as possible
  • Ensure it is clear what you are saying
  • Enable people to understand with a minimum effort

To that end think about using…

  • Pre Attentive Attributes to help your audience’s brains process information visually
  • Utilise the Visual Vocabulary to identify what is the crucial way to present your data
  • Think about the story you want to tell with your data
  • Consider the titles you use – for example, July Results doesn’t draw your audience in whereas Consumer behaviour has Shifted does.

Final Thoughts

There’s a lot to think about here and consider… If there are three thoughts to take away they are:

  • Ask questions
  • Be curious
  • Use data to support your decisions

And remember – be more unicorn!

For more information on how we can support your business, contact us today at [email protected]

Share this article...

More from Strategi

Please keep me in the
loop with all things Strategi

We care about your data, and
we’d love to use cookies to make
your experience better.