Why Measuring Mentoring in Hours is a Terrible Idea

A hand holding a watch, tracking the time.

How do you measure the relationships in your life? By hours spent together? We hope not.  

Just as ‘quality time’ exists for personal and familial relationships, it is also an important sentiment for professional relationships like mentoring and coaching. 

It’s why seeing a therapist can sometimes feel transformative, while other times it may feel like a rip off. Measuring the impact of our relationships in ‘hours’ is never really accurate.

Human relationships are difficult to measure. When it comes to mentoring programs this flaw is compounded – the business metrics take a long time to track and the impact gets lost along the way if a measure of quality is absent. 

So why do other mentoring platforms measure mentoring by number of sessions, or worse, number of hours?

As Harvard Business Review says, positive professional relationships have three traits in common: 

1. The individual understands what the relevance of their relationship is; 
2. They understand whether, and why, they are transactional or transformational; 
3. And they are committed to maintaining the relationship even when they are in conflict. 

Thinking about these three traits can help you assess your key relationships and identify opportunities to engage and connect in ways that deliver results when needed. If you’ve realised that some of your professional relationships need work, identify the key relationships that have the most influence on your success, and conduct a quick audit of each of the three traits.

A lot of time and energy is invested in developing and maintaining an organisational mentoring program and organisations want to know that those efforts are well spent. As they rightfully should be! They should be seeking these answers from both those who run mentoring programs, and hold the expectation that the tools that assist running their programs, surface this information.

You’ll notice that other mentoring platforms often measure and focus solely on ‘vanity metrics’ cloaked as ‘engagement activities’. These may include recording and reporting on the number of messages and meetings, along with: 

  • Number of sign ups
  • Number of relationships
  • Number of mentoring sessions
  • Number of hours mentoring
  • Number of goals set

These numbers, on their own, fail to surface the quality and sentiment of your program. They merely describe whether your program is increasing or decreasing in size and very loosely if something is happening.

So, how should you measure a mentoring program?


Every mentoring program should be monitored and evaluated across the lifetime of the program. By tracking the program’s health regularly and course-correcting as you go ensures program success remains on track and all of your participants continuously enjoy their experience. 

Collecting and banking this information regularly also makes evaluating the success of a program far easier at its conclusion (or at key intervals if your program is ‘always-on’).

If you’re using Mentorloop, there are a number of ways you can assess program health along the way.

The Art and Science of Measuring Mentoring Programs, with Mentorloop.

A healthy mix of qualitative and quantitative data.


The Art: Qualitative data

  • Mentoring Stories – these are surfaced throughout your program’s lifecycle and give you a deeper insight into the journey that different mentoring pairs in your program go on, what they have learned and how they’ll tackle their next goals. For example, meet Adrian from HP

  • Relationship Quality Sentiment – active reflection on every participant’s matching experience is encouraged via the platform, to ensure everybody is pleased with their match and provides an opportunity to find another match should the pairing not work out. This feedback is shared with the Program Coordinator as well as fed back into the matching algorithm to improve future matches. 
Jackie at REA Group

  • Pulse reports – deep-dive surveying. These surveys are automatically executed at key intervals throughout the program to understand how participants feel and help you understand things like:
    • Whether mentoring should continue at your organisation
    • The likelihood of recommending the program to others or repeat participation
    • How participants would rate their opportunities in various areas such as learning, growth, cross-team collaboration, skill development or goals
    • Exploring other program specific success indicators

The Science: Quantitative Data

  • Quality at Scale: Tracking your MQS (Mentorloop Quality Score); weights all of the inputs and generates an average. Measures the satisfaction and quality of mentoring both at the individual level and aggregated at a program level.  

  • Milestone data – who is at what stage? This is important for when you have asynchronous or ‘always-on’ mentoring at play where participants may have started their mentoring journey at different times. You can use this data to group participants and segment your communication, for example; you may bulk message all of your mentees who have completed their Milestones and suggest they give being a mentor a go in your next cohort. 

  • Goals and Tasks – track their progress towards these tasks. No need to check in regularly, you can access this via the goals tab at any time. These goals can be professional, such as getting a promotion, or personal, such as developing a new skill or building a personal advisory board. 

  • Refer a friend suggestions – virality of the program

Always On: Live data

If you’re running a program without a mentoring platform, please, PLEASE, don’t wait until the end to evaluate it. Programs that evaluate their program health at the end of a program have waited until it’s too late. Just like you wouldn’t have an annual check-in with your romantic partner, we don’t suggest doing so with mentoring partners either.

Bringing this data together, assists in building your organisational mentoring narrative – evaluating in real-time.


We’re no stranger to communications platforms; whether you’re a fan of Microsoft Teams, Zoom, Whatsapp, text, phone calls or a good old fashioned coffee – all, are equally great ways of communicating. Check out our suite of apps and integrations to weave this in seamlessly with your organisation’s communications ecosystem.

The best mentoring happens where you’re currently operating, communicating and where people feel most comfortable. So, as controversial as it sounds, the best mentoring sometimes happens outside of the Mentorloop platform, so it makes sense to ask the question:

How do you know if in-person mentoring is going well?


Put simply, we just ask.  

Evaluating mentoring programs can often be an afterthought, or pushed to the end of a mentoring program – 6, 12 or even 18 months into the program, when it’s too late to course correct or make vital changes that could have taken your program on a completely different path. 

By asking participants a simple question after key moments (like post-meeting) and periodically throughout their mentoring lifetime, Mentorloop aggregates these responses to make it easy to know how, and when, to intervene by surfacing program sentiment data.   
At the same time, participants have the opportunity to add private feedback on how the relationship is faring. This is also collected to build mentoring stories – making it easier than ever to build a narrative around the impact that your mentoring program has and surface Mentoring Champions who can help you promote the program in subsequent recruitment drives.


To learn more about how Mentorloop can surface the data that matters, book a demo:

Book Your Demo

5 1 vote
Article Rating

Em is our Marketing Manager at Mentorloop. That's a lot of 'm's! | She is passionate about crafting messages, crafternoons and craft beer.

guest
1 Comment
oldest
newest most voted
Inline Feedbacks
View all comments

[…] Unfortunately, unless you do overcome this challenge and track your program properly, or maybe use incorrect measurements, it may not last too long. Executive teams and management expect to see results in the year 2017 […]