Product Agility

How to Predict Success Before It’s Too Late – The Power of Leading Metrics

Ben Maynard Season 3 Episode 2

Send us a text

In this episode of the Product Agility Podcast, host Ben Maynard explores the role of leading and lagging metrics in product strategy, OKRs, and business impact. Many teams rely on lagging metrics to measure success, but when the results are in, it’s often too late to make meaningful changes. Leading metrics provide an opportunity to adjust course earlier, yet they are frequently misunderstood or overlooked.

This episode examines the difference between leading and lagging metrics, why both are essential and how they can be effectively integrated into OKRs to ensure they act as a steering tool rather than just a report card. Ben also discusses the Kellogg Change Model, a framework that helps teams without direct customer impact measure meaningful success.

Topics include:

  • Why lagging metrics alone can lead to missed opportunities
  • How leading metrics provide early indicators of success
  • The connection between strategy, tactics, and measurement
  • The Kellogg Change Model and how it applies to internal teams

Practical ways to use metrics for learning and adaptation rather than just reporting
Whether you are a product manager, an agile coach, or a leader looking to improve strategic decision-making, this episode offers insights into how to measure success more effectively and avoid common pitfalls when setting objectives and key results.

Host Bio

Ben is a seasoned expert in product agility coaching, unleashing the potential of people and products. With over a decade of experience, his focus now is product-led growth & agility in organisations of all sizes.

Stay up-to-date with us on our social media📱!

Ben Maynard

🔗 https://www.linkedin.com/in/benmaynard-sheev/

🐦 https://x.com/BenWMaynard

💻 https://sheev.co.uk/

Product Agility Podcast

🔗 https://www.linkedin.com/company/productagilitypod/

💻 https://productagilitypod.co.uk/

🖇️ https://linktr.ee/productagility


Listen & Share On Spotify & iTunes


Want to come on the podcast?

Want to be a guest or have a guest request? Let us know here https://bit.ly/49osN80

Are your OKRs actually steering you and your team towards success? Or are they just yet another mechanism, another fad, which is being hoisted upon you by your leadership or by somebody else in the organization? Maybe some expensive consultant like me. 
But 
what if you could know you are not on the path to success way before it's too late for course correction before you've got egg on your face. 
In this episode, we're diving into the real power of leading and lagging metrics. How they can help shape strategy, drive execution, and connect impact way beyond just revenue. 
We'll break it down in the context of the Kellogg change model to show, to show how every team, even those without direct customer impact, can measure meaningful success. 
Whether you are a product manager trying to align your teams or an agile coach guiding strategic decisions, or at least trying to guide strategic decisions, this episode is chalk full with insights you can apply today. 
So let's begin. Welcome to the product agility podcast, the missing link between agile and product. 
The purpose of this podcast is to share practical tips, strategies, and stories from world class thought leaders and practitioners. 
Why I hear you ask? Well, 
I want to increase your knowledge and your motivation to experiment so that together we can create ever more successful products. My name is Ben May Island and I'm your host. What has driven me for the last decade to bridge the gap between agility and product is a deep rooted belief that people and products evolving together can achieve mutual excellence. Hey, welcome back to the podcaster today. I'm sure if you can 
hear that. And that's me 
just doing a cup of coffee. 'cause I'm not in my usual studio. I'm in fact in a hotel in Lithuania. Um, a rather nice hotel 
in a hotel, all the same. And I'm finding myself sat at a desk 
writing a script, um, and recording a podcast episode for you all. Um, I know that we haven't been as consistent as perhaps we would've liked to have been. Uh, that's just because I've been, well, very busy, 
um, number of different clients on the go. Uh, no shortage of things I want to talk to you all about, but I've just been lacking the time. So I committed myself this morning to getting up extra early, fixing a cup of coffee and wanting to share some thoughts on 
leading and lagging metrics, specifically in OKRs and in relation to product strategy. Because 
you know what leading and lagging metrics, I think for those people that get them, they just make perfect sense. But I see a consistent failure in articulation and also, you know, I think people struggle to pick up the concept. 
So today we're gonna look at the nuances 
of what it means to measure leading and lagging metrics so that you can make, build great products, 
align teams, and drive really meaningful business impact. 
But 
we're not just gonna talk about what they are. Uh, we're gonna explore, you know, specifically how they can connect to OKRs, 
how that then is an echo of your product or business strategy and how they drive business impact. And to do that, we'll take a little tour around the Kellogg change model to explain how teams that don't have direct revenue impact can still measure meaningful success. 
So if you've ever really struggled to measure success beyond just, you know, shipping features or dare I say, 
you know, just looking at velocity and you wanna make sure that 
what you are doing is having a real impact, perhaps OKRs or not, and see how you can steer things in the right direction, have a team can really take ownership for steering things in the right direction. Well, this isn't episode just for you 
now, why the leading and lagging metrics really matter? Well, 
let's start with a common struggle. You 
have an, okay, you have an OKR and hopefully this is something the team have come up with themselves and it's to improve. 
Oh no, it's pick something really 
obviously, um, user attention, 
right? And we wanna hit a target of, I dunno, 30% 
we go by, it's been three months, it was a quarter to OKR, we're getting towards the end and we're at 12% 
not where we wanted to be. And I'm not saying that, you know, with an KR and the key results you have as part of that, 
you need to hit those targets. But, you know, if you're less than half of the way there and you're only finding out at the very end, then 
yeah, I, I would say that some people may start to panic at this point, 
may start to question their life choices. May begin to wonder, well, how are we gonna have the difficult conversation with 
our leaders? Not that we haven't met the target, but because we're only finding out now, 
do we game it? Do we massage in some way? Do we try and 
shoot the finger of blame that somebody else in the organization 'cause it's their fault we didn't do this, it isn't us. 
Um, or do we just, you know, 
promise 
from the bottom of our hearts that next quarter 
it will be better 
For me, this is a classic example of a lagging metric problem. 
It's a result of what happened in the past. Um, but what if we could have known 
weeks ago that that retention wasn't improving? 
And this is where leading metrics come in. 
So when we think about leading and lagging metrics, 
we have to consider the impact we're trying to have. 
So in this instance, it was retention, but I mean that's a bit of a rubbish impact. And I would probably say that if we're talking about an impact here, we're gonna talk about something that's really gonna affect the bottom line. 
Something that's make saving or protecting the organization's money. And this OKR is here to create a real definite link between the activities that we're having and then specific business impact. Um, somebody challenged me once on a 
lean product discovery course I was facilitating, saying, well, you know, well, 
uh, impact is, 
uh, needs to be something other than 
just revenue or the bottom line or make saving or protecting money. Um, and yeah, for us, some interesting challenges, let's say. Um, and yeah, there is a seeded truth from what he said, even though it was slightly vitriolic. 
I think that when we talk about impact here, 
um, 
really we wanna be talking about something that's making a difference to the business. 
So 
if we can't and it is an internal impact, then we, we make do with that. 
And our lagging metric tells us 
that we have achieved what we wanted or we set out to achieve. And that achievement is having the impact what that we were endeavoring to achieve. 
And so with a lagging metric, it tells you if you have succeeded 
and you know it's a lagging metric because it's gonna come too late for you to fix anything. 
Okay? So 
lagging metric will tell you if you've succeeded, but it's gonna come too late to tell you if you have, uh, if you 
are able to fix anything, it comes too late in the day. 
Leading metrics tell you if you are on the right path before it's too late to adjust. 
Okay, so what are the things that we believe need to be true? What are the causes that will affect the lagging metric? 
So how do we know we're on the right path before it's too late to adjust? 
So if we think about this and we think about OKRs, people's 
topic of the, 
of the year, perhaps 
OKRs without the right mix of leading and lagging metrics can become just a, a scorecard rather than a steering tool. 
A good 
OKR needs to have a mix between leading and lagging 
so that the leading metrics help you steer where you are going. 
And this is where many teams struggle 
because they, they realize is that OKRs are missing the real signals that are gonna drive success. 
So if we think about connecting, leading, and lagging metrics to strategy and tactics, 
we need to think, i 

think about what's the difference between strategy and tactics. I don't think it's something that many people are taught. 
Um, 
I can't remember many people speaking about it to be fair, even though I've worked with many people who have had very fancy MBAs. I'm, 
I'm sure somebody did mention it at some point, but it gets lost. 
And so I thought it was really important for us to take a moment to think about the difference between strategy and tactics. 
Um, so this, we're gonna need to bring in a bigger picture 
because leading and lagging metrics actually help bridge the gap between STA strategy, 
which is where we are going to play, how a we are going to win, what obstacles are we gonna overcome 
and tactics, which is what we are going to do to execute that strategy. 
So I always used to say 
maybe it's not a hundred percent something I stand behind now, but I used to say 
your product roadmap is your strategy. Your product backlog is your tactics. And I still think there's a a seed of truth in that. 
So if we think about the strategy, 
this is having that winning aspiration that says that we wanna be the fastest, most reliable, the most successful X, Y, z, whatever it might be. 
And the tactics are how we're gonna go about doing that. So if it was a online marketplace for training as an example, and we wanted to be the, the most reliable, trusted source for telling people 
you 
where they should invest their money and time when it comes to education, we're gonna look at things like improving the quality of the ratings. We're gonna look at improving the checkout experience. We can be looking at making sure we've got a great supply of fantastic trainers, that these are the tactics we're gonna undertake to achieve our strategic aspiration. 
So how do we then know if these tactics are working? And this is where leading and lagging indicators come in and play a role. 
So leading metrics you can think of are, uh, predictive, hopefully realtime indicators, things like checkout speed, 
lagging metrics are the results that confirm success. So this is looking at, you know, the number of completed transactions that we have. This is looking at a number of new customers we're coming on and the attachment rate for the number of new courses they're purchasing. 
So if we think of an example OKR here and, and if to, excuse me, it is early in the morning 
and my, my brain maybe wasn't working on full throttle. Um, so I just picked a really easy OKR and I thought, okay, improved customer experience and looking to increase revenue. 
Lemme know on LinkedIn how that could have been better. Um, uh, 
it could equivalent to five o'clock in the morning. 
And I think that the key results leading key result could be reducing checkout time. 
Uh, leading key result could be increasing. The add to cart rate 
lagging 
is to growth in revenue. 
Maybe it's even something do with customer satisfaction, but anyone that knows me knows that I have a 
allergic reaction to seeing things like customer satisfaction on key results. Um, especially when it is just broad, like improved customer satisfaction. I think it's a bit lame. I think we can do better. 
So when we think about those key results, that mixture of leading and lagging, it's a system of measures that allows us to track short term 
or keeping an eye on the long term impact. 
So if we now 
take a moment to consider, 
uh, the Kellogg change model, and I do like the Kellogg change model, this was something which I was aware of in my career, um, but it was only when I started, uh, being, uh, under the tutorship 
of Jeff got Health and Josh Schaden and check out Lead New Ex the book and who does what by how much I became much more familiar with this color change model. 
Um, and I I like to think about this because actually this really helps teams when they are and organizations, groups, value streams, products, whatever it might be, when they feel that they don't drive direct impact, uh, on the revenue of an organization. 
And you know, I think even within a 
a group in an organization, one of the biggest frustrations as teams have is when they can't tie their work directly to revenue or directly to real external customer impact. Um, for example, uh, implementing a new 
identity and access management 
system, uh, security team looking to improve compliance, uh, DevOps team, trying to reduce deployment time. I mean, yeah, okay, I know you can find a way to 
create a connection to a customer and you, 
I've tried and I have failed many times in doing this and there are teams I know have tried and succeeded, but most of them 
that link to the customers so tenuous, it's hard for 'em to really believe it and keep it in their minds and actually let that drive them forward. 
So 
if we can't do that, and we do always struggle, then to have these teams that maybe don't, don't have that direct 
impact in the traditional sense that we'd like to have every team have an impact, then the Kellogg change model can help us. 'cause it helps us show a link between the work and business success. 
So if you break down the Kellogg change model, um, we have five areas. Um, 
it all begins with an input. 
And this is the people time tools, the, the, the thing that we are inputting into our system. Okay? So this is the, 
the investment we're making in people, resources, time and tools. 
When we're making that input investment, then what we are gonna be achieving are some kind of activities. And these are, you know, very much like your tactics. These are the actual work being done, 
you know, the were actual work being done by those teams, 
the outputs, and this is the, the middle stage of this all vis-a-vis, uh, 
immediate and, and and intermediate results. 
So this is where we are seeing the solutions that we have hypothesized are going to be having the impact that we would like, uh, come to fruit. 
And we are gonna be seeing that 
stuff is being produced, that output's being created. And this is traditionally the type of thing that maybe people would wanna see on a project plan or the type of things that your senior leaders are gonna be asking you to see 'cause they believe that is progress. 
But actually when we then this upper notch, what we're then looking for is outputs to then have some form of outcome, a behavioral change. And so if we go back to that DevOps example, actually the, the customers 
or the, the users of what is being produced from that output 
are other teams, are the engineers. And what we wanna be able to see is that those engineers are able to ship 
updates to the product faster. 
Is someone doing something different by a certain amount of time? 
And that's a valid outcome in this, in this instance. 'cause the impact of funnel business value 
is that those engineers, those teams 
are gonna be, uh, more confident 
in the changes that they make. They're gonna be be able to get their changes to market 
faster. We are gonna be reducing costs 'cause there'll be less time invested by them to help maybe 
handhold something through the deployment. 
So in this instance that's te for DevOps team trying to reduce deployment time. The impact the funnel business value isn't something which perhaps is directly related to a a customer per se, but it is, it is 
saving the organization money. 
And that can be just as important at times as going out and having a shit hot product idea and making loads of money. 
So if we gave you that, again, we think about this DevOps team trying to reduce the for deployment time. 
The impact 
is faster time to market, it's reduced costs. 
The outcome is the engineers being able to ship updates faster. 
The output was the faster deployments, 
the activity were all the things that those teams were doing to improve that deployment pipeline. 
And the input was to people time and tools. 
And so when we begin to break it down in this respect and we think of the impact as difference to the business, the outcome is the behavioral change. These are the key results 
for our OKR. 
Then we begin to find ways to actually framework in a real, uh, business centric and meaningful way for those teams that don't have those direct customer impacts. 
And I've found that over the years, this structure really helps teams measure impact beyond just revenue. And that allows every team to see how their work contributes towards success. 
So get on the internet, look up the catalog change model, huge amount of information available there, all free of charge. Um, also, yeah, check out the, uh, who, who does what by how much by Jeff got health and Josh den some great, uh, angles and viewpoints on this model and OKRs in that book too. 
Um, so 
how do we apply this to our product, 
right? And how do we apply this to our OKRs? 
So 
here are some practices that I believe in for using leading and lagging metrics effectively. 
First and foremost, it's about creating a balance of both types of metrics. So the 
OKRs having both leading like predictive and lagging result based metrics. I'd like to think of these as a system. 
Each of these metrics should affect each other in some way 
and we can create a chain of effect, a hypothetical chain of effect. We can look at the system and making sure that each of these measures balances each other out and that the things we think are 
going to predict success, 
perhaps actually are. 
So 
get a balance, view them as a system and find out how they connect, visualize them, okay? And then that can help us make sure we've got a good mix of leading and lagging metrics. 'cause I believe we need both. 
I'm a keen advocate that leading metrics should be actionable. They should focus on things we can change today. Not things we can only measure later. Now that may sound obvious, but 
will be surprised 
at how difficult we can find it sometimes to actually find things we can measure today. 
So take a look at what you believe your leading metrics are or if you've got a blank sheet of paper, think to yourself what can you change today 
that you think will have an impact on the later that will give you the result that you're looking for 
and try and make those leading metrics actionable and observable. 
My third tip 
is to connect internal and external impact. So use the Kellogg change model to track progress. 
Use it as a facilitation tool, as a, as a template or dare I say a canvas, uh, to get people, uh, putting up on a wall what they think their output is, what they think their activities are, and see if people can come up with a business impact 
that they can have as a team. 
So use a color change model as a template, as an idea, 
uh, to connect internal and external impact. 
And importantly. And I think there is a, 
you know, 
things coming out from people that I respect, whether it's Jeff got Jeff and Josh, or whether it is radical duck talking about OKRs. I think that we should use metrics to learn 
and to experiment, 
not just to report. 
So leading metrics, if they aren't improving, if they aren't coming out as we would like, then they are an invitation. No, not an invitation. They are a compulsion for us to change our TAC tactics and not just wait for the lagging results. So don't see your metrics as something to report out or to track progress. See them as mechanisms by which to learn. See them as mechanisms by which to articulate hypotheses. And if they are not working, then change your tactics. Don't wait for the lagging results. 
What we wanna do with all of those, uh, tips I provided is ensure teams have short term progress measures to keep them motivated, 
right? Whilst keeping an eye on their bigger business goal, whatever that business goal might be. 
So I can 
almost smell all the buffet breakfast in the hotel and it's time for me to wrap this up 'cause I have to go and, 
uh, facilitate a lean product discovery course this morning, uh, which I'm super excited about, but I do need to go and get some substance before I do that. So we're gonna wrap up today's episode. 
And I, 
how do I wanna do that? Well, I suppose if I think of leading and lagging metrics, they're there to help us bridge the gap between strategy and execution. 
They're had to give us early signals to guide their decisions or ensuring we stay focused on long-term impact. 
If you found this useful, then 
I'd love to hear from you. Tell me how you are using leading and lagging indicators, how you've been confused by them. You know, it's nice a couple of times a week I'm getting messages from people and uh, that's a lovely thing. You know, it makes it all worthwhile. So feel free to reach out with me, to me on LinkedIn, share your challenges. Don't worry about making a a post. You can just send me a message and that's absolutely fine. Or if you do wanna do a post to, to thank me for such an amazing episode, or if you have a question, I'm leading a lagging indicators. Or if you think I am full of shit, um, call me out. Let's have a conversation on LinkedIn. Let's, you know, let's all learn together because I don't think that dissent should be kept private. I think we should make it public. 'cause it's only then we can learn online and hopefully make some new friends. 
So 
as always, if you've enjoyed this episode, please subscribe, leave me a review on Spotify or iTunes and share it with a, a friend or colleague. Um, it helps us grow and reach more awesome people just like you. 
So 
thank you very much for listening. Uh, this has been the Product Agility podcast and we'll see you again next week.

People on this episode