How to Match Your KPI Calculation and Reporting Cadence
by Stacey BarrKPI reporting cadence and calculation cadence need some kind of synchronisation, but just because performance might be published monthly, doesn’t mean all your KPIs should be measured monthly.
We don’t need to make our KPI calculation cadence directly match our reporting cadence. In other words, if a customer service team reviews performance each month, it doesn’t mean that all their KPIs, like Customer Satisfaction, Net Promoter Score, On-Time Delivery and the like, have to be measured monthly too.
While direct matching might feel like a neat alignment, it has a few problems:
- If we calculate our KPIs monthly, we won’t have enough data to pick up valid and real signals in a monthly report. (Too many people still use the wrong analysis methods to look for signals in their KPIs – don’t be one of them anymore!)
- If our KPI measures a result that can change very quickly, quarterly or annual calculations will be too dull to show signals of change.
- If our KPI measures a result that takes a long time to change, monthly calculations will show only random variation (not real change) and will be a waste of data collection.
Of course, this goes for any other cadence of calculation and reporting, like annual, quarterly, or weekly. So the question is, what is a useful matching of KPI calculation and reporting cadence? There are four things to consider:
- Get the most information out of your KPIs by calculating them frequently enough but not too frequently.
- Respond as quickly as possible to your KPIs but setting a reporting cadence that prevents signals from going unnoticed.
- Focus your performance review conversations on the KPIs signaling that attention is needed most urgently.
- For low cadence KPIs, it’s okay to turn attention to the progress of actions underway to improve them.
- For high cadence KPIs, it’s okay to look in on them between reporting periods.
Let’s take a closer look at each consideration:
Calculate KPIs frequently enough, but not too frequently.
We need our KPIs to tell us when the result they’re measuring changes in a relevant way. All changes in performance are subject to the law of natural variation, so just because a KPI shows a difference between one month and the next, it doesn’t mean the result has changed. KPIs are numbers and analysis of numbers is mathematical statistics, so the laws of statistics need to be followed to get truth out of our KPIs.
Using XmR charts for our KPIs is the simplest way to do this, and it means that we need a minimum of three and as many as eight KPI values to behave in a different way before we can be sure there’s a signal. This is why we want to measure as frequently as we can, without it being too frequently.
Imagine a fire department has a result of ‘fire crews are available to respond to every alarm’, and it’s measured by Crew Availability, the percentage of alarms for which a full crew was ready to deploy. Measuring this weekly might be too frequent because programs to recruit and train intakes of new fire fighters might only run once or twice a quarter, and we wouldn’t see any meaningful change in Crew Availability between weeks. Weekly data collation and analysis would be a waste of time. But that also means that measuring quarterly might be too infrequent, because we can certainly get new fire fighters assigned to crews every month or two. So a monthly calculation cadence might suit Crew Availability just right, in this instance at least.
To get the most information out of our KPIs, we need to carefully consider which cadence of calculation is the most useful.
Set a reporting cadence that prevents signals from going unnoticed.
A performance report is based on a collection of related KPIs. This collection of KPIs might monitor the set of goals of a team, a business unit, a function, a business process, or a strategic direction. These KPIs might each have varying calculation cadences, some calculated weekly, some monthly, some quarterly, for example.
The idea is to report as frequently as we can on this related set of measures that makes sure signals that need action don’t fall through cracks. But also we don’t want to waste time producing and reviewing a performance report too often, when the KPIs haven’t got enough new values to show anything new.
A good guideline is to set our report cadence to one period less frequently than our highest cadence KPI. In other words, if the KPI we calculate most frequently is weekly, then we’d set our report cadence to monthly. This is going to work out just fine, as long as we’ve taken the first step above, to set our KPI calculation cadence properly.
Focus performance review conversations on the KPIs signaling that attention is needed most urgently.
A performance report is produced for a team conversation. That team is the group of people who own and are accountable for the KPIs in that report. And that conversation is about answering three basic questions of each KPI: what is it doing, why is it doing that and what needs to be done about it? Furthermore, we want to answer these questions, and take improvement actions, before the KPIs move too far away from their targets.
After we’ve got enough historic data for each of our KPIs to interpret them, and after we’ve done the analysis to identify historic signals, our performance review conversations are focused on new signals about how these KPIs are changing relative to their targets. This means that every performance reporting period, we’ll look over all the KPIs, and see if any new signals have emerged. It won’t matter what the calculation cadence of each KPI is; we’re looking just for the new values since the last reporting period, if there are any, and whether those values have formed a signal of change. Like these, for example, in a particular monthly performance report for our fire department:
- Crew Availability, measured monthly, will only have one new value since last time, but if it has become the fourth point in a short run above the central line, it’s a signal this KPI has changed.
- Emergency Response Time, measured weekly, will have four or five new values, but if they are simply following the same pattern of variability as previous values, there’s no signal. And if this measure is already on target, there’s no priority for action needed either.
- Residential Fires per 100,000 Households, measured quarterly, won’t have any new value this month (if this month doesn’t start a new quarter), and so this measure won’t need any attention.
For low-cadence KPIs, it’s okay to turn attention to the progress of actions underway to improve them.
Just like the example of Residential Fires per 100,000 Households, there will be KPI’s with a calculation cadence slower than the cadence of the report it features in. There are going to be reports that come out where these low-cadence KPIs aren’t showing anything new. We don’t exactly ignore them, though. We have two constructive things to give our attention to, instead.
Firstly, we can take a look at the progress of the improvement actions, or change initiatives, designed to move those KPIs towards their targets. Are they on track? Do they need any intervention? We turn from performance management to project management.
Or, we can look at any lead indicators of our low-cadence KPIs and check that they’re tracking as we’d like. Often lead indicators are higher-cadence KPIs than their lag measures. Just like aircraft Turn Time, the time between an aircraft landing and taking off again for a new flight, is a lead indicator of customer loyalty and profit in the airline industry. Our lead indicators can give us forewarning about future signals in our low-cadence KPIs.
For high-cadence KPIs, it’s okay to look in on them between reporting periods.
Just like the example of Emergency Response Time, we’ll also have KPIs with a faster cadence than our reporting cadence. It’s not efficient to make the report cadence faster, just for the few high-cadence KPIs we might have. That will waste too much time and effort.
But there is a chance that those KPIs might show up a signal we should know about, and perhaps respond to, before the next report comes out. No problem: there’s no law that says we can’t just check in on those high-cadence KPIs
in-between reporting periods. As the world moves more and more of us toward dashboards that have fingers that reach easily into our database systems, it will be easier and easier to do this.
It’s about smoother reporting that makes us faster at responding to our KPIs.
In motorsport there is a term known as rev matching, which means that when you downshift, you blip the throttle a little before letting out the clutch, to raise the engine revs to the rear wheel speed. This facilitates a smoother gear change, by avoiding the engine braking that otherwise usually happens. And it sounds pretty cool, too.
That’s what we’re trying to do with matching our KPI calculation and reporting cadence: we want a smoother experience of reviewing our KPIs at the same time as picking up the important signals faster.
KPI calculation and reporting cadence need to sync, but just because reporting is monthly, doesn’t mean the KPIs should be measured monthly.
[tweet this]
ACTION:
Wouldn’t it be interesting to know how well we’re all currently matching our KPI cadences and report cadences? Share your current KPI and reporting cadences with me, in this quick poll, and I’ll update this article with the data.
Connect with Stacey
Haven’t found what you’re looking for? Want more information? Fill out the form below and I’ll get in touch with you as soon as possible.
167 Eagle Street,
Brisbane Qld 4000,
Australia
ACN: 129953635
Director: Stacey Barr