Death By Metrics

We seem to measuring a lot of things lately. The sheer volume of status reports and dashboards and timelines and updates would seem to indicate we have lots of metrics being captured and reported. But I've seen firsthand how numbers are shoddily derived, over reported, incorrectly reused and re-reported and it doesn't inspire confidence. What does it mean that leadership is basing decisions on these values?

Measuring what you do and how you do it is critical to correctly identifying what is of value to your customers and what is not. For Enterprise Architects, metrics are even more important than usual for the precise reason that the industry has had mixed success consistently defining what EA actually is and what it can do for a company. Additionally, it can often be difficult to ‘see’ the value an EA program provides merely by observing. It is incumbent on us as Architects to demonstrate value by measuring and refining what we do in a clear manner that is consumable by business and IT stakeholders.

However, measuring things just to say you measured them doesn’t help anyone.

We seem to measuring a lot of things lately. The sheer volume of status reports and dashboards and timelines and updates would seem to indicate we have lots of metrics being captured and reported. I’d venture to say that most of this is probably unnecessary and ultimately confuses everyone as to the success and overall merit of programs, especially the architecture components of programs. More ominously, I’ve seen firsthand how numbers are shoddily derived, over reported, incorrectly reused and re-reported and it doesn’t inspire confidence. If I don’t believe the numbers I’m seeing are of any actual relevance to how a program is performing, what does it say that leadership bases decisions on these values?

Measuring the wrong things is endemic in IT. I’m reminded of a report from McDonald’s that indicated a $1k social media campaign resulted in a 33% increase in check-ins at franchise locations. This report was bandied about as an indication that social media works wonders for corporate balance sheets. The problem, of course, is that what was measured was the number of check-ins and not the number of purchases. In fact, it didn’t even measure the number of people walking through the door despite various pronouncements that the report demonstrated an increase in foot traffic as well. All that was measured was how many folks were checking in, either inside the restaurant or merely driving by the store. It didn’t measure foot traffic. It didn’t measure revenue. It is difficult to support the statements that it increased either.

If you think it is a problem to measure the wrong things, imagine measuring the wrong things in great quantities. The sheer volume of measurements and dramatic increase in emphasis on metrics and reporting over the past few years is partly attributable to corporate profits being squeezed by economic uncertainty. Companies are looking for any increase in efficiency. To do that, they need metrics. They need people reporting what they’re doing, how they’re doing it and what the results are on a weekly or daily basis. With this increased demand for demonstrative indicators of value-add, numbers are flying. Various reports have the same rehashed numbers that have been entered manually into multiple different systems (a spreadsheet here, an ERP system there, a slide deck everywhere). Saying the same wrong thing over and over again doesn’t make it any more correct.

Metrics are important, no doubt. How else are we to know that what we’re doing is the right thing? My caution is merely to a) make sure you’re measuring data that actually tells you something about the effectiveness of your program and b) don’t make your people report it multiple times to different audiences in different formats! Sadly this last point is less likely to occur.

I wonder if anyone has collected data on how much time and labor is wasted measuring irrelevant things, drawing irrelevant conclusions and reporting and dashboarding these to superiors in a multitude of redundant status reports formats?