A crucial aspect of your organization’s agility lies in the speed at which your IT function can deliver change. Not the small run-the-mill types of change, but the mission-critical delivery of the new enabling technologies, digital platforms and IT solutions that your business needs to strive. Speed gives you a competitive edge in your respective markets, and as such the momentum of your corporate IT team stands as a key strategic enabler. Let’s be honest however, corporate IT is often branded with all sorts of depreciatory qualifiers related to the pace at which it can deliver.
But what is corporate IT speed? And how is it measured? The answers you’ll find below are probably not what you thought, and are certainly not what you’d want them to be.
In the case of cars, trains or marathon runners, the formula is the one we’ve learned at school: distance traveled divided by the time it takes to travel that distance.
That’s why we often use kilometers per hour to gauge the speed of travelling things. All of this is obvious. It is evident because we all have a sense of what distance means since it’s part of the tangible world we live in. Same for time: even if some of us (you know them!) have an elastic conception of time, there are a standardized measures and tools such as a clock.
That’s fine for transportation, but speed can be so many other things. The “speed” at which an automobile factory produces cars is measured by the number of cars built, divided by the time it takes to build them. In the end, speed can be viewed as the measurement of some achievement divided by the time taken to reach it.
Now that we have a formula applicable to any situation, let’s try to answer the questions above (what is corporate IT speed and how is it measured). The divisor is always time, so we can forget about it for now and focus exclusively on the dividend.
To assess IT speed, you need to know what an achievement is and be able to measure it. But to be eligible, achievement measures must have certain characteristics:
- They have to be measurable quantitatively; and
- Their units of measure must be standardized.
That’s sensible since measures of speed should not be left to qualitative interpretations and should be applicable to all solutions yielded by IT. Same for the standardization of the units of achievement, an absolute must if you want to compare speeds. After all, what’s the point of measuring speed if you cannot draw comparative conclusions?
That’s where the whole corporate IT speed thing collapses. In the case of the car factory, you count cars, but in the case of corporate IT, what are the units? There are documented units of productivity for some types of IT work, but that’s not sufficient because:
- these units vary from one work product to the other;
- they also vary from one part of your IT to the other;
- they do not cover the whole process that yields what you pay for; and
- I suspect that the processes to systematically measure them aren’t implemented.
So what is the equivalent of the cars that you count on the shipping dock of the automotive factory? The sad but true answer is that there is likely no such equivalent in your IT shop. Hence, everyone falls back on project delivery or the tangible outputs delivered through them. Speed gauges become statements such as: “We delivered the new version of the CRM in 14 one-month sprints,” or “Release 3 of system XYZ took four months to deliver, compared to six months each for releases 1 and 2.”
But you cannot fairly compare the new version of the CRM with the preceding one. What you delivered in releases 1, 2 and 3 may be quite different in their nature and size. Neither can you compare anything between system XYZ, your CRM application and the majority of the hundreds of disparate business solutions you own. Thus, this gauge of speed is not sufficient either, because the units are not standardized.
When units of achievement vary from one project or one team to the other, that’s not usable as a valid measure of speed. That’s anecdotal evidence, nothing more.
Regardless, someone still needs to show that something has been provided at a certain speed. Since IT deliverables vary so much in size and nature, the only thing left to assess speed is money. You have to make the leap of faith that on average, higher-priced projects (or phases, releases, or whatever units of delivery you choose) yield more throughput. By doing so, cost actuals become a proxy to measure what has been delivered.
Assuming you can bare the assumption, the result is disconcerting: speed of delivery becomes the budget size of what has been delivered, divided by the time it took to deliver it. When we factor that into the formula above, it yields the following:
In other words, corporate IT speed is measured by the speed at which money is burnt.
Which also means that if you ask your corporate IT function to get any faster, the only thing they can do is spend your money sooner, leaving you the onus of believing that more was achieved per unit of time. This is far from a valid measure of speed.
Corporate IT’s unenviable reputation with respect to pace is not unrelated to the formula above. You have within your organization a function for which speed of delivery is a critical competitive element, but it is not measured adequately.
We all know that what is not measured will not improve, and measuring it in such a grotesque way as in the formula above is like not measuring it at all.
This is the reality of corporate IT today because no one has ever had enough motivation to develop better and more accurate ways to measure throughput. Do not get sweet-talked into the difficulties of developing such measures. It’s neither because such measures don’t exist, nor that your IT staff doesn’t have the skills to make it happen. Furthermore, it has nothing to do with technology, rather with how accountabilities are distributed and how team or personal performance measures are defined.
Next week’s article will provide more insights on what performance really means in corporate IT. My book gives a broader view of the problem and a deeper understanding of the non-technological root causes behind the poor state of speed in corporate IT.