As an IT leader, I often find myself walking a thin line: I am the company’s voice before the employees, and the employees' voice before the company. This extends to mediating between internal parties and vendors, auditors, consultants and other external entities as well. While not an absolute situation (and certainly not at my current workplace), it is often the case that higher leadership pushes an IT management model that is ultimately a fallacy. Curiously enough, other parties' retort is also deeply flawed. Both are rooted in good intentions, but tangle up in a vicious circle that does more harm than good, even though no one overtly intends it. In a sense, a lot of IT leadership and management efforts are spent bridging these two fallacies.
The Total Control Fallacy
Many traditional IT management ideas are based on the fact that you need to have absolute control over software and infrastructure processes. The waterfall development model and its brethren, production release calendars, and plenty of the standard activity tracking and reporting methods, in my mind at least, fall into this scheme. It seems that as IT evolved into a high-profile, crucial component of the value chain across almost every company in almost every industry, it became evident that its immaturity could pose serious problems to organizations. But the drive to bring what seem perfectly acceptable controls and procedures over from other disciplines has resulted, more often than not, in overly rigid, unresponsive IT organizations that satisfy neither their host organization nor the IT people that actually put all their effort into trying to bring about success. A few choice examples:
- Activity tracking and reporting: one usually finds Microsoft Project being the tool of choice for tracking and reporting, but even the most buttoned-up companies find it too demanding for all but the largest projects. Therefore, much of the drudgery becomes even more “drudgerier” by virtue of having to write it up in some other task control system and then having to provide weekly, monthly, quarterly and yearly summaries of achievements, roadblocks and benefits. I’ll fully agree that it’s important to bring visibility into the processes, but it pains me to see how many hours are wasted each week at many, many IT organizations piling up data that carries only a very marginal chance to be converted into valuable information. The cost of providing status rivals that of delivering value
- Estimation: a lot of what passes as estimation in terms of project planning (both software and overall systems delivery) is actually guesswork, usually brought into shape by some unruly demand about a given date or timeframe. Highly skilled technical personnel may quickly fall into the routine of simply divining which dates upper management wants to hear, after having careful and thoughtful estimation exercises thrown out in less time than it takes to go through them, let alone understand them
- Release processes: these are particularly offensive to me, as they’re brought into effect more by IT itself than by its customers, unlike the previous two. Few other areas, in my experience, lend themselves so well to attempting to solve technical problems with non-technical measures. People come to accept the fact that “systems are hard to get right” and build all sorts of convoluted and bureaucratic procedures to release software instead of actually working to deliver value and minimize risk without stifling agility, which I suspect is what drives the kind of horror stories beautifully dismantled by Jez Humble and David Farley in their masterpiece, Continuous Delivery
I could go on an on, but the idea remains the same: too many things in modern IT management are about creating the illusion of total control, with no actual chance of achieving the level of control that’s sold to the CEO and the CIO peers, nor of letting go the tight grips and having bright people figure better solutions to the IT issues - much like they are expected to do with business issues day in and night out. IT is unique enough that it should be given leeway to develop its own unique controls and procedures, but that’s seldom the case: even within the profession, we tend to import a lot of “look-good” processes that don’t really suit us well. Alas, many of these useless measures are the creation of IT and not of the business, and in truly perverse instances, IT will bend their rules to cater to a special request from the business: if it fails, the need for excessive rules will be reinforced, and if it works, it won’t matter enough so as to bring the rules into question.
The Total Anarchy Fallacy
When Agile methodologies started to spread out, hope shone in for many a battered IT professional - and for some leaders as well, for not all process conflict emerges along hierarchical lines. Here, at last, was an encoding of, and a drive to come back to, common-sense practices that had allowed teams all over the world to delight their customers and instill into their members the sense that their work was valuable, meaningful, important. However, apart from the natural challenges faced when trying to change any well established process, Agile was done more harm than good by overzealous proponents trying to sell their own philosophical and ideological viewpoints and grudges with the status quo under a thin Agile veneer. Read this interview with David J. Anderson for a very concise and well-argued expansion of this argument.
In a sense, it’s only natural. I’ve long held the view that many things in life behave like a ruler, one end of which you hold with a vise and then bend the other end all the way to one side: when let go, it obviously won’t return to a centered, peacefully balanced position; it will swing wildly to the opposite edge, only to try to return to its former position, and only after quite a few swings will it find balance. After so many years of mismanagement and, in cases, downright oppression, it’s only natural that IT experts see in Agile an escape valve to get rid of all the cruft that has been imposed of them for so long, and which actually hinders progress towards delivering value to the host organization.
But not all management is cruft. Even when the ideas are poorly implemented, they usually attempt to mitigate some risk, allow timely exploitation of some opportunity, or simply provide some basic amount of visibility and information into what’s being done. In my opinion, a lot of the brilliancy in Agile lies not in shunning all that management needs from their teams, but in providing truly effective alternatives, where the value provided and the cost incurred are “just right”, and mostly getting out of the way of the team’s way when it’s best to do so. By being radical about any given implementation, many proponents eschew basic change management tenets, resulting in even harsher conditions as the proponents of the total control fallacy simply entrench further deeply their position.
Slowly Restoring The Ruler To The Center
What, then, are we to do? Much has been written about this, and although it is not the point of this post to simply point out the fallacies, it is also not realistic to expect a silver bullet. What I’ve found to make a difference when situations like this arise is to follow a very simple rule:
Don’t marry yourself to paradigms, old or new.
It is tempting to walk into a dysfunctional organization or team, either as a new team player or as someone with authority (even if initially only stemming from a title or position), throw everything you find into a bag and then declare it useless. This is often accompanied by then ceremoniously opening a new package, taking everything out of it and declaring it the New, Correct Way To Do Things From Now On(TM).
I realize there may be cases that’s exactly what needs to be done, but I believe them to be very few and far between. As a developer, as a team lead, as a manager, as a consultant, you can’t really honestly expect it to be common that you can throw out all controls, all practices and procedures, and bring in Agile (or any other paradigm for that matter) and expect it to (a) be fully and unquestionably accepted, and (b) actually produce order-of-magnitude improvements in a matter of months. Only in certain crisis scenarios can something like this pan out. More usually, you follow Gerry Weinberg’s excellent advice from The Secrets of Consulting: find out the real, people problem behind the non-people, non-problem they’d have you believe you’re at, and focus on a 10% improvement that no one can deny, but that also doesn’t alien anyone on the team. That may sound more bland and political than what many revolutionary types expect, but I find it more valuable for results than sheer bravado and slate-wiping action.
And while I can’t tell for sure, I think this is more in line with what original Agile proponents were about. A Boeing 787 is fast, but not agile; a cheetah is agile. If you want to bring in flexibility, the chance to quickly change direction and keep pace with the business within IT, it’s OK if you want to fight the prevailing existing fallacy. Just make sure that, along with a purported solution, you don’t bring in an equally toxic fallacy instead.
Warlin Garcia - April 28 2013 9:54 PM
I see that very often. People making decisions far from the teams actually doing the work. All of the sudden you get stuck with the “great new thing” that works everywhere else (and it’s a new buzz word) and that “must” work for us as well. Guess what, it may work but, as you said, it won’t work like “plug & play”. You need to consider transition periods and the most important part (at least to me): buy-in from the people doing the work. The lack of support and feedback from the teams most affected by the changes can make a great idea look really bad at the end.
Ramón Jiménez - April 28 2013 10:21 PM
Dead on, Warlin. And it cuts both ways: the people down the trenches might expect management to blindly take up anything they propose because, well, they are the ones down the trenches. The fallacies at their best.
Thanks for commenting, and feel free to share!