I have a request. I hope it is not too onerous, because something is really starting to grind my gears.
Can we in IT please all stop claiming that any technology is going to kill another?
No, it won’t.
My hyperbole aside, I know this with complete and utter certainty, even though I am barely conversant in database technologies. Seriously, SQL hasn’t even killed off VSAM – first released in 1974 – which is still the foundation for a huge volume, perhaps even the majority, of our daily financial, logistics, retail, and government business. In fact, not only are we still storing data in VSAM, we are still programming in COBOL, and even doing it on 20 year old mainframes. So realistically, an upstart like NoSQL has no chance of killing anything.
Similarly, virtualization will not kill the physical computing infrastructures that came before it. Even most early adopters are struggling to get over 50% of their servers virtualized, while the average penetration is, by some reports, as low as 16%. Meanwhile, the percentage of desktops that have been virtualized is still in single digits. In some cases, so-called “legacy” systems are actually becoming their own hypervisors (e.g. Windows, z/VM, Solaris, and Linux).
The same is true of cloud computing. Even if, as Gartner predicts, by 2012, 20 percent of businesses will own no IT assets – which I find highly dubious; and even if the cloud computing market will be worth $160bn by 2011 – also somewhat dubious – then still a vast majority of organizations will continue to own their IT assets. Even allowing for some substantial private cloud deployment (much less dubious), there is no chance cloud computing will kill the on-premise, installed and owned, IT environment.
Historically, this has always been true. Distributed computing never fully replaced mainframe computing. Indeed, the mainframe is actually experiencing record levels of growth (.pdf) in particular among heavy mainframe users (over 500 MIPS). Personal computing never replaced distributed computing either. The Internet did not kill local computing; thin clients did not kill desktops; Firefox did not kill IE (although IE did eventually kill Netscape); Java did not kill COBOL, let alone C; disk did not kill tape; Salesforce.com did not kill Siebel; Google did not kill Yahoo; Gmail did not kill Exchange.
In fact, it is really quite rare that any new technology completely kills off any other. We in IT are the magpies of the business world, collecting and hoarding all the shiny technologies we can. These are not just collector items or museum pieces though; these are real, mission-critical systems and applications. So we end up with a hybrid of critical technologies spanning not just years, but decades.
(Which is why I am such a strong proponent of heterogeneous IT management … but that is another article)
Perhaps it is just semantics, or a philosophical distaste for absolutes. Perhaps the rampant pace of IT development just makes it seem like we don’t replace technology (when of course we do).
But I would still be really happy if we could all refrain from declaring the death of any technology.
Because chances are it is simply never going to happen.