When exactly did enterprises become late adopters of technology? We know that they were some of the first buyers of computers. IBM sold tabulating and later computing machines to businesses starting in the 1910s. During the 1980s it was businesses which bought PCs in significant numbers to augment, and later replace, their centralized computing resources. Networking was in use in government and in business long before consumers saw any value in it.
In my talks I often point out that if you wanted to create a near-monopoly in computing in the 1990s all you needed was to convince 500 people to adopt your technology: the IT managers of the Fortune 500. If the largest companies used your product then they would impose the standard on all their suppliers and distributors and pretty soon there would be no alternatives.
So what happened during the last decade or so?
Today IT departments are known as the Information Denial department. I recall that when the DVD first became an option on desktop or laptop computers, IT departments were first to decline the option (presumably because it would be used for entertainment rather than work.) When instant messaging first became available, it was IT departments who blocked the ports. When mobile devices with cameras became available signs went up that no cameras would be allowed on company premises. When USB sticks became available, USB ports started getting glued shut. When iOS became available, no devices running it were allowed on the network. Then came Facebook, Instagram and dozens of social media.
This pattern of not only a refusal to adopt but an outright ban on new technologies by enterprises made them fall off the radar of technology developers. Quite simply everyone outside the supply chain into enterprises stopped developing new markets around them. From venture funds to developers, enterprises fell out of the business plans.
The enterprise stood as a place of “legacy” and “security” which prevented mobile or other forms of computing. Paradoxes emerged wherein an administrative assistant had more computing power in his pocket than the CEO had in her data center; where the same assistant would know what was happening faster than any of the bosses. Homes had better connectivity than offices and productivity at small firms increased faster than at big firms. Incidentally, even the slowest enterprises were faster then the government. The bigger the firm, the slower and stupider it seemed. Were large firms employing dumb managers or did being a manager in a large firm make you dumb?
One resolution to this paradox might be that mobility and the movement of processing onto consumer devices increased the cadence of product development to such a degree that the purchase cycle and dollar amounts involved ran out of the range which companies could absorb.
A simple way to explain it is this: A company takes longer to decide to purchase a device than the device’s shelf life. In other words, by the time all the salespeople and committees and standards setting and golf playing and dining and site visits would be completed, the object whose purchase was being discussed would be discontinued.
A more onerous issue is that companies have procedures for accepting technologies (capital expenditures) which require high degrees of interaction and decision making. In order to step though these procedures, the vendors need to have sales people who need to invest lots of their time and therefore need to be compensated with large commissions. If those commissions are a percent of sale then the total sales price needs to be large enough “to make it worth while to all parties”. As a result, paradoxically, an enterprise technology must be sufficiently slow and expensive to be adopted.
Mobility was disruptive to enterprise because the new computing paradigm was both too fast and too cheap to be implementable.
This implies that the problem with enterprises is not the stupidity of its buyers. They are no less smart than the average person–in fact, they are as smart with their personal choices for computing as anybody. The problem is that enterprises have a capital use and allocation model which is obsolete. This capital decision process assumes that capital goods are expensive, needing depreciation, and therefore should be regulated, governed and carefully chosen. The processes built for capital goods are extended to ephemera like devices, software and networking.
It does not help that these new capital goods are used to manage what became the most important asset of the company: information. We thus have a perfect storm of increasingly inappropriate allocation of resources to resolving firms’ increasingly important processes. The result is loss of productivity, increasingly bizarre regulation and prohibition of the most desirable tools.
Which brings us to the latest announcement of collaboration between the new disruptor of computing Apple and the vendors supplying Enterprises like IBM and Cisco.
Apple was the loser in the standardization of computing during the 1990s but is the winner in the mobilization of computing during the 2010s. The company positioned itself in both cases on consumer computing but it never gave up on enterprises.
The approach of Apple seems to be to enable the larger suppliers of technology to enterprises to bundle iOS as part of the acceptable set of services and products. In essence, Apple is complying with the requirement to be slow and expensive in order to be adopted. It can maintain the cadence of product development while attaching itself to the purchase cycle of the enterprise.
In a way it’s like an automatic transmission in a car. Operating through gears, the engine can rev at a different rate than the wheels turn. Occasionally, shifting happens but the fluid coupling keeps both the engine and the wheels from absorbing any damaging shocks.