Bill Gates retired last week from daily management of Microsoft, after a 33-year career to rank along side those of Thomas Edison and Henry Ford. But those who identify him mainly with the personal computer, the machine he loved and sought to put in every home and on every office desk, underestimate the role he played as system builder.
A visionary who recognized practically from the beginning the gains that would accrue from an arrangement in which computers of all sorts would work smoothly with one another, Gates also resembled William Forbes and Theodore Vail, the men who in the late nineteenth century put together the Bell telephone system – part technological genius, part business strategist, part social engineer.
If the architect of so ubiquitous a new infrastructure didn’t accumulate great wealth, it would have been surprising indeed. And Gates became, for thirteen years, the world’s richest man. Even today, with Microsoft’s market value well off its peaks, his fortune is estimated to be around $50 billion, still enough to put him third on Forbes’ Rich List, just behind Warren Buffett and Mexican television magnate Carlos Slim Helu.
There is, however, another aspect to Gates’ career. For all the glory, it is a story of defeat. The Internet today which connects all those computers is not at all the world that Gates imagined, any more than the contemporary telephone network maps onto the designs – if anyone can remember them – of AT&T, the giant corporation once known as “Ma Bell.”
The personal computer was little more than a glimmer when in 1977 Gates quit college to write a language program for the Altair 8080 hobby computer – on paper tape. In a lengthy series of adroit moves, he wrested leadership of the personal computer industry from its dominant firm, IBM Corp., then from its pioneer, Apple Computer, and, in due course, from minicomputer manufacturers led by Digital Equipment Corp. He was the first to understand that software was the key to interoperability.
Each of these industrial leaders had an opportunity to take the lead and failed to act decisively. At every juncture, Gates seized the day. Sometimes, as when he sold IBM an operating system that he did nor yet possess, his advantage was mainly daring; other times, as with the 1990 introduction of the applications programming interface (an innovation designed to stimulate while still controlling developers), technological acumen ranked high. Grit counted for much; so did boisterous high spirits.
Then came the “Internet tidal wave.”
It’s a famous story: How in 1993 Microsoft president Steve Ballmer returned to Harvard College to find that undergraduates who twenty years before had been excited about personal computers now could only talk about “chat” – bulletin boards and discussion groups squirreled away behind impenetrable addresses in the Internet. How Netscape stole a march with its new-fangled “browser,” designed to prowl the newly implemented World Wide Web with application interfaces of its own. Perhaps it would to become a network platform, an alternative to Microsoft’s Windows system, which was then in use on 90 percent of the world’s PCs. How Gates identified in famous memo, “The Internet Tidal Wave,” the threat it posed and, a few months later, on Pearl Harbor Day 1995, declared war on Netscape with a browser of Microsoft’s own.
Netscape crumbled in short order, but the chief result was a ferocious US government antitrust suit.
The Internet and the Web, it turned out, had been put together in a very different way from the Microsoft/personal computer empire. It was a technology that had been developed by university scientists and engineers, corporate executives and government bureaucrats, working together far from the limelight in voluntary committees and groups, spinning out disintegrated companies to manufacture routers, servers, switches, fiber-optic cable. The applications the engineers devised – email, file-sharing and search technology – proved to be more popular and customer-friendly than any of the myriad applications created for the Microsoft desktop.
Gates crippled Netscape, but he lost the subsequent antitrust case. His company was judged to have abused its monopoly position. On appeal, he won a reversal of the government’s plan to break Microsoft into two pieces, then outlasted the US Justice Department after the tie election of 2000 was decided in favor of George W. Bush. The 9/11 attacks rendered the whole business ancient history.
But by then the Web had changed everything. Google’s search and advertising-based business was merely the most visible aspect; behind the scenes, far from metropolitan centers, were vast server farms of linked computers, a network “cloud” of processing power and data storage that was slowly replacing the awkward strength expensively packed into every PC. The desktop device that had so enchanted the youthful Gates, was, like the telephone, on its way to becoming a dumbed-down terminal to connect to a Web where tailored services of every imaginable type were increasingly available on a more economical basis. And in this new world Gates’ company was far from nimble. A particularly good survey of these developments can be found in The Economist’s briefing, After Bill. Its leader writer put it this way: “Watching Microsoft in the company of Google and Facebook is a bit like watching your dad try to be cool.”
Exit Gates, 52, to philanthropy, pledged to eradicate malaria – a mission he could in fact hope to accomplish.
There are two equally valid ways to read this story. One is strictly personal, from the perspective of the business magazines and cinema westerns – that the innovator is inevitably a transitory figure, that Gates, as The Economist put it, “was perfectly suited to his time – but he is less well equipped for the collaborative and fragmented era of internet computing.” The other is as social policy – the recognition that there’s more than one way to skin a cat, and that well-modulated social innovation often achieves better results than entrepreneurial zeal.
It’s hard now to recall the demoralized atmosphere in the United States in which Gates began his career in the early 1970s – Vietnam, Watergate, Harold Geneen, inflation, OPEC, The Club of Rome, all that. That he and Ballmer, his college chum, could together take on IBM and win, was widely taken to signify that individuals once again were autonomous, that start-ups were the engines of economic growth, that truly it was “morning in America” again. Never mind that they were privileged kids from private schools.
So it is no small irony that the technology that eventually blunted Gates’ drive had been imagined by Defense Department bureaucrats; designed by a free-floating intelligentsia of university scientists and corporate engineers; characterized at every juncture by open instead of proprietary standards; developed by government contractors; paid for by Congress; privatized, if not exactly invented, by legislators led by Al Gore. For all that, it remained nearly invisible until it threatened to engulf, not only Microsoft, but the vaunted Baby Bell telephone companies as well, in a single amazing wave. Not only had all this group-think created an enormous and highly-profitable Internet industry, but its services were cheaper and easier to than those clunky PCs with their shrink-wrapped software.
Microsoft, like IBM, is a great company. Gates is a thoroughly remarkable man, who is likely to leave a lasting mark on philanthropy. But like Ford and Edison, he was unable to enter the technological Promised Land – in this case, of cloud computing, of Amazon and Google – even though he played a key role in bringing it about. Ford and Edison wound up cranky, hanging out together in their Florida resort. Bill Gates learned from his defeat.