The history of the personal computer’s infiltration into almost every home is a quixotic one.
The Xerox corporation did, as early as 1971, develop a prototype personal computer, complete with a graphical-user interface, and a handheld device that later came to be known as the mouse. The company shelved any plan to sell it, judging the market to be too small. The first successful personal computer was marketed by a small startup, Apple computer. But the Apple II remained a boutique product, purchased only by those with an avid interest in computing.
The PC became attractive beyond this demographic, only when a relatively low-cost model was introduced by International Business Machines — the behemoth that was founded in 1888, long before the invention of the electronic computer itself. IBM originated the punched-card method of high-speed inventory in the nineteenth century, and the company (which was not renamed International Business Machines until 1924, taking its name from the Canadian subsidiary) successfully exploited the latest innovations in information technology, before and after the invention of the modern computer, to become one of the largest companies in the world (complete with its own company town).
From punched-cards, IBM moved on to the super-computers of the 1950s and ‘60s, with almost all of its business going to government or other large corporations. At the beginning of the 1980s, its executives sensed a great opportunity in shifting to the consumer marketplace. The company, with so much capital at its disposal, simply updated the old industrial method, going back long before Henry Ford, of manufacturing by assembly line on a scale great enough, to make the device affordable to the worker that manufactured it. The IBM personal computer — “The PC” as it came to be known — was judged inferior to its main competitor (the Apple, and later the “Mac” or Macintosh PC), but its relative cheapness could not be overcome in the mass market.
Moreover, the PC’s software was “open” to the degree that any startup firm could create programs for it (again unlike the Apple, which came with most common software pre-loaded). Apparently, however, IBM, a company with several hundred thousand employees, did not have the expertise on-hand to quickly create the necessary software (what came to be known as the disc-operating system) to make a personal computer functional. After they were unwisely turned down by Apple computer, IBM then turned to an obscure Washington state firm, Micro-Soft.
That company’s president, Bill Gates, was no computing genius, but had great business sense. His firm did not, however, actually have the necessary software, nor apparently, the expertise to write it. Micro-Soft discreetly purchased a different software firm’s “quick and dirty” operating system, patching it up as best they could before passing it off to IBM as their own work. The latter company was not able to purchase MS-DOS, however, instead licencing it from Micro-Soft. It was thus how Gates was able to leverage his very small firm into the biggest corporation in the world, ultimately dwarfing IBM. It was, again, old-fashioned business methods, instead of the superiority of the product on offer, which allowed Microsoft to become such a monstrosity, and Gates the world’s richest man.
Very different than the “new economy” described by some, Gates’ rise was almost a parody of the saga of the robber barons of the “gilded” age. Microsoft not only managed to become a monopoly interest in a key product, but Gates (like Carnegie and Rockefeller) ultimately turned to philanthropy in penance for his fifty-billion dollar fortune.
The Macintosh computer, the first successful graphical-user interface PC, is held in reverent esteem for it user-friendliness. The great costs of designing and manufacturing the GUI-PC nearly bankrupted Apple computer, however, such that its bohemian operatives were compelled to accept the leadership of an old business hand, someone heretofore uninvolved with the computing industry. Thereafter, its founder (Steve Jobs) was kicked out of the company. Again, however, the ultimate standard graphical-user PC was not the Mac, but the IBM model with its substandard Microsoft Windows operating system, which came out a couple of years after the Mac was first marketed in 1984. Mass-manufacturing won out over the boutique model, once again.
Given the plain facts of the development of the computer industry in the last few decades, it is hard to understand the credence given to the notion that computing will somehow overcome the difficulties associated with the “bricks and mortar” economy. The computer itself became a staple precisely through economies-of-scale and standardization-of-product, old-fashioned methods that information technology was supposedly going to supersede. For many years now, IBM has been but a minor player in the personal computer hardware market. Its place was taken by other manufacturers, such as Dell computer, which followed the IBM model by manufacturing on a mass scale (often in low-wage countries such as Mexico).
The new-economy utopians somehow convinced themselves that information-processing would, on it own, cause the lion to nestle up to the lamb, and the conflicts and stresses associated with the “industrial” age, would disappear. As to the value of software as opposed to hardware (ie. “bits and bites” as opposed to “bricks and mortar” — as though the latter were the most advanced material of industrial age), software only became valuable when the hardware on which it runs was mass-produced to be cheap enough to the common household. The biggest computer companies in the world — IBM, Intel, Cisco, Hewlett-Packard, Dell — are makers of hardware, not software. The exception is, of course, the biggest company in the world, Microsoft. But, as mentioned, Microsoft became so large because it had (and maintains) a proprietary hold on the computer operating system — essentially the interface between the hardware and software of the personal computer.
Besides, as the Economist noted in the 1990s, a good chunk of Microsoft’s profits has come from the sale of hardware, such as mice and other peripherals (which operated best, naturally, under the MS-Windows system). Initially, independent software producers such as WordPerfect and Lotus were able to make millions off their “killer” applications (in word-processing and spreadsheets, respectively). But inevitably, Microsoft itself introduced copycat software programs, which eventually marginalized both WordPerfect and Lotus 1-2-3 as the standard applications. Because Word and Excel (the MS spreadsheet program) were integrated into the Windows programming, they were easier to learn and manipulate than either WordPerfect or 1-2-3. However, after the introduction of the Windows Chicago system in 1995 (like the IBM PC fourteen years earlier, the initial sale of “Windows 95" was introduced by a massive advertising campaign which included the multimillion-dollar licencing of the Rolling Stones’ Start Me Up), direct sales of operating-system software became relatively small.
Mostly, Windows 95 and its successors came pre-loaded on virtually all PC’s that were sold, anywhere in the world. The fee paid for this privileged was incorporated into the cost of the computer itself. In fact, software became less and less valuable, the more accessible computer hardware became during the 1990s. The number of people using any particular software program, was far greater than the number who actually purchased it, directly or through the purchase of a computer, due to software “piracy.” And courtesy other innovations in computer hardware, involving the digital recording (“ripping”) of writeable compact discs, which made not only software programs easily distributable, but also made commercially-sold compact discs subject to piracy.
When the resulting digital-encoding of songs was made available through the peer-to-peer networks of the Internet, it ultimately caused the collapse of the value of the software of the music and movie industries in the general marketplace. Software producers have gone to great lengths to guard against software piracy, often causing problems with programs themselves. The failure of Microsoft’s ballyhooed Windows Millennium operating system, was largely due to problems caused by excessive security safeguards.
A publicly-accessible Internet became possible only after computing was made a household appliance via traditional methods of manufacturing and marketing. As mentioned, the Internet resulted from the needs of the U.S. defence department to build a computer network that was (relatively) safe from enemy nuclear attack. The Internet is, at its base, a triumph of hardware, not software, with its revolutionary method of simultaneously fragmenting, replicating, and then reassembling data between any two nodes on a network.
Again, without the subsidy provided by the government and universities, this hardware would not have been developed at all. Moreover, it took twenty-five years following its invention, before the Internet was made available, by commercial means, to the general public. At first, commercial network service providers, such as CompuServe and America On-Line, resisted the adoption of Internet technology and protocols. Even Microsoft had, at first, planned to construct its own network while setting up its online service.
As for the tiny, mostly local Internet service providers that sprang up in major centres in North
America and Europe around 1994 (when the development of the World Wide Web made going online a graphic experience), were remarkable for not being very profitable at all. Eventually, these startups went belly-up, or merged into ever-bigger regional and national companies. Eventually, long-established telecoms and cable-TV firms swallowed up most of the independent Internet service providers. As for the hardware side, the market for Internet equipment was for a long period held by one company, Cisco Systems, which continues to hold the lion’s share even now. This dominance of a single entity, whether Cisco, Microsoft and Intel, of each niche in the computer industry, is of course more similar to nineteenth-century monopoly capitalism than the vision of the twenty-first, as offered by the new-economy prophets.
There is, in fact, good reason why virtual-monopoly concerns would come to predominate the computer industry, on the hardware and software ends. As the Economist also noted at the beginning of the tech boom, widespread networked computing depends upon the adoption of an operating standard. Thus, either all players had to agree to the hardware and software standards in the beginning (which, as we know, they did not), or a single commercial provider would come to dominate a market so completely, as to shut out all other players (which is what in fact occurred).
During the nineteenth century, the cutthroat practices of Gould, Rockefeller, Edison and other robber barons, made available to the common lot thousands of goods and services. Would the oil, railway, electrical, telephone and other industries have been better served by a situation of perfect or ruinous competition? This is what in fact prevailed in the early years of most machine-technological industries from the nineteenth century on. Engineered technology, on the other hand, seems to promote oligopoly, and even monopoly. This was even more true of computer-engineering than the “industrial era” technologies of railway and motorcar. During the last quarter of the twentieth century, as advances in computer-engineering led to always-fresh opportunities for commercial exploitation, hundreds of thousands of startup firms have come and gone, with a relative handful, such as Micro-Soft, becoming behemoths. Monopolists such as Bill Gates and Andy Grove made computing available to the masses, for better or worse.