Size, Speed, and Price
The desktop personal computer has become the dominant computer type for business and home use. Hardware has grown in size in order to accommodate more devices, provide more storage capacity, and generally enhance performance. In the mid-1980s, the typical PC consisted of a 12-inch (30 cm) monitor, a keyboard, and a central processing unit (CPU) that accommodated a 20-Mb (megabyte) hard-disk drive and a 5.25-inch (13 cm) floppy-disk drive, and had a number of ports (connection sockets) for optional peripheral devices. The printer was the most common peripheral device. By the late 1990s, 14- and 17-inch (36 cm and 43 cm) monitors were standard, in order to provide improved display of pictorial content. The CPU, usually in a tower format, typically accommodated a several gigabyte hard-disk drive, a 3.5-inch (9 cm) floppy-disc drive, a CD-ROM (compact disc read-only memory) or DVD (digital versatile disk) drive, and a modem. It had sufficient ports to take a range of peripheral devices, including scanners, digital cameras, CD writers, loudspeakers, and extra storage drives, such as the Zip or Jaz drives. The CD-ROM, introduced in 1984, has become the standard format for applications software, games programs, and educational software such as multimedia encyclopedias. Recordable CDs (CD-Rs), requiring a CD writer, became available in 1990. Introduced in 1995, the DVD can hold a full-length motion picture.
However, the key determinants of performance, the microprocessor and the RAM chip, have grown in processing power, speed, and capacity without growing in size. Since its introduction in 1993, the 32-bit Intel Pentium chip, the dominant PC microprocessor chip, has evolved from running at a speed of 60 MHz (megahertz) to 600 MHz. A 64-Mb RAM chip is now regarded as no more than average. Therefore, while desktop models dominate, portable computers are now available in sizes ranging from the palmtop to the notebook. In the 1970s, portability was more of a relative concept. The first portable computer, the Baby suitcase computer of 1976, was a CPU without a monitor, like the Altair desktop. Even by 1981, when the Osborne I portable computer was introduced, portable computers were still suitcase-size and referred to as luggables rather than portables. Compaq, which introduced a portable PC in 1982, was the first company to really focus on the portable computer market. By 1986, the portable computer had shrunk from the luggable to the briefcase-size laptop, and by 1989, from the laptop to the thinner notebook. The notebook is the smallest type of PC that retains full functionality; it can accommodate hard disk, floppy disk, and CD-ROM drive as well as an internal modem. Subnotebooks and hand-held palmtops or personal organizers economize on size by having limited data storage facilities and small keyboards, but can transfer data to desktop or notebook computers by wired or infrared linkages. Some palmtop computers, such as the Apple Newton, introduced in 1993, omit the keyboard entirely and instead allow input to be written onto an LCD “notepad” using a stylus. They have built-in optical character recognition (OCR) software.
Prices have continued to fall, thanks to economies of scale, increases in production efficiency, and competitive market forces. While big American manufacturers, including IBM, Dell, and Compaq, still have sizeable market shares, the nature of computer retailing has allowed small companies to prosper. Purchasing direct from the manufacturer or from computer warehouses via mail order or electronic commerce has become a significant feature of the personal computer trade. Although personal computers may be ostensibly made by European, Canadian, or U.S. companies, in this context “making” means assembling, and the majority of the manufacturing process takes place in the Far East. Japanese companies, such as Toshiba, Sony, and Fujitsu, have been particularly successful in the portable computer market. Portable computers continue to be significantly more expensive than desktop models offering equivalent performance. This largely reflects the relatively high cost of flat liquid crystal display (LCD) screens in comparison with conventional cathode ray tube-based monitors.
Computer Software
The key to the mass-market success of the microcomputer lay, not in the hardware itself, however small or cheap it became, but in the development of a range of generic applications. Compatible computer hardware meant that there was a huge incentive for companies to develop software that would enable users to exchange data easily. From the beginning to the present day, the business-software industry has been dominated by American companies. The first mass-market applications provided the means of computerizing the tasks that were common to all businesses—accounting and word-processing. In 1979, Software Arts introduced VisiCalc, the first commercial spreadsheet, to run on the Apple II computer. Spreadsheets create files in the form of tables in which numerical data can be sorted and manipulated. Launched in 1982, the Lotus Development Corporation’s 1–2–3 application for PCs, which added database and graphics display functions to the core spreadsheet functions, soon became the market leader. In the same year, Ashton-Tate released dBASE II, the first commercial relational database. While the pre-PC WordStar was the first commercial word-processing package, WordPerfect—launched in 1982—became the market leader of the 1980s.
Personal Computers
In terms of mass-market potential, the problem with the microcomputer industry in the late 1970s was the proliferation of incompatible machines. No company was able to establish a sufficiently large market share to shape the direction of microcomputer production. IBM initially adopted a disdainful approach to the nascent microcomputer industry. However, once the demand for single-user computers became evident, IBM entered the market in 1981 with the launch of the 5150 PC. The key features of this IBM PC were an Intel 16-bit microprocessor, 64K RAM, and the Microsoft Disk Operating System (MS-DOS). IBM appropriated the term “personal computer,” which—shortened to PC—became used to describe the system architecture. Reputation, marketing channels, and immense research and development resources soon gave IBM a decisive competitive edge in the business market, in spite of its relatively high prices. In 1983, IBM introduced an upgraded PC, the 5160 PC XT, which had a hard-disk drive as well as a floppy-disk drive, and the cheaper IBM PC jr, aimed at the home consumer. (The floppy disk had been introduced as a convenient portable storage medium in 1971.) By the end of 1983, IBM had sold 800,000 PCs. In 1984 came the IBM 5170 PC AT, which introduced the 16-bit ISA (industry standard architecture) data bus, which accelerated the flow of data.
PC architecture was soon cloned by other companies to create a range of IBM-compatible models. At first, would-be imitators had to use the practice of “reverse engineering,” whereby they deconstructed an IBM PC to analyze its technical design. This became unnecessary when IBM decided to publish its system architecture in order to encourage software companies to develop PC applications and thus stimulate the growth of PC ownership. While IBM achieved its goal of making the PC the industry standard for microcomputers, it lost out in terms of computer sales to companies making cheaper clones. For example, the British Amstrad PC1512 personal computer, introduced in 1986, was both cheaper and faster than the IBM PC. In the United States, Compaq, a spin-off from Texas Instruments, was so successful with its IBM clones that in 1986 it superseded Apple as the fastest-growing American corporation ever.
The Home Computer Arrives
In 1975, Micro Instrumentation and Telemetry Systems (MITS), a small firm based in Albuquerque, New Mexico, introduced the world’s first microcomputer, the Altair 8800. Lacking its own monitor and keyboard, the Altair 8800 was intended for the serious home enthusiast. Bill Gates (William Henry Gates III) and Paul Allen developed a modified version of the BASIC programming language for the Altair. They registered the Microsoft trade name in November 1976 to market the new language as MS-BASIC. Steven Jobs and Stephen Wozniak, two computer enthusiasts based in Silicon Valley, the heart of the semiconductor industry, were inspired by the example of the Altair. Using a cheaper 8-bit microprocessor, the MOS Technology 6502, they built their own microcomputer. Encouraged by the response of fellow enthusiasts, they began small-scale production of the Apple I computer in 1976. Snubbed by the companies offered the commercial rights but convinced of the commercial potential of the microcomputer, Jobs and Wozniak raised venture finance and set up Apple Computer in 1977. The Apple II computer, the world’s first commercial microcomputer, had generated $2.5 million in sales revenue by the end of the year.
The immediate success of the Apple II energized the computer industry. Other companies, particularly calculator manufacturers, were quick to see the potential of the standalone, desktop computer and began to develop rival products. Like Apple, they hoped to appeal simultaneously to the potential home user and the small business. The U.S. company Commodore Business Machines, founded by Jack Tramiel in 1958, introduced the PET 2001 only two months after the launch of the Apple II. By 1980, a number of U.S. companies were producing microcomputers (all of which were mutually incompatible) and companies such as Epson were selling compact, cheap printers to complement microcomputers. In Britain, Clive Sinclair, developer of the first pocket calculator, introduced the Sinclair ZX80 home computer in 1980. The ZX80 became the cheapest microcomputer on the market. It was designed to use a television set as a display screen rather than a dedicated monitor.
The fall in the price of microcomputers was largely due to the astonishing decrease in the costs of microchip manufacture. No other industry has matched the semiconductor industry for sustained reduction in costs coupled with faster performance. While U.S. companies such as Intel and Motorola dominated the microprocessor market, Japanese companies such as Fujitsu and NEC (Nippon Electric Company) began to make major inroads into the memory-chip market. In 1970, Intel’s first RAM (random access memory) chip had a mere 1K (kilobyte) capacity. Over the next decade, the capacity of RAM chips rose to 4K in 1973, 16K in 1976, and 64K in 1979. Japanese manufacturers were able to rapidly penetrate the memorychip market by taking an approach different from that of the U.S. memory-chip companies. Instead of investing time trying to get more memory on the same size of chip, they opted for the simpler approach of making bigger chips. They also championed the CMOS chip design, which consumed less power than the NMOS chip and was more resilient.
Microelectronics
While the minicomputer widened the market for computers, they were still too expensive and complex for small businesses, let alone individuals. For computers to be brought within the reach of a mass market, they needed to become still smaller, cheaper, and easier to use. The next advance in fundamental electronic technology after the transistor was the integrated circuit. While it had taken the research resources of the world’s largest company, AT&T, to invent the transistor, within ten years transistor manufacture was dominated by new specialist semiconductor companies. The first integrated circuit was created in 1958 by Jack Kilby of Texas Instruments. It consisted of five components on a single germanium chip. A year later, Robert Noyce of Fairchild Semiconductor, founded in 1957, produced the first planar transistor. The planar process involved oxidizing a silicon wafer, coating it with a photosensitive material, photographing a pattern onto it and etching the pattern into the oxide, washing off the coating, and selectively introducing impurities. It was a repeatable process that enabled complex circuits to be built on a silicon wafer. By 1970, the price of an integrated circuit, also known as a silicon chip, had fallen from about $30 to $1, and an integrated circuit might contain up to 100 components.
The use of integrated circuits meant that the printed circuit boards of devices such as calculators became much more compact. Integrated circuits began to be used in computers in the late 1960s, but the central processing unit of a computer required thousands of integrated circuits. In 1968, Robert Noyce cofounded Intel, which began to develop large-scale integrated circuits. While Noyce had predicted that the way forward would be to fit the whole central processing unit onto a single chip, it was one of his employees, Ted Hoff, who actually achieved that. Hoff developed the Intel 4004 chip, the world’s first microprocessor, which made the pocket calculator possible. In terms of mathematical processing power, the Intel 4004 chip was virtually the equivalent of ENIAC. However, its limitation was that as a 4-bit chip (meaning that it could handle four binary digits simultaneously) it could not process alphabetical characters, because it could only define 16 4-bit characters, or bytes. The IBM 7030 computer of 1961 had established the 8-bit character, or byte, as the standard for general computing. Intel launched its first 8-bit microprocessor, the 8008 chip, in 1972, followed by the improved 8080 chip in 1973, paving the way for the first generation of home computers. The 8-bit chip could define 256 different 8-bit characters.
Early Computing
While punched-card calculating machines proved an effective means of speeding up lengthy tabulations, they were not suitable for carrying out more complex mathematical tasks, such as differential equations. In 1876, the Irish physicist Sir William Thomson (later Lord Kelvin) put forward the concept of a mechanical differential analyzer for solving differential equations. However, as with Babbage, Thomson’s ideas were too advanced for contemporary engineering capabilities. The idea of the differential analyzer resurfaced in the 1930s. In the mid-1920s, the American scientist and electrical engineer Vannevar Bush began work on a mechanical-electrical differential analyzer, which he called the product integraph. As the product integraph could only solve the simplest differential equations, in 1930 Bush began to develop a more complex differential analyzer that could handle eighteen independent variables.
The leading European computing pioneers of the 1930s included Douglas Hartree, who constructed the first British differential analyzer, and Konrad Zuse, a German engineer who built the first binary calculator, fed by a punched-tape reader, in 1938. The significance of Zuse’s work is that it laid the foundations for digital computing. Earlier mechanical and electromechanical calculating machines were analogue computers, meaning that each of their components yielded a range of values, which combined to produce a result. Zuse’s binary calculator was based on the binary algebraic method of the nineteenth-century English mathematician George Boole, who demonstrated that equations could be reduced to a series of true or false propositions. This is known as Boolean logic, and in binary code the values of 0 and 1 are used to represent false and true. The advantages of the binary system became more apparent when electronic computers were developed in the late 1940s. The binary system lends itself perfectly to circuits where the state at any point depends on the presence or absence of a pulse of current or the low or high voltage of a component. A long series of bivalue electronic transactions is much simpler to engineer reliably and much more flexible in terms of program routines than fewer transactions with many possible values. In 1939, John V. Atanasoff and Clifford Berry of Iowa State University built the world’s first electronic calculator, which had an external magnetic drum to store a binary code program.
Intelligence and Interoperability
In 1981, the Japanese government announced the launch of the Fifth-Generation Computer Project. While the preceding four generations were defined by their core electronic characteristics—valves, transistors, integrated circuits, and microprocessors—the fifth generation was a more holistic concept. The objective of the project was to develop computers with artificial intelligence over a ten-year period. The idea of artificial intelligence was not a new one. The term came into use in the mid-1950s, and long-term research had been undertaken in the United States at universities including Stanford and Carnegie-Mellon. However, the focus had been much narrower and progress had been limited. While the algorithmic logic advocated by Von Neumann was expressed in a sequential data processing architecture, artificial intelligence required parallel processing architecture, which was more akin to the heuristic reasoning patterns of the human brain. Heuristic is the process whereby we draw on our knowledge of many things to infer solutions to problems. The success of the solutions depends on our expertise, in terms of the quality and range of our knowledge. By 1980, only the most powerful computers, known as supercomputers, used parallel processing. In 1965, the U.S. company Control Data Corporation introduced the first supercomputer, the CD6600, designed by Seymour Cray, who went on to set up Cray Research, which became the leading producer of supercomputers. These computers were designed for complex tasks such as weather and aerodynamic modeling.
Aside from artificial intelligence, the other significant strategic trend of the 1980s and 1990s has concerned the development of software that allows greater interoperability of computers. While hardware compatibility is one way of achieving interoperability, it was evident that total hardware compatibility was extremely unlikely to occur. UNIX, the pioneering hardware-independent operating system, had provided a means of establishing a common platform across a network of nonmatched computers. In the early 1990s, another operating system on similar lines, Linux, named after its Finnish inventor Linus Torvalds, was released as “open source (code)” a term for nonproprietary systems that are made freely available. Further developments in open systems were stimulated by the introduction of the World Wide Web in 1993. As the whole concept of the Web is that it is accessible to all computers irrespective of platform (hardware and operating systems), it fostered new languages and applications. It has become accepted for Web applications, such as browsers and document readers, to be made available as freeware. One of the first of these applications was Adobe’s Acrobat Reader, which allows documents to be downloaded onto any platform. The leading web browsers, Netscape Navigator and Microsoft’s Internet Explorer, were introduced as freeware, in 1994 and 1995, respectively. Released in 1995, Sun Microsystems’s Java programming language has become the main language for Web applications.
The Mainframe and Mini Computers
In 1948, IBM decided not to manufacture computers commercially, believing, based on market research, that expense and size were prohibitive factors. Howard Aiken, who had joined IBM, remarked in 1950 that he could not ever see the need for more than six computers in the world. However, other scientists who had built prototype computers thought otherwise and assisted in the development of commercial models. The Manchester University team collaborated with the Manchester-based electrical engineering and electronics company, Ferranti, to create the first commercial computer, the Ferranti Mark I, launched in 1951. Eckert and Mauchly set up in commercial partnership in 1947, but sold their business to Remington Rand three years later. They developed the first commercial American computer, UNIVAC, for Remington Rand in 1951. The original UNIVAC model, supplied to the U.S. Bureau of Census, was the first computer to use magnetic tape for storage. More unusually, the Cambridge University team entered into collaboration with the catering company J. Lyons, which operated a chain of tea shops, to develop the LEO (Lyons Electronic Office) computer for processing business data.
IBM soon reassessed its position and in 1952 Aiken designed its first commercial computer, also its first electronic computer, the model 701. IBM soon acquired a reputation for innovation in computing and overtook Remington Rand’s early lead in the U.S. computer market. It recognized that the high power consumption (up to 100 kilowatts) and heat output of valve computers were disadvantageous, causing valves to burn out too frequently and creating uncomfortable working conditions. The alternative to the valve was the smaller and more resilient solid-state transistor, invented in December 1947 by John Bardeen, Walter Brattain, and William Shockley at Bell Laboratories of AT&T. At first, commercial transistor production had a high failure rate, so transistors were expensive. As an intermediate measure, IBM began to manufacture hybrid computers incorporating valves and transistors, which brought some gains in size and power reduction. The experimental hybrid model 604 computer, built in 1954, led to the commercial model 608 computer of 1957. IBM contracted Texas Instruments, a company that began as a manufacturer of geophysical instruments in the 1930s and moved into the semiconductor industry in the 1950s, as its transistor supplier. Two years later, the IBM model 7090 computer was fully transistorized. Reductions in size were not only beneficial to customers in terms of space savings, but also increased the speed of data processing because the electric impulses had less distance to travel.
Computer storage capacity also improved during the 1950s. In 1953, Jay Forrester of the Massachusetts Institute of Technology installed the first magnetic core memory in the Whirlwind computer, which had been developed specifically for the U.S. Navy in the 1940s. IBM’s contract to develop a successor to the Whirlwind, the SAGE computer of 1956, provided the opportunity to work on magnetic core memory and magnetic drum storage. The magnetic drum evolved into the magnetic disk. In 1957, IBM’s 305 Random Access Method of Accounting and Control (RAMAC) was the world’s first commercial computer disk storage system. In the 1950s, there was no concept of generic software, as each computer was programmed to perform the specific tasks required by the individual client. The programming process was simplified by the development of high-level computer languages that were designed for particular programming purposes. The high-level languages were supported by interpreter or compiler programs, which translated the language into binary machine code. The first of these languages, introduced by IBM, in 1956 was FORTRAN (FORmula TRANslation), which was intended for scientific and mathematical programs. For business applications, COBOL (COmmon Business Oriented Language) was introduced in 1959.
These large computers running specialized programs became known as mainframe computers. IBM had sold 1,800 mainframe computers by 1960 and 12,000 by 1964. IBM’s sales philosophy placed great emphasis on a continuing close relationship with customers. However, by the early 1960s, it became clear that smaller customers might favor a more generic approach. In 1963, the American company Digital Equipment Corporation (DEC) introduced the PDP-8, the world’s first minicomputer. The launch of the more generalist minicomputer was closely followed by the development of the first general-purpose computer language, BASIC (Beginner’s All-purpose Symbolic Instruction Code), in 1964. BASIC was written by John Kemeny and Thomas Kurtz at Dartmouth College. IBM did not immediately embrace the change in business strategy that the minicomputer represented, as it had too much invested in its mainframe strategy. However, it did respond by developing a more flexible type of mainframe architecture. In 1964, IBM launched the System/360 computer, which was conceived of as a “family” of mainframe equipment. System/360 was modular rather than highly tailored and offered a choice of processors, peripherals, and complementary software packages, allowing upgrading or expansion over time. It was a commercial success and total sales of IBM computers rose to 35,000 by 1970.
The long-term future of the mainframe was threatened by developments that made it possible to link up, or network, separate computers. AT&T’s core business gave it a vested interest in computer systems that were interoperable and accommodated multiple users, such as individual telephone exchanges. In 1969, Bell Laboratories developed the UNIX operating system, which became widely used for networking computers. Bell researchers developed a high-level, general-purpose computer language, C, which made UNIX compatible with virtually any of the existing minicomputers. When C became too restrictive for more demanding computer applications, it was modified by a Bell Laboratories researcher, Bjarne Stroustrup, to become C++, introduced in 1983. C++ incorporates object-oriented programming, a more flexible way of modeling data relationships and has become one of the most widely used programming languages.
The First True Computers
World War II stimulated computer development as military advantages could be gained through designing weapons according to more sophisticated ballistic calculations and deciphering the encoded communications of the opposing side. In the early 1930s, the U.S. Navy Board of Ordnance sponsored the American mathematician Howard Aiken, and in 1939, in collaboration with engineers at the International Business Machines Corporation (IBM), he was contracted by the Navy to develop a machine for ballistic calculations. Aiken’s electromechanical Automatic Sequence Controlled Calculator, also known as the Harvard Mark I, was completed in 1944 at a cost of $500,000. It was operated by a punched-tape program and weighed 5 tons.
In Britain, computer research efforts were concentrated on code breaking. Alan Turing, the British mathematician who in 1936 had formulated his vision of a “universal computing machine,” was one of the team that created the Colossus code-breaking machine. Colossus succeeded in breaking the supposedly impregnable German Enigma code, but, for obvious reasons, the project was kept top secret. The most influential of the computers developed in the course of military research was not completed until 1946. This was the Electronic Numerical Integrator and Calculator (ENIAC), commissioned by the U.S. Army Ordnance Department. ENIAC was built by a team at the University of Pennsylvania, led by John Presper Eckert and John William Mauchly. Drawing on Atanasoff and Berry’s design, ENIAC was the world’s first electronic computer. Weighing 30 tons and occupying 160 square meters (1,600 square feet) of floor space, it contained 19,000 thermionic valves, which acted as gates controlling the flow of electric current. Each calculation was programmed by operators feeding in punched cards, and the results were also presented on punched cards.
Feeding in punched cards was a slow and laborious process, so university scientists elsewhere began working on methods of internal program storage. In 1945, the eminent Hungarian-born American mathematician John Von Neumann outlined his theory of a stored-program computer with a central unit to control and process operations in sequence and with read-write random access memory. In Britain, teams at the Universities of Manchester and Cambridge were also addressing this issue. The Manchester team was led by Freddie Williams and Tom Kilburn and assisted by Alan Turing. In 1948, the Manchester electronic computer, known as the Small Scale Experimental Machine (SSEM) and nicknamed the Baby, ran the world’s first stored program, which was stored on cathode ray tubes. Von Neumann’s ideas first came to fruition in the Electronic Delay Storage Automatic Calculator (EDSAC), built at Cambridge University and operational from 1949. EDSAC used mercury delay line storage, a technology developed at the Massachusetts Institute of Technology. EDSAC was completed in advance of the Von Neumann computers developed in the United States, namely the Electronic Discrete Variable Computer (EDVAC) at the University of Pennsylvania and the MANIAC-1 computer at the Institute for Advanced Study at Princeton.
Consumers
The growth of industrialization in the nineteenth century was stimulated by, and linked to, a rising population that created bigger markets. The establishment of modern capitalism grew in association with many of these developments. The innovations within technology and science were not driven only by “pure” experimentation but also by the desire to commercially develop the results. This culture of mass consumption was already advanced in Europe, Canada, and the United States at the beginning of the twentieth century and was initially enjoyed by the middle classes. The post-1945 increase in prosperity allowed more and more working people to purchase consumer durables.
Designers and manufacturers of the earlier twentieth-century domestic appliances were certainly aware of their potential markets insofar as they wanted their products to sell. Nevertheless, what market research that was carried out was largely unscientific and anecdotal. Initially they relied on the nineteenth-century premise that there were “natural” preexisting markets for a product. The role of promotion and advertising was to make sure that the potential customers were attracted to your particular product. Branding, the process of giving a product an identity, was beginning to develop and was accelerated during the Depression years of the 1930s. Economists and politicians looked to increased consumption as a way out of economic slumps. The late 1920s and 1930s saw the introduction of the marketing methods and psychological selling techniques familiar today. There was a change from “getting commodities to consumers” to “getting consumers to commodities.”
This was achieved by advertising techniques that, in the case of domestic appliances, were aimed specifically at women. Advertisements prompted purchase through a combination of guilt and desire. In the United Kingdom and the United States advertisements began to illustrate the housewife, not the servant, using the appliances and exploited rising standards of cleanliness and fears about “household germs.” The increasing use of labor-saving appliances may have saved time in some areas, but social and cultural pressures led to increasing standards and more time spent on other areas of housework. The desire to consume was stimulated by aspirational advertisements and planned obsolescence of products.
As Americans were encouraged to become patriotic consumers many of them felt that they needed to make informed choices about the increasing range of products. In 1926 Frederick Schlink, an engineer from White Plains, New York, organized a consumer club that distributed lists of products that were seen as good value and also those “one might well avoid, whether on account of inferior quality, unreasonable price, or of false and misleading advertising.” Schlink used these lists to produce a book, Your Money’s Worth, which led to the founding of Consumers’ Research and the Consumers’ Research Bulletin in 1928.
The Consumers Union was a splinter group from Consumers’ Research and was established in 1936, following acrimonious labor relations. Its founding group of professors, labor leaders, journalists, and engineers had a mission to “maintain decent living standards for ultimate consumers,” a rhetoric born of the Depression and the strike-breaking tactics of Schlink. It remains independent of both government and industry and depends on membership subscriptions. It first published its magazine Consumer Reports in the same year, establishing a tradition of testing and rating products and services. The initial circulation was around 4,000. Appliances were and continue to be tested for performance, energy efficiency, noise, convenience, and safety. Subscriptions had risen to 100,000 by 1946 and continued to grow, even during the McCarthy era when Consumer Reports was listed as a subversive magazine. The Consumers Union now has over 4.6 million subscribers, a children’s magazine (launched in 1980 as Penny Power, now known as Zillions) and a web site.
In the United Kingdom, the Good Housekeeping Magazine was established in 1922, largely aimed at the servantless middle-class woman. It founded the Good Housekeeping Institute in 1924 to test recipes and “submit all domestic appliances to exhaustive tests and bring those approved to the notice of all housewives,” which it continues to do today. The UK Consumers Association, based on the U.S. Consumers Union was founded in 1956 and first published Which?, a quarterly magazine of tests and reports in 1957. Which? became a monthly magazine in 1959. The UK Consumers Association currently has over a million members. The International Organization of Consumers Unions was established in 1960 and includes consumer associations from the United States, the Netherlands, Belgium, and Australia.
The marketing trends of the 1930s continued after 1945 and in-depth market research developed throughout corporate America in the 1950s. The British Market Research Association was established in 1957, the same year as Vance Packard’s critical study of advertising, The Hidden Persuaders, was published in the United States. The following quotation from Packard’s book illustrates how the advertising industry continued to use the twin themes of guilt and desire in the postwar boom years.
The cosmetic manufacturers are not selling lanolin, they are selling hope. . . . We no longer buy oranges, we buy vitality, we do not buy just an auto, we buy prestige.
If you tell the housewife that by using your washing machine, drier or dishwasher she can be free to play bridge, you’re dead! She is already feeling guilty about the fact that she is not working as hard as her mother. You are just rubbing her up the wrong way when you offer her more freedom. Instead you should emphasize that the appliances free her to have more time with her children and to be a better mother.
Advertisements of the period support this. A Hotpoint ad from Good Housekeeping of June 1951 carries the copy “Save 8 Hours Every Week with a Hotpoint All-Electric Kitchen—Gain Extra Time for All Your Extra Duties.” The time saved, the advertisement suggests, is “for your family as well as the many added duties you’re called on to shoulder these days.” Needless to say, the “you” in question was female.
These quotes reflect a set of cultural values that were already in the process of being challenged by the feminist, civil rights, and youth movements of the 1950s and 1960s. Unsafe at Any Speed, by the American lawyer and consumer advocate Ralph Nader, was published in 1965 and exposed the lack of safety in the General Motors Corvair automobile. Nader joined the Consumers Union in 1967. Congress passed twenty-five pieces of consumer legislation between 1966 and 1973.
The advertisers and manufacturers varied in their ability to respond to these social and cultural changes. The rise of the affluent teenager created a new market, one that clothing, publishing, and cosmetics companies responded to with vigor. The domestic appliance companies also had to change. By the late 1970s the impact of feminism had been such that the latter comment quoted in Packard was no longer tenable as an advertising concept, even though it was still a reality for many women. A mid-1960s ad for a Nevastik Teflon-coated frying pan from the UK Good Housekeeping Magazine had the copy, “Even a Man Can’t Go Wrong with Nevastik Pans.”
Market research had become more sophisticated, and markets were increasingly divided into socioeconomic groups that could become target markets. This analysis became more sophisticated during the 1980s and 1990s as markets were segmented by postal areas and lifestyles.
It has been assumed that manufacturers and consumers stood in opposition to each other, with the consumer organizations acting as monitors and protectors of the latter’s interests. Indeed, the efforts of consumer organizations have led to legislation to improve safety standards and consumers rights after a purchase has been made. But it would be wrong to believe that consumers have been passive recipients of what the producers have given them and that a docile and uncritical public leads to low standards of design. It has been argued that consumers’ desires and needs have been created by the producers and, with the aid of their advertisers, have been satisfied by those producers. This implies that consumption is a less authentic and satisfying activity than, for example, working. It also seems to imply that popular forms of culture and material culture are superficial. Given the sophisticated nature of advanced capitalist societies, this attitude can be contested: needs are often no longer natural, but cultural, informed by the many connections and discontinuities within those societies. Many modern objects do not simply—or, indeed, primarily—have “use or exchange” value but more importantly have “identity” value. This can clearly be seen in some of the more fashionable domestic appliances of the 1980s and 1990s. A Dyson vacuum cleaner or a Sony Walkman is a successful piece of technology, but each equally has become a purchase that reinforces its own brand identity and defines the identity of the consumer. The same can be said of older products such as the Aga cooker or the more self-knowing products from the Alessi stable.
The late twentieth century has produced a society where manufacturers, designers, and consumers are linked, knowingly or not. Companies continue to conduct market research but also are quicker to respond to and appropriate ideas that often bubble up from within popular or mass culture. This “circuit of culture” links the identity, production, consumption, regulation, and representation of a commodity within a circular relationship. This model has increasingly applied to domestic appliances over the last twenty years. Many domestic products that were once almost culturally invisible are now recognized as having a meaning. Consumers are now largely more sophisticated and are able to “read” the intended meanings of the manufacturers and to construct or appropriate their own, which will in turn influence the manufacturers and affect how that product is marketed or modified. Nevertheless, the findings of the 1960 UK Molony Report on consumer protection remain valid.
The business of making and selling is highly organized, often in large units, and calls to its aid at every step complex and highly expert skills. The business of buying is conducted by the smallest unit, the individual consumer, relying on the guidance afforded by experience, if he possesses it, and if not, on instinctive but not always rational thought processes.
The Graphical User Interface
By the mid-1980s, personal computers were becoming common in the workplace, but they were still rare in the home. Expense was not the only factor; other factors were operational skills and functionality. While the microcomputer was domestic in scale, it made few concessions to the casual user in terms of usability. Personal computers were marketed as “user-friendly,” but many people were intimidated by disc operating systems that offered only an enigmatic prompt, signifying the active disk drive, on the opening display screen. Apple again demonstrated its inventiveness when it introduced the Lisa in 1983. The Lisa introduced the graphical user interface (GUI), a screen display that showed program options as graphic icons, pull-down menus from menu bars, and “windows,” screens that could be overlaid and sized. It also offered a pointing device called a mouse as an alternative to the keyboard for navigation and activating menu commands. The computer mouse had been developed in the 1960s at the Stanford Research Institute by Douglas Engelbart, who obtained a patent in 1970. It was commercially developed by the Xerox Corporation in the 1970s, but only became a standard computer device when GUI displays arrived.
Although the Lisa was too expensive to have a major impact on the microcomputer market, the launch of its cheaper sibling, the Apple Macintosh, in 1984 established the GUI as the truly user-friendly face of computing. The Macintosh, familiarly known as the Mac, became particularly popular with graphic designers as it ran the first commercial desktop publishing (DTP) package, Adobe PageMaker. With its streamlined shell, the Mac was also the first microcomputer to be hailed as a design icon. While purist DOS users disparaged the Mac as a WIMP (windows, icons, menus, pointing device), Microsoft was quick to recognize the mass-market appeal of the GUI. As the developer of the Word and Excel applications for the Mac, Microsoft had privileged access to the Apple GUI program code, which became a bone of contention when Microsoft began to develop its own GUI operating system, Windows, for PCs. A legal judgment imposed restrictions on the design of the first version (1.0) of Windows, launched in 1985, but the restrictions ceased to apply thereafter. Nevertheless, it was only with the release of version 3.0 in 1990 that Windows achieved equivalent user-friendliness to the Mac interface. The later versions, Windows 95 and 98, improved the multitasking performance of the interface, which allows separate applications to be open at the same time.
Microsoft’s monopoly of the PC operating system gave it clear advantage in the development of PC applications, as its applications programmers had first access to new code. Microsoft’s first PC application was the PC version of the Excel spreadsheet, introduced in 1987. Since then, its suited Office and Office Pro packages of business applications have become the PC market leaders.
Computers
Since the creation of the first electronic computer in 1946, computer technology has evolved with unparalleled speed. Conceived as a machine to automate and accelerate the calculation of complex sums, the computer became the universal machine for business and personal use because of its ability to process verbal as well as numerical data. Ownership of computers in the home became feasible in the late 1970s when computers of desktop size were developed. As the price of personal computers plummeted and the functionality of the computer became more diverse, home ownership rose. In 1995, the United States led the home ownership rankings with a 37 percent home ownership rate, while Britain, ranked sixth, had a 25 percent home ownership rate. Today, with appropriate software and peripheral devices, the home computer can provide many services, including processing of household financial accounts, word-processing, electronic mail, entertainment, and information.