Computer Software

The key to the mass-market success of the microcomputer lay, not in the hardware itself, however small or cheap it became, but in the development of a range of generic applications. Compatible computer hardware meant that there was a huge incentive for companies to develop software that would enable users to exchange data easily. From the beginning to the present day, the business-software industry has been dominated by American companies. The first mass-market applications provided the means of computerizing the tasks that were common to all businesses—accounting and word-processing. In 1979, Software Arts introduced VisiCalc, the first commercial spreadsheet, to run on the Apple II computer. Spreadsheets create files in the form of tables in which numerical data can be sorted and manipulated. Launched in 1982, the Lotus Development Corporation’s 1–2–3 application for PCs, which added database and graphics display functions to the core spreadsheet functions, soon became the market leader. In the same year, Ashton-Tate released dBASE II, the first commercial relational database. While the pre-PC WordStar was the first commercial word-processing package, WordPerfect—launched in 1982—became the market leader of the 1980s.

Japanese Domination of the Camera Market

After World War II, Japanese companies began to compete very effectively in the international camera market. At the top end of the professional camera market, Rolleiflex and other European companies, including Sweden’s Hasselblad and Austria’s Voigtlander, maintained their supremacy. For example, Voigtlander introduced the zoom lens, widely used in motion-picture photography since the 1930s, for still photography in 1958. However, Japanese companies made serious in-roads into the lower end of the professional camera market. Nippon Kogaku launched its first camera, the Nikon I rangefinder 35 mm camera, in 1948. Exceeding a million units in sales, its Nikon F SLR camera of 1959 was the first commercially successful SLR model. In 1953, Nikon was also the first maker to produce a camera with motorized drive, but motorized drives were uncommon until the 1960s. Another Japanese company, Olympus, produced the first compact SLR camera, the Olympus Trip, in 1968. Twenty years later, sales of the Olympus Trip reached 10 million.

Even the smallest SLR cameras could not be described as pocket-size, so there was a gap in the market between low-performance cartridge cameras that were small and lightweight and the bulkier SLR cameras. In the late 1970s, this gap was filled by the introduction of fully automatic, compact 35 mm cameras. These cameras improved in the early 1980s as a result of the development of auto-focus lenses. As cameras gained more electronic functions, their styling reflected this transition by becoming increasingly high tech. The harder lines of the older metal-bodied mechanical SLR cameras gave way to the sleek lines of plastic-bodied electronic compact and SLR cameras.

The serious image of the camera was only challenged by the modern equivalents of the Brownie. Eastman Kodak had continued to periodically reinvent the simple “point and press” camera. The cheap Kodak Instamatic camera of 1963 used easy-to-load film cartridges and achieved sales of 50 million units by 1970. Its successor, a pocket-size model introduced in 1972, was equally successful. In 1982, a new Kodak format, the film disc, was launched. Single-use disposable cameras followed in the late 1980s. Kodak’s colorful Fun Saver disposable cameras achieved dramatic market penetration, reaching sales of 50 million units by 1995. In the United States, single-use cameras accounted for 75 percent of annual camera sales. In keeping with environmental concerns, Eastman Kodak recovers more than 80 percent by weight of the materials in disposable cameras by reuse or recycling.

Cooker Design

While gas-cooker manufacturers tended to be more innovative in design terms than their electric-cooker counterparts during the interwar period, the time lag was much shorter. After 1920, gas and electric cookers gradually evolved their own identity through the use of new materials and surface finishes. Manufacturers began to apply vitreous enamel, which had previously been used sparsely on splashbacks and cooktops, to all surfaces, outside and inside. Although mottled black enamel was used in conjunction with white, mottled grey enamel and white enamel became more common, as a visible break from the traditional black-leaded range. In the 1930s, other colors, such as mottled blue and green, were also popular. Aside from its appearance, the great advantage of the enameled surface was that it was easily cleaned. By 1930, the typical gas or electric cooker stood on four short legs and consisted of an oven, surmounted by a grill compartment, and a cooktop with between two and four boiling rings.

Sheet steel, which was light and more flexible, was available in the 1920s, but was too expensive to be used extensively. The pioneer of the sheet steel cooker was the American designer Norman Bel Geddes, who produced the Oriole cooker design for the Standard Gas Corporation in 1932. Sheet steel was a logical choice for Bel Geddes who, as an advocate of streamlining, sought materials that could provide a seamless profile. The construction process entailed the clipping of bendable sheets to a steel chassis rather than the bolting of rigid plates to a cast-iron frame. The Oriole cooker in white porcelain-enameled steel was notable for its rounded edges, flush front with plinth, and folding splashboard cum tabletop. The plinth served the dual purpose of inhibiting the accumulation of dust and food debris under the cooker and providing storage space. The full-line cooker with a warming drawer below the oven became standard by the 1940s. In Britain, the first white steel gas cooker was the Parkinson Renown, designed for the 1935 George V Jubilee House and produced commercially from 1937. The use of sheet steel encouraged the standardization of core components, which could then be assembled in different combinations, and this standardization lowered production costs.

Instant Foods

The term “instant food” covers any dried product that is prepared for cooking simply by adding a measure of liquid, usually water or milk. The first ready-mix food was Aunt Jemima’s pancake flour, produced in St. Joseph, Missouri, in 1889. Other instant baking products, such as cake mixes, had their heyday in the 1960s, when the level of female employment rose. These products were marketed on the basis that home baking was a badge of good housewifery, so instant mixes enabled the busy working woman to cheat a little. In 1946, the R. T. French Company of Rochester, New York, introduced the first instant mashed potato product. General Foods introduced Minute Rice, a dried precooked rice, in 1950.

Before the 1950s, all instant products were produced by traditional air-drying, either at ambient temperature or with added heat. By 1940, a new method, freeze-drying, had been developed in Sweden. Food was rapidly frozen and then placed in a vacuum chamber to dry, because, at low pressures, water passes directly from the solid state to the gaseous state, a process known as sublimation. This was particularly effective for any foods with a high water content, as the water is removed rapidly without damaging the structure of the food. The freeze-dried food is sponge-like in texture and therefore absorbs water rapidly. However, the high speed of freezing and drying required for effective results means that the food pieces need to be no more than 2.5 cm (1 inch) thick. The first factory for freeze-drying food opened in Russia in 1954. Freeze-drying is used commercially for drying vegetables and meat, as well as coffee.

Personal Computers

In terms of mass-market potential, the problem with the microcomputer industry in the late 1970s was the proliferation of incompatible machines. No company was able to establish a sufficiently large market share to shape the direction of microcomputer production. IBM initially adopted a disdainful approach to the nascent microcomputer industry. However, once the demand for single-user computers became evident, IBM entered the market in 1981 with the launch of the 5150 PC. The key features of this IBM PC were an Intel 16-bit microprocessor, 64K RAM, and the Microsoft Disk Operating System (MS-DOS). IBM appropriated the term “personal computer,” which—shortened to PC—became used to describe the system architecture. Reputation, marketing channels, and immense research and development resources soon gave IBM a decisive competitive edge in the business market, in spite of its relatively high prices. In 1983, IBM introduced an upgraded PC, the 5160 PC XT, which had a hard-disk drive as well as a floppy-disk drive, and the cheaper IBM PC jr, aimed at the home consumer. (The floppy disk had been introduced as a convenient portable storage medium in 1971.) By the end of 1983, IBM had sold 800,000 PCs. In 1984 came the IBM 5170 PC AT, which introduced the 16-bit ISA (industry standard architecture) data bus, which accelerated the flow of data.

PC architecture was soon cloned by other companies to create a range of IBM-compatible models. At first, would-be imitators had to use the practice of “reverse engineering,” whereby they deconstructed an IBM PC to analyze its technical design. This became unnecessary when IBM decided to publish its system architecture in order to encourage software companies to develop PC applications and thus stimulate the growth of PC ownership. While IBM achieved its goal of making the PC the industry standard for microcomputers, it lost out in terms of computer sales to companies making cheaper clones. For example, the British Amstrad PC1512 personal computer, introduced in 1986, was both cheaper and faster than the IBM PC. In the United States, Compaq, a spin-off from Texas Instruments, was so successful with its IBM clones that in 1986 it superseded Apple as the fastest-growing American corporation ever.

Film and Flash Technology

Photograph quality relied as much on the performance of film and the lighting of the subject as on lens technology. In the days of long exposures, professional photographers became proficient at calculating exposure times through trial and error and controlled the light entering the camera merely by removing and replacing a lens cap. Exposure tables, calculators, and strips of photochromatic paper assisted the amateur photographer. Cameras began to incorporate simple mechanical shutters to control light input, and the first leaf shutter, the Deckel, was developed in Germany in 1902. Accurate short exposures only became possible when the photoelectric cell (or photocell) was invented. The first practical photoelectric cell was invented by the German physicists Julius Elster and Hans Friedrich Geitel in 1904. The photoelectric cell was first incorporated in a separate exposure meter in 1931 by the American William Nelson Goodwin Jr. and was developed commercially by the Weston Electrical Instrument Company of Newark, New Jersey, in 1932 as the Photronic Photoelectric Cell. The U.S. Time Corporation (Timex) produced an “electric eye” camera in 1950, but the first camera to feature a built-in photoelectric cell positioned behind the lens was the Pentax Spotmatic, introduced in 1964 by the Japanese Asahi Optical Corporation. Built-in exposure meters meant that cameras could be operated by the selection of foolproof automatic settings. The first automatic-exposure SLR camera was the Pentax ES model of 1971.

From the 1880s, electric lights could be used to enhance indoor lighting, and flash photography was possible through the use of flash powders and magnesium ribbon. Flash photography became easier with the invention of the electric flashbulb, patented by the German inventor Johannes Ostermeier in 1930. The electronic flashbulb was invented in the United States in 1935. Flashbulbs were initially mounted in separate flashguns, but as they became smaller, they were embodied in attachments that fitted directly onto the camera. The hot shoe flashgun attachment, a metal “shoe” on top of the camera into which a metal “foot” on the flash was slid, was introduced in 1939. As cameras became fully electronic, the flash became an integral part of the camera.

In the 1930s, highly flammable celluloid film was replaced by nonflammable cellulose acetate. While the Scottish physicist James Clerk Maxwell expounded the principles of the three-color photographic process in 1861, it was another thirty years before the French doctor Gabriel-Jonas Lippman developed a process for color photography. In the early twentieth century, a number of color processes were developed, including the Lumière brothers’ Autochrome process, but these were for plates rather than film and were time-consuming and expensive. In the 1930s, Eastman Kodak and the German company Agfa (A.G. für Anilin, the Aniline Company) began to develop color film in transparency form. Kodachrome film became available in 35 mm cartridges and roll film in 1936, and Agfacolor film was available in 35 mm cartridges the next year. The first color negative film, Kodacolor, did not come on the market until 1944 and was quickly followed by faster Kodak Ektachrome transparency film until 1946 and negative film in 1947. While Eastman Kodak’s domination of the camera market began to wane when Japanese companies moved into the camera industry in the 1950s, the company continued to be a leading force in film technology.

After the introduction of color film, the main improvements in photographic film lay in the development of faster films. Film speed is a measure of light-sensitivity. The first system for measuring film speed was developed in Britain in 1890, but from 1947, the American Standards Association (ASA) ratings became the industry norm. Eastman Kodak has continued to be a leader in film technology: for example, it launched a series of high-speed X films, starting with Tri-X black-and-white roll film in 1954. A significant advance in 1963 was the development by the Swiss company Ciba-Geigy of the Ultrachrome process, which allowed prints to be made from transparencies.

Competition between Gas and Electricity

By 1920, solid fuel ranges were out of general favor, except in rural areas where gas and electricity supplies were absent. They remained so thereafter, although the Aga stove, invented by the Swedish physicist Gustav Dalen in 1924 and marketed commercially from 1929, has sustained a small but devoted customer base. In Britain, the growing importance of gas as a cooking and heating fuel was confirmed by the 1920 Gas Regulation Act, which changed the basis for gas prices from illuminating value to calorific value. The situation was much the same in the United States, where consumption of gas for lighting fell from 75 percent in 1899 to 21 percent in 1919, when consumption as domestic fuel reached 54 percent. World War I had provided an opportunity to demonstrate the convenience of electric cookers, which were adopted for field canteens. In the intensifying competition between gas and electricity, the gas cooker manufacturers had the upper hand, in terms of both price and performance. In 1915, the American Stove Company of Cleveland, Ohio, had introduced the first thermostat for gas ovens, the Lorrain oven regulator. The British equivalent, the Regulo thermostat, was developed by Radiation Ltd. (John Wright & Company) in 1923 and fitted to the Davis Company’s New World gas cooker, which also featured a slag wool lagging for better insulation and a base flue. Previously, oven controls, like boiling ring controls, had settings that simply expressed the rate of gas flow, with no reference to the temperature produced. Similarly, electric cookers were fitted with mercury current regulators, and this remained so until the early 1930s. A thermometer attached to the oven door showed the effect of the regulator setting. In Britain, the first automatic temperature controller for electric ovens was the Credastat regulator, introduced in 1931.

Gas boiling rings were also much more efficient than electric ones because the electric elements were slow to heat up, compared to the instant heat of gas. The flat electric plates only provided good heat transmission to pans with similarly flat bottoms that maximized surface contact. Electric boiling rings began to improve in the mid-1920s, when enamel-coated, metal-sheathed elements appeared. This design of boiling ring meant that the pan was in close contact with the heating source without an intervening plate. In the early 1930s, the U.S. company General Electric developed a new type of faster-heating radiant ring, the Calrod strip element, which consisted of resistance coils set in magnesium oxide and sheathed with chromium iron. Combined with bimetallic controls, akin to the automatic oven regulators, the new boiling rings were much more comparable in performance with gas burners.

One of the few inherent advantages of electric cookers at this time was variety of size. Plumbing in a gas outlet was more space-consuming and obtrusive than the electrical equivalent, so gas cookers were invariably full-size cookers. People living alone or families in houses or apartments with small kitchens constituted a ready market for smaller cookers. The British company Belling made particular efforts to exploit this market. In 1919, it introduced the Modernette cooker, a compact, lightweight floor-standing cooker, and in 1929 it launched the Baby Belling, a tabletop cooker.

In Britain, the price differential between gas and electric cookers was largely a result of the non-standardization of electricity supply. This meant that manufacturers needed to produce electric cookers specified to meet the range of voltages in use. The construction of the national grid from 1926 eventually removed this disadvantage. Moreover, in 1930, a group of British electric cooker manufacturers agreed to a common standard that reduced the number of options, thus consolidating production. The electricity utilities introduced cheap rental schemes to overcome the purchase disincentives. An indication of the success of these schemes is that rental of electric cookers was more common than buying until 1938. In the United States, with its standardized electricity supply, electric cookers were much cheaper, but the combined advantages of gas cookers gave them a dominant market position in both Britain and the United States. In Britain, about 75 percent of homes had gas cookers in 1939, compared with about 8 percent of homes that had electric cookers. However, as electric cookers accounted for about a quarter of total cooker production, the balance was shifting in favor of electric cookers. In the United States, gas was less dominant because the larger and more dispersed rural population created a continuing demand for solid fuel and oil stoves. By 1930, gas cookers were the most popular type and were found in 48 percent of homes, while electric cookers were found in just 6 percent of homes.

Frozen Foods

The pioneer of frozen foods was Clarence Birdseye, who based his freezing process on the natural freezing of meat and fish that he had observed in the Arctic zone. He noted that naturally frozen meat and fish seemed fresh when cooked and eaten months later. After returning to the United States, he formed Birdseye Seafoods in 1922 and initially concentrated on chilling fish fillets at a plant in New York. By 1924, he had developed a method of “flash-freezing” by placing cartons of food between metal plates under pressure. He formed the General Seafood Corporation to exploit the flash-freezing technique. In 1929, he sold his company to the Postum Company for $22 million, on the condition that his surname was used as two words, hence the Birds Eye brand name. The expanded company was renamed as the General Foods Corporation. In 1929, cartons of Birds Eye frozen vegetables went on sale in the United States. They were intended to be eaten on the day of purchase, as refrigerators, which were found in only a minority of homes, were only suitable for short-term storage of frozen foods. In 1930, twenty-six varieties of Birds Eye Frosted Foods were test-marketed in Springfield, Massachusetts. The line that was introduced across the United States in 1931 consisted of fish, meat, peas, spinach, loganberries, raspberries, and cherries.

By 1933, 516 stores were stocking Birds Eye Frosted Foods. In 1939, Birds Eye introduced precooked frozen dishes based on chicken, beef, and turkey. As consumption of frozen foods began to increase rapidly in the 1940s, the first specialist self-service frozen-food centers appeared, initially in the New York area in 1945. In Britain, frozen foods became available for the first time in 1946, after a Birds Eye plant was set up in Great Yarmouth. The U.S. company Sara Lee Kitchens produced the first frozen baked foods for the mass market in 1953. A year later, the complete frozen meal appeared when C. A. Swanson & Sons of Omaha, Nebraska, launched TV dinners. In 1957, a new method of cooking frozen foods emerged when the U.S. company Seabrook Farms launched Miracle Pack Prepared Foods, the first boil-in-the-bag frozen foods. The first frozen food to make a major impact in Britain was Birds Eye Fish Fingers, introduced in 1955. These cod sticks coated in breadcrumbs became a favorite children’s food.

In the energy-conscious 1980s, a new competitor to frozen foods appeared—chilled foods. The chilling process involves keeping cooked foods at constant temperatures of 0° to 4°C (32°F–40°F), the recommended temperature range for refrigerators. Although chilled foods have a shorter storage life than frozen foods, they are also quicker to cook and therefore save energy.

Frozen foods have had a profound effect on both the food industry and consumer behavior. For growers of food crops, selling produce to frozen-food companies meant reducing wastage and loss of income through natural decomposition. Some farmers may therefore prefer to sell their whole crop to the frozen food industry. One consequence of this has been that some types of fruit and vegetables are less widely available as fresh produce. The convenience of stocking up on food less frequently is another factor that has reduced the role of fresh food in the diet. An advantage of frozen foods for consumers, however, is that foods are available out of season, thus providing a more varied diet all year round. Calorie-counted, nutritionally balanced frozen or chilled meals may be a boon to the busy consumer, but traditional cooking skills have suffered as a result. Today, for many people, traditional cooking has become a hobby rather than a necessity.

Early Electric Cookers

The first electric oven was installed in the Hotel Bernina, near St. Moritz in Switzerland, in 1889. Electricity was supplied by a hydroelectric power generator.

In Britain and the United States, electric cookers began to feature in public demonstrations and model electrical kitchen displays at major exhibitions in the early 1890s, including the 1891 Crystal Palace Exhibition in London and the 1893 Columbian Exposition in Chicago. The companies that pioneered the commercial production of electric cookers included Crompton & Company in Britain and the Carpenter Company in the United States. The heating elements in these early electric cookers took the form of resistor wires embedded in enameled panels. This heating technology was improved in 1893 by the English electrical engineer H. J. Dowsing, who sandwiched the steel heating wires between two panels, creating a safer and more practical design. Crompton & Company began to manufacture and market cookers to Dowsing’s design in 1894. The heating panels were at first placed on the oven sides and later at the top and bottom. The performance of electric cookers benefited from the improvement in heating technology created by the invention of Nichrome (or nickel and chrome) wire by the American Albert L. Marsh in 1905. The boiling plates on the cooktop took the form of radiant coils on fireclay supports, topped by perforated or solid metal plates.

The main problem for electric cooker manufacturers was that there were few electrified homes to sell their products to. Moreover, even fewer homes had a power circuit as well as a lighting circuit. Electric cookers were, and still are, the electric appliances with the highest power rating and, as such, require a dedicated power supply and fuse box. The investment in wiring an electric cooker and the high costs of the heavy electricity consumption were a major disincentive at a time when electric cookers had nothing extra to offer in terms of functionality. Up until World War I, both gas and electric cookers were modeled on the rival solid fuel range. This meant box-shaped ovens with safe-like doors, made of cast iron with a black lead finish. Given the persistence of fears about the safety of gas and electricity, gas and electric cooker manufacturers may have felt that a familiar design would provide a sense of reassurance. Not surprisingly, with such limited sales potential for full-size cookers, manufacturers concentrated their marketing efforts on small, tabletop cooking appliances, such as electric frying pans and chafing dishes. These appliances had the advantage that they could be used in the dining room as well as the kitchen and had no nonelectric rivals.

The Home Computer Arrives

In 1975, Micro Instrumentation and Telemetry Systems (MITS), a small firm based in Albuquerque, New Mexico, introduced the world’s first microcomputer, the Altair 8800. Lacking its own monitor and keyboard, the Altair 8800 was intended for the serious home enthusiast. Bill Gates (William Henry Gates III) and Paul Allen developed a modified version of the BASIC programming language for the Altair. They registered the Microsoft trade name in November 1976 to market the new language as MS-BASIC. Steven Jobs and Stephen Wozniak, two computer enthusiasts based in Silicon Valley, the heart of the semiconductor industry, were inspired by the example of the Altair. Using a cheaper 8-bit microprocessor, the MOS Technology 6502, they built their own microcomputer. Encouraged by the response of fellow enthusiasts, they began small-scale production of the Apple I computer in 1976. Snubbed by the companies offered the commercial rights but convinced of the commercial potential of the microcomputer, Jobs and Wozniak raised venture finance and set up Apple Computer in 1977. The Apple II computer, the world’s first commercial microcomputer, had generated $2.5 million in sales revenue by the end of the year.

The immediate success of the Apple II energized the computer industry. Other companies, particularly calculator manufacturers, were quick to see the potential of the standalone, desktop computer and began to develop rival products. Like Apple, they hoped to appeal simultaneously to the potential home user and the small business. The U.S. company Commodore Business Machines, founded by Jack Tramiel in 1958, introduced the PET 2001 only two months after the launch of the Apple II. By 1980, a number of U.S. companies were producing microcomputers (all of which were mutually incompatible) and companies such as Epson were selling compact, cheap printers to complement microcomputers. In Britain, Clive Sinclair, developer of the first pocket calculator, introduced the Sinclair ZX80 home computer in 1980. The ZX80 became the cheapest microcomputer on the market. It was designed to use a television set as a display screen rather than a dedicated monitor.

The fall in the price of microcomputers was largely due to the astonishing decrease in the costs of microchip manufacture. No other industry has matched the semiconductor industry for sustained reduction in costs coupled with faster performance. While U.S. companies such as Intel and Motorola dominated the microprocessor market, Japanese companies such as Fujitsu and NEC (Nippon Electric Company) began to make major inroads into the memory-chip market. In 1970, Intel’s first RAM (random access memory) chip had a mere 1K (kilobyte) capacity. Over the next decade, the capacity of RAM chips rose to 4K in 1973, 16K in 1976, and 64K in 1979. Japanese manufacturers were able to rapidly penetrate the memorychip market by taking an approach different from that of the U.S. memory-chip companies. Instead of investing time trying to get more memory on the same size of chip, they opted for the simpler approach of making bigger chips. They also championed the CMOS chip design, which consumed less power than the NMOS chip and was more resilient.

Breakfast Cereals

Prior to the 1860s, breakfast cereal came in one variety—oatmeal porridge. This was not a quick breakfast dish, as it needed to be cooked for a long time. The solution was to cook a large batch and then reheat daily. In 1877, prepacked American Quaker brand rolled oats that had a much faster cooking time than oatmeal were introduced. The first ready-to-eat breakfast cereal, Granula, was invented by James Caleb Jackson of Dansville, New York, in 1863. Jackson found that small cooked granules of graham cracker dough made a suitable cold breakfast cereal, served with cold milk. However, it was not until the 1890s that the idea of ready-to-eat breakfast cereal really took off. John Harvey Kellogg had become director of the Battle Creek Sanitarium and, with his brother, Will, had begun to develop easily digestible foods for invalids. They developed a baked wheat flake cereal that was marketed in 1895 as Granose, the first flaked breakfast cereal. Soon after, a second breakfast cereal enterprise came into being in Battle Creek, when C.W. Post, founder of the Postum Company, developed Grape Nuts in 1897.

In 1898, Will Kellogg developed Cornflakes, the cereal that became most closely associated with the Kellogg name. Kellogg’s became the company name in 1922, replacing the Sanitas Nut Food Company (1898) and the Battle Creek Toasted Flake Company (1906). The first ready-to-eat breakfast cereal to reach the British market was Force Flakes, made in Canada, in 1902.

Although early breakfast cereals followed very healthy formulas, with only small amounts of malt and sugar added for extra flavor, as time went on, sugar content increased dramatically and fiber content fell accordingly. Kellogg’s Sugar Smacks, introduced in 1953, had a 56 percent sugar content. In the more health-conscious society of the late 1950s, Kellogg’s did introduce healthier cereals, such as Special K in 1955, but the company no longer had a healthy whole-foods image. Muesli, a favorite Swiss breakfast food that contains nuts and dried fruit, has become the epitome of the healthy breakfast cereal.

The Evolution of the Kitchen Range

For centuries, cooking arrangements in Europe were based on the system developed by the Romans and diffused throughout Europe in the wake of the military conquests. At its simplest, this involved a raised brick hearth to hold an open fire, set within a wide chimney base. As smoke and hot air rose, they were drawn up the chimney. Different methods of cooking could be achieved by adding devices such as spits, supports for pots and pans, and brick-oven compartments. Cooking on an open fire was slow and inefficient because a lot of heat was absorbed by the chimney walls and by the air in the room. In the mid-eighteenth century, the American statesman and scientist Benjamin Franklin invented a ventilated cast-iron wood-burning stove, through which the hot combustion gases circulated before escaping.

This idea for concentrating the heat source and retaining heat was developed further by Benjamin Thompson, Count von Rumford, in the 1790s. Rumford was born in the United States, in Massachusetts, but his early career as a spy for the British led to his forced departure to Europe. During his employment by the elector of Bavaria in various senior ministerial roles, he developed the solid-fuel range for use in a variety of large-scale catering contexts, including workhouses, army canteens, and hospitals. Perhaps the most innovative feature of Rumford’s ranges was the sunken chambers for pans in the range top. The pans were heated by the combustion gases rising up the surrounding flues. Although Rumford produced scaled-down versions of his basic range design, it was another American inventor, Philo Penfield Stewart, who developed the prototype of the nineteenth-century household range. Stewart patented his first range design in 1834 and later moved from Ohio to Troy, New York, where he established himself as a manufacturer.

Bread

Flour milling and baking became industrialized in the United States in the nineteenth century. Increasingly, bread was not baked at home but purchased from shops and bakeries. One sign of the changing nature of food production and distribution was the formation in the United States in 1898 of the National Biscuit Company (later shortened to Nabisco), which was an amalgamation of 114 bakeries, representing 90 percent of American commercial biscuit production. In Britain, the dominance of national bakery chains is a much more recent phenomenon, with no more than 40 percent of all bread consumed produced in large plant bakeries as late as 1953.

The automation of bread-making began with the introduction of roller milling of flour in the 1870s. Roller mills could produce much finer and whiter flour of a more consistent quality than grindstones. This had two major implications for bread-making: the finer flour could absorb more water, producing a lighter and more malleable dough, and the natural oils in the wheat berry were extracted at an early stage, leaving a flour with a longer life. In the 1920s, the factory bread-making process was accelerated when high-speed dough mixers became available.

The phrase “the best thing since sliced bread” appeared in the United States in the 1930s following the introduction of presliced Wonder Bread. In 1928, after sixteen years of development work, Otto Frederick Rohwedder launched the first practical bread-slicing and wrapping machine in Battle Creek, Michigan. In the same year, the Continental Bakery in NewYork introduced Wonder Bread, the first nationally distributed wrapped loaf of bread. Two years later, using Rohwedder’s machines, it introduced presliced Wonder Bread. Wrapped, presliced bread also appeared in the United Kingdom in 1930. By 1933, 80 percent of the bread sold in the United States was presliced and wrapped.

Sliced bread was convenient and of a standard thickness. Its introduction no doubt helped the sales of electric toasters throughout the 1930s. However, healthfood campaigners argued that the convenience of the presliced white loaf came at the expense of its nutritional value. By the 1940s and 1950s, white bread was routinely enriched by the addition of vitamins and minerals. Stoneground whole wheat flour and unwrapped loaves enjoyed a revival from the late 1950s as a result of the growth of the health foods movement. For example, the American health food guru Gayelord Hauser was a strong advocate of the benefits of wheat germ.

Microelectronics

While the minicomputer widened the market for computers, they were still too expensive and complex for small businesses, let alone individuals. For computers to be brought within the reach of a mass market, they needed to become still smaller, cheaper, and easier to use. The next advance in fundamental electronic technology after the transistor was the integrated circuit. While it had taken the research resources of the world’s largest company, AT&T, to invent the transistor, within ten years transistor manufacture was dominated by new specialist semiconductor companies. The first integrated circuit was created in 1958 by Jack Kilby of Texas Instruments. It consisted of five components on a single germanium chip. A year later, Robert Noyce of Fairchild Semiconductor, founded in 1957, produced the first planar transistor. The planar process involved oxidizing a silicon wafer, coating it with a photosensitive material, photographing a pattern onto it and etching the pattern into the oxide, washing off the coating, and selectively introducing impurities. It was a repeatable process that enabled complex circuits to be built on a silicon wafer. By 1970, the price of an integrated circuit, also known as a silicon chip, had fallen from about $30 to $1, and an integrated circuit might contain up to 100 components.

The use of integrated circuits meant that the printed circuit boards of devices such as calculators became much more compact. Integrated circuits began to be used in computers in the late 1960s, but the central processing unit of a computer required thousands of integrated circuits. In 1968, Robert Noyce cofounded Intel, which began to develop large-scale integrated circuits. While Noyce had predicted that the way forward would be to fit the whole central processing unit onto a single chip, it was one of his employees, Ted Hoff, who actually achieved that. Hoff developed the Intel 4004 chip, the world’s first microprocessor, which made the pocket calculator possible. In terms of mathematical processing power, the Intel 4004 chip was virtually the equivalent of ENIAC. However, its limitation was that as a 4-bit chip (meaning that it could handle four binary digits simultaneously) it could not process alphabetical characters, because it could only define 16 4-bit characters, or bytes. The IBM 7030 computer of 1961 had established the 8-bit character, or byte, as the standard for general computing. Intel launched its first 8-bit microprocessor, the 8008 chip, in 1972, followed by the improved 8080 chip in 1973, paving the way for the first generation of home computers. The 8-bit chip could define 256 different 8-bit characters.

Miniaturization of the Precision Camera

The Kodak Brownie was technically very basic. The cheap fixed focus lens was adequate for snapshots of places and people, but was incapable of close-up photography. Meanwhile, a number of technical advances benefited professional photographers. German camera manufacturers led the way in this sector of the market. The first anastigmatic lens, the Protar f7.5, was developed in 1889 by the German physicist Paul Rudolph for Carl Zeiss, manufacturers of optical equipment based in Jena, eastern Germany. An anastigmatic lens guarantees that all points of the image are accurately aligned in both the vertical and horizontal planes. The Zeiss Tessar lens of 1902 reduced the maximum aperture to f4.5, which improved the depth of field. In 1898, the American inventor William F. Folmer developed the Graflex camera, the first camera capable of high-speed photography in split focal planes. The first compact precision camera, the 35 mm Leica, was manufactured by the German company Leitz, based in Wetzlar. Although Oskar Barnack developed the prototype in 1914, the production model was introduced at the Leipzig trade fair only in 1925.With its rangefinder optical viewing system, interchangeable lenses, and range of accessories, the Leica became an industry standard, and the 35 mm format is the dominant format today. The Zeiss Ikon, marketed from 1932, was another popular professional 35 mm camera.

The portability of the Leica also encouraged the growth of photojournalism, although many publishers insisted on contact printing from large-format negatives. A number of magazines with high photographic content, including Life in the United States and Picture Post in Britain, were launched in the 1930s. Reflex cameras were invented in Britain in the nineteenth century but became a standard camera type only in the 1930s. The key advantage of the reflex camera is that the photographer sees the same image that the lens “sees,” enabling accurate focusing. This is achieved through an arrangement of angled mirrors and prisms that reflects the image entering the lens to the viewfinder. The single-lens reflex (SLR) camera, invented in 1861, was followed by the twin-lens reflex (TLR) camera in 1880, which had a separate lens supplying the viewable image above the main lens. In 1929, the German company Rolleiflex produced a TLR camera with a large viewing screen in the top panel, which became a popular professional model.

SLR cameras, which were compact and more suitable for amateurs, were available from the mid-1930s but did not became common until the 1960s when Japanese camera makers brought out more affordable models. They were easier for hand-held, rather than tripod, use because they had a conventional front-facing viewfinder, which received the image from a hinged mirror that swung back out of the path of the lens when the shutter release was activated. A subminiature precision camera, the Minox, was produced from 1937 by the Latvian company V.E.F.

Alessi

Giovanni Alessi Anghini established a plate-turning workshop at Bagnella, Omegna, Italy, and founded the Alessi Company in 1921. It initially worked in nickel silver and brass and later electroplated with nickel, chrome, and silver. The first articles produced were coffeepots, trays, and table accessories. In 1928 the company moved to Crusinallo in order to utilize hydroelectric power and began to shift from the traditional turned products to pressed ones in stainless steel.

Alessi always produced stylish products, including Carlo Alessi Anghini’s Bombe coffee set of 1945. Ettore Alessi, the technical director, opened the company up to collaboration with external designers in 1955. Working with architects, the company produced stylish objects such as the stainless-steel-wire Citrus basket that is still in production. When Alberto Alessi took over the running of the company he began to use star designers, and the company increased its reputation for very stylish objects during the 1970s, working with Ettore Sottsass from 1972. This trend continued into the 1980s and 1990s with commissions from well-known designers and architects for kettles, coffee sets, and table accessories. These included Michael Graves, Philippe Starck, Aldo Rossi, Richard Sapper, Robert Venturi, and Frank Gehry. Although not always the most functional of objects, they were seen as status symbols for the style conscious. Produced for the top end of the market, these and in-house Alessi designs have influenced the look of many mainstream products. In the 1990s Alessi collaborated with Philips on kettles and toasters.

Alessi is an unusual company with a mission to act as a patron for designers; according to Alberto Alessi, it is “not a normal factory—it is closer to being an applied art research laboratory.” It produces three ranges, Alessi (mass-produced stainless steels and plastics), Officina Alessi (small or middle-series production, including reproductions of outstanding late-nineteenth- and early-twentieth-century designs by the likes of Christopher Dresser and Marianne Brandt), and Tendentse (porcelain). Still a family-owned company, it continues to celebrate

Cookers

The character of cooking in the home underwent a dramatic transformation during the twentieth century, partly as a result of technological developments, but also as a result of social changes. In 1900, most households had coal-fired ranges with solid hotplates above small ovens and consumed relatively little preprocessed food. On the whole, processed foods were valued more for their longer shelf lives than for time savings in preparation and cooking. A hundred years later, most households had freestanding or built-in gas or electric cookers (stoves in American parlance) and consumed a wide range of processed foods.

Beverages

Even simple processes like brewing tea and coffee could be simplified by processing. The flavor in coffee beans is a volatile essence, which begins to dissipate when the roasted bean is ground. Hence, traditionally, coffee beans would only be ground immediately before use. In 1878, Chase & Sanbourn of Boston, Massachusetts, packaged ground, roasted coffee in sealed cans to preserve its flavor. In 1901, a Japanese-American chemist, Satori Kato, produced the first soluble instant coffee for the Pan-American Exposition in Buffalo. Eight years later, George Constant Louis Washington of New York produced a soluble coffee powder, which he sold under the George Washington brand name. However, instant coffee was not mass-produced until the late 1930s. The Swiss food company, Nestlé, developed a mass-production method for instant coffee in order to exploit the surplus of Brazilian coffee beans. Nestlé mass-marketed their instant coffee as Nescafé from 1938. The American food giant General Foods produced an instant coffee in 1942 specifically for supply to the United States Army. It was marketed to the public as Maxwell House instant coffee after World War II. However, the American public tended to shun instant coffee, whereas in Britain and Japan, it made up about 90 percent of coffee sales. The standard drying technique involves spraying brewed coffee into a rising column of heated air, which removes the water as steam, leaving a powder residue. Freeze-drying technology improved in the 1950s and was applied to instant coffee in the mid-1960s. Freeze-dried coffee retained more flavor because the volatile oils remained.

Although coffee is the dominant hot beverage in the United States, the British public has always preferred tea. This may explain why the idea of the tea bag originated in the United States, where consumers needed more persuasion to drink tea. In 1904, a New York tea and coffee merchant, Thomas Sullivan, decided to send customers tea samples in muslin pouches. It was in this form that tea bags were first commercially produced in the United States in 1919. At first, manufacturers saw the catering industry, rather than private consumers, as the main market for tea bags, but by the mid-1930s, Tetley, of NewYork, was mass-marketing tea bags. In Britain, the public at first shunned the tea bag as an inferior product. This was justified insofar as tea bag manufacturers were able to use the fine “sweepings,” previously treated as a waste product. These sweepings would have leaked out of the paper cartons used to package loose-leaf tea. Improvements in tea bag technology, giving improved infusion, helped to sell the concept of the tea bag. By 1993, over 80 percent of tea sold in Britain was in the form of tea bags.

Early Computing

The origins of the computer lie in the development of large mechanical calculating machines from the early nineteenth century. The English mathematician Charles Babbage is considered to have invented the concept of the programmable computer when he devised his Analytical Engine, which was never completed. Babbage’s idea of storing instructions on punched cards was adopted with commercial success by the American inventor Herman Hollerith. Hollerith’s first punch-card data-processing machine was developed specifically for tabulating U.S. census returns in 1890. From 1896, Hollerith’s company, the Tabulating Machine Company (which later became part of the International Business Machines Corporation), built similar machines for a range of uses.

While punched-card calculating machines proved an effective means of speeding up lengthy tabulations, they were not suitable for carrying out more complex mathematical tasks, such as differential equations. In 1876, the Irish physicist Sir William Thomson (later Lord Kelvin) put forward the concept of a mechanical differential analyzer for solving differential equations. However, as with Babbage, Thomson’s ideas were too advanced for contemporary engineering capabilities. The idea of the differential analyzer resurfaced in the 1930s. In the mid-1920s, the American scientist and electrical engineer Vannevar Bush began work on a mechanical-electrical differential analyzer, which he called the product integraph. As the product integraph could only solve the simplest differential equations, in 1930 Bush began to develop a more complex differential analyzer that could handle eighteen independent variables.

The leading European computing pioneers of the 1930s included Douglas Hartree, who constructed the first British differential analyzer, and Konrad Zuse, a German engineer who built the first binary calculator, fed by a punched-tape reader, in 1938. The significance of Zuse’s work is that it laid the foundations for digital computing. Earlier mechanical and electromechanical calculating machines were analogue computers, meaning that each of their components yielded a range of values, which combined to produce a result. Zuse’s binary calculator was based on the binary algebraic method of the nineteenth-century English mathematician George Boole, who demonstrated that equations could be reduced to a series of true or false propositions. This is known as Boolean logic, and in binary code the values of 0 and 1 are used to represent false and true. The advantages of the binary system became more apparent when electronic computers were developed in the late 1940s. The binary system lends itself perfectly to circuits where the state at any point depends on the presence or absence of a pulse of current or the low or high voltage of a component. A long series of bivalue electronic transactions is much simpler to engineer reliably and much more flexible in terms of program routines than fewer transactions with many possible values. In 1939, John V. Atanasoff and Clifford Berry of Iowa State University built the world’s first electronic calculator, which had an external magnetic drum to store a binary code program.

Intelligence and Interoperability

In 1981, the Japanese government announced the launch of the Fifth-Generation Computer Project. While the preceding four generations were defined by their core electronic characteristics—valves, transistors, integrated circuits, and microprocessors—the fifth generation was a more holistic concept. The objective of the project was to develop computers with artificial intelligence over a ten-year period. The idea of artificial intelligence was not a new one. The term came into use in the mid-1950s, and long-term research had been undertaken in the United States at universities including Stanford and Carnegie-Mellon. However, the focus had been much narrower and progress had been limited. While the algorithmic logic advocated by Von Neumann was expressed in a sequential data processing architecture, artificial intelligence required parallel processing architecture, which was more akin to the heuristic reasoning patterns of the human brain. Heuristic is the process whereby we draw on our knowledge of many things to infer solutions to problems. The success of the solutions depends on our expertise, in terms of the quality and range of our knowledge. By 1980, only the most powerful computers, known as supercomputers, used parallel processing. In 1965, the U.S. company Control Data Corporation introduced the first supercomputer, the CD6600, designed by Seymour Cray, who went on to set up Cray Research, which became the leading producer of supercomputers. These computers were designed for complex tasks such as weather and aerodynamic modeling.

Aside from artificial intelligence, the other significant strategic trend of the 1980s and 1990s has concerned the development of software that allows greater interoperability of computers. While hardware compatibility is one way of achieving interoperability, it was evident that total hardware compatibility was extremely unlikely to occur. UNIX, the pioneering hardware-independent operating system, had provided a means of establishing a common platform across a network of nonmatched computers. In the early 1990s, another operating system on similar lines, Linux, named after its Finnish inventor Linus Torvalds, was released as “open source (code)” a term for nonproprietary systems that are made freely available. Further developments in open systems were stimulated by the introduction of the World Wide Web in 1993. As the whole concept of the Web is that it is accessible to all computers irrespective of platform (hardware and operating systems), it fostered new languages and applications. It has become accepted for Web applications, such as browsers and document readers, to be made available as freeware. One of the first of these applications was Adobe’s Acrobat Reader, which allows documents to be downloaded onto any platform. The leading web browsers, Netscape Navigator and Microsoft’s Internet Explorer, were introduced as freeware, in 1994 and 1995, respectively. Released in 1995, Sun Microsystems’s Java programming language has become the main language for Web applications.

Baird Television Company

John Logie Baird, the first person to transmit television pictures, was born in Helensburgh, Dumbartonshire, Scotland, in 1888. He studied electrical engineering at the Royal Technical College, Glasgow, and began a degree at Glasgow University that was suspended by the outbreak of World War I. Ill health, which was a recurrent feature of his life, ruled him out of military service. Instead, he became superintendent engineer of the Clyde Valley Electrical Power Company. After the war, Baird did not resume his degree. He set up a successful business, marketing a range of goods including soap and patent socks.

In 1922, Baird suffered a serious physical and nervous breakdown, which made him unable to continue working. He began to experiment with television after moving to Hastings on the English south coast. Baird developed a mechanical scanning system, based on a design patented by the German engineer Paul Nipkow in 1884. At this stage, Baird’s experiments were a hobby with no immediate business prospects, so he was forced to improvise by using cheap or waste materials, such as biscuit (cookie) tins and bicycle lamp lenses. In early 1924, he succeeded in transmitting a still image of a Maltese cross to a receiver in the same room. Convinced of the potential of his invention, he moved to London and was hired to give television demonstrations in Selfridge’s department store. With family financial backing, he set up Television Ltd. and refined his basic technology to improve the quality of the picture. By October 1925, he was able to transmit the live image of a person. He repeated this demonstration for members of the Royal Society in January 1926. Baird then applied for a license to transmit television signals and began trials over a distance of 10 miles. In 1927, he made the first long-distance telecast from London to Glasgow. The next milestone for Baird came in 1928 with the first transatlantic television broadcast from London to a radio station in Hartsdale, New York.

With new financial backing, Baird formed the Baird Television Development Company in 1927 and set up a studio near the Crystal Palace headquarters of the British Broadcasting Corporation (BBC) in 1928. He negotiated a contract with the BBC to provide trial television broadcasts, initially twice weekly for half an hour, using its Crystal Palace transmitter. Baird’s receivers, known as “televisors,” cost the equivalent of three month’s average wages. Not surprisingly, fewer than a thousand homes in London invested in this novelty. In 1932, the BBC decided to take control of Baird’s broadcasts.

More ominous for the long-term prospects of the Baird system was the launch of a powerful television consortium. EMI and Marconi, aware of American experiments with electronic television that promised picture delivery superior to Baird’s 32-line picture at 12.5 frames per second, had been conducting their own research and development. In 1934, they formed the Marconi-EMI Television Company. A parliamentary committee, the Selsdon Committee, was set up in 1934 to investigate the existing systems and recommend standards of service. Baird decided to improve the performance of his system by making a licensing agreement with the American inventor, Philo Taylor Farnsworth, for use of his Image Dissector. In 1935, the BBC was given responsibility for television broadcasting and invited the Baird Television Company and Marconi-EMI to carry out trial broadcasts at “high definition” picture quality, defined as at least 240 lines. After four months of trials, in February 1937, the BBC decided in favor of the Marconi-EMI 405-line system.

The Baird system was rendered redundant, but Baird himself received some consolation when his pioneering work was rewarded with the gift of the Gold Medal of the International Faculty of Science, which had never previously been awarded to a Briton. Baird’s company continued to manufacture televisions that met the Marconi-EMI standard, while Baird himself pursued a new challenge—color televison. In 1928, he had demonstrated color television using mechanical scanning, and he now returned to the development of color television. He experimented with a mixture of electronic and mechanical techniques that yielded 600-line color television pictures by late 1940. Earlier in 1940, the Rank Organisation had taken control of the Baird Television Company, which became Rank Cintel Ltd., leaving Baird free to pursue his color television interests. By 1944, he had developed the Telechrome tube, a two-color system that used two electron guns whose beams converged on a translucent screen that was coated on one side with blue-green phosphors and on the other side with red-orange phosphors. He found another financial backer in British music hall star and actor Jack Buchanan and set up John Logie Baird Ltd. Unfortunately, this new venture proved to be short-lived as Baird died in 1946.

The Mainframe and Mini Computers

In 1948, IBM decided not to manufacture computers commercially, believing, based on market research, that expense and size were prohibitive factors. Howard Aiken, who had joined IBM, remarked in 1950 that he could not ever see the need for more than six computers in the world. However, other scientists who had built prototype computers thought otherwise and assisted in the development of commercial models. The Manchester University team collaborated with the Manchester-based electrical engineering and electronics company, Ferranti, to create the first commercial computer, the Ferranti Mark I, launched in 1951. Eckert and Mauchly set up in commercial partnership in 1947, but sold their business to Remington Rand three years later. They developed the first commercial American computer, UNIVAC, for Remington Rand in 1951. The original UNIVAC model, supplied to the U.S. Bureau of Census, was the first computer to use magnetic tape for storage. More unusually, the Cambridge University team entered into collaboration with the catering company J. Lyons, which operated a chain of tea shops, to develop the LEO (Lyons Electronic Office) computer for processing business data.

IBM soon reassessed its position and in 1952 Aiken designed its first commercial computer, also its first electronic computer, the model 701. IBM soon acquired a reputation for innovation in computing and overtook Remington Rand’s early lead in the U.S. computer market. It recognized that the high power consumption (up to 100 kilowatts) and heat output of valve computers were disadvantageous, causing valves to burn out too frequently and creating uncomfortable working conditions. The alternative to the valve was the smaller and more resilient solid-state transistor, invented in December 1947 by John Bardeen, Walter Brattain, and William Shockley at Bell Laboratories of AT&T. At first, commercial transistor production had a high failure rate, so transistors were expensive. As an intermediate measure, IBM began to manufacture hybrid computers incorporating valves and transistors, which brought some gains in size and power reduction. The experimental hybrid model 604 computer, built in 1954, led to the commercial model 608 computer of 1957. IBM contracted Texas Instruments, a company that began as a manufacturer of geophysical instruments in the 1930s and moved into the semiconductor industry in the 1950s, as its transistor supplier. Two years later, the IBM model 7090 computer was fully transistorized. Reductions in size were not only beneficial to customers in terms of space savings, but also increased the speed of data processing because the electric impulses had less distance to travel.

Computer storage capacity also improved during the 1950s. In 1953, Jay Forrester of the Massachusetts Institute of Technology installed the first magnetic core memory in the Whirlwind computer, which had been developed specifically for the U.S. Navy in the 1940s. IBM’s contract to develop a successor to the Whirlwind, the SAGE computer of 1956, provided the opportunity to work on magnetic core memory and magnetic drum storage. The magnetic drum evolved into the magnetic disk. In 1957, IBM’s 305 Random Access Method of Accounting and Control (RAMAC) was the world’s first commercial computer disk storage system. In the 1950s, there was no concept of generic software, as each computer was programmed to perform the specific tasks required by the individual client. The programming process was simplified by the development of high-level computer languages that were designed for particular programming purposes. The high-level languages were supported by interpreter or compiler programs, which translated the language into binary machine code. The first of these languages, introduced by IBM, in 1956 was FORTRAN (FORmula TRANslation), which was intended for scientific and mathematical programs. For business applications, COBOL (COmmon Business Oriented Language) was introduced in 1959.

These large computers running specialized programs became known as mainframe computers. IBM had sold 1,800 mainframe computers by 1960 and 12,000 by 1964. IBM’s sales philosophy placed great emphasis on a continuing close relationship with customers. However, by the early 1960s, it became clear that smaller customers might favor a more generic approach. In 1963, the American company Digital Equipment Corporation (DEC) introduced the PDP-8, the world’s first minicomputer. The launch of the more generalist minicomputer was closely followed by the development of the first general-purpose computer language, BASIC (Beginner’s All-purpose Symbolic Instruction Code), in 1964. BASIC was written by John Kemeny and Thomas Kurtz at Dartmouth College. IBM did not immediately embrace the change in business strategy that the minicomputer represented, as it had too much invested in its mainframe strategy. However, it did respond by developing a more flexible type of mainframe architecture. In 1964, IBM launched the System/360 computer, which was conceived of as a “family” of mainframe equipment. System/360 was modular rather than highly tailored and offered a choice of processors, peripherals, and complementary software packages, allowing upgrading or expansion over time. It was a commercial success and total sales of IBM computers rose to 35,000 by 1970.

The long-term future of the mainframe was threatened by developments that made it possible to link up, or network, separate computers. AT&T’s core business gave it a vested interest in computer systems that were interoperable and accommodated multiple users, such as individual telephone exchanges. In 1969, Bell Laboratories developed the UNIX operating system, which became widely used for networking computers. Bell researchers developed a high-level, general-purpose computer language, C, which made UNIX compatible with virtually any of the existing minicomputers. When C became too restrictive for more demanding computer applications, it was modified by a Bell Laboratories researcher, Bjarne Stroustrup, to become C++, introduced in 1983. C++ incorporates object-oriented programming, a more flexible way of modeling data relationships and has become one of the most widely used programming languages.

Convenience Foods

Convenience food is very much a twentieth-century concept. In the nineteenth century, the main reason for processing food before sale was to increase its shelf life. This was a matter of increasing concern, given that the growth of the urban population meant that food had to be transported from further and further afield to the place of consumption in order to meet rising demand. Bottling, canning, and drying were methods that assisted food preservation and were amenable to mass-production and distribution. The archetypal canned food is Heinz baked beans, made by the U.S. H. J. Heinz Company, which is now sold all over the world. The disadvantage of canned foods was that the high temperatures at which the food was cooked, in order to kill enzymes and bacteria, also destroyed some vitamins. Canned foods also have a high content of sugar and salt, which are used as flavor enhancers.

Increasing production of preserved foods containing additives led governments to impose legal standards. In Britain, the Sale of Food and Drugs Act of 1875 imposed much stricter guidelines and penalties than earlier legislation. In the United States, the Pure Food and Drugs Act was passed in 1906. A number of minor religious sects stressed the importance of a healthy diet. Notable amongst these were the Seventh-day Adventists, whose headquarters were in the small town of Battle Creek, Michigan. The name of Battle Creek became familiar internationally owing to its emergence as the center of breakfast cereal production. The Adventists championed breakfast cereals because of their nutritional value, but the cereals became popular in the twentieth century because of their convenience. It was the convenience factor that spurred the development of new preservation techniques, including deep-freezing, irradiation, and freeze-drying. These techniques not only extend the life of food, making fewer shopping trips necessary, but also shorten the cooking time, an increasingly important factor as more women went out to work. The convenience of bulk buying led to a shift in food retailing from the local store offering personal service to the self-service supermarket. By 1959, supermarkets accounted for 69 percent of American food sales. In Britain, supermarkets were slower to take hold, but were dominant by the 1970s.

The First True Computers

World War II stimulated computer development as military advantages could be gained through designing weapons according to more sophisticated ballistic calculations and deciphering the encoded communications of the opposing side. In the early 1930s, the U.S. Navy Board of Ordnance sponsored the American mathematician Howard Aiken, and in 1939, in collaboration with engineers at the International Business Machines Corporation (IBM), he was contracted by the Navy to develop a machine for ballistic calculations. Aiken’s electromechanical Automatic Sequence Controlled Calculator, also known as the Harvard Mark I, was completed in 1944 at a cost of $500,000. It was operated by a punched-tape program and weighed 5 tons.

In Britain, computer research efforts were concentrated on code breaking. Alan Turing, the British mathematician who in 1936 had formulated his vision of a “universal computing machine,” was one of the team that created the Colossus code-breaking machine. Colossus succeeded in breaking the supposedly impregnable German Enigma code, but, for obvious reasons, the project was kept top secret. The most influential of the computers developed in the course of military research was not completed until 1946. This was the Electronic Numerical Integrator and Calculator (ENIAC), commissioned by the U.S. Army Ordnance Department. ENIAC was built by a team at the University of Pennsylvania, led by John Presper Eckert and John William Mauchly. Drawing on Atanasoff and Berry’s design, ENIAC was the world’s first electronic computer. Weighing 30 tons and occupying 160 square meters (1,600 square feet) of floor space, it contained 19,000 thermionic valves, which acted as gates controlling the flow of electric current. Each calculation was programmed by operators feeding in punched cards, and the results were also presented on punched cards.

Feeding in punched cards was a slow and laborious process, so university scientists elsewhere began working on methods of internal program storage. In 1945, the eminent Hungarian-born American mathematician John Von Neumann outlined his theory of a stored-program computer with a central unit to control and process operations in sequence and with read-write random access memory. In Britain, teams at the Universities of Manchester and Cambridge were also addressing this issue. The Manchester team was led by Freddie Williams and Tom Kilburn and assisted by Alan Turing. In 1948, the Manchester electronic computer, known as the Small Scale Experimental Machine (SSEM) and nicknamed the Baby, ran the world’s first stored program, which was stored on cathode ray tubes. Von Neumann’s ideas first came to fruition in the Electronic Delay Storage Automatic Calculator (EDSAC), built at Cambridge University and operational from 1949. EDSAC used mercury delay line storage, a technology developed at the Massachusetts Institute of Technology. EDSAC was completed in advance of the Von Neumann computers developed in the United States, namely the Electronic Discrete Variable Computer (EDVAC) at the University of Pennsylvania and the MANIAC-1 computer at the Institute for Advanced Study at Princeton.

Consumers

The growth of industrialization in the nineteenth century was stimulated by, and linked to, a rising population that created bigger markets. The establishment of modern capitalism grew in association with many of these developments. The innovations within technology and science were not driven only by “pure” experimentation but also by the desire to commercially develop the results. This culture of mass consumption was already advanced in Europe, Canada, and the United States at the beginning of the twentieth century and was initially enjoyed by the middle classes. The post-1945 increase in prosperity allowed more and more working people to purchase consumer durables.

Designers and manufacturers of the earlier twentieth-century domestic appliances were certainly aware of their potential markets insofar as they wanted their products to sell. Nevertheless, what market research that was carried out was largely unscientific and anecdotal. Initially they relied on the nineteenth-century premise that there were “natural” preexisting markets for a product. The role of promotion and advertising was to make sure that the potential customers were attracted to your particular product. Branding, the process of giving a product an identity, was beginning to develop and was accelerated during the Depression years of the 1930s. Economists and politicians looked to increased consumption as a way out of economic slumps. The late 1920s and 1930s saw the introduction of the marketing methods and psychological selling techniques familiar today. There was a change from “getting commodities to consumers” to “getting consumers to commodities.”

This was achieved by advertising techniques that, in the case of domestic appliances, were aimed specifically at women. Advertisements prompted purchase through a combination of guilt and desire. In the United Kingdom and the United States advertisements began to illustrate the housewife, not the servant, using the appliances and exploited rising standards of cleanliness and fears about “household germs.” The increasing use of labor-saving appliances may have saved time in some areas, but social and cultural pressures led to increasing standards and more time spent on other areas of housework. The desire to consume was stimulated by aspirational advertisements and planned obsolescence of products.

As Americans were encouraged to become patriotic consumers many of them felt that they needed to make informed choices about the increasing range of products. In 1926 Frederick Schlink, an engineer from White Plains, New York, organized a consumer club that distributed lists of products that were seen as good value and also those “one might well avoid, whether on account of inferior quality, unreasonable price, or of false and misleading advertising.” Schlink used these lists to produce a book, Your Money’s Worth, which led to the founding of Consumers’ Research and the Consumers’ Research Bulletin in 1928.

The Consumers Union was a splinter group from Consumers’ Research and was established in 1936, following acrimonious labor relations. Its founding group of professors, labor leaders, journalists, and engineers had a mission to “maintain decent living standards for ultimate consumers,” a rhetoric born of the Depression and the strike-breaking tactics of Schlink. It remains independent of both government and industry and depends on membership subscriptions. It first published its magazine Consumer Reports in the same year, establishing a tradition of testing and rating products and services. The initial circulation was around 4,000. Appliances were and continue to be tested for performance, energy efficiency, noise, convenience, and safety. Subscriptions had risen to 100,000 by 1946 and continued to grow, even during the McCarthy era when Consumer Reports was listed as a subversive magazine. The Consumers Union now has over 4.6 million subscribers, a children’s magazine (launched in 1980 as Penny Power, now known as Zillions) and a web site.

In the United Kingdom, the Good Housekeeping Magazine was established in 1922, largely aimed at the servantless middle-class woman. It founded the Good Housekeeping Institute in 1924 to test recipes and “submit all domestic appliances to exhaustive tests and bring those approved to the notice of all housewives,” which it continues to do today. The UK Consumers Association, based on the U.S. Consumers Union was founded in 1956 and first published Which?, a quarterly magazine of tests and reports in 1957. Which? became a monthly magazine in 1959. The UK Consumers Association currently has over a million members. The International Organization of Consumers Unions was established in 1960 and includes consumer associations from the United States, the Netherlands, Belgium, and Australia.

The marketing trends of the 1930s continued after 1945 and in-depth market research developed throughout corporate America in the 1950s. The British Market Research Association was established in 1957, the same year as Vance Packard’s critical study of advertising, The Hidden Persuaders, was published in the United States. The following quotation from Packard’s book illustrates how the advertising industry continued to use the twin themes of guilt and desire in the postwar boom years.

The cosmetic manufacturers are not selling lanolin, they are selling hope. . . . We no longer buy oranges, we buy vitality, we do not buy just an auto, we buy prestige.

If you tell the housewife that by using your washing machine, drier or dishwasher she can be free to play bridge, you’re dead! She is already feeling guilty about the fact that she is not working as hard as her mother. You are just rubbing her up the wrong way when you offer her more freedom. Instead you should emphasize that the appliances free her to have more time with her children and to be a better mother.

Advertisements of the period support this. A Hotpoint ad from Good Housekeeping of June 1951 carries the copy “Save 8 Hours Every Week with a Hotpoint All-Electric Kitchen—Gain Extra Time for All Your Extra Duties.” The time saved, the advertisement suggests, is “for your family as well as the many added duties you’re called on to shoulder these days.” Needless to say, the “you” in question was female.

These quotes reflect a set of cultural values that were already in the process of being challenged by the feminist, civil rights, and youth movements of the 1950s and 1960s. Unsafe at Any Speed, by the American lawyer and consumer advocate Ralph Nader, was published in 1965 and exposed the lack of safety in the General Motors Corvair automobile. Nader joined the Consumers Union in 1967. Congress passed twenty-five pieces of consumer legislation between 1966 and 1973.

The advertisers and manufacturers varied in their ability to respond to these social and cultural changes. The rise of the affluent teenager created a new market, one that clothing, publishing, and cosmetics companies responded to with vigor. The domestic appliance companies also had to change. By the late 1970s the impact of feminism had been such that the latter comment quoted in Packard was no longer tenable as an advertising concept, even though it was still a reality for many women. A mid-1960s ad for a Nevastik Teflon-coated frying pan from the UK Good Housekeeping Magazine had the copy, “Even a Man Can’t Go Wrong with Nevastik Pans.”

Market research had become more sophisticated, and markets were increasingly divided into socioeconomic groups that could become target markets. This analysis became more sophisticated during the 1980s and 1990s as markets were segmented by postal areas and lifestyles.

It has been assumed that manufacturers and consumers stood in opposition to each other, with the consumer organizations acting as monitors and protectors of the latter’s interests. Indeed, the efforts of consumer organizations have led to legislation to improve safety standards and consumers rights after a purchase has been made. But it would be wrong to believe that consumers have been passive recipients of what the producers have given them and that a docile and uncritical public leads to low standards of design. It has been argued that consumers’ desires and needs have been created by the producers and, with the aid of their advertisers, have been satisfied by those producers. This implies that consumption is a less authentic and satisfying activity than, for example, working. It also seems to imply that popular forms of culture and material culture are superficial. Given the sophisticated nature of advanced capitalist societies, this attitude can be contested: needs are often no longer natural, but cultural, informed by the many connections and discontinuities within those societies. Many modern objects do not simply—or, indeed, primarily—have “use or exchange” value but more importantly have “identity” value. This can clearly be seen in some of the more fashionable domestic appliances of the 1980s and 1990s. A Dyson vacuum cleaner or a Sony Walkman is a successful piece of technology, but each equally has become a purchase that reinforces its own brand identity and defines the identity of the consumer. The same can be said of older products such as the Aga cooker or the more self-knowing products from the Alessi stable.

The late twentieth century has produced a society where manufacturers, designers, and consumers are linked, knowingly or not. Companies continue to conduct market research but also are quicker to respond to and appropriate ideas that often bubble up from within popular or mass culture. This “circuit of culture” links the identity, production, consumption, regulation, and representation of a commodity within a circular relationship. This model has increasingly applied to domestic appliances over the last twenty years. Many domestic products that were once almost culturally invisible are now recognized as having a meaning. Consumers are now largely more sophisticated and are able to “read” the intended meanings of the manufacturers and to construct or appropriate their own, which will in turn influence the manufacturers and affect how that product is marketed or modified. Nevertheless, the findings of the 1960 UK Molony Report on consumer protection remain valid.

The business of making and selling is highly organized, often in large units, and calls to its aid at every step complex and highly expert skills. The business of buying is conducted by the smallest unit, the individual consumer, relying on the guidance afforded by experience, if he possesses it, and if not, on instinctive but not always rational thought processes.

The Graphical User Interface

By the mid-1980s, personal computers were becoming common in the workplace, but they were still rare in the home. Expense was not the only factor; other factors were operational skills and functionality. While the microcomputer was domestic in scale, it made few concessions to the casual user in terms of usability. Personal computers were marketed as “user-friendly,” but many people were intimidated by disc operating systems that offered only an enigmatic prompt, signifying the active disk drive, on the opening display screen. Apple again demonstrated its inventiveness when it introduced the Lisa in 1983. The Lisa introduced the graphical user interface (GUI), a screen display that showed program options as graphic icons, pull-down menus from menu bars, and “windows,” screens that could be overlaid and sized. It also offered a pointing device called a mouse as an alternative to the keyboard for navigation and activating menu commands. The computer mouse had been developed in the 1960s at the Stanford Research Institute by Douglas Engelbart, who obtained a patent in 1970. It was commercially developed by the Xerox Corporation in the 1970s, but only became a standard computer device when GUI displays arrived.

Although the Lisa was too expensive to have a major impact on the microcomputer market, the launch of its cheaper sibling, the Apple Macintosh, in 1984 established the GUI as the truly user-friendly face of computing. The Macintosh, familiarly known as the Mac, became particularly popular with graphic designers as it ran the first commercial desktop publishing (DTP) package, Adobe PageMaker. With its streamlined shell, the Mac was also the first microcomputer to be hailed as a design icon. While purist DOS users disparaged the Mac as a WIMP (windows, icons, menus, pointing device), Microsoft was quick to recognize the mass-market appeal of the GUI. As the developer of the Word and Excel applications for the Mac, Microsoft had privileged access to the Apple GUI program code, which became a bone of contention when Microsoft began to develop its own GUI operating system, Windows, for PCs. A legal judgment imposed restrictions on the design of the first version (1.0) of Windows, launched in 1985, but the restrictions ceased to apply thereafter. Nevertheless, it was only with the release of version 3.0 in 1990 that Windows achieved equivalent user-friendliness to the Mac interface. The later versions, Windows 95 and 98, improved the multitasking performance of the interface, which allows separate applications to be open at the same time.

Microsoft’s monopoly of the PC operating system gave it clear advantage in the development of PC applications, as its applications programmers had first access to new code. Microsoft’s first PC application was the PC version of the Excel spreadsheet, introduced in 1987. Since then, its suited Office and Office Pro packages of business applications have become the PC market leaders.

Bang & Olufsen

The Danish company Bang & Olufsen is a leading manufacturer of top quality audio equipment and televisions. Bang & Olufsen products are renowned for their blend of high-tech performance and elegant, minimalist styling, summed up in the slogan it registered in 1931, “B&O—the Danish Quality Brand.”

Two young Danish engineers, Peter Bang and Sven Olufsen, who had met while studying at the School of Engineering in Århus, founded the company in 1925. They were fortunate to have families wealthy enough to back them financially, and their first workshop was in the attic of the Olufsen family’s country manor near Struer. Their first product was a mains radio receiver, that is, one that was powered by a wired electricity supply—unusual at a time when most radios were battery-powered. However, the company’s first commercial success was not the mains radio itself, but its eliminator, the device that rectified the incoming alternating current to produce direct current. B&O began to manufacture the eliminator as a separate device that enabled any battery-powered radio to be run off mains electricity. Expanding production led Bang & Olufsen to open its first factory in 1927 in the town of Gimsing. In 1929, the company returned to producing mains radios with the launch of a five-valve radio that delivered high output.

In the 1930s, Bang & Olufsen diversified into the production of a range of audio equipment, including gramophones, amplifiers, and loudspeakers. The company’s products and advertising graphics were heavily influenced by the design aesthetics of the Bauhaus school. The key design characteristics were simple, geometric lines and detailing that emphasized the function of the product and an absence of ornament for purely decorative effect. B&O was a pioneer of the radiogram, a radio receiver and record player combined in one cabinet. The first B&O radiogram, the Hyperbo, was launched in 1934. The tubular steel frame of the Hyperbo was influenced by the chair designs of the German Bauhaus designer Marcel Breuer. Bang & Olufsen’s first radio with a Bakelite cabinet, the Beolit, was introduced in 1939. From the mid-1960s, the prefix “Beo” was incorporated in all B&O model names. In the same year, B&O’s Master de Luxe radiogram incorporated a feature that became very popular—push-button radio-station selection. The radio was pretuned to 16 radio stations.

Bang & Olufsen went through a quiescent period during World War II because it refused to cooperate with the occupying German forces. Worse still, after liberation from German occupation in 1945, the factory was bombed by Danish Nazi sympathizers. After the rebuilding of the factory, Bang & Olufsen entered the field of television manufacture. In the 1950s, B&O commissioned a number of Danish architects, including Poul Henningsen and Ib Fabiansen, to design the cabinets for its audio and television equipment. It was keen to produce cabinets that were lighter and easier to move around. In 1962, B&O introduced the Horizon TV, its first television to be mounted on a four-wheeled metal stand.

The transistorization of audio equipment and televisions paved the way for compact, modern product designs. The Beomaster 900K, designed by the Danish architect Henning Moldenhawer, was the world’s first low-line radio cabinet, a forerunner of the stereo receivers that formed part of the popular modular hi-fi systems of the late 1960s and 1970s. The designer who did most to establish a distinctive B&O style of audio equipment was Jakob Jensen. His designs, beginning with the Beolab 5000 music system of 1965, were expressive of the technical sophistication of B&O’s products. This system introduced user-friendly sliding controls. The Beolab system was accompanied by cube stereo loudspeakers, with the angular speaker cone mounted on thin stems with a circular base. However, Jensen’s most famous design for B&O was the Beogram 4000 stereo turntable of 1972, because this introduced the world’s first tangential pickup arm. The straight double tone arm was electronically controlled by a spot of light, and its tangential path eliminated the wandering in the groove that curved arms were prone to.

Recognizing that its products were never going to achieve the mass-market penetration of rival Japanese electronics products because high quality meant high prices, B&O concentrated on lifestyle marketing and design. It targeted a wealthy international clientele for whom style and quality were the tantamount product characteristics. B&O’s continuing commitment to functionality and ease of use was exemplified in the controls of the 1976 Beomaster 1900 receiver. The most frequently used controls were mounted visibly at the front for easy access, while the secondary controls were behind, concealed beneath a hinged lid. Similarly concealed controls became standard on televisions in the 1980s. The other innovative feature of the Beomaster 1900 controls was that the buttons were touch-sensitive electronic buttons, not mechanical push buttons. The Beosystem 5000 modular hi-fi system of 1983 eliminated controls from the hi-fi units in favor of a unified remote-control panel. This concept was taken a step further in 1984 with the introduction of the Beolink 1000 remote-control unit that incorporated television as well as audio controls.

In the 1990s, B&O broke away from stacking, modular hi-fi design in order to distinguish its products from those intended for the mainstream mass market. The Beosystem 2500 of 1991 was an integrated unit with the decks mounted vertically and therefore more visibly. The Beosystem 2500 and its successor, the BeoSound Century, also echoed the slim verticality of B&O’s televisions. Introduced in 1984, the BeoVision MX 2000 television was the first of B&O’s slim televisions. Its shallow cabinet and the minimal frame around the screen emphasized the picture, the core function. Audio and television were brought together in the BeoCenter AV5 of 1997, a complete home-entertainment system. As the twentieth century ended, Bang & Olufsen’s final contribution to user convenience was the development of the BeoVision 1 television, which incorporates an intelligent automatic program selection function, whereby the user selects the preferred types of program and the television matches the selection to the programs available.

Computers

Since the creation of the first electronic computer in 1946, computer technology has evolved with unparalleled speed. Conceived as a machine to automate and accelerate the calculation of complex sums, the computer became the universal machine for business and personal use because of its ability to process verbal as well as numerical data. Ownership of computers in the home became feasible in the late 1970s when computers of desktop size were developed. As the price of personal computers plummeted and the functionality of the computer became more diverse, home ownership rose. In 1995, the United States led the home ownership rankings with a 37 percent home ownership rate, while Britain, ranked sixth, had a 25 percent home ownership rate. Today, with appropriate software and peripheral devices, the home computer can provide many services, including processing of household financial accounts, word-processing, electronic mail, entertainment, and information.