The first commercial hand-cranked dishwasher intended for home use was shown by the Walker Company of Syracuse, NewYork, at the New York State Fair in 1910. Walker produced an electric version in 1918, with a small electric motor at the base to drive the agitator. Hand-cranked models continued to be sold in the 1920s. A British example was the Polliwashup machine, effusively advertised as the “greatest household labour-saver of all Time.” General Electric purchased the Walker Company in 1930 and began to remodel the dishwasher. It brought out the first square-tub model in 1932. The first electric dishwasher sold in Britain, in 1937, was a U.S. product, a Thor model made by the Hurley Machine Company.
In the late 1930s and again ten years later, a combined washing machine and dishwasher appeared on the market in the United States. The two functions were served by having a long agitator for clothes washing that was interchangeable with a short agitator plus dish rack. This model was a top loader but from the late 1940s, as with automatic washing machines, manufacturers began to favor the front-loading design. The price of dishwashers in Britain fell when Hoover began to manufacture them there. The introduction of plastic sink-top models by companies such as Electrolux was another way of making dishwashers more affordable and at the same time addressing the problem of space constraints in the typical British kitchen.
The performance of dishwashers benefited from improvements in soap and detergent technology in the 1950s. By the mid-1960s, special dishwashing powders and rinse aids were available. Considering the very limited commercial success of dishwashers even in the United States, it is surprising that Kelvinator demonstrated a high-tech concept of dishwashing in Seattle in 1962. This was the water-free and detergent-free ultrasonic dishwasher. The concept never got beyond the prototype stage, so its reliability is untested.
Dishwashers were still a luxury item in the United States until the late 1960s when annual sales reached 2 million units. In Britain, only 2 percent of homes had dishwashers by 1973. This slow pattern of growth has baffled those historians of domestic technology who have argued that dishwashing, as a frequent and tedious chore, is an obvious candidate for automation. Recently, manufacturers have applied the same technical innovations to dishwashers as to washing machines and for the same reasons. The ability of the fuzzy logic chip to optimize water and detergent use allows manufacturers to promote dishwashers as environmentally friendly, setting aside the issue of electricity consumption.
Soap is made by mixing animal or vegetable fats with sodium hydroxide (caustic soda) or potassium hydroxide (caustic potash). This causes the fatty acids to form sodium salts. In its normal bar (tablet) form, soap is not ideal for washing large loads of clothes. For centuries, the usual practice was to presoak clothes in water to which lye, an alkali derived from wood or plant ash, or urine had been added. This had a mild bleaching effect and loosened dirt, which could then be removed by rubbing with soap. In 1791, a French chemist, Nicholas Leblanc, developed a process for making washing soda (sodium carbonate), which performed the same function as lye or urine. In Britain, most people used soap very sparingly until 1853, when the government repealed a soap tax imposed in 1712. The first powdered soap, Babbitt’s Best Soap, went on sale in New York in 1843. The first British soap powder was Hudson’s Soap Extract, introduced in 1863. By 1900, soap was also available in flake form. In 1918, the British company Lever Brothers (now Unilever) introduced Rinso, the world’s first granulated laundry soap.
The German company Henkel took the first step toward modern detergent technology with its Persil washing powder, introduced in 1907. The name was derived from two of its constituents, perborate and silicate. Perborate releases oxygen from the water molecules so that it becomes available to act on stains. Persil was an improved soap powder, rather than a modern detergent, but its self-activation process was the key to the successful laundry detergents that followed. The U.S. company Procter & Gamble developed an equivalent product, Oxydol. After electric washing machines became available in 1907, the disadvantages of soap-based laundry products became more evident. When soap is used in water containing magnesium and calcium (hard water), insoluble salts are created, which form scum on the surface. The first nonsoap detergents were developed in Germany in the late nineteenth century, with coal tar as the base and sulfuric acid as the reagent. Nekal, introduced in Germany in 1907, was the first detergent to be marketed commercially.
However, although early detergents eradicated the scum problem, they were not as effective at washing clothes as the improved soap powders. In 1933, Procter & Gamble marketed its first detergent, Dreft, as a cleanser for washing dishes in hard water. It launched the first synthetic laundry detergent, Tide, in 1946 as the “washing miracle.” Tide became the leading American laundry detergent in the 1950s. The British equivalent of Tide was Surf, introduced by Unilever in 1949. These new detergents contained optical brighteners, which enhanced the appearance of white clothes. Previously, a similar effect had been achieved by adding a blue powder to rinsing water. From the 1950s, powder detergents were based on alkyl benzene, a light, clear oil derived from coal tar or petroleum. In 1968, the first biological detergents appeared. These contain enzymes, which are natural catalysts with specific properties. The digestive enzymes used in detergents break down biological stains: lipase acts on fats, protease on proteins, and amylase on starches. Another advantage of enzymes is that they are effective at low temperatures. The increasing popularity of front-loading automatic washing machines created a need for low-lather detergents. Silicone is one of the ingredients that can be added to reduce foaming. In the mid-1980s, liquid detergents were developed.
While advances in detergent technology improved their washing performance, it became apparent that a number of the key ingredients had damaging environmental consequences. Phosphates and surfactants, the surface-active agents that improve wetting (the penetration of water into fabrics), evade the biological processes used to purify sewage. Phosphates, which may also be found
A Theme here at HouseHold Tips is putting things away.
Get stuff off your counters.
Have a place for everything,
and decide on that place according to how often you use a thing.
In the case of dishwashing liquid-
we're making an exception.
I use dishwashing liquid all through the day, everyone does.
The vast majority of people sit their dishwashing liquid by their kitchen sink (I don't know where those other people put it).
If it has to be out- why not make it look good??!!
I buy blue dishwashing liquid to match my kitchen AND...
put it in this container to the left.
It looks so much nicer!
For the record- this is officially a 'Gourmet Oil Bottle' I bought at The Container Store for $9.99.
This little idea rocks because I love thinking 'outside the box' about organization and
you end up conserving dishwashing liquid because it comes out in drips and little streams!
Founded in 1944 as the British Council of Industrial Design, the government-funded Design Council is responsible for encouraging and promoting high standards of industrial design. It was not the first British organization to undertake this mission. In 1914, the Arts and Crafts Exhibition Society, which promoted handcrafted products, turned its attention to industrial design and set up an offshoot, the Design and Industries Association. The Arts and Crafts influence was reflected in its motto, “Fitness for purpose.” The government first took an interest in industrial design in 1931, when the Board of Trade first proposed to set up the Council for Art and Industry, whose main purpose from its launch in 1934 was to mount exhibitions showing examples of good design. Ironically, it was during World War II, a period of material shortages and production constraints, that the government began to take a more active role in shaping the design environment. In 1942, the Ministry of Information recruited a number of leading British designers, including Herbert Read, Milner Gray, and Misha Black, to head a new Design Research Unit. The goal was to create an advisory and consultancy network to educate manufacturers to embrace the notion of “total design.”
The formation of the British Council of Industrial Design and its Scottish Committee was announced in December 1944 by the president of the Board of Trade. The stated objective of the council was “to promote by all practicable means the improvement of design in the products of British industry.” Its motto, “Good design, good business,” reflected the government’s belief that design was a way of boosting product sales. The council’s staff grew from 10 in 1945 to over 1,000 in 1946. With the backing of the Board of Trade, the council mounted its first exhibition, “Britain Can Make It,” at the Victoria and Albert Museum in 1946. Featuring 5,000 exhibits produced by 1,300 companies, the exhibition was visited by 1.4 million people. The products were carefully selected to convey the council’s notions of good taste in design to both consumers and manufacturers.
Gordon Russell, an influential British furniture designer whose career spanned the Arts and Crafts Movement and Modernism, took over as director in 1947. The council began to promote its ideas through publications (such as Design magazine, which was introduced in 1949), film strips, and displays. In 1951, the council undertook a national survey of British design, which generated the Design Index, a stock list of approved products, and a pictorial reference library. The most high-profile venture of the council in its early years was its contribution to the 1951 Festival of Britain. The festival was the brain-child of the Royal Society of Arts as a means of celebrating the centenary of the Great Exhibition. The role of the Council of Industrial Design was to choose 10,000 products of British manufacture for exhibition. Eight and a half million visitors attended the festival, which had its main site on the South Bank of the Thames in London.
In spite of the success of the Festival of Britain, the council felt that it was not getting its message across to the public. The pattern of sales of consumer products suggested that the majority of the public were attracted by the “cheap and cheerful” rather than the quiet good taste that the council espoused. The council felt that it needed a permanent showcase for good design. The result was the Design Centre for British Industries, a “shopping guide to well-designed British goods,” which opened in London in 1956. The Scottish Design Centre opened in Glasgow in 1957, which was also the year that the council introduced an annual design award. The Design Centre Awards, later renamed the Design Council Awards and then the British Design Awards, were originally restricted to consumer goods. They were later extended to capital goods, such as engineering products and car components, in 1974, to medical products in 1975, and computer software in 1986. In order to bring good design to the attention of a wider audience, the council introduced a black and white triangular label as a symbol to identify goods that met its design criteria. The label scheme was discontinued in 1988.
The council’s name was shortened to the Design Council in 1972. The 1970s was a period of consolidation and regional expansion. New offices were opened in Cardiff in 1974 and Belfast in 1978. In the 1980s, the council began to adapt to a changing political and business environment. In 1983, it introduced the Funded Consultancy Scheme, whereby eligible companies could claim free design consultancy services. New facilities were opened at the London Design Centre, including an Innovation Centre in 1985, a Materials Information Centre in 1988, and aYoung Designers Centre in 1989.
Further rethinks in the 1990s brought several changes of premises and a slimming down of the council’s functions. In 1990, the Design Council Scotland moved to a different location in Glasgow. A year later, the Design Council Northern Ireland also relocated to new offices in Belfast, while a new regional English office was opened in Leeds. The last of these relocations saw the council move its London headquarters from Haymarket to Covent Garden in 1998. In 1993, the government’s industry minister announced a comprehensive review of the Design Council in consultation with manufacturers, designers, and educators. The outcome of the review was that the council ceased to provide direct design consultancy services, transferring its consultancy database to the Chartered Society of Designers.
The end of the second millennium provided the opportunity for the council to relaunch itself. The Millennium Products initiative, managed by the Design Council, was launched in September 1997 by the prime minister. After three rounds of applications, a total of just over 1,000 Millennium Products were selected by the end of 1999. These design achievements were celebrated in the Spiral of Innovation artwork commissioned by the Design Council and installed in the Millennium Dome at Greenwich in London.
The electrical appliance manufacturers spotted the opening for an appliance to assist this process. During the 1950s U.S. companies produced a line of electric defrosters that could be plugged in and then placed in the freezer. Most were small electric heaters encased in aluminum bodies with wooden handles, such as those manufactured by Howell & Company and the Shane Manufacturing Company. Some models had metal and plastic casings. The Osrow Products Company of New York produced an infrared version in the early 1960s.
The growth of industrialization in the nineteenth century was stimulated by, and linked to, a rising population that created bigger markets. The establishment of modern capitalism grew in association with many of these developments. The innovations within technology and science were not driven only by “pure” experimentation but also by the desire to commercially develop the results. This culture of mass consumption was already advanced in Europe, Canada, and the United States at the beginning of the twentieth century and was initially enjoyed by the middle classes. The post-1945 increase in prosperity allowed more and more working people to purchase consumer durables.
Designers and manufacturers of the earlier twentieth-century domestic appliances were certainly aware of their potential markets insofar as they wanted their products to sell. Nevertheless, what market research that was carried out was largely unscientific and anecdotal. Initially they relied on the nineteenth-century premise that there were “natural” preexisting markets for a product. The role of promotion and advertising was to make sure that the potential customers were attracted to your particular product. Branding, the process of giving a product an identity, was beginning to develop and was accelerated during the Depression years of the 1930s. Economists and politicians looked to increased consumption as a way out of economic slumps. The late 1920s and 1930s saw the introduction of the marketing methods and psychological selling techniques familiar today. There was a change from “getting commodities to consumers” to “getting consumers to commodities.”
This was achieved by advertising techniques that, in the case of domestic appliances, were aimed specifically at women. Advertisements prompted purchase through a combination of guilt and desire. In the United Kingdom and the United States advertisements began to illustrate the housewife, not the servant, using the appliances and exploited rising standards of cleanliness and fears about “household germs.” The increasing use of labor-saving appliances may have saved time in some areas, but social and cultural pressures led to increasing standards and more time spent on other areas of housework. The desire to consume was stimulated by aspirational advertisements and planned obsolescence of products.
As Americans were encouraged to become patriotic consumers many of them felt that they needed to make informed choices about the increasing range of products. In 1926 Frederick Schlink, an engineer from White Plains, New York, organized a consumer club that distributed lists of products that were seen as good value and also those “one might well avoid, whether on account of inferior quality, unreasonable price, or of false and misleading advertising.” Schlink used these lists to produce a book, Your Money’s Worth, which led to the founding of Consumers’ Research and the Consumers’ Research Bulletin in 1928.
The Consumers Union was a splinter group from Consumers’ Research and was established in 1936, following acrimonious labor relations. Its founding group of professors, labor leaders, journalists, and engineers had a mission to “maintain decent living standards for ultimate consumers,” a rhetoric born of the Depression and the strike-breaking tactics of Schlink. It remains independent of both government and industry and depends on membership subscriptions. It first published its magazine Consumer Reports in the same year, establishing a tradition of testing and rating products and services. The initial circulation was around 4,000. Appliances were and continue to be tested for performance, energy efficiency, noise, convenience, and safety. Subscriptions had risen to 100,000 by 1946 and continued to grow, even during the McCarthy era when Consumer Reports was listed as a subversive magazine. The Consumers Union now has over 4.6 million subscribers, a children’s magazine (launched in 1980 as Penny Power, now known as Zillions) and a web site.
In the United Kingdom, the Good Housekeeping Magazine was established in 1922, largely aimed at the servantless middle-class woman. It founded the Good Housekeeping Institute in 1924 to test recipes and “submit all domestic appliances to exhaustive tests and bring those approved to the notice of all housewives,” which it continues to do today. The UK Consumers Association, based on the U.S. Consumers Union was founded in 1956 and first published Which?, a quarterly magazine of tests and reports in 1957. Which? became a monthly magazine in 1959. The UK Consumers Association currently has over a million members. The International Organization of Consumers Unions was established in 1960 and includes consumer associations from the United States, the Netherlands, Belgium, and Australia.
The marketing trends of the 1930s continued after 1945 and in-depth market research developed throughout corporate America in the 1950s. The British Market Research Association was established in 1957, the same year as Vance Packard’s critical study of advertising, The Hidden Persuaders, was published in the United States. The following quotation from Packard’s book illustrates how the advertising industry continued to use the twin themes of guilt and desire in the postwar boom years.
The cosmetic manufacturers are not selling lanolin, they are selling hope. . . . We no longer buy oranges, we buy vitality, we do not buy just an auto, we buy prestige.
If you tell the housewife that by using your washing machine, drier or dishwasher she can be free to play bridge, you’re dead! She is already feeling guilty about the fact that she is not working as hard as her mother. You are just rubbing her up the wrong way when you offer her more freedom. Instead you should emphasize that the appliances free her to have more time with her children and to be a better mother.
Advertisements of the period support this. A Hotpoint ad from Good Housekeeping of June 1951 carries the copy “Save 8 Hours Every Week with a Hotpoint All-Electric Kitchen—Gain Extra Time for All Your Extra Duties.” The time saved, the advertisement suggests, is “for your family as well as the many added duties you’re called on to shoulder these days.” Needless to say, the “you” in question was female.
These quotes reflect a set of cultural values that were already in the process of being challenged by the feminist, civil rights, and youth movements of the 1950s and 1960s. Unsafe at Any Speed, by the American lawyer and consumer advocate Ralph Nader, was published in 1965 and exposed the lack of safety in the General Motors Corvair automobile. Nader joined the Consumers Union in 1967. Congress passed twenty-five pieces of consumer legislation between 1966 and 1973.
The advertisers and manufacturers varied in their ability to respond to these social and cultural changes. The rise of the affluent teenager created a new market, one that clothing, publishing, and cosmetics companies responded to with vigor. The domestic appliance companies also had to change. By the late 1970s the impact of feminism had been such that the latter comment quoted in Packard was no longer tenable as an advertising concept, even though it was still a reality for many women. A mid-1960s ad for a Nevastik Teflon-coated frying pan from the UK Good Housekeeping Magazine had the copy, “Even a Man Can’t Go Wrong with Nevastik Pans.”
Market research had become more sophisticated, and markets were increasingly divided into socioeconomic groups that could become target markets. This analysis became more sophisticated during the 1980s and 1990s as markets were segmented by postal areas and lifestyles.
It has been assumed that manufacturers and consumers stood in opposition to each other, with the consumer organizations acting as monitors and protectors of the latter’s interests. Indeed, the efforts of consumer organizations have led to legislation to improve safety standards and consumers rights after a purchase has been made. But it would be wrong to believe that consumers have been passive recipients of what the producers have given them and that a docile and uncritical public leads to low standards of design. It has been argued that consumers’ desires and needs have been created by the producers and, with the aid of their advertisers, have been satisfied by those producers. This implies that consumption is a less authentic and satisfying activity than, for example, working. It also seems to imply that popular forms of culture and material culture are superficial. Given the sophisticated nature of advanced capitalist societies, this attitude can be contested: needs are often no longer natural, but cultural, informed by the many connections and discontinuities within those societies. Many modern objects do not simply—or, indeed, primarily—have “use or exchange” value but more importantly have “identity” value. This can clearly be seen in some of the more fashionable domestic appliances of the 1980s and 1990s. A Dyson vacuum cleaner or a Sony Walkman is a successful piece of technology, but each equally has become a purchase that reinforces its own brand identity and defines the identity of the consumer. The same can be said of older products such as the Aga cooker or the more self-knowing products from the Alessi stable.
The late twentieth century has produced a society where manufacturers, designers, and consumers are linked, knowingly or not. Companies continue to conduct market research but also are quicker to respond to and appropriate ideas that often bubble up from within popular or mass culture. This “circuit of culture” links the identity, production, consumption, regulation, and representation of a commodity within a circular relationship. This model has increasingly applied to domestic appliances over the last twenty years. Many domestic products that were once almost culturally invisible are now recognized as having a meaning. Consumers are now largely more sophisticated and are able to “read” the intended meanings of the manufacturers and to construct or appropriate their own, which will in turn influence the manufacturers and affect how that product is marketed or modified. Nevertheless, the findings of the 1960 UK Molony Report on consumer protection remain valid.
The business of making and selling is highly organized, often in large units, and calls to its aid at every step complex and highly expert skills. The business of buying is conducted by the smallest unit, the individual consumer, relying on the guidance afforded by experience, if he possesses it, and if not, on instinctive but not always rational thought processes.
All that stuff you have in your pantry- take it all out and make some new recipes for dinner! Gravy packets, crackers, rice... whatever!
This is also a great way to declutter!! Yey, another bonus!!
And thank you to one of my favorite Eco-Friendly Bloggers for originally giving me this idea over a year ago; I LOVE THIS IDEA and do it every few months! It's fantastic!
And check out The Green PhoneBooth for more GREEN IDEAS !!
The main disadvantage with these models was that left unattended an overheated pan full of fat could cause a dangerous fire. Electric fryers were introduced into the U.S. market in the late 1940s. By the early 1950s rectangular models included Dormeyer’s Fri-well and Dulane’s Fryryte. Sunbeam produced a circular model that also doubled as a roaster and a casserole. Most were thermostatically controlled so that the oil remained at a constant temperature. Cheaper models did not have this feature and relied on the experience of the cook or a fat thermometer. They could also be used without the frying baskets as electric casseroles and soup cookers.
Like other appliances the fundamentals of the earlier electric models remain but plastics replaced steel or aluminum as the outer casings in the 1970s and additional refinements and safety features have been added. Models in the 1990s featured locking lids, vertical oil drainage, “coolwalls” similar to toasters, replaceable or washable filters in the lids to absorb grease and odors, and controls to raise and lower the baskets without lifting the lids. The main manufacturers are De Longhi, Tefal, Moulinex, and Morphy Richards.
These appliances continue to compete with the traditional fryer, and both now have to meet the challenge of the even-more-convenient ready-cut frozen “oven” or “microwave” fries, which can simply be heated up.
ROLL YOUR TOWELS!
You know immediately HOW MANY you have...
And it looks nice! No?
I love this idea, I do it with towels, as well as sheets. It's especially good for sheets and pillow cases!!
It organizes your linen closet- presto!
The sharp rise in home computer ownership during the 1990s and the associated growth in influence of the Internet provoked concerns about the social consequences of computerization. To some, the World Wide Web is a positive force for good worldwide as a virtual expression of the global village; to others, it is culturally imperialistic and a corrupting force. In many ways, the debate has paralleled earlier arguments about the role of television in society. On the one hand, computer technology was applauded as empowering and democratizing; on the other hand, it was denigrated as socially exclusive and escapist. As with television, concern centered on the potential negative effects on children of the home-computer generation. The image of the socially gauche computer nerd, more comfortable with the synthetic relationships of the Internet chat room than face-to-face interaction, has become familiar through movies such as Weird Science. However, it is the ready availability of pornographic material on theWorld Wide Web that has generated the greatest outrage. Like most technologies, computer technology is open to abuse, but only a minority of people would currently contend that the negatives of computerization outweigh the positives.
One of the largest manufacturers is the Japanese Tomy Corporation, which was founded in Tokyo in 1927. Although its main business is toys for young children it also produces a range of baby monitors.
Such appliances reflect changing social attitudes about caring for small children. Many parents no longer think their children should cry themselves to sleep. Also, recent publicity given to “crib death” (SIDS) and asthma has made such devices almost essential for concerned parents. These appliances take advantage of the developments in communications technology; the Tomy Baby Watch can transmit live images of the sleeping infant onto the family television screen.
In 1921, the Washburn Crosby Company, a forerunner of General Mills, introduced the fictitious “Betty Crocker” as a signature for the advice and information produced by its Home Service Department. This idealized American housewife was the result of the thousands of baking and cooking inquiries that the company received after organizing a competition. “Betty” sounded friendly and homely, while “Crocker” was the surname of a recently retired company executive.
By 1940, Betty Crocker had become a household name, so it was not surprising that General Mills mechanical engineering division used the name when planning to diversify into domestic electrical goods in 1945. The Betty Crocker cake mixes followed in 1947. Betty Crocker’s Picture Cook Book was first published in 1950. Written for the growing number of suburban kitchens, it was the first cookbook to have photographs and became a best-seller.
General Mills produced the Tru-Heat electric iron, a toaster, a food mixer, an automatic fry-cooker, a waffle iron, a coffeemaker, and a steam ironer under the name Betty Crocker. Ironically this move to peacetime production was limited by the demands of the Korean War, and General Mills sold the business to McGraw Electric Company in 1954.
The Betty Crocker food brand remains strong, and the name is still used for recipe books and nutritional information. As for the fictional Betty, a series of models and actresses has ensured that her clothes and image keep up to date.
The Chicago couple Gordon and Carole Segal established Crate&Barrel in 1962. Like Terence Conran in the United Kingdom, they realized that others would enjoy the design and quality of kitchen and home products that they had found on journeys within the United States and abroad. The idea came to them while doing the washing up.
They renovated a 1,700-square-foot former elevator factory in Chicago’s Old Town district. The decor was, by necessity, very cheap; the walls were lined with crating timber and the products displayed in packing crates and barrels. The first store employed three people and offered gourmet cookware and other contemporary housewares in greater variety and at better prices than elsewhere in Chicago.
The first Crate&Barrel mail-order catalogue was produced in 1967 and the first store outside Chicago opened in 1977. Crate&Barrel opened in San Francisco in 1985 and in New York in 1995. In 1998, it entered a partnership with the world’s largest mail-order company, Otto Versand of Hamburg, Germany.
Crate&Barrel has grown into one of the most influential retailers in the United States, with over eighty stores. Its flagship store opened on North Michigan Avenue, Chicago, in 1990. Although the store has been described as resembling a giant food processor, its main forms are a cube with a cylindrical attachment, literally a crate and barrel.
Like Habitat, Crate&Barrel helped create “lifestyle” shopping and influenced both consumers and other retailers. Apart from furniture and linens it sells a wide range of stylish appliances from manufacturers such as KitchenAid and Dualit. The company has a strong philanthropic policy and passes unsold goods on to local charities. It also has financially supported AIDS-related causes.
It's a New Year!
A New Decade!
And a New Blog!
The Japanese company Asahi Optical Corporation produces cameras under the Pentax brand name. It came to international prominence as a camera manufacturer in the 1950s, when Japanese companies began to challenge the dominance of European manufacturers in the 35mm rangefinder and single-lens reflex (SLR) camera sectors. Asahi developed a reputation for innovation and was responsible for a number of camera industry firsts. As well as producing Pentax cameras, lenses, and accessories, Asahi also manufactures eyeglass lenses, binoculars, printers, scanners, and endoscopes.
The company was formed in 1919 as the Asahi Optical Joint Stock Company and began to manufacture lenses for use in cameras, binoculars, and other optical instruments. In 1939, the company moved into the manufacture of aerial cameras for military use. After World War II, Asahi began to develop cameras for the consumer market. In 1952, Asahi launched the Asahiflex I camera, the first Japanese 35mm SLR camera. At this time, most SLR cameras suffered from a problem known as “mirror blackout:” the angled mirror behind the lens, which reflects the image to the viewfinder, is slow to retract when the shutter opens, thus blocking the path between the lens and the film. In 1954, Asahi solved this problem by fitting the Asahiflex II with an instant-return mirror. The first camera produced under the Pentax brand name appeared in 1957. The name was derived from the elision of the words “pentaprism,” or five-sided prism, and “reflex,” because the innovative feature of the Pentax camera was the incorporation of the pentaprism in the viewfinder, which made it possible for the viewfinder to be set vertically, providing a more natural viewing position.
Mass production of Pentax cameras began in 1959, reflecting the growing popularity of Japanese SLR cameras. The next major advance came with the introduction of the Pentax Spotmatic camera in 1964. This featured a photoelectric cell positioned behind the lens, which meant that the light reading was as accurate as possible. This arrangement, known as “through the lens” (TTL) metering, became standard in SLR cameras. TTL metering was taken a step further in the Pentax ES SLR camera of 1971, which introduced automatic exposure control by incorporating an electronic shutter that was programmed to select the exposure time according to the light reading. Accelerating sales meant that by 1971, Asahi had sold a total of three million SLR cameras since 1952, with a third of total sales coming in the last two years.
Asahi turned its attention to developing smaller, lightweight cameras. The Pentax MX SLR camera of 1976 was the world’s smallest and lightest SLR camera to date. Another variant of the MX model, the ME, was the first camera that operated wholly by automatic exposure, with no manual override. Even at its most compact, the 35mm SLR camera was still heavy and bulky in comparison to the pocket-size Instamatic cameras, pioneered by Eastman Kodak in the 1960s. Asahi’s solution was to create an SLR camera that used the same compact 110 film cartridges as the Instamatics. The Pentax System 10 SLR camera, launched in 1978, was compact and convenient, but also had the superior functionality provided by interchangeable lenses and numerous accessories. Meanwhile other companies were working on another lightweight format—the non-SLR 35mm compact camera. Asahi entered this field in 1982, when it introduced the Pentax Sport 35 camera. This camera also featured the innovative auto-focus lens technology introduced in the Pentax ME-F SLR camera of 1980. Asahi also became the first camera company to achieve total sales of 10 million SLR cameras in 1980.
Asahi continued to be a pioneer in the automation of camera functions. The Pentax Super Program SLR camera, introduced in 1983, offered the user a choice of six types of exposure control, including the use of auto flash. Two years later, the Pentax A3000 SLR camera provided fully automatic operation, with the addition of automatic film loading and winding and film-speed sensing. In 1986, Asahi improved the flexibility of the fixed-lens 35mm compact camera by marketing the world’s first compact 35mm camera with a zoom lens, the Pentax IQZoom camera. Since then, Asahi has extended the range of the compact zoom lens, culminating in 1998 with the launch of the Pentax IQZoom 200 camera, with a 48 mm to 200 mm zoom lens, which is still the longest zoom lens on a compact camera. Meanwhile, the first digital Pentax camera, the EI-C90, came on the market in 1996, followed in 1997 by the first Pentax APS (Advanced Photographic System) camera, the efina. As Asahi was not a partner in the consortium that developed APS, it played to its strengths by concentrating on applying its compact zoom lens technology to APS.
During the nineteenth century the majority of cooking pots had been made of cast iron. A major United Kingdom company manufacturing cooking pots was Kenricks of Birmingham. Lighter wares were of sheet tin. Enameled cast iron was developed in the 1850s. It was easier to clean and more attractive, despite a limited color range. A mottled blue was one of the commonest colors. Germany, Austria, and France were all significant producers. The 1870s saw innovations in the metal industry. Pressed mild steel appeared as a result of the Bessemer and Siemens processes. Aluminum wares were produced in the United States, France, and the United Kingdom from the 1880s. The American West Bend Company of West Bend, Wisconsin, began production of aluminum cookware in 1911. Its main customer was the mail-order company Sears, Roebuck.
The new cooking methods of gas and later electricity had an effect on cookware. Enameled iron saucepans were unpredictable and milk pans boiled over due to the uneven conductivity of iron and enamel. American manufacturers did continue with cast iron, offering bright enameled colors on the outside. British manufacturers moved to steel and aluminum.
The cookware industries of Europe and America went into wartime production between 1914 and 1918 and emerged with improved technologies. The most popular British brands were Tower, Diamond Brand, Swan, and Goat. Steel cookware was often enameled in either green or beige with contrasting rims in a darker shade. In postwar Germany, the once mighty BMW company had to cease producing aircraft engines and move into pots and pans to survive.
Aluminum wares were well suited to electric stoves. In 1934 the Wear Ever Company produced a range with heat resistant plastic handles. Stainless steel, an alloy of steel, nickel, and chromium developed in the 1920s, offered better resistance to rust and acidic corrosion. West Bend introduced their Waterless Cooker in 1921. Based on the suggestion of one of its salesmen, the lid of the cooker was fitted with clamps that prevented the escape of steam during cooking, making the addition of water unnecessary. They sold well and led to the introduction of a range of waterless wares known as the Flavo-Seal line. The U.S. Revere Copper and Brass Company developed a range of stainless steel pans with copper plated bottoms. Marketed in 1939 as Revere Ware, they gave better heat distribution.
In addition to stainless steel materials, chrome plated steel and heatproof glass like Pyrex was becoming popular. Colored aluminum was also fashionable during the late 1940s and 1950s. Anodized aluminum wares have a satin gloss finish and were introduced in 1946 by the Aluminum Utensil Company. West Bend had introduced anodized wares with colored dyes by 1950; these materials improved the aesthetics of cookware during the 1950s when the look of a kitchen became more important to manufacturers and consumers. This trend continued throughout the late twentieth century.
The first nonstick pan was introduced in 1956 by a Frenchman, Marc Gregoire, and his wife, Colette. Gregoire had experimented with the low friction substance PTFE (polytetrafluoroethylene) to produce a smoother running fishing reel. His wife suggested that it would have more commercial success if applied to cookware. The resulting Tefal company became a brand leader as nonstick Teflon-coated pans became popular during the 1960s and 1970s.
Traditional manufacturers also received a boost during the 1970s as a result of public interest in antiques and the positive reassessment of much Victorian taste. A refurbished range or new Aga in a suburban home required the “right” pans. The French firm of Le Creuset benefited enormously. Founded in 1925 at Fresnoy-Le Grand at St. Quentin, its heavy, hand-finished enameled cast-iron pots and pans were just right.
Cultural and social trends also influenced cookware. The success of Asian and Eastern restaurants during the 1960s and 1970s led to a rising interest in cooking such dishes at home. In the United Kingdom, the Habitat stores led the way, selling woks, rice steamers, and chicken bricks. Another trend was toward professional cookware as the Western media began to promote “lifestyle” eating and drinking in the 1980s. For those inspired by celebrity chefs, popular choices were the Elysee Line of 1981 by Cuisinox, a French company established in the 1930s, or Calphalon of 1978, a range of commercial, hard anodized aluminum wares made in Ohio. Another serious choice was Le Pentole designed in 1979 for Industrie Casalinghi Mori in Italy by Nika Sala. This is a stylish modern reworking of a stacking steamer with five pans.
A more recent innovation has been the specialized pan designed for use away from the stove burner (hob). The Tefal Le Saucier is a nonstick saucepan with an integral mixing paddle powered by an electric spindle through its base. It sits on an individual hot-plate with electronic controls for heating, timing, and stirring. Equally French is the Tibos electric crepe maker, with a nonstick griddle and spreading device.
Cookware has largely retained its traditional forms throughout the twentieth century; the main advances have been in the better performing and lighter materials used and the aesthetic choices available. Most cookware performs well, dependent on price, and both manufacturers and consumers are equally influenced by popular fashions and tastes.
The desktop personal computer has become the dominant computer type for business and home use. Hardware has grown in size in order to accommodate more devices, provide more storage capacity, and generally enhance performance. In the mid-1980s, the typical PC consisted of a 12-inch (30 cm) monitor, a keyboard, and a central processing unit (CPU) that accommodated a 20-Mb (megabyte) hard-disk drive and a 5.25-inch (13 cm) floppy-disk drive, and had a number of ports (connection sockets) for optional peripheral devices. The printer was the most common peripheral device. By the late 1990s, 14- and 17-inch (36 cm and 43 cm) monitors were standard, in order to provide improved display of pictorial content. The CPU, usually in a tower format, typically accommodated a several gigabyte hard-disk drive, a 3.5-inch (9 cm) floppy-disc drive, a CD-ROM (compact disc read-only memory) or DVD (digital versatile disk) drive, and a modem. It had sufficient ports to take a range of peripheral devices, including scanners, digital cameras, CD writers, loudspeakers, and extra storage drives, such as the Zip or Jaz drives. The CD-ROM, introduced in 1984, has become the standard format for applications software, games programs, and educational software such as multimedia encyclopedias. Recordable CDs (CD-Rs), requiring a CD writer, became available in 1990. Introduced in 1995, the DVD can hold a full-length motion picture.
However, the key determinants of performance, the microprocessor and the RAM chip, have grown in processing power, speed, and capacity without growing in size. Since its introduction in 1993, the 32-bit Intel Pentium chip, the dominant PC microprocessor chip, has evolved from running at a speed of 60 MHz (megahertz) to 600 MHz. A 64-Mb RAM chip is now regarded as no more than average. Therefore, while desktop models dominate, portable computers are now available in sizes ranging from the palmtop to the notebook. In the 1970s, portability was more of a relative concept. The first portable computer, the Baby suitcase computer of 1976, was a CPU without a monitor, like the Altair desktop. Even by 1981, when the Osborne I portable computer was introduced, portable computers were still suitcase-size and referred to as luggables rather than portables. Compaq, which introduced a portable PC in 1982, was the first company to really focus on the portable computer market. By 1986, the portable computer had shrunk from the luggable to the briefcase-size laptop, and by 1989, from the laptop to the thinner notebook. The notebook is the smallest type of PC that retains full functionality; it can accommodate hard disk, floppy disk, and CD-ROM drive as well as an internal modem. Subnotebooks and hand-held palmtops or personal organizers economize on size by having limited data storage facilities and small keyboards, but can transfer data to desktop or notebook computers by wired or infrared linkages. Some palmtop computers, such as the Apple Newton, introduced in 1993, omit the keyboard entirely and instead allow input to be written onto an LCD “notepad” using a stylus. They have built-in optical character recognition (OCR) software.
Prices have continued to fall, thanks to economies of scale, increases in production efficiency, and competitive market forces. While big American manufacturers, including IBM, Dell, and Compaq, still have sizeable market shares, the nature of computer retailing has allowed small companies to prosper. Purchasing direct from the manufacturer or from computer warehouses via mail order or electronic commerce has become a significant feature of the personal computer trade. Although personal computers may be ostensibly made by European, Canadian, or U.S. companies, in this context “making” means assembling, and the majority of the manufacturing process takes place in the Far East. Japanese companies, such as Toshiba, Sony, and Fujitsu, have been particularly successful in the portable computer market. Portable computers continue to be significantly more expensive than desktop models offering equivalent performance. This largely reflects the relatively high cost of flat liquid crystal display (LCD) screens in comparison with conventional cathode ray tube-based monitors.
While feminists and household economists of the rational school had long espoused the concept of the fitted kitchen, few homes had fitted kitchens until after World War II. The United States was well ahead of Britain in this respect, and U.S. companies began to use the desirability of the fitted kitchen as a marketing vehicle for a range of appliances in the late 1940s. In terms of cookers, this brought an emphasis on ergonomic design and materials. Surplus wartime stocks meant that aluminum became an affordable lightweight option for some cooker parts. Features such as glass doors, introduced in the 1930s, and eye-level grills were heralded as aids to efficiency and economy of movement. The 1950s fitted kitchen also prompted the revival of the split-level cooker with a waist-level or eye-level oven. The term “split level” is used to signify that the cooking units are not integrated vertically, but dispersed horizontally. Split-level electric cookers first appeared in the early years of electric cookers and were initially the more common design in the United States. However, their double width meant that they were too large to fit comfortably in smaller kitchens. In the 1960s, in search of new selling points, manufacturers developed features that extended the potential for producing meals requiring different types of cooking. One option was the cooker with two ovens, allowing simultaneous cooking at different temperatures. Oven fans helped to distribute the heat more evenly, facilitating the use of the whole oven, while attachments, such as rotisserie spits, tailored the oven for specialized cooking.
The standard cooktop held four fast radiant rings, and boiling ring technology changed little from the 1930s to 1966, when the ceramic electric hob, or cooktop, appeared. The ceramic hob was the commercial result of an accidental discovery at the Corning Glass Works in the United States in 1952. A malfunctioning furnace produced an opalescent, tough glass with distinctive thermal properties. Heat from bare electric elements placed beneath the glass, and demarcated by patterns on the upper surface, is conducted vertically, but not horizontally. Not only is the ceramic hob extremely efficient, but with its flat surface, it is easy to clean and available as a work surface when not in use for cooking. Manufacturers also gave much attention to the cleanability of ovens. One “self-cleaning” method, introduced in 1969, was the application of a grease-resistant coating to the oven interior. This is known as catalytic cleaning. Another method, introduced in 1978, was pyrolitic cleaning, whereby a short burst of maximum heat after cooking prevents the build-up of hardened grease. Cleanliness was also the motive for the development of the electric cooker hood, which is placed directly above the hob to absorb greasy vapors and cooking smells. Such hoods contain an extractor or exhaust fan and filters. Depending on the hood design, the extracted air may either be recirculated in the kitchen after filtration or vented outdoors.
Since the 1970s, when the fitted kitchen approached its peak of popularity, manufacturers have designed kitchen appliances to fit in with the standard sizes of kitchen units. This also prompted the evolution of the split-level cooker into the modular cooker, whereby the oven, hob, and grill might be completely separate, self-contained units. Modularity has allowed consumers to mix and match gas and electric cooking units to suit their individual needs or preferences. The German manufacturer Neff has been particularly noted for its modular cookers. The Italian company Zanussi has focused more on offering a range of colors, finishes, and style details. For example, the Zanussi ID cooker (1999) could be tailored in terms of types of doors, handles, and knobs, as well as color and finish, to achieve a customized specification. Another trend, associated with a revived interest in cooking as an art rather than a necessity, has created a consumer market for the cooker built to professional catering standards and usually high-tech in design. The latest development in hob technology is the induction hob, which dispenses with heating elements in favor of magnetic heat induction coils. While a British company, the Falkirk Iron Company, experimented with induction cooking in the 1920s, the idea lay dormant until the 1990s. Price, familiarity, and availability of types of energy were the prevailing influences on choice of cooker in 1900, but today the equivalent factors are more likely to be price, performance, convenience, and design. These factors mean that gas and electric cookers are likely to co-exist on more or less equal terms for the foreseeable future.