Compact Disc Players

Launched in 1983, the compact disc player (or CD player) is the digital recording equivalent of the gramophone (or record player). It is a device for playing back sound recorded on a small optical disc. The term “compact disc” was used because the first commercially produced optical discs were 30 cm (12 in) videodiscs, whereas audio optical discs, which had less information to record, were only 12 cm (4.75 in) in diameter.

Optical disc technology uses a laser both to embed the recording and to decode it for playback. During recording, a laser beam removes tiny dots from the etch-resistant chemical coating of the glass master disc. The dots vary in length according to the digital sound input and form a spiral track up 5 km (3.3 miles) long as the disc rotates. The master disc is then placed in a bath of hydrofluoric acid, which etches pits in the glass where no coating remains. In the mass production process, the master disc is replicated as plastic discs with a thin aluminum coating. The disc provides up to 100 minutes of sound. Inside the CD player, the disc rotates on a turntable and is scanned by a laser beam that detects reflection from the nonpitted surface and its absence from the pitted track. The laser transmits pulses of light to a photodiode, which converts the light to electrical pulses for transmission to an amplifier and loudspeakers.

The compact disc was developed through a joint venture between the Dutch company Philips, the pioneer of videodisc technology, and the Japanese company Sony. When the joint venture was agreed upon in 1979, both companies had reason to pool their resources rather than go ahead independently. Both had recently lost out to Matsushita, the world’s leading electronics producer, when the VHS format outsold their separate videocassette formats. Moreover, Philips was then involved in a costly videodisc rivalry with JVC (Japanese Victor Company) and the U.S. company RCA. By 1980, the two companies had agreed on the standard for audio CD and began to develop their products independently. In 1982, Sony launched the CDP-101 CD player in Japan. It was designed to fit in with existing hi-fi stacking systems.

The CD player became the fastest-selling machine, as of then, in the history of consumer electronics, although it has recently been surpassed by the digital versatile disk (DVD) player. In the United States, the sales of CD players grew from 35,000 in 1983 to 700,000 in 1985, while CD sales grew from 800,000 to 15 million. The introduction of portable CD players increased the popularity of the CD format. Sony launched its first portable CD player with headphones, the D-5, in 1984. The Sony D-88 Pocket Discman, a slimmer model based on the successful Walkman personal cassette player, arrived in 1988. CD players were also incorporated in ghetto blasters, or boom boxes. Two new variants of the audio compact disc format were introduced in 1999: the DVD-Audio format, developed by the Japanese company Matsushita, and the Super Audio Disc format, developed by Sony and Philips. Both offer enhanced sound quality through increasing the rate of digital sampling.

The CD player was marketed as a major advance in the quality of sound reproduction on several grounds: greater dynamic range (essentially loudness), the inherent superiority of the digital copying that permits the master recording to be exactly reproduced, and the absence of wear and surface noise compared to the gramophone (phonograph) or tape recorder. Other factors favoring the CD player include the convenience of operation by remote control and programmed track selection. While studies have shown that the majority of people, including trained musicians, cannot reliably distinguish between analogue recordings (LPs) and digital recordings (CDs), by 1988 CDs were outselling LPs. Today, the CD player has supplanted the record player in the majority of homes.

Compasso d’Oro

The Golden Compass Awards (Il Compasso d’Oro) are a series of industrial design awards that originated in Italy in 1954 when Aldo Borletti of the Milan department store La Rinascente founded them as a one-off. It was an immediate success, attracting 5,700 entries. The Compasso d’Oro was important in promoting good design in everyday things at a time when Italian industry was reestablishing itself after World War II. It also encouraged Italian designers to consider such mundane objects as being worthy of their attention as well as encouraging manufacturers to invest in designers. The 1954 winners set the pattern for future competitions; they in cluded a sewing machine, an electric fan, a typewriter, and kitchen components.

The awards continued in 1955, 1956, 1957, and 1959. Due to organizational or economic reasons, the awards are not given every year. The panel of judges is small, usually consisting of around six people drawn from the relevant industries. Judges have included designers such as Marco Zanuso, Vico Magistretti, and Philippe Starck. Awards were given four times in the 1960s, only twice in the 1970s, and four times in the 1980s. Winners have included plastic buckets, sewing machines, lemon squeezers, collapsible dish-racks, washing machines, lamps, gas cookers (stoves), and telephones as well as cars and furniture.

Il Compasso d’Oro is now run by the Associazione Design Industriale (ADI), an association of 750 manufacturers, architects, and designers working in Italy.

Coffeemakers

The application of technology to assist the art of making a good cup of coffee began in the nineteenth century with the invention of the percolator by the American-born Count Rumford in Germany in 1806, with at least one aim being to discourage the heavy drinking of Munich workmen. His invention improved upon the traditional Turkish method of heating both ground beans and water in the same container. Water trickled down through a central cylinder that contained the coffee and filter and then up into the outer body of the vessel. The other long-favored method was the tinplate or enamel “drip-pot” that simply filtered the hot water from an upper vessel, through the ground coffee and into a lower one.

By the mid-1830s the first true coffee machines, large alcohol-heated drip machines, had been developed, primarily for cafés. Steam pressure was popular for smaller domestic models, especially in Italy. Simple steam-pressure machines featured a water container with a filter for the ground coffee. A metal tube dipped into the water, and as it heated, the pressure of the resulting steam forced the water through the coffee and out of the tube. Italian companies like Pavoni and Snider produced a variety of these models in the early twentieth century. Later models were electrically heated.

Coffee was a popular drink in Europe and America, and it was here that the major developments took place. The earliest electric appliances were percolators (the first introduced in 1908 by Landers, Frary & Clark under their Universal trade name) with a heating element attached to the base. Two types of vacuum coffeemakers were developed in Britain. The Siphon percolator of the 1850s used the principle of the vacuum siphon patented by Robert Napier in 1830. The apparatus consisted of two flasks linked by a pipe. Boiling water was poured onto ground coffee in a glass flask. The steam generated by hot water in another flask, usually of china, created a vacuum that drew the liquid coffee through. It was then served from a tap on the side. Another nonelectric solution was the Cona vacuum system developed by Alfred Cohn in London in 1910. It consisted of two glass vessels. The bottom one held the water and was connected to the top one, which held the ground coffee. Once heated the water rose into the top to infuse the coffee, while the cooling lower half created a partial vacuum, which drew the liquid coffee back down. The Cona remains popular today. The Danish Bodum company introduced their version, the Santos, designed by the architect Kaas Klaeson, in 1958 and it is also still in production.

The United States and Germany, both coffee-drinking nations, continued to develop electric models that were effectively percolators with electric heating elements in the base. AEG produced an electric siphon model during the 1920s. West Bend developed the Flavo-Drip coffeemaker that did not require a filter in 1922. Its popularity led to a stove-top percolator called the Flavo-Perk. The popular American Silex of the 1930s was a glass, two-bowl drip model that sat on a separate electric burner. Like kettles of the period, few were automatic. In 1937, S.W. Farber introduced the Coffee Robot that proclaimed to “do about everything but buy the coffee.” It was a vacuum type with an automatic shut-off and a thermostat to keep the coffee warm. Its success tempted other American appliance manufacturers into the market.

The postwar trend in the United States was for sleeker all in one, automatic electric coffeemakers. Glass was replaced by chrome bodies with Bakelite handles. Many had simple engraved patterns on their sides. Popular models included the Sunbeam vacuum Coffeemaster and the Universal Coffeematic percolator.

Meanwhile, Italy produced more important developments, both in 1933. Alfonso Bialetti designed and produced the Moka Express, a two-part machine that forced the heated water up through the coffee into the upper vessel. Made of cast aluminum, it is still popular today, and it still carries the distinctive trademark of the cartoon caricature of its inventor. If the Moka was uncomplicated, the cafetiere designed by fellow Italian Calimani was simplicity itself. Its now familiar form is that of a glass vessel with a plunge filter that is pushed down through the infusing coffee. It began to be used in French cafés after 1945 and became popular in the 1950s. The cafetiere is now ubiquitous on both sides of the Atlantic.

Italy was also the birthplace of espresso, a coffee produced through pressurized machines based on the 1901 patent of the Milanese engineer Luigi Bezzera. The main drawback of these machines was that the steam was forced through the coffee at a relatively slow rate, resulting in a bitter flavor. A Milanese man, Cremonesi, experimented with a piston mechanism to increase the pressure. He fitted it to the machine in Achille Gaggia’s bar in Milan. The piston method of forcing water through a bed of coffee at high pressure resulted in a fresher cup of coffee with a creamy head or crema. Cremonesi died during World War II, and Gaggia went on to develop the idea, with the Gaggia machine going into production in 1948. This machine was synonymous with the rise of coffee bars in Europe and America during the postwar period and stimulated the desire for authentic espresso at home. Gaggia produced the first domestic electric espresso machine in 1952. It was named Gilda, after the film that starred Rita Heyworth. A further improvement was the pump system developed by the Faema Company of Milan in the 1950s. A pump forced the water directly through the coffee at a constant temperature of 200°F. This method produced espresso very quickly and was adopted as the preferred method for domestic machines.

During the 1960s and 1970s these European methods began to make headway in the United States and Britain. Fresh filtered coffee was simple to make, and there was less chance of overheating it, which could happen with percolators. Manufacturers like Braun, Philips, and Rowenta produced well-designed automatic filter coffeemakers with plastic cases. The cafetiere was also successfully marketed by Bodum, which introduced their Bistro cafetiere in 1974, beginning their successful Presso line. Coffee was now one of the world’s favorite beverages, although it must be remembered that the majority of sales were for the instant granulated variety. Instant coffee was the result of eight years research by the Swiss Nestlé Company and was introduced in 1938. The coffee was freeze-dried to eliminate the water but leave the oils that gave the taste. By the mid-1990s, it accounted for 90 percent of all coffee drunk in the United Kingdom, over 70 million cups a day.

Nevertheless, the 1980s saw the manufacturers respond to an increasingly sophisticated market. The domestic espresso machine came of age with sleek matt black miniatures from the likes of Braun, Bosch, Gaggia, Krups, and Siemens, fully equipped with steam pipes to froth up milk for cappuccinos. Initially expensive, these models forced the water through the coffee with either an electric pump or a centrifugal system that spins the water at high speed. In the early 1990s Russell Hobbs, Tefal, and Krups produced combination machines featuring an espresso maker, milk frother, and filter coffeemaker.

As the kitchen had become both a stylish room and a workspace, the coffeemaker, like the kettle has not escaped the attentions of contemporary designers, especially those working for Alessi. Aldo Rossi produced an espresso maker and a cafetiere, Richard Sapper an espresso maker, and Michael Graves a cafetiere.

Coffee remains popular throughout the world and the public taste for distinctive coffee has been stimulated by the growth of specialist coffee shops and cafés. Such is the market that brands like Starbucks are becoming global. In this environment appliances that replicate the coffee shop taste remain in demand.

Carpet Sweepers

The carpet sweeper is manually operated. Its rotating brushes pick up dust, which is then deposited in the pan above. In the late 1850s, many patents for carpet sweepers were lodged in the United States. They were based on the same principles as the first street-sweeping machine patents granted to the British engineer Joseph Whitworth in 1840 and 1842. These early patents did not result in commercial production.

In 1876, Melville Reuben Bissell, of Grand Rapids, Michigan, patented an improved design of carpet sweeper. He began production and the Bissell carpet sweeper became the first commercially successful model. It consisted of a long pivoted handle, a wooden dust box on wheels and a set of rotating brushes. Bissell’s innovation was the central bearing brush, which allowed the sweeping brushes to self-adjust to suit different surfaces. By 1906, annual production of Bissell carpet sweepers had exceeded the one million mark. In Britain, similar carpet sweepers appeared in the 1880s. Carpet sweepers have become even more portable since being made of plastics and lightweight metals but their design has changed little, except for minor details such as the addition of corner brushes. They have retained a market niche because of their convenience for small cleaning jobs.

The Popularization of Photography

A number of American and European inventors developed coated paper films, but the big breakthrough was the invention in 1889 of celluloid roll film by the American chemist Henry Reichenbock. George Eastman, former bank clerk and founder of the Eastman Kodak Company, was Reichenbock’s employer. Based in Rochester, New York, Eastman Kodak revolutionized the camera industry by concentrating on the mass market. Before 1888, when Eastman launched his first camera, camera developments were geared to the needs of the professional user. Studio cameras had large, heavy wooden bodies, while field cameras, with folding bellows, were more portable, but expensive. The first Kodak camera, loaded with a 100-frame paper roll film, was advertised with the slogan, “You press the button, we do the rest.” Once the film was completed, the owner returned the camera to Eastman Kodak for processing. The camera plus film cost $25, and the cost of the prints and a new film was $10.

Eastman’s next step was the introduction of the Pocket Kodak box camera, the first truly portable camera, in 1895. When the Kodak pocket bellows camera followed in 1898, Eastman Kodak had sold one and a half million cameras. By 1900, about 10 percent of the population in both the United States and Britain owned a camera. A small, cheap box camera, the Kodak No. 1 Brownie was introduced in that year and sold for just $1, plus 15 cents for a six-frame film. Designed by Frank Brownell and made from wood and cardboard, the No. 1 Brownie brought photography within the means of the average person.

By the 1920s, the amateur camera market was more competitive, so makers began to use design to increase the desirability of their products. In 1927, Eastman Kodak employed the American designer Walter Dorwin Teague to redesign the Box Brownie. Teague’s Beau Brownie design was launched in 1930 and featured a two-color Art Deco geometric front panel. Teague also designed the 1928 Vanity Kodak bellows camera, which came in a range of colors and with a matching case. The availability of Bakelite and other new plastics made it possible to produce cheaper cameras in a variety of shapes and colors. The Kodak Baby Brownie of 1933, again designed by Teague, had a squat Bakelite shell with rounded edges and a distinctive ribbed lens panel. A particularly innovative use of plastics was made by the American designer Raymond Loewy, in his 1937 Purma Special camera design for the British company R. F. Hunter. The streamlined black Bakelite shell incorporated an integral viewfinder and wind-on mechanism, while the lens was not glass, but Perspex, thus reducing the costs. Fun cameras are exemplified by the Coronet and Corvette Midgets, made by Britain’s Coronet Camera Company. These miniature cameras had rounded Bakelite bodies with domed tops housing the viewfinder and came in a range of striking, mottled colors.

Apple Computer, Inc.

Steven Jobs and Stephen Wozniak founded Apple in 1976. In 1975, Jobs and Wozniak were both working in Silicon Valley, California, for Atari and Hewlett Packard, respectively. They were also members of the Home Brew Computer Club, a group of computer enthusiasts. Inspired by recent microprocessor developments, they built their own microcomputer, the Apple I, in Jobs’s garage. They received orders from a local computer shop and began small-scale production. Encouraged by this, they looked for financial backing. Mike Markkula became the third partner, taking over the financial and administrative side. The company was incorporated in 1977.

The world’s first commercial personal computer, the Apple II, was displayed at a San Francisco computer fair in 1977 and was an immediate success. Wozniak was injured in a plane crash in 1981, and although he returned briefly after recovering, he subsequently quit Apple. Apple became the first personal computer company to achieve annual sales of $1 billion and the fastest growing American corporation ever, but by 1983 it needed to counter the challenge of the fast-selling IBM PC. Its response was the Apple Lisa, a computer featuring an innovative, user-friendly interface. Program options were displayed in the form of graphic icons, pull-down menus, and windows, with easy navigation via a mouse. However, the Lisa was too expensive for the home computer market. Its basic features were incorporated in the Apple Macintosh, styled by the Californian branch of the German design consultancy, Frogdesign. The Mac received a high profile launch in January 1984 with the showing of an unusually long (60 seconds) TV commercial, directed by acclaimed British film director Ridley Scott. In addition to the graphical user interface (GUI), the Apple Mac had the advantage of compactness. With the monitor, processor unit, and disk drive contained in one streamlined shell, the Mac became a style icon.

Disagreements with John Sculley, the company’s president and chief executive officer, led Jobs to resign in May 1985, and Apple faltered as IBM-compatible personal computers began to flood the market. A dispute with Microsoft over the alleged infringement of Apple’s design rights by Windows 1.0 was settled in a way that left Microsoft free to mimic the Apple GUI in later versions of Windows. In 1986, Apple bounced back by making desktop publishing affordable with the launch of PageMaker software for the Mac and the LaserWriter printer. A portable Apple Mac was introduced in 1989. The introduction of Microsoft Windows 3.0 in 1990 made Apple increasingly vulnerable to competition from PC manufacturers and allied software companies. Apple initially sustained its market share thanks to its loyal existing customer base and then bolstered its position with the introduction in 1991 of the Macintosh PowerBook, a notebook computer with networking and multimedia capability.

A joint venture with the Japanese company Sharp led in 1993 to the Apple Newton. This hand-held computer featured built-in optical character recognition software, allowing input to be handwritten onto the liquid crystal display with a plastic stylus. In 1994, the technical performance of the Mac was boosted by the adoption of the PowerPC chip, a fast microprocessor. In spite of such innovations, Apple’s market share declined steeply, prompting major corporate changes. The decision to license the Mac operating system came too late to be effective. In December 1996, Steven Jobs returned when he agreed to Apple’s acquisition of his NeXT software company. Jobs soon regained control of Apple as acting chief executive officer and in 1997 negotiated a five-year software development deal with Microsoft whereby Microsoft also invested $150 million in Apple.

While Apple continued to improve technical performance through the G3 (and later G4) versions of the PowerMac and PowerBook, it also developed a more competitively priced computer, the iMac. Like the first Mac, the 1998 iMac has a single-shell monitor-cum-processor-and-disk drive. Its distinctiveness lies in its rounded lines and availability in a range of five strong colors combined with translucent white. Back on a profitable footing, Apple consolidated its position in 1999 with the introduction of the iBook notebook computer.

Computer Printers

Information stored on computer can be read off the screen, but many people find that reading from a computer screen is more physically wearing in terms of eye strain and postural fatigue than reading from the printed page. In the days of mainframe computers, printing was a batch job and required large, durable machines. The development of the first personal computers in the mid-1970s led to the corresponding development of desktop printers.

In the period 1976 to 1979, several types of printer became available. The best print quality was delivered by printers that used the same printing technology as contemporary typewriters. Indeed, some models were converted typewriters that retained the keyboard for dual-purpose use. The print head was a daisy wheel, a disk with spokes and raised characters around the circumference, and the printing medium was a carbon tape. Daisy-wheel printers were comparatively slow and noisy. An acoustic hood could be placed over the printer to deaden the noise, but this made the printer more bulky.

The alternatives to the daisy-wheel printer were cheaper, faster, and quieter, but delivered much lower print quality. In the late 1950s, dot-matrix printers were developed for use with mainframe computers. Dot-matrix printers use carbon tape, but the printer head consists of tiny pins that are selectively used, as instructed by the built-in microprocessor, to form characters. The print quality of desktop models improved somewhat in the early 1980s, when 24-pin heads superseded the original 9-pin heads. The cheapness of dot-matrix printers made them a popular choice where print quality was not the main consideration. Thermal and electro-sensitive printers were quieter still because they were nonimpact printers. Instead of a printer head, they used a stylus and the printing medium, carbon, was impregnated in the paper. The carbon was released in response to electric current flowing through the stylus. The print was fainter and less crisp than that produced by a daisy wheel. Low print quality together with the high cost of the special paper limited the sales of these printers. However, thermal printing is still used in many fax machines.

Two methods of nonimpact printing have proved very successful: the ink-jet printer and the laser printer. The ink-jet printer first appeared on the market in the early 1980s. The Japanese optical and electronics company Canon pioneered “bubble jet” ink printers in 1981. In 1984, the American electronics company Hewlett-Packard introduced the first of its ThinkJet series of ink-jet printers. Ink, supplied in cartridges, is sprayed through a matrix of perforations in the printer head. As with the dot-matrix printer, each character is a composite of dots. In the early days of ink-jet printers, there was a tendency for the ink to “bleed,” creating a fuzzy effect. Bleed-resistant papers were created but these, predictably, were more expensive than ordinary computer paper. By the late 1980s, improvements made to reduce the bleed problem and a steep drop in prices established the ink-jet printer as the favored budget purchase, in place of the dot-matrix printer. Ink-jet printers also have the advantage of compactness. When the notebook generation of portable computers emerged in the late 1980s, complementary portable models of ink-jet printers followed.

The only printer to match the daisy-wheel printer in terms of quality is the laser printer, which has more in common with the photocopier than the typewriter. The world’s first laser printer, the IBM 3800, was introduced by the U.S. office machine giant in 1976, but it took another ten years for the price to fall sufficiently for laser printers to become commercially competitive. Laser printers use powdered ink known as toner and contain a light-sensitive drum, a laser, and a rotating mirror. Where light from the laser beam falls on the electrostatically charged drum, the charge is dissipated; where no light falls, the charge remains and toner is attracted. The toner is transferred to paper and fused in place by heating. While characters and images are formed as patterns of dots, as with dot-matrix and ink-jet printers, the laser printer dots are so small and closely spaced that lines appear to be continuous.

Both ink-jet and laser printing technologies brought another advance—color printing. Canon introduced its first color ink-jet printer in 1982, only a year after its first monochrome model. It is now standard for ink-jet printers to operate as dual monochrome or color printers. Color printing is slightly more expensive because a tri-color ink cartridge has to be replaced more often than a black ink cartridge. Ink-jet models have a huge price advantage over color laser models. Hewlett-Packard, the company that has set the standards in laser printer technology since the launch of its first LaserJet printer in 1984, did not introduce a color model until 1994.

As printer technology changed, different manufacturers became involved. Makers of daisy-wheel printers included major typewriter manufacturers such as IBM and Olivetti as well as Xerox and Tandy/Radio Shack. Since then, Japanese companies have taken over the printer market. Epson became a leading maker of dot-matrix printers, and Canon, with a pedigree in the unrelated field of camera manufacture, is a leading maker of ink-jet printers. In the laser printer field, Japanese companies such as Canon and Panasonic dominate the lower end of the market, but Hewlett Packard of the United States is still a major supplier of top quality models.

Mrs Sharps Traditions...

Ok so this time of year, the blog I found and my own need to gather my ducklings at the moment reminded me of this book ....which I already own... traditions sometimes get a bad rap in our world today, especially among Christians as being to worldly or unnecessary yet Christ's admonition in Mark 7 wasn't to tell us to do away with any and all traditions not specifically outlined in Scripture (Thanksgiving anyone -- ok so that one is based on the feast of Succot but you get my drift) rather that they were throwing away the scripture for their traditions and missing the point of the feasts, holidays and observances which was to point them to God... but I digress,
This is a visually beautiful book and a great way to start adding small and large traditions to your own celebrations, outlines ides for each month. These are the things children remember when they leave home...(I don't like her other books quite as much because they vear off into paganistic stuff in my opinion but glean the good and leave the rest!)http://www.amazon.com/gp/product/074321076X/ref=s9_simz_gw_s0_p14_t1?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=center-2&pf_rd_r=1K09DKJCNVK7P7HJJQDP&pf_rd_t=101&pf_rd_p=470938631&pf_rd_i=507846

Hearts for Home....


Found this by linking from one of my followers site and I love it! Beautiful blog with sweet photos,etc.Similar idea to the day book but a good reminder to challenge myself (ourselves?) to be mindful of our homes and what we are doing...

In today’s busy and aggressive society mothers at home are not given much encouragement to be committed to their role in being home.In my weekly 'Hearts for Home' post I am encouraging myself and others to commit to thinking about what 4-6 things we can do each week to bring our thoughts, prayers and actions to keeping our 'Hearts for Home'.I am praying that a spirit of gentleness, generosity in thoughts and time along with loveliness and grace would lead us to be all that God wants us to be.


So my list for today (Nov 24th is):
1. Set up my prayer journal (again! :))
2. Make the chalkboard I've been wanting for awhile to display scripture for thought each week/month etc.
3. Decorate joyfully for the holidays with my kiddos and keep our Thanksgiving traditions, revisit my resources in this area
4. Family Prayer time
5. Focus on our home and what needs doing - do the next thing not everything
6. Encourage my husband physically and verbally
the list could go on and on can't it? But I need encouragement at times not to get caught up in the daily stuff that is urgent and small but see the larger picture of what God has called me to do don't you?! (you can link to this blog by clicking on the hearts for home picture in my sidebar or the named link in my other blogs listing)

For Today.... Nov 23rd 2009

Outside my window... Grayish Nov day, trees reaching their bare arms towards heaven like dancers in a play - some still wear gowns of brown, green or yellow but soon they'll be dressed all in gray or black too.

I am thankful for... the freedom to home school, my children, quiet

I am wearing... Jeans, hubby's 299th BSB t shirt (navy), blue snowflake socks and a turquoise bandanna

I am remembering... almost 20 years ago I was getting ready to have dd#1 and anxiously awaiting hubby's arrival from AIT; all our soldiers overseas and their families - OIF, OEF, Korea, etc....

I am going... to the library in a bit with ds #2 for school stuff

I am reading... Just Breathe, From Jerusalem to Iran Jaya, The Scarlet Letter,

On my mind... dd #1's car fender bender, Thanksgiving Plans, God's provision

From the learning rooms... Starting our Adventures in Knighthood unit and lapbook; dd #2 just told me she used her knowledge from America Rocks and Dr Quinn to answer a college test question! Gotta love homeschooling resources.... ;)

Noticing that... it's a bit chilly in the house and the sun is peeking out!

Pondering these words... see my post a couple down by A W Pink

From the kitchen... scrambled eggs, chorizo, fresh guacamole and tortillas for lunch; bean soup, corn muffins and salad for dinner

Around the house... our Christmas "tree"/bush was delivered today. We live on "Christmas Tree Lane" in Abilene -- two of the main streets in town get small trees from the city to decorate and light from now until Dec 31st

One of my favorite things... campfires and smores!

From my picture journal...

Sara in the Grand Tetons 2007 absoloutly breathtaking!





Clothes Dryers

There are two types of mechanical aid to drying washed clothes: appliances for extracting water by pressure and appliances for producing evaporation through heat. In 1900, with the exception of very large households, the only equipment available for drying clothes was the mangle or wringer, where wet clothing was passed between heavy rollers to squeeze out water. The production of new electrical appliances for drying clothes began in the 1920s but only reached the mass market from the 1950s onward.

The first widespread improvement of the twentieth century was the fitting of powered wringers to electric washing machines. The wringer was connected to the electric motor at the base of the washing machine by a vertical driveshaft. The user still had to lift out the wet clothing and guide it through the rollers. Washing machines with wringers were the most common type of washer from just before World War I until the late 1950s.

Electric spin dryers were introduced in the 1920s but did not find an immediate market as domestic appliances, although larger models for public laundries were more commercially successful. The spin dryer is based on the principle of centrifugal force: wet clothing is placed in a perforated drum that rotates about a vertical axis, forcing the clothing against the walls and pressing out the water, which drains downward naturally or can be pumped out upward to a sink. In the late 1940s, the relaunch of the automatic washing machine with its integrated spin-drying function provided new impetus. By the mid-1950s, manufacturers had begun to exploit the domestic potential of the spin dryer both as a separate appliance and in combination with the washer. In Britain, where ownership of automatic washing machines with integral spin drying grew slowly, the first spin dryer aimed at the mass market was introduced by Creda in 1956. The spin dryer has changed little in essence except in being made lighter (by replacing steel casing with plastic) and more compact.

The drying of clothes by applying heat has evolved from the practice of placing damp clothes on racks in front of a fire or in airing cupboards near a hot water tank. Some versions of the lamp radiator type of electric room heater (circa 1905–1915) had rails at the top for hanging towels or clothes on. This was the only type of room heater that was safe for placing in direct proximity to damp clothes. Electric fans could also assist in the drying of clothes, but these were scarcer than room heaters. The next step was separate heated towel rails and drying cabinets, which appeared in the 1920s. The standard design
"The nature of Christ's salvation is woefully misrepresented by the present-day evangelist. He announces a Savior from hell rather than a Savior from sin. And that is why so many are fatally deceived, for there are multitudes who wish to escape the Lake of fire who have no desire to be delivered from their carnality and worldliness." - A. W. Pink


http://www.eternallifeministries.org/awp_books.htm -- I hadn't read anything by Pink but I love the quote and found this.

Blessings!

Ceramics

The use of ceramic vessels in the home is, of course, age-old, and most of the major developments in production techniques had been achieved before the twentieth century. Nevertheless, there were a number of processes that helped to further democratize the range and style of products available.

Photolithographic images for ceramics were developed during the late 1930s. This process allowed exact copies of original artwork to be reproduced on a piece via a transfer. It was first successfully exploited in the United States in the 1950s. A further improvement was the Murray-Curvex offset litho process that became available during the mid-1950s. This process transferred the still wet print onto the ceramic article via a gelatin pad or “bomb,” allowing the print to cover the sides of bowls and tureens, producing an “all-over” pattern.

These techniques were exploited by American and European designers and manufacturers and gave rise to a new wave of brightly patterned wares, often influenced by current artistic movements such as abstract expressionism. Some products were criticized as simply having new surface decoration applied to older shapes, but others were genuinely new combinations of exciting shapes and patterns, available at affordable prices.

The 1960s and 1970s saw less creativity in design, but technological development continued with tougher glazes able to withstand electric dishwashing.

Casco Products Corporation

Casco Products, an industrial-products company, began producing domestic appliances in 1949 with a successful electric iron. Soon after the company was aquired by Standard Kollsman Industries in 1960 it introduced the Lady Casco range of appliances. Pots and pans had long been sold as matching sets that could also be displayed, especially if enameled or colored; the Lady Casco range took this concept one step further. It attempted to “theme” appliances to match the American “dream kitchen,” an inducement to replace existing appliances with a new set all by the same manufacturer. The line of ten matching appliances centered on the Chef Mate, a motor-driven base with a range of attachments, including a mixer and a blender. The rest of the Lady Casco set consisted of a toaster, coffeepot, iron, and frying pan.

If this approach was novel, so was the method of marketing the products. The line was to be offered to stores on a franchise arrangement with the added gimmick that each set carried an exclusive five-year guarantee backed by Lloyd’s of London! By the close of 1961 over 2,000 stores had signed up to the franchise deal and sales were encouraging. Then in 1962 the parent company decided to abandon the project and the Lady Casco program was discontinued. In 1963 the appliance section of the business was acquired by Hamilton Beach.

Most domestic appliances follow a “house style,” and smaller pieces such as toasters and kettles have been themed to complement each other, even if sold separately. A recent trend has been the marketing of “double” or “triple packs” of products by manufacturers like Hinari, Breville, and Morphy Richards. These usually consist of a toaster and a kettle, the third element being either a sandwich toaster or a coffeemaker.

Can Openers

Tinplate canisters or “cans” for food were developed by the Frenchman Nicholas-François Appert and the Englishman Peter Durrand in the late eighteenth and early nineteenth centuries. It was thanks to Louis Pasteur’s work on bacteria and improvements in sterilization that by 1860 canning had become commercially viable. It established the emerging food-processing industry, delivered more reliable food supplies, and led to new forms of food retailing. The earliest can openers were of the “spike and blade” variety. A spike was driven into the top of the can and the lid cut off with the blade. Although still on sale until the late 1930s they were superseded by the “blade and cogwheel” type with “butterfly” handles. These cut off the lid far more cleanly. They were usually made of either all steel or with wooden handles and are still on sale today, usually with plastic bodies and steel blades and cogs.

The 1950s and 1960s saw the variety of canned goods increase. Wall-mounted can openers were introduced with gear-driven cutting wheels operated by a handle. Most featured a magnet to hold the lid once it had become separated from the body of the can. They were available in a range of colors to match the increasingly brighter kitchen units of the time. In 1968 Sunbeam produced an electric combination can opener/knife sharpener in avocado green.

Electric openers became available in the 1960s, either as wall-mounted or free-standing appliances. The can is placed against the cutting wheel and held in place by a lever. The motor drives the blade around the can, switching itself off automatically once the can is open. Black & Decker currently produce seven types, some with built-in knife sharpeners and bottle openers.

Early Photography

“Camera” is the Latin word for vault or chamber. It was adopted as the name for the photographic device because early demonstrations of the basic photographic principle involved a dark chamber, hence “camera obscura,” with a pinhole in one wall, letting in light and forming an image of an external scene on the opposite wall. The Italian artist and inventor Leonardo da Vinci was the first person to describe the camera obscura in 1515. In the form of small wooden box with a simple lens instead of a mere pinhole, the camera obscura became an optical toy. The addition of a prism to reflect the image downward onto paper created the camera lucida, which could be used for tracing a scene or subject. In the eighteenth century, chemical experimenters discovered that certain salts, such as silver nitrate, reacted to light. The next discovery was that an image could be retained by placing an item such as a leaf on a surface coated with an emulsion of silver salts and exposing it to light. The exposed surface darkened, and the covered surface remained unchanged. Images produced by this contact process were called photograms.

In the early nineteenth century, pioneers in England and France invented true photography. In the early 1820s, the French doctor Joseph-Nicéphore Niepce developed his “heliographic” process, whereby a pewter plate coated with bitumen was placed in a camera obscura and exposed to light for eight hours. Another Frenchman, Louis-Jacques Mandé Daguerre, developed the first commercially successful photographic process in 1838. He captured the image on a metal plate coated with silver iodine and used mercury vapor to “fix” the image. However, daguerreotypes were delicate and needed careful handling. They were typically mounted in hinged cases to prevent fading and protect the surface. In 1839, Alphonse Giroux of Paris developed an improved camera for Daguerre, and exposure times began to shorten.

The major disadvantage of the daguerreotype was that the image could not be reproduced. The one-step process was soon superseded by the calotype process, patented by the Englishman William Henry Fox Talbot in 1841. Fox Talbot’s process produced a negative image, with the black and white portions reversed, which was developed and fixed.

The image was then reversed to produce one or more positive images by placing the negative in contact with photosensitive paper and re-exposing it to light. In 1843, Fox Talbot invented an enlarger, which created a positive photographic print larger than the negative. However, his calotype paper was soon supplanted by more durable and fast-exposure glass-plate negatives. The wet-plate or wet collodion process, developed by the Englishman Frederick Scott Archer in 1851, required the photographer to coat the glass plates immediately before use and develop them straight afterward, a rather messy procedure that tended to discourage amateur interest. The introduction of dry gelatin plates, consisting of glass with a coating of silver bromide, in the 1870s made outdoor photography easier and stimulated amateur photography. It was the invention of roll film in the 1880s, however, that made photography attractive to a mass audience.

Broilers

Broiling, the cooking of meat on a fire or on a grid over it, is one of the most ancient forms of cooking. The first electric broilers appeared in 1916. The first table model was the Broil King, manufactured by the International Appliance Company in 1937. Table broilers were usually cylinders with hinged or removable lids. The meat sat on a perforated metal tray while the cooking element was housed in the lid. Farberware introduced the Open Hearth broiler in 1962. This featured a heating element placed below the food.

A variant was the Rotissimat of 1946, produced by the Rotissimat Corporation, which, as its name implied, featured a rotisserie for poultry. The product was promoted in supermarkets by using them to roast chickens, and it has been suggested that this stimulated the introduction of shop-roasted chicken. Rotissimat went into liquidation in 1954, but the name remains as a generic term.

Manufacturers in the United States also produced open gas broilers or indoor barbecues in the 1940s and 1950s that were companions to gas or electric stoves. Broilers were not so popular within the United Kingdom and Europe, where the home rotisserie became combined with a grill. Moulinex produced such table models in the 1970s. The 1960s saw “top-of-the-range” electric ovens having rotisserie attachments within their “eye-level” grills. Table broilers are no longer such popular items due to the increasing speed and sophistication of ovens, grills, and microwave ovens.

Paradoxically, the classic method of broiling food on a gridiron has become more popular with the outdoor barbecue. Here traditional charcoal, whether ignited by fire-lighters or gas jets continues to be the popular fuel.

HouseHold Tips is about to launch!

Household Tips is about to Launch!

Check Back Soon!










Divestiture

Despite periodic government restraints, AT&T survived intact until 1984. In 1969, the recently licensed Microwave Communications Incorporated (MCI) obtained FCC approval to connect its microwave long-distance service to the local Bell networks. With major changes in the offing, including the expansion of mobile telephone services, the FCC decided to challenge AT&T’s monopoly in a landmark 1974 antitrust suit, with the initial intention of separating its manufacturing and service functions. After prolonged pretrial hearings, the case finally reached the trial stage in 1981. Only a year later, AT&T surprisingly agreed to a negotiated settlement whereby the company was dismantled to create eight separate companies. The settlement was approved by the courts in 1983 and took effect in January 1984.

Prior to divestiture, AT&T was the world’s largest private company by such a large margin that the downsized AT&T was still in the world’s top three. AT&T retained its long-distance telephone network, its international telephone services, its manufacturing function (Western Electric), and its research and development function (Bell Laboratories). It also gained the right to branch out into data communications, an activity previously prohibited by the FCC. AT&T lost the twenty-two Bell System local networks, which were restructured to form seven regional operating companies, nicknamed the “Baby Bells”—Nynex, Bell Atlantic, Bell South, Southwestern Bell, Ameritech, U.S. West, and Pacific Telesis. Deregulation did not end there, as the Baby Bells began to test the regulatory limits by applying to expand both geographically and functionally. Moreover, in 1995, AT&T announced a voluntary demerger, whereby it would split into three independent companies. In October 1996, AT&T’s manufacturing and research business was reconstituted as Lucent Technologies. Two months later, its computer business, the NCR Corporation followed. (NCR had been acquired after the original divestiture.) This left AT&T, now redesignated the AT&T Corporation, with the long-distance telephone network, cellular phone services, a business-communications consultancy, a credit facility, and Internet services.

Camcorders

The camcorder, or video camera, captures moving images and sound on videotape. Camcorders targeted at the amateur user came on the market in the 1980s within a few years of the first professional models. Although camcorders were initially a luxury item, reductions in their price and size boosted ownership, particularly in their birthplace, Japan.

The camcorder’s predecessor, the motion picture (cine) camera, was never found in more than a small minority of homes. In the late nineteenth century, Thomas Alva Edison in the United States and the Lumière brothers in France pioneered the development of equipment for recording and playing moving pictures for public entertainment. Early motion picture cameras were hand-cranked, which required skill and was therefore a deterrent for amateur users. To create the illusion of continuous motion, the cameras had to capture 24 images per second, each of which was shown twice (i.e., at 48 frames per second) when the film was projected for viewing. Any change in the rate of hand-cranking would ruin the illusion.

The motorization of the motion picture camera made it more user-friendly, so cheaper models designed for amateur use were marketed in the 1920s by makers such as Kodak and Pathé. Although 8 mm motion picture film was available from 1932, 16 mm remained the standard for amateur cine cameras until the 1950s when more compact 8 mm models appeared, mainly produced by Japanese companies such as Canon. Motion picture cameras still had distinct disadvantages for leisure use. One disadvantage was the need for a projector and screen to show the films, and another was the absence of sound. Although professional motion picture film incorporating a sound track was developed in the 1920s, the equipment was not economically feasible for the amateur market.

The camcorder followed in the wake of the videocassette recorder, which gave the television set a new role as a playback device rather than just a broadcast receiver. As a sophisticated piece of technology, the camcorder was initially expensive and designed as a portable tool to meet professional broadcast standards. The conventional television camera owed much of its bulk to the size and shape of the orthicon electron tube. The first generation of camcorders contained a vidicon tube, which was much shorter and slimmer than the orthicon tube. Inside the camcorder, light entering the lens strikes the faceplate of the vidicon tube. The faceplate’s photoconductive lead-oxide coating converts light to an electric charge, which is picked up by the scanning electron beam and delivered as an output signal to the video recording head. The video track is recorded diagonally across magnetic tape, whereas the sound track, recorded simultaneously through a microphone, is placed along one edge. A small screen allows the user to preview shots and to play back the recording.

As Japanese companies had become dominant in the motion-picture-camera market in the 1950s, predictably they have also dominated camcorder production. In 1982, Sony released the Betacam professional camcorder, which used half-inch tape. Sony recognized that the Betacam format, which had already been overtaken by VHS in the videocassette recorder market, was too large to be successful for consumer camcorders. In 1982, a group of electronics manufacturers, including Sony and the Dutch company Philips, agreed to work on developing a standard miniature format, Video8, based on an 8-mm tape cassette. The Japanese company JVC (Japanese Victor Company), developer of the VHS format, soon pulled out of the Video8 consortium to concentrate on a compact version of VHS. In 1984, JVC launched the world’s first compact VHS camcorder, the GR-C1. The CompactVHS cassette (VHS-C) had a running time of one hour and was only a third of the size of a standard VHS cassette, but could be placed in a special adaptor shell for playback on VHS videocassette recorders. The Video8 specification was agreed in 1983, and the first Video8 camcorders appeared in 1985. In the United States, Kodak launched the KodaVision 8 mm camcorder, which was manufactured for Kodak by Panasonic, a subsidiary of the Japanese company Matsushita. Sony’s Handycam solid-state camcorder was more compact, weighing only 1 kg (2.2 lb), and the running time of the 8 mm cassettes was ninety minutes. In the case of camcorders, absolute standardization of tape format proved to be less critical than it had been in the case of videocassette recorders, largely because there was no prerecording issue and no need to use a videocassette recorder for playback.

Camcorders had far greater inherent consumer appeal than motion picture cameras, partly because of the convenience of playback via the television set. Other advantages were that videotape entails no external processing costs and is reusable. Recordings can be viewed immediately and then shot again if the results are not satisfactory. Since the mid-1980s, camcorders have evolved rapidly. In 1989, Sony brought out Hi8, a higher resolution 8 mm tape. The replacement of the vidicon tube with solid-state imaging devices not only reduced the size and weight of camcorders, but also improved the video quality and reduced power consumption. There are two types of solid-state video pickups—the metal-oxide semiconductor (MOS) and the more popular charge-coupled device (CCD). Both consist of an array of tiny photodiodes that convert light to electrical energy, but CCDs employ a scanning method that produces a higher output signal. The CCD was invented at American Telephone and Telegraph’s Bell Laboratories in 1969 by George Smith and Willard Boyce. It was first used in Sony’s Handycam camcorder.

Hot Cocoa & Marshmallows...




Ok so it's officially cold in KS now! (39 on my computer at the moment and expecting snow this week!) During the colder months our family tends to go through hot cocoa mix a lot -- as in a cup or two every night (x5 or 6!). So hubby and I were looking at buying it today when at the store and I decided I'm sure I can find a recipe online with ingredients I can pronounce! (every one at the store had hydrogenated something...) Now some of you may not consider this healthy at all but I figure at least I know what it is so it's got to be better than the others! Plus I'm sure if you REALLY wanted to you could make it sans sugar and add stevia to your taste and with dutch process cocoa etc (which would of course change the price). I've made homemade mallows before and am including that recipe for you too. (De Etta if you have one that's better please let me know. Oh by the way our local cocoa conniseur (aka Jessica) has prononuced it fine. There are several out there that include non-dairy creamer which I didn't want so just look through them!.... one caveat is that this is still not as good as the kind made with sugar, milk and cocoa on the stove but it is just as good as the powdered stuff.




Cocoa


This recipe is off the Food Channel and I found it just by entering hot cocoa mix recipe into my search engine! (those in parenthesis are the measurements I used which = 4 times)


3/4 cup powdered milk (7 C)

1/2 teaspoon ground cinnamon (1 Tb)
3/4 cup sugar (6C)
1/2 cup cocoa powder (4C)
4 ounces bittersweet chocolate (2 packages of mint chocolate chips -- opt)


This came out to $10.00 for 115 oz (7lbs of mix/1 gallon jar) - which is 18c/cup (2oz/cup-1/4cup)

without the chocolate chips it would be $6.00- 9.5c/cup


to give you an idea our local store brand was $3.00/30 oz -- $20c cup

and Swiss Miss at our store was $5.00/28 oz -- which is 32c cup


So it is cheaper and better for you to make your own, with the organic ingredients it would probably cost more but still be better in the long run!


Here's the marshmallows:

(I have made these before and they are yummy, make a BIG batch - one is more than enough for a cup of cocoa! Again some may not find these truly healthy but they are still better than store bought unless you can find/have your own marshmallow plant! This recipe is from Mary Janes Ideabook*Cookbook*Lifebook but you can find them on the Food Channel site as well)


9 packets of Chill Over Powder (I used regular gelatin and like 8 packs)

1 1/4 C Water 2 1/2 C Sugar 1 C light corn syrup

1/2 tsp salt 2 Tb Vanilla 2 egg whites

1/4 tsp cream of tarter


In a small saucepan dissolve gelatin in 3/4 C of cold water. (sprinkle over water and whisk). Place over medium heat, stirring constantly for 3 minutes (DO NOT go longer!) . Set aside -- will be thick paste.

In a 3 qt saucepan blen together 1/2 C water with sugar, corn syrup and salt. Bring to slow boil Continue to cook unstirred until candy thermometer reaches 240 F -- don't over cook or they'll be tough. Remove from heat, whisk in gelatin paste until dissolved and stir in vanilla.

In large mixer beat egg whites until frothy and add cream of tarter. Continue beating until soft peaks form. On low speed add sugar mixture in steady stream. After all the syrup is added, turn mixer to high and beat for 15 minutes (yes that's righ 15!)

Spread into a 9X13 pan that's been heavily dusted with powdered sugar. Dust top. Let dry overnight - or for 8 hours. Loosen mixture from pan by lifting around edges with knife (I ran mine under hot water). Turn out onto powdered sugar dusted cutting board. Dust bottom with powdered sugar again, cut into squares with a knife dipped in powdered sugar. Dust edges completely so they don't stick together. (It's not as much powdered sugar as it seems).... great with smores or cocoa!



Enjoy!



Basket Buddies

I miss mine!
Susan Miller was our speaker at Fall Worship & Study in 04 and she made the statement that "a basket buddy is someone who will climb in the basket with you when you're a basket case!"....


Life?

Ok so some friends have told me "that's just life" about a status I had on facebook

Stating that "I just wonder why everything has to happen at the same time: reintegration, retirement, dating issues, college stuff for the girls, kiddos leaving the nest (soon!), perimenopause, etc..." (didn't throw in our finances, extended family issues,etc). Maybe their right but it sure seems like everyone else gets to space those things out a little or only have two or three instead of ALL of them at the same time! I mean come on Lord....

However: feeling overwhelmed, saying "I thank God for Tim being home but this transistion sucks!", "Lord I know you're good and in all this but I'm not seeing it", "retirement is scary and a hard transition for civilians much less military" -- is not SIN! It's reality, and while it may not be tragic or life ending it is life changing and not easy! Saying that's life is NOT helpful.... saying I'm right there with ya, be praying for you, make a list of praises and challenges, how can I help/come alongside? or something along those lines IS.

and by the way I'm sorry but I still want to know what God was thinking by making hormone changes, kids leaving home, & husbands talking about retirement happen all at the same time?!!

Favorite Vacation..




Ok so I'm a couple days late but here's some of our vacation memories! Bear in mind though that many of our "Vacations" take place the same time as a PCS move to another place!


I don't know if I have one that's a favorite though --


Hubby and I went to Paris in 03 for 4 days sans kiddos, lots of coffee, french food, walking everywhere!, sightseeing, chocolate, wine and hot sticky nights (Aug with no air conditioning!)

Our Cross Country Trip in 2004 would be one as well... we picked up our new van in VA (AAFES) and drove to NY to the I Love Lucy Museum for dd, then over to Ohio to see fil, on to Chicago (American Girl Place for both dd's-- payment for babysitting while Mom was the FRG leader during deployment!), down Illinois to see friends, Thanksgiving at Cracker Barrel, Red River Army Depot in Texarkana, N. Arkansas to the Cook Homeplace, St. Louis, Precious Moments Chapel, GW Carver monument, friends in OK, mil in Waco, my family in AZ and to Fort Irwin CA!

Then this move which was fun from CA to KS as we took Route 66 all the way and stopped at great little out of the way diners (Kingman has awesome rootbeer!), sang songs from Cars, etc.
We visited Gila Gliff Dwellings with my Mom and Dad and reserved a B&B near the dwellings based on what we saw online! Well on the way there my Mom started having a panic attack and wasn't dealing with the twisty windy roads (ear issues) so we finally got there and she was all nervous and anxious. However the B&B was an old schoolhouse but was run by a bunch of hippies and while it was fine accomadations wasn't EXACTLY what we were thinking! A bit shabby round the edges and very much a communal breakfast with liberal ideas floating around! Needless to say Dad was ready to GO almost as soon as he got up! :) Did manage to get Mom a glass of wine and a dip in the natural hot tub (small) then visited the dwelling the next day and on to Silver City.

So many great memories along the way as well --- unexpected hospital visits in NM, traveling with two dogs and a rabbit, the Grand Canyon (where hubby and I met), White Sands (once a stopover every summer on the way to Ark, now our families general stop on the way to AZ), flights overseas, whirlwind tour of DC and Arlington when it was FREEZING cold, Noah freaking out because he realized he was stepping on live critters at Cabrillo Ntl Monument during a extreme minus tide, ....

Just a few of our adventures!


Air-Conditioning

Air-conditioning is an integrated, automated system for controlling the temperature, humidity, and cleanliness of air in a building. The concept arose from the known sensitivity of certain industrial processes to air temperature and humidity. In nineteenth-century textile mills, the use of water sprays to cool and humidify the air was a primitive attempt at air-conditioning. True air-conditioning was the invention of an American mechanical engineer,Willis Haviland Carrier. In 1902, only a year after graduating from Cornell University, Carrier installed a system for controlling air temperature and humidity in a printing plant in Brooklyn, NewYork. He patented his “apparatus for treating air” in 1906. Early customers for Carrier’s system included textile mills, where dry air could cause fibers to become unmanageable owing to the effects of static electricity. Interest was not restricted to cotton mills in the American South; the first foreign customer was a silk mill in Yokohama, Japan. In 1911, Carrier made public the formulae for his air-conditioning calculations, which still form the basis of air-conditioning technology today.

In order to capitalize on his invention, Carrier formed the Carrier Engineering Corporation with six partners in 1915 and began manufacturing air-conditioning units in 1922. By then, Carrier had invented a new machine—the centrifugal chiller. This refrigeration device provided the first practical solution to the problem of cooling very large spaces. As air-conditioning became well established in the industrial context, operators of other types of large buildings, such as theaters and hotels, became aware of the more general benefits of air-conditioning as a means of improving human comfort levels. The first public building to feature a Carrier centrifugal chiller was the J. L. Hudson Department Store in Detroit, Michigan, where three chillers were installed in 1924. Four years later, the Carrier company developed the “Weathermaker,” a small air-conditioning unit specifically designed for household use.

Mean Mom....

yep that's me! I so want one of these shirts...

http://www.meanmom.com/ then clickon our products

http://www.meanmomuniversity.com/index.html


and I added this blog to my site as well.

http://themeanestmom.blogspot.com/2008/01/about.html

Gotta laugh!

Calculators

The term “calculator” may be applied to any device that assists the process of calculation. However, in practice it has become shorthand for one such device, the electronic calculator, which was the first to achieve widespread ownership beyond the workplace.

The ancestors of the electronic calculator were the desktop mechanical calculating machines developed in the late nineteenth century. These were based on principles established in the seventeenth century by the French mathematical philosopher Blaise Pascal and the German Gottfried Wilhelm Leibniz and were commonly known as adding machines. By 1900, two main types had emerged: machines operated by setting levers; machines with numerical keyboards, of which the first was the Comptometer of 1886, invented by the American Dorr E. Felt. The American inventor William S. Burroughs developed an adding and listing machine in 1892, with a built-in printing facility for producing a paper record. Adding machines soon became a standard piece of office equipment.

The purely mechanical adding machine evolved into the more compact, electrically powered version, which became typical in the 1950s. The first commercial electronic calculator was a transistorized desktop model introduced by the American Bell Punch Company in 1963. Texas Instruments produced a hand-held electronic calculator in 1967. Early American and Japanese hand-held calculators were still large by today’s standards. Home ownership remained low because people’s needs outside the workplace could be met more cheaply and conveniently by the use of “ready reckoner” tables and slide rules, or simply by mental effort.

The image and role of the calculator changed only when the advent of microelectronics enabled the production of small, cheap calculators. The world’s first true pocket calculator was the Sinclair Executive calculator, designed by the British inventor Clive Sinclair and launched in 1972. It featured an LED (light-emitting diode) display. In the same year, Hewlett-Packard pocket calculators became available in the United States. In 1973, the Japanese company Sharp introduced the first electronic calculator with a liquid crystal display. Within five years, the price of pocket calculators had fallen dramatically. In 1979, the pocket calculator became the card-size calculator when Sharp developed a super-thin model.

The pocket calculator is an example of a product that created demand where it did not previously exist. Today, the sophistication of the pocket calculator has reached such a level that even cheap models incorporate a range of scientific functions well beyond the needs of the average user. More expensive models have larger displays so that results can be presented graphically. The leading manufacturers are Japanese companies such as Casio and Sharp. In environmental terms, the pocket calculator has another distinction: it is the only commonplace device available in a solar-powered form. The solar unit in a calculator is a semiconducting photoelectric cell, which converts light energy into electric energy, thus removing the need for batteries. Pocket calculators may be wholly solar-powered or dual-powered, with back-up battery power to compensate for low light levels.

Atari, Inc.

Atari Inc., the world’s first electronic games company, was founded by the American engineer and entrepreneur Nolan Bushnell in 1972. Bushnell was previously employed by the Ampex Corporation, manufacturers of audio and videotape recorders. The name Atari describes a move in the Japanese game “Go.” The company was based in Sunnyvale, California. Bushnell’s first electronic game was Pong, an electronic version of tennis, originally developed as a coin-operated machine for use in bars and amusement arcades. Atari introduced a home version of Pong two years later, but by then, another American company, Magnavox, had launched its Odyssey electronic ball game. While Atari was doing business with its coin-operated games machines, it lacked the capital to compete with the larger companies that were entering the home-entertainment market. Therefore, in 1976, Bushnell decided to sell Atari to Warner Communications, although he continued to work for Atari until 1978.

In 1977, Atari introduced a games console called the Video Computer System (VCS) that took interchangeable game cartridges. By 1979, there were more than twenty VCS games available, including Space Invaders, which had been developed as an arcade game by the Japanese company, Taito. The popularity of the home version of Space Invaders persuaded Atari to develop VCS versions of its own arcade games, which, by 1982, included Asteroids, Battle Zone, Missile Command, and Pac-Man.

Meanwhile, the company had also turned its attention to the growing home-computer market. Steve Jobs, cofounder of Apple, was employed by Atari in 1976 when he and Stephen Wozniak developed the Apple I microcomputer. While Atari rejected an offer to acquire the rights to the Apple I at a time when the market was untested, two years later it introduced the Atari 400 and 800 home computers. The Atari home computers were technically sound and were well supported with peripherals, but their commercial success suffered because of Atari’s divided commitments. By the early 1980s, Atari was facing stiff competition on all fronts. The launch of the IBM PC in 1981 was followed by a savage price war in 1982, as companies such as Commodore and Texas Instruments cut prices in an effort to maintain sales. While other companies were continuously updating their home computers, Atari was tied up in the rather lengthy development of a new line of home computers, the XL series. The introduction of the XL computers in 1983 did little to offset the financial problems caused by the slump in Atari’s games sales.

In 1984, Warner Communications was glad to offload Atari by selling it to Jack Tramiel, who had founded Commodore in 1958. Tramiel streamlined Atari by reducing the workforce and suspending existing projects, and he also developed new products. Atari’s prospects began to look healthier when the launch of the 5200ST home computer was favorably received. The 5200ST was nicknamed the “Jackintosh” because of its similarity to the Apple Macintosh. Aimed at and priced for the home market, the 5200ST’s disadvantage was a limited range of software. To capitalize on the burgeoning range of PC software, Atari launched its first PC-compatible computer, the i8088 PC-1, in 1987. Its other strategy was to compete for Apple’s share of the non-PC business-computer market by developing a more powerful successor to the 5200ST. Like Apple, Atari used Motorola microprocessors. The Atari TT computer, based on the Motorola 68030 chip, was launched in Europe in late 1989, a year before its American launch. Europe, where Apple had a lower market share than in the United States, had proved a more receptive market for the predecessor ST computer. The Atari TT was cheaper than its Apple equivalent but software was again a stumbling block.

Meanwhile, Atari had not entirely given up on the games market, but it was under increasing pressure from the Japanese companies Nintendo and Sega. While Atari was largely relying on its back catalogue of games, Nintendo and Sega were developing new, inventive games. The Atari 7800, 2800Jr., and XEGS games consoles were primitive technologically compared to the Japanese rival systems, but Atari showed that it was still capable of innovation in the games field when it unveiled its Lynx portable games console with a full color LCD display in 1989. However, Nintendo’s cheaper Game Boy, also launched in 1989, was more commercially successful.

With its computer sales stagnating, Atari pinned its hopes on overtaking its Japanese rivals in the games market by developing an advanced console. The Atari Jaguar console, launched in December 1993, was the world’s first 64-bit games console. It featured high quality sound and color rendering. A contract securing IBM manufacture of the hardware and the commitment of numerous software developers to produce games for the Jaguar were promising signs. However, Atari made the fatal mistake of underestimating the time required for software development after the release of the programming code. Consumers lost interest in the superior Jaguar hardware when a suite of games was slow to appear. Atari was unable to recoup sufficient lost ground before the arrival of the Sony PlayStation and Nintendo 64 consoles in 1996.

The commercial failure of the Jaguar was the last straw for Atari. No longer viable as an independent company, it merged with JTS, a disk-drive manufacturer. This proved to be a short-lived reprieve as JTS folded in 1998. The American toys and games company Hasbro purchased the rights to all Atari’s games. This was just one of a series of strategic acquisitions in the 1980s and 1990s that broadened Hasbro’s product range. In 1995, Hasbro had set up an “interactive” division to develop games on CD-ROM for the personal computer and the Sony PlayStation. Old favorite Atari games, such as Centipede and Frogger, are now available in CD-ROM format. In Europe, Atari’s main computer market, computers are still being made to the specifications of Atari’s architecture and operating systems.

New Posts Coming Very Soon Now!

 I am working behind the scenes now to get this blog up and running!

Please check back soon !!




FOR TODAY: November 10th, 2009

  • Outside my window... grey and chilly looking but it's supposed to be 60 today, wish the leaves would've stayed on the trees longer!
  • I am thinking... that I need to finish this and move on, that our rat terrier is spoiled.
  • I am thankful for... Coffee from my hubby this morning (he's on leave)
  • From the learning rooms... HAVE to get the school records caught up and College transcripts for the girls done (AAAH!)
  • From the kitchen... hmmm, cereal for breakfast - chicken chili for lunch, need to make a grocery and menu list and make bread.
  • I am wearing... jammies (blush) - In & Out AZ tee and flannel pants with eldest dd's slippers!
  • I am creating... this post at the moment! :)
  • I am going... to call Julie today and schedule Jessica's senior photos!
  • I am reading... Be Still and know that I am God by Amy & Judge Reinhold which Shellie Kelly gave me at PWOC Conference this week, Inkheart and In Every Pew there's a broken heart. Learning, about Lecto Divina always have more than one book going.
  • I am hoping... to get those transcripts done today! or at least most of it..
  • I am hearing.... my mantel clock chime, dogs toenails on our wood floor, son taking out the trash, and computer chill pad at the moment.
  • Around the house... lots to catch up on after conference!
  • One of my favorite things... morning snuggles with my 9 yo
  • A few plans for the rest of the week: visiting Lebanon KS so Noah can say he too has been to the center of the contiguous United States, grocery shopping, Cowboy Joe party tonight for kiddos, AWANA, babysitting gigs for the girls, scrapbooking??


  • Here is picture for thought I am sharing...
Thanks Mandy for the link!

British Telecom

British Telecommunications plc, more familiarly known as British Telecom or BT, was formed by the privatization of Britain’s national telephone system in 1984. As a private-sector company, it not only retained its core business of supplying local, long-distance, and international telecommunications services and equipment in Britain, albeit in a new competitive environment, but also gained new international business opportunities. British Telecom now operates joint ventures in thirty other countries worldwide, including Spain, India, and South Africa.

The industry had been run as a state monopoly since 1912. Telephone services were introduced in Britain in the late 1870s and 1880s by privately owned companies, including the United Telephone Company, which was jointly owned by America’s National Bell and Edison. In 1880, the government awarded licensing control over telephone services to the state-owned Post Office, which already had a monopoly of telegraph services. In 1889, when a number of competing private companies merged to form the National Telephone Company, the Post Office took over the operation of long-distance lines. The Post Office assumed full control of telephone services following the nationalization of the industry in 1912, although a few local telephone services continued to be owned and operated by local authorities.

In the late 1960s, a combination of factors prompted recognition of the need for institutional change within the British telephone industry. The introduction of automatic distance dialing, known as subscriber trunk dialing (STD), in 1959 and international subscriber dialing in 1963 put pressure on telephone exchanges, many of which needed upgrading. As a government department, the Post Office was subject to Treasury constraints on investment. In 1969, the Post Office gained a greater degree of financial autonomy when it became a national industry rather than a government department operating on a budget allocated by the Treasury. Consequently, it was able to commission a consortium of three British companies, GEC, Plessey, and STC, to develop a computer-controlled digital telephone exchange system, named System X. After prototype testing in 1978, it was introduced in London in 1980 and gradually extended.

Privatization came as a result of the 1979 election of a Conservative government on a free-enterprise platform. Under the ensuing British Telecommunications Act of 1980, the Post Office lost its monopoly of telephone services. In preparation for privatization, it was restructured in 1981 into two independent divisions, mail and telecommunications. A second telephone service supplier, Mercury Communications, was granted a license in 1982. In 1984, the newly privatized British Telecom opened its first digital international exchange, installed by Thorn-Ericsson Telecommunications Ltd.

Privatization and deregulation coincided with the introduction of cellular phone services. In 1982, the British government decided to grant two nationwide licenses for cellular phone services. One license was awarded to the Cellnet consortium, led by British Telecom in partnership with the security company Securicor, and the other went to the Vodafone consortium led by Racal Electronics. Both cellular phone services became operational in 1985. A second phase of telecommunications deregulation followed the release of the government’s 1990 discussion paper “Competition and Choice: Telecommunications Policy for the 1990s.” The duopolies in both the fixed telephone and cellular phone systems were discontinued, opening the market to new operators.

British Telecom’s response to increased competition was to strengthen its commitment to customer care by launching a new BT mission in 1991 that promised to put customers first. While British Telecom has retained its overall leadership in British telephone services according to market share, in terms of financial success, it has been overtaken by the mobile phone company, Vodafone.

Brillo Pads

Brillo pads are steel wool pads impregnated with a special soap containing jeweler’s rouge. They were introduced by the Brillo Company of Brooklyn, NewYork, in 1930.

The company was the result of what would now be called a “market-led” approach. A Mr. Brady, a New York door- to-door salesman, was selling aluminum pots and pans and noted that his customers complained about how difficult they could be to keep clean. Brady consulted his brother-in-law, Mr. Ludwig, a costume jeweler. It was Ludwig who struck upon the idea of combining soap with jeweler’s rouge to produce the required shine. Brady then found that his soap was beginning to out-sell the pans. Brady and Ludwig approached a lawyer, Milton B. Loeb, for advice on establishing a company to begin commercial production. Loeb must have seen the potential as he joined them, as well as providing the brand name, Brillo, after the Latin beryllus (shine). Loeb went on to become treasurer and president of the company. The Brillo soap was patented and registered as a trademark in 1913.

Brillo’s main product was the soap that was sold with pads of steel wool. Initially sold by door-to-door salesmen, they were soon taken up by grocery and hardware stores and chains such as Woolworth’s. The steel-wool pads, impregnated with the soap, were introduced in 1930. Brillo remains one of the world’s best selling pan cleaners, along with its main rival SOS. They have survived the arrival of motorized scouring pads in the 1960s. The Kent Kordless of 1962 was one such product, but was deemed not worth its cost by Consumer Reports.

Thanks to AndyWarhol’s oversize replicas, the Brillo pad’s bright and simple packaging, along with the Campbell’s Soup can, became an icon of 1960s pop art. The company is now a part of Church & Dwight Co. Inc

Barbecues

Barbecues, or open-air meals, date back to large social events such as ox or hog roastings. Such events were communal affairs; today the barbecue is seen as a more private affair conducted in a suburban garden (yard). They still maintain their social functions, as they often double as parties. Barbecues became popular in the United States in the 1960s and spread to Northern Europe in the 1970s. The large barbecues offered as part of Mediterranean package holidays were another stimulus.

The garden barbecue grills or spit roasts food with the heat supplied by hot charcoal or compressed hardwood briquettes. There are many different shapes and sizes, but they are all used in much the same way.

The simplest type is based on the Japanese hibachi, or fire bowl, a simple rectangular container with a grilling rack. Larger models stand on legs and can also be circular. They usually have a windshield with slots to accommodate different grilling positions and spits. Rounded kettle barbecues are more sophisticated, with domed hoods and vents.When closed the hood allows it to act more like an oven, capable of broiling joints or fowl.

The most difficult part of barbecue cooking is to get the charcoal to light. This has been assisted by the use of solid and liquid firelighters. An easier way is to light the fuel by liquid petroleum gas. More sophisticated models also have small gas rings. Funnel barbecues use lightly folded newspaper that is set alight to deliver rapid intense heat to the fuel.

The appeal of barbecue cooking is that it can make simple sausages and burgers taste better. One interesting sociological factor is that, although women still do most of the cooking, the control of the barbecue is often a male preserve.

Prov 31:27...

"She looks well to how things go in her household, and the bread of idleness (gossip, discontent, and self-pity) she will not eat."... Amplified Bible.

This verse has been coming back to me repeatedly of late. What exactly does looking well to the way things are in your household mean? Well it doesn't necessarily mean that it looks like your house - remember it says HER household - we all have different schedules, jobs,etc to manage.
Hubby's schedule and needs as a military man probably look very different from those of a pastor, or someone with a set schedule everyday, or for those who don't have a husband, etc. That's not bad, it just is.

What about the kiddos? How many you have/ages/gender and tastes will determine differences in this area as well.

I think looking well to your household means being on top of things, keeping things running smoothly and doing so cheerfully and yet it is more than that -- oh there's the budgeting,etc but i mean the spiritual side as well. Praying for said husband and children if you have them, meeting physical, spiritual and emotional needs and NOT comparing myself to Mary Jane down the road or a friend who has other needs. To often as women and particularly Christian women we tend ot think if we aren't quiet, always happy, never moody, and don't do things exactly the same as others then we are failing. The Church sometimes reinforces this idea with books on how to be the perfect Christian woman, others who say things like "if you..." or give ideas and scripture as if they were God, well meaning though they may be. Where's the grace for each other? To be sure we need to be Titus 2 women who teach and help younger women, but not in a way that makes them feel inadequate or that they somehow don't measure up.

What does the Lord require of you? That will look different for each of us, and in each season! Using store brought bread for a season if that's what you need will not kill your family, nor will eating out occasionally (notice I said occasionally :)) to use two random examples.

Ask the Lord to show you what looking well to your home's ways means for YOU

Breadmakers

The baking of bread at home declined during the twentieth century due to the rise of industrialized baking and retailing. By 1950 most people bought their bread from small local bakeries, which in turn were overtaken by the large supermarket chains.

In the United Kingdom, the late 1970s saw a reaction, led by food writers, to the rather bland industrially produced bread and a demand for greater choice. This feeling was amplified when more people took holidays in France, where the tradition of the small local bakery has remained intact, even in large cities. Supermarkets responded with “instore” bakeries and a much wider variety of breads inspired by French and Italian recipes.

Bread could be baked in the home in a gas or electric oven, but during the early 1990s manufacturers developed countertop breadmakers designed to give good results every time. West Bend produced the first American model in 1993. The company moved very quickly to enter this niche market, completing the project in only thirty-five weeks, from concept to shipping. A breadmaker is essentially a mixer, proofing oven, and mini-oven in one. They have plastic “cool wall” cases, usually with viewing windows. They automatically knead, proof, and bake, and they can take up to three sizes of loaf tin. The ingredients are placed in a nonstick baking tin, a cycle is selected, and the machine does the rest. A paddle in the bottom of the bread pan kneads the dough, stopping two or three times to let it rise. Settings are usually for overnight baking, but some models offer high-speed programs that deliver a loaf in less than two hours. Most models have midcycle indicators to allow extra ingredients such as fruits and nuts to be added. They come in a variety of sizes, which usually relate to the size of the loaf, from one to two and a half pounds.

Popular manufacturers include Black & Decker, Breville, Oster, Panasonic, Prima, Sunbeam, Toastmaster, and West Bend.

Braun

The Braun company was founded in Frankfurt in 1921 by Max Braun (1890–1951), an engineer from East Prussia. It originally produced connectors for machine belts and later moved into components for radios and gramophones in 1923. By 1925 the company was producing many of its own plastic components, and by 1929 it had begun to make complete sets. Braun became one of Germany’s largest radio manufacturers. It began to innovate during the 1930s, introducing a combined radio and phonograph in 1932 and a battery powered portable radio in 1936. By 1938 its modern Frankfurt factory employed 1,000 people.

During the postwar reconstruction it added domestic appliances and electric razors to its range of products. The Braun S50 shaver and the Multimix appeared in 1950. In 1954 Braun struck a deal with the Ronson Company, who were licensed to manufacture Braun shavers in the United States.

Max Braun was succeeded in the early 1950s by his sons Artur and Erwin, who were interested in design and brought in a range of talented designers to work on their products. Dieter Rams joined the company in 1955, along with Hans Gugelot, Otl Aicher, and Gerd Alfred Müller. The following year it set up its own design department, which Rams headed from 1960.

The result of this corporate approach was a unified range of products that possessed a sculptural simplicity. The electronics of razors, food mixers, and heaters were enveloped in white metal or plastic covers with minimal, easy-to-use controls. The KM 321 Kitchen Machine of 1957, a food mixer, exemplified this approach. This “neofunctionalist” approach could also be seen in the audio products such as the Phonosuper of 1956, nicknamed “Snow White’s Coffin” because of its rectangular shape, white body, and clear Perspex lid. Braun set a standard that influenced other companies to take design more seriously. Its products were selected by the New York’s Museum of Modern Art and praised at the 1958 Brussels World Fair as “outstanding examples of German manufacturing.”

The aesthetic merit of Rams’s designs was reflected in the work of U.K. “Pop” artist Richard Hamilton in his Toaster screen print and collage of 1967. He stated, “My admiration for the work of Dieter Rams is intense and I have for years been uniquely attracted towards his design sensibility; so much so that his consumer products have come to occupy a place in my heart and consciousness that the Mont Sainte-Victoire did in Cézanne’s.”

The controlling interest in Braun was bought by the U.S. Gillette Company in 1967. Since then its style has become a little diluted but the ET22 calculator and the Micron shaver have ensured that Braun products remain distinctive. Braun has, more than probably any other company, managed to successfully marry modernist principles to industrial production. The results have largely been just what Erwin Braun wished them to be: “honest, unobtrusive, and practical devices.”