Microelectronics

While the minicomputer widened the market for computers, they were still too expensive and complex for small businesses, let alone individuals. For computers to be brought within the reach of a mass market, they needed to become still smaller, cheaper, and easier to use. The next advance in fundamental electronic technology after the transistor was the integrated circuit. While it had taken the research resources of the world’s largest company, AT&T, to invent the transistor, within ten years transistor manufacture was dominated by new specialist semiconductor companies. The first integrated circuit was created in 1958 by Jack Kilby of Texas Instruments. It consisted of five components on a single germanium chip. A year later, Robert Noyce of Fairchild Semiconductor, founded in 1957, produced the first planar transistor. The planar process involved oxidizing a silicon wafer, coating it with a photosensitive material, photographing a pattern onto it and etching the pattern into the oxide, washing off the coating, and selectively introducing impurities. It was a repeatable process that enabled complex circuits to be built on a silicon wafer. By 1970, the price of an integrated circuit, also known as a silicon chip, had fallen from about $30 to $1, and an integrated circuit might contain up to 100 components.

The use of integrated circuits meant that the printed circuit boards of devices such as calculators became much more compact. Integrated circuits began to be used in computers in the late 1960s, but the central processing unit of a computer required thousands of integrated circuits. In 1968, Robert Noyce cofounded Intel, which began to develop large-scale integrated circuits. While Noyce had predicted that the way forward would be to fit the whole central processing unit onto a single chip, it was one of his employees, Ted Hoff, who actually achieved that. Hoff developed the Intel 4004 chip, the world’s first microprocessor, which made the pocket calculator possible. In terms of mathematical processing power, the Intel 4004 chip was virtually the equivalent of ENIAC. However, its limitation was that as a 4-bit chip (meaning that it could handle four binary digits simultaneously) it could not process alphabetical characters, because it could only define 16 4-bit characters, or bytes. The IBM 7030 computer of 1961 had established the 8-bit character, or byte, as the standard for general computing. Intel launched its first 8-bit microprocessor, the 8008 chip, in 1972, followed by the improved 8080 chip in 1973, paving the way for the first generation of home computers. The 8-bit chip could define 256 different 8-bit characters.

0 comments: