Intelligence and Interoperability

In 1981, the Japanese government announced the launch of the Fifth-Generation Computer Project. While the preceding four generations were defined by their core electronic characteristics—valves, transistors, integrated circuits, and microprocessors—the fifth generation was a more holistic concept. The objective of the project was to develop computers with artificial intelligence over a ten-year period. The idea of artificial intelligence was not a new one. The term came into use in the mid-1950s, and long-term research had been undertaken in the United States at universities including Stanford and Carnegie-Mellon. However, the focus had been much narrower and progress had been limited. While the algorithmic logic advocated by Von Neumann was expressed in a sequential data processing architecture, artificial intelligence required parallel processing architecture, which was more akin to the heuristic reasoning patterns of the human brain. Heuristic is the process whereby we draw on our knowledge of many things to infer solutions to problems. The success of the solutions depends on our expertise, in terms of the quality and range of our knowledge. By 1980, only the most powerful computers, known as supercomputers, used parallel processing. In 1965, the U.S. company Control Data Corporation introduced the first supercomputer, the CD6600, designed by Seymour Cray, who went on to set up Cray Research, which became the leading producer of supercomputers. These computers were designed for complex tasks such as weather and aerodynamic modeling.

Aside from artificial intelligence, the other significant strategic trend of the 1980s and 1990s has concerned the development of software that allows greater interoperability of computers. While hardware compatibility is one way of achieving interoperability, it was evident that total hardware compatibility was extremely unlikely to occur. UNIX, the pioneering hardware-independent operating system, had provided a means of establishing a common platform across a network of nonmatched computers. In the early 1990s, another operating system on similar lines, Linux, named after its Finnish inventor Linus Torvalds, was released as “open source (code)” a term for nonproprietary systems that are made freely available. Further developments in open systems were stimulated by the introduction of the World Wide Web in 1993. As the whole concept of the Web is that it is accessible to all computers irrespective of platform (hardware and operating systems), it fostered new languages and applications. It has become accepted for Web applications, such as browsers and document readers, to be made available as freeware. One of the first of these applications was Adobe’s Acrobat Reader, which allows documents to be downloaded onto any platform. The leading web browsers, Netscape Navigator and Microsoft’s Internet Explorer, were introduced as freeware, in 1994 and 1995, respectively. Released in 1995, Sun Microsystems’s Java programming language has become the main language for Web applications.

0 comments: