Sunday, October 18, 2009

Computing - Machine Evolution

The subsequent 60-year diffusion of the computer within society is a long story that has to be told in another place. Perhaps the single most remarkable development was that the computer—originally designed for mathematical calculations—turned out to be infinitely adaptable to different uses, from business data processing to personal computing to the construction of a global information network. We can think of computer development as having taken place along three vectors—hardware, software and architecture. The improvements in hardware over the past 60 years are legendary. Bulky electronic tubes gave way in the late 1950s to “discrete” transistors—that is, single transistors individually soldered into place. In the mid-1960s microcircuits contained several transistors—then hundreds of transistors, then thousands of transistors—on a silicon “chip.” The microprocessor, developed in the early 1970s, held a complete computer processing unit on a chip. The microprocessor gave rise to the PC and now controls devices ranging from sprinkler systems to ballistic missiles.

The challenges of software were more subtle. In 1947 and 1948 von Neumann and Goldstine produced a series of reports called Planning and Coding Problems for an Electronic Computing Instrument. In these reports they set down dozens of routines for mathematical computation with the expectation that some lowly “coder” would be able to convert them into working programs. It was not to be. The process of writing programs and getting them to work was excruciatingly difficult. The first to make this discovery was Maurice Wilkes, the University of Cambridge computer scientist who had created EDSAC, the first practical stored-program computer. In his Memoirs, Wilkes ruefully recalled the moment in 1949 when “the realization came over me with full force that a good part of the remainder of my life was going to be spent in finding errors in my own programs.”

He and others at Cambridge developed a method of writing computer instructions in a symbolic form that made the whole job easier and less error prone. The computer would take this symbolic language and then convert it into binary. IBM introduced the programming language Fortran in 1957, which greatly simplified the writing of scientific and mathematical programs. At Dartmouth College in 1964, educator John G. Kemeny and computer scientist Thomas E. Kurtz invented Basic, a simple but mighty programming language intended to democratize computing and bring it to the entire undergraduate population. With Basic even schoolkids—the young Bill Gates among them— could begin to write their own programs.

In contrast, computer architecture—that is, the logical arrangement of subsystems that make up a computer—has barely evolved. Nearly every machine in use today shares its basic architecture with the stored-program computer of 1945. The situation mirrors that of the gasolinepowered automobile—the years have seen many technical refinements and efficiency improvements in both, but the basic design is largely the same. And although it might be possible to design a radically better device, both have achieved what historians of technology call “closure.” Investments over the decades have produced such excellent gains that no one has had a compelling reason to invest in an alternative.

Yet there are multiple possibilities for radical evolution. In the 1980s interest ran high in socalled massively parallel machines, which contained thousands of computing elements operating simultaneously. This basic architecture is still used for computationally intensive tasks such as weather forecasting and atomic weapons research. Computer scientists have also looked to the human brain for inspiration. We now know that the brain contains specialized processing centers for different tasks, such as face recognition or speech understanding. Scientists are harnessing some of these ideas in “neural networks” for applications such as license plate identification and iris recognition.

More blue sky research is focused on building computers from living matter such as DNA and computers that harness the weirdness of the quantum world. No one knows that the computers of 50 years hence will look like. Perhaps their abilities will surpass even the powers of the minds that created them.


THE FUTURE OF COMPUTER ARCHITECTURE
The stored-program computer has formed the basis of computing technology since the 1950s.

What may come next?
Quantum: The much touted quantum computer exploits the ability of a particle to be in many states at once. Quantum computations operate on all these states simultaneously.

Neural Net: These systems are formed from many simple processing nodes that connect to one another in unique ways. The system as a whole exhibits complex global behavior.

Living: Computers based on strands of DNA or RNA process data encoded in genetic material.

Soource of Information : Scientific American September 2009

No comments: