Computers play a key role in how individuals work and how they live. Even the smallest organizations have computers to help them operate more efficiently, and many individuals use computers at home for educational, entertainment, and business purposes.
Nearly 5,000 years ago the abacus emerged in Asia Minor. The abacus may be considered the first computer. This device allowed its users to make computations using a system of sliding beads arranged on a rack. Early shopkeepers used the abacus to keep up with transactions. The use of pencil and paper spread, the abacus lost its importance. Nearly twelve centuries past before the next important advance in computing devices emerged.
In 1642, Blaise Pascal, the 18-year-old son of a French tax collector, invented what he called a numerical wheel calculator to help his father with his duties. The Pascaline, a brass rectangular box, used eight movable dials to add sums up to eight figures long. Pascal's device used a base of ten to achieve this. The disadvantage to the Pascaline, of course, was its limitation to addition. In 1694, Gottfried Wilhem von Leibniza a German mathematician and philosopher improved the Pascaline by creating a machine that could also multiply. Like its predecessor, Leibniz's mechanical multiplier worked by a system of gears and dials.
It wasn't until 1820, however, that mechanical calculators gained widespread use. A Frenchman, Charles Xavier Thomas de Colmar, invented a machine that could perform the four basic mathematic functions. The arithometer, presented a more systematic approach to computing because it could add, subtract, multiply and divide. With its enhanced versatility, the arithometer was widely used up until World War I.
The real beginnings of computers began with an English mathematics professor, Charles Babbage. Babbage's steam-powered Engine, outlined the basic elements of a modern general purpose computer and was a breakthrough concept. The Analytical Engine consisted of over 50,000 components. The basic design of included input devices in the form of perforated cards containing operating instructions and a "store" for memory of 1,000 numbers of up to 50 decimal digits long.
In 1889, an American inventor, Herman Hollerith, created a machine that used cards to store data information which was fed into a machine and compiled the results mechanically. Each punch on a card represented one number, and combinations of two punches represented one letter. As many as 80 variables could be stored on a single card. Hollerith brought his punch card reader into the business world, founding Tabulating Machine Company in 1896, later to become International Business Machines (IBM) in 1924 after a series of mergers. Other companies also manufactured punch readers for business use. Both business and government used punch cards for data processing until the 1960's.
When World War II began, the governments sought to develop computers to accomplishment their potential strategic importance. This increased funding for computer development projects and hastened technical progress. In 1941, a German engineer Konrad Zuse had developed a computer to design airplanes and missiles. The Allied forces, however, made greater strides in developing powerful computers. American efforts produced a broader achievement. In 1933, Howard H. Aiken, a Harvard engineer working with IBM, succeeded in producing an all-electronic calculator. The purpose of the computer was to create ballistic charts for the U.S. Navy. It was about half as long as a football field and contained about 500 miles of wiring. It used electromagnetic signals to move mechanical parts. The machine was slow taking 3-5 seconds per calculation and inflexible in that sequences of calculations could not change; but it could perform basic arithmetic as well as more complex equations.
Another computer development spurred by the war was the Electronic Numerical Integrator and Computer (ENIAC). It consisted of 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints, the computer was such a massive piece of machinery that it consumed 160 kilowatts of electrical power. ENIAC was developed by John Presper Eckert and John W. Mauchl. ENIAC was a general-purpose computer.
In 1945, Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) with a memory to hold both a stored program as well as data. This "stored memory" technique as well as the "conditional control transfer," that allowed the computer to be stopped at any point and then resumed, allowed for greater versatility in computer programming. The key element to the von Neumann architecture was the central processing unit, which allowed all computer functions to be coordinated through a single source. In 1951, the UNIVAC I (Universal Automatic Computer), built by Remington Rand, became one of the first commercially available computers to take advantage of these advances. The first computers were characterized by the fact that operating instructions were made-to-order for the specific task for which the computer was to be used. Each computer had a different binary-coded program called a machine language that told it how to operate. This made the computer difficult to program and limited its versatility and speed. Other unique features of first computers were the use of vacuum tubes and magnetic drums for data storage.
The invention of the transistor greatly changed the computer's development in 1948. The transistor replaced the large, cumbersome vacuum tubes. The transistor was at work in the computer by 1956. Throughout the early 1960's, there were a number of commercially successful computers used in business, universities, and government from companies such as Burroughs, Honeywell, IBM, and others. These computers also contained transistors in place of vacuum tubes. They also contained all the components we associate with the modern day computer: printers, disk storage, memory, tape storage, operating systems, and stored programs.
By 1965, most large business routinely processed financial information using computers. It was the stored program and programming language that gave computers the flexibility to finally be cost effective and productive for business use. Though transistors were clearly an improvement over the vacuum tube, they still generated a great deal of heat, which damaged the computer's sensitive internal parts. Jack Kilby, an engineer with Texas Instruments, developed the integrated circuit in 1958. The IC combined three electronic components onto a small silicon disc, which was made from quartz. Scientists later managed to fit even more components on a single chip, called a semiconductor.
By the 1980's, very large scale integration squeezed hundreds of thousands of components onto a chip. Ultra-large scale integration increased that number into the millions. The ability to fit so much onto an area about half the size of a dime helped diminish the size and price of computers. It also increased their power, efficiency and reliability. By the mid-1970's, computer manufacturers sought to bring computers to general consumers. These minicomputers came complete with user-friendly software packages that offered even non-technical users an arrangement of applications, most popularly word processing and spreadsheet programs..
In 1981, IBM introduced its personal computer (PC) for use in the home, office and schools. The 1980's saw an expansion in computer use in all three arenas as clones of the IBM PC made the personal computer even more affordable. The number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used. As computers became more widespread in the workplace, new ways to harness their potential developed. As smaller computers became more powerful, they could be linked together, or networked, to share memory space, software, information and communicate with each other. Computers continue to grow smaller and more powerful.
Monday, June 8, 2009
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment