AskEssays.com - Discover essay samples

Computers Not the greatest invention of the 20 th century

4.9 of 5.0 (121 reviews)

Contains
1813 words
Category
Technology

Computers Not the greatest invention of the 20 th century Page 1
Computers Not the greatest invention of the 20 th century Page 2
Computers Not the greatest invention of the 20 th century Page 3
Computers Not the greatest invention of the 20 th century Page 4
Computers Not the greatest invention of the 20 th century Page 5
The above thumbnails are of reduced quality. To view the work in full quality, click download.

Computers Not the greatest invention of the 20 th century


?Computers: Not the Greatest Discovery of the Twentieth Century?



Nothing epitomizes modern life better than the computer. For better or worse, computers have infiltrated every aspect of our society. Today, computers do much more than simply compute. Supermarket scanners calculate our grocery bill while keeping store inventory, computerized telephone switching centers play traffic cop to millions of calls and keep lines of communication untangled, and automatic teller machines let us conduct banking transactions from virtually anywhere in the world. But where did all this technology come from and where is it heading? To fully understand and appreciate the impact computers have on our lives and promises they hold for the future, it is important to understand their evolution.

The abacus, which emerged about 5,000 years ago in Asia Minor and is still in use today, may be considered the first computer. This device allows users to make computations using a system of sliding beads arranged on a rack. Early merchants used the abacus to keep trading transactions. But as the use of paper and pencil spread, particularly in Europe, the abacus lost its importance. It took nearly 12 centuries, however, for the next significant advance in computing devices to emerge. In 1642, Blaise Pascal, the 18-year-old son of a French tax collector invented what he called a numerical wheel calculator to help his father with his duties. This brass rectangular box, also called a Pascaline, used eight movable dials to add sums up to eight figures long. Pascal's device used a base of ten to accomplish this. For example, as one dial moved ten notches, or one complete revolution, it moved the next dial - which represented the ten's column - one place. When the ten's dial moved one revolution, the dial representing the hundred's place moved one notch and so on. The drawback to the Pascaline, of course, was its limitation to addition.

In 1694, a German mathematician and philosopher, Gottfried Wilhem von Leibniz, improved the Pascaline by creating a machine that could also multiply. Like its predecessor, Leibniz's mechanical multiplier worked by a system of gears and dials. Partly by studying Pascal's original notes and drawings, Leibniz was able to refine his machine. The centerpiece of the machine was its stepped-drum gear design, which offered an elongated version of the simple flat gear. It wasn't until 1820, however, that mechanical calculators gained widespread use. Charles Xavier Thomas de Colmar invented a machine that could perform the four basic arithmetic functions. Colmar's mechanical calculator, the arithometer, presented a more practical approach to computing because it could add, subtract, multiply and divide. With its enhanced versatility, the arithometer was widely used up until the First World War. Although later inventors refined Colmar's calculator, together with fellow inventors Pascal and Leibniz, he helped define the age of mechanical computation.

The real beginnings of computers as we know them today, however, lay with an English mathematics professor, Charles Babbage. Frustrated at the many errors he found while examining calculations for the Royal Astronomical Society, Babbage declared, "I wish to God these calculations had been performed by steam!" With those words, the automation of computers had begun. By 1812, Babbage noticed a natural harmony between machines and mathematics: machines were best at performing tasks repeatedly without mistake; while mathematics, particularly the production of mathematic tables, often required the simple repetition of steps. The problem centered on applying the ability of machines to the needs of mathematics. Babbage's first attempt at solving this problem was in 1822 when he proposed a machine to perform differential equations, called a Difference Engine. Powered by steam and large as a locomotive, the machine would have a stored program and could perform calculations and print the results automatically. After working on the Difference Engine for 10 years, Babbage was suddenly inspired to begin work on the first general-purpose computer, which he called the Analytical Engine. Babbage's assistant, Augusta Ada King, Countess of Lovelace and daughter of English poet Lord Byron, was instrumental in the machine's design. One of the few people who understood the Engine's design as well as Babbage, she helped revise plans, secure funding from the British government, and communicate the specifics of the Analytical Engine to the public. Also, Lady Lovelace's fine understanding of the machine allowed her to create the instruction routines to be fed into the computer, making her the first female computer programmer. In the 1980's, the U. S. Defense Department named a programming language in her honor.

Babbage's steam-powered Engine, although ultimately never constructed, may seem primitive by today's standards. However, it outlined the basic elements of a modern general purpose computer and was a breakthrough concept. Consisting of over 50,000 components, the basic design of the Analytical Engine included input devices in the form of perforated cards containing operating instructions and a "store" for memory of 1,000 numbers of up to 50 decimal digits long. It also contained a "mill" with a control unit that allowed processing instructions in any sequence, and output devices to produce printed results. Babbage borrowed the idea of punch cards to encode the machine's instructions from the Jacquard loom. The loom, produced in 1820 and named after its inventor, Joseph-Marie Jacquard, used punched boards that controlled the patterns to be woven.

In 1889, an American inventor, Herman Hollerith, also applied the Jacquard loom concept to computing. His first task was to find a faster way to compute the U.S. Census. The previous census in 1880 had taken nearly seven years to count and with an expanding population, the bureau feared it would take 10 years to count the latest census. Unlike Babbage's idea of using perforated cards to instruct the machine, Hollerith's method used cards to store data information which he fed into a machine that compiled the results mechanically. Each punch on a card represented one number, and combinations of two punches represented one letter. As many as 80 variables could be stored on a single card. Instead of ten years, census takers compiled their results in just six weeks with Hollerith's machine. In addition to their speed, the punch cards served as a storage method for data and they helped reduce computational errors. Hollerith brought his punch card reader into the business world, founding Tabulating Machine Company in 1896, later to become International Business Machines (IBM) in 1924 after a series of mergers. Other companies such as Remington Rand and Burroughs also manufactured punch readers for business use. Both business and government used punch cards for data processing until the 1960's.

In the ensuing years, several engineers made other significant advances. Vannevar Bush developed a calculator for solving differential equations in 1931. The machine could solve complex differential equations that had long left scientists and mathematicians baffled. The machine was cumbersome because hundreds of gears and shafts were required to represent numbers and their various relationships to each other. To eliminate this bulkiness, John V. Atanasoff, a professor at Iowa State College and his graduate student, Clifford Berry, envisioned an all-electronic computer that applied Boolean algebra to computer circuitry. This approach was based on the mid-19th century work of George Boole who clarified the binary system of algebra, which stated that any mathematical equations could be stated simply as either true or false. By extending this concept to electronic circuits in the form of on or off, Atanasoff and Berry had developed the first all-electronic computer by 1940. Their project, however, lost its funding and their work was overshadowed by similar developments by other scientists.

With the onset of the Second World War, governments sought to develop computers to exploit their potential strategic importance. This increased funding for computer development projects hastened technical progress. By 1941 German engineer Konrad Zuse had developed a computer, the Z3, to design airplanes and missiles. The Allied forces, however, made greater strides in developing powerful computers. In 1943, the British completed a secret code-breaking computer called Colussus to decode German messages. The Colossus's impact on the development of the computer industry was rather limited for two important reasons. First, Colossus was not a general-purpose computer; it was only designed to decode secret messages. Second, the existence of the machine was kept secret until decades after the war.

American efforts produced a broader achievement. Howard H. Aiken, a Harvard engineer working with IBM, succeeded in producing an all-electronic calculator by 1944. The purpose of the computer was to create ballistic charts for the U.S. Navy. It was about half as long as a football field and contained about 500 miles of wiring. The Harvard-IBM Automatic Sequence Controlled Calculator, or Mark I for short, was a electronic relay computer. It used electromagnetic signals to move mechanical parts. The machine was slow (taking 3-5 seconds per calculation) and inflexible (in that sequences of calculations could not change); but it could perform basic arithmetic as well as more complex equations.

Another computer development spurred by the war was the Electronic Numerical Integrator and Computer (ENIAC), produced by a partnership between the U.S. government and the University of Pennsylvania. Consisting of 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints, the computer was such a massive piece of machinery that it consumed 160 kilowatts of electrical power, enough energy to dim the lights in an entire section of Philadelphia. Developed by John Presper Eckert and John W. Mauchly, ENIAC, unlike the Colossus and Mark I, was a general-purpose computer that computed at speeds 1,000 times faster than Mark I.

In the mid-1940's John von Neumann joined the University of Pennsylvania team, initiating concepts in computer design that remained central to computer engineering for the next 40 years. Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to hold both a stored program as well as data. This "stored memory" technique as well as the "conditional control transfer," that allowed the computer to be stopped at any point and then resumed, allowed for greater versatility in computer programming. The key element to the von Neumann architecture was the central processing unit, which allowed all computer functions to be coordinated through a single source. In 1951, the UNIVAC I (Universal Automatic Computer), built by Remington Rand, became one of the first commercially available computers to take advantage of these advances. Both the U.S. Census Bureau and General Electric owned UNIVACs. One of UNIVAC's impressive early achievements was predicting the winner of the 1952 presidential election, Dwight D. Eisenhower.

First generation computers were characterized by the fact that operating instructions were made-to-order for the specific task for which the computer was to be used. Each computer had a different binary-coded program called a machine language that told it how to operate. This made the computer difficult to program and limited its versatility and speed. Other distinctive features of first generation ...

You are currently seeing 50% of this paper.

You're seeing 1813 words of 3626.

Keywords: computer is the greatest invention of 20th century essay, computer is the most important invention, why is the computer a great invention, 20th century invention computer

Similar essays


The torpedoes

The Torpedoes The pre-war issue torpedo had the disadvantage of leaving a visible trail of bubbles on the surface on its way to the target. The standard torpedo of the war suffered from early problems with its internal depth-keeping equipment, and its firing pistol, but these were solved after the Norwegian Campaign. In mid 1942 an imp...

22 reviews
Download
The impact of computers in accounting

In our society today, computer technology plays an important role in many form of business, especially in the field of accounting. As technology advances, not only does the use of computers penetrates individual's everyday life, it also helps to mange and improve many business operations from service, manufacturing to retail. Ken, an accoun...

134 reviews
Download
Media influence

MEDIA INFLUENCE The media play an important role in our lives and influence us in our choices and things we value in life. We definitely live in an information society, large groups of people receive information and store it. There are many opinions about the mass media of communications served as a vehicle for rational discussion featuring...

81 reviews
Download
Privacy and anonymity and information network technologies

We might assume that nothing new could be said about the issue of privacy behond the basic notion that it is something secluded from the inclusion of others, a virtue and right that every citizen of a democratic society might possess. However, if that were actually the case then we would not see our culture involved in debates about th...

29 reviews
Download
Computers 2

COMPUTERS Could one imagine what the world would be like today without computers? For one, I would not be typing this paper right now. Computers were actually developed in early history. The first major use for a computer in the U.S. was during the 1890 census. Two men, Herman Hollerith and James Powers, developed a new punched...

65 reviews
Download
Multimedia

As A Technology, It Is Called As a technology, it is called multimedia. As a revolution, it is the sum of many revolutions wrapped into one: A revolution in communication that combines the audio visual power of television, the publishing power of the printing press, and the interactive power of the computer. is the convergence of these differ...

76 reviews
Download
Technology 2

Computers are capable of doing more things every year. There are many advantages to knowing how to use a computer, and it is important that everyone know how to use them properly. Using the information I have gathered, and my own knowledge from my 12 years of computer experience, I will explain the many advantages of owning a computer and knowi...

78 reviews
Download
Internet site evaluations

Introduction "Wow, will you look at this website I found, they claim that 9 out of 10 people in America are millionaires. What's more, they claim to be the largest online wealth creation firm. Surprisingly, there is no information regarding the author or the origin of this site." Internet dream or website scheme? If one is easily pers...

32 reviews
Download
Internet

This isn't perfect but it is a good start Who is supposed to be the watchdog on the net? This question will raise the temperature in almost any room. The government already has to many regulations on lots of things, who wants Uncle Sam's hand in the web. Some of the regulations that are applied are there for specific reasons. Those protect th...

176 reviews
Download
Radio making waves in america

Radio - making waves in america Radio: Making Waves in America Radio-wave technology is one of the most important technologies used by man. It has forever changed the United States and the world, and will continue to do so in the future. Radio has been a communications medium, a recreational device, and many other things to us. When British p...

183 reviews
Download
Technological developments

Scientific and technological developments have real and direct effects on every person\\\'s life. Some effects are desirable; others are not. Some of the desirable effects may have undesirable side effects. In essence, there seems to be a trade-off principle working in which gains are accompanied by losses. Example: As our society continues...

107 reviews
Download
Warfare technology

In today's world war plays a very important role in people's lives. Some of these wars date as far back as the Roman Empire and Alexander the Great, while others are just being born and brought to the doorstep of the world. In 1939 this world was introduced to World War II. Although this war was based in Europe, it would touch the lives of p...

164 reviews
Download
What is changing in the technology of heavy construction mac

SUBJECT: hinery? In being one of the leaders in the construction industry, it is our duty to keep up with and push ourselves in acquiring upcoming technology. As per your request on the topic, we have created a team to not only acknowledge some of the possibilities we may be faced with in the future, but to assess the causes and effects...

49 reviews
Download
Computer Crime

: Prevention and Innovation Since the introduction of computers to our society, and in the early 80?s the Internet, the world has never been the same. Suddenly our physical world got smaller and the electronic world set its foundations for an endless electronic reality. As we approach the year 2000, the turn of the millenium, humanity...

192 reviews
Download
Atsisiųsti šį darbą