Computers in some form are in almost everything these days. From
Toasters to Televisions, just about all electronic things has some form of
processor in them. This is a very large change from the way it used to be, when
a computer that would take up an entire room and weighed tons of pounds has the
same amount of power as a scientific calculator. The changes that computers
have undergone in the last 40 years have been colossal. So many things have
changed from the ENIAC that had very little power, and broke down once every 15
minutes and took another 15 minutes to repair, to our Pentium Pro 200's, and the
powerful Silicon Graphics Workstations, the core of the machine has stayed
basically the same. The only thing that has really changed in the processor is
the speed that it translates commands from 1's and 0's to data that actually
means something to a normal computer user. Just in the last few years,
computers have undergone major changes. PC users came from using MS-DOS and
Windows 3.1, to Windows 95, a whole new operating system. Computer speeds have
taken a huge increase as well, in 1995 when a normal computer was a 486
computer running at 33 MHz, to 1997 where a blazing fast Pentium (AKA 586)
running at 200 MHz plus. The next generation of processors is slated to come
out this year as well, being the next CPU from Intel, code named Merced, running
at 233 MHz, and up. Another major innovation has been the Internet. This is a
massive change to not only the computer world, but to the entire world as well.
The Internet has many different facets, ranging from newsgroups, where you can
choose almost any topic to discuss with a range of many other people, from
university professors, to professionals of the field of your choice, to the
average person, to IRC, where you can chat in real time to other people around
the world, to the World Wide Web, which is a mass of information networked from
places around the world. Nowadays, no matter where you look, computers are
somewhere, doing something.
Changes in computer hardware and software have taken great leaps and
jumps since the first video games and word processors. Video games started out
with a game called Pong...monochrome (2 colors, typically amber and black, or
green and black), you had 2 controller paddles, and the game resembled a slow
version of Air Hockey. The first word processors had their roots in MS-DOS,
these were not very sophisticated nor much better than a good typewriter at the
time. About the only benefits were the editing tools available with the word
processors. But, since these first two dinosaurs of software, they have gone
through some major changes. Video games are now placed in fully 3-D
environments and word processors now have the abilities to change grammar and
check your spelling.
Hardware has also undergone some fairly major changes. When computers
entered their 4th generation, with the 8088 processor, it was just a base
computer, with a massive processor, with little power, running at 3-4 MHz, and
there was no sound to speak of, other than blips and bleeps from an internal
speaker. Graphics cards were limited to two colors (monochrome), and RAM was
limited to 640k and less. By this time, though, computers had already undergone
massive changes. The first computers were massive beasts of things that weighed
thousands of pounds. The first computer was known as the ENIAC, it was the size
of a room, used punched cards as input and didn't have much more power than a
calculator. The reason for it being so large is that it used vacuum tubes to
process data. It also broke down very often...to the tune of once every fifteen
minutes, and then it would take 15 minutes to locate the problem and fix it.
This beast also used massive amount of power, and people used to joke that the
lights would dim in the city of origin whenever the computer was used.
The Early Days of Computers
The very first computer, in the roughest sense of the term, was the
abacus. Consisting of beads strung on wires, the abacus was the very first
desktop calculator. The first actual mechanical computer came from an
individual named Blaise Pascal, who built an adding machine based on gears and
wheels. This invention did not become improved significantly until a person
named Charles Babbage came along, who made a machine called the difference
engine. It is for this, that Babbage is known as the 'Father of the Computer."
Born in England in 1791, Babbage was a mathematician, and an inventor.
He decided a machine could be built to solve polynomial equations more easily
and accurately by calculating the differences between them. The model of this
was named the Difference Engine. The model was so well received that he began
to build a full scale working version, with money that he received from the
British Government as a grant.
Babbage soon found that the tightest design specifications could not
produce an accurate machine. The smallest imperfection was enough to throw the
tons of mechanical rods and gears, and threw the entire machine out of whack.
After spending 17,000 pounds, the British Government withdrew financial support.
Even though this was a major setback, Babbage was not discouraged. He came up
with another machine of wheels and cogs, which he would call the analytical
engine, which he hoped would carry out many different kinds of calculations.
This was also never built, at least by Babbage (although a model was put
together by his son, later), but the main thing about this was it manifested
five key concepts of modern computers --
' Input device ' Processor or Number calculator ' Storage unit to hold number
waiting to be processed ' Control unit to direct the task waiting to be
performed and the sequence of calculations ' Output device
Parts of Babbage's inventions were similar to an invention built by
Joseph Jacquard. Jacquard, noting the repeating task of weavers working on
looms, came up with a stiff card with a series of holes in it, to block certain
threads from entering the loom and blocked others from completing the weave.
Babbage saw that the punched card system could be used to control the
calculations of the analytical engine, and brought it into his machine.
Ada Lovelace was known as the first computer programmer. Daughter of an
English poet (Lord Byron), went to work with Babbage and helped develop
instructions for doing calculations on the analytical engine. Lovelace's
contributions were very great, her interest gave Babbage encouragement; she was
able to see that his approach was workable and also published a series of notes
that led others to complete what he prognosticated.
Since 1970, the US Congress required that a census of the population be
taken every ten years. For the census for 1880, counting the census took 7'
years because all counting had to be done by hand. Also, there was considerable
apprehension in official society as to whether the counting of the next census
could be completed before the next century.
A competition was held to find some way to speed the counting process.
In the final test, involving a count of the population of St. Louis, Herman
Hollerith's tabulating machine completed the count in only 5' hours. As a
result of his systems adoption, an unofficial count of the 1890 population was
announced only six weeks after the census was taken. Like the cards that
Jacquard used for the loom, Hollerith's punched cards also used stiff paper with
holes punched at certain points. In his tabulating machine, roods passed
through the holes to complete a circuit, which caused a counter to advance one
unit. This capability pointed up the principal difference between the
analytical engine and the tabulating machine; Hollerith was able to use
electrical power rather than mechanical power to drive the device.
Hollerith, who had been a statistician with the Census Bureau, realized
that the punched card processing had high potential for sales. In 1896, he
started the Tabulating Machine Company, which was very successful in selling
machines to railroads and other clients. In 124, this company merged with two
other companies to form the International Business Machines Corporation, still
well known today as IBM.
IBM, Aiken & Watson
For over 30 years, from 1924 to 1956, Thomas Watson, Sr., ruled IBM with
an iron grip. Before becoming the head of IBM, Watson had worked for the
Tabulating Machine Company. While there, he had a running battle with Hollerith,
whose business talent did not match his technical abilities. Under the lead of
Watson, IBM became a force to be reckoned with in the business machine market,
first as a purveyor of calculators, then as a developer of computers.
IBM's entry into computers was started by a young person named Howard
Aiken. In 1936, after reading Babbage's and Lovelace's notes, Aiken thought
that a modern analytical engine could be built. The important difference was
that this new development of the analytical engine would be electromechanical.
Because IBM was such a power in the market, with lots of money and resources,
Aiken worked out a proposal and approached Thomas Watson. Watson approved the
deal and give him 1 million dollars in which to make this new machine, which
would later be called the Harvard Mark I, which began the modern era of
Nothing close to the Mark I had ever been built previously. It was 55
feet long and 8 feet high, and when it processed information, it made a clicking
sound, equivalent to (according to one person) a room full of individuals
knitting with ...
The Analytical and Difference Engines (1835-1869): The English mathematician Charles Babbage (1792-1871) never got to build his invention, but his design had an uncanny resemblance to the modern computer. Ada Lovelace, Lord Byron's daughter, wrote eloquently about the device and was history's first programmer. The ABC (Atanasoff Berry Computer)...
The decade of the 1980's saw an explosion in computer technology and computer usage that deeply changed society. Today computers are a part of everyday life, they are in their simplest form a digital watch or more complexly computers manage power grids, telephone networks, and the money of the world. Henry Grunwald, former US ambassador to Austria...
Imagine being able to point into the sky and fly. Or perhaps walk through space and connect molecules together. These are some of the dreams that have come with the invention of virtual reality. With the introduction of computers, numerous applications have been enhanced or created. The newest technology that is being tap...
A young man sits illuminated only by the light of a computer screen. His fingers dance across the keyboard. While it appears that he is only word processing or playing a game, he may be committing a felony. In the state of Connecticut, computer crime is defined as: 53a-251. Computer Crime (a) Defined. A person commits computer crime when he v...
The microeconomics picture of the U.S. has changed immensely since 1973, and the trends are proving to be consistently downward for the nation's high school graduates and high school drop-outs. "Of all the reasons given for the wage squeeze ' international competition, technology, deregulation, the decline of unions and defense cuts ' technology i...