Shopping Product Reviews

History of Computing – Informatics and Technology

The volume and use of computers in the world is so great that it has become difficult to ignore. Computers appear to us in so many ways that we often don’t see them for what they really are. People associated with a computer when they bought their morning coffee from the vending machine. As they drove to work, the traffic lights that so often got in the way were controlled by computers in an attempt to speed up the commute. Whether we accept it or not, the computer has invaded our lives.

The origins and roots of computers began as have many other inventions and technologies in the past. They evolved from a relatively simple idea or blueprint designed to help perform functions more easily and quickly. The first basic type of computers were designed to do just that; calculate!. They performed basic math functions like multiplication and division and displayed the results in a variety of methods. Some computers display output in a binary representation of electronic lamps. Binary denotes using only ones and zeros, therefore, the lit lamps represented ones and the unlit lamps represented zeros. The irony of this is that people needed to perform another math function to translate binary to decimal so that the user could read it.

One of the first computers was called the ENIAC. It was an enormous and monstrous size, almost like that of a standard railway carriage. It contained electronic tubes, heavy gauge wiring, angle irons, and knife switches, just to name a few of the components. It has become hard to believe that computers have become the suitcase-sized microcomputers of the 1990s.

Computers finally became less archaic-looking devices near the end of the 1960s. They had shrunk to the size of a small car and processed chunks of information at a faster rate than older models. Most computers at this time were called “mainframe computers” due to the fact that many computers were connected together to perform a certain function. The main users of these types of computers were military agencies and large corporations such as Bell, AT&T, General Electric, and Boeing. Organizations like these had the funds to afford such technologies. However, the operation of these computers required large resources of intelligence and manpower. The average person could not have imagined trying to operate and use these multi-million dollar processors.

The United States was credited with the title of computer pioneer. It wasn’t until the early 1970s that nations like Japan and the United Kingdom began to use their own technology for computer development. This resulted in newer components and smaller computers. The use and operation of computers had become a way that people of average intelligence could manage and manipulate without much noise. When the economies of other nations began to compete with the United States, the computer industry expanded rapidly. Prices fell dramatically, and computers became more affordable for the average home.

Like the invention of the wheel, the computer is here to stay. Almost everything that is useful in society requires some form of training or education. Many people say that the predecessor of the computer was the typewriter. The typewriter definitely required training and experience in order to operate it at a usable and efficient level. Children are taught basic computer skills in the classroom to prepare them for the future evolution of the computer age.

The history of computers began about 2,000 years ago, with the birth of the abacus, a wooden shelf that holds two horizontal wires with beads strung on them. When these beads are moved, according to the programming rules memorized by the user, all regular arithmetic problems can be solved. Another important invention of the same time was the astrolabe, used for navigation.

Blaise Pascal is generally credited with building the first digital computer in 1642. He added numbers entered with dials and was done to help his father, a tax collector. In 1671 Gottfried Wilhelm von Leibniz invented a computer that was built in 1694. It could add and, after changing a few things, multiply. Leibnitz invented a special stopped-gear mechanism for entering the digits of the addends, and it is still in use.

The prototypes made by Pascal and Leibnitz were not used in many places and were considered rare until a little over a century later, when Thomas de Colmar (also known as Charles Xavier Thomas) created the first successful mechanical calculator that could add, subtract, multiply, and divide Many improved desktop calculators by many inventors followed, so that around 1890, the range of improvements included: Accumulation of partial results, automatic storage and re-entry of previous results (a memory function), and printing of results . Each of these requires manual installation. These improvements were made primarily for commercial users and not for the needs of science.

While Thomas de Colmar was developing the desktop calculator, Charles Babbage (after whom “Babbages” computer store is named), a professor of mathematics, initiated a series of very interesting developments in computers in Cambridge, England. In 1812, Babbage realized that many long calculations, especially those required to make mathematical tables, were actually a series of predictable actions that were constantly repeated. From this, he suspected that it should be possible to do this automatically. He began to design an automatic mechanical calculating machine, which he called a differential motor. By 1822, he had a working model to demonstrate. Financial help from the British government was obtained, and Babbage began manufacture of a differential engine in 1823. It was intended to be steam-driven and fully automatic, including printing of the resulting tables, and commanded by a fixed instruction program.

The differential motor, though with limited adaptability and applicability, was truly a breakthrough. Babbage continued to work on it for the next 10 years, but in 1833 he lost interest because he thought he had a better idea; the construction of what would now be called a fully program controlled, general purpose automatic mechanical digital computer. Babbage called this idea an Analytical Engine. The ideas of this design showed a lot of foresight, although this could not be appreciated until a century later.

Plans for this engine called for an identical decimal computer that would operate on numbers of 50 decimal digits (or words) and would have a storage capacity (memory) of 1000 of those digits. Embedded operations were supposed to include everything a modern general-purpose computer would need, including the important conditional control transfer capability that would allow commands to be executed in any order, not just the order they were programmed.

As people can see, it took a great deal of intelligence and strength to arrive at the style and use of computers in the 1990s. People have assumed that computers are a natural development in society and take them for granted. Just as people have learned to drive a car, it also takes skill and learning to use a computer.

Computers in society have become difficult to understand. Exactly what they consisted of and what actions they performed depended largely on the type of computer. Saying that a person had a typical computer does not necessarily limit the capabilities of that computer. The styles and types of computers covered so many different functions and actions that it was difficult to name them all. The original computers of the 1940s were easy to define their purpose when they were first invented. Mainly they performed mathematical functions many times faster than anyone could have calculated. However, the evolution of the computer had created many styles and types that were highly dependent on a well-defined purpose.

Computers of the 1990s fell roughly into three groups consisting of mainframes, network drives, and personal computers. Mainframe computers were extremely large modules in size and had the ability to process and store massive amounts of data in the form of numbers and words. Mainframes were the first types of computers developed in the 1940s. Users of these types of computers ranged from banking firms, large corporations, and government agencies. They were usually very expensive, but they were designed to last at least five to ten years. They also required well-educated and experienced labor to be operated and maintained. Larry Wulforst, in his book Breakthrough to the Computer Age, describes the old mainframe computers of the 1940s in comparison to those of the 1990s by speculating: “…the contrast to the sputtering engine sound that powered early flights of the Wright brothers at Kitty Hawk and the roar of powerful engines on a Cape Canaveral launch pad.” End of the first part.

Works Cited

Wolfforst, Harry. Advance to the computer age. New York: Charles Scribner’s Sons, 1982.

Palferman, Jon, and Doron Swade. The dream machine. London: BBC Books, 1991.

Campbell-Kelly, Martin, and William Aspray. Computer, a history of the information machine. New York: Basic Books, 1996.

Leave a Reply

Your email address will not be published. Required fields are marked *