History of Computers from Generation to Generation

Computers today are devices that cannot be separated from everyday life. But did you know the history of computers in the beginning? In the early days of the outbreak, the computer was made not to help you complete various tasks, but rather as a calculating machine. This refers to the term “computing” which means counting, so that the computer means the person or tool used to count.

With this understanding, the history of computers is very long, even in the days before Christ. Computer seeds can be found in early human civilization. The use of abacus in Europe or suanpan in China for example, these two devices are simple computers that humans use to count faster.

But computer design with its own programming only emerged in the early 19th century. The originator was a British polymath and mechanical engineer, Charles Babbage. He gave birth to the invention of a computer machine precisely in 1822. To find out the complete history of computers, from generation to generation, you can listen to the following description.

The History of First Generation Computers
The history of modern computers began by Charles Babbage in the early 20th century. Babbage, an engineer and mathematician, came up with an idea to create an analytical engine that made it possible to automatically calculate groups of numbers. Although Babbage succeeded with the concept, the practice was not smooth. The problem lies in the series of machines that must be made by hand, even though the circuit consists of thousands of parts.

Although it failed, the Babbage project with the British government succeeded in showing that it was not impossible to create an automatic counting tool. The findings from England even then encouraged scientists, mathematicians, and mechanical engineers to create computing machines.

The emergence of analog generation
After Babbage’s failure, many computing machine technologies emerged. But unfortunately many of these findings deviated from the concept of Babbage’s computer that is programmed and can be used for various computing purposes. Computers in those days were known as analog computers.

The first modern analog computer appeared in 1872. The machine invented by Sir William Thomson was a device for predicting waves of sea water. A few years later, precisely in 1876 James Thomson invented a modern analog computer that was able to solve several equations using the mechanism of the rotation of the wheels and discs.

The pinnacle of analog computer creation came when H.L. Hazen and Vannevar Bush created a more complex analog computer by combining Thomson’s findings with the torque amplifier H. H. Nieman’s findings. This device even lasted until 1950 for some special purposes such as education and aviation, before being displaced by the presence of digital computers.

The first generation of modern computers
The history of computers then revolves around digital mechanisms. In 1938, the American Navy succeeded in creating an electromechanical analog-based computer. This finding is a breakthrough because the size of the resulting device is quite practical (can be transported by a submarine) and does not require as much operating power as an early generation analog computer.

The findings from America later gave birth to several computers that use electromechanical programs, one of which was Z3 created by a German engineer, Konrad Zuse. This computer was then called the first digital computer. Zuse Z3 created using a binary number system that is more simple and practical when compared to the decimal system used by Babbage.

From Z3, move to the Colossus which was built by Max Newman. Colossus until now recorded as the first digital computer that can be programmed electronically. After Colossus, computer history shifted to ENIAC (Electronic Numerical Integrator and Computer). This machine is similar to Colossus, but the speed is better and also more flexible.

To operate ENIAC, users must enter the program manually into the machine using a special switch. The engine itself is a “giant” size, weighing up to 30 tons and using electric power up to 200 kW.

The second generation, getting closer to modern computers
The concept of a modern computer was conceived by the British mathematician, Alan Turing. In his writing On Computable Numbers (1936), Turing came up with the idea of ​​creating a device he called a “universal computing machine”. In his writings Turing believed that the machine could count anything that could indeed be counted through a special instruction or program.

Turing’s discovery then led to the concept of computers with stored programs. Computers are developed by creating special “spaces” for storing programs (instructions for running machines). The world’s first stored program computers are Frederic C. Williams’s Manchester Baby, Tom Kilburn, and Geoff Tootill.

After the era of stored programs, the era of bipolar transistors emerged by William Shockley, John Bardeen, and Walter Brattain from Bell Laboratories. This tool replaces the vacuum tubes used on early generation digital computers. Lighter transistors, practical. and only requires a little electricity. In addition, transistors are also more durable. The first use of transistors in computer history was recorded in the Tom Kilburn project of the University of Manchester.

Computer history then moved to the use of integrated circuits (integrated circuits). Integrated circuit is a device made of semiconductor material with all integrated electrical components. This concept was coined by Geoffrey W.A. Dummer. The application itself is carried out by several parties, but the most famous are Jack Kilby (Texas Instruments) and Robert Noyce (Fairchild Semiconductor).

Third and fourth generation
With the integrated ciruit, computer development is increasingly defined. The purpose of computer floating is also more directed, namely to a computer with a smaller size. The trick is to reduce the size of the circuit and electrical components in the computer. This is then realized through Large Scale Integration (LSI).

In 1964, Douglas Engelbart presented a prototype of a mobile computer with a mouse and graphical user interface (GUI). This idea then made the public begin to realize that computers can also be used by people outside of scientists and mathematicians as previously thought. An example of a first generation mobile computer is the IBM 5100 which weighs 23 kg.

In the mid-1970s, a computer assembly plant began to produce computers for the community which came to be called mini computers. This device comes with simple software, at that time the most popular programs were word processing and spreadsheets. Shifting into the 1980s, minicomputers began to be equipped with video game programs for entertainment. One of the most popular is Atari 2600.

In 1981, IBM began introducing personal computers (PCs) for independent use in offices and homes. Until 1991, PC sales from IBM even reached 65 million units. Along with the increasing popularity of PCs, innovation continues to be done. Computers in the fourth generation also continue to be reduced in size and added to the working ability of the software.

For now, computers have begun to enter the fifth generation. This is indicated by the use of artificial intelligence on computers. Indeed, this system cannot really be implemented because there are still many shortcomings. Even so, that does not mean the realization of fifth generation computers will not be possible in the near future. How, apparently the device in front of you now has a long history. Hopefully the history of the computer above can provide new knowledge for you.