Definition of computers/what are computers
It is an electronic machine which is used to store, process, input, output the data
History of Computers
Maira (Secessity) is Mother of invention. The saying holds true for computers also because computers were invented as a result of man’s search for fast and accurate calculating devices.
The earliest device that qualifies as a digital computer is the “abacus” also known as “soroban”. This device permits the users to represent number by position of beads on a rack. Simply addition and subtraction can be carried out rapidly and efficiently by positioning the beads appropriately.
Babbage Engine 1842
Charles Babbage, a nineteenth century Professor at Cambridge University, is considered to be the father of modern digital computers. In 1942 Babbage came out with his new idea of analytical engine that was intended to be completely automatic. It was to be capable of performing the basic arithmetic functions for any mathematical problem and it was to do so at an average speed of 60 additions per minute.
Mark I Computers 1937-44
The first fully automatic calculating machine was designed by Howard A. Aiken of Harvard University in collaboration with IBM (International Business Machines) corporation. Although this machine proved to be extremely reliable, it was very complex in design and huge in size. It used over 3000 electrically actuated switches to control its operations
The Atanasoff- Berry Computers 1939-42
This machine was developed by Dr. John Atanasoff to solve certain mathematical equations. It was also called ABC. It used 45 vacuum tubes for internal logic and capacitors for storage.
The ENIAC 1943-46
the Electronic Numerical Integrator and Calculator (ENIAC) was the first all electronic computer. It was constructed at the Moore School of Engineering of the University of Pennsylvania, USA by a design team.
ENIAC was developed as a result of military need. It took up the wall space in a 20×40 square feet room and used 18000 vacuum tubes. The addition of two numbers was achieved in 200 microseconds, and multiplication in 2000 microseconds.
The EDVAC 1946-52
The operation of ENIAC was seriously handicapped by the wiring board. This problem was later overcome by the new concept of “stored program” developed by Dr. John Von Neumann. The basic idea behind the stored program concept is that a sequence of instructions as well as data can be stored in the memory of the computer for the purpose of automatically directing the flow of operations. The stored program feature considerably influenced the development of modern digital computers and because of this feature we often refer to modern digital computer as stored program digital computers.
The Electronic Discrete Variable Automatic Computer (EDVAC) was designed on stored program concept.
The EDSAC 1947-49
Almost simultaneously with EDVAC of USA., the Britishers developed the Electronic Delay Storage Automatic Calculator (EDSAC). The machine executed its first program in May 1949. In this machine, addition was accomplished in 1500 microseconds, and multiplication operation in 4000 microseconds. The machine was developed by a group of scientists at University of Cambridge.
Manchester Mark I 1948
This computer was a small experimental machine based on the stored program concept. It was designed at Manchester University by a group of scientists. It storage capacity was only 32 words, each of 31 binary digits. This was too limited to store data and instructions. Hence the Manchester Mark I was hardly of any practicle use.
The UNIVAC 1951
The Universal Automatic Computer (UNIVAC) was the first digital computer which was not “one of a kind”. Many UNIVAC machines were produced, the first of which was installed in the Census Bureau in 1951 and was used continuously for 10 years. The first business use of a computer UNIVAC I was by General Electric Corporation in 1954.
In 1952 IBM introduced another UNIVAC computers and sold over 1000 of them.
The Computer Generations
“Generation” in computer talk is a step in technology. It provides a framework for the growth of the computer industry. Originally, the term “generation” was used to distinguish between varying hardware technologies. But nowadays, it has been extended to include both the hardware and the software which together make up an entire computer system.
First Generation Computers 1942-55
Some of the early computers like ENIAC, EDVAC, EDSAC etc. was belong to first generation computers.
These machines and other of their time were made possible by the invention of “vacuum tube” which was a fragile glass device that could control and amplify electronic signals. These vacuum tube computers are referred at First Generation Computers.
Second Generation Computers 1955-64
The transistor, a smaller and more reliable successor to the vacuum tube, was invented in 1947. however, computer that used transistors were not produced in quantity until over a decade later. The second generation emerged with transisteros being the brain of the computer.
Third Generation Computers 1964-75
Advances in electronics technology continued and the advent of “microelectronic” technology made it possible to integrate large number of circuit elements into very small surface of silicon know as “chips”. This new technology was called “integrated circuits”. The third generation was based on IC technology and the computer that were designed with the use of integrated circuits were called third generation computers.
Fourth Generation Computers 1995-Onwards
Initially, the integrated circuits contained only about ten to twenty components. This technology was named small scale integration (SSI). Later, with the advancement in technology for manufacturing ICs. It became possible to integrate upto a hundred components on a single chip. This technology came to be known as medium sacle integration (MSI). Then came the era of large scale integration (LSI) when it was possible to integrate over 30,000 components onto a single chip. Effort is still on for further miniaturization and it is expected that more than one million components will be integrated on a single chip known as very large scale integration (VLSI).
Related Technology News:
- What is Commercial off the Shelf (COTS)?
- Alternate mark inversion (AMI)
- Encoding Schemes
- Differential Manchester encoding
- What is Cyclic Redundancy Check (CRC)