History of computer development ccomputer, complete history of computer development from generation to generation, before and after 1940

Assalamualaikum warohmatullahi wabarokatu,


HISTORY OF COMPUTER DEVELOPMENT

History of computer development before 1940

In the era before 1940 the use of counting tools was still very simple and manual, until there was a tool that counted trade transactions called abacus, after 12 centuries a numerical wheel calculator appeared invented by Blaise Pascal and after that was developed by Gottfred Wilth von Leibniz and after Charles Xavier Thomas de Colmar invented a machine that can perform four basic arithmetic functions. Colmar's mechanical calculator, the arithometer, presents a more practical approach to calculation because this tool can do addition, subtraction, multiplication, and division.


History of computer development after 1940

FIRST GENERATION COMPUTER ( 1940 - 1959 )

This first generation computer uses vacuum tubes to process and store data. This tool becomes hot and flammable quickly, therefore thousands of vacuum tubes are needed to run the entire computer operation. This tool also requires a lot of electrical energy that causes electrical interference in the surrounding area. This first generation computer is 100% electronic and helps experts in solving calculation problems quickly and precisely.

SECOND GENERATION COMPUTER ( 1959 – 1964 )

In 1948, the invention of the transistor greatly influenced the development of computers. Transistors replace vacuum tubes in television, radio and computers that affect the size of electrical machinery Transistors began to be used in computers starting in 1956. Another discovery in the form of the development of magnetic-core memory helped the development of second generation computers that were smaller, faster, more able reliable, and more energy efficient than its predecessors. The first machine that utilizes this new technology is a supercomputer.

IBM made a supercomputer named Stretch, and Sprery-Rand made a computer called LARC. These computers, which were developed for atomic energy laboratories, can handle large amounts of data. The machine is very expensive and tends to be too complex for business computing needs, thus limiting its popularity. Only two LARCs have ever been installed and used: one at Lawrence Radiation Labs in Livermore, California, and the other at the US Navy Research and Development Center in Washington D.C.

The second generation computer replaced machine language with assembly language. Assembly language is a language that uses abbreviations to replace binary code. In the early 1960s, there began to appear successful second generation computers in business, in universities, and in government. This second generation computer computer is a computer that fully uses transistors.

THIRD GENERATION COMPUTER ( 1964 – EARLY 80 )

Although transistors in many ways outperform vacuum tubes, transistors generate considerable heat, which can potentially damage the internal parts of the computer. Quartz stone removes this problem. Jack Kilby, an engineer at Texas Instrument, developed an integrated circuit (IC: integrated circuit) in 1958. IC combines 3 electronic components in a small dish made of quartz sand.

the scientists were then able to insert more components into a single chip called a semi conductor, the result of the computer getting smaller because the components can be integrated into the chip. Another advancement in third generation computers is the use of an operating system (Operating System) which allows machines to run various different programs simultaneously with a main program that monitors and coordinates computer memory.

FOURTH GENERATION COMPUTER ( EARLY 80 - ??? )

After the IC, the development goal became clearer: reducing the size of circuits and electrical components. Large Scale Integration (LSI) can load hundreds of components on a chip. In the 1980s, Very Large Scale Integration (VLSI) contained thousands of components on a single chip. Ultra-Large Scale Integration (ULSI) increases the number to millions. The ability to install so many components in a chip that is half the size of a coin pushes down the price and size of the computer.

It also increases the workability, efficiency and reliability of the computer. The Intel 4004 chip made in 1971 brought progress to the IC by placing all components of a computer (central processing unit, memory, and input / output control) in a very small chip. Previously, IC was made to do a certain specific task. Now, a microprocessor can be produced and then programmed to meet all the desired needs. Not long after, every household device such as a microwave oven, television, and car with electronic fuel injection is equipped with a microprocessor.

FIFTH GENERATION COMPUTER ( FUTURE )

Many advances in the field of computer design and technology have increasingly made the creation of fifth generation computers. Two major engineering advances are parallel processing capabilities, which will replace the nonNeumann model. The non-Neumann model will be replaced by a system capable of coordinating multiple CPUs to work simultaneously. Another advance is superconducting technology that allows electrical flow without any obstacles, which in turn can accelerate the speed of information. Japan is a country famous for the socialization of jargon and fifth generation computer projects, ICOT institutions were also formed to make it happen, many say this project failed. But many also informed that the success of this project will change the world. We are just waiting for which information is valid.

SUMMARY
Before 1940, computer equipment or calculating equipment was still simple. After 1940, computers used vacuum tubes and transistors as the main components used to improve the performance of the computer.

THANK YOU FOR BEING SIMPLE IN THIS SIMPLE ARTICLES WHAT WAS MY THAT I HAVE BEEN ABLE TO USE FOR ALL OF YOU AAMIIN !! ( SHARING IS BEAUTIFUL )

WASSALAM