Friday, May 27, 2011

Principles of Technology in the Computer.


Principles of Technology in the Computer.

In 1670 the German philosopher and mathematician Gottfried Wilhelm Leibniz perfected this and invented a machine that could also multiply.
The French inventor Joseph Marie Jacquard, when designing an automatic loom, used thin wood perforated plates to control the fabric used in complex designs. During 1880, the American Statistical Herman Hollerith conceived the idea of ​​using punched cards,
similar to the Jacquard cards, to process data. Hollerith got compile statistical information to the general population census of 1890 the United States through the use of a punch card system
to be on electrical contacts.
Also in the nineteenth century, British mathematician and inventor Charles Babbage worked out the principles of modern digital computer. He invented a series of machines, as the Difference Engine, designed to solve complex mathematical problems. Many historians consider Babbage and his partner, Augusta Ada Byron British Mathematics (1815-1852), daughter of English poet Lord Byron, as the true inventors of the modern digital computer. The technology of the time was not able to transfer their concepts of successful practice, but one of his inventions, the analytical engine, and had many of the features of a modern computer. Includes a current or inflow and a pack of punched cards, a memory for storing data, a processor for
r mathematics and a printer for permanent record.
analog computers began to be built in the twentieth century. The first model calculations performed by rotating shafts and gears. With these machines evaluated the numerical approximations of equations too difficult to be able to solve by other methods. During the two world wars, analog systems were used for mechanical and electrical part, to predict the path of torpedoes from submarines and remote control bombs in aviation.
During the Second World War (1939-1945), a team of
scientists and mathematicians working in Bletchley Park, north London, created what was considered the first fully electronic digital computer, the Colossus. In December 1943, Colossus, 1500 válvulaso incorporating vacuum tubes, is now operational. It was used by the team led by Alan Turing to decode encrypted radio messages from the Germans. In 1939, regardless of this project, John Atanasoff and Clifford Berry had already built a prototype electronic machine at Iowa State College (USA). Subsequent research prototype and is made in the anonymity and was later eclipsed by the development of the Electronic Numerical Integrator and Computer (ENIAC English, Electronic Numerical Integrator and Computer) in 1946.
The ENIAC, which as shown is based largely on the Atanasoff-Berry Computer (ABC English, Atanasoff-Berry Computer), obtained a patent that expired in 1973, several decades later.
The ENIAC contained 18,000 vacuum tubes and had a speed of several hundred multiplications per minute, but his program was connected to the processor and must be changed manually.
Is a successor to the ENIAC built with a storage program that is based on the concepts of Hungarian-American mathematician John von Neumann. The instructions are stored in a memory call, which released the limitations of computer speed paper tape reader during implementation and help resolve problems without having to connect to the computer.
In the late 1950's the use of transistor in computers marked the advent of smaller logical elements, quick and versatile machines with valves allowed. Because transistors use much less energy and have a longer life, its development was due to the emergence of more sophisticated machines, which were called or second-generation computers. The components are made smaller and the spaces between them, so that the system was cheaper.
In late 1960 there was the integrated circuit (IC), which allowed the production of multiple transistors on a silicon substrate where the interconnects were soldiers. The integrated circuit allowed a further reduction in price, size and error rates.
The microprocessor has become a reality in the mid 1970's, with the introduction of circuit scale integration (LSI stands for Large Scale Integrated) and, later, with the circuit of large scale integration (by VLSI acronym for Very Large Scale Integrated), with several thousand soldiers interconnected transistors on a silicon substrate.

0 comments:

Post a Comment