Computer evolution

Table of contents:

Computer evolution
Computer evolution
Anonim

Among the most important technologies in human history, computers are perhaps the most recent invention. However, the need for calculations has existed since ancient times. Already the first agricultural civilizations of Mesopotamia were faced with such volumes of numbers and data, which were too difficult to store and recount in the mind. Then the first prototype of the computer appeared - the abacus.

Image
Image

Let's say the summing machine, created by Pascal back in the 17th century, was, in fact, mechanized accounts. Gears with the same gear ratio of 1:10, making a certain number of revolutions, made it possible to add numbers with five to eight decimal places. A little later, the Leibniz mechanism appeared, capable of performing all four basic arithmetic operations.

It was arranged more complexly: the corresponding discharge was represented by a stepped cylinder, each of which, having completed ten revolutions, returned to its original position and transferred one revolution to the next cylinder - this is how odometers work in modern cars. The same movements in reverse order allowed subtraction, and additional mechanisms that automated multiple addition and subtraction provided multiplication and division.

Image
Image

Leibniz himself said that simple counting "is not worth the attention and time of a worthy person, since any peasant is capable of performing the same work with the same precision if he uses a machine." However, the first documented use of the word "computer" (Richard Braithwaite, 1613) meant not a machine, but a profession. In those years, real "computers" were people who were really experienced in arithmetic - and this situation persisted until the middle of the 19th century, when mechanisms began to replace them little by little. Since the 1890s, the word "computer" has been included in the Oxford English Dictionary - already as a mechanical device.

However, almost all adding machines of that time were only more advanced, cheaper and more reliable versions of Leibniz's machine: they did not allow them to completely get rid of manual labor when counting. Most of the practical tasks - be it the calculation of the ballistic flight of a projectile or the support of a railway bridge - require entering, processing and reading tens, hundreds and thousands of numbers. Computing took a lot of effort and resources, and in order to really free "worthy people" from the humiliating work of a "counter", a machine was needed that was capable of performing any calculations and had memory, input and output devices.

Charles Babbage was the first to think about such a universal mechanism, who in 1820-1840 worked on a difference machine for decomposing functions into polynomials. The most complex system of tens of thousands of parts was never fully built by him, and only by the 200th anniversary of Babbage's birth in Great Britain they assembled it (proving the correctness of the engineer's calculations) and the primitive printer he designed.

Image
Image

The idea of Babbage's universal machine - although not feasible with the technology of the day - made a big impression on the minds. Already in the middle of the 19th century, Countess Ada Lovelace described the work of such a mechanism, introducing the concept of algorithms, cycles, and became the first programmer of a computer that did not yet exist. However, there was not long to wait.

Electromechanical

Towards the end of that same 19th century, the US government was faced with a rapid population growth - mainly due to the influx of migrants from Europe. The country's legislation prescribes a population census every 10 years, but already in 1880 so many questionnaires were collected that their manual processing took seven years. Meticulous statisticians calculated that the census in 1890 would take more than 10 years - the volumes grew like a snowball. It was for their processing that engineer Herman Hollerith created a tabulation machine that used punched cards. The holes corresponding to the responses in the questionnaire allowed thin flexible wires to pass through the punch card and connect at the bottom to conductive cells, liquid mercury electrodes. Closing the contacts caused the tiny motor to spin the corresponding wheel one revolution, locking the position.

Image
Image

By connecting the electrodes into circuits, it was possible to perform addition and combinatorial calculations: for example, when figuring out the total number of married men. It was a big step forward - no longer a mechanical, but an electromechanical computer. Hollerith's tabulators made it possible to process data an order of magnitude faster - they were purchased even by the government of Tsarist Russia, where they were used for the 1897 census. Created by an engineer, the Computing-Tabulating-Recording (CTR) company developed and produced more and more complex tabulators, and since 1924 it has become known under a new, now familiar name - International Business Machines, or simply IBM.

The company's products were hugely successful, but their capabilities quickly became scarce. Industrialization and the First World War, the rapid development of factories and cities, science and transport demanded more and more productivity. Electromechanical systems grew and became more complex: the Mark I machine, built by the same IBM in 1941, on the order of the American Navy occupied an entire building and was incredibly difficult to manage and operate.

Image
Image

She used dozens of punched tapes and millions of connection options, but the main innovation was the introduction of electromechanical relays. This device can be called a switch that delays or passes the current (rotating the same counter wheel), depending on whether there is current in the second, control loop. It's time to use logic.

Electrical

By combining such switches, you can get logic gates for performing computations. Let's say we need to add five and six. In a binary system, this means summing 0101 and 0110, bit by bit, according to the rules: 0 + 1 = 1 + 0 = 1.0 + 0 = 0, 1 + 1 = 10. We only need two logic gates: the first will supply current (1) if one of the registers to be added contains 1 and in our case will give 0011; the second will work only on 1 and 1 - in our case it corresponds to 1000. Simultaneous operation of two circuits will give 1011 - or 11 in the decimal system.

From an everyday point of view, it is not very convenient, but for a computer - what you need. Punched cards, magnetic tapes or memory cells can act as carriers of zeros and ones, and "switches" can act as logical elements. By the time we stopped, they had evolved to be fully electric.

Indeed, all 3,500 Mark I mechanical relays required physical switching, causing the circuit to close and open again. As a result, they had only a limited supply of endurance and required replacement after about 50 thousand switches. This also reduced their speed: the machine could perform only three operations of addition or subtraction per second. Finally, a mechanical solution is extremely unreliable: an ordinary insect that sneaked into the system threatened to disrupt its work - which happened every now and then, giving rise to the modern word "bug". Unsurprisingly, engineers soon turned their attention to another way to get controllable switches - vacuum tube diodes, which turned electromechanical systems into fully electrical ones.

Such devices were created back in the 1900s: a vacuum lamp contains electrodes, one of which, heating up when a current is applied, begins to emit electrons, which rush to the oppositely charged electrode. However, a third electrode installed between them can control this flow. If negative voltage is applied to it, it blocks the movement of electrons, and if positive, it facilitates it.

Lamp diodes were much more reliable and faster than mechanical relays; they could switch hundreds and thousands of times per second and last longer. They were widely used in sound amplifiers: a weak current in the control circuit closed a more powerful working circuit, thereby amplifying the signal. But if a household amplifier required one lamp, the computer needed hundreds - fragile, expensive, requiring regular replacement, and energy-hungry.

At the same time, already the first tube computers - such as the Colossus, which broke the ciphers of the Wehrmacht radio messages during the Second World War - quickly crossed the bar of thousands of diodes. To carry out each specific calculation, it was necessary to reprogram the system completely, combining logic gates from electronic tubes in a new way.

Image
Image

This process was automated only by the creators of the next machine - ENIAC, completed by 1945 and used to develop thermonuclear weapons. It was the first truly programmable computer capable of carrying out up to 500 thousand operations per second. Nevertheless, it became obvious that a fundamentally different mechanism for creating switch-relays was needed: the time of transistors was approaching.

Image
Image

Electronic

The merit for the creation of semiconductor transistors belongs to William Shockley and his colleagues at Bell Laboratories. In fact, these are the same switches, distant descendants of mechanical and tube systems, but operating at a more miniature level.

Image
Image

To understand how they work, we will have to descend again to the atomic scale. Silicon - one of the main elements in the earth's crust - forms a crystal lattice with the properties of a semiconductor. In their pure form, all four electrons that are on the outer shells of silicon atoms are separated between neighboring lattice sites.

Image
Image

They are stabilized and unable to move, so a flawless silicon crystal does not conduct current. However, the introduction of already small amounts of additives (doping) from elements with a different number of external electrons (for example, boron) creates free charge carriers in the lattice - or vacancies (holes) - which they will tend to occupy. We will get a material with electronic (N-) or hole (P-) conductivity.

Image
Image

Miniature

Now imagine that by careful doping we have transformed a small piece of pure silicon into an N-semiconductor with a thin P-conduction strip dividing it in half. An excess of electrons from the N-regions will occupy the nearest holes in the P-region, creating a region with an excess negative charge. It will prevent the further movement of electrons, blocking the flow of current, like the third control electrode in a vacuum tube. But if a positive charge is applied to the P-region, it will remove the excess electrons, allowing the current to move.

We got the same switch, but incredibly compact and fast, energy efficient and completely wear-free. By combining silicon NPN or PNP transistors, you can build any logic circuit for ultra-fast calculations, placing billions of transistors and contacts between them in a tiny volume. All that remains is to produce them.

Modern technologies for the production of semiconductor microcircuits are more accurate than jewelry ones and require more than surgical cleanliness. The temperature, which at some stages is brought to 1500 ° C, is controlled to tenths of a degree, and dust particles in the air of huge industrial premises should not exceed five per liter of volume. This is the only way to achieve sufficient accuracy and place more and more transistors on the microcircuit - from 2300 on the revolutionary Intel 4004 microprocessor in 1971, to 3.1 million transistors on the 1993 Intel Pentium and hundreds of millions in each of the ten cores of a modern Xeon processor.

High purity is also required from the main production resource - quartz sand, which is calcined in the presence of magnesium for additional purification and oxygen removal. The resulting silicon is melted and a seed is immersed in it - a tiny crystal, which is slowly pulled out, growing all new atomic layers until a single crystal of sufficient size is obtained. Cutting it, they get thin - less than a millimeter - plates of pure semiconductor, which, after grinding and additional processing, turn into blanks in order to "cut" a whole system of transistors and connections - a microcircuit of the future processor.

For this, silicon (semiconductor) is coated with a layer of silicon oxide (insulator) and a photoresist material. Under the action of ultraviolet rays, it hardens, and in other areas is subsequently washed off, allowing you to remove the insulating oxide layer. The process is similar to medieval lithographic technology, in which the paint was retained only in grooves scratched into a metal plate, forming a ready-made pattern for a print. It is called photolithography, although the "grooves" here are already a nanometer drawing of the finest microcircuit.

Image
Image

In this case, pre-prepared stencils are used that transmit ultraviolet light in some areas and delay in others. Similarly, other layers are applied containing boron or other impurities to form NPN junctions, copper or other metals for future contacts.

The dimensions of the stencils are much larger than those of the future processor, so the "sheaves" of radiation after them are focused on a tiny area using special lenses. Already in the 1980s, the accuracy of such systems was brought to micrometers, and modern technologies make it possible to "reduce" the stencil picture when transferred to a silicon crystal by many orders of magnitude - up to 10 nanometers.

The progress is impressive and is still fully consistent with the rule that Gordon Moore, one of the leaders of IBM, approved at the dawn of silicon electronics: every 18 months, the performance of microcircuits doubles. Other areas of technology have long and unsuccessfully envied this development (imagine if the speed of transportation grew at a similar pace!), However, it will probably end soon.

Future

Indeed, less well known is Moore's other observation that the cost of each next step in miniaturizing silicon chips grows at almost the same rate as their performance. In recent years, this has led to some lag in the circuitry from the usual rate of development, and engineers have come very close to the minimum limit. Transistors - only hundreds or even tens of atoms in size - are already manifesting themselves as quantum systems. They generate random effects that distort the accuracy of calculations - and the acceleration of supercomputers increasingly relies not on the power of individual microcircuits, but on a large number of cooperating elements.

Image
Image

However, even the decline of silicon transistors will not mean that we have come to the limit of the performance of computers as such. Some experts see a great promise for DNA computation, combining paired nucleotides in nucleic acid chains: in theory, they promise incredibly high performance in solving many problems that require parallel computing. There is even more hope for quantum computers, which are able to rely on the very random and strange effects of quantum mechanics from which incredibly miniaturized silicon chips suffer. The first of them have already begun work - however, they deserve a separate story.

Popular by topic