Comparison of computers of different generations table. Comparative characteristics of computer generations

Comparison options

Generations of computers

fourth

Time period

Element base (for CU, ALU)

Electronic (or electric) lamps

Semiconductors (transistors)

integrated circuits

Large integrated circuits (LSI)

Main computer type

Small (mini)

Basic input devices

Remote control, punched card, punched tape input

Alphanumeric display, keypad

Color graphic display, scanner, keyboard

Main output devices

Alphanumeric printer (ATsPU), perforated tape output

Graph plotter, printer

External memory

Magnetic tapes, drums, punched tapes, punched cards

Perforated tape, magnetic disk

Magnetic and optical discs

Key decisions in software

Universal programming languages, translators

Batch operating systems optimizing translators

Interactive operating systems, structured programming languages

Software friendliness, network operating systems

Computer operating mode

Single program

Batch

Time divisions

Personal work and network processing

The purpose of using a computer

Scientific and technical calculations

Technical and economic calculations

Management and economic calculations

Telecommunications, information service

Table - The main characteristics of computers of various generations

Generation

Period, years

1980-present temp.

Element base

Vacuum vacuum tubes

Semiconductor diodes and transistors

integrated circuits

Ultra-large integrated circuits

Architecture

Von Neumann architecture

Multiprogram mode

Local computer networks, computing systems for collective use

Multiprocessor systems, personal computers, global networks

Performance

10 - 20 thousand op / s

100-500 thousand op/s

About 1 million op/s

Tens and hundreds of millions of op/s

Software

Machine languages

Operating systems, algorithmic languages

Operating systems, dialog systems, computer graphics systems

Application packages, databases and knowledge bases, browsers

External devices

Input devices from punched tapes and punched cards,

ATsPU, teletypes, NML, NMB

Video terminals, HDD

NGMD, modems, scanners, laser printers

Application

Calculation tasks

Engineering, scientific, economic tasks

ACS, CAD, scientific and technical tasks

Management tasks, communications, creation of workstations, word processing, multimedia

Examples

ENIAC, UNIVAC (USA);
BESM - 1.2, M-1, M-20 (USSR)

IBM 701/709 (USA)
BESM-4, M-220, Minsk, BESM-6 (USSR)

IBM 360/370, PDP -11/20, Cray -1 (USA);
EU 1050, 1066,
Elbrus 1.2 (USSR)

Cray T3 E, SGI (USA),
PCs, servers, workstations from various manufacturers

Over the course of 50 years, several generations of computers appeared, replacing each other. The rapid development of VT throughout the world is determined only by advanced element base and architectural solutions.
Since a computer is a system consisting of hardware and software, it is natural to understand a generation of computer models characterized by the same technological and software solutions (element base, logical architecture, software). Meanwhile, in a number of cases it turns out to be very difficult to classify BT by generations, because the line between them from generation to generation becomes more and more blurred.
First generation.
Element base - electronic lamps and relays; random access memory was performed on triggers, later on ferrite cores. Reliability - low, required a cooling system; Computers were large. Performance - 5 - 30 thousand arithmetic op / s; Programming - in computer codes (machine code), later autocodes and assemblers appeared. Programming was done by a narrow circle of mathematicians, physicists, and electronic engineers. First generation computers were used mainly for scientific and technical calculations.

Second generation.
Semiconductor element base. Significantly increased reliability and performance, reduced size and power consumption. Development of input / output, external memory. A number of progressive architectural solutions and further development of programming technology - time-sharing mode and multiprogramming mode (combining the work of the central processor for processing data and input / output channels, as well as parallelizing operations for fetching commands and data from memory)
Within the framework of the second generation, the differentiation of computers into small, medium and large became clearly manifest. Significantly expanded the scope of the computer to solve problems - planning - economic, management of production processes, etc.
Automated control systems (ACS) are being created for enterprises, entire industries and technological processes (APCS). The end of the 50s is characterized by the emergence of a number of high-level problem-oriented programming languages ​​(HLL): FORTRAN, ALGOL-60, etc. The development of software has received in the creation of libraries of standard programs in various programming languages ​​and for various purposes, monitors and dispatchers for controlling modes work of the computer, planning its resources, which laid the concept of the next generation of operating systems.

Third generation.
Element base on integrated circuits (IC). There are series of computer models that are programmatically compatible from the bottom up and have increasing capabilities from model to model. The logical architecture of computers and their peripheral equipment became more complex, which significantly expanded the functional and computing capabilities. Operating systems (OS) become part of computers. Many tasks of managing memory, input / output devices and other resources began to be taken over by the OS or directly by the hardware of the computer. Software becomes powerful: database management systems (DBMS), systems for design work automation (CAD) for various purposes appear, automated control systems and process control systems are being improved. Much attention is paid to the creation of application software packages (APP) for various purposes.
Programming languages ​​and systems are developing. Examples: - series of IBM/360 models, USA, serial production - since 1964; -ES COMPUTER, USSR and CMEA countries since 1972.
Fourth generation.
Large (LSI) and extra-large (VLSI) integrated circuits become the element base. Computers were already designed for the efficient use of software (for example, UNIX-like computers, which are best immersed in the UNIX software environment; Prolog-machines, focused on artificial intelligence tasks); modern NU. Telecommunications information processing is gaining momentum by improving the quality of communication channels using satellite communications. National and transnational information and computing networks are being created, which allow us to talk about the beginning of the computerization of human society as a whole.
Further intellectualization of CT is determined by the creation of more advanced human-computer interfaces, knowledge bases, expert systems, parallel programming systems, etc.
The element base has made it possible to achieve great success in miniaturization, increasing the reliability and performance of computers. Micro- and mini-computers have appeared that are superior in capabilities to medium and large computers of the previous generation at a much lower cost. The technology for the production of processors based on VLSI accelerated the pace of production of computers and made it possible to introduce computers to the broad masses of society. With the advent of a universal processor on a single chip (Intel-4004 microprocessor, 1971), the era of the PC began.
The first PC can be considered the Altair-8800, based on the Intel-8080, in 1974. E. Roberts. P. Allen and W. Gates created a translator from the popular Basic language, significantly increasing the intelligence of the first PC (later they founded the famous company Microsoft Inc). The face of the 4th generation is largely determined by the creation of supercomputers, characterized by high performance (average speed 50 - 130 megaflops. 1 megaflops = 1 million operations per second with a floating point) and non-traditional architecture (the principle of parallelization based on pipelined command processing) . Supercomputers are used in solving problems of mathematical physics, cosmology and astronomy, modeling complex systems, etc. Since powerful computers play and will continue to play an important switching role in networks, network problems are often discussed together with questions on supercomputers. -Computers can be called machines of the Elbrus series, computer systems ps-2000 and PS-3000, containing up to 64 processors controlled by a common instruction flow, the speed on a number of tasks was about 200 megaflops. At the same time, given the complexity of developing and implementing projects of modern supercomputers, which require intensive fundamental research in the field of computer science, electronic technology, a high production culture, and serious financial costs, it seems very unlikely that domestic supercomputers will be created in the foreseeable future, according to the main characteristics not inferior to the best foreign models.
It should be noted that during the transition to the IS technology of computer production, the defining focus of generations is increasingly shifting from the element base to other indicators: logical architecture, software, user interface, application areas, etc.
Fifth generation.
It originates in the bowels of the fourth generation and is largely determined by the results of the work of the Japanese Committee for Scientific Research in the field of computers, published in 1981. According to this project, computers and computing systems of the fifth generation, in addition to high performance and reliability at a lower cost, fully provided by VLSI and other latest technologies, must satisfy the following qualitatively new functional requirements:

· to ensure the ease of use of computers by implementing systems for input / output of information by voice; interactive processing of information using natural languages; learning opportunities, associative constructions and logical conclusions;

Simplify the process of creating software tools by automating the synthesis of programs according to the specifications of the initial requirements in natural languages

· to improve the main characteristics and operational qualities of VT to meet various social tasks, to improve the ratio of costs and results, speed, lightness, compactness of computers; ensure their diversity, high adaptability to applications and reliability in operation.

Given the complexity of the implementation of the tasks set for the fifth generation, it is quite possible to break it down into more visible and better perceived stages, the first of which is largely implemented within the framework of the present fourth generation.

The textbook consists of two sections: theoretical and practical. The theoretical part of the textbook outlines the foundations of modern informatics as a complex scientific and technical discipline, including the study of the structure and general properties of information and information processes, the general principles of constructing computing devices, considers the organization and functioning of information and computer networks, computer security, presents the key concepts of algorithmization and programming, databases and DBMS. To control the received theoretical knowledge, questions for self-examination and tests are offered. The practical part covers the algorithms of the main actions when working with the Microsoft Word word processor, the Microsoft Excel spreadsheet editor, the Microsoft Power Point presentation program, archiving programs and anti-virus programs. As a consolidation of the completed practical course, at the end of each section, it is proposed to perform independent work.

Book:

In accordance with the element base and the level of software development, four real generations of computers are distinguished, a brief description of which is given in table 1.

Table 1



The computers of the first generation had a low speed of several tens of thousands of operations per second. Ferrite cores were used as internal memory.

The main disadvantage of these computers is the mismatch between the speed of internal memory and ALU and control unit due to different element base. The overall performance was determined by the slower component - the internal memory - and reduced the overall effect. Already in the first generation computers, attempts were made to eliminate this drawback by asynchronizing the operation of devices and introducing output buffering, when the transmitted information is "dumped" into the buffer, freeing the device for further work (the principle of autonomy). Thus, for the operation of I / O devices, their own memory was used.

A significant functional limitation of the first generation of computers was the focus on performing arithmetic operations. When trying to adapt to the tasks of analysis, they turned out to be ineffective.

There were no programming languages ​​as such, and programmers used machine instructions or assemblers to code their algorithms. This complicated and delayed the programming process. By the end of the 1950s, programming tools were undergoing fundamental changes: a transition was being made to automation of programming using universal languages ​​and libraries of standard programs. The use of universal languages ​​led to the emergence of translators.

The programs were executed by tasks, i.e., the operator had to monitor the progress of solving the problem and, when the end was reached, initiate the execution of the next task himself.

The beginning of the modern era of using computers in our country dates back to 1950, when at the Institute of Electrical Engineering of the Academy of Sciences of the Ukrainian SSR, under the leadership of S.A. Lebedev created the first domestic computer called MESM - Small Electronic Computing Machine. During the first stage of the development of computer technology in our country, a number of computers were created: BESM, Strela, Ural, M-2.

The second generation of computers is the transition to the transistor element base, the appearance of the first mini-computers.

The principle of autonomy is being further developed - it is already implemented at the level of individual devices, which is expressed in their modular structure. I/O devices are provided with their own CUs (called controllers), which frees the central CU from managing I/O operations.

The improvement and cheapening of computers led to a decrease in the unit cost of computer time and computing resources in the total cost of an automated solution to the problem of data processing, while at the same time, the costs of developing programs (i.e. programming) almost did not decrease, and in some cases tended to increase . Thus, a trend towards efficient programming was outlined, which began to be realized in the second generation of computers and is being developed to the present day.

The development of integrated systems based on libraries of standard programs that have the property of portability, i.e., functioning on computers of different brands, begins. The most commonly used software tools are allocated in the PPP for solving problems of a certain class.

The technology of executing programs on a computer is being improved: special software tools are being created - system software.

The goal of system software is to make it easier and faster for the processor to move from one task to another. The first batch processing systems appeared, which simply automated the launch of one program after another and thereby increased the processor utilization. Batch processing systems were the prototype of modern operating systems, they became the first system programs designed to control the computing process. During the implementation of batch processing systems, a formalized job control language was developed, with the help of which the programmer told the system and the operator what work he wanted to do on the computer. A set of several tasks, usually in the form of a deck of punched cards, is called a task package. This element is still alive today: the so-called MS DOS batch (or batch) files are nothing more than job packages (the extension in their name bat is an abbreviation for the English word batch, which means batch).

Domestic computers of the second generation include "Promin", "Minsk", "Razdan", "Mir".

In the 1970s, computers of the third generation appeared and developed. In our country, these are ES computers, ASVT, SM computers. This stage is the transition to an integrated element base and the creation of multi-machine systems, since it was no longer possible to achieve a significant increase in performance on the basis of a single computer. Therefore, computers of this generation were created on the basis of the principle of unification, which made it possible to integrate arbitrary computing systems in various fields of activity.

The expansion of the functionality of computers has increased the scope of their application, which caused an increase in the volume of processed information and set the task of storing data in special databases and maintaining them. This is how the first database management systems - DBMS - appeared.

The forms of using computers have changed: the introduction of remote terminals (displays) has made it possible to widely and effectively introduce the time-sharing regime and thereby bring the computer closer to the user and expand the range of tasks to be solved.

A new kind of operating system that supports multiprogramming has made it possible to provide time-sharing. Multiprogramming is a method of organizing a computing process in which several programs are alternately executed on one processor. While one program is performing an I/O operation, the processor is not idle, as it was when programs were executed sequentially (single-program mode), but is executing another program (multi-program mode). In this case, each program is loaded into its own section of internal memory, called a partition. Multiprogramming is aimed at creating for each individual user the illusion of the sole use of a computer, therefore such operating systems were interactive in nature, when the user solved his tasks in the process of dialogue with the computer.

This principle is implemented by the presence of RAM. This is a fundamentally important decision, because Initially, automatic computing devices were designed so that commands either came from an input device or were sewn directly into electrical circuits, and in order to solve a new problem, it was necessary to re-solder the circuits. Even Charles Babbage suggested that only numbers should be stored in the “warehouse” (memory), and commands should be entered using punched cards. The decision that commands and data are stored in memory on an equal footing was implemented in the first electronic computers.

Principle of program control

This principle is implemented by the presence of CU. The principle of program control is that the computer operates according to a program stored in memory. The program consists of commands (link to figure).

Sequential execution of operations

Sequential execution of operations is that commands are executed one after another, the execution of a new command begins after the completion of the previous one. In modern computers, along with sequential processing, there is the possibility of parallel processing of several processes, which greatly speeds up the work and expands the capabilities of the computer. But in the first developments this was not the case.

Binary coding

Information in the computer is stored and processed in an encoded form. Binary number system is used for coding. This is due to the convenience of the technical implementation of binary signs 0 and 1, which are interpreted by electrical signals of high and low voltage, and the simplicity of operations with binary numbers. It should be noted that this principle was not originally implemented in all computers. The first-born of American computer technology, the Mark-1 computer, performed calculations in the decimal system, but the technical implementation of the decimal encoding was very complicated, and it was later abandoned.

Use of electronic elements and electrical circuits

The use of electronic elements and electrical circuits provides the greatest reliability of the computer in comparison with the electromechanical relays that were used in the first designs of computing devices.

Generations of computers and prospects for the development of computer technology

In the history of the development of computing facilities, three historical stages can be distinguished, the time frames of which are presented in Table 1.

Table 1 Stages of development of computing facilities

Comparing these time periods, we can say that the time during which humanity made the colossal leap from the first computers to modern supercomputers is a moment "between the past and the future."

The period from 1945 to ser. 90s The development of computer technology is usually divided into four stages, which are characterized by qualitative changes in hardware and software. These stages are called generations. The main characteristics of each generation are presented in Table 2. However, it should be noted that the boundaries between generations are not clearly defined. In the process of developing computer technology, computer models were developed that had signs of a new generation.

Table 2 Generations of computers

Computer generation

Chronological boundaries of the period

ser. 40s - ser. 50s

ser. 50s-ser. 60s

ser. 60s - 70s

Element base

Electronic vacuum lamps (up to 20 thousand lamps in one machine)

semiconductor transistors. The circuits are mounted on separate boards.

Microcircuits - an electronic circuit of several thousand elements that implements a specific function (size up to 0.3 - 0.5 cm2).

Microprocessors - an integrated circuit with a high degree of integration that performs the functions of a control unit and an ALU.

Reliability

Frequent overheating, difficult troubleshooting, replacement » 2000 lamps per month

Overheating has been fixed. In the event of a malfunction, the entire board is replaced. Great reliability, durability

Greater reliability, durability compared

Speed ​​(number of operations per second)

(10-20 thousand op/sec)

(up to a million ops/sec)

(several million ops/sec)

(tens of millions op/sec)

RAM capacity

Production

Single copies.

Serial

Compatible Machine Systems

Mass production

Dimensions

Bulky cabinets occupy a large machine room

The same type of racks of large sizes above human height occupy the machine room

The machine is made in the form of two racks; does not require a special room

The main achievement is the appearance of personal computers located on the desktop

Programming

machine codes. Requires high professionalism and knowledge of the structure of the computer

Algorithmic languages

Further development and diversity of programming languages

Languages ​​for solving specialized management tasks, databases, text editors

ENIAC EDSAC (USA)MESM (Russia)

BESM-*; “Minsk **” (Russia)

EU (single system): EU-1060; SM (series of small computers: SM-22…

IBM-8080.088, *286 (US); Iskra 1030, Neuron (Russia)

After the creation of the EDSAC model in England in 1949, a powerful impetus was given to the development of universal computers, which stimulated the appearance in a number of countries of computer models that made up the first generation. Over the course of more than 40 years of the development of computer technology (CT), several generations of computers have appeared, replacing each other.

The first generation computers used vacuum tubes and relays as an element base; random access memory was performed on triggers, later on ferrite cores; performance was, as a rule, in the range of 5-30 thousand arithmetic op/s; they were characterized by low reliability, required cooling systems and had significant dimensions. The programming process required considerable art, a good knowledge of the computer architecture and its software capabilities. At the beginning of this stage, programming in computer codes (machine code) was used, then autocodes and assemblers appeared. As a rule, first-generation computers were used for scientific and technical calculations, and the programming process itself was more like an art, which was practiced by a very narrow circle of mathematicians, electrical engineers and physicists.

EDSAC computer, 1949

Computer of the 2nd generation

The creation of the first transistor in the USA on July 1, 1948 did not herald a new stage in the development of VT and was associated primarily with radio engineering. At first, it was rather a prototype of a new electronic device that required serious research and development. And already in 1951, William Shockley demonstrated the first reliable transistor. However, their cost was quite high (up to $8 a piece), and only after the development of silicon technology did their price drop sharply, helping to accelerate the process of miniaturization in electronics, which also captured VT.

It is generally accepted that the second generation begins with the RCA-501 computer, which appeared in 1959 in the USA and was created on a semiconductor element base. Meanwhile, back in 1955, an on-board transistor computer was created for the ATLAS intercontinental ballistic missile. The new elemental technology has made it possible to dramatically increase the reliability of the VT, reduce its dimensions and power consumption, and also significantly increase productivity. This made it possible to create computers with greater logical capabilities and performance, which contributed to the expansion of the scope of computers to solve problems of planning and economics, control of production processes, etc. Within the framework of the second generation, the differentiation of computers into small, medium and large becomes more and more clear. The end of the 50s is characterized by the beginning of the stage of programming automation, which led to the emergence of the programming languages ​​Fortran (1957), Algol-60, etc.

Computer of the 3rd generation

The third generation is associated with the emergence of computers with an element base on integrated circuits (ICs). In January 1959, Jack Kilby created the first IC, which is a thin germanium plate 1 cm long. To demonstrate the capabilities of integrated technology, Texas Instruments created an on-board computer for the US Air Force containing 587 ICs and a volume (40 cm3) 150 times smaller than a similar computer of the old type. But the Kilby IC had a number of significant shortcomings, which were eliminated with the advent of planar ICs by Robert Noyce in the same year. From that moment on, IC technology began its triumphal march, capturing all new sections of modern electronics and, first of all, computer technology.

The software that ensures the functioning of the computer in various operating modes becomes much more powerful. There are developed database management systems (DBMS), automation systems for design work (CAD); Much attention is paid to the creation of application software packages (APP) for various purposes. New and existing languages ​​and programming systems continue to appear and develop.

4th generation computer

Large (LSI) and extra-large (VLSI) integrated circuits, created respectively in the 70-80s, become the design and technological basis of the 4th generation VT. Such ICs already contain tens, hundreds of thousands and millions of transistors on a single chip (chip). At the same time, LSI technology was partially used in the projects of the previous generation (IBM/360, ES EVM Ryad-2, etc.). Conceptually, the most important criterion by which 4th generation computers can be separated from 3rd generation computers is that the first ones were already designed with the expectation of efficient use of modern NEDs and simplification of the programming process for a problem programmer. In terms of hardware, they are characterized by the widespread use of IC technology and high-speed memory devices. The most famous series of computers of the fourth generation can be considered the IBM / 370, which, unlike the equally well-known 3rd generation IBM / 360 series, has a more developed command system and a wider use of microprogramming. In the older models of the 370th series, a virtual memory device was implemented, which allows the user to create the appearance of unlimited RAM resources.

The personal computer (PC) phenomenon dates back to the creation in 1965 of the first PDP-8 minicomputer, which appeared as a result of the universalization of a specialized microprocessor for controlling a nuclear reactor. The machine quickly gained popularity and became the first mass-produced computer of this class; in the early 70s, the number of cars exceeded 100 thousand units. The next important step was the transition from mini- to micro-computers; this new structural level of CT began to take shape at the turn of the 1970s, when the appearance of LSI made it possible to create a universal processor on a single chip. The first microprocessor Intel-4004 was created in 1971 and contained 2250 elements, and the first universal microprocessor Intel-8080, which was the standard of microcomputer technology and created in 1974, already contained 4500 elements and served as the basis for the creation of the first PCs. In 1979, one of the most powerful and versatile 16-bit microprocessor Motorolla-68000 with 70,000 elements was released, and in 1981, the first Hewlett Packard 32-bit microprocessor with 450,000 elements.

PC Altair-8800

The first PC can be considered Altair-8800, created on the basis of the Intel-8080 microprocessor in 1974 by Edward Roberts. The computer was mailed out, costing only $397, and was expandable with peripherals (only 256 bytes of RAM!!!). For the Altair-8800, Paul Allen and Bill Gates created a translator from the popular Basic language, significantly increasing the intelligence of the first PC (they later founded the now famous Microsoft Inc). Bundling a PC with a color monitor led to the creation of the competing Z-2 PC model; a year after the appearance of the first PC Altair-8800, more than 20 different companies and firms were involved in the production of PCs; the PC industry began to form (the actual production of PCs, their sale, periodicals and non-periodicals, exhibitions, conferences, etc.). And already in 1977, three models of the Apple-2 PC (Apple Computers), TRS-80 (Tandy Radio Shark) and PET (Commodore) were put into mass production, of which Apple, which was lagging behind in the competition, soon becomes leader in PC manufacturing (its Apple-2 model was a huge success). By 1980, Apple enters Wall Street with the largest share capital and $117 million in annual revenue.

But already in 1981, IBM, in order to avoid losing the mass market, began producing its now well-known series of PCs, IBM PC/XT/AT and PS/2, which opened a new era of personal BT. Entry into the arena of the PC industry by the giant IBM puts PC production on an industrial basis, which allows solving a number of issues important for the user (standardization, unification, advanced software, etc.), to which the company paid great attention already within the framework of the production of the IBM / 360 series and IBM/370. It can be reasonably assumed that in the short period of time that has passed from the debut of the Altair-8800 to the IBM PC, more people have joined the BT than in the entire long period - from Babage's Analytical Engine to the invention of the first ICs.

The Amdahl 470V16 model, created in 1975 and compatible with the IBM series, can be considered the first computer to open the class of supercomputers proper. The machine used an efficient parallelization principle based on command pipeline processing, and the element base used BIS technology. Currently, the class of supercomputers includes models with an average speed of at least 20 megaflops (1 megaflops = 1 million floating point operations per second). The first model with such performance was the largely unique ILLIAC-IV computer, created in 1975 in the USA and having a maximum speed of about 50 megaflops. This model had a huge impact on the subsequent development of supercomputers with a matrix architecture. A bright page in the history of supercomputers is associated with the Cray-series of S. Cray, the first Cray-1 model of which was created in 1976 and had a peak performance of 130 megaflops. The architecture of the model was based on the pipelined principle of vector and scalar data processing with an element base based on VLSI. It was this model that laid the foundation for the class of modern supercomputers. It should be noted that despite a number of interesting architectural solutions, the success of the model was achieved mainly due to successful technological solutions. The subsequent models Cray-2, Cray X-MP, Cray-3, Cray-4 brought the performance of the series to about 10 thousand megaflops, and the Cray MP model, using a new architecture on 64 processors and an element base based on new silicon microcircuits, had peak performance about 50 gigaflops.

Concluding the excursion into the history of modern VT with one or another detail of its individual stages, several significant remarks should be made. First of all, there is an ever smoother transition from one generation of computers to another, when the ideas of the new generation mature to one degree or another and are even implemented in the previous generation. This is especially noticeable during the transition to the IC technology for the production of VTs, when the defining focus of generations is increasingly shifting from the element base to other indicators: logical architecture, software, user interface, application areas, etc. The most diverse VTs appear, the characteristics of which do not fit within the traditional classification framework; one gets the impression that we are at the beginning of a kind of universalization of the CT, when all its classes are striving to level their computing capabilities. Many elements of the fifth generation are in one way or another characteristic of our day.

The era of electronic computers began in the 40s of the XX century and is associated with the work of such theoreticians and practitioners of computer technology as Alan Turing (Great Britain), Konrad Zuse (Germany), Claude Shannon, John Atanasoff, Howard Aiken, Presper Eckert, John von Neumann (USA) and other scientists and engineers.

In 1943, by order of the US Navy, with the financial and technical support of IBM, under the leadership of G. Aiken, the first universal digital computer Mark 1 was created. It reached 17 m in length and more than 2.5 m in height. Electromechanical relays were used as switching devices, data were entered on punched tape in the decimal number system. This machine could add and subtract 23-bit numbers in 0.3 seconds, multiply two numbers in 3 seconds, and was used to calculate the trajectory of artillery shells.

Two years earlier, in Germany, under the leadership of K. Zuse, an electromechanical computer Z-3 was created, based on the binary number system. This machine was significantly smaller than Aiken's and much cheaper to manufacture. It was used for calculations related to the design of aircraft and rockets. But its further development (in particular, the idea of ​​transferring to vacuum vacuum tubes) did not receive the support of the German government.

In the UK, at the end of 1943, the Colossus computer went into operation, which instead of electromechanical relays contained about 2000 electron tubes. The mathematician A. Turing took an active part in its development with his ideas on formalizing the description of computational problems. But this machine had a highly specialized character: it was designed to decipher German codes by sorting through various options. The processing speed reached 5000 characters per second.

ENIAC (Electronic Numerical Integrator and Computer), which was created in 1946 by order of the US Department of Defense under the leadership of P. Eckert, is considered the first tube universal digital computer. It contained over 17,000 vacuum tubes and worked with decimal arithmetic. In terms of its size (about 6 m in height and 26 m in length), the machine was more than twice as large as the Mark-1, but its speed was much higher - up to 300 multiplication operations per second. Calculations were carried out on this computer, confirming the fundamental possibility of creating a hydrogen bomb.

The next model (1945-1951) of the same developers, the EDVAC (Electronic Discrete Variable Automatic Computer) machine, had a more capacious internal memory, into which it was possible to write not only data, but also a program. The coding system was already binary, which greatly reduced the number of vacuum tubes.

The talented mathematician D. von Neumann took part in this development as a consultant. In 1945, he published a "Preliminary report on the EDVAC machine", in which he described not only a specific machine, but also managed to outline the formal, logical organization of a computer, singled out and outlined in detail the key components of what is now called the "von Neumann architecture" (Fig. one).

The starting point of the history of our domestic computer technology is considered to be 1948, when Isaac Brook and Bashir Rameev, employees of the Energy Institute of the USSR Academy of Sciences, received a copyright certificate for the invention "Automatic digital computer". In the same 1948, at the Institute of Electrical Engineering of the Academy of Sciences of the Ukrainian SSR, under the leadership of Academician Sergei Lebedev, work began on a project to create a MESM - a small electronic calculating machine.

Between 1948 and 1952 prototypes were created, single copies of computers, which, as in the United States, were used simultaneously both for carrying out especially important calculations (often classified) and for debugging design and technological solutions.
Rice. 1 - The architecture of the "von Neumann machine"

In the future, work in the field of computer development was carried out in several directions.

For example, projects by S.A. Lebedev. MESM, commissioned in December 1951, became the first operating computer in the USSR. In 1953 S.A. Lebedev became director of the Moscow Institute of Fine Mechanics and Computer Technology (ITM and VT) and led the development of a series of famous BESM (large electronic calculating machines): from BESM-1 to BESM-6. Each machine in this series at the time of its creation was the best in the class of mainframe computers.

BESM-1 (1953) had 5000 vacuum tubes, performed 8...10 thousand operations per second. Its peculiarity was the introduction of floating-point operations with a large range of used numbers. On BESM-1, three types of RAM with a capacity of 1024 39-bit words were tested in real operation:

  1. on electroacoustic mercury tubes (delay lines); this type of memory was used in EDSAC and EDVAC;
  2. on cathode-ray tubes (potentialoscopes);
  3. on ferrite magnetic cores.

External memory was made on magnetic drums and magnetic tapes.

A special place in the history of the development of domestic computer technology is occupied by BESM-6, which has been mass-produced since 1967 for 17 years. Its architecture implemented the principle of parallelization of computing processes, and its performance - 1 million operations per second - was a record for the mid-60s. The first full-fledged operating systems, powerful translators, the most valuable library of standard subroutines that implement numerical methods for solving various problems appeared on BESM-6, all of which are domestically produced.

By the end of the 60s, about 20 types of general-purpose computers were being produced in our country - the BESM series (Moscow, S.A. Lebedev), Ural (Penza, B.I. Rameev), Dnepr, Mir (Kyiv, V.M. Glushkov), Minsk (Minsk, V. Przhiyalkovsky) and others, as well as specialized vehicles, mainly for the defense department. Incidentally, in contrast to the West, where the "engines of progress" in the field of computer technology were not only the military, but also representatives of the business world, in the USSR they were only the military. But gradually, scientists, business executives, and officials began to realize the role of computers in the country's economy and the urgent need to develop new generation machines.

The question arose about the transition to the computer industry. In December 1969, at the government level, it was decided to choose the IBM S/360 series of machines as the industry standard for universal computers of a single series (EC computers). The first machine in this series, the EC-1020, was released in 1971.
The production of ES computers was established jointly with other socialist countries within the framework of the CMEA (Council for Mutual Economic Assistance). Many scientists opposed copying IBM systems, but could not offer something in return as a single standard.
Of course, the ideal option would be to implement the architectural principles of IBM in cooperation with the company itself, and not a family of almost five years ago, but the most modern models, and in combination with comprehensive support for their own developments. But the state did not have enough funds for everything, and they went for a simpler option. Thus began the decline of the domestic computer industry.
Note that the lag behind the West was not at all due to the decision to copy IBM machines. The technological base for the production of elements on which computers were built began to lag behind the world at an alarming rate. The more funds needed to be invested in the development of microelectronics, the more difficult it was to maintain the required level. The backwardness of the element base, the sluggishness of the centralized economy, the lack of competition, the dependence of developers and manufacturers on the officials of the State Planning Commission did not allow the repetition of the computer revolution that took place in the years of the creation of the EU in the West.

If we take its element base as the main characteristic of a computer, then four generations can be distinguished in the history of their development (table).
Table - The main characteristics of computers of various generations


Generation

1

2

3

4

Period, years

1946 -1960

1955-1970

1965-1980

1980-present temp.

Element base

Vacuum vacuum tubes

Semiconductor diodes and transistors

integrated circuits

Ultra-large integrated circuits

Architecture

Von Neumann architecture

Multiprogram mode

Local computer networks, computing systems for collective use

Multiprocessor systems, personal computers, global networks

Performance

10 - 20 thousand op / s

100-500 thousand op/s

About 1 million op/s

Tens and hundreds of millions of op/s

Software

Machine languages

Operating systems, algorithmic languages

Operating systems, dialog systems, computer graphics systems

Application packages, databases and knowledge bases, browsers

External devices

Input devices from punched tapes and punched cards,

ATsPU, teletypes, NML, NMB

Video terminals, HDD

NGMD, modems, scanners, laser printers

Application

Calculation tasks

Engineering, scientific, economic tasks

ACS, CAD, scientific and technical tasks

Management tasks, communications, creation of workstations, word processing, multimedia

Examples

ENIAC, UNIVAC (USA);
BESM - 1.2, M-1, M-20 (USSR)

IBM 701/709 (USA)
BESM-4, M-220, Minsk, BESM-6 (USSR)

IBM 360/370, PDP -11/20, Cray -1 (USA);
EU 1050, 1066,
Elbrus 1.2 (USSR)

cray t3 e, SGI (USA),
PCs, servers, workstations from various manufacturers

What shall we call fifth generation computers?
Currently, several fundamentally different areas are being worked out:

  1. an optical computer in which all components will be replaced by their optical counterparts (optical repeaters, fiber optic communication lines, memory based on holography principles);
  2. a molecular computer, the principle of which will be based on the ability of some molecules to be in different states;
  3. a quantum computer made up of subatomic-sized components and operating on the principles of quantum mechanics.
The fundamental possibility of creating such computers has been confirmed both by theoretical works and by the operating components of memory and logic circuits.