Information technology definition history of development current state. The history of the development of information technology. The emergence and development of information technology

Used abbreviations IT - information technology

IR - informational resources

1) The concept of "information technology"

Books are known to be data stores. They are designed to retrieve information by reading. But if you try different books by touch or taste, you can also get information. Such methods will make it possible to distinguish between books made in leather, cardboard and paper bindings. Of course, these are not the methods suggested by the authors of the books, but they also provide information, although not complete.

Information is one of the most valuable resources of society along with such traditional material types of resources as oil, gas, minerals, etc. Therefore, the process of information processing, by analogy with the process of processing material resources, can be perceived as a technology.

Information resourcesis called a set of data that are valuable for an enterprise (organization) and act as material resources.Information resources include texts, knowledge, data files, etc.

Information technologyis called a set of methods, production processes and software and hardware, united in a technological chain, which provide collection, storage, processing, output and dissemination of information to reduce the labor intensity of the processes of using information resources, increase their reliability and efficiency.

In accordance with the definition adopted by the definition of UNESCO, information technology is a set of interrelated scientific, technological and engineering disciplines that study methods of efficient organization of the work of people involved in processing and storing information, as well as computing technology and methods of organizing and interacting with people and production equipment. ...

There are three classes of information technologies that allow you to work with various kinds of subject areas:

1) Global information technologies,which include models, methods and tools that formalize and allow the use of information resources of society as a whole;

2) Basic information technologythat are intended for a specific area of ​​application;

3) Specific information technology,which implement the processing of specific data when solving specific functional tasks of the user (for example, tasks of planning, accounting, analysis, etc.).

The main purpose of information technologyconsists in the production and processing of information for its subsequent analysis by a person and making, based on the analysis, the optimal decision regarding the performance of an action.

2) Information technology development history

I. Until the second half of the 19th century, the basis of information technology was a pen, an inkwell and a ledger. Communication (communication) is carried out by sending packets (dispatches). Information processing productivity was extremely low, each letter was copied separately by hand, apart from the accounts summed up manually, there was no other information for making decisions.

The beginning of the 16th century - Leonardo da Vinci sketched a thirty-bit adder with ten-tooth rings.

1723g - it. scientist Christian Ludwig Gesten created an arithmetic machine.

1751 g. - The Frenchman Perera invented a more compact arithmetic machine.

1820 - the first industrial production of digital calculating machines-adding machines.

1822 - English. Mathematician Charles Babbage created a software-controlled calculating machine.

II. At the end of the 19th century, the "manual" information technology was replaced by the "mechanical" one. The invention of the typewriter, telephone, voice recorder, modernization of the public mail system - all this served as the basis for fundamental changes in information processing technology and, as a result, in work productivity.Essentially "mechanical" technology paved the way for the organizational structure of existing institutions.

Beginning 20th century - there was an adding machine with keys for entering numbers.

III. The 40s - 60s of the 20th century are characterized by the emergence of "electric" technology based on the use of electric typewriterswith removable elements, copiers on plain paper, portable voice recorders. They have improved office operations by improving the quality, quantity and speed of processing documents.

1937-1943. - a computer based on electro-magnetic relays - "Mark 1".

1947 - Mark 2.

1943 - under the leadership of John Mauchly and Prosper Eckert, mathematician John von Neumann, the tube computing machine was invented.

1948 - the transistor is invented.

1955 - began to produce computers on transistors.

1958 - the first integrated circuit was invented.

1959 - solutions for the creation of a microprocessor have been developed.

IV. The appearance in the second half of the 60s of large productive computers on the periphery of office activities (in computing centers) made it possible to mix the emphasis in information technology on processing not the form, but the content of information.This was the beginning of the formation of "electronic" or "computer" technology. As you know, information management technology should contain at least 3 most important components of information processing: accounting, analysis and decision-making. These components are realized in a "viscous" environment - a paper "sea" of documents, which becomes more and more immense every year.

1964 - developed a computer of the 3rd generation with the use of electronic circuits.

Application concepts developed in the 60s automated systems control systems (ACS) do not always and do not fully meet the task of improving management and optimal implementation of information technology components (accounting, analysis, decision-making).Methodologically, these concepts are often based on ideas about unlimited possibilities"Push-button" information technology with a continuous increase in the computing power of ACS systems in the use of the most general simulation models, which in some cases are far from the real mechanism of operational control.

The name “automated control system” does not quite correctly reflect the functions that such systems perform, more precisely, it would be “automated control systems” (ACS), because in existing ACS, the concept of "system" does not include the decisive link of control - the user. Ignoring this fundamental circumstance, apparently, led to the fact that the expansion of the ACS network and the increase in the power of their computing facilities provided, thanks to large arrays of primary data, an improvement in the main accounting management functions (reference, statistical, tracking). However, accounting functions reflect only the past state of the control object and do not allow assessing the prospects for its development, i.e. have low dynamism. In other components of the control technology, the increase in the power of the ACS did not give a tangible effect. Lack of developed communication links user workstations with a central computer, batch mode of data processing typical for most ACS, low level of analog support - all this in fact does not provide a high quality of analysis by users of statistical reporting data and the entire interactive level of analytical work. Thus, the effectiveness of the ACS at the lower rungs of the management ladder, i.e. exactly where information flows are formed, it falls significantly due to the significant redundancy of the incoming information in the absence of data aggregation means. It is for this reason that, despite the introduction of an additional ACS system, the number of employees engaged in accounting functions is increasing every year: today, one sixth of all employees of the management apparatus are accounting personnel.

V. 1975 - processor based Intel 8080 created the first mass PC - Altair.

Since the 70s, a tendency has formed to shift the center of gravity of the development of ACS to the fundamental components of information technology (especially to analytical work) with the maximum use of man-machine procedures. But as before, all this work was carried out on powerful computers located centrally in computing centers.At the same time, the construction of such automated control systems is based on the hypothesis according to which the problems of analysis and decision-making belonged to the class of formalizable, amenable to mathematical modeling. It was assumed that such automated control systems should improve the quality, completeness, authenticity and timeliness of information support for decision-makers, whose work efficiency will increase due to an increase in the number of analyzed tasks.

but the introduction of such systems has yielded very sobering results. It turned out that the applied economic and mathematical models have limited opportunities for practical use: analytical work and the decision-making process occur in isolation from the real situation and are not supported by the informational process of formation. For each new task, a new model is required, and since the model was created by specialists in economic and mathematical methods, and not by the user, the decision-making process occurs as if not in real time and the creative contribution of the user himself is lost, especially when solving atypical management problems. At the same time, the computational potential of control, concentrated in computing centers, is separated from other means and technologies of information processing due to the ineffective operation of the lower stages and the need for continuous conversions of information. It also reduces the effectiveness of information technology in solving problems at the top rungs of the management ladder. In addition, the organizational structure of technical means that has developed in the ACS is characterized by a low coefficient of their use, a long (not always performed) design of automated systems and their low profitability due to the weak impact of automation results on management efficiency.

Vi. August 1984 - the IBM PC appeared.

With the advent of personal computers on the "crest of the microprocessor revolution", the idea of ​​automated control systems is fundamentally modernized: from computing centers and centralized control, to distributed computing potential, increasing the homogeneity of information processing technology and decentralizing control.This approach has found its embodiment in decision support systems (DSS) and expert systems (ES), which characterize a new stage of computerization of organizational management technology in essence - the stage of personalization of ACS. Consistency is the main feature of DSS and the recognition that the most powerful computer cannot replace a person. V this case we are talking about a structural human-machine control unit, which is optimized in the work processes: the capabilities of the computer are expanded due to the structuring by the user of the tasks to be solved and the replenishment of its knowledge base, and the capabilities of the user - due to the automation of those tasks that were previously inexpedient to transfer to the computer for economic reasons. or for technical reasons. It becomes possible to analyze the consequences different solutions and get answers to questions like: "what will happen if ...?", without wasting time on the laborious programming process.

The most important aspect of the implementation of DSS and ES¾ rationalization of the daily activities of management workers. As a result of their implementation at the lower levels of management, the entire foundation of management is significantly strengthened, the load on centralized computing systems and the upper levels of management is reduced, which makes it possible to concentrate in them the issues of solving large long-term strategic tasks.Naturally, DSS computer technology should use not only personal computers, but also other modern means of information processing.

The DSS concept requires a revision of the existing approaches to managing work processes in an institution. In essence, on the basis of the DSS, a new man-machine labor unit is formed with labor qualifications, its rationing and payment. It accumulates the knowledge and skills of a specific person (user of a DSS) with integrated knowledge and skills embedded in a PC.

1990 - a database system is being created Internet.

There are several points of view on the development of information technology using computers, which are determined by various signs of division.

Common to all the approaches outlined below is that with the advent of the personal computer, a new stage in the development of information technology began. The main goal is to meet the personal information needs of a person, both for the professional sphere and for the household.

The main signs of the division of information technology are presented in figure (1).

It is necessary to distinguish between the history of IT and IT

3) Modern types of formation technologies

Let us turn to the general definition of technology: a set of methods, methods of influencing raw materials, materials, etc. appropriate instruments of production in the process of creating material and spiritual values. “Raw material” in the case of information technology is undoubtedly information. And the methods and ways by which we process, store, transmit information are quite diverse.

There are different definitions of the concept of "information technology". New information technologies (NIT) are understood as the whole set of methods and means of automating information activities in scientific, social, industrial, educational, household spheres, in organizational management, and record keeping. According to J. Wellington "Information technologies are systems created for the production, transmission, selection, transformation and use of information in the form of sound, text, graphic images and digital information. These systems are based on computer and telecommunication technologies (based on microelectronics), which, in turn, can be used in conjunction with other types of technologies to enhance the final effect. "

An information cultured, literate person should be able to realize when information is needed, should be able to find, evaluate and effectively use the information received, be able to interact with traditional and automated means of storing it.

Modern material production and other spheres of activity are increasingly in need of information services, processing of a huge amount of information. A universal technical means of processing any information is a computer, which plays the role of an amplifier of the intellectual capabilities of a person and society as a whole, and communication means using computers are used to communicate and transfer information. The emergence and development of computers is a necessary component of the process of informatization of society.

Informatization of society is one of the laws of modern social progress. This term is more and more persistently replacing the term "computerization of society", widely used until recently. Despite the external similarity of these concepts, they have a significant difference.

In the computerization of society, the main attention is paid to the development and implementation of the technical base of computers, which ensure the prompt receipt of the results of information processing and its accumulation.

When informatization of society, the main attention is paid to a set of measures aimed at ensuring full use reliable, comprehensive and timely knowledge in all types of human activity.

Thus, “informatization of society” is a broader concept than “computerization of society”, and is aimed at the earliest possible acquisition of information to meet their needs.In the concept of "informatization of society", the emphasis should be placed not so much on technical means as on the essence and goals of socio-technical progress. Computers are the basic technical component of the process of informatization of society.

Informatization based on the introduction of computer and telecommunication technologies is the reaction of society to the need for a significant increase in labor productivity in the information sector of social production, where more than half of the able-bodied population is concentrated.So, for example, in information sphere The USA employs more than 60% of the working-age population, in the CIS - about 40%.

Let's consider some types of modern information technologies: telephone, television, cinema, personal computer.

From a modern point of view, the use of the phone in the early years of its existence looks rather ridiculous. The supervisor dictated the message to his secretary, who then sent it from the phone room. Phone call were received in a similar room of another company, the text was recorded on paper and delivered to the addressee (Figure 2).

Telephone communications

It took a long time before the telephone became such a widespread and familiar means of communication, to begin to use it the way we do it today: we call the right place ourselves, and with the advent of cell phones- and a specific person.

Nowadays, computers are mainly used as a means of creating and analyzing information, which is then transferred to familiar media (for example, paper). The emergence of the Internet eliminates this need (tax authorities accept reports in electronic form).But now, thanks to the widespread use of computers and the creation of the Internet, for the first time, you can use your computer to communicate with other people through their computers. The need to use printed data for transmission to colleagues is eliminated, just as paper disappeared from telephone conversations. Today, thanks to the use of Web can be compared to the time when people stopped recording the text of telephone messages: computers (and their communication with each other via the Internet) are already so widespread and familiar that we are beginning to use them in fundamentally new ways. WWW Is the beginning of the journey where computers will truly become communication tools.

The Internet provides an unprecedented way of getting information. Everyone with access to WWW , can get all the information available on it, as well as powerful means of searching for it. The opportunities for education, business and the growth of mutual understanding between people are simply overwhelming. Moreover, the technology Web allows you to spread information everywhere. The simplicity of this method is unparalleled in history.In order to make your views, products or services known to others, you no longer need to buy space in a newspaper or magazine, or pay for time on television and radio. Web makes the rules of the game the same for government and individuals, for small and large firms, for producers and consumers, for charities and political organizations. World Wide Web (WWW ) on the Internet is the most democratic medium of information: with its help anyone can say and hear what has been said without intermediate interpretation, distortion and censorship, guided by a certain framework of decency. The Internet provides a unique freedom of expression for individuals and information.

Like using company internal phones to communicate with employees and the outside world, Web it is used both for communication within the organization, and between organizations and their consumers, customers and partners. The same technology Web that allows small firms to make their mark on the Internet, a large company can be used to communicate the current status of a project over an internal intranet, allowing its employees to always be more knowledgeable and therefore more responsive than small, agile competitors.Using an intranet within an organization to make information more accessible to its members is also a step forward from the past. Now, with the ability to store documents in an intricate computer archive, it became possible (under the control of security tools) to easily search and describe documents, make links to them and compose indexes. Thanks to technology Web business, as well as management, becomes more efficient.

Information technology data processing

Information technology for data processing is designed to solve well-structured problems for which the necessary input data are available and the algorithms and other standard procedures for their processing are known. This technology is applied at the level of operational (executive) activities of low-skilled personnel in order to automate some routine, constantly repeated operations of managerial labor. Therefore, the introduction of information technologies and systems at this level will significantly increase the productivity of personnel, free them from routine operations, and possibly even lead to the need to reduce the number of employees.

At the level of operational activities, the following tasks are solved:

· processing of data on operations performed by the company;

· creation of periodic control reports on the state of affairs in the company;

· receiving answers to all kinds of current inquiries and preparing them in the form of paper documents or reports.

An example would be a daily report on cash receipts and disbursements by a bank, which is generated in order to control the balance of cash, or a query to the personnel database, which will provide data on the requirements for candidates for a particular position.

There are several processing-related features that differentiate this technology from all others:

· fulfillment of data processing tasks required by the company. Every firm is required by law to have and store data about its activities, which can be used as a means of ensuring and maintaining control over the firm. Therefore, any company must have an information processing system and develop the appropriate information technology;

· solving only well-structured problems for which an algorithm can be developed;

· execution of standard processing procedures. Existing standards define typical data processing procedures and prescribe them to be followed by organizations of all types;

· execution of the bulk of work in automatic mode with minimal human participation;

· use of detailed data. Firm records are detailed (detailed) and can be audited. During the audit, the activities of the firm are checked chronologically from the beginning of the period to its end and from the end to the beginning;

· emphasis on the chronology of events;

· the requirement of minimal assistance in solving problems from specialists at other levels.

Data Retention: Much of the data at the operational level needs to be stored for later use, either here or at another level. Databases are created to store them.

Creation of reports (documents): in information technology of data processing, it is necessary to create documents for the management and employees of the company, as well as for external partners. In this case, documents can be created both upon request or in connection with an operation carried out by the company, and periodically at the end of each month, quarter or year.

Information technology management

The purpose of information technology management is to meet the information needs of all employees of the company, without exception, dealing with decision-making. It can be useful at any level of government.

This technology is focused on work in the environment of an information management system and is used with the worst structuredness of the tasks being solved, when compared with the tasks solved using information technology for data processing.

Information technology management is ideally suited to meet the similar information needs of employees and different functional subsystems (departments) or levels of firm management. The information they provide contains information about the past, present and probable future of the company. This information takes the form of regular or ad hoc management reports.

To make decisions at the level of management control, information should be presented in an aggregated form, so that trends in data change, the reasons for deviations and possible solutions are visible. At this stage, the following data processing tasks are solved:

· assessment of the planned state of the control object;

· assessment of deviations from the planned state;

· identifying the causes of deviations;

· analysis of possible solutions and actions.

Information management technology aims to create different types of reports. Regular reports are generated according to a set schedule that determines when they are generated, for example, a monthly analysis of a company's sales.

Special reports are created at the request of managers or when something unplanned happened in the company. Both those and other types of reports can take the form of summarizing, comparative and extraordinary reports.

In summarizing reports, data are combined into separate groups, sorted and presented as subtotals and final totals for individual fields.

Comparative reports contain data obtained from various sources or classified according to various characteristics and used for comparison purposes.

Extraordinary reports contain data of an exceptional (extraordinary) nature.

The use of reports to support control is particularly effective when implementing so-called variance control. Deviation management assumes that the main content of the data obtained by the manager should be deviations of the state of the company's economic activities from some established standards (for example, from its planned state). When using the principles of variance management in the company, the following requirements are imposed on the reports generated:

· the report should only be generated when a deviation has occurred;

· information in the report should be sorted by the value of the critical indicator for a given deviation;

· it is desirable to show all deviations together so that the manager can grasp the existing connection between them;

· the report must show the quantitative deviation from the norm.

Main components: Input information comes from operational level systems. The output information is generated in the form of management reports in a form convenient for making a decision. The content of the database is transformed with the help of appropriate software into periodic and ad-hoc reports that are sent to the specialists involved in decision-making in the organization.The database used to obtain the specified information must consist of two elements:

1) data accumulated on the basis of the evaluation of the operations carried out by the firm;

2) plans, standards, budgets and other regulatory documents that determine the planned state of the object of management (division of the company).

Information technology decision support

The efficiency and flexibility of information technology largely depend on the characteristics of the interface, the decision support system. The interface determines: the user's language; a computer message language organizing a dialogue on the display screen; user knowledge.

User language -these are the actions that the user performs in relation to the system by using the capabilities of the keyboard, electronic pencils, writing on the screen, joystick, "mouse", commands given by voice, and the like. The simplest form of user language is the creation of input and output forms. Having received the input form (document), the user fills it in with the necessary data and enters it into the computer. The decision support system performs the necessary analysis and issues the results in the form of an output document of the established form.

Message language - this is what the user sees on the display screen (symbols, graphics, color), data received on the printer, sound outputs, etc. An important measure of the effectiveness of the interface used is the chosen form of dialogue between the user and the system. Currently, the following forms of dialogue are most common: request-response mode, command mode, menu mode, mode of filling in gaps in expressions suggested by the computer. Each form, depending on the type of task, user characteristics and the decision made, can have its own advantages and disadvantages. Long time the only implementation of the message language was a printed or displayed report or message. Now there is a new possibility of presenting the output data - computer graphics. It makes it possible to create color on the screen and on paper. graphic images in three dimensions. The use of computer graphics, which significantly increases the visibility and interpretability of the output data, is becoming more and more popular in information technology for decision support.

User knowledge -this is what the user should know while working with the system. These include not only the action plan in the user's head, but also textbooks, instructions, reference data issued by a computer.

The improvement of the interface, the decision support system, is determined by the success in the development of each of the three specified components. The interface must have the following capabilities:

· manipulate various forms of dialogue, changing them in the process of making a decision at the user's choice;

· transfer data to the system in various ways;

· receive data from various devices systems in various formats;

· flexibly maintain (provide assistance upon request, prompt) user knowledge.

Information technology expert systems

The greatest progress among computer information systems has been noted in the development of expert systems. Expert systems enable a manager or a specialist to receive expert advice on any problems about which these systems have accumulated knowledge.

Solving special problems requires special knowledge... However, not every company can afford to have experts on all issues related to its work on its staff, or even invite them every time a problem arises. The main idea of ​​using the technology of expert systems is to obtain from an expert his knowledge and, by loading it into the computer memory, to use it whenever the need arises. All this makes it possible to use the technology of expert systems as advisory systems.

The similarity of information technologies used in expert systems and decision support systems is that they both provide high level decision support. However, there are three significant differences.

The first is due to the fact that the solution of a problem within the framework of decision support systems reflects the level of its understanding by the user and his ability to obtain and comprehend the solution. On the contrary, expert systems technology invites the user to make a decision that exceeds his capabilities.

The second difference between these technologies is expressed in the ability of expert systems to explain their reasoning in the process of obtaining a solution. Very often these explanations are more important to the user than the solution itself.

The third difference is associated with the use of a new component of information technology - knowledge.

The main components of information technology used in the expert system are: user interface, knowledge base, interpreter, system creation module.

The manager (specialist) uses the interface to enter information and commands into the expert system and receive output information from it. Commands include parameters that guide the process of knowledge processing. Information is usually given in the form of values ​​assigned to specific variables.

The technology of expert systems provides the ability to receive as output information not only the solution, but also the necessary explanations.

There are two types of explanations:

· on-demand explanations. The user at any time can demand an explanation of his actions from the expert system;

· explanation of the received solution to the problem. After receiving the decision, the user may request an explanation of how it was obtained. The system must explain each step of its reasoning leading to the solution of the problem. Although the technology of working with the expert system is not simple, the user interface of these systems is friendly and usually does not cause difficulties in dialogue.

The knowledge base contains facts describing the problem area, as well as the logical relationship of these facts. Rules are central to the knowledge base. A rule determines what should be done in a given situation and consists of two parts: a condition that can be met or not, and an action that should be performed if the condition is met.

All the rules used in the expert system form a system of rules, which, even for a relatively simple system, can contain several thousand rules.

An interpreter is a part of an expert system that processes knowledge (thinking) in a knowledge base in a certain order. The interpreter's technology is reduced to sequential consideration of a set of rules (rule by rule). If the condition contained in the rule is met, a certain action is taken, and the user is presented with an option to solve his problem.

In addition, many expert systems introduce additional blocks: database, calculation unit, data entry and correction unit. The calculation block is necessary in situations related to making management decisions. In this case, a database plays an important role, which contains planned, physical, calculated, reporting and other constant or operational indicators. The block for entering and correcting data is used to promptly and timely reflect the current changes in the database.

System creation module - serves to create a set (hierarchy) of rules. There are two approaches that can be used as the basis for the system creation module: using algorithmic languages programming and using shells of expert systems.

Expert systems shellis a ready-made software environment that can be adapted to solve a specific problem by creating an appropriate knowledge base. In most cases, the use of wrappers makes it faster and easier to build expert systems than programming.

Introduction

This abstract work is devoted to the topic: "Information technology: the origins and stages of development, purpose, means and methods."

The relevance of the choice of the topic of the work is explained by the fact that in the process of human economic activity, information becomes critical for the subjects of the world and national economies. In modern conditions, information is also becoming a powerful factor in accelerating the radical restructuring of production processes, influencing not individual links, but the entire process of material production as a whole. In the Russian Federation, organizational, material and legal prerequisites for the formation of information support for the management of all sectors of the national economy are now being created: the legislative base is being formed, the sphere of information services is being developed, the technical support of the information environment is being improved (including due to domestic production), the information component of all organizations in society. As a result of these events, the process of "initial accumulation" of the information market's resources has intensified, and the next stage should be the process of regulating the civilized rules of the "game" on it. In this regard, the necessary development of highly efficient, functional information technology (hereinafter IT).

Therefore, the purpose of writing our work was a concise systematization of information about information technologies at the present stage of their development as tools for regulating the information market.

Based on the purpose of writing a work, we are faced with the following tasks:

Give a definition of the concept of information technology and consider the history of their formation;

Describe the goals of development and functioning of information technologies;

Give examples of means and methods of information technology.

Information technology concept. The history of their formation

Information technologies have long entered our everyday life and have taken root in it, however, this very concept remains multifunctional and indistinct. Technology has traditionally been understood as the process of creativity, production, both in art and craft. At the same time, the process itself involved a series of consistent efforts to achieve the set goal.

This human-controlled process includes not only goals, but also certain means, methods, strategies. So in the case of material production technologies, the process covers the collection and processing of raw materials until the manufacture of the final product with a given set of characteristics and qualities.

Accordingly, using different technologies for the same material, it is possible to obtain different products, since the technology changes the initial state of raw materials to obtain completely new production objects.

Since information is one of the most valuable resources of society, it is no less important than traditional material types of resources - oil, gas, minerals, etc. Working with information resources can be compared with the processes of conventional production and also called technology. Then the following definition will be fair: information technology is a process or a set of information processing processes. Information technology (IT) can be represented in the form of a diagram (Fig. 1). Konopleva I.A., Khokhlova O.A., Denisov A.V. Information Technology. - M .: Prospect, 2013 .-- 328 p.

Because at the input and output of IT there are not matter, and not energy, but information, then: information technology can also be defined as a set of processes that use the means and methods of accumulation, processing and transmission of primary information to obtain information of a new quality about the state of an object, process or phenomenon.

This information of a new quality is called an information product. Schematically, the process of converting information into information, and later into a software product, can be illustrated as follows (Fig. 2). In this case, threats are understood as a combination of factors that pose a threat to valuable information, namely: the possibility of unauthorized access and / or distribution. Yudina I.G. Complex information product: characteristics and definition // Bibliosphere. 2012. No. 5. S. 43-46.

Picture 1

Information technology diagram

If the production of material products is carried out to meet the needs of people and their communities, then the goal of information technology is presented as obtaining an information product for its analysis by a person and making decisions based on it for performing actions. As in material production, a different information product can be obtained by applying different technologies in relation to the incoming information.

In the legal literature, the concept of "information product" has not yet been given, in particular, it is absent in the Law of the Russian Federation "On Information, Informatization and Information Protection". You can only consider the definition that was given in the Law of the Russian Federation "On participation in international information exchange", which, however, has lost its force: an information product (products) is documented information prepared in accordance with the needs of users and intended or used to meet the needs of users ... Sinatorov S.V. Information Technology. - M .: Dashkov and Co, 2010 .-- 456 p.

Figure 2

Place of information and software product in the information circulation system


Consequently, the final purpose of an information product, like information technology, is also the satisfaction of human needs. We will talk in more detail about the goals of information technology later.

The beginning of the era of information technology (IT) can be considered the time when a person began to distinguish himself from the world around him: language, oral reproduction of information, transmission of it using signs, sounds - all this can be called the first stage in the development of information technology.

The emergence of writing is a characteristic feature of the second stage in the development of information technology. Thanks to the ability to reproduce information on material carriers (wooden, wax-coated or clay tablets, papyrus, leather), the first repositories of information are formed - libraries. But the mass dissemination of information was initiated by printing (Table 1) Aloshti H.R. Philosophical view of information information technology // Scientific and technical information. Series 2: Information Processes and Systems. 2012. No. 4. P. 1-12 ..

The third stage in the development of information technology can be called the period of the emergence and rapid introduction of mechanical means of processing, storing and transmitting information, such as a typewriter or adding machine.

The discovery in the field of electricity made a revolution in information technologies, which led to the transition to the fourth stage of their development. Now it is possible to transfer significant amounts of information over long distances at a sufficiently high speed (telephone, teletype), and store them on magnetic media.

Table 1

Stages of IT development

Tasks to be solved

The first is 150 thousand BC. - 3 thousand BC

Primitive tools for drawing symbols on household items

Cohesion of individual tribes into clan communities, the formation of the first societies

Not mechanized

The second - 3 thousand BC. - V century. n. NS.

Writing instruments, the first printing presses

Maintaining power and order in the first states, labor organization

Primitive mechanized

The third - V century. n. NS. - XIX AD

Printing and calculating keyboards

Mechanization of control systems

Mechanized

The fourth beginning of the twentieth century. - 1940s

Remote communication complexes

Global automation of management processes

Automated

Fifth - 1940 - our days

Computers, computers

Management of the global economy in the information market

Electronic, digital - a combination of computing technology and communications

The beginning of the fifth stage in the development of information technologies is associated with the appearance of the first electronic computers and the transition to electronic information technologies.

In comparison with analogue, the main advantage of electronic sources of information is their efficiency and increasing mass (a vivid example is information on the Internet). The rapid development of computer technology gives rise to new forms and methods of processing, storing and transmitting information.

Separate stages of the development of computer information technologies can also be distinguished:

Machine resources stage (computer implementation, programming in machine codes);

Programming stage (programming languages, batch processing);

The stage of new information technologies, characterized by the emergence of personal computers (personal computers, or abbreviated PC - personal computer), computer networks, AWPs (automated workstations), databases, OLAP technologies (dynamic data analysis), Internet technologies, etc. NS.

The main tasks of modern IT are:

Achieving the universality of communication methods;

Support for multimedia systems;

Maximum simplification of the means of communication in the "man-PC" system.

In addition, IT as a system has the following properties:

Expediency;

The presence of components and structure;

Interaction with the external environment;

Integrity;

Development in time. V.A. Pastukhov Information Technology Management // Oil Refining and Petrochemistry. Scientific and technical achievements and best practices. 2011. No. 5. S. 59-61.

History of information technology dates back to long before the emergence of the modern discipline of computer science, which appeared in the XX century. Information technology (IT) is associated with the study of methods and means of collecting, processing and transmitting data in order to obtain information of a new quality about the state of an object, process or phenomenon.

In view of the growing needs of mankind in processing an increasing amount of data, the means of obtaining information have been improved from the earliest mechanical inventions to modern computers. Also, within the framework of information technology, there is the development of related mathematical theories, which now form modern concepts.

Information technologies activate and effectively use the information resources of society (scientific knowledge, discoveries, inventions, technologies, advanced experience), which allows you to obtain significant savings in other types of resources - raw materials, energy, minerals, materials and equipment, human resources, social time. To date, IT has gone through several evolutionary stages, the change of which is mainly determined by the development of scientific and technological progress, the emergence of new technical means of information processing. The main technical means of information processing technology is a personal computer, which significantly influenced both the concept of construction and use of technological processes, and the quality of information obtained after processing.

Collegiate YouTube

    1 / 5

    ✪ The history of the emergence and development of programming and computers

    ✪ Lecture 1: Structure and objectives of the information technology service

    ✪ XXI century - the age of information technology

    ✪ History of the development of information technology

    ✪ 01 - Databases. Stages of development of information systems and databases

    Subtitles

Early history

The earliest mention of the use of computing devices falls on the period 2700-2300 BC. NS. Then in ancient Sumer the abacus was widespread. It consisted of a board with drawn lines that delimited the sequence of orders of the number system. The original use of the Sumerian abacus was by tracing lines on sand and pebbles. Modified abacus were used in the same way as modern calculators.

Mechanical analog computing devices appeared hundreds of years later in the medieval Islamic world. Examples of devices from this period are the equatorium of the inventor Az-Zarqali, the mechanical motor of the astrolabe Abu Raikhan al-Biruni, and the torquetum Jabir ibn Aflah. Muslim engineers have built a number of machines, including musical ones, that can be "programmed" to play a variety of musical compositions. These devices were developed by the brothers Banu Musa and Al-Jazari. Muslim mathematicians have also made important advances in cryptography and cryptanalysis, as well as Al-Kindi frequency analysis.

After John Napier discovered logarithms for computational purposes in the early 17th century, there was a period of significant progress among inventors and scientists in the creation of computational tools. In 1623, Wilhelm Schickard developed a calculating machine, but abandoned the project when the prototype he had begun to build was destroyed by fire in 1624. Around 1640, Blaise Pascal, a leading French mathematician, built the first mechanical addition device. The structure of the description of this device is based on the ideas of the Greek mathematician Heron. Then, in 1672, Gottfried Wilhelm Leibniz invented the step calculator, which he assembled in 1694.

To be able to create the first modern computer, a significant development of the theory of mathematics and electronics was still required.

Binary logic

By this time, the first mechanical device controlled by a binary circuit had been invented. The Industrial Revolution gave impetus to the mechanization of many tasks, including weaving. Punched cards controlled the looms of Joseph Marie Jaccard, where a perforated hole on a card meant a binary one, and an unperforated spot meant a binary zero. Thanks to punched cards, the machines were able to reproduce the most complex patterns. Jaccard's loom was far from being called a computer, but it shows that a binary system could be used to control machinery.

Formation of discipline

Computer pioneers

Before the 1920s computers(something like computing machine) were the clerks doing the calculations. Many thousands of such computers was employed in commerce, worked in government and research institutions. Most of the "computers" were women who had special education. Some have done astronomical calculations for calendars.

The mathematical foundations of modern computer science were laid by Kurt Gödel in his incompleteness theorem (1931). In this theorem, he showed that there are limits to what can be proven and disproved using a formal system. This led to the definition and description of Gödel and other formal systems, including the definition of such concepts as μ-recursive function and λ-definable functions.

1936 was a key year for computer science. Alan Turing and Alonzo Church, in parallel with each other, presented a formalization of algorithms, defining the limits of what can be computed, and a "purely mechanical" model for computation.

Alan Turing and his Analytical Engine

Post-1920s expression Calculating machine refers to any machine that performed work human computer, especially to those that were developed in accordance with the effective methods of the Church-Turing thesis. This thesis is formulated as: "Any algorithm can be specified in the form of a corresponding Turing machine or a partially recursive definition, and the class of computable functions coincides with the class of partially recursive functions and with the class of functions computable on Turing machines." In other words, the Church-Turing thesis is defined as a hypothesis about the nature of mechanical computing devices, such as electronic computers. Any calculation that is possible can be done on a computer, provided there is sufficient time and storage space.

The mechanisms working on computing with infinities became known as the analog type. Values ​​in such mechanisms were represented by continuous numerical values, for example, the angle of rotation of a shaft or the difference in electrical potential.

Unlike analog machines, digital machines had the ability to represent the state of a numerical value and store each digit separately. Digital machines used a variety of processors or relays before the invention of the RAM device.

Name Calculating machine since the 1940s, it began to be supplanted by the concept a computer... Those computers were able to do the calculations that clerks used to do. From when values ​​were no longer dependent on physical characteristics (as in analog machines), a logical computer based on digital hardware was able to do everything that could be described. purely mechanical system .

Turing machines were designed to formally define mathematically what can be computed given constraints on computational power. If a Turing machine can complete a task, then the task is considered Turing computable. Turing mainly focused on designing a machine that could determine what could be calculated. Turing concluded that as long as there is a Turing machine that can compute the approximation of a number, that value is countable. In addition, the Turing machine can interpret logical operators such as AND, OR, XOR, NOT, and If-Then-Else to determine if a function is computable.

At a symposium on large-scale digital technology in Cambridge, Turing said, "We are trying to build a machine to do various things simply by programming, not by adding additional hardware."

Shannon and information theory

Before and during the 1930s, electrical engineers were able to build electronic circuits to solve mathematical and logic problems, but most of them did it. special way without any theoretical rigor. That all changed with the publication of Claude Elwood Shannon's master's thesis in 1937 on: Symbolic analysis of relay connections and circuit-switched connections(A Symbolic Analysis of Relay and Switching Circuits). Shannon, influenced by Boole's work, acknowledged that it could be used to organize electromechanical relays to solve logic problems (then it came to be used in telephone switches). This concept (about using the properties of electrical switches) was at the heart of all electronic digital computers.

Shannon founded a new branch of computer science - information theory. In 1948, he published an article titled. The ideas in this article are applied in probability theory to solve the problem of how best to encode the information that the sender wants to convey. This work forms one of the theoretical foundations for many areas of research, including data compression and cryptography.

Wiener and Cybernetics

From experiments with anti-aircraft systems, which interpreted radar images to detect enemy aircraft, Norbert Wiener coined the term cybernetics from ancient Greek. κυβερνητική "The art of management". He published the article "Cybernetics" in 1948, which influenced the emergence of artificial intelligence... Wiener also compared computation, computing, memory devices, and other cognitively similar concepts to a kind of brain wave analysis.

John von Neumann and von Neumann architecture

In 1946, a model of computer architecture was created, which became known as von Neumann architecture. Since 1950, the von Neumann model has ensured the consistency of designs for subsequent computers. The von Neumann architecture was considered innovative because von Neumann introduced a representation that allowed machine instructions to be used and memory regions allocated. The Neumann model consists of 3 main parts: an arithmetic logic unit (ALU), memory (OP) and a memory control unit.

Hardware development

First and second generations of computers

In 1950, the National Physics Laboratory (UK) completed the Pilot ACE, a small-scale programmable computer based on a Turing machine model.

Among other significant developments, IBM introduced the first hard drive on September 13, 1956. magnetic disks("Winchester") RAMAC with a volume of 5 Megabytes, September 12, 1958 at Texas Instruments the first microcircuit started working (the inventors of the microcircuit are considered Jack Kilby and one of the founders of Intel, Robert Noyce).

Third and later generations of computers

Under the leadership of Lebedev in the period 1948-1951. the first domestic computer MESM was created - a small electronic calculating machine of the first generation (1951). The architecture and construction principles of the MESM were similar to those previously used in ENIAC, although Lebedev was not familiar with von Neumann's architecture. In parallel with his work in Kiev, S. A. Lebedev is in charge of the development of a large electronic calculating machine BESM at ITMiVT. Since 1953, the first BESM model had a reduced performance, about 2000 operations in the village. 7 copies of BESM-2 were created at the Kazan plant of calculating and analytical machines. The BESM version, BESM-4, was developed on a semiconductor element base (chief designer OP Vasiliev, scientific supervisor S. A. Lebedev).

M-20 (chief designer S. A. Lebedev) is one of the best machines of the first generation (1958). M-40 is a computer created in 1960 and considered the first Elbrus on vacuum tubes (chief designer S. A. Lebedev, his deputy V. S. Burtsev). In 1961, a computer-guided anti-aircraft missile M-40 successfully knocks down an intercontinental ballistic missile capable of carrying nuclear weapons during testing.

The pinnacle of S. A. Lebedev's scientific and engineering achievements was the BESM-6, the first model of the machine was created in 1967. It implements such new principles and solutions as parallel processing of several instructions, ultra-fast register memory, stratification and dynamic allocation of random access memory, multi-program mode of operation, advanced interrupt system. BESM-6 is a second generation supercomputer.

Since 1958, the development of the control computer "Dnepr" (chief designer B. N. Malinovsky, scientific supervisor V. M. Glushkov) has been underway, and since 1961 the introduction of these machines has begun at the factories of the country. These machines appeared simultaneously with control machines in the United States and were produced for a whole decade (usually, the obsolescence of a computer is five to six years).

In 1962, on the initiative of V.M. Glushkov, the SKB of computers was created, and in 1963 -. After Dnepr, the main direction of the team's work under the leadership of Glushkov is the creation of intelligent computers that simplify engineering calculations.

Formation of programming in the USSR

The starting point for the emergence of domestic programming should be considered 1950, when the layout of the first Soviet computer MESM (and the first computer in continental Europe) appeared.

The main and generally recognized achievement of D. A. Pospelov is the creation at the end of the 60s of the XX century a set of new methods for constructing control systems, which are based on semiotic models for representing control objects and describing control procedures. He created an apparatus of tier-parallel forms, which made it possible to pose and solve many problems associated with the organization of parallel computations in computer complexes and networks. On its basis, in the 70s, such problems as synchronous and asynchronous distribution of programs among computers of a computer system, optimal segmentation of programs, optimization of information exchanges were solved.

Software development

OS

Mobile operating systems are also gaining popularity. These are the operating systems that run on smartphones, tablets, PDAs, or other digital mobile devices. Modern mobile operating systems combine the features of a personal computer operating system with features such as touch screen, cellular, Bluetooth, Wi-Fi, GPS navigation, camera, camcorder, speech recognition, voice recorder, MP3 player, NFC and infrared ...

Mobile devices with mobile capabilities (eg smartphone) contain two mobile operating systems. The software platform, which is available to the user, is complemented by a second low-level proprietary real-time operating system, with the help of which the radio and other equipment operate. The most common mobile operating systems are Android, Asha, Blackberry, iOS, Windows Phone, Firefox OS, Sailfish OS, Tizen, Ubuntu Touch OS.

Development of networks

One of the first attempts to create a means of communication using electricity dates back to the second half of the 18th century, when Lesage built an electrostatic telegraph in Geneva in 1774. In 1798, the Spanish inventor Francisco de Salva created his own design for the electrostatic telegraph. Later, in 1809, the German scientist Samuel Thomas Semmering built and tested an electrochemical telegraph.

A further development of the telegraph was the telephone. Alexander Graham Bell organized the first telephone calls over telegraph wires on October 9th. Bell's tube served in turn for both transmission and reception of human speech. The telephone, patented in the USA in 1876 by Alexander Bell, was called the "talking telegraph". The subscriber was called through the tube using a whistle. The range of this line did not exceed 500 meters.

The history of the further development of the phone includes an electric microphone, finally, finally replacing carbon, speakerphone, tone dialing, digital sound compression. New technologies: IP telephony, ISDN, DSL, cellular communication, DECT.

In the future, there was a need for data transmission networks (computer networks) - communication systems between computers or computing equipment. In 1957, the US Department of Defense decided that the American army needed reliable communications and information systems in case of war. Paul Baren, developed a distributed network design. It was named ARPANET (Advanced Research Projects Agency Network). Due to the fact that it is very difficult to transmit an analog signal without distortion over long distances, he proposed transmitting digital data in packets.

In December 1969, an experimental network was created, connecting four nodes:

  • University of California Los Angeles (UCLA)
  • University of California at Santa Barbara (UCSB)
  • Stanford Research University (SRI)
  • Utah State University

Over the years, the network has gradually expanded to the entire United States.

In 1965, Donald Davis, a scientist at the National Physics Laboratory in England, proposed the creation of a packet-switched computer network in England. The idea was not supported, but by 1970 he was able to create a similar network to meet the needs of a multidisciplinary laboratory and to prove this technology in practice. By 1976, the network included 12 computers and 75 terminal devices.

By 1971, MIT had developed the first program to send e-mail over the network. This program immediately became very popular among users. In 1973, the first foreign organizations from Great Britain and Norway were connected to the network via a transatlantic telephone cable, and the computer network became international.

In 1983, the ARPANET became the term Internet. The Ethernet specification was published in September. Nov 12 - Computer scientist Tim Berners-Lee has published proposals for a hypertext diagramming system, dubbed the World Wide Web. In the 1990s, the Internet consolidated most of the then existing networks (although some, like Fidonet, remained separate). The merger looked attractive due to the lack of a unified leadership, as well as due to its openness technical standards The Internet, which made networks independent of business and specific companies.

see also

Notes (edit)

Literature

  • Shallit, Jeffrey A Very Brief History of Computer Science(English). CS 134 in University of Waterloo (1995).
  • M.V.Bastrikov, O.P. Ponomarev. Information technology management: Textbook. - Kaliningrad: Institute "KVSHU", 2005. - 140 p.
  • Bellos, Alex Abacus adds up to number joy in Japan (unspecified) ... Date of treatment June 25, 2013.
  • Ifrah Georges. The Universal History of Computing: From the Abacus to the Quantum Computer. - John Wiley & Sons, 2001 .-- 11 p.

Lecture INFORMATION TECHNOLOGIES

Lecture plan

3.1. Definition of information technology

3.2. The history of the emergence of information technology

3.3. Stages of development of automated information technologies

3.4. The role and significance of information technology

Definition of information technology

The creation and functioning of information systems is closely related to the development of information technologies, their main part of. Technology translated from Greek means art, skill, skill, that is, something that is directly related to the processes, which are a certain set of actions aimed at achieving the goal. The process is determined by the chosen strategy and is implemented by a combination of various means and methods. Technology changes the quality or the original state of matter in order to obtain a material product.

Information is one of the most valuable resources of society along with traditional material resources: oil, gas, minerals, etc. This means that the process of its processing is an information process by analogy with the processing of material resources is called technology (Fig. 3.1).

Information processes (English. information processes) according to the legislation of the Russian Federation - these are the processes of collection, processing, accumulation, storage, search and dissemination of information. Information technology Is an information process that uses a set of means and methods for collecting, processing and transmitting data (primary information) to obtain information of a new quality about the state of an object, process or phenomenon (information product) (Fig. 3.1).

The purpose of material production technology is the release of products that meet the needs of a person or a system. The purpose of information technology is the production of information for its analysis.
a person and making, on its basis, a decision to perform an action.

Information technology in management Is a set of methods for processing disparate initial data into reliable and operational information of the decision-making mechanism using hardware and software tools in order to achieve optimal market parameters of the controlled object. Automated information technology Is a system-organized set of methods and means for the implementation of operations of collection, registration, transfer, accumulation, search, processing and protection of information based on the use of advanced software, used computer technology and communications, as well as methods using which information is offered to clients.

Information Technology Toolkit- one or more interconnected software products for a certain type of computer, the technology of work in which allows you to achieve the goal set by the user. The following tools are used: word processor (editor), desktop publishing systems, spreadsheets, database management systems, electronic notebooks, electronic calendars, functional information systems (financial, accounting, for marketing, etc.), expert systems, etc. ...

Information technology is closely related to information systems, which are the main environment for it. Information technology is a process of clearly regulated rules for performing operations on primary data, the main purpose of which is to obtain the necessary information. An information system is an environment, the constituent elements of which are computers, computer networks, software products, databases, people, various kinds of technical and software communications, etc., that is, it is a human-computer information processing system, the main purpose of which is the organization of storage and transmission information. The implementation of the functions of an information system is impossible without knowledge of the information technology oriented towards it. Information technology can exist outside the sphere of the information system.

The technological process does not have to consist of all the levels shown in Fig. 3.2. It can start at any level and not include, for example, stages or operations, but consist only of actions.


Various software environments can be used to implement the stages of the technological process. Information technology, like any other, should provide a high degree of dismemberment of the entire information processing process into stages (phases), operations, actions and include the entire set of elements necessary to achieve the goal.

The history of the emergence of information technology

The term " information Technology”Appeared in the late 1970s. and came to mean information processing technology. Computers have changed the way we work with information, and they have increased the responsiveness and efficiency of management, but at the same time, the computer revolution has created serious social problems of information vulnerability.
In business, the use of a computer consists in identifying task situations, classifying them and using hardware and software to solve them, which are called technologies- rules of action using any general means for a whole set of tasks or task situations.

Usage computer technology enables the company to achieve a competitive advantage in the market by using basic computer concepts:

· To increase the efficiency and efficiency of work through the use of technological, electronic, instrumental and communication means;

· To maximize individual efficiency by accumulating information and using means of access to databases;

· To increase the reliability and speed of information processing by means of information technologies;

· Have a technological basis for specialized teamwork.

The information age began in the 1950s when the first general purpose computer for commercial use was introduced to the market. UNIVAC that performed calculations in milliseconds. The search for a mechanism for computing began many centuries ago. Abacus - one of the first mechanical calculating devices five thousand years ago, were invented independently and almost simultaneously in Ancient Greece, Ancient Rome, China, Japan and Russia. Abacus is the ancestor of digital devices.

Historically, the development of two directions of development of computing and computing technology has developed: analog and digital. Analog direction based on calculating an unknown physical object (process) by analogy with the model of a known object (process). The founder of the analog direction is the Scottish Baron John Napier, who theoretically substantiated functions and developed a practical table of algorithms, which simplified the performance of multiplication and division operations. A little later, the Englishman Henry Briggs compiled a table of decimal logarithms.

In 1623 William Oughtred invented the rectangular slide rule, and in 1630 Richard Delamaine invented the circular slide rule, in 1775 John Robertson added a slider to the ruler, 1851-1854. Frenchman Amedey Manheim changed the design of the line to an almost modern look. In the middle of the IX century. devices were created: a planimeter (for calculating the area of ​​flat figures), a curvimeter (determining the length of curves), a differentiator, an integrator, an integrated graph (for obtaining graphical results of integration) and other devices.

The digital direction in the development of computing technology turned out to be more promising. At the beginning of the XVI century. Leonardo da Vinci created a sketch of a 13-bit adder with ten-toothed rings (a prototype of a working device was built only in the 20th century).
In 1623, Professor Wilhelm Schickard described the device of a calculating machine. In 1642, the French mathematician and philosopher Blaise Pascal (1623-1662) developed and built a calculating device “ Pascaline”To help his father - a tax collector. This design of the counting wheel was used in all mechanical calculators until 1960, when they fell out of use with the advent of electronic calculators.

In 1673, the German philosopher and mathematician Gottfried Wilhelm Leibniz invented a mechanical calculator capable of performing basic arithmetic operations in binary system reckoning. In 1727, Jacob Leopold created a calculating machine based on the Leibniz binary system. In 1723, a German mathematician and astronomer created an arithmetic machine that determined the quotient and the number of successive addition operations when multiplying numbers and monitored the correctness of data entry.

In 1896 Hollerith founded a company to manufacture tabulating calculating machines. Tabulating Machine Company, which merged with several other companies in 1911, and in 1924 General Manager Thomas Watson changed its name to International Business Machine Corporation (IBM). The beginning of modern computer history is marked by the invention in 1941 of the Z3 computer (electrical relays controlled by a program) by the German engineer Konrad Sousse and the invention of the simplest computer by John W. Atanasoff, professor at the University of Iowa. Both systems used the principles of modern computers and were based on the binary number system.

The main components of the 1st generation computers were vacuum tubes, memory systems were built on mercury delay lines, magnetic drums, and Williams cathode-ray tubes. Data were entered using punched tapes, punched cards and magnetic tapes with stored programs. Used printing devices. The performance of computers of the first generation did not exceed 20 thousand operations per second. Lamp machines were produced on an industrial scale until the mid-50s.

In 1948 in the USA Walter Brattain and John Bardeen invented the transistor, in 1954 Gordon Teale used silicon to manufacture the transistor. Since 1955, computers have been produced on transistors. In 1958, the integrated circuit was invented by Jack Kilby and the industrial integrated circuit ( Chip). In 1968 Robert Noyce founded the company Intel (Integrated Electronics). Computers on integrated circuits began to be produced in 1960. Computers of the second generation became compact, reliable, fast (up to 500 thousand operations per second), functional devices for working with magnetic tapes and memory on magnetic disks were improved.

In 1964, computers of the third generation were developed with the use of electronic circuits of small and medium degree of integration (up to 1000 components per chip). Example: IBM 360(USA, company IBM), EU 1030, EU 1060(THE USSR). In the late 60s. XX century minicomputers appeared,
in 1971 - a microprocessor. In 1974 the company Intel released the first widely known microprocessor Intel 8008, in 1974 - 2nd generation microprocessor Intel 8080.

Since the mid-1970s. XX century computers of the IV generation were developed. They were based on large and very large scale integrated circuits (up to a million components per chip) and high-speed memory systems with a capacity of several megabytes. When turned on, self-loading took place, when turned off, the data of the RAM was transferred to the disk. The performance of computers has become hundreds of millions of operations per second. The first computers were produced by the company Amdahl Corporation.

In the mid 70s. XX century the first industrial personal computers appeared. The first industrial personal computer was created in 1975 Altair microprocessor based Intel 8080... In August 1981 the company IBM released a computer IBM PC microprocessor based Intel 8088 which quickly gained popularity.

Since 1982, V-generation computers have been under development, focused on knowledge processing. In 1984 the company Microsoft presented the first samples of the operating system Windows, in March 1989, Tim Berners-Lee, an employee of the International European Center, proposed the idea of ​​creating a distributed information system Word Wide Web, the project was adopted in 1990.

Similar to the development of hardware, software development is also divided into generations. Generation I software was a basic programming language that only computer specialists knew. Generation II software is characterized by the development of problem-oriented languages ​​such as Fortran, Cobol, Algol-60.

Use of interactive operating systems, database management systems and languages structured programming, such as Pascal, belongs to third generation software. Generation IV software includes distributed systems: local and global networks computer systems, advanced graphical and user interfaces, and an integrated programming environment. Generation V software is characterized by knowledge processing and parallel programming steps.

The use of computers and information systems, the industry of which began in the 1950s, is the main means of increasing competitiveness through the following main advantages:

· Improving and expanding customer service;

· Increasing the level of efficiency by saving time;

Increasing the load and bandwidth;

· Improving the accuracy of information and reducing losses caused by errors;

· Raising the prestige of the organization;

· Increase in business profits;

· Ensuring the possibility of obtaining reliable information in real time using the iterative mode and organizing queries;

· The use of reliable information by the manager for planning, management and decision-making.

Lecture 1. The concept of information technology.

Topic number 1, Lesson number 1

EDUCATIONAL - METHODOLOGICAL DEVELOPMENT

Industrial and environmental safety

Department

(lecture)

ON THE DISCIPLINE "Information Technologies in Risk Management"

In the early stages of history, to synchronize the actions performed, a person needed coded communication signals. The human brain solved this problem without artificially created tools: human speech developed. Speech was also the first bearer of knowledge. Knowledge was accumulated and passed from generation to generation in the form of oral stories. The natural capabilities of man to accumulate and transfer knowledge received the first technological support with the creation of writing. The process of improving information carriers is still going on: stone - bone - clay - papyrus - silk - paper magnetic and optical media - silicon - ... Writing became the first historical stage of information technology. The second stage of information technology is the emergence of book printing. It stimulated the development of sciences, accelerated the rate of accumulation of professional knowledge. The cycle: knowledge - science - social production - knowledge is closed. The spiral of technological civilization began to unwind at a breakneck pace. Typography has created informational prerequisites for the growth of productive forces. But the information revolution is associated with the creation of computers in the late 40s of the twentieth century. From the same time, the era of information technology development begins. A very important property of information technology is that for it information is not only a product, but also a raw material. Electronic modeling of the real world on a computer requires processing a significantly larger amount of information than the final result contains. In the development of information technology, stages can be distinguished. Each stage is characterized by a specific feature.

1. At the initial stage of development of information technologies (1950-1960s), the interaction between man and computers was based on machine languages. The computer was available only to professionals.

2. In the next stage (1960-1970s) operating systems are created. Processing of several tasks, formulated by different users; the main goal is to maximize the utilization of machine resources.

3. The third stage (1970-1980s) is characterized by a change in the criterion of the efficiency of data processing, human resources for the development and maintenance of software became the main ones. This stage includes the dissemination of mini-computers. An interactive mode of interaction of several users is carried out.

4. The fourth stage (1980-1990s) is a new qualitative leap in software development technology. The center of gravity of technological solutions is shifted to the creation of means of interaction between users and computers when creating a software product. The key link in the new information technology is the presentation and processing of knowledge. The total spread of personal computers. Note that the evolution of all generations of computers occurs at a constant rate - 10 years per generation. Forecasts assume that the pace will continue until the beginning of the 21st century. Each generational change of information technology means requires retraining and radical restructuring of the thinking of specialists and users, a change of equipment and the creation of more mass computing equipment. Information technology, as an advanced field of science and technology, determines the rhythm of the time of technical development of the entire society Investments in infrastructure and Internet services caused the rapid growth of the IT industry in the late 90s of the XX century.