The very first processor in the world. The history of the creation of processors. Intel has become a leader among processor manufacturers

The first microprocessors with four bits (bits) consisted of one crystal.

The first microprocessors were based on p - MOS circuits. Modern microprocessors are executed on and - MOS circuits with low cost and medium speed, on extremely low-power CMOS circuits and on TTL circuits with high speed.

The first microprocessors (MP) appeared in the early 70s as a result of the joint efforts of systems engineers, problem solvers architectural organization of funds computing technology, and circuit engineers involved in the design and production technology of radio electronic equipment.

The first microprocessor, the 4-bit Intel 404, entered the unprepared market in 1971. The 4004 MP, designed to meet the needs of calculator manufacturers, appeared before the world as a sign of a new era in integrated electronics.

The earliest microprocessors used a method of memory management known as purely machine memory.

It is worth recalling that the first microprocessors imported to Japan in 1971 cost about a thousand dollars.

For more than 30 years that have passed since the appearance of the first microprocessors, certain exchange rules have been developed, which are followed by the developers of new microprocessor systems. These rules are not too complicated, but it is necessary to know firmly and strictly follow them for successful work.

Operating systems are created for any type of microprocessor based on the instruction set that is put into the microprocessor during development. The first microprocessor was created by Intel, the leading chip manufacturer.

Can any technical achievement of the computer age rival the microprocessor in importance? The first microprocessors, whose short history began just a decade ago, were based mainly on the achievements of microelectronics - a technology that arose much later than the appearance of computers themselves and to a large extent independently of them. From the outset, microprocessor designers and manufacturers have received strong acclaim once they have demonstrated that each new development by another small step it becomes closer in structure to a modern medium or large computing machine. Observers easily concluded that if mounting density, speed, and automated design capabilities continued to rise as expected, microprocessors would soon be on par in power and logic with large minicomputers, and possibly large computers.

In 1970, another important step On the Road to the Personal Computer - Marshian Edward Hoff of Intel designed an integrated circuit similar in function to the central processing unit of a large computer. This is how the first microprocessor Intel-4004 appeared (see the picture on the right), which was released for sale in 1971. It was a real breakthrough, because the microprocessor Intel-4004 with a size of less than 3 cm was more productive than the giant ENIAC machine. True, the capabilities of Intel-4004 were much more modest than those of the central processor of large computers of that time - it worked much slower and could process only 4 bits of information simultaneously (processors of large computers processed 16 or 32 bits simultaneously), but it also cost tens of thousand times cheaper.

The creation of an operating system like PC-DOS is neither a matter of chance nor the result of purely technocratic planning. Economic competition led to the emergence of operating systems for mainframes long before the first microprocessors.

It is a single microcircuit that controls everything that happens in the PC. This microcircuit operates at a certain clock frequency, measured by a certain number of megahertz. By today's standards, the first microprocessors (8088 or 80286) were terribly slow and would not be able to handle modern programs.

Redesigning a large integrated circuit whenever a company wants to update its product range, which happens very often, is truly a colossal job. The microprocessor was born thanks to an idea put forward by specialists from Bizicom: it is necessary to CKOEI-design such an integrated circuit that can be easily adapted to any new product mastered by their company. Alas, then Japan was still too weak in the field of R&D; so the United States was able to grab the ball and run away by creating the first microprocessor.

However, Intel continued to adhere to the prototype, the development funds for which had already been spent. Thus, the well-known Intel 8008 MP became the first microprocessor on the world market.

Who and when invented the world's first microprocessor

Every Intel employee knows who invented the microprocessor. In 1969, Japanese developers who had previously been involved in designing calculators came to work in this, then not yet known, firm. Engineers used twelve integrated circuits to create a common desktop computer. Main role Masatoshi Shima played in this project. At the time, Ted Hofsor was managing one of Intel's departments. He, as the future creator of the microprocessor, realized instead of a calculator with the ability to program, it would be better to make a computer that would program the work of the calculator.

The creation of the first processor in the world began with the development of its architecture. In 1969, an Intel employee suggested calling the first series of microprocessors the 4000 family. Each model in the family had sixteen output chips. This helps to understand what the first microprocessor was. Model 4001 had 2 KB memory. The 4003 had a ten-bit expander with keyboard connectivity and various indicators. And version 4004 was already a four-bit processor device. Many believe that it was the very first microprocessor. In the 4004 model, two thousand three hundred transistors worked. The device operated at a frequency of 108 kHz.

Today you can find different opinions about when the first processor was created, however, most believe that November 15, 1971 is the date and year of the creation of the first microprocessor in the world. Initially, this development was bought by the Japanese company Busicom for sixty thousand dollars, but Intel later returned the money to remain the only copyright holders of the invention.

The first processor was used in traffic control systems, in particular in traffic lights. In addition, the device was used in blood analyzers. A little later, 4004 found a place in the Pioneer 10 space probe, which was launched in 1972.

The first domestic microprocessor was created in the early seventies at the Special Computing Center under the leadership of D.I. Yuditsky.

Thus, in the 70s, microprocessors began to gradually penetrate into various areas of human activity. All processors were later divided into microprocessors and microcontrollers directly. The former are used in personal computers, and microcontrollers have found application in control different systems... They have a weaker computing core, but there are many additional nodes. Microcontrollers are sometimes called micro-computers, since all nodes and modules are located directly on the chip.

Intel launches its first microprocessor, the 4004

Intel released the world's first microprocessor, which was available to all commercial structures and ordinary people. A year earlier, the military had developed the F14 CADC (en) microprocessor, which was classified as "top secret" until 1998.

The Japanese company Busicom Corp (formerly called Nippon Calculating Machine, Ltd) was engaged in the production of calculators, but the microcircuits required for the operation of the computer were developed by Intel. So Busicom Corp ordered 12 chips for its new calculator. It should be noted that the microcircuit had a minimum number of functions and was capable of performing a certain list of work. When a new action appeared, an additional microcircuit had to be developed. Intel employees believed it was economically and practically unprofitable. It is worth replacing all existing microcircuits with one central processor, which will perform all the necessary tasks.

Both companies supported the idea. Since 1969, Ted Hoff, a project designer and Intel spokesman, and Stanley Mazor of Busicom Corp, who previously worked in general chip design, have taken up the design of the processor. The development began with a reduction in the number of microcircuits to 4. They included a central processing unit, a 4-bit central processing unit, read-only memory for storing permanent information and random access memory for storing user information.

When the Italian physicist Federico Fagin came to work at Intel, the development of the microprocessor moved to a new stage. He would later be named the chief designer of the MCS-4 family of microprocessors. Until this time, Fagin developed similar schemes. In 1961, at Olivetti, Federico was involved in the logical design of computers. In 1968 he developed a commercial silicon gate chip for Fairchil: the Fairchild 3708. This experience helped him to integrate the CPU microprocessor. Fagin made a huge contribution to the development and development of the microcircuit. Collaboration of the Italian physicist with Masatoshi Shima, software engineer software by Busicom Corp, led to the development of the first 4004 microprocessor, which was introduced to the world on November 15, 1971. The microprocessor cost $ 200.

Why was the microprocessor named 4004? The first digit indicates the product number. Each Intel product was numbered. Memory chips (PMOS chips) were produced under the first number. Chips NMOS were produced under the second number. Bipolar microcircuits were designed under the third number. Accordingly, the fourth number was received by microprocessors. CMOS microcircuits began to be produced under the fifth number. Number seven are magnetic domains. The eighth number is bit microprocessors and microcontrollers. The sixth and ninth numbers were missing.

HISTORY OF CREATION AND DEVELOPMENT
MICROPROCESSOR AUTOMATIONЇ

Modern solutions in the field of automation, robotization and electric drive cannot be imagined without the use of microprocessor tools and systems. The famous American company Intel, founded in 1968, made a significant contribution to the development of semiconductor microcircuits. This was the time of the emergence of new technologies, thanks to which it became possible to create miniature semiconductor devices - microcircuits. Their application opened up new perspectives in all areas of technology, including automation. The era of digital machine information processing began. The first ENIAC computer, created in 1946, weighed about 30 tons and occupied a large room. In 1968 there were already 30 thousand computers in the world. These were predominantly large mainframes (electronic computers) and cabinet-sized "mini-computers". An unpleasant feature of these computers were frequent emergencies due to overheating of lamps and a large number of connectors. Therefore, the emergence of integral electronics was due to objective reasons.


Rice. 1. The first electronic digital computer of general purpose ENIAC (Electronic Numerical Integrator and Computer))


Intel was founded by talented scientists and inventors Robert Noyce, Gordon Moore and Andrew Grove. It was Robert Noyce who invented the integrated circuit in 1959. In the mid-60s, Noyce worked as a manager of the American company Fairchild Semiconductor, known for its developments in the field electronic technology... Gordon Moore led research and development at Fairchild Semiconductor and was one of the eight founders of Fairchild. Andy Grove, a native of Hungary, was a process engineer. He joined Fairchild Semiconductor after earning his Ph.D. in chemical engineering from the University of Berkeley.

In the late 1960s, many talented engineers left Fairchild Semiconductor and started their own firms. Robert Noyce and Gordon Moore founded Intel and became its first employees. Over time, Andy Grove joined them. Start-up capital($ 2.5 million) the firm was provided by the San Francisco financier Arthur Rock.

Intel has specialized in the manufacture of semiconductor memory devices. The first serial device was the "3101" 64-bit Schottky-bipolar static random access memory. The special place taken by Intel in the world of electronics is associated with other devices - microprocessors. They have become the technical base of the current computer scientific and technological revolution.

The impetus for the creation of the microprocessor was a contract with the Japanese firm Busicom, which specialized in the production of calculators. Busicom commissioned Intel to develop twelve specialized microcircuits, but Intel did not have sufficient human, financial and manufacturing resources to fulfill such a large order. Then the talented engineer Ted Hoff suggested instead of twelve specialized microcircuits to create one universal one that could replace them. R. Noyce and G. Moore appreciated the refinement of the solution proposed by T. Hoff. The idea also satisfied the Busicom company, which financed the work. Thus, Intel began developing a universal microcircuit that can be programmed to execute certain instructions. For the first time, the need for a hardware implementation of the device's operation algorithm disappeared: all operations for processing numerical data were now carried out in accordance with a specific program, which promised savings in time and money. A group of Intel engineers and designers headed by Federico Fagin worked on the implementation of T. Hoff's plans. After 9 months of hard work, the world's first microprocessor "4004" appeared. He numbered 2300 semiconductor transistors, but calmly fit in the palm of your hand. In terms of performance, the new processor was not inferior to the ENIAC computer, which occupied 85 cubic meters and consisted of 18,000 vacuum tubes. Ted Hoff designed the architecture of the first processor, Stan Meisor designed its instruction set, and Federico Fagin designed the processor die.

After evaluating the advantages of using microprocessors, Intel management went to negotiations with Busicom, as a result of which Intel acquired all the rights to the 4004 processor for 60 thousand dollars (it should be noted that Busicom soon went bankrupt). After that, a wide advertising campaign began, the goal of which was to convey to the engineering community the great potential of programmable devices in various fields - from traffic control to the automation of complex production processes. Intel has hosted engineering workshops, promotional materials, and microprocessor reference guides. In some weeks, the firm sold more reference documents than the microprocessors themselves. Across certain time they are very widespread.

Thus, the 4004 chip became the first microprocessor. Approximately six months later, several more companies announced the appearance of such devices. These microprocessors, executed according to r-MOS technology, were four-bit, that is, they could process only 4 bits of information at a time. The length of the program and the set of instructions were limited, the first processors did not have many of the functions that are required for modern microprocessors. In 1972, Intel released the 8008 processor, which inherited the basic features of the 4004. It was the first 8-bit processor and is now classified as a first generation processor. It already had an accumulator, six general-purpose registers, a stack pointer, eight address registers, and special instructions for I / O data, but this processor did not become widely used in commercial designs either.

In late 1973, Intel is developing a new 8-bit microprocessor "8080". Its architecture and command system turned out to be so successful that it is considered a classic today.

The widespread use of microprocessors in technology began precisely with the advent of the 8080 chip, which belonged to the third generation processors, but was not the only successful 8-bit processor. Six months later, the "6800" microprocessor of the American company Motorola appeared, which made a tough competition for the Intel processor. Like the 8080, the 6800 microprocessor was made using n MOS technology, required a separate clock generator, had a three-bus structure with a 16-bit address bus, a well-developed architecture and a command system. Its main advantages were a more powerful interrupt system than the "8080" and one (not three, like the "8080") supply voltage. The principles of the internal architecture of the "6800" also significantly differed from the "8080" primarily by the absence of general-purpose registers, in which, depending on the tasks set, both address information and numeric data could be stored. Instead, the processor added a second equivalent battery for data processing and specialized 16-bit registers, where only address information was stored. The data for processing was selected from external memory and returned there after processing. Memory instructions were simpler and shorter, but transferring a byte to memory took longer than exchanging between the internal 8080 registers. The architecture of neither of the two mentioned processors had significant advantages, and each of them became the ancestor of two large families of microprocessors - Intel and Motorola, whose representatives compete to this day.

In 1978, Intel manufactured the first 16-bit 8086 microprocessor used by International Business Machines (IBM) to create personal computers, and the 16-bit Motorola 68000 chip was used in the famous Atari and Apple computers. As for the "home" computers, they became widespread with the advent of the ZX Spectrum model (based on the "Z80" processor) by the English company Sinclair Research Ltd, the founder of which was the talented engineer Sir Clive Sinclair. The idea of ​​using a TV instead of an expensive monitor and a household tape recorder for storing programs and data significantly reduced the cost of a home computer and made it affordable for the average buyer.

Intel 4004- 4-bit microprocessor developed by Intel Corporation and released on October 15, 1971.

This microcircuit is considered the world's first commercially available single-chip microprocessor.


Intel 8080- 8-bit microprocessor introduced in 1974. Provided a tenfold increase in computing performance compared to the previous processor.

This is the device that helped the engineering community embrace the idea of ​​microprocessors. This chip sparked a boom in personal computers.


Intel 8048- the world's first microcontroller, was released in the late 70s.

This device has become widespread due to its use in personal computer keyboards and game consoles.


Intel 8051 Is a second generation microcontroller, released in 1980.
Thanks to its successful architecture and command system, it has become a de facto industry standard. It is still produced by well-known corporations in America, Korea and Japan.

Modern multi-core processor

The computational performance of modern microprocessors, according to the results of various tests, is approximately tens of thousands of times higher than the performance of the first processor.

Rice. 2. Line of key models of microprocessors and microcontrollers


A year after the creation of the 8080 microprocessor, several Intel engineers moved to Zilog and began working on a new processor building on their previous designs. Consequently, in 1977, the "Z80" microprocessor appeared, which became the best representative of 8-bit processors. Compared to the 8080, it required only one supply voltage, had a more powerful and flexible interrupt system, three times the clock speed, two batteries and a double set of general-purpose registers. The "Z80" instruction set contained all 78 instructions of the "8080" microprocessor and almost the same number additional commands, so the programs created for the "8080" were ported to the "Z80" without any changes.

Later (mid-70s), another trend emerged in the development of microprocessors, which is directly related to automation and the emergence of processors for embedded solutions. It was started by the Intel 8085 processor. At first it was conceived as a continuation of the 8080 chip, but after a while the Z80 and the new 6809 microprocessor from Motorola appeared. Both significantly outperformed the 8085, prompting Intel to take on the development of the first 16-bit microprocessor, the 8086, but with the development of the 8156 and 8755 peripheral chips, the 8085 was given new perspectives. The first microcircuit contained a static RAM (random access memory) with a volume of 256 bytes, two 8-bit, bit-configurable I / O ports and a programmable timer-counter. The second consisted of three multi-bit I / O ports and a 2 Kbyte read-only memory (ROM) with UV erasure. By appropriately combining the conclusions of these three microcircuits, the developers of electronic equipment received a functionally complete module - a microcontroller, which can be built into any device: a voltmeter, a frequency meter, into various amplifying devices or converters. Several firms have released power-saving k MOS versions of this family. This made it possible to create microprocessor devices with self-contained battery power. Finally, in the late 70s, Intel "combined" these three microcircuits into one chip and created a single-chip microcomputer (microcontroller) "8048", which included RAM and ROM, an arithmetic logic device, a built-in clock generator, a timer-counter, input / output ports. Further, similar to the forty-eighth microcontrollers "8035" and "8748" were developed. The command system of single-chip microcontrollers was significantly weaker than that of the "8085" processor, the amount of RAM and ROM, the number of I / O ports was also smaller than that of the above-mentioned three-case module, but all this was located in one chip, which greatly simplified development and production new devices based on single-chip microcomputers. The idea of ​​\ u200b \ u200bcreating universal hardware with software setting for specific tasks, which became the impetus for the emergence of microprocessors, received the highest degree of implementation precisely in single-chip microcontrollers.

In the early 80s, Intel released a more powerful microcontroller "8051", and soon - and its modifications "8031" and "8751". The microcomputer core of this series has become a classic for microcontrollers. From the point of view of technology, the 8051 microcontroller was for its time a very complex device MCS 51 - the undisputed leader in the number of varieties and companies producing its modifications. Today there are more than 200 modifications microcontrollers MCS 51, which are produced by nearly 20 leading manufacturers of electronic components (Atmel, Infineon Technologies, Philips, Hyundai, Dallas Semiconductor, Temic, TDK, Oki, AMD, MHS, LG, Winbond, Silicon Labs, etc.). Microcontrollers of the original architecture from Motorola, Zilog, Analog Devices, Microchip, Scenix, Holtec also found their niche.

Bob Noyce

Known for his pioneering views on the development of semiconductor technology. It was Robert Noyce who invented the integrated circuit in 1959. In the mid-1960s, Noyce was the manager of the influential firm Fairchild Semiconductor. Later - one of the founders of Intel.

Gordon Moore

A talented and hardworking engineer with great respect among colleagues. One of the founders of Intel.
“We are real revolutionaries. After all, these latest advances in electronics are changing the world much faster than any political events. "

Andy Grove

The energetic and adventurous Andrew Grove worked as a process engineer at Fairchild Semiconductor. Grove joined Fairchild after earning his Ph.D. in chemical engineering from the University of Berkeley. One of the founders of Intel.

Ted Hoff

Teddy Hoff is one of the inventors of the microprocessor. It was he who proposed the concept of a universal micro-circuit and developed the architecture of the first processor.
“What impresses me most personally is that, thanks to microprocessors, computers have become a massively available product.”

Rice. 3. Prominent scientists-inventors, revolutionaries in the field of microelectronics


The creation of a microprocessor is recognized as one of the outstanding achievements of the twentieth century. Hundreds of millions of microprocessors and billions of microcontrollers are sold worldwide every year. According to the magazine "World of Computer Automation", the average American has about 300 times (!) During the day deals with microcontrollers, embedded literally everywhere - from washing machines, elevators and telephones to traffic lights, automobiles and industrial machines.

The Semiconductor Industry and Business Survey believes that if the automotive and aviation industries developed at the same pace as semiconductor manufacturing for 30 years, a Rolls-Royce car would be worth $ 2 75 cents and, using only one liter of gasoline, could travel almost one and a half thousand kilometers, and the Boeing 767 would cost $ 500 and could fly around the globe in 20 minutes, having spent only a can of kerosene. In 1996, the names of the creators of the microprocessor, Dr. Tedd Hoffa, Dr. Federico Fagin, and Wall Maysor, were inducted into the United States National Inventors Hall of Fame (Akron, Ohio) and ranked alongside the names of Thomas Edison, the Wright brothers and Alexander Bell.

Another direction in the development of microprocessor systems originated in 1969, which was due to the need to replace complex, cumbersome and unreliable relay-contactor circuits at industrial enterprises. automatic control... It was this year that General Motors prepared a tender request for the development of a universal microprocessor device for the needs of industrial production.

The tender was won by Bedford Associates from Massachusetts, which at the time was headed by Richard Morley. They developed a microprocessor device (controller) that made it possible to switch the signal wires connected to it in different combinations. These combinations were set by the control program, which was compiled on a computer, and then loaded into the controller's memory. Thus, with the help of one microprocessor device with a program loaded into it, it became possible to implement a control system, for the development of which it was previously necessary to switch dozens or even hundreds of various electromechanical components, such as relays, timers, counters, regulators, etc. the same controller could be used to control various machines and mechanisms only by changing the program loaded into it. This is how the first programmable logic controller (PLC) appeared in the world, which Bedford Associates dubbed "Project 084".

The company began to develop the production of industrial controllers and was later renamed "Modicon" (short for "Modular Digital Controller", that is, modular digital controller). In 1977, the Modicon brand was sold to Gold Electronics, later it was bought by the well-known German company AEG. As a result, the Modicon brand passed into the ownership of the French company Schneider Electric, which still owns it to this day. It should be noted that Schneider Electric is one of the world leaders in the development, production and implementation of technical means power supply, electric drive and automation.

Another company, which still occupies a high position among the leaders in manufacturers of components for automation, also took part in the tender at the request of General Motors. It's about Allen Bradley. Although the company lost the tender, work in this direction was carried out further. Allen Bradley acquired a majority stake in Information Instruments and Bunker-Ramo Corporation, which had already developed the PDQ II controller (short for Program Data Quantizer). This controller model turned out to be too cumbersome and difficult to program. However, Allen Bradley persisted and in 1970 developed the PMC (Programmable Matrix Controller) based on the PDQ II. However, this model also did not meet the requirements of customers for the control of technological units. After revision, a model called PLC 1 ("Programmable Logic Controller", programmable logic controller) was born. It is this name and the abbreviation PLC that have established themselves in the field of automation and are used by specialists to refer to this class of devices.

a) b)

In the mid-70s of the last century, the market for programmable logic controllers began to grow rapidly and Modicon and Allen Bradley had a number of competitors, among which General Electric, Siemens, Square D, Industrial Solid State Controls, and others should be noted.

A significant step towards simplifying the use of programmable logic controllers was the introduction of international standard IEC 61131 3, which declares programming languages ​​for PLCs. Thanks to this, an engineer of any profile (technologist, electrician, chemist, etc.) can easily create programs for controlling technological installations, even without knowing the intricacies of programming. Also, the designated languages ​​are universal for PLCs from different manufacturers.

Intel 4004 microprocessor

History

Why 4004?
The point is that each product category has been assigned its own number. The first Intel products were memory chips (PMOS chips), which were numbered 1xxx. In the 2xxx series, NMOS chips were developed. Bipolar microcircuits have been assigned to the 3xxx series. 4-bit microprocessors are designated 4xxx. CMOS microcircuits received the designation 5xxx, magnetic domain memory - 7xxx, 8-bit or more microprocessors and microcontrollers belonged to the 8xxx series. The 6xxx and 9xxx series were not used.

The second digit denoted the type of product: 0 - processors, 1 - RAM microcircuits, 2 - controllers, 3 - ROM microcircuits, 4 - shift registers, 5 - EPLD microcircuits, 6 - PROM microcircuits, 7 - EPROM microcircuits, 8 - monitoring chips and synchronization circuits in pulse generators, 9 - telecommunication chips.

The third and fourth digits corresponded to the serial number of the product, and since the first processor required three more specialized microcircuits (ROM, RAM and I / O expander), which were released earlier than 4004, the microprocessor was named 4004.

The 4004 microprocessor was housed in a 16-pin DIP package with a die size of less than 1 square meter. see The processor could execute 60,000 instructions per second. (For comparison, one of the first completely electronic computers- American ENIAC - executed only 5000 instructions per second, occupied an area of ​​278.7 square meters. m and weighed 30 tons.) Intel foresaw the crucial importance of microprocessors in the miniaturization of computers, and therefore bought the copyright for the 4004 microprocessor and its improved versions from Busicom for $ 60 thousand.

However, in 1971 the processor did not become a hit of sales. Intel's strategy was to market the 4004 to expand the market for the much more popular 1101/1103 memory chips. Only the microprocessor, the electronic "great-grandson" 4004, began to enjoy deserved popularity.

Specialized microcircuits of the 4xxx series

The 4004 chip came with 3 ASICs: ROM, RAM, and I / O expander. And although these microcircuits had their own designation system (series 1xxx, 2xxx and 3xxx), they received a second name in the 4xxx category, which began to be designated next to their usual numbering.

  • Collecting

    The Intel 4004 is naturally one of the most popular collectible chips around. The most highly valued are the white and gold Intel 4004 chips with visible gray marks on the white part (original case). So in 2004, such a microcircuit, on

Are you using a computer or mobile device to read this topic now. The computer or mobile device uses a microprocessor to perform these actions. The microprocessor is the heart of any device, server or laptop. There are many brands of microprocessors from a wide variety of manufacturers, but they all do much the same and in much the same way.
Microprocessor- also known as a processor or central processing unit, is a computing engine that is built on a single chip. The first microprocessor was the Intel 4004, which appeared in 1971 and was not as powerful. He could add and subtract, and that's only 4 bits at a time. The processor was amazing because it was executed on a single chip. You will ask why? My answer is that engineers at that time were producing processors either from multiple chips or from discrete components (transistors were used in separate packages).

If you have ever wondered what a microprocessor does in a computer, what it looks like, or what are its differences compared to other types of microprocessors, then go under the cut- there are all the most interesting, and details.

Microprocessor Progress: Intel

The first microprocessor, which later became the heart of the simple home computer, was the Intel 8080, a complete 8-bit computer on a single chip that appeared in 1974. The first microprocessor caused a real surge in the market. Later in 1979, a new model was released - Intel 8088. If you are familiar with the PC market and its history, you know that the PC market moved from Intel 8088 to Intel 80286, and that one to Intel 80386 and Intel 80486, and then to Pentium, Pentium II, Pentium III, and Pentium 4. All of these microprocessors are made by Intel, and they are all improvements to the basic design of the Intel 8088. The Pentium 4 can execute any code, but it does it 5000 times faster.

In 2004, Intel introduced microprocessors with multiple cores and a million transistors, but even these microprocessors followed the same general rules as previously made chips. Additional information in the table:

  • date: is the year the processor was first introduced. Many processors were re-released, but with higher clock speeds, and this continued for many years after the original release date.
  • Transistors: This is the number of transistors on a chip. You can see that the number of transistors per die has been steadily increasing over the years.
  • Micron: width, in microns, of the smallest wire on the chip. For comparison, I can cite a human hair, which has a thickness of about 100 microns. As the dimensions were getting smaller and smaller, the number of transistors increased.
  • Clock frequency: the maximum speed the chip can reach. I will tell you about the clock frequency a little later.
  • Width (bus) data: is the width of the ALU (Arithmetic Logic Unit). An 8-bit ALU can add, subtract, multiply, etc. In many cases, the data bus is the same width as the ALU, but not always. The Intel 8088 was 16-bit and had an 8-bit bus, while modern Pentium models are 64-bit.
  • MIPS: This column in the table stands for displaying the number of operations per second. It is a unit of measure for microprocessors. Modern processors can do so many different things that today's ratings, presented in the table, lose all meaning. But you can feel the relative power of the microprocessors of those times
From this table, you can see that, in general, there is a relationship between clock speed and MIPS (operations per second). The maximum clock speed is a function of the manufacturing processor. There is also a relationship between the number of transistors and the number of operations per second. For example, an Intel 8088 clocked at 5 MHz (now 2.5-3 GHz) only runs 0.33 MIPS (about one instruction for every 15 clock cycle). Modern processors can often execute two instructions per clock cycle. This increase is directly related to the number of transistors on the chip and I will talk about this further.

What is a chip?


The chip is also called integrated circuit... Usually it is a small, thin piece of silicon on which the transistors that make up the microprocessor have been engraved. A chip can be as small as one inch, but still contain tens of millions of transistors. Simpler processors can consist of several thousand transistors engraved on a chip just a few square millimeters in size.

How it works



Intel Pentium 4

To understand how a microprocessor works, it would be helpful to look inside and learn about its internals. In the process, you can also learn about assembly language, the native language of the microprocessor, and much of what engineers can do to increase the speed of the processor.

The microprocessor executes a collection of machine instructions that tell the processor what to do. Based on the instructions, the microprocessor does three main things:

  • Using its ALU (Arithmetic Logic Unit), the microprocessor can perform mathematical operations. For example, addition, subtraction, multiplication, and division. Modern microprocessors are capable of extremely complex operations
  • Microprocessor can move data from one memory location to another
  • The microprocessor can make decisions and move on to a new set of instructions based on those decisions


To put it bluntly, the microprocessor does complex things, but above I described three main activities. The following diagram shows a very simple microprocessor capable of doing these three things. This microprocessor has:

  • Address bus (8, 16, or 32 bits) that sends a memory access
  • Data bus (8, 16 or 32 bits) that transfers data to memory or receives data from memory
  • RD (read) and WR (write) tell memory whether they want to install or get an addressed location
  • Clock line that allows you to view the processor clock sequence
  • A reset line that resets the command counter to zero and restarts execution

Microprocessor memory

Earlier we talked about address and data buses, as well as read and write lines. All this is connected either with RAM ( RAM) or ROM (read only memory or read only memory, ROM) - usually with both. In our microprocessor example, we have a wide 8 bit address bus and an equally wide data bus — also 8 bits. This means that the microprocessor can access 2 ^ 8 to 256 bytes of memory, and can read and write 8 bits of memory at a time. Let's assume this simple microprocessor has 128 bytes of internal memory starting at address 0 and 128 bytes of RAM starting at address 128.

Random access memory stands for read-only memory. The read-only memory chip is programmed with permanent preset bytes. The bus address tells the RAM chip which byte to get to and fit on the data bus. When the read line changes its state, the read-only memory chip presents the selected byte to the data bus.

RAM stands for RAM, lol. RAM contains a byte of information, and the microprocessor can read or write to these bytes depending on whether the read or write line is signaling. One of the problems that can be found in today's chips is that they forget everything as soon as the energy is gone. Therefore, the computer must have RAM.



RAM chip or read-only memory (ROM) chip

By the way, almost all computers contain some amount of RAM. On personal computer read-only memory is called BIOS (Basic Input / Output System). At startup, the microprocessor starts executing instructions that it finds in the BIOS. BIOS instructions, by the way, also fulfill their role: they check the hardware, and then all the information goes to HDD to create a boot sector. The boot sector is one small program, and the BIOS stores it in memory after reading it from disk. The microprocessor then starts executing the boot sector instructions from RAM. The boot sector program will show the microprocessor what else to take with it. hard disk into RAM, and then does it all, and so on. This is how the microprocessor loads and runs the entire operating system.

Microprocessor instructions

Even the incredibly simple microprocessor I just described will have a fairly large set of instructions that it can execute. The collection of instructions is implemented as bit patterns, each of which has a different meaning when loaded into the instruction sector. People are not particularly good at remembering bit patterns as they are a collection of short words. By the way, this set of short words is called the processor's assembly language. An assembler can translate words into a bit pattern very easily, and then the assembler's efforts will be put into memory for the microprocessor for execution.

Here is a set of assembly language instructions:

  • LOADA mem- load into register with memory address
  • LOADB mem- load into register B from memory address
  • CONB mem- load constant value into register B
  • SAVEB mem- save register B to memory address
  • SAVEC mem- save register C to memory address
  • ADD- add A and B and save the result to C
  • SUB- subtract A and B and store the result in C
  • MUL- multiply A and B and store the result in C
  • DIV- split A and B and store the result in C
  • COM- compare A and B and save the result in the test
  • JUMP addr- go to the address
  • JEQ addr- go, if equal, to solve
  • JNEQ addr- go, if not equal, to solve
  • JG addr- go, if more, for solution
  • JGE addr- go if greater or equal to solve
  • JL addr- go, if less, to solve
  • Jle addr- go if less or equal to solve
  • STOP- stop execution
Assembly language
The C compiler translates this C code into assembly language. Assuming that RAM starts at address 128 in this processor, and read-only memory (which contains the assembly language program) starts at address 0, then for our simple microprocessor, the assembler might look like this:

// Assume a is at address 128 // Assume F is at address 1290 CONB 1 // a = 1; 1 SAVEB 1282 CONB 1 // f = 1; 3 SAVEB 1294 LOADA 128 // if a> 5 the jump to 175 CONB 56 COM7 JG 178 LOADA 129 // f = f * a; 9 LOADB 12810 MUL11 SAVEC 12912 LOADA 128 // a = a + 1; 13 CONB 114 ADD15 SAVEC 12816 JUMP 4 // loop back to if17 STOP

Read only memory (ROM)
So the question now is, "How do all of these instructions integrate with read-only memory?" I will explain, of course: each of these instructions in assembly language must be represented as a binary number. For simplicity, let's assume that each assembly language instruction assigns itself a unique number. For example, it would look like this:

  • LOADA - 1
  • LOADB - 2
  • CONB - 3
  • SAVEB - 4
  • SAVEC mem - 5
  • ADD - 6
  • SUB - 7
  • MUL - 8
  • DIV - 9
  • COM - 10
  • JUMP addr - 11
  • JEQ addr - 12
  • JNEQ addr - 13
  • JG addr - 14
  • JGE addr - 15
  • JL addr - 16
  • Jle addr - 17
  • STOP - 18
These numbers will be known as opcodes. In read-only memory, our little program will look like this:

// Assume a is at address 128 // Assume F is at address 129Addr opcode / value0 3 // CONB 11 12 4 // SAVEB 1283 1284 3 // CONB 15 16 4 // SAVEB 1297 1298 1 // LOADA 1289 12810 3 // CONB 511 512 10 // COM13 14 // JG 1714 3115 1 // LOADA 12916 12917 2 // LOADB 12818 12819 8 // MUL20 5 // SAVEC 12921 12922 1 // LOADA 12823 12824 3 // CONB 125 126 6 // ADD27 5 // SAVEC 12828 12829 11 // JUMP 430 831 18 // STOP

You can see that 7 lines of C code became 18 lines of assembler, and that all became 32 bytes in read only memory.

Decoding
The decode instruction must turn each of the opcodes into a set of signals that will drive various components inside the microprocessor. Let's take the ADD instructions as an example and see what it has to do. So:

  • 1. In the first clock cycle, you need to load the instruction itself, so the decoder needs to: activate the buffer for the command counter by three states, activate the read line (RD), activate the data in the three states of the buffer in the command register
  • 2. In the second clock cycle, the ADD instruction is decoded. There is very little to do here: set the arithmetic logic unit (ALU) operation to register C
  • 3. During the third cycle, the program counter increases (in theory, this can overlap in the second cycle)
Each instruction can be broken down into a set of sequenced operations - such as we just looked at. They manipulate the components of the microprocessor in the correct order. Some instructions, such as the ADD instruction, may take two or three clock cycles. Others may take five or six measures.

Let's come to the end


The number of transistors has a huge impact on processor performance. As you can see above, a typical Intel 8088 microprocessor could complete 15 cycles. The more transistors, the higher the performance - it's simple. The large number of transistors also allows technology such as pipelining.

The pipeline architecture is made up of command execution. It can take five cycles to execute one instruction, but there cannot be five instructions at different stages of execution at the same time. So it looks like one instruction completes each clock cycle.

All of these trends are allowing the number of transistors to grow, resulting in the multi-million dollar transistor heavyweights that are available today. Such processors can perform about a billion operations per second - just imagine. By the way, now many manufacturers have become interested in the release of 64-bit mobile processors and obviously another wave is coming, only this time 64-bit architecture is the king of fashion. Maybe I'll get to this topic in the near future and tell you how it actually works. This, perhaps, is all for today. I hope you found it interesting and learned a lot.

It's hard to imagine human life without modern electronics. Of course, there are many places where o modern technologies until now and have not heard, not to use. But still, the overwhelming majority of the world's population is somehow connected with electronics, which has become an integral part of our life and work.

Since ancient times, man has used various devices in order to make some production processes more efficient or to make his own existence more comfortable. The real breakthrough happened in the late 40s of the 20th century, when transistors were invented. The first were bipolar transistors, still used today. They were followed by MOSFETs (metal oxide semiconductor).

The first transistors of this type were more expensive and less reliable than their bipolar cousins. But, starting in 1964, electronics began to use integrated circuits, which are based on MOS transistors. This subsequently allowed to reduce the cost of production. electronic devices and significantly reduce the size of gadgets and systems while reducing power consumption. Over time, microcircuits became more complex and sophisticated, replacing large blocks of transistors, which opened up the possibility of reducing the size of electronic devices.

By the end of the 60s, microcircuits with a fairly large number of logic gates (large for that time) began to spread: 100 and more. This made it possible to use new elements to create computers. Developers of electronic computers relatively quickly recognized that increasing the density of transistors in a microcircuit would ultimately allow a computer processor to be created in the form of a single chip. Initially, integrated circuits with MOS transistors were used to create terminals, calculators, developers began to use them onboard systems passenger and military transport.

Key moment

Today, most electronics specialists admit that the start of a qualitatively new stage in the development of electronics began in 1971, when a 4-bit 4004 processor from Intel appeared, later replaced by an 8-bit 8008 chip. small size a Japanese company named Nippon Calculating Machine, Ltd. (later Busicom Corp.) ordered just 12 chips from Intel. The company needed these microcircuits for its calculators, and the logical design of the chips was developed by an employee of the customer company). At that time, a new set of microcircuits was developed for each device, performing highly specialized functions.

When fulfilling the order, Martian Edward Hoff proposed to reduce the number of microcircuits for the new device of the Japanese company by introducing the use of a central processor. It was he, according to the idea of ​​the engineer, who was supposed to become a data processing center and perform arithmetic and logical functions. The processor had to replace several microcircuits at once. The management of both companies approved this idea. In the fall of 1969, Hoff, with the help of Stanley Maysor, proposed a new architecture of microcircuits, the number of which was reduced to only 4. Some of the proposed elements are a 4-bit central processor, ROM and RAM.

The processor itself was developed by Federico Fagin, an Italian physicist who became the chief designer of the MCS-4 family at Intel. It was he who, thanks to his knowledge of MOS technology, was able to create a processor, implementing Hoff's idea. By the way, the world's first commercial microcircuit using silicon gate technology was also developed by him. She was called Fairchild 3708.

Fagin, being an Intel employee, was able to create new method designing systems of arbitrary logic. He was assisted in his work by Masatoshi Shima, who was then an engineer at Busicom. Fagin and Sima subsequently developed the Zilog Z80 microprocessor, which, by the way, is still being produced.


Architecture Intel processor 4004

But the main thing happened on November 15, 1971. This is the date of the appearance of the first microprocessor from Intel, the 4004 chip. Its cost at that time was $ 200. Almost all the functions of a mainframe processor were implemented on a single chip. It was announced in November 1971 in Electronic News magazine.

Processor specifications:


  • Date of appearance: November 15, 1971
  • Number of transistors: 2300
  • Crystal area: 12 mm²
  • Process technology: 10 μm (P-channel silicon pie MOS technology)
  • Clock frequency: 740 kHz (specifically from 500 to 740.740 ... kHz, since clock period 2..1.35 μs (or 92.6 kHz?)
  • Width of registers: 4 bits
  • Number of registers: 16 (16 four-bit can be used as 8 eight-bit)
  • Number of ports: 16 four-bit input and 16 four-bit output
  • Data bus width: 4 bits
  • Width of the address bus: 12 bits
  • Harvard architecture
  • Stack: internal 3-tier
  • Command memory (ROM / ROM): 4 kilobytes (32768 bits)
  • The amount of addressable memory (RAM / RAM): 640 bytes (5120 bits)
  • Number of instructions: 46 (of which 41 are 8-bit and 5 are 16-bit)
  • Instruction cycle: 10.8 microseconds
  • Supply voltage: −15 V (pMOS)
  • Working temperature: 0 to + 70C
  • Storage and operating conditions: from -40 to + 85C
  • Connector: DIP16 (the microcircuit was directly soldered into the printed circuit board or installed in a special slot)
  • Body: 16-pin DIP (1 plastic or 3 ceramic, e.g. C4004 (white ceramic with gray stripes), C4004 (white ceramic), D4004 (black and gray ceramic), P4004 (black plastic))
  • Delivery type: separately and in MCS-4 sets (ROM, RAM, I / O, CPU)
This processor executed 60,000 to 93,000 instructions per second. At the same time, one of the first electronic computers, ENIAC, could only execute 5,000 instructions per second. At the same time, ENIAC occupied 280 square meters, weighed 27 tons and consumed 174 kW of energy.

The 4004 processor did not become very popular. The 8080th chip, which can be called the "great-grandson" of the 4004, began to be used everywhere.

Calculators and computers

In 1971, Intel had competitors. For example, Mostek, a company that developed semiconductor devices and devices based on them, created the world's first "calculator on a chip", MK6010.

In June 1971, Texas Instruments launched a media campaign highlighting the benefits of its processor. At the time, the Datapoint 2200 based on TMX 1795 was described as “ powerful computer superior to the original ”, which meant that the capabilities of the Datapoint 2200 based on the TMX 1795 significantly exceeded the capabilities of the Datapoint 2200 based on bipolar transistors. But STS, after testing the new chip, rejected it, continuing to use bipolar chips. Intel was still working on its own processor.

After some time, TI, having made sure that there was no demand for TMX 1795 (later - TMC 1795), ended the media campaign and stopped production of the system. But this particular chip went down in history as the first 8-bit processor.

In 1971, STS lost interest in a single processor for its systems, transferring all rights to the new Intel chip. The company did not give up this opportunity, and continued to develop the 8008 chip, successfully offering it to a number of other companies. In April 1972, she delivered hundreds of thousands of such processors. Two years later, the 8008 processor was replaced by the new 8080, after which the 8086 came and the era of systems on the x86 architecture began. Now, when working on a powerful PC or laptop, it is worth remembering that the architecture of such a system was developed many years ago for the Datapoint 2200 HMI.

Intel then used more advanced technology, which provided the advantage of its processors. They were fast and relatively energy efficient. Plus, Intel chips had a higher transistor density than the TI chip, which reduced the size of the processors. Plus, marketing also played an important role, in this area Intel also made a number of successful steps, which ensured the popularity of the company's developments.

Be that as it may, the situation with the leadership in the development of the first processors is far from being as unambiguous as it is commonly believed. There were several pioneers here at once, but later the development of only one of them became popular. Actually, we are all dealing with modernized “descendants” of this technology today, in the 21st century.

Tags: Add Tags