THE IMPACT OF ICs ON COMPUTER TECHNOLOGY
Integrated circuit technology and computers have become increasingly interdependent despite
, their relatively independent origins. Progress in one discipline today often precipitates advances in
the other. For example, the computer, as a major component of 1C design systems that aid and even
automate portions of the VLSI design process, has permitted major advances in 1C complexity. On
the other hand, advances in JC technology have made high-capacity semiconductor memories
.   readily available and have enabled us to design new classes of computing engines, such as systolic
arrays, that, via massive parallelism and local communication, offer substantial performance gains.
Improvements in 1C processing due to enhanced capability of process equipment and an improved manufacturing environment have made possible a steady reduction in device feature sizes. Once again, the interaction between the two disciplines is evident primarily in the widespread use of computers for process modeling, process control, scheduling, and inventory control.
The five generations of ICs are not congruent with the five generations of computers but rather . have been derived from the progress of device technology over the last several decades. Small, medium, large, very large, and ultra large integration-or SSI, MSI. LSI. VLSI, and ULSI-are conve¬niently represented by ranges of chip complexity such that the upper limit on each range is 32 times -the lower limit:
SSI.........2-64__________________________________________________________
MSI ........64-2000
LSI  ........2000-64,000
VLSI........64,000-2,000.000
ULSI........2.000,000-64,000,000
Today's technology is in the VLSI range, although devices of lesser complexities, including discrete, small-signal transistors, are still manufactured and used.
Shortly after the germanium transistor was demonstrated in the late 1940's, efforts were initiated to make a transistor in silicon. Of the two, the metallurgy of germanium caused it to be available first, but the wider energy band gap and lower heat dissipation of silicon made it more desirable for electronic devices. In any case, by the mid-1950's silicon transistors had become available, and the technology of this material was rapidly advancing. By 1960, photolithography, oxide masking, and impurity diffusion techniques were being applied to produce high-performance transistors. Practical 1C application followed quickly, with different organizations developing the required technical elements.
Since its invention in 1959, the 1C has undergone rapid growth in which chip complexity figured prominently. For example, after 1959, 1C chip complexity doubled every year. In 1973, complexity had reached nearly 8000 components per chip. Complexity has since doubled every 1.5 to two years. This progression is known as Moore's Law.
Complexity growth is not due completely to feature size reduction but depends also on increases in chip size. In fact, chip area has increased by an order of magnitude in the last two decades as silicon substrate quality and process controls have advanced to permit economically   * acceptable yields at these chip sizes.
Many other parameters, such as wafer size, number of chips per wafer, chip costs, gate density, and production levels, may be used to trace the rapid progression of 1C technology over the last 25    ' years. Of these, 1C technology is especially impressive when viewed in terms of cost decreases.
As integration levels have increased, so too have the performance capabilities of the devices on
energies were in the vicinity of one.
FIVE GENERATIONS OF COMPUTER SYSTEMS
Each generation of computers has resulted from an advance in component technology, beginning with the replacement of the vacuum tube by the discrete transistor. Shortly after the commercialization of the transistor, the Eniac and Whirlwind main frame computers of the early I950's were replaced by the IBM 709 in the early 1960's. The PDP-11, a third-generation main frame computer, was a result of the introduction of the 1C in the late 1960's. The IBM 370 and Amdahl 470 architectures are the result of large-scale integration, and the most recent main frame computers based on very large scale integration became available in the mid-1980's. Software methodologies have .paralleled the advances in hardware, beginning with the generation of binary code, extending to the use of assembly language instructions, and culminating in the use of higher level languages coupled with operating systems. Furthermore, natural-language interface software will accompany the introduction of fifth-generation computer hardware to culminate in the realization of "artificial intelligence" systems.
The advancement in central main frame computers has been paralleled by-the introduction of minicomputers, microcomputers, and controllers. When 1C technology achieved a level of performance/cost ratio that permitted distributed hardware computer systems to be manufactured cost-effectively with improved performance, the benefits of merely expanding time-sharing central-computer resources diminished. As a result, complete computer systems manufacturers could expand their product portfolio with a compatible set of products to serve a broader market-at minimal increase in the cost to the market.
For the high-performance main frame computer segment of the market, the thrust has been to increase performance while maintaining price, while for the low-performance, low-price end of the product spectrum, the thrust has been to maintain performance while reducing cost. In this manner, as higher levels of integration are employed in the circuit fabrication, the increased performance concepts first introduced for high-performance main frame use are incorporated in minicomputer and then microcomputer and controller systems.
WHAT IS COMING WITH THE FIFTH GENERATION
Fifth generation computer systems will be required to have an extremely wide variety of sophisticated functions to solve the numerous problems which today's computers have and to meet the social needs of the 1990's during which decade computerization is expected to find many more applications than nowadays.
As a whole, functions required of fifth generation computer systems will be as follows:
1. Increased int611igence and ease of use so that they will be better able to assist man.
2. Functions which enable inputting and outputting of information via speech or voice, graphics, images and documents.
Enhancement of input/output functions, which serve as the interface between man and computer, is of prime importance in making computers easier to use.
3. The ability to process information conversationally using everyday language.
As computers penetrate further and further into every field of our society, there will be more opportunities for laymen to operate them and thus gain direct access to needed information.
4. The ability to put stored knowledge to practical use in order to be able to utilize computers more effectively for solving problems, they will have to be equipped with specialized knowledge, i.e. knowledge bases, related to the "fields in which they are employed. Then by putting these knowledge bases to practical use computers will be better able to lessen the burden on their human
operators as well as serve a role as consultant systems for all mankind.
5. The functions of learning, associating and inferring.
So that computers have knowledge and can sufficiently use it for a desired purpose, they should be given in one form or another abilities of learning, associating and inferring just like ours. With such abilities, computers would be able to clarify even vague requests given by man and using their vast ability to store information achieve new judgement facilities of their own which will help expand the capabilities of us humans as well.
6. Increased diversification and adaptability. Up to now general-purpose computers with fixed hardware have been in the mainstream, but computer systems in the 1990's will be required to have much wider diversification and purpose-oriented adaptability and flexibility.  Hardware and software both should have their basic components modularized for free system adaptability and re-arrangeability to suit various purposes.
7. The fifth generation computer systems will be knowledge information processing systems having problem-solving functions of a very high level.
As advances in ICs are exploited, more powerful special-purpose computer hardware circuits become realizable, thus permitting more flexibility in 1C circuit design. Therefore, as in the past, advances in circuit technology have prompted the design of computers and computer systems (including communications); indeed, the future advances in circuit and device technology, which usefully employ new device phenomena, will determine the architecture of future computers, computer networks, and communication systems. Software capabilities will evolve to provide intelligent systems as hardware performance continues to improve. Costs will decline*to the point that they are not a significant factor in determining the number of machine cycles or parallel implementations per program statement and memory capacity required to implement a given function. Additionally, recent advances in the use of heterogeneous materials and associated processing techniques in the fabrication process are adding capabilities like integrated optical communications and devices based on quantum domain principles to further increase the perform¬ance/cost ratio of computer hardware equipment.
The research efforts are being planned to explore "beyond silicon shrink" directions. These efforts will be based primarily on quantum effect principles, and if successfully brought to fruition, could provide yet another order-of-magnitude increase in performance. These advances would then provide the next generation of computers.
In today's 1C technology, the feature size is about two mm. Before fundamental thermodynamic limits are reached by shrinking, it is entirely reasonable to project that 16 M-bit dynamic memories and a one-femtojoule logic switch can be fabricated with a lOXreduction in today's minimum feature size.
However, switching speeds of active elements are already approaching the propagation delay between elements, resulting in a timing skew. This is a problem common to most present-day computer architectures, which require synchronization to ensure the simultaneous occurrence of timed events. Consequently, the distribution of clocking signals requires careful design. The magnitude of the capacitance used for charge storage in dynamic RAM, when impressed with the maximum voltage possible without incurring dielectric breakdown, results in a stored charge comparable with that produced by the passage of an alpha particle through the 1C memory array, and thus the loss of memory through alpha-event "soft error".
Current densities in interconnects can exceed the threshold for electromigration, producing limited endurance components as mentioned earlier. To achieve the advances forecast by the 10-fold direct scaling of the device technology to accomplish the next level of integration, the pragmatic barriers associated with device isolation, heat generation, interconnection, and packaging to reliability and wear-out must be achieved and implemented in the fabrication of the ICs.
CHOOSING THE RIGHT TECHNOLOGY
An electronic designer, faced with the problem of engineering a system today, is confronted with a wide choice of implementation methods. Even problems, which are basically analogue in nature, can now be solved effectively using digital or computer based techniques.
Analogue design techniques are well established and understood and therefore need little comment here. Suffice is to say that modern analogue components are in general easy to use and the designer can implement a system using highly stable operational amplifiers and other module-like integrated circuits such as multiplexers. The main problems, however, are still in the areas of noise, capacitive cross coupling and calibration. These areas do not usually cause problems in digital implementations, though of course there are others. It is now becoming economical to process analogue signals digitally by digitizing the signal, carrying out the required function digitally and then converting back with a digital to analogue converter. Applications of this technique include digital speech transmission using pulse code modulation and digital processing of seismic waveforms.
Digital techniques fall into two main groups, namely serial and parallel. Computer implementations are essentially serial in that they implement the various parts of the total process sequentially. The concept of a parallel processor refers to the fact that a processor deals with a group of bits, i.e., a word at once or in parallel.
The user of microprocessors needs at least a familiarity with the major characteristics of different semiconductor technologies if he is to be able to select suitable devices to meet his system requirements. He also needs to understand when to use the various implementation techniques, such as hard wired logic, custom LSI and minicomputers, etc., in preference to microprocessors.
Microprocessors do not of course always provide the best way to implement a particular system and it is therefore worth considering other possibilities. Diagram 4 shows the basic family of techniques that can be used. As can be seen the digital area with which we are concerned splits into two distinct sections, one covering techniques where the function is fixed by a program, e. g., microprocessors, the other covering techniques where the function is fixed by the physical layout or interconnection of devices. This category of course covers custom LSI and conventional hard wired logic systems.
The design of a microprocessor-based system can be broken down into two parts, namely the hardware and software. The design of hardware is becoming very straightforward, since modern microprocessors and their associated peripheral and memory components can be virtually plugged together without the need for detailed logic design, which was previously the norm. There is still a choice, however, to be made between hardware designed specifically for the particular application an4 a general purpose microcomputer card which requires the minimum of special logic.
The term hard wired logic refers to the traditional method of implementing a digital function using "and" gates and flip-flops, etc. The most common form of this is standard TTL (transistor-transistor-logic). This is now being superseded by low power Schottky TTL. Earlier forms of logic such as RTL (resistor-transistor logic) and DTL (diode-transistor logic) are no longer used in new designs, except for discrete versions of DTL which is compatible with TTL and is often used for interfacing to higher power circuits such as lamp drivers,
When higher integration densities or lower power consumption is required, one of the many forms of MOS logic (metal oxide semiconductor) is usually used. Most microprocessors and many of their peripherals are implemented in this technology.
Since the various parts of the total function can be implemented as separate sub-units which can all work simultaneously, i. e., a parallel realization, hard wired logic is potentially very fast. It is anticipated that hard wired logic will to a large extent be superseded by microprocessor and custom
LSI implementations, which use less power and occupy less space. Hard wired logic, however cannot be beaten when high speed is required. This is unlikely to change in the near future unless great improvements are made in the custom LSI area.
Custom LSI is similar to hard wired logic with the essential difference that the interconnections between the various logic elements are done by the semiconductor manufacturer at chip level. In other words, the design is performed in much the same way as conventional hard wired logic and then this design is implemented on a single integrated circuit. Obviously the logic elements are different and the design constraints very different. The design process requires a much more basic knowledge of circuit design and of course a deep understanding of semiconductor technology. Unlike hard wired logic, it is not possible to make minor alterations to the circuit at breadboard level; it either works first time or it does not. The design of custom LSI must therefore be done very carefully using highly skilled engineers and is therefore very costly, but once done, it leads to very low unit costs. It is therefore ideal for very high volume applications. The amount of logic that can be implemented on a single chip is of course limited and so custom LSI is not suitable for very complex functions.
The idea behind an uncommitted logic array (ULA) is similar to that of custom LSI, in that both are tailor-made integrated circuits. In custom LSI, the circuit is designed from scratch, possibly using modules such as flip-flops, etc., whereas in a ULA design, as much of the general circuit design as possible is carried out in a general fashion, the tailoring being left to the last minute. A ULA consists of a matrix of components on a single chip, but with no interconnections. To im¬plement a specific function, aluminium interconnections are put onto the ULA in a single masking operation. This means that the design is relatively simple compared with custom LSI, although it is not so flexible. A typical ULA consists of a couple of thousand components arranged in a matrix of about two hundred cells.
AUTOMATIC CONTROL
Automatic control is process of maintaining a satisfactory relationship between the input and the output of a system without human intervention. Automatic control is used in preference to simple manual control to relieve man of tedious or difficult tasks, to permit control in an environment hostile to man, to obtain amplification of signals, to produce a desired effect at a remote location, and for many other reasons. Thus, automatic control systems now are widely used in the home (temperature control), in industry (automatic machine tools; petroleum refineries), in space vehicles (automatic control of attitude and trajectory), and in military applications (aiming and firing guns).
Development. One of the first applications of automatic control was James Watt's use of the fly ball governor in 1787 to control the speed of a steam engine. An even earlier example was the use of a small pilot windmill (fantail) to keep a large windmill faced into the wind so that it would provide maximum power. Whenever the wind direction changed, the fantail would rum the large windmill back into the wind. This type of windmill was invented by Edmund Lee in England in 1745.
Following the invention of the vacuum tube early in the 20th century, the development of electronics laid the technological foundations for great advances in the design of automatic control systems. About the beginning of World War II, interest in such systems increased greatly because of the need to improve the speed and accuracy with which searchlights, guns, and radar antennas could be aimed at moving targets.
Operation. A simple form of automatic control system is represented in the illustration, which shows the use of feedback as a means of automatic control. The system output-the variable being controlled-is subject to undesirable variations because of disturbing influences. The system input-in this case the desired value of the variable being controlled-is the reference quantity. The system output is fed back and compared with the input in a comparator, or error detector. In this device, the controlled variable, c, is automatically subtracted from the reference input, r. The difference thus obtained is the error, e, in the variable being controlled (r-c e). If the error is positive (r>c), the -tern output is too small, and the controller acts on the controlled process so that the value of the controlled variable is increased. If the error is negative (c>r), the system output is too large, and the controller acts on the controlled process so that the value of the controlled variable is decreased.
In many cases the characteristics of the particular process being controlled cause such a simple control system to exhibit a "hunting" behavior, which is very undesirable. When a control system persistently overcorrects for error, the error alternately becomes positive and negative. The system then appears to be "hunting" continually for the desired value of the controlled variable. In such cases, to improve the response, the controller is designed to respond to the error signal in a more intricate manner, usually by anticipating future error by measuring the rate at which the present •i-or is changing.
Types of Systems. There is a large class of automatic control systems in which the input may change from one level to another occasionally but is constant at all other times. The primary jiction of such systems, termed regulators, is to compensate for the effects of unwanted disturbances acting on the controlled process. Familiar regulators include thermostatic temperature-control devices for ovens, air conditioners, and heating systems. The thermostat serves as the comparator in a regulator that maintains a desired constant temperature in spite of disturbances caused by the opening of doors and the changes in outdoor temperature.
There is another large class of control systems, termed servomechanisms, in which the output is  a mechanical displacement of some object. In these systems the input may vary rapidly over a wide range, unwanted disturbances may act on the controlled process, and the system output is often at a location that is remote from the location of the input.
For most automatic control systems, feedback is essential to compensate for outside disturbances and unpredictable variations in the process being controlled. If no unwanted disturbances act on the controlled process and if the characteristics of the controlled process are known and unchanging, feedback is not required for automatic control. The controller can then be designed so that the system output is approximately proportional to the input with no use of feedback and hence no error detection.
AUTOMOTION
A simple example of automation is the thermostatically controlled heating system in a home. The furnace provides the heat, but the thermostat automatically turns the furnace on and off to keep the temperature of the home constant. One machine starts and stops another. A more elaborate example of automation is the computer complex that controls an automobile production line or prepares a company payroll.
Automation may be defined as any continuous, integrated operation of a production system that uses electronic computers or related equipment to regulate and coordinate the quantity and quality of what is produced. Automatic control of production is achieved in factories by transfer machines, which move a product from place to place for successive operations.
Computers, transfer machines, and related equipment use the principle of 'feedback' a concept of control in which the input of machines is regulated by the machines' own output. Although the use of machines dates back to the steam engine of the 18th century and to the assembly line of the early 20th century, feedback is a new development truly unique to automation. (Under this definition, a farm cannot be called automated merely because of the hugeness of its tractor, since the principle of feedback is lacking.)
Automation covers the output of both physical products and of services. It may be used to administer work in any large organization, as in manufacturing, to produce automobiles, or in the insurance industry, to process data on vast numbers of policies. Automation may be used even by labor unions, churches, and other organizations that arc large enough to need and afford the equipment. It has been reliably estimated that most of the recording activities of the New York Stock Exchange could be handled by one electronic computer and two operators.
Technology of which automation is a component-is the application of science to practical uses. Man lived hundreds of thousands of years without it, until the Industrial Revolution in the 18th century, but only about 10 percent of the people were able to live above minimum subsistence, and they usually did this by enslaving the rest. Since the first Industrial Revolution, and during the present-day "automation revolution" the number of people living in poverty in industrialized countries has fallen to about 20 percent.
Nearly everybody knows that technology can solve a multitude of problems. Spectacular eco¬nomic growth has been due in great part to advances in technology. Untold millions of people, especially in the underdeveloped parts of the world, fully expect science and technology to solve all of their most pressing problems.
Too few persons recognize that although technology solves countless old problems it also creates many new ones. Not all technological improvement is a net gain. In the first place, some new technology is necessary just to cure the ills of previous technology; for example, if afterburners are perfected for automobile exhaust, then the air will merely be as clean as it was before the automobile contaminated it. Secondly, some new technology is workable but not yet economical, as in the case of solar energy. Thirdly, nearly all forms of technology have enormous potential for human betterment but, if they are not clearly understood, technological advances can do more harm than good. This is especially true of automation.
Mass-production techniques, however, have produced a mental and physical dependency on machines. The complete effects of this dependence are not yet fully recognized. Although living standards in the industrialized world are the highest in history, much of industry has become dependent on automated machinery, and as a consequence people generally have become dependent on automation's products, such as washers, dryers, and automobiles. Reliance on these machines often tends to make society measure culture not in terms of intellectual or artistic accomplishment but in terms of such new concepts as automobile horse-power, cigarette mildness, and deodorant durability.
Automation as a Benefactor. Wernher van Braun, the rocket engineer, once said, "We can lick gravity but the paperwork is overwhelming", referring to the fact that the most powerful computers are needed to do the millions of calculations required to guide rockets as they take off into space. The staggering amount of arithmetic required for each space mission could never be done by human beings alone in time for the results to be useful.
The memory drum of a computer at a medical college holds millions of pieces of evidence regarding the results of certain types of treatment based on particular symptoms. Patients benefit because doctors no longer must rely on their own memories of a few similar cases. Other computers benefit science and education by holding Russian-English dictionaries in their memory drums and translating automatically (although not by any means perfectly, because single words have so many different meanings).
Some jobs cannot be done without automation. Because of their speed of operation, automatic transfer machines, electronic computers, and other automation equipment perform tasks that otherwise could not be accomplished, no matter how much power was used or how well the work was organized and managed. Manipulating an atomic pile or controlling a rapid chemical reaction could not be done without automation. Some new products, such as polyethylene, a soft but strong plastic used for making thousands of consumer products, could not have been produced without automation. Color television would not be possible without automatic control machinery because human beings by themselves cannot put the hundreds of thousands of colored dots in their right places in the tubes. In addition, automatic sensing devices operate under conditions that would be
deadly to man-in intense heat, in bitter cold, in poisonous gases, and in areas of atomic radiation.
In the pottery industry, silica dust has long been a hazard, but closed silos and automatic conveyors now handle all dust-producing materials. In a major automotive stamping plant, scrap eel formerly was collected at individual scrap collection areas, then was baled and moved on open conveyors to the central collection area. Workmen were exposed to physical dangers, and there /ere frequent injuries. Automatic equipment now enables the scrap to be put into balers, and closed conveyors move it to the collection area where the scrap is loaded automatically.
Automation generally has resulted in greater efficiency and over-all cost savings for companies, aid consumers have benefited from lower prices on a wide variety of products.
Automation as an Employer. Although many jobs have been eliminated by automation, others lave been created. Before one of the Bell System telephone companies installed several electronic computers for office work, the processing of information to go into the company's operating plan squired the accounting department to employ all available workers for nights, weekends, and holidays, and additional persons had to be borrowed from other departments. Everyone worked trader great pressure until the plan was processed. After computers were installed, the company 'i ported that the plan was handled in stride and with no strain. A deck of cards, automatically punched, replaced a mountain of laboriously typed reports. Nevertheless, more clerical workers had to be added. The explanation is that prior to automation most of the employees had not had time to do what they were sup, posed to do. Important analysis and planning had been postponed because of the burden of routine. Thus, the efficiency and comprehensiveness of the clerical work was greatly increased by means of automation.
How many new jobs may be created permanently in the manufacturing, selling, and servicing of computers is not yet known, because the industry itself is undergoing rapid change. There is some indication that the manufacture of automation equipment itself may become automated. In the manufacture of automation equipment, the "instrument production" industry-employment declined sharply beginning in 1955. Employment in the production of nonconsumer machinery had risen by an average of about 12,000 jobs a year from 1900 to 1930 and by about 80,000 a year from 1939 to 1947, but then remained relatively constant until 1958 (while investment in machinery hit new records), and has fallen steadily ever since. Regardless of whether these trends continue, the jobs created by automation will be far different from the jobs destroyed and will require workers to obtain more education and training.
EFFECTS ON WORKERS AND JOBS
Advantages to Workers. Automation has many advantages for workers. It improves working conditions in several ways. Safety is improved by means of mechanized materials handling, elimination of the most hazardous jobs, and the reduction of the number of persons in direct production areas through the use of remote controls. For instance, dangerous operations are monitored with an electric eye or television equipment. As a result, hernia, eye troubles, and foot accidents have been greatly reduced in many automated plants.
In general, automation improves working conditions by permitting plants to be cleaner, neater,
and more pleasant. Automated grain mills have eliminated all dust. Some foundry workers never
touch molding sand except out of curiosity, and there are oil refinery workers who could wear
Dinner jackets and white gloves on the job and never get them soiled. Automation thus has certain
esthetic advantages.
Job Losses. Mushrooming technological changes have had a much more serious effect on factory jobs than office jobs. Although automation has increased factory output enormously the total number of production workers has declined. The number of workers in factory production in the United States decreased from 40 percent to 30 percent of the labor force from 1950 to 1960 while over-all production increased about 40 percent and the population rose 20 percent.
Displacement of labor takes several forms. First, a worker may be laid off permanently with loss of seniority and other job rights. A second direct form of displacement involves transfer of the displaced worker to another department of the same firm. Several case studies have found departmental transfer to be a common occurrence. The decline in employment of production workers in the automobile industry has included both of these types.