keeping up with the computer revolution

5
IEEE TRANSACTIONS ON EDUCATION, VOL. E-22, NO. 2, MAY 1979 Keeping up with the Computer Revolution The revolution that started in the 1970's with the introduction of the first microprocessor continues. Engineers are now talking about 400 000 transistors on a logic chip by 1985; by that time, a single silicon chip sufficient for a mainframe central processor will be available. As general purpose microcomputers become as cheap and plentiful as today's op amps or memory chips, there will be a significant change in the products of industry. All companies are going to make computer-applications products, no matter whether they are now in computers, electronics, or even nonelectronics. They will do so for reasons of economics: every company will want to manufacture more expensive, profitable products by combining computers with other products. There is going to be real competition for engineers skilled at doing this. Another important factor is the rapid emergence of what is often called "silicon software" or read-only memories in which software is hardburned into chips for inclusion in the computer system. Clearly, this is a technology that must be understood by those who will do design. The EE of the 80's will be required to understand a wide spectrum of technology. Besides the usual top- ics, he or she must understand computer system design, LSIfabrication, fiber optics, digital communica- tions, analog circuits. In analog circuits, there is a comparable revolution with a myriad of cheap modules becoming available to the designer-a phase-lock loop being an example. While depth is required in order that the student have a "marketable skill, " breadth is also desirable in view of the broad spectrum of applications. How can we meet this challenge? The wrong question is being asked. If a school is not already up to speed, they ought to be asking, "How do we get on board? Do we have enough intestinal fortitude to break with certain traditions so as to include digital logic and computers as an integral part of the curriculum for every electrical engi- neering student?" If a school wishes to retain a viable pro- gram, the answer to the last question had better be, "Yes". The 1978 General Electric Conference, for example, clearly demonstrated that the ubiquitous digital circuit is invading everything-locomotive control, nuclear power plants, local process control, the radio and television station, microwave range, instrumentation, etc. A few years ago education in the computer area at Brigham Young University was conducted at the graduate level; i.e., as a specialty after receiving the first degree. It became apparent that digital concepts would shortly pervade the domain of the electrical engineer. If our Electrical Engineering Department was to attract a proper share of students and educate them in such a manner as to meet the challenge of a changing tech- nology, a curriculum change seemed in order. Accordingly, that which was graduate work one day became freshman work the next day. If properly introduced, the average freshman has no dif- ficulty in understanding basic concepts of digital logic and computing systems and subsystems. Students entering the Electrical Engineering Department at Brigham Young Univer- sity at the freshman level enroll in Introduction to Computer Engineering. At this state, the concept of a volt, ampere, charge, magnetic phenomenon, etc., is generally not "resi- dent." Two or three lectures on laws of electricity and magne- tism serve to irradicate false notions and to establish a proper vocabulary and a quantitative ability with simple series circuits, including the transistor. The instructor goes to considerable effort to make the fresh- man feel comfortable in his first professional contact with the byline "Freshmen-the highest we count in this class is to One." Simple single concept ideas are formulated; these are rein- forced with class feedback and frequent very short 2-7 minute quizzes. Because the requirement for knowledge in the digital field is becoming so widespread, students from such diverse areas as Psychology, Law Inforcement, English, Com- puter Science, Technology, Mathematics, Physics, Chemistry, other engineers, etc., enter and successfully complete this beginning course despite their nonelectrical engineering background. Throughout the freshman and sophomore year, IC's are treated as terminal devices. This is most important! There is no need to present an unpalatable dose of solid state physics at this level. The student does know that certain families exist and that he may need to select on the basis of speed, heat, noise immunity, etc. The single most unique feature of the Brigham Young Uni- versity computer offering is the Digital Logic and Computer Systems Laboratory. This facility has been described else- where and, therefore, will only briefly be outlined here (1, 2). A flexible and modular Digital Trainer has been developed which serves as a patchable panel to interconnect PC boards taken for an inventory ranging from a simple 7400 logic gate to a rather complex memory or microprocessor. This facility provides the student with rather immediate response to his design, a most desirable pedagogical technique. The consequence of our program, with emphasis on starting early in the freshman year, is that: 1. Some students move into the microprocessor course in the second semester of the freshman year. Emphasis here is not on high level languages but rather on bit banging, data flow, and programming in hex; i.e., the hardware level. 2. Sophomore students build, among other projects, a mini- processor controller at the end of the first semester. 3. At the end of the Sophomore year, a minicomputer is built on a project basis. 4. A rather complex project is completed the third year. It is recognized that logical problems are simple, the real prob- lems of delay, sync, drive level, etc., require electrical engineer- ing circuits background. All of our electrical engineering students are educated to pass the EIT examination; i.e., even computer option students receive a good dose of circuits and electronics. In effect, the computer option student has devel- oped very early in his academic program, sufficient expertise that we cannot meet the demand for his services at the local level. The parallel demands of: 1. Local research and development firms 2. Laboratory assistantships 3. Campus Computer Services and 4. Private consulting 0018-9359/79/0500-0039$00.75 © 1979 IEEE 39

Post on 24-Sep-2016

214 views

Category:

Documents


1 download

TRANSCRIPT

IEEE TRANSACTIONS ON EDUCATION, VOL. E-22, NO. 2, MAY 1979

Keeping up with the Computer Revolution

The revolution that started in the 1970's with the introduction of the first microprocessor continues.Engineers are now talking about 400 000 transistors on a logic chip by 1985; by that time, a single siliconchip sufficient for a mainframe central processor will be available. As general purpose microcomputersbecome as cheap and plentiful as today's op amps or memory chips, there will be a significant change inthe products of industry. All companies are going to make computer-applications products, no matterwhether they are now in computers, electronics, or even nonelectronics. They will do so for reasons ofeconomics: every company will want to manufacture more expensive, profitable products by combiningcomputers with other products. There is going to be real competition for engineers skilled at doing this.Another important factor is the rapid emergence of what is often called "silicon software" or read-only

memories in which software is hardburned into chips for inclusion in the computer system. Clearly, this isa technology that must be understood by those who will do design.The EE of the 80's will be required to understand a wide spectrum of technology. Besides the usual top-

ics, he or she must understand computer system design, LSIfabrication, fiber optics, digital communica-tions, analog circuits. In analog circuits, there is a comparable revolution with a myriad of cheap modulesbecoming available to the designer-a phase-lock loop being an example. While depth is required in orderthat the student have a "marketable skill, " breadth is also desirable in view of the broad spectrum ofapplications. How can we meet this challenge?

The wrong question is being asked. If a school is not alreadyup to speed, they ought to be asking, "How do we get onboard? Do we have enough intestinal fortitude to break withcertain traditions so as to include digital logic and computersas an integral part of the curriculum for every electrical engi-neering student?" If a school wishes to retain a viable pro-gram, the answer to the last question had better be, "Yes".The 1978 General Electric Conference, for example, clearlydemonstrated that the ubiquitous digital circuit is invadingeverything-locomotive control, nuclear power plants, localprocess control, the radio and television station, microwaverange, instrumentation, etc.A few years ago education in the computer area at Brigham

Young University was conducted at the graduate level; i.e., asa specialty after receiving the first degree. It became apparentthat digital concepts would shortly pervade the domain of theelectrical engineer. If our Electrical Engineering Departmentwas to attract a proper share of students and educate them insuch a manner as to meet the challenge of a changing tech-nology, a curriculum change seemed in order. Accordingly,that which was graduate work one day became freshmanwork the next day.

If properly introduced, the average freshman has no dif-ficulty in understanding basic concepts of digital logic andcomputing systems and subsystems. Students entering theElectrical Engineering Department at Brigham Young Univer-sity at the freshman level enroll in Introduction to ComputerEngineering. At this state, the concept of a volt, ampere,charge, magnetic phenomenon, etc., is generally not "resi-dent." Two or three lectures on laws of electricity and magne-tism serve to irradicate false notions and to establish a propervocabulary and a quantitative ability with simple series circuits,including the transistor.The instructor goes to considerable effort to make the fresh-

man feel comfortable in his first professional contact with thebyline "Freshmen-the highest we count in this class is to One."Simple single concept ideas are formulated; these are rein-forced with class feedback and frequent very short 2-7 minutequizzes. Because the requirement for knowledge in thedigital field is becoming so widespread, students from suchdiverse areas as Psychology, Law Inforcement, English, Com-puter Science, Technology, Mathematics, Physics, Chemistry,other engineers, etc., enter and successfully complete this

beginning course despite their nonelectrical engineeringbackground.Throughout the freshman and sophomore year, IC's are

treated as terminal devices. This is most important! There isno need to present an unpalatable dose of solid state physicsat this level. The student does know that certain familiesexist and that he may need to select on the basis of speed,heat, noise immunity, etc.The single most unique feature of the Brigham Young Uni-

versity computer offering is the Digital Logic and ComputerSystems Laboratory. This facility has been described else-where and, therefore, will only briefly be outlined here (1, 2).A flexible and modular Digital Trainer has been developedwhich serves as a patchable panel to interconnect PC boardstaken for an inventory ranging from a simple 7400 logic gateto a rather complex memory or microprocessor. This facilityprovides the student with rather immediate response to hisdesign, a most desirable pedagogical technique.The consequence of our program, with emphasis on starting

early in the freshman year, is that:1. Some students move into the microprocessor course in

the second semester of the freshman year. Emphasis hereis not on high level languages but rather on bit banging,data flow, and programming in hex; i.e., the hardwarelevel.

2. Sophomore students build, among other projects, a mini-processor controller at the end of the first semester.

3. At the end of the Sophomore year, a minicomputer isbuilt on a project basis.

4. A rather complex project is completed the third year.It is recognized that logical problems are simple, the real prob-lems of delay, sync, drive level, etc., require electrical engineer-ing circuits background. All of our electrical engineeringstudents are educated to pass the EIT examination; i.e., evencomputer option students receive a good dose of circuits andelectronics. In effect, the computer option student has devel-oped very early in his academic program, sufficient expertisethat we cannot meet the demand for his services at the locallevel. The parallel demands of:

1. Local research and development firms2. Laboratory assistantships3. Campus Computer Services and4. Private consulting

0018-9359/79/0500-0039$00.75 © 1979 IEEE

39

IEEE TRANSACTIONS ON EDUCATION, VOL. E-22, NO. 2, MAY 1979

provide employment opportunities such that essentially eachgraduate will have had considerable experience prior to gradua-tion. We effectively have a built-in coop program without theattendant burden. The electronics and power option students(25% of the total in each) not only take the freshman course,but also a sequential logic course and a microprocessor courseis recommended. The early education provides well-motivatedjuniors and seniors who can move effectively to incorporatetechnological change.And, now, back to the original question; keeping up in the

sort of atmosphere described is not too difficult. The majorhurdle is the first step jump (and for some schools, major cur-ricular surgery is indicated). After the initial transient, it iscomforting to know that no one has really invented substan-tially new architecture recently; basics, well taught, readilyleads the engineer into new technology.The result of the early start concept has not gone unnoticed

by the electronics and power option faculties. A completecurricular revision has taken place in the areas of circuits andelectronics; in fact, these concepts are integrated into the samecourse so that laboratory experiments can start at an earlydate. A new text is being written to accommodate this teach-ing pattern. The student no longer needs to complete circuitsbefore starting electronics or power applications.Our success is credited to a highly motivated student body.

This is brought about by:1. getting started early at the freshman level,2. having an open digital laboratory which allows rapid and

meaningful component and/or system synthesis,3. changing the curriculum to fit technological changes,4. avoiding excessive unneeded abstractions at the under-

graduate level (Turing machines, information theory,etc.),

5. offering practical employment parallel to the academicprogram, because of their realistic laboratory experience,thereby facing the student with the real world,

6. using as an underlying motivation, the objective of grad-uating an individual qualified to practice as a professionalengineer,

7. a dedicated, innovative faculty.

REFERENCES

1. Hands-On Approach to Digital Systems Instruction, Paul M. Hansenand Richard Ohran, presentation to Model Curriculum Committee,IL State Univ., June 1976.

2. Innovative Approach to Digital Systems Instruction, Paul M. Hansen,and Richard S. Ohran, presented to IEE 1975 Region Six Confer-ence, Salt Lake City, May 1975.

JENS J. JONSSONChairmanElectrical Engineering DepartmentBrigham Young UniversityProvo, UT 84602

My major concern is with the question of the real integrityof the apparent discipline of computers. The amount of basicmaterial to be understood is really quite small. There is ageneral lack of basic unifying concepts. Everything is quitespecial and evolving. To proceed without regard to these factsin teaching everything from a digital point of view is equiva-lent conceptually (admittedly with some exaggeration) to thecreation a few decades ago of course on 10K resistors and yetanother on 15K resistors or possibly, as the age of enlighten-ment dawned, one on carbon resistors and another on metalfilm devices.

There is yet another very real danger in rushing willy nillyto chase the computer wraith. It is the potential lack of rec-ognition that the computer is merely a tool and that a tool isneedful of a task. To train our students in the sharpening ofaxes without exposing them to the marvels of trees and forestsis to abandon the integrity of the process. For it is in the pur-suit of trees with axes that one conceives of saws and fromsaws the possibility of solutions to other problems-thecreation of cabinets for example.Computing provides only a few basic lessons; the number of

these is not large nor is their profundity great. It encouragesorderly and algorithmic processes appropriate to all kinds ofengineering activities including design. Computers taught atan early stage can motivate these ideas without a ponderouscollection of obscure basic facts being needed. Another basicattribute of computing, fostered by its intrinsic simplicity, isits potential to teach two important lessons, 1) that much canbe done with little if properly organized, 2) that one shouldseek everywhere for as elegant, basic and canonic a solution toall problems as the computer provides for many for which it issuited.Since the computer's major strength is its lack of intrinsic

profundity, I agree therefore that its formalism can be andpossibly should be taught early, not so much for its own sakebut as an easily motivated basis for some engineering principlesof greater applicability in areas in which computers can playno immediate part.Though importance is attached by some to "silicon software"

or read-only memories, I believe that there is nothing particu-larly profound or fascinating about this. It simply implies thatone can make fairly complex computer-based standarizedmodules at very low cost. It is not that one can do that sortof thing, it is what one should do with that sort of thing that isimportant. The latter is indeed an interesting question but haslittle or nothing to do with computers per say. It may how-ever have a lot to do with signal processing etc. etc.Correspondingly there is no intrinsic need for an engineer in

the modern world in which microprocessors proliferate toknow anything particularly complete about computer systemdesign or LSI fabrication, particularly the latter. What is mostimportant is only that in courses in some root discipline suchas signal processing that explicit digital manipulation or evencomputation can be used economically and at reasonable,though not very high, speed.Indeed the EE of the 80's will be required to understand a

wide spectrum of technology. To think that computing is it, istoo simple minded. Indeed to understand such things as fiberoptics, digital communications and analog circuits appears tome to be more basic and long lasting. On the question ofmeeting the challenge of the 80's we must be very carefulto avoid running in all directions or, alternatively, runningonly in one. There is really very little hope to succeed bychasing the latest innovation. One could argue that comput-ing is not a very recent development and I would agree, but Iwould also argue that the amount of basic material intrinsicto the last three decades of development of computers isrelatively sparse in really durable ideas. Education in the 80'smust deal with principles with a sampling of breadth and witha judicious sampling of depth in isolated areas and areas whichare related. It is by the latter means that one would educatein knowing what it is to know something well, and train theability to extrapolate deep knowledge in one area to immedi-ately useful knowledge in another.

K. C. SMITHUniversity of TorontoToronto, Ont., Canada

40

IEEE TRANSACTIONS ON EDUCATION, VOL. E-22, NO. 2, MAY 1979

Electrical Engineering Educators, perhaps more than anyother group in academia, deal with a technology that appearsto be exploding in all directions. The computer revolution haseither been the detonator or a contributing force in the variousexplosive areas. How do we keep pace with this technologicalrevolution caused by the computer? While there is no substi-tute for a good solid scientific base, we must not stop there.We must be aggressive and imaginative and think well beyondthe basic concepts. I believe that it is extremely important toexpose our students to as large a variety of our technology aspossible. While it is virtually impossible to expose them toeverything, many of the elements within the technologyspectrum are well within our economic grasp.

I believe that one of the most efficient mechanisms forteaching, or at least exposing, our students to some of thefacets and ramifications of new technology is the laboratory.Much of the new technology is not prohibitively expensive; infact, it is cheap. Today the innovative professor can accom-plish with one hundred dollars experiments which would havecost thousands of dollars in years gone by. Cheap componentscoupled with carefully designed experiments can now providebreadth and depth of material coverage to solidify the theoryvia its application in a variety of areas.For example, it is imperative that our students be well versed

in the fundamentals of digital computer organization, program-ming and interfacing. But this alone is not enough. Thestudents should understand the new microcomputer and theenormous potential it has in systems of all types. Nowherecan this be done better than in the laboratory where thestudents are forced to apply and extend their knowledge.Although some of my contemporaries believe that only a

minimum laboratory experience is sufficient, I feel that manylaboratories judiciously situated throughout the curriculumare necessary viable mechanisms for broadening the students'technology while strengthening their understanding of thebasic underlying theory.

J. DAVID IRWINHeadDepartment ofElectrical EngineeringAuburn UniversityAuburn,AL 36830

The principal problem which confronts the electrical engi-neering educator in "keeping up with the computer revolution"is no different from that which has continuously confrontedhim in "keeping up with the electrical engineering revolution";namely, how to modify the curriculum so that the educationof todays graduate is consistent with the current technicalcontent of the profession.Regarding computer education in the electrical engineering

curriculum, I firmly believe that todays graduate at the bac-calaureate level should have devoted 14 to 15 credit hours ofstudy to digital computer subjects. These should includehigher level language and assembly language programming andan introduction to systems programming. Of the more hard-ware oriented subjects, the graduate should have studied theelements of digital system design and the architecture of micro-computers. Finally, the graduate should have had significant"hands on" laboratory experience concerned with both thehardware and software aspects of using microcomputers as asystem element.

I further believe that this same graduate should have devoted24 to 25 credit hours of study to the theories of electric net-

works, electronic circuits, transforms, electromagnetics, ran-dom processes, and solid state devices. At the systems anddesign level, he should have had twelve credit hours of intro-ductory courses in control systems, communication systems,electronic systems, and power systems. Concurrently, heshould have had about six credit hours of "hands on" labo-ratory experience which develop laboratory technique andprovide confidence and capability in using the experimentalmethod as a problem solving tool. Finally, the graduate musthave been permitted some time in the curriculum for additionalstudy in areas of specialization which are of particular interest.To successfully complete this kind of program, the graduate

must have studied about 30 credit hours of mathematics andphysics. He should have spent about six credit hours in thefurther development of oral and written communication skillsand about six credit hours in developing an appreciation andknowledge of the elements of the economic system withinwhich he will be expected to function. Morever, the graduatestill is required to have participated in 15 credit hours of studyin the humanities and social sciences.In my view, it is essential that the entire program require

only four years to complete. This means the total credit hourrequirement must be between 120 and 130 hours.For such a curriculum to be practical, certain constraints

must prevail. First, the faculty must resist the temptation toREQUIRE more study in an area than is absolutely necessaryto give the student a basic understanding of the subject matter.(All too often we require more courses than necessary in anarea because of intensive faculty interest.) Furthermore, thefaculty must place more importance on requiring a broadeducation over electrical engineering than over all of engineer-ing. This means that the study of subjects such as mechanics,thermodynamics, graphics, heat transfer, etc. should not berequired. Rather, these should enter a students plan of studyas electives when these are needed to complement his area ofspecialization.In summary, I contend that electrical engineering education

should have no difficulty in keeping up with the computerrevolution. In fact, I contend that, with discipline, it can con-tinuously provide a four year program of study which is con-sistent with the current technical content of the profession.

C. L. COATESHeadSchool of Electrical EngineeringPurdue UniversityWest Lafayette, IN 47907

THE IMPACT OF THE COMPUTER REVOLUTION ONCOMMUNICATIONS AND SIGNAL PROCESSING

This statement addresses the impact of the computer revolu-tion on Communications and Signal Processing. In recentyears the availability of high speed digital computers has en-couraged the development of increasingly complex and sophis-ticated signal processing algorithms. Large scale integration(LSI) and grand scale integration (GSI) technologies, as wellas the availability of fast microprocessors, have resulted insophisticated and efficient special purpose signal processors.Furthermore, there is a growing emphasis on analog sampled-data devices.To cope with these advances an increased emphasis has been

placed in our undergraduate curriculum in the areas of digitalfilters, digital signal processing and computer simulation. Since

41

IEEE TRANSACTIONS ON EDUCATION, VOL. E-22, NO. 2, MAY 1979

the mathematical and systems background required to com-prehend analog and digital signal processing is similar and sinceboth analog and digital signal processing techniques will com-plement each other in years to come, a marriage of the funda-mental techniques is required, rather than the introduction ofadditional courses.Instruction in communications and signal processing can

take place at two levels, the conceptual and the applied.At the conceptual level the study of deterministic concepts

of signal and system analysis can be done in a unified way.Continuous Fourier and discrete Fourier analysis can be con-

sidered as special cases of orthonormal decompositions.Laplace and z-transforms can be taught in parallel. Coursesin network theory can deal with the necessary backgroundrequired for both analog and digital filter design, such as

approximation theory. Mapping techniques make possibletransfering from the analog to the digital domain and viceversa. Probabilistic concepts should be also be used as a uni-fying factor for both areas. Random signals (discrete andcontinuous) may be taught and the different characteristicsof noise arising in analog and digital systems pointed out.At the applied level systems and signals may be implemented

either by hardware or by software. In hardware implementa-tions recent advances in microelectronics and microprocessorsshould be emphasized. In software implementations improvedalgorithms and programing techniques should be introduced toaid simulation.The Communications Group in the Department of Electrical

Engineering, University of Toronto, has implemented some ofthe above ideas in the 3rd and 4th year courses and the studentresponse has been favorable. Specifically, in the 3rd yearProbability course, the idea of computer simulation is intro-duced by means of pseudo-random number generators andsimple problems on queuing analysis. In the 3rd year Signaland Systems courses, in addition to the usual analog, time andfrequency domain techniques, topics such as z-transforms,discrete Fourier transforms, fast Fourier transforms (FFT) anddigital filters are introduced. In the 4th year Communicationcourses, extensive use is made of these ideas introduced in theprevious years. Examples are computer problems dealing withthe use of digital filters and FFT techniques in the problem ofpower spectral estimation and the performance evaluation ofa communication system, by means of computer simulation ofsignals and Gaussian noise.In summary, we feel that the breadth of knowledge required

of the present-day student to become "marketable" in thecomputer-based industries can be easily incorporated withinthe framework of present-day Electrical Engineering cur-

riculum by small, but significant, changes at different levels.The marriage of analog and digital techniques and experimentsinvolving hardware as well as computer simulation help us inmeeting this challenge posed by the computer revolution inthe fields of Communications and Signal Processing.

S. PASUPATHYA. N. VENETSANOPOULOSDepartment of Electrical EngineeringUniversity of TorontoToronto, Ont., Canada

Before addressing the question of how to keep up with thecomputer revolution, we must first understand the computerrevolution itself. The computer revolution is the product ofthe availability of cheap high level functions in small boxes.

Currently complete processors are available on single chips.In the future even larger processors and complete computerswill be placed on single chips. This has drastically affectedthe economics of computers making them available for use ina wide range of products. However, basic computer architec-tures have not been revolutionized by the computer revolution.Instead, it has continued a steady evolution. Similarly, otheraspects of computing theory have only been affected insofaras experiments which were prohibitively expensive in the pastare now possible. The revolution of the computer revolutionis the revolution of computer applications, not computertheory.With this understanding, the question becomes not how to

keep up, but what fundamental principles and techniques ofcomputing and computer hardware should be taught. Ifeducators try to keep up with every manufacturers' latesthardware and software product, they will be swamped withdata that are out-of-date by the time they have been incorpo-rated into a course. Instead, if courses are oriented towardthe fundamental principles and techniques, which evolve moreslowly, and use recent products as examples, graduates whocan quickly evaluate and adapt to the rapid product changesof the computer revolution will be produced. However, ifcourses are oriented towards case studies of the currentproduct lines, graduates may miss principles and techniqueswhich will prevent them from rapidly adapting to new prod-ucts and developments.The computer revolution is not a revolution of theory but of

application and the question of keeping up must be analyzedin that light.

E. B. WAGSTAFFSchool ofElectrical EngineeringGeorgia Institute of TechnologyAtlanta, GA 30332

At UCLA, the Computer Science Department is separatedfrom the Department of Electrical Sciences and Engineering.This causes some difficulties in coordinating the teaching ofmicrocomputers. In the present arrangement, the ComputerScience Department offers courses in the architecture of digitalcomputers, including microcomputers, and also laboratorycourses which allow hands-on experience with up-to-date hard-ware. A new graduate course on very-large-scale integration(VLSI), offered by our Department, discusses microprocessorsfrom the viewpoint of the integrated-circuit designer. It coversthe technology, hardware/software design and architecture ofVLSI systems.An important new course, still in the planning stage, will

discuss the structure, components, programming and applica-tion of microcomputers on a level required by a casual user.This will be included in the undergraduate course sequenceoffered by our Department in Applied Electronics.

It should be pointed out that most of the important featuresof microprocessors are common with other, larger, computers.Except for a few special aspects (such as bit slicing), existingcourses taught on the process-controlling application of on-line computers remain applicable to microcomputers.

GABOR C. TEMESDepartment of Electrical EngineeringUniversity of California, Los AngelesLos Angeles, CA 90024

42

IEEE TRANSACTIONS ON EDUCATION, VOL. E-22, NO. 2, MAY 1979

The computer revolution is most often graphically depictedby the increased density in integrated circuits. It is indeedspectacular that the number of transistors per chip has in-creased by an order of magnitude every 5 years and may reach1,000,000 by 1985. This has brought the dramatic reductionin size and price. Computers have been applied to new activi-ties and greater function has been added to computers. Theso-called minicomputers have increased the typical mainmemory capacity more than an order of magnitude every 5years for the past 10 years.The trend is clear that, whether the computer is imbedded

in another product or used as a computer, the amount ofsoftware is increasing rapidly. To be sure, electrical engineerswill need to know much more about software, and testing/checkout/diagnostics for both hardware and software. But,the vast amounts of logical functions available to the electricalengineer make his/her world take on new enlarged dimensions.First, the engineer should be looking at a broader, total solu-tion to problems. It is now possible to have a significantimpact in improving the way people do things. Second, dueto the size and complexity of tasks that can be built intocomputers, groups of people must coordinate their designsand simplify interconnections. Third, the interface to theuser must be greatly improved. People should be able to solvetheir problems without facing the barriers of the computersystem.Hence, it is more important to learn about problem formula-

tion and solutions than too much focus on current buildingblocks or technology.

E. DAVID CROCKETTHewlett-Packard CompanyGeneral Systems DivisionSanta Clara, CA 95050

THE IMPACT OF VLSI ON COMPUTERSCIENCE EDUCATION

The new world of computation done in the LSI medium ischaracterized by some good news and some very bad news.The good news is that now logic and memory are made of

the same stuff; implemented in a uniform technology. Vastamounts of concurrent computation can be achieved by mix-ing computational elements in with memory elements, notseparating them artificially as is done in Von Neumann ma-chines. This approach to computation has the potential fororders of magnitude increase in performance and decrease inthe energy, area, and time required for computation.The bad news is that Computer Science is going to have to

completely change its way of doing business to take advantageof the properties of the stuff out of which the brave new worldof computers will be constructed. In the new medium, movingdata around consumes most of the time, area and energy in asystem. Logical operations when done locally are virtuallyfree. This property assures that topology and topologicalproperties of computations will never be hidden from theuser of computation, even at the highest level. No longer isthere a clear distinction between hardware and software.From now on, programming will consist of the act of mappinga computation with certain topological properties on to acomputational structure with a different set of topologicalproperties.We are no longer blessed (or cursed) with artificial levels of

packaging to provide de facto levels of organization and inter-connect. Topological constraints are present at all levels ofdesign. No memory is truly random access. Some parts arealways closer, and therefore faster, and requiring less energyper access than others. The Von Neumann style no longerprovides us a guide. We are on our own. While it may bedifficult for machine designers to cope with the new worldof VLSI, it will be much more difficult for conventionalprogrammers to make this transition. The theory of analysisof algorithms and complexity of computation will requiremajor revisions. Rather than a de facto mapping into a singlecomputation stream, complexity theory is faced with a topo-logical mapping between the properties of the algorithm andthe properties of the computing structure. In this mapping,the preservation of geometric locality is richly rewarded.However, today we have no good mathematical measure ofwhat locality means or how it can be expressed or measured.The enormous degree of concurrency which is there for theasking can only be used by algorithms of a fundamentallydifferent sort than have been developed historically. Manyalgorithms for sequential machines destroy the locality whichis inherent in the problem in order to save total computationalsteps. The "fast" Fourier transform and quicksort algorithmsachieve a small number of operations at the expense of shippingdata globally on a massive scale. Traditional computer languagesnot only have the awkward problems so eloquently describedby Backus,I they are unable to represent computations whichinvolve a very large number of things happening concurrently.The optimizing compiler of the future must be able to de-compose problems into a vast number of independent pro-cesses and map these processes on to some computing structurewith a given topological configuration.Arising out of this brave new world are three new disciplines

fundamentally different from those which are attacked in cur-rent computer science curricula.

(1) The design of machines, algorithms and notations forvery large degrees of concurrency.

(2) A complexity theory for algorithms, computing struc-tures, and the mapping between them.

(3) An amorphous area which I call the physics of computa-tion.

The third area involves lower bounds or the time, area andenergy required for any given computation. It includes primi-tives from information theory, computational complexity andthermodynamics. The combination of the last two areasshould, in the not too distant future, be able to express alower limit on the complexity of a computation by the un-avoidable energy which is necessary for its switching and com-munications requirements.All major academic disciplines are faced with periodic major

revolutions in their content. Physics has been through severalby now and there may be another in the wings. ComputerScience has been able to exist for nearly half a century with-out a major revolution in its approach or the contents of itscurricula. We should welcome the challenge of a major revolu-tion in this area posed by the new technology.

REFERENCES1Backus, J. Can programming be liberated from the von Neumannstyle? A functional style and its algebra of programs. Communicationsof the ACM (August 1978); Volume 21, Number 8.

CARVER A. MEADDepartment of Computer ScienceCalifornia Institute of TechnologyPasadena, CA 91109

43