entering the non-von-neumann era

2
GUEST EDITORIAL Entering the non-von-Neumann era It is very interesting to contrast the development of digital computers with the history of quided-weapon development during the 1950s. From the outset, guided-weapon design showed great diversity of approach, with air frame, engines and guidance using widely differing techniques, depending on the particular application. For instance, the first successful missile probably used beam-riding electronics, but this had obvious deficiencies in some situations, and semiactive radar, active radar and infrared homing systems were brought in. The air frames could be cruciform, canard or twist and steer, whereas pro- pulsion was by solid-or liquid-fuel rockets or ramjets. There was even a missile system in which a kamikaze pigeon steered the missile towards its target. The latter reminds me of an accountant I was showing around the latest hardware for the ICL 1900. He said to me, 'You know, I don't care if you have little men turning handles in there, as long as I get the pay-roll out on Thursday'. It is true that computers have many users with a technical interest in the devices themselves, but the vast majority just want their computer load carried out to schedule with the minimum of trouble and expense. We must never forget that computer architecture, which seems so important and interesting to the designer, is, to the vast majority of users, only the rather abstract background of a routine tool. Returning to the comparison with guided weapons, it cannot be denied that the type of computer system which has been designed during the last 30 years looks unbelievably narrow compared with the rich technical alternatives in the guided-weapon field. The first computers were designed to suit the purpose of war-time ballistic calculations and decryption, and their hardware, seen from today's standpoint, was unbelievably primitive. These machines are linked with the names of von Neumann and Turing, and their immediate post-war successors were slow serial machines which followed the same principles, and so it has continued. In some respects, the very early machines had attractive features which later disappeared; for example, all the early machines could address every word in memory directly, which meant, for instance, that the Zebra computer could perform a fast serial associative search using one microprogram bit. Another example is the early Ferranti machines which had a random-number generator built into the order code. The narrowness of design for computer structures is becoming increasingly obvious to research workers, and the search for alternatives is now on. It might be argued that all computers should be von Neumann computers, because it is theoretically possible for such a machine to model any other processor. There are three elements which militate against this, the first being time. If a novel architecture can do something a hundred or a thousand times quicker than the von Neumann design, then, l.s.i. notwithstanding, it ought to be done that way, given that the demand exists. The second aspect of the current scene which must be considered unsatisfactory is large programming. Many brilliant people in many countries have applied their skills to the programming of large computer systems for multiple users, and the results are nearly always comparable, and somehow unsatisfying. An alternative architecture might well lead to simpler programming. At present, structured pro- gramming is considered to be a panacea, and it has achieved useful results, but, just as good design of mechnical structures needs a thorough knowledge of the structural components used, so good program design needs a thorough knowledge of the appropriate computer structures, and this is all too often lacking. The third unsatisfactory aspect of the von Neumann machine is that, as developed with file methods and operating systems, it does not seem to allow us to design a general multiprocessor system which facilitates buying processors rather like one buys butter: by the pound or half pound. A new architecture, or set of new architectures, could well lead to significant advances in ah" the above areas, and provide exciting work for designers as well as giving users a friendly work horse. Many of the necessary new architectures and ideas which solve a class of problem and are simple to program have been around for years, but have had a frosty reception, both academically and industrially. What is needed is a considerable act of synthesis. We are now at the turning point. Gordon Moore, President of INTEL, at a Caltech v.l.s.i. Conference, said that it was going to be possible to put many more components on a silicon chip, but that if people asked him what he could think of to put on such a chip he could at present only say bigger processors and more memory. Although he did not say so explicitly, he was almost certainly referring to von Neumann processors. Subsequently, at the 3-day conference, many speakers stressed that what they needed was a new architecture, but hardly anybody talked specifically about new architectural possibilities. We need to examine very carefully the question of which route computer science and engineering can use to take us into the new design space, because at present there would seem to be very little room for manoeuvre. There are now excellent microcomputers, which will soon have the power of minicomputers, and the price of a complete system is staggeringly low. At the other end of the spectrum, Amdahl and Cray are nuzzling up to the velocity of light. Until three weeks ago,* the main- frame companies were faced with a unique problem - 'how much do you ask for a $1000 processor from a customer with $1 000000 to spend?' The IBM 4300 series announcement seems to say that you sell it for about one-fifth of the machine it replaces, which means that mainframe companies, in the long term, have to find a way of selling much more hardware to their customers. With the possibility of a million components on a chip quite soon, and with subnanosecond logic speeds, what can we do? More important, what should we do? In my view, computer science and engineering of systems has fragmented into separate areas, in each of which very talented people 'beaver away' assuming that all the other areas are going to stay almost as they are. The situation makes the gulf of the 1960s between hardware and software look trivial. It is worth identifying each of these areas with some general remarks. (a) Main line hardware design The work in this area has largely retired into silicon chip companies. At subnanosecond speed there is very little that a system designer can do without a firm knowledge of chip manufacturing technology. There are some non-von-Neumann machines in various stages of proposal and design, but some of them look as though the only intention is to mimic a large von Neumann processor with a non-von-Neumann architecture. * At the time of writing. COMPUTERS AND DIGITAL TECHNIQUES, APRIL 19 79, Vol. 2, No. 2 57

Upload: fg

Post on 19-Sep-2016

229 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Entering the non-von-neumann era

GUEST EDITORIAL

Entering the non-von-Neumann eraIt is very interesting to contrast the development of digital computers with the history of quided-weapon developmentduring the 1950s. From the outset, guided-weapon design showed great diversity of approach, with air frame, engines andguidance using widely differing techniques, depending on the particular application. For instance, the first successful missileprobably used beam-riding electronics, but this had obvious deficiencies in some situations, and semiactive radar, active radarand infrared homing systems were brought in. The air frames could be cruciform, canard or twist and steer, whereas pro-pulsion was by solid-or liquid-fuel rockets or ramjets. There was even a missile system in which a kamikaze pigeon steered themissile towards its target.

The latter reminds me of an accountant I was showing around the latest hardware for the ICL 1900. He said to me, 'Youknow, I don't care if you have little men turning handles in there, as long as I get the pay-roll out on Thursday'. It is true thatcomputers have many users with a technical interest in the devices themselves, but the vast majority just want their computerload carried out to schedule with the minimum of trouble and expense. We must never forget that computer architecture,which seems so important and interesting to the designer, is, to the vast majority of users, only the rather abstract backgroundof a routine tool. Returning to the comparison with guided weapons, it cannot be denied that the type of computer systemwhich has been designed during the last 30 years looks unbelievably narrow compared with the rich technical alternatives inthe guided-weapon field.

The first computers were designed to suit the purpose of war-time ballistic calculations and decryption, and their hardware,seen from today's standpoint, was unbelievably primitive. These machines are linked with the names of von Neumann andTuring, and their immediate post-war successors were slow serial machines which followed the same principles, and so it hascontinued. In some respects, the very early machines had attractive features which later disappeared; for example, all theearly machines could address every word in memory directly, which meant, for instance, that the Zebra computer couldperform a fast serial associative search using one microprogram bit. Another example is the early Ferranti machines which hada random-number generator built into the order code.

The narrowness of design for computer structures is becoming increasingly obvious to research workers, and the search foralternatives is now on. It might be argued that all computers should be von Neumann computers, because it is theoreticallypossible for such a machine to model any other processor. There are three elements which militate against this, the first beingtime. If a novel architecture can do something a hundred or a thousand times quicker than the von Neumann design, then,l.s.i. notwithstanding, it ought to be done that way, given that the demand exists. The second aspect of the current scenewhich must be considered unsatisfactory is large programming. Many brilliant people in many countries have applied theirskills to the programming of large computer systems for multiple users, and the results are nearly always comparable, andsomehow unsatisfying. An alternative architecture might well lead to simpler programming. At present, structured pro-gramming is considered to be a panacea, and it has achieved useful results, but, just as good design of mechnical structuresneeds a thorough knowledge of the structural components used, so good program design needs a thorough knowledge ofthe appropriate computer structures, and this is all too often lacking. The third unsatisfactory aspect of the von Neumannmachine is that, as developed with file methods and operating systems, it does not seem to allow us to design a generalmultiprocessor system which facilitates buying processors rather like one buys butter: by the pound or half pound.

A new architecture, or set of new architectures, could well lead to significant advances in ah" the above areas, and provideexciting work for designers as well as giving users a friendly work horse. Many of the necessary new architectures and ideaswhich solve a class of problem and are simple to program have been around for years, but have had a frosty reception, bothacademically and industrially. What is needed is a considerable act of synthesis.

We are now at the turning point. Gordon Moore, President of INTEL, at a Caltech v.l.s.i. Conference, said that it was goingto be possible to put many more components on a silicon chip, but that if people asked him what he could think of to put onsuch a chip he could at present only say bigger processors and more memory. Although he did not say so explicitly, he wasalmost certainly referring to von Neumann processors. Subsequently, at the 3-day conference, many speakers stressed thatwhat they needed was a new architecture, but hardly anybody talked specifically about new architectural possibilities.

We need to examine very carefully the question of which route computer science and engineering can use to take us intothe new design space, because at present there would seem to be very little room for manoeuvre. There are now excellentmicrocomputers, which will soon have the power of minicomputers, and the price of a complete system is staggeringly low.At the other end of the spectrum, Amdahl and Cray are nuzzling up to the velocity of light. Until three weeks ago,* the main-frame companies were faced with a unique problem - 'how much do you ask for a $1000 processor from a customer with$1 000000 to spend?' The IBM 4300 series announcement seems to say that you sell it for about one-fifth of the machine itreplaces, which means that mainframe companies, in the long term, have to find a way of selling much more hardware to theircustomers.

With the possibility of a million components on a chip quite soon, and with subnanosecond logic speeds, what can we do?More important, what should we do? In my view, computer science and engineering of systems has fragmented into separateareas, in each of which very talented people 'beaver away' assuming that all the other areas are going to stay almost as they are.The situation makes the gulf of the 1960s between hardware and software look trivial. It is worth identifying each of theseareas with some general remarks.

(a) Main line hardware designThe work in this area has largely retired into silicon chip companies. At subnanosecond speed there is very little that a systemdesigner can do without a firm knowledge of chip manufacturing technology. There are some non-von-Neumann machines invarious stages of proposal and design, but some of them look as though the only intention is to mimic a large von Neumannprocessor with a non-von-Neumann architecture.

* At the time of writing.

COMPUTERS AND DIGITAL TECHNIQUES, APRIL 19 79, Vol. 2, No. 2 5 7

Page 2: Entering the non-von-neumann era

It has been recognised for many years that the easiest way to access files by name is to use content-addressable memory,and the same technique was shown by Brooker1 in the 1960s to be needed for a large percentage of the operations requiredwithin a compiler. The amazing thing is that while World War II was still being fought, Vannevar Bush, the American scientist,controller of the war-time US scientific effort, wrote a paper showing how a computer with content-addressable memorywould be far superior to a numerical one, since the machine operations exactly model the way that we think. The article,which was published in Atlantic Monthly in July 1945, is entitled "As We May Think", and seems to be known only to a veryfew people.

Again multiprocessor traffic hardware simulators, capable of dealing with all types of operational research problems, weredeveloped at the University of Manchester Institute of Science and Technology and at Cambridge University during the early1960s. These worked very well and within their own problem area gave typically a thousand times Atlas power. Theses,papers and books were written, but no commercial developments have taken place. Many similar ideas can be cited and onegeneralisation for non-von-Neumann computers is to regard them as special-purpose electronic peripherals,2 interfaced to anormal computer system.

(b) ProgrammingIt is surprising how languages with radically different approaches compared to Fortran etc. seem to get very little attention.The use of APL, whether for science or business, immediately gives the user a totally different feel for his relationship tothe machine and to his way of solving problems. Similarly Snobol gives a very different feel and appears to be a naturallanguage for use with associative addressing. An important paper by John Backus3 (which needs a long vacation to be readproperly) discusses non-von-Neumann programming, In it, he shows that there are permanent disadvantages in programmingfor a von Neumann machine and proposes functional programming as a liberating alternative. One can see links with APL anddirected graphs in this work but no suggestions about hardware implementation.

(c) Computation theoryThis is another subject where the average reader needs a vacation to consider the fundamental papers such as the one byKarp and Miller.4 However, if one wishes to design a computer structure (either hardware or software) to be logically sound,even with many parallel data paths, then obedience to Karp and Miller's basic structural rules will yield a product which canbe algorithmically tested for logical soundness. This major benefit has been almost entirely ignored by designers at large,in spite of the fact that logical soundness is the best possible short cut for simulation. (The Karp and Miller mathematicssupports an infinite hierarchy of functions).

(d) Computer-aided design of systemsThis area seems to be very separate from the rest of computing. At the c.a.d. conference last March, there was no-one fromICL or IBM on the preliminary list of delegates. There are several approaches to c.a.d. for systems, some graphical and someusing register transfer languages; in fact, from an analytic point of view, the two approaches are virtually identical and one canup-compile and down-compile between them. Furthermore, trivial attention to formality allows the use of Karp and Miller'swork, although only one group seems to have implemented it. During the 1960s I was very struck by the detailed resemblancebetween the c.a.d. programs that were evolving in ICL and the structure of typical compilers. In the future it could well bethat we shall see compilers modelling themselves on some of the methods that have been developed for computer-aideddesign of systems.

Conclusion

If we are to move ahead then we must first of all persuade people from all the above groups to work together far more closelythan in the past. Secondly, we must examine the whole user-^computer-^program relationship much more closely. Afterreading Vannevar Bush's article one is tempted to say that we have had 30 years of computing numbers on von Neumannmachines and now perhaps we should have at least 10 years of manipulating words'on Vannevar Bush machines. Certainlynonnumeric processing can profitably absorb a lot of our time and effort, and the vital statistics of language are little knownin computer design teams.5

A study of office procedures by the Xerox Corporation shows that a typical office organisation can be represented as adirected graph, and there is no reason why such studies in different user areas should not be linked much more directly intothe computer system. The other task we have in the future is to apply very much more hardware (in quantity, not cost)within today's processors to help people use computers to the best advantage. The most promising area for this is the man-machine interface, which ought to be designed using all the well-known facts of psychology and ergonomics coupled with3D colour graphic displays. It is within our power, and very necessary, to give the computer a friendly aspect even to thenonskilled user.

Perhaps I can close by saying that it is possible to find the basis of many alternative system architectures in the briefremarks above. At least two methods of using computing machinery without a conventional compiler are fairly evident, andthe detection of these is left as an exercise for the reader. F.G. HEATH

Heriot-Watt University

References

1 BROOKER, A.: 'IEE Computer Technology Conference, 19672 HEATH, F.G.: 'Electronic peripherals: autonomous hardward functions', Infotech State of the Art Report, 1971, 5, p. 4053 BACKUS, J.: 'Can programming be liberated from the von Neumann style? A functional style and its algebra of programs', Commun.

ACM, 1978,214 KARP, R.M., and MILLER, R.E.: 'Parallel program schemata', /. Comput. & Syst. Sci., 1969, 35 KUCERA, H., and NELSON FRANCIS, W.: 'Computational analysis of present-day American English' (Brown University Press, Providence,

Rhode Island, 1966)The editors invite correspondence on this, and future, guest editorials for possible publication.

58 COMPUTERS AND DIGITAL TECHNIQUES, APRIL 19 79, Vol. 2, No. 2