software renewal: a case study

8
Software Renewal: A case StUdY Harry M. Sneed, Software Engineering service Error-free software in large applications may be possible only by respecifying the original design-and may be affordable only when automatic tools become available. The data processing community needs to apply software engineer- ing techniques and tools to real proj- ects to determine their practical use- fulness. Such an opportunity was pro- vided by the Bertelsmann Publishing Corporation of Giutersloh, West Ger- many, during a two year period from 1981 to 1983. This article reports the results of that project and the ex- perience gained from it. The application In 1981 Bertelsmann, the world's second largest publishing company, completed a commercial application system for distributing books and other publications throughout the world. The system, which includes ordering, billing, packing, and distrib- uting, runs on line during the day and in batch mode during the night. It is made up of eight subsystems, with more than 1000 modules and 300,000 PL/I source statements. The system was developed with the help of IBM analysts using the HIPO method for design1 and a decision table genera- tor, Dectab, for coding.2 The data- base, which contains some 5000 data items, was constructed using the Adabas database system. 3 Testing and documenting such a large system proved to be a problem. Due to application size and complexi- ty, the modules needed to be tested in- dividually, but at the time of develop- ment no adequate test tools were available. The original documenta- tion, in the form of HIPO charts, was soon outdated as the programs were altered during testing. So great was the difference between the original design documentation, which was never fully completed, and the final programs that the design had to be discarded. This left the system without a baseline documentation. The only true descrip- tion of the application was the pro- grams themselves, but these were in a constant state of fluctuation as there were numerous error reports, at the rate of one error per 175 source lines, and user change requests, at the rate of one change per two and a half mod- ules, during the installation phase. At this critical point, after one year of operation, Bertelsmann decided to try and reconstruct a requirements specification from the programs them- selves, and to systematically test the modules with the aid of automated tools. One subsystem of the total system was chosen: the mailing system with 232 modules and some 24,000 lines of PL/I source code. By respeci- fying, redocumenting and reverifying this one subsystem we hoped to deter- mine whether or not the entire system could be reconstructed in accordance with the principles of software engi- neering. Like urban renewal, we wanted to find out whether software systems could be economically reno- vated. Project strategy The strategy of the project was to proceed in four stages. In the first stage, the modules were to be statically analyzed and redocumented with the aid of an automated static analyzer. In the second stage, the programs and data structures were to be formally specified using an automated speci- fication tool based on the structured analysis method4 and the documents produced by the static analyzer. In the third stage, the module test cases were to be written in a test specification lan- guage based on the assertion method5 and executed by a module test system 0740-7459/84/0700/0056$01.00 O 1984 IEEE 56 IEEE SOFTWARE

Upload: truongphuc

Post on 12-Feb-2017

220 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Software Renewal: A case Study

Software Renewal:

A case StUdY

Harry M. Sneed, Software Engineering service

Error-free software in largeapplications may be possible

only by respecifying theoriginal design-and may be

affordable only whenautomatic tools become

available.

The data processing communityneeds to apply software engineer-

ing techniques and tools to real proj-ects to determine their practical use-fulness. Such an opportunity was pro-vided by the Bertelsmann PublishingCorporation of Giutersloh, West Ger-many, during a two year period from1981 to 1983. This article reports theresults of that project and the ex-perience gained from it.

The applicationIn 1981 Bertelsmann, the world's

second largest publishing company,completed a commercial applicationsystem for distributing books andother publications throughout theworld. The system, which includesordering, billing, packing, and distrib-uting, runs on line during the day andin batch mode during the night. It ismade up of eight subsystems, withmore than 1000 modules and 300,000PL/I source statements. The systemwas developed with the help of IBManalysts using the HIPO method fordesign1 and a decision table genera-tor, Dectab, for coding.2 The data-base, which contains some 5000 dataitems, was constructed using theAdabas database system. 3

Testing and documenting such alarge system proved to be a problem.Due to application size and complexi-ty, the modules needed to be tested in-dividually, but at the time of develop-ment no adequate test tools wereavailable. The original documenta-tion, in the form of HIPO charts, wassoon outdated as the programs werealtered during testing. So great was thedifference between the original designdocumentation, which was never fullycompleted, and the final programsthat the design had to be discarded.

This left the system without a baselinedocumentation. The only true descrip-tion of the application was the pro-grams themselves, but these were in aconstant state of fluctuation as therewere numerous error reports, at therate of one error per 175 source lines,and user change requests, at the rate ofone change per two and a half mod-ules, during the installation phase.

At this critical point, after one yearof operation, Bertelsmann decided totry and reconstruct a requirementsspecification from the programs them-selves, and to systematically test themodules with the aid of automatedtools. One subsystem of the totalsystem was chosen: the mailing systemwith 232 modules and some 24,000lines of PL/I source code. By respeci-fying, redocumenting and reverifyingthis one subsystem we hoped to deter-mine whether or not the entire systemcould be reconstructed in accordancewith the principles of software engi-neering. Like urban renewal, wewanted to find out whether softwaresystems could be economically reno-vated.

Project strategyThe strategy of the project was to

proceed in four stages. In the firststage, the modules were to be staticallyanalyzed and redocumented with theaid of an automated static analyzer. Inthe second stage, the programs anddata structures were to be formallyspecified using an automated speci-fication tool based on the structuredanalysis method4 and the documentsproduced by the static analyzer. In thethird stage, the module test cases wereto be written in a test specification lan-guage based on the assertion method5and executed by a module test system

0740-7459/84/0700/0056$01.00O 1984 IEEE56 IEEE SOFTWARE

Page 2: Software Renewal: A case Study

with the objective of achieving 90-per-cent branch coverage.6 In the fourthand final stage, the test specification,,in the form of assertion procedures,was to be merged with the functionalspecification of the programs andtheir data structures to create a pro-duction environment for maintenanceand further development7 (Figure 1).The final result was to be a formal

specification of the processes, data ob-jects, and their relationships, as well asa set of test procedures for testing newmodule versions against the evolvingspecification.The process specification was to

consist of* a process description,* a function tree,* an input/output diagram for each

function,* a decision table for each group of

conditional functions,* a function tree for each process,* a description for each function,and

* the assertions on the pre- andpostconditions of each elemen-tary function.

The object specification was to con-sist of

* an object description,* a data tree,* a data usage diagram for eachdata item,

* a description of each data item,and

* the assertions on the input/outputdomains of each elementary dataitem.

In addition, the relationships be-tween the processes, between the ob-jects, and between the objects andprocesses were to be specified in accor-dance with the entity/relationshipmodel. 8 Finally, there was to be a datatest procedure for each of the 78 datainterfaces, that is, data capsules, and amodule test procedure for each of the232 PL/I procedures, that is, mod-ules.The project was planned to last a

year and to involve two testers andtwo specifiers besides the personneldeveloping the automated tools. As itturned out, this proved to be a gross

underestimation, mainly because ofthe effort required to refine the tools.However, without tools the task ofrenovation was deemed impossible.The rationale for employing auto-

mated tools was one of sheer eco-nomic necessity. A human being canonly understand and document a fewhundred lines of code per week. Withthe aid of a tool, this can be increasedto several thousand. The ratio is atleast one to ten. The same applies totesting. Without the help of a tool, it isnot possible to trace the test paths andto document the test coverage. Be-sides, the tool is what generates the testdata and verifies the test results. So itwas clear from the beginning that the

232 PL/I PROCEDURES,24,000 PL/I LINES OF CODE

SOFSPEC d

SOURCELIBRARY

SOFTDOC

SOFTEST

LIBRARY

SPECIFICATION TEST PROGRAMS

ENVIRONMENT =SPECIFICATION +TEST +PROGRAMS

PRODUCTION ENVIRONMENTFOR MAINTENANCEAND ENHANCEMENT

Figure 1. Renovation project strategy.

July 1984 57

Page 3: Software Renewal: A case Study

postdocumentation and the testing ofthe programs had to be done automat-ically.The least help from tools was ex-

pected in respecifying what the pro-grams should be doing. But even herea tool was found to be indispensable.The business analysts, who generallyare not trained to make precise speci-fications, were encouraged by thespecification tool to be precise andformal. If for no other reason, aspecification tool can be justified onthi's basis. The resulting specificationwould have been valueless if the disci-pline enforced by the tool had beenmissing. In fact, an argument might bemade that the lack of discipline on thepart of the original developers was amajor reason for having to renovatethe Bertelsmann system.

Static analysisThe first stage of the project in-

volved the static analysis of theselected programs. This stage pro-ceeded bottom up in three steps: (1)module analysis, (2) program analysis,and (3) system analysis.The tool for this purpose was Soft-

doc, a static analyzer for Cobol, PL/I,and assembler programs.9 Softdocprocessed the source code in order toproduce four tables for each module:a data description table, a data flowtable, a control flow table, and an in-terface table.The data description table described

the attributes and usage of each dataitem referenced by the module in ques-tion. The data flow table depicted inwhich statements each data item wasused and how it was used, either as apredicate in a condition, an argument,or a result. The control flow table en-compassed each PL/I control com-mand, such as PROC, DO-WHILE,SELECT, and IF, including theirconditions, as well as all commentscontained in the procedure. The inter-face table consisted of all entries, callsand I/O operations, with their respec-tive parameters (Figure 2).

With the aid of these tables, it waspossible to automatically generate aseries of documents from the pro-

grams themselves, at three differentlevels of aggregation. At the modulelevel, the following documents weregenerated:

* a tree of internal procedures andbegin blocks,

* an input/output diagram for eachinternal procedure and beginblock,

* a data description list,* a pseudo code listing,* a control graph,* a path analysis report, and* an intramodular data flow table.

At the program level, the modules ofeach of the eight programs were aggre-gated to produce:

* a module tree,* a calling hierarchy list,* an input/output diagram for eachmodule,

* an interface description list, and* an intermodular data flow table.

At the system level, the program tableswere aggregated to produce a doc-ument of the interprogram relation-ships. In all, five documents were pro-duced at this level:

Without a tool, it would havetaken at least three man yearsto document the programs at

the same level of detail.

* a table of module references,* an input/output diagram for eachprogram,

* a file reference table,* an interprogram data flow table,and

* a system data dictionary.The whole analysis took no more

than one week to complete. Al-together some 1624 module docu-ments, 40 program documents, andfive system documents were produced,giving an approximately one to one

and a half ratio of code to program

documentation. Thus, it was possibleto analyze 5000 lines of code per day.Without a tool, it would have taken atleast three man-years to document theprograms at the same level of detail.This showed that for the postdocu-

58

mentation of programs, automatedtools are indispensable. There can beno adequate and reliable descriptionof the programs without them. How-ever, it should be noted that thedocumentation of the programs is nosubstitute for a requirements specifi-cation. This work remained to bedone.

Requirement specificationFollowing the production of the

program documentation, it was pos-sible to commence with the respecifi-cation of the application. In contrastto the program analysis, the specifi-cation proceeded top down. In all, 10steps were involved (Figure 3):

(1) describing the data objects,(2) describing the processes,(3) depicting the object/process

relationships,(4) depicting the user interfaces,(5) depicting the data trees,(6) depicting the function trees,(7) depicting the decision logic,(8) depicting the data flow,(9) describing the functions, and

(10) describing the data.The tool used was Sofspec, a system

for the interactive submission of anapplication using 12 different CRTforms and their storage in a specifica-tion database based on the entity/rela-tionship model. 10 Sofspec allows theuser to submit the specification in in-teractive mode under the IBM TSO/SPF monitor in a tabular format. Atany time during a terminal session itcan be asked to produce' a certainspecification document, or it can berequested to verify a certain aspect ofthe specification.The specification work began by ex-

amining the files, databases, and datacommunication interfaces docu-mented by the static analysis. Data ob-jects could be derived from the filereference table. Each object was de-fined in terms of its meaning, occur-rence, periodicity, space requirements,and description. Processes could betaken from the interprogram dataflow table. Each process was alsodefined in terms of is meaning, occur-rence, periodicity, time requirements,and description.

IEEE SOFTWARE

Page 4: Software Renewal: A case Study

Following the description of the ob-jects and processes, the relationshipswere defined. The process/object rela-tionships (input, output, I/O) couldbe taken from the input/output dia-grams produced by the static analysis.The Adabas search keys for each ac-cess had, however, to be taken fromthe database design. The object/ob-ject relationships (1:1, I:N, M:l,M:N) were obtained by inverting theprocess/object relationships. The pro-cess/process relationships (predeces-sor, successor, parallel task, invoker,invokee), that is, the process controlflow, could be derived from the sys-tem data flow table.

Thus, it proved possible to recreatean abstract conceptual model of thesystem represented by objects, pro-cesses, and their relationships, butonly partially from the informationobtained from the existing programs.A great deal of information had to becollected from the original designers.This was due to the large semantic gapbetween the view of the system as awhole, in terms of objects and pro-cesses, and the view of the individualprograms. The specification processproceeded top down. The fourth andfifth steps were to document the datastructures. These were found to be oftwo types: user interfaces, that is,forms, and data sets. The user inter-faces were described in terms of sam-ple forms with references to the vari-ables contained in them. The data setswere defined as data trees in accor-dance with the Jackson notation usingsequence, selection, repetition, group-ing, and search key identification. 1Sofspec supported both notations.The information for the data structur-ing was taken from the actual listingand CRT panels, as well as from thedata dictionary generated through thestatic analysis of the programs bySoftdoc.

In the following two steps, the func-tion structures and decision logictables were specified from the moduletree diagrams and the pseudo codegenerated by Softdoc. The functiontrees in Sofspec were also defined interms of the Jackson methodology.For every repetitive and selective func-

MODULELEVEL

DATABASE

PROGRAMLEVEL

DATABASE

Figure 2. Static anal

LOGICALRELATIONSHIPS

Module TreeMODULE HIPO Diagram

MODULE t DOCUMENTS Interface ListingANALYSIS ~~~~~~~~~~DataDictionary

Pseudo CodeFlow GraphTest Paths

rDNI FLOW / \ FACE FLOWTABLE TABL TABLE

PROGRAM PROGRAM Program TreePROGRAM ~~DOCUMNS HIPO DiagramANALYSIS ~~~~~~~~~~Calling Hierarchy

Data DictionaryData Flow Diagram

LOW | t FC

Module Cross-ReferenceSYSTEM HIPO Diagrams

SYSTEM * ANALYSIS SADT DiagramsANALYSIS ~ Data Flow Chart

Data Dictionary

lysis with Softdoc.

(~\(l)U USAGE 4 m PRECEDENCEREL SRELATIONSHIPS

FORMMUSTER

DECISIONLOGIC

FUNCTION ELEMENTS

ENTITIES = OBJECTSPROCESSESDATAFUNCTIONS

RELATIONSHIPSRELATIONSHIPS = OBJECT/OBJECT

PROCESS/PROCESSPROCESS/OBJECTOBJECT/DATADATA/DATAPROCESS/FUNCTIONFUNCTION/FUNCTIONFUNCTION/DATA

Figure 3. Specification with Sofspec.

July 1984 59

Page 5: Software Renewal: A case Study

tion it was necessary to define thecondition. This was done using eithera terminal-supported decision table ora decision tree. Later in the documen-tation the function trees and decisiontrees, that is, decisions tables, weremerged to produce a single specifica-tion document, which also includedthe detailed data flow.The detailed data flow description

was the eighth step. Here the moduleinput/output diagrams derived fromthe code were used to specify the in-puts and outputs of each function.The new data flow diagrams, at thefunctional level, were then generatedby the specification tool.The ninth step was to describe each

function based on the comment blockcontained in the code. Where the com-ment blocks were missing, it wasnecessary to examine the code itself inorder to describe the algorithm in

There were several deviationsbetween the assumption of theuser and the actual construc-

tion of the programs.

natural language. A function wasdefined to correspond to a PL/I pro-cedure. These procedures ranged from50 to a maximum of 500 executablePL/I statements.The tenth and final step in the

respecification phase was actually thefirst step of the testing phase. Eachdata element in the system had to bedefined in terms of is attributes. Thesewere taken from the Softdoc data dic-tionary. In addition, the domain ofeach data item had to be defined. Butsince this was related to the programtest it was postponed until after thetest phase.The task of respecifying the applica-

tion programs took 17 man-months tocomplete. This amounted to one man-month of specification per 1400 linesof code. The ratio of program code tospecification documentation was onthe average of three to one; for exam-ple, for every three pages of code therewas one page of specification docu-mentation.

When the specification documenta-tion was fmished, it was possible todiscuss it with the user, who up to thispoint was not sure what the systemwas doing. Insofar as the user becamefamiliar with the formal specificationtechniques, this discussion provedvery fruitful. As a result of the discus-sion, it became clear that there wereseveral deviations between the as-sumptions of the user and the actualconstruction of the programs. Becauseof that it became necessary to revisethe specification to accommodate theuser's view. This took another fourman-months.

Program testingThe final stage of the project turned

out to be the most difficult and time-consuming. It entailed testing the pro-grams against the newly constructedspecification. Twenty-two man-months were required to specify all ofthe data, and another six man-months were necessary to conduct thetests themselves. Each module had tobe tested independently until at least90 percent of its branches werecovered.The basis of the test was the pro-

gram documentation produced by thestatic analyzer. Among thedocuments produced were a diagramof the module inputs and outputs anda list of the paths through themodule, together with the condi-tional operands, that is, predicatesassociated with each path. The per-son responsible for the test had toanalyze each module to determinewhat inputs would lead to the exe-cution of what paths through themodule. They also had to predictwhich results would occur from eachpath. A test case table was used torecord the inputs and outputs of eachpath as well as the branches traversedby that path (Figure 4).

Unfortunately, the modules hadoriginally been generated by a decisiontable generator and then expanded byhand. Thus, they were very badlystructured. GOTO instructions werethe rule instead of the exception. Thismade it difficult to trace the paths andpredict the results. A module of 300

statements took two man-days toanalyze. A module of 600 statementstook seven to eight days to analyze.The largest module with some 760statements took 11 days to analyze.We began to see that the effort in-volved in defining a program-basedtest increased exponentially in relationto the number of instructions, thenumber of decision nodes, and thenumber of arguments, that is, inputs.

Linear code with sequences of IFand WHILE statements was easier tohandle than highly nested code withconditions within conditions andloops within loops. As might be ex-pected, the worst -case was that ofoverlapping GOTOs. Modules withfew arguments, especially those withfew conditional arguments or pred-icates, were easier to test than thosewith many such arguments. All of thisfurther demonstrated that there was adefinite relationship between programand complexity-in terms of programlength, 12 decision nodes, 13 and datausage 14- and test effort.Having designed a test case table for

a module, the inputs and outputs werethen transformed into an assertionprocedure. The assertion languageconsisted of an IF statement alongwith four types of assertions: set asser-tions, range assertions, function asser-tions, and relational assertions. 15Alphanumeric and coded values

were usually defined by means of a setassertion. Numeric values were de-fined as ranges. Address and lengthinformation was defined by function.Relationships between input and out-put values were defined by the rela-tional assertion. The writing of asser-tions proved to be a simple and easilylearned task. The problem was inknowing what assertions to write.The assertion procedures written

were of two types, driver proceduresand stub procedures. Driver proce-dures initialized the preconditions of amodule under test and initiated the testcases. After each test case they verifiedthe postconditions of the module.Stub procedures simulated either sub-modules or files. They verified theoutputs and generated inputs to themodule under test wherever a call or

IEEE SOFTWARE60

Page 6: Software Renewal: A case Study

I/O operation was performed. In all,over 300 assertion procedures werewritten with total of some 7000 asser-tions (Figure 5).The actual execution of the test

lasted less that three months. Withtwo testers working in parallel it waspossible to test 20 modules per week,that is, each tester tested approximate-ly two modules a day. This was due tothe use of the tool Softest, which com-piled the test procedures into testtables, which were then interpreted totest the modules. 16 The testbed wasautomatically generated. In the test-bed all procedure calls and I/O opera-tions were simulated. For coupling themodule under test with the test proce-dures, the module symbol table wasused. In this way, the input variableswere assigned, and the output vari-ables verified. Concurrently, the pathstraversed were traced, the branchcoverage registered, and the dataflows followed.

After each test a postprocessor pro-duced a series of reports: a test pathreport, a data usage report, and abranch coverage report. The branchcoverage report indicated the coverageratio and the test branches not tested.Each test was continued until at least90-percent branch coverage was at-tained, as prescribed in the contract(Table 1).

In most cases where 90-percentcoverage was not reached, it was dueto erroneous or incomplete assertionprocedures. Modules with more than500 statements and a highly nestedlogic proved to be the most difficult tocover. The test had to be repeated asmany as 10 times before the adequatecoverage was reached. Following eachtest the assertions had to be adjustedor enhanced to invoke different paths.An average of five to eight test caseswere needed to test the smaller mod-ules but as many as 26 test cases werenecessary to test larger modules.

Minor errors were discovered in 55of the 232 modules. These had to becorrected by the responsible pro-grammers before the test could con-tinue. Most of these errors wererelated and were the result of designdecisions. The low level of errors

found was due to the fact that the pro-grams had already been in operationfor a year before they were submittedto this test. So most of the critical er-rors had already been removed. Inaddition, since the modules had, to agreat extent, been generated fromdecision tables, all the errors were of acommon nature and the patternscould be easily recognized. Over 75percent of the errors were discoveredwhile analyzing the code. The test onlydocumented their existence. The other25 percent were exposed at executiontime by either not being able to reach

INPUTS

OUTPUTS

PATHS

DATA 1 2 3

FIELD1 'XXX' 'YYY' '77'FIELD2 100 200 300FIELD3 - 5 0 5FIELD4 X'FF' X'F1' X'FO'FIELD5 11 10 9FIELD6 B'0' B'0' B'1'

DATA 1 2 3FlIELDI11 'XXX '7'YYY'ZZ'FIELD12 95 200 305FIELD13 26 10 4FIELD14 X'FF' X'F1' X'FO'FIELD15 B"1' B'1 ' B'0'

MODULE 1 2 3

Y6651 1 1 13 2 57 4 89 6 912 10 1115 14 1316 17 1717

Figure 4. Test case tables.

MODULE: V665A, STATASSERT PRE FIELD1ASSERT PRE FIELD2ASSERT PRE FIELD3ASSERT PRE FIELD4ASSERT PRE FIELD5ASSERT PRE FIELD6

ASSERT POST FIELD11ASSERT POST FIELD12ASSERT POST FIELD13ASSERT POST FIELD14ASSERT POST FIELD15

END V665A;

Figure 5. Assertion procedure.

SET('XXX-, 'YYY', '777');RANGE (100 + 100);RANGE (-5:5);SET (X'FF', 'Fl', 'FO');RANGE (11 - 1);SET (B'O'(2), '1-);

= FIELD1;= FIELD2 + FIELD3;= FIELD5 - FIELD3;= FIELD4;= NOT (FIELD6);

July 1984 61

Page 7: Software Renewal: A case Study

certain branches or by the control flowfollowing paths other than those pre-dicted. Only three errors were foundby violating output assertions. Thiswas probably because the programshad already been exposed to extensivesystem testing. Those errors foundwere almost all on exceptional func-tions that had not yet been used in pro-duction. This only underlines the factthat programs do not have to be errorfree to be useful. It all depends on howthey are used.The total effort of 22 months to test

24,000 lines of PL/I code to find 55minor errors and three major onescould certainly not be economicallyjustified. The main value of the testingproject was in establishing a testbedfor future testing. It did, however,demonstrate that systematic moduletesting on a large scale was feasible,and that with new untested modules itcould even be economical, but only ifadequate tools are available. The testeffort amounted to an average of oneman-day per 54 instructions. Consid-ering the fact that this was the first ex-perience with a new test specificationlanguage, it should be possible in thefuture to increase this productivity.

A lthough we were successful, themove from conventional devel-

Table 1.Test coverage measurement.

(Total branches, 17; executed branches, 16; coverage ratio, 94 percent)

MODULEV665A

BRANCH1234567891011121314151617

STATEMENT1

50545862646872768084889296100104108

LASTTEST

31

11

TOTALTEST

31

1

11

1 12 21 11 11 11 11 11 1(Not executed.)

3 3

opment and maintenance practices tosystematic software engineering wasexpensive. The effort cost two thirdsof the original cost, and, without thesupport of highly sophisticated tools,was not economically justified.

In the long run, to reduce costs andmake systematic maintenance moreattractive, companies will have to in-vest in developing adequate tools andways to automatically bridge the gapbetween programs and their specifica-tion. Our work at Bertelsmann con-vinced us that automation is the onlytrue solution to the maintenance prob-lem. U

References1. "HIPO-A Design Aid and Docu-

mentation Technique," IBM Corp.,Manual No. GC-20-1851, WhitePlains, N.Y., 1974.

2. "IBM Decision Table Translator,User-Guide," IBM Form No. 79974,Stuttgart, Germany, 1973.

3. "ADABAS Introduction," SoftwareAG ofNorth America, Reston, VA.,1976.

4. T. DeMarco, Structured Analysis andSystem Specification, Yourdon Press,New York, 1978.

5. C. V. Ramamoorthy, S. F. Ho, andW. T. Chen, "On the AutomatedGeneration of Program Test Data,"IEEE Trans. Software Eng., Vol.SE-2, No. 4, 1976, pp. 293-300.

6. J. C. Huan, "An Approach to Pro-gram Testing," ACM ComputingSurveys, Sept., 1975.

7. H. M. Sneed and A. Merey, "Auto-mated Software Quality Assurance,"Proc. Compsac 82, Computer SocietyPress, Los Alamitos, Calif., 1982, pp.239-247.

8. P. Chen, "The Entity-RelationshipModel: A Basis for the EnterpriseView of Data," AFIPS Conf. Proc.,Vol. 46, 1977 NCC, Dallas, Tex.,1977, pp. 77-84.

9. G. Jandrasics, "SOFTDOC-A Sys-tem for Automated Software Analysis

IEEE SOFTWARE

Those errors found werealmost all on exceptional

functions which had not beenused in production.

62

Page 8: Software Renewal: A case Study

and Documentation," Proc. ACMWorkshop Software Quality Assur-ance, Gaithersburg, Md., Apr., 1981.

10. Nyary and H. M. Sneed, "SOF-SPEC-A Pragmatic Approach toAutomated Specification Verifi-cation," Proc. Entity/RelationshipConf., Anaheim, Calif., Oct., 1983.

11. M. Jackson, Principles of ProgramDesign, Academic Press, London,1975.

12. M. Halstead, Elements of SoftwareScience, Elsevier Computer ScienceLibrary, New York, 1977.

13. T. McCabe, "A ComplexityMeasure," IEEE Trans. SoftwareEng., Vol. SE-2, No. 4, Dec. 1976,pp. 308-320.

14. S. Henry and 0. Kafura, "SoftwareStructure Metrics Based on Informa-tion Flow," IEEE Trans. SoftwareEng., Vol. SE-7, No. 5, Sept. 1981,pp. 510-518.

15. M. Majoros and H. M. Sneed,"Testing Programs Against a FormalSpecification,"' Proc. Compsac 83,Computer Society Press, Los Ala-mitos, Calif.,.1983, pp. 512-520.

16. M. Majoros, "SOFTEST-A Systemfor the Automated Verification ofPL/I and Cobol Programs," J.Systems & Software, New York, Dec.1982.

Harry M. Sneed is the technical manager atSoftware Engineering Service, a Munich,West Germany, software house. Beforejoining SES in 1978, he was a systems pro-grammer for Siemens, and before that hewas with the Volkswagen Foundation.From 1967 to 1970 he worked as a pro-grammer/analyst for the US Navy Depart-ment.

Sneed received his BA and MS degreesfrom the University of Maryland in 1967and 1969. He is a member of theACM andIEEE.

His address is Software Engineering Ser-vice GmbH, Pappelstrasse 6, 8014 Neu-biberg, West Germany.

Sessions covered such topics as medical computer graphics;diagnostic techniques using synthetic imaging and animation;picture archiving and communications systems; the clinicalutility of graphics in routine surgical procedures; and the funda-mental issues of computer vision. 131 pp.

Order #524

PROCEEDINGS-1984 International Joint Alpine Symposium

February 11.15, 1984

Nonmembers-$30.00Members-$15.00

Order from IEEE Computer Society Order Dept.P0 Box 80452, Worldway Postal Center

Los Angeles, CA 90080 USA(714) 821-8380

U

U. - - S

IIT RESEARCH INSTITUTE (IITRI) is an independent researchorganization founded in 1936 to provide research, development anddesign services to industry and government. Today, IITRI ranks asone of the largest and best.known contract research organizationsand continues to extend its capabilities to effectively meet researchchallenges of our changing society. The IITRI facility in ROME, NEWYORK has openings for experienced professionals to join a groupof software engineers and computer scientists performing softwareengineering research and developing special purpose databases.

SOFTWARE TECHNOLOGYB.S. or M.S. in Computer Science or related discipline; at least 2 yearsexperience in one or more of the following areas:* Systems Development * Networking * Software Tools/Methodologies * Office Automation * Data Base Management* Data Collection/Analysis * Software Metrics * Software Acquisi-tion * Software Assessment.Exposure to Unix, DoD applications and research methodologies aplus. Projects will support Ada' STARS, DACS, and other DoDprograms.'Ada is a registered trademark of the U.S. Government (Ada JointProgram Office).If you are interested in professional growth and increasing respon-sibility, in advanced educational opportunities, and would like to livein upstate New York where you'll enjoy casual living and easy ac-cess to year round recreational activities, send your resume to:

Nan Pfrimmer, Dept. GT1IIT Research Institute199 Liberty PlazaRome, NY 13440An equal opportunity employer M/F/WVH.U.S. Citizenship Required.Only Principals need apply.