eduservice—a computer assisted evaluation system

10
0360-1315~82 030311-10so3.00~0 Pergamon Press Ltd EDUSERVICE-A COMPUTER ASSISTED EVALUATION SYSTEM SOME DESIGN CONSIDERATIONS E. J. W. M. VAN HEES Educational Research Centre. Universlt) of Tilburg. The Netherlands Abstract Computer Managed Learning (CML) is often regarded as only an interim approach to the adoption of Computer Asslsted Learning (CAL). In this paper CML is discussed both as part of an Innovative strateg! and as a means to support the classroom teacher in its own right. It IS argued that. from a centralised point of view. introduction of the computer in support of edu- cation can best be started from the bottom. 1.e. m rather simple admimstrative and managerial task. One such introductory development is described in detail. COMPUTER USES IN SUPPORT OF TEACHING AND LEARNING At CAL’79* De Witte[l] gave an overview of CMI-II. the second version of a Dutch large scale CML-system. Van HeesC2.31 has given a detailed account of the original design characteristics of the first version. The system was set into full operation in 1974. and is now in use in 8 of the 13 Dutch universities. 1 polytechnic and 1 institute for secondary education. This system. which resembles the British CAMOL-system (e.g. Ryme11[4]) but is more interactively oriented. represents a particular view of computer support of education. i.e. as a manager of individualised courses like IPI (Individu- ally Prescribed Instruction) or PSI (Personalised System of Instruction). Recently Van Hees[S] has argued that the computer can give support to the classroom teacher in many more ways than the particular one implemented in this system. By abstracting from the well-known realisations like CML-systems for IPI, itembanking-systems, on-line testing and CAL. twelve elementary functions can be distmguished. a number of which are included in every separate system. These twelve functions are listed in Fig. I, At CAL’81 t Leiblum[6] presented an evaluation of different CML-systems on the basis of this list. If the trivial function of record keeping is added. the elements and the relations between them can be ordered into the functional scheme of computer support of education presented in Fig. 2. In this figure a last core function “presentation” has been included in order to cbver CAL applications as well:. The logical entrance to the scheme is in the upper left corner. The other corners are logical exits where unique materials are generated: items, feed-back and advices. The two intermediate blocks “test-generating” and “scheduling” are possible exits with reproducible output. The other blocks only function withm a cycle. and possibly for reference and administration. The operational systems at the present time can all be characterised by a particular selection of these functions. For instance: CMI-II and CAMOL consist of the assignment of items for individual performance tests derived from an itembank of some sort. test-marking. assignment of study tasks, and several forms of report- ing and evaluation based on the course structure represented by a data file, which in fact is a concise coded form of the course objectives. Tests are administered off-line and the study tasks are not comprised in the CML-system. These characteristics are pictured in Fig. 3. derived from the overall scheme. TICCIT. the well-known CAL-system designed by Merril1[7.8] is basically a delivery system for mdivldualised audio visual instruction and testing. The most salient feature is that the courses are completely learner-controlled. although it is possible to conform control to the advice of the program. This means that presentations do not follow assignments. but are called by the student. To enable the students to keep track of the course. course-schedules called “maps” can be displayed at any time. The student’s pro_eress is indicated on the map. These characteristics are pictured in Fig. 4. * CAL’% S~mposlum on Computer .Asslsted Learning. University of Exeter. U.K.. April 1979. t CAL.81 Symposium on Computer Asslsted Learnmg Umversity of Leeds. C:.K.. April 1981. 2 The presentation function is a rather complex one. It varies from display of information (learning material) m simple drills to the rather drffuse “problem management” m simulatidn and problem solving

Upload: ejwm

Post on 20-Jan-2017

215 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Eduservice—a computer assisted evaluation system

0360-1315~82 030311-10so3.00~0 Pergamon Press Ltd

EDUSERVICE-A COMPUTER ASSISTED EVALUATION SYSTEM

SOME DESIGN CONSIDERATIONS

E. J. W. M. VAN HEES

Educational Research Centre. Universlt) of Tilburg. The Netherlands

Abstract Computer Managed Learning (CML) is often regarded as only an interim approach to the adoption of Computer Asslsted Learning (CAL). In this paper CML is discussed both as part of an Innovative strateg! and as a means to support the classroom teacher in its own right. It IS argued that. from a centralised point of view. introduction of the computer in support of edu- cation can best be started from the bottom. 1.e. m rather simple admimstrative and managerial task. One such introductory development is described in detail.

COMPUTER USES IN SUPPORT OF TEACHING AND LEARNING

At CAL’79* De Witte[l] gave an overview of CMI-II. the second version of a Dutch large scale CML-system. Van HeesC2.31 has given a detailed account of the original design characteristics of the first version. The system was set into full operation in 1974. and is now in use in 8 of the 13 Dutch universities. 1 polytechnic and 1 institute for secondary education. This system. which resembles the British CAMOL-system (e.g. Ryme11[4]) but is more interactively oriented. represents a particular view of computer support of education. i.e. as a manager of individualised courses like IPI (Individu- ally Prescribed Instruction) or PSI (Personalised System of Instruction). Recently Van Hees[S] has argued that the computer can give support to the classroom teacher in many more ways than the particular one implemented in this system. By abstracting from the well-known realisations like CML-systems for IPI, itembanking-systems, on-line testing and CAL. twelve elementary functions can be distmguished. a number of which are included in every separate system. These twelve functions are listed in Fig. I, At CAL’81 t Leiblum[6] presented an evaluation of different CML-systems on the basis of this list. If the trivial function of record keeping is added. the elements and the relations between them can be ordered into the functional scheme of computer support of education presented

in Fig. 2. In this figure a last core function “presentation” has been included in order to cbver CAL applications as well:.

The logical entrance to the scheme is in the upper left corner. The other corners are logical exits where unique materials are generated: items, feed-back and advices. The two intermediate blocks “test-generating” and “scheduling” are possible exits with reproducible output. The other blocks only function withm a cycle. and possibly for reference and administration.

The operational systems at the present time can all be characterised by a particular selection of these functions. For instance:

CMI-II and CAMOL consist of the assignment of items for individual performance tests derived from an itembank of some sort. test-marking. assignment of study tasks, and several forms of report- ing and evaluation based on the course structure represented by a data file, which in fact is a concise

coded form of the course objectives. Tests are administered off-line and the study tasks are not comprised in the CML-system. These characteristics are pictured in Fig. 3. derived from the overall scheme.

TICCIT. the well-known CAL-system designed by Merril1[7.8] is basically a delivery system for mdivldualised audio visual instruction and testing. The most salient feature is that the courses are

completely learner-controlled. although it is possible to conform control to the advice of the program. This means that presentations do not follow assignments. but are called by the student. To enable the

students to keep track of the course. course-schedules called “maps” can be displayed at any time. The student’s pro_eress is indicated on the map. These characteristics are pictured in Fig. 4.

* CAL’% S~mposlum on Computer .Asslsted Learning. University of Exeter. U.K.. April 1979. t CAL.81 Symposium on Computer Asslsted Learnmg Umversity of Leeds. C:.K.. April 1981. 2 The presentation function is a rather complex one. It varies from display of information (learning material)

m simple drills to the rather drffuse “problem management” m simulatidn and problem solving

Page 2: Eduservice—a computer assisted evaluation system

312 E. J. W. M. VA?. HEES

‘.

2.

3.

4.

5.

6.

7.

8.

9.

10.

11.

12.

Oojective Banking_ Collection and maintenance of a structured set of objectives related to a lesson, task, course, disciplirre or curriculum

Learning Resources Banking and Library Information Retrieval Collection and maintenance of structured lists of educational faclllties, resources, packages and/or prlnted materials

Learning Raterial BanLing Collection and maintenance of those instructionai materials that can be stored in computer iarduare/soft*are or peripheral units

Questrcn-mnking_ (a) Collection o' a structured set of test-items or evaluation Questions (b) On-Line edit ng of questions (c) Maintenance qf test-item statistics (d) Maintenance OT bank usage statistics

Item Generation (a) Generation of test-items v,a a framework or macro faclilty, e.g. ransom "umber

generation (b) Generation of items through randomly selecting/substituting existing pacts of

the item (stem or answer choices) 'c) Generation of items based on a model/grammar representin the structures of tne

Learning materials

'Test Generation (a) Allocating items for inoividual study tests (b) Generating parall-_l fgrms of a specific test ICI Generating standardised final examinations from a" item bank

Test-ma*hinp !a) Sco~lng of indivldudl tests !b) Scoring of final examinations

Reportinq (a) Reporting of individual teSt res.uLts and lists of (b) Reporting of individual and group study progress

Evaluation (a) Providing test analysis (b) Providing ccucational product analysis (formative .

test resuits for groups

evaLuationI CC) Provrclng ed”CatlOnal process a"alys15, e.g. inforaatlon about study t,me,

+rtervalr between attemDts, number of attempts, questionnaire processing, etc.

Counseiinq. Providing individual advice relating to study or career goals

Creat,"g and malntainlcg schedules of educational facilities

Fig. I. Elements of Computer Managed Instruction.

INNOVATIVE STRATEGIES

Lippey[9] has developed a taxonomy of computer Gses in education that reflects an increasing difficulty of development and acceptance. which is inversely related with the distance from the instructional process. Ir. ascendmg order of difficulty he distinguishes three major groups of computer applications: research tools. student programs and operational support. Like Lippey we are mainly interested in the third category. This group contains all computer uses which support the operations of the educational Institute. including the teaching-learning process. This is broken down into five sub-categories.

This includes all applications which are also found in business and other organisations. e.g. pay-roll and inventory control.

Page 3: Eduservice—a computer assisted evaluation system

Destgn considerations on EDUSERVICE 313

oblectwe banklng

t

_ mateflal bankmg

I

I I

CMI macro I

recources level i Item banking 1 bankmg

I i

SChedullnQ

1

record keeping

I

cwnsellng

student centred teacher centred

evaluation evaluation

Fig. 2. Functional scheme of computer support of education

Education administration

Administrative applications which are unique to the educational environment, but do not support the teaching process. Examples are student-registration and class-scheduling.

instruction-related logistics

Clerical support of the educational process. which does not contain educational decisions. i.e. diagnosis or prescriptions. Examples are test-construction. test-marking and assistance in selecting learning materials. Sometimes these computer uses are regarded as CML-applications.

Macro-guidance

Here the computer does make pedagogical decisions but does not actualfy teach. The decision periods are relatively long. hours to years. Turn-around time may often be overnight. Most of the applications which are regarded as CML and counseling fall under this heading.

objectwe I banklng I 1

course structure

Wck level

r-------,

I presentatton ; : off-me , -

_

,---i---’ 1 tests : admmistered

Fig. 3. Functional scheme of CM14 and CAMOL

Page 4: Eduservice—a computer assisted evaluation system

314 E. J. W. M. VAS HEES

oblectlve bankmg

material I banklng I

I I I

learner

Control

+Lzi-,A~*&~ Fig. 4. Functional scheme of TICCIT.

This includes all applications which are usually called C,4I or C.4L. Guidance 13 \ery short ranse. i.e. seconds. The student interacts directly with the propram by means of a computer terminal.

In Fig. 2 CAL (C.411 and CML (CM11 are presented according to this description. as micro- guidance and macro-guidance.

Lippey argues that computer applications in the top of the list will grow more easily than those at the bottom. Therefore computer applicaticns should be carried through in that order. Some of his arguments are: the further we proceed in the list the closer we come to the educational process and the student. Theoretical knowledge and experience m orhe: tields of application g!ve us less 811~. Furthermore there is an increase in magnitude of the changes in every-day practice which results from the introduction of the computer. together with an increase in the necessary means. while on the other hand there are fewer visible profits due to elimination of traditional cost faciors. Therefore gradual introduction is ad\ ised, starting from the top 3f the list And remaming as close ~1s po>siblc to existing wals of operation and changes that emerge there. b! which process the appiica:ions established in preceding stages act as natural bridges to newer ones.

A rigid intcrpretatior! of this philosophy leads to the standpoint that firstiy CWL has t3 be put into practice and after adoption of that. C.4L can be put forward with success. .4s an innovati\,e strategy this is not new: for instance Systems Deve!opment Corporation has developed a CML-system m the 1960s as “an interim approach towards wide-spread adoption of Computer Hased instruc!~on”[lO]. However, such a point .3f view ulll meet with great resistance and .justly so. Many CAL packages are developed by ir.dividual teachers or user groups where no preceding applications were existent. and It is well-known from C4L-literature that exactly this individual approach bears most promise for

success. Obviously the reason !‘o r this IS that by emerging from such a b%. acceptance by teachers and srudents is more likely because of their greater involvement m the innovation. u.hile ah the integration of computer uses in the s3ucational program is better ensured. On the other hand. It IS often found to be difficult to let these teacher-made packages play more than a minor adjunct part. The main reason for that might be that accompanyin_r CML-provisions arc absent so that keepmg track of students activities and guiding the students on the basis of their proeresh is difficult to

accomplish. Secondly CAL ivill probably not be widely adopted if most of the teachers do not already use the computer to support their educational tasks. Finally. m these isolated indltidual applicailons the necessary investments in hardware and software for each extension are very large. n,hich endangers continuity.

From our experience wtth CAL CML-developments over one decade \\e hale drawn the conclu- sidn that tb.e two viewpoints need differentiation. In a centralised form of developmental support. whsrz the necessary means are centrally provided. gradual introduction of computer WCS. starting from administrative support and closely following the wishes and needs of the teachers. IS to be recommended. In the case of more individual development of CAL-modules to solve certain edu- catlonal problems by one or more teachers. possibly with the assistance of ;I centrai facillt],. the opposite approach IS indicated. That means that. departing from the practical problems in Implcment- ing the self-contamed CAL-applications. attempts are made tc dc\elop fucllitlr~ that ;LIY of more

Page 5: Eduservice—a computer assisted evaluation system

general use and that can support the teacher in tasks that are more and more distant from actual teaching.

EDI_!SER\‘ICE. A CASE

The CML-systems mentioned previously have met with some success. However. the growth in

usage has lagged behind our expectations. At the University of Tilburg the interest of fact&q in this appiication has even been reduced over recent years. The CML-system has been developed by the educational R&D centres of the Universities of Tilburg and Eindhoven. in close cooperation with a group of teachers who called upon them to support their self-paced courses. However. their successors and the other teachers in their own and other departments cannot easily be involved and the whole aim of the system does not find much recognition and appreciation throughout the Universities. It has become apparent that the introduction of the computer in education. particularly m contact with the students. IS too large a step for most teachers. even m the rather remote manner of macro- guidance. This is consistent with Lippeg‘s findings.

After the completion of the first system the Educational Research Centre of the University of Tilburg decided to focus more on teacher aids, Lippey‘s category “Instruction related logistics”. in order to bridge the gap between the more accepted business and administrative applications and the uses of the computer in the educational process as in the CML-application. As a start we developed a number of ctrl fzczc~ programs for processing teacher-made group tests and educational data. e.g. from students questionnaires and practice assignments. The past 5 years saw a growing demand for facili- ties l&e the above-menti(~ned. for product and process evaluation in the classroom (“test-marking”. “reporting” and “evaluation”l.

In consultation with several groups of teachers we have sought. for some time. a set of functions to be performed by a computerised classroom support system. Occasions for these consultations were found in our teacher training courses. in minor evaluation projects that were done at the request of the teachers, in counseling in test development and in assistance in the use of the existing facilities. This resulted in the following list of functions that an operational service system should supply:

Processing of multiple choice tests for groups of students tpre-tests. formative tests and post-tests). A large number of options is needed. like scrambled tests. score-reports that include the names of the students derived from trd hoc. files or from student registration files. grade reporting. adding scores or grades from essay tests or other work assignments. sub-reports on scores or items from groups of participants. reports on sub-tests. and finally verbal advice on test de~~elopment based on item analysis. It must be possible to store the responses and the results on disk for subsequent processing with resident statistical packages (e.g. SPSS) or to punch the data in cards for processing by another computer if no direct link is available.

Input of other kinds of data and storage on disk or output on punched cards. Output options should include variable output formats. sorting of records and automatic printing or punching of the information that is needed for subsequent processing (in SPSS-terms the control cards FILENAME. N OF CASES. SUBFILES and INPI!T FORMAT).

Complete processing of student questionnaires with different types of questions ttrue,false. multiple choice. grouped items and rating items with 3. 5 and ‘i-point scales). Output options are frequency tables and diagrams. cross-tabulations and some statistics. The output must be paginated and it must be possible to add descriptions to the items. choices and scales. and to add comments and introduc- tory texts. In this way a full report that is ready for reproduction can be produced at the terminal.

Banking of all kinds of choice type test-items. This comprises storage of the item-text, comments, item-statistics and bank-statistics. The bank should contain an on-lme editor for correction and updating. Automatic updating of item-statistics with data from test-processing must be possible on request.

Banking of essay-type. fill-in and evaluation questions. These questions are not accompanied by item-statistics. all other support functions are as mentioned above. A “standard”-pool of evaluation questions has already been constructed.

Generating tests and questionnaires from item-pools that are stored in the computer. It must be possible to print scrambled test-forms. The information that is needed for processing the answers can be collected and stored automatically. in order to be able to start test-processing directly by identify- ing the test and inputting answers.

Data collecting and reporting on call. for counseling and signaling students that are in danger of falling behind.

4 number of these flinctj~?ns were already available in several self-contained programs. This could have been expanded with a number of small programs lo meet new demands. An important advan-

Page 6: Eduservice—a computer assisted evaluation system

E. J. W. M. VAN HEES

!-------I course

level

! / administered

iott-t’“e

( leedbxk ‘r-....-.-.--

Fig. 5. FunctIonal scheme of EDCSERVICE

tage of this approach is that new wishes can be satisfied eaiiy. However, important disadvantages are that it takes much assistance and supervision. extensive documentation and considerabIe user knowl- edge of the structure of data fifes and of the computer system (i.e. work flow language and tile maintenance or zbe the control language of the time sharing system). No manpower is available for supervision and assistance, and moreover rt might be expected that in future no manpower will be available anymore for new developments. However. a more important argument that contrasts with this approach is our starting point. that the whole system should be aimed at the individual teacher (or subject group). The teacher himself is totally responsible for the data processing, he himself must take the necessary decisions and he alone has access to the student data that are gathered by him and that he makes use of to support his tezching effort. It is possible to aggregate data to some extent, but again only by the individual user who has access to this data for the guidance of his own students. Therefore it was finally decided to integrate the separate functions in one system, with special attention to the “ease of use“. .4 functional scheme of this system, which was given the name EDUSERVICE. is presented in Fig. 5.

DESlGN CONSIDERATIONS

The concept “ease of use” is generally not well defined. What we are most concerned with is that the user does not have to burden himself with technicalities and that he is not bothered by the machme. This leads to the followmg considerations. Most users are not familiar with the ins and outs of computing and will use the system under design so infrequently. that it is too much trouble to have to study comprehensive documentation first. The necessary preliminary and post-handling must be minim&d and the system may not hinder the students. The system must be very reliable and the data must be highly secure. Turn-around. including manual handling. must be competitive with non-

computerised operation. even if these procedures were much less adequate. ft must be able to confine necessary attendance to a minimum. And last but not ieast, the users will not be able to state their needs for years in advance, so they will go on demanding new facilities that must be provided for or else they will be frustrated very soon and :urn their backs on the system.

These considerations from a human rather than a machine standpoint have resulted in miscella- neous criteria for the design that can be split up into ten specific topics.

Operation of the apparatus. a screen, a pruner and an opttcal mark reader, must be very simple and comparable with an electrical typewriter. The total system is started by a simple push on the button titled “HERE IS.. ‘. and the typed message RUN EDUSERVICE.

Page 7: Eduservice—a computer assisted evaluation system

Design considerattons on EDUSERVICE 317

When the system is started all options are available to the user. The user is not confronted with the

structure of the program. Each command that makes sense can be given at any moment. and the program does not break off due to user errors. Therefore the manual can be restricted to theoretical topics. some introduction to the possibilities of the system and an index of commands for reference.

The system uses a natural language instead of the usual elaborate and boring prompting sequences. We have opted for a simple grammar of Dutch single and double terms as usual in normal conver- sation on the subject. Possible single terms are: yes. no. none and stop. The commands, or rather requests to the program are essentially double terms, consisting of a verb and a noun. For instance ‘Change the marks please” would be abbreviated to ‘Change marks”. In those cases where one term implies the other it is sufficient to give only that single term. as for instance in “punch (data)“. Only Dutch terms common to the subject are used. spelling differences are usually ignored and many equivalent terms will be recognised.

4. .vo computer lireracj, required

All file handling and work flow commands are provided by the system.

5. Help on request

A help sequence is available at any point in execution where the system will accept input. The help sequence usually diagnoses why the user does not obtain the information he requested or gives a “menu” if a command is expected.

The necessary attendance is almost totally restricted to built in supervisor routines. These routines

register all use of the system. all created files, names and passwords and print reports on usage on request. For every user group (usercode) a functionary must be appointed who can handle some simple supervisory tasks by means of the SUPERVISOR-function in the system.

7. Ownership and priuacj, secured

Files can only be used and destroyed by their owner. Each file can only be accessed via a unique password that is assigned by the user himself on creation of the file. Data that is of interest to the students is coded.

8. Little preliminary, and subsequent rrearmenr

Simple pencil mark cards designed specially for test answers are filled in by the students. Each card can contain up to 60 answers, We do not use machine-readable sheets but only cards. because the latter are cheaper. more reliable. stronger and easier to handle and can be inputted automatically much faster with moderately priced readers in cooperation with a terminal. We found that students in higher education have little difficulty with filling in well-designed mark cards. In secondary education

the results are even better. All print-outs are in standard A4-format in camera-ready high quality printing. with paging. page numbers and relevant headings that are produced automatically.

9. Short response rime and “turn-around”

The computer always responds within a few seconds. As data are gathered while reading the answer-cards. reports can be generated almost instantaneously. When some more processing time is required, for instance for sorting. the user is notified. The “turn-around” time for processing usual

tests for 200-300 students. i.e. the time from the start of EDUSERVICE to finishing the final report, is less than 1 h.

10. Modulariry

Modularity is of course essential, in order to obtain some of the characteristics already mentioned.

However. modularity is carried through to such an extent that it is quite simple to add new functions. The command language is also easily expandable. because the commands themselves and the com- mand logic are contained in data arrays. The system consists of approx. 150 small procedures at this moment and grows with some 30 procedures per year.

The system is implemented on a large general purpose computer (Burroughs B7700) with hundreds of terminals throughout the country. It is programmed in Burroughs Extended Algal. The described

Page 8: Eduservice—a computer assisted evaluation system

system is stall partially under development. Test processmp. data input and output and creation and edttmg of itempools are completed and are used rather rxtensivsly (6 pools of 350-1000 items. 100 tests processed in 1980). Test-generating procedures will be developed thts year. .4 new facility that IS under development at this moment is readability analysts of longer test items ie.g. cases in bus:ness. law. social sciences and history) and of essay ty~ answers. Grading English essays is already possible. The analysis is developed in cooperation with Pensacola College. C.S.A. [ 1 I],

The system has been released for use hy all members of PGO. the user association of the CYlL- systems.

FUTURE DEVELOPMENTS

We are now considering the transfer of several modules to microcomputers. also for use in second- ary education. We have purchased 21 cheap Apple microcomputer with a diskette drive to explore the possibiiities. .4t the Umversity of Nijmegen a group of teachers is in the process of implementing CMI-II on a Sorcerer micro. The programs are written in Pascal. with some assembler routines to control the reader and the printer. .4 first outline of a micro classroom support system is given tn Fig.

6. Only the day tc day operation is shown. Course construction and creation of an itempool have to take place by means of separate programs.

IPI-course x Exam Y Pooi 2

Accept student n's Input answercards rnput ,tem

choice of ;owse moduie

Generate an ir!div>duat Assign grades Edit item

test for student "

(print)

jcorr test of student n Rcpcrt ;corer/gradcs Setect items for a

reporr results/next (print) tt*t

choices/restudy &vices

Daily reports OR

course X (prinr)

test analysis

(print)

Print test

L Fig. 6. Outhnt of a mrcrh alassroom supporr sqsrem.

The computer system wiil probably bi: as follows. The three sets of modules are ccntained an

separate diskettes. On switching the micro on the “menu” is loaded. On selection of a module the module is loaded. and a possible former module is removed to save workspace. The necessary data tile is contained or built on a second diskette. If an exam is not too large it IS not necessary to use a second diskette. In that case the test can be processed “in core” and after finishing it can be destroyed or saved on a cassette.

The module and in-core-data need 32 K of RAM. Per extra 16 K of R.4%t. 7% tests can be held m core. One 5 tn. diskette can hold the attributes of approx. X?OO items. If the texts of the items must be stored as well only 100-300 items can be contained on one diskette. This ts enough for most block tests. but not for pools that cover a whole course. We have not yet found a \oIution for that.

More than in the past we have now become aware of the need for a “C’ML-language”. By this vve do not mean ;1 kind of author language as in specialized C/-\L-systems. but II formal definitton of a mmimal set of vartablrs and options that must be avatlable to the user of LL ChlL-system. Some attempts have been rnade in the past. see for instance Behr[lZ] and BakerLl3].

PERSPECTIVES

,4t our universit:~ most teachers are not acquainted with computer appiications except for research and student programs. Recently the L’nisersit>‘s Council requested the author to investigate the need to estah!tsh central facilitres.

Page 9: Eduservice—a computer assisted evaluation system

Desqn considerations on EDL’SERVICE ?lY

All 66 ‘.subject groups” (groups of faculty who teach in the same subject area) in the 6 departments have been questioned in detail. Response was almost SY’.. The most wjanted on-line facilities in order of preference were (3”” is I responding group):

1. 2. 3. 4. 5. 6. 7. 8. 9.

10. 11. 12. 13. 14. 15. 16. 17. 18.

Library retrieval 49”,,

Administration of study results 43”,,

Study progress reporting 40”,,

Student administration 36”,,

Computation (prefixed programs) 30”,, Computer programming b> students Test-marking. analysts and reporting 24”,, Questionnaire processing Learning resources banking Games >

Zl”,,

Simulation 18”,, Computation and display in classrooms Itembanking >

15,, 0

CAL-practice 9”,, CAL-drills CAL-tutorials > 6”,,

On-line testing CML for IPI. >

Y,,

There is a remarkable overlap in CAL and CML-like applications. which might indicate that Lippey’s categories Macro-guidance and Micro-guidance are mixed up as regards faculty-appreciation. One obscuring factor might be that the popularity of simulation and gaming at this moment ma! have caused a superficial familiarity. which is definitely qot the case Ntth for instance tailored testing. The 6

first and the last (!) application were already available to some extent. In general the results indicate that EDUSERVICE is in fact a good starting point for further introduction of the computer in support of education.

As Rushby et a/.[143 have pointed out. our CML-efforts are totally self-supported. The two CML-systems and EDUSERVICE have been developed totally at the participants’ expense. and in the case of EDUSERVICE there has been only one participant until now. the Educational Research Centre of the University of Tilburg. In Holland there never has been and probably never will be a national development program in CAL as in the UK. Grants from government or from the founda- tion for educational research (SVO) for this kind of educational technology is quite out of the question. Therefore the development proceeded onI> when the author had time to spend on it There is a great need for a micro-based classroom support system. In order to make these applications available in time we will have to seek for some kind of external funding for the first time.

6

7 R 9

10

II

REFERENCES

De Witte P. C. F.. A software package for computer managed mstructlon. Co~lpur. E&. 3. 325 330 (1979).

Van Hees E. J. W. M.. Some prelimmarg remarks on the Eindhoven ChlL-system. Paper presented at the IEEE Eurocon 74 Conference. .4msterdam (19741. Van Hees E. J. W. M.. Computer Managed Learning at the Um\eralt! level m the Netherlands Educ. 7rchnol. 16, %?I (19761. Rymell A. D.. System outline of Computer Assisted Management of Learmng C4MOL. CET’NDPCAL. London (1974). Van Hees E. J. W. M.. Computerteopassingen m het onderwijs’3: De computer als onderwijsmanager. Inrwmediair 16 (1980): Computertoepassingen in het onderwi&4: Computerondersteuning in totaalbeeld. Intermediair 16 (1980). Leiblum M., A CA1 servtce group considers Computer Managed Instruction and the Interactwe Instruc- tional System (IIS). In Selecred Proceedings CAL’81 (Edited b> P. R. Smith). pp. I%164 Pergamon Press. Oxford I 198 I I Merrill M. D.. Learner control: beyond aptitude treatment Interactions. .4 C’CR 23. 217~ 126 119751. Merrill M. D.. Learner control in computer based learning. Cwnpur. Edut 4. 77 -95 (IY801. Lippey G.. Computer Managed Instruction: Some Strategic considerations. Edw 7e1~ltwl. 15. 9- 13 11975). Coulson J. E.. An instructlonal management system for the public school. Technical Memorandum. SDC. Santa Monica (1967). Walker N. and Boillot M.. A computerized reading level analysis. Edw 76~ /QUIP/ 19. 47~4Y I lY791

Page 10: Eduservice—a computer assisted evaluation system

370 E. J. W. &I, V4> HEES

12. Behr G. E.. Softw,are structures for mstructional management systems. Ph.D. Thesis. L’niversitj of Wiscon- sin. Madison I 19761.

I-l Rushbq N James E. B and Anderson J. S. A.. A Three-dimensional Vieu of Computer-Bavzd Learruns m Continental Europe. PLET 15, 152-161 11978).