integrating developmental assessment with student-directed instruction: a case in vocational...

16
Journal of Vocational Education and Training, Volume 55, Number 1, 2003 113 Integrating Developmental Assessment with Student-directed Instruction: a case in vocational education in the Netherlands HARM H. TILLEMA Leiden University, Netherlands ABSTRACT In this article, it is argued that performance assessment is a valuable instructional tool in courses in higher education not only to evaluate learning outcomes or diagnose prior learning (APL), but also to scaffold and monitor learning progress for the further development of competencies. Especially in the attainment of competencies in vocational education, this developmental type of assessment, as it relates work and practice, serves as a strong device for providing functional feedback to the learner. A developmental assessment programme aimed at integrating assessment with instruction called EDAS was studied in a course programme in higher vocational education in the Netherlands. The programme entails a student-directed and self-regulated format for assessment. The construction and implementation of the programme is being described as a case, by taking the perspective of the teaching staff. The description focuses on experiences that were gathered and ‘lessons’ that were drawn by the staff. A few years ago, the EDAS system was launched at a Dutch Institute of Higher Vocational Education for Small Business and Retail Management. EDAS means Educational Development and Assessment System. The goal of EDAS is establishing a link between the curriculum and its assessment in order to achieve a better match between student learning and the attainment of vocational competencies. The innovative aim of EDAS is to reverse the traditional attention in higher education courses on the measurement of learning objectives and outcomes (Broadfoot, 1996) by focusing on monitoring student learning progress and formative assessment. As a project EDAS is meant to concretise and instrumentalise the close connection that exists in the curriculum between evaluation/assessment, learning and instruction (Shepard, 2000). The

Upload: harm

Post on 19-Feb-2017

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

Journal of Vocational Education and Training, Volume 55, Number 1, 2003

113

Integrating Developmental Assessment with Student-directed Instruction: a case in vocational education in the Netherlands

HARM H. TILLEMA Leiden University, Netherlands

ABSTRACT In this article, it is argued that performance assessment is a valuable instructional tool in courses in higher education not only to evaluate learning outcomes or diagnose prior learning (APL), but also to scaffold and monitor learning progress for the further development of competencies. Especially in the attainment of competencies in vocational education, this developmental type of assessment, as it relates work and practice, serves as a strong device for providing functional feedback to the learner. A developmental assessment programme aimed at integrating assessment with instruction called EDAS was studied in a course programme in higher vocational education in the Netherlands. The programme entails a student-directed and self-regulated format for assessment. The construction and implementation of the programme is being described as a case, by taking the perspective of the teaching staff. The description focuses on experiences that were gathered and ‘lessons’ that were drawn by the staff.

A few years ago, the EDAS system was launched at a Dutch Institute of Higher Vocational Education for Small Business and Retail Management. EDAS means Educational Development and Assessment System. The goal of EDAS is establishing a link between the curriculum and its assessment in order to achieve a better match between student learning and the attainment of vocational competencies. The innovative aim of EDAS is to reverse the traditional attention in higher education courses on the measurement of learning objectives and outcomes (Broadfoot, 1996) by focusing on monitoring student learning progress and formative assessment. As a project EDAS is meant to concretise and instrumentalise the close connection that exists in the curriculum between evaluation/assessment, learning and instruction (Shepard, 2000). The

Page 2: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

Harm H. Tillema

114

term development in the acronym is meant to strengthen the following mission: ‘the curriculum is governed by the perspective that educational programmes are above all a supporting and scaffolding element in the growth of the student’s self-directed learning towards acquiring the competencies needed in the labour market. It is the student that needs to govern his or her own learning course’ (EDAS project, 1997; Zimmerman & Schunk, 2001).

These goals and educational perspectives are reached through a competence-based and learner-orientated programme as the central focus of the curriculum. Instruction in such a context needs to be aligned and coupled with an appropriate evaluation approach that takes credit for continuous monitoring of the student as the student is progressing through the curriculum (Torrance, 1994; Topping, 1998). A central feature of assessment in the EDAS system is the high value placed on self-responsibility of the student for collecting and presenting evidence of growth in his or her own learning, i.e. competence development.

No doubt, EDAS implies a major change on the level of instructional interaction and distribution of responsibilities between students and their teachers, as well as with respect to the actual delivery of the course programme (Swanson et al, 1995). For instance, the teachers’ role becomes more that of a coach of student learning (Boekaerts, 1999). In appraising, monitoring and evaluating learning progress, the teaching staff needs suitable instruction-related ways of assessment which they can fall back on in their teaching; therefore, approaches have to be constructed in order to adapt teachers to their new assessment role and equip them with the necessary tools (Broadfoot, 1996). EDAS is trying to fulfil these requirements.

Conceptualising the Assessment Notions Behind EDAS

Assessment for Development

To effectively support and scaffold students as self-directed learners (Zimmerman & Schunk, 2001) means, as far as assessment is concerned, moving beyond the measurement of the outcomes of what has been learned to assessment approaches that can anticipate competence levels and monitor the learner’s progress during the course of competence development (Torrance, 1994). Assessment viewed as supporting learners means providing opportunities to give insight into one’s current or actual levels of performance, as well as into the learner’s potential to achieve targeted performance (Heartel, 1990). Utilising suitable competence-based assessment instruments can contribute greatly to performance enhancement by effectively providing functional and valid feedback, assessing the learning process as well as its products (Butler & Winne, 1995).

Page 3: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

INTEGRATING DEVELOPMENTAL ASSESSMENT

115

Competence-based and Self-regulated Learning

It is the learner who creates value-added solutions to his or her competence profile (Boekaerts, 1999). It is, in this view, only natural that a student receives responsibility for his or her own development. In self-regulated learning students are in charge of the goals and strategies of their own learning (Olson, 1991; Fischer & King, 1995). Appraising the strengths and weaknesses in existing competencies and defining a student’s learning needs have often been subsumed under the teacher’s control (Shephard, 2000). In a self-directed view on learning (Zimmerman & Schunk, 2001), education and training are looked upon through the eyes of the student who receives timely support and functional feedback (Butler & Winne, 1995) while in the process of developing competence. In this sense, training and development become embedded in a careful monitoring and assessment of performance (Peterson, 1995).

Establishing an Integrative Approach to Assessing Competencies

Given the above-mentioned notions on learner-orientated assessment and evaluation, there is a strong need for appropriate instruments that can build an integration between assessment and interventions for perfor-mance improvement (Gipps, 1994; Broadfoot, 1996). As is evidenced in many insights on competence-based assessment (Wiggins, 1989; Heartel, 1990; Herman & Winters, 1994; Peterson, 1995), evaluative information preferably must be extracted from direct and authentic (i.e. work-related) activities, and simultaneously must provide cues for further specific training or development activities, i.e. offer a learning plan (Smith & Tillema, 2001). An integrated approach to assessment would mean utilising several instruments for monitoring different aspects of perfor-mance relative to the actual student’s profile, ensuring the student receives feedback relative to the course profile. In this sense, assessment scaffolds the course track. From the work of Tillema (1996, 1998), such an integrative approach was derived to be used in EDAS (see Figure 1 for an outline of the assessment instruments aligned with the curriculum).

Page 4: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

Harm H. Tillema

116

intake

S A

curriculum P & C

D C

curriculum S A

curriculum D C

P & C

Portfolio

Portfolio

Portfolio

Sept Febr Jun ----------------> time: one year of the curriculum Figure 1. Assessment instruments as aligned with the course programme. SA, self-assessment coupled with feedback from peers; P&C, presentation of portfolio and counselling session; DC, development or assessment centre consisting of performance simulations.

Assessment as related to development, in this view, reflects four main conceptions about the instrumentation and ways of assessment (Wiggins, 1989):

• It helps the person to monitor his or her own development, he or she

then receives feedback on a continuous basis (Butler & Winne, 1995). A learning or development plan thus can be constructed.

• It reveals and utilises discrepancies between self-perceptions or self-assessments, and external sources of information about a person’s competencies as ways to inform the student about further learning needs (Heartel, 1990).

• It is the student who should profit primarily from this information (self-directedness) and be able to utilise it for increased awareness of development in competencies (Redman, 1994).

• It must reflect the competencies acquired, i.e. evidencing the performance itself, in such a way that processes, as well as products of learning are documented (Smith & Tillema, 2001).

To materialise these notions, the EDAS system encompasses the following combination of instruments, used throughout the course programme:

• the portfolio; • the development centre;

Page 5: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

INTEGRATING DEVELOPMENTAL ASSESSMENT

117

• the self-and peer assessment.

A portfolio is a purposeful collection of examples of learning collected over a period of time (Smith, 1997), and gives visible and detailed evidence of a person’s attainment of competencies. It serves primarily as a tool to highlight progression in competence development as it is under the control and responsibility of the students who present the evidence.

A development centre (Jones & Whitmore, 1994) is a deliberately construed assessment setting in which several simulation/work-related exercises/assignments are given to test specific competence attainment on a more detailed level. The assessment centre method (Byham, 2001) is typically used to construct practice assignments to test the competencies.

The discrepancies between self-assessment and peer assessment serve as a reflective tool for students to elaborate on their own strengths and weaknesses as mirrored by the perceptions of others, thus establishing a baseline of reference points for the student.

The collected assessment information (i.e. in the portfolio and development centre) will be related to further competence development through coaching activities by the teaching staff.

The issue addressed in this article is how EDAS as a project has implemented its assessment notions and what has been learned from its implementation by the teaching staff for the further construction of the assessment approach advocated.

Method

The implementation of EDAS is described as an exemplary case (Lundeberg et al, 1999), recognising its uniqueness as it is related to the setting in which EDAS was implemented while, at the same time, designating it as a source of assistance offering ‘lessons’ to be learned for other settings. The EDAS case describes the collected evaluation statements by the teaching staff during the full implementation period of 4 years at an institute of higher education in the Netherlands. During that period regular, monthly meetings with the staff were held to monitor and evaluate the implementation progress, as well as to make decisions about further steps to be made in the implementation process. With respect to the gathered evaluative comments it was distinguished between the process and product or outcomes (Hargreaves & Evans, 1997) of the implementation of EDAS. The evaluative statements and decisions made by the staff are put together as retrospective summaries (Hitchcock & Hughes, 1995); first, with respect to the decisions made during the construction and implementation process itself, and secondly, with respect to ‘lessons’ drawn by the teaching staff for future development. ‘Lessons’ drawn were viewed upon as testimonies by the teaching staff.

Page 6: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

Harm H. Tillema

118

The collected statements were categorised under appropriate headings below.

Results

Introducing EDAS

From the outset, worries about and a lack of belief in the success of a new way of measurement of learning were present, i.e. the teaching staff acknowledges students are working for and are primarily interested in ‘making the grade’. However, at the same time, it is believed that a major impetus for the creation of a new assessment system is to involve students in the assessment of their learning. In this way the notion of assessment integrated with instruction highlights the difference with what is encountered in most (other) institutions for Higher Vocational Education in the Netherlands. EDAS, therefore, in the eyes of the teaching staff, should focus on the successful later performance of students by delivering relevant, i.e. competence-related feedback. It is also recognised and accepted by the teaching staff to give the student more responsibility for his or her own competence development. Education, i.e. the curriculum, is so acknowledged, is instrumental to attaining these competencies. Putting competence development in the hands of students actually means giving them the opportunity to monitor themselves and highlight their own performance more concretely. These guiding principles remained almost undisputed throughout the process of implementation.

To meet these principles, EDAS as a system, incorporates the following elements:

• student-directed learning and acquisition of competencies remains at

the centre of the curricular programme, leading to flexible course tracks;

• assessment monitors instructional/learning progress continuously through a combination of assessment instruments (see Figure 1);

• competencies to be covered in the curriculum are derived from qualification profiles extracted from outside the educational institution as to legitimise the curriculum;

• teachers act as facilitators of instruction and monitor development by adopting the role of a coach.

Self-management of learning, according to the teaching staff, is a central asset of the course programme, and requires the interest of each student in life-long learning and development. This would mean a call by teachers (as a collective) on student competencies like problem-solving, self-regulation and reflective awareness. EDAS fits in with this requirement by

Page 7: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

INTEGRATING DEVELOPMENTAL ASSESSMENT

119

helping the student to collect insightful information on the learning progress being made toward the attainment of competencies by assessing multi-faceted learning experiences (not only knowledge and skills, but also perspectives, orientations and experiential learning) through a number of different assessment instruments.

Building Blocks of the EDAS System

Defining competencies. One of the essentials of an assessment approach like the EDAS system is the way in which relevant competencies are selected and defined. In order to arrive at a balanced set of competencies, a so-called ‘Wisdom of Practice’ study (Mroseck, 1996) was conducted in which, through a delphi-process of information collection, relevant actors in the competence domain were consulted several times in order to define qualifications for the curriculum. These qualifications were rephrased as content of the curriculum through frequent exchanges with the teaching staff. A careful selection of relevant competence domains and criteria of attainment was the outcome of the process (this process is described in more detail in EDAS, 1997). It is important to note that the Wisdom of Practice study also provided the competence-related standards, which identify the proficient expert in that competence domain. The experts in the domain supplied the necessary information on the standards by which assessment of competencies can be established. Levels of performance were identified for each competence domain. Based on this information it then became possible to describe each competence in performance-related terms together with their assessment standards or qualification level. Each competence eventually is described in terms of:

• a setting or situation in which the relevant performance is to be

demonstrated; • the actual performance to be shown is described in concrete and

behavioural terms; • the standards by which it can be assessed, as indicated by attainment

levels.

The wisdom of practice study was a major route to acceptance and acknowledgement of assessment as a competence and performance-orientated way of evaluation by the teaching staff.

Constructing assessments as student assignments. Students and teachers are primarily experiencing assessment around the portfolio as an instrument which, in the hands of the student, crystallises and demonstrates accomplishments. Following the format outlined by Tillema (1998) the portfolio is considered a reflective learning tool in which

Page 8: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

Harm H. Tillema

120

evidence of attained performance (i.e. a piece of work, a product made or an artefact of a process the student was involved in) is being commented upon and which indicates further learning needs. The reflection process is triggered by an initial self-assessment in which the student indicates the outcomes he or she hopes to attain and the attained level of proficiency to be achieved. Relative to his or her own perceptions on accomplishments, peers are invited to appraise the evidence collected in the portfolio (peer assessment). This collection of evaluative information is taken as input for the conversation of the student with the teacher as a coach (see also Figure 1 for an outline of the procedure). An outcome of this counselling meeting might be the decision reached to collect further assessment information, in the so-called ‘development centre’, as an additional assessment intervention that, in the EDAS system , is offered twice a year.

In the development centre, typically, competencies are assessed that have special relevance to the student’s learning progress, given his or her previous accomplishments as evidenced in the portfolio. The EDAS team together with the teaching staff are responsible for the selection and construction of relevant exercises and assignments in the development centre. All assessment assignments are derived from work-related problems or situations that have been collected over time by the teaching staff.

Managing the implementation process. The institute of Higher Education, which adopted EDAS as a system, abandoned the traditional curricular divisions and separate staff sections in favour of a cohort-like organisation of the teaching staff (just as students were organised in cohorts). Teacher cohort teams were responsible for delivering the curriculum to their respective student cohort group. For the teacher cohort teams, the EDAS project organised monthly lunch or ‘sandwich’ meetings to brief teacher cohorts about developments and to enter into discussion with them about the implementation of the assessment approach. These sandwich meetings proved to be a vital connection between the EDAS goals and the actual implementation practices of these teachers. It provided space for alterations and adaptations from the original proposed EDAS guidelines in order to tailor the assessment approach to the actual instructional process as teachers and students engaged in a specific cohort. These adaptations were discussed intensively with the teachers and amended accordingly. Given the view that teachers, as well as their students, are the prime owners of their instructional process as self-regulated learners this adaptation of the EDAS system was found acceptable and necessary. As such, the teaching staff has prime responsibility for decisions with regard to implementation and successful adoption of the assessment notions put forward by EDAS.

Page 9: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

INTEGRATING DEVELOPMENTAL ASSESSMENT

121

‘Lessons’ and Conclusions Drawn from Implementing EDAS

Construction and implementation of a far-reaching system for assessment and evaluation alongside the renewal of a curriculum at an institute for higher education proved to be a time-consuming and cumbersome process of negotiation and fine-tuning, where success depended highly on situational constraints and supportive infrastructure. Several persisting problems and some fruitful experiences were identified by the teaching staff to be brought forward out of the process:

1. The establishment of relevant competencies for development and learning in dialogue with teachers proved to be a crucial step, i.e. identifying competencies that are of direct concern for teaching and can constitute legitimate targets for an educational programme. The competencies, legitimised by external agents, acted as cornerstones for teachers, as well as students to direct learning and assessment. Being involved in the process of constructing these competencies only positioned them more strongly. However, competence-related performance rephrased as teaching content seemed difficult to formulate. A continuous discussion was needed between the competence domain, i.e. the organisations and corporations constituting that domain, and educational practitioners, such as teachers, in order to find a common ground with respect to the content and details of performance. The so-called ‘Wisdom of Practice’ study conducted constituted a sensitive bottom-up approach bridging the ‘voices’ of practice, while at the same time balancing teachers’ expertise with external expertise about the attainment and feasibility of competencies.

2. The levels and standards of performance do not only function to comply with external requirements (exams, inspectorate), but also function to express the internal (personal) targets of excellence and quality for teachers and students. By setting standards at certain levels, a programme may distinguish itself from other competitive programmes. Teachers were very keen on expressing their aspired standard levels as an internal yardstick for them to appraise their own efforts. This, on the one hand, calls for mutual agreement between staff members on what the actual attainment levels are, whilst on the other hand it requires a certain amount of differentiation in programme requirements to make differences in attainment possible. This process of defining competence profiles was quite new to the teachers but since it is connected to assessment they acknowledged its relevance. An educational management team capable of leading the staff and able to clarify its success standards strongly can facilitate the endeavour toward setting standards and clarifying boundaries (Preedy et al, 1997).

Page 10: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

Harm H. Tillema

122

3. The collection of relevant information for assessment is an interpretative process. The work-related experiences of students and their competence development as produced puts new demands on what is to be assessed and on the way in which it is being assessed. Questions like, what counts as relevant evidence, who decides about inclusion of essential performance and how do we integrate different performance evidences (i.e. in portfolio or development centre) or, more outspoken by teachers, whose perceptions count most in order to arrive at a meaning-ful and coherent picture of development and growth. These questions are felt all as crucial and called for constant discussion in the teaching teams. The construction of cooperating teams of teachers in cohorts proved to be an important condition for reaching an agreement, although it also gave rise to divergent solutions between teams: i.e. some favoured a strong student influence, others were opposed. 4. The need for embedding assessment in a judgemental context is clearly felt by teachers. Assessment is not just information, but only complete after a process of deliberation and reflection. This process of judgemental reasoning and decision-making between students who present and defend their evidence, and their teachers who appraise and weight the evidence may be severely hampered if it is merely regarded from the perspective of information collection. In a developmentally orientated assessment the viewpoints of the student as a self-regulated learner as well as the presence of a qualifying ‘system’ that has a keen eye on how to comply to external requirements or standards have to be reconciled. In this respect teachers considered the formulated standards as essential in governing the judgemental process.

5. Sustainment and maintenance of an integrated approach to assessment calls for a balanced system with multiple instruments, capable of giving a detailed, multi-perspective picture of growth in competencies. There is not one single assessment instrument that can do the job alone – teachers were well aware that an interconnected set of measurement instruments capable of addressing different perspectives on competence attainment would be needed. Self- and peer-assessment stress the individual commitment and strong involvement of the student in their development. Portfolios satisfy the need for a direct monitoring of performance-related evidence while development centres open the possibility for specific simulations of performance. This ‘balanced’ system, which typifies EDAS, can only be maintained with sufficient attention to procedures, clear guidelines and well established rules of conduct as became apparent to all teachers throughout the process of implementation.

Page 11: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

INTEGRATING DEVELOPMENTAL ASSESSMENT

123

6. Active participation of students in their own assessment for development is not self-evident and cannot be taken for granted. Fear of underachieving, possible hyperinflation in providing evidence and igno-rance about actual standards all lead unduly to a collection of unauthentic and invalid information presented by students to comply to what is required. Portfolio construction in itself is a lengthy process and highly dependent on what the learning environment offers. It was found, therefore, that students need to be motivated and instructed to use the assessment instruments in a productive way and need guidance to engage in self-reflective activities about their learning (i.e. ‘learning to learn’ as a condition). It proved all too clear to their teachers that students are not necessarily convinced of their self-regulative role in learning. 7. The internal organisation of a course has to support time and efforts for collecting sufficient assessment information, as well as transforming it into developmental plans for further learning. The assessment information brought forward needs to be evaluated and properly discussed in order to point out profitable roads for further development, resulting in adaptive learning tracks thus putting a flexible demand on the curriculum delivery. Teachers and students together need to be prepared to conduct a debate on the evaluation of progress and its consequences for learning. For teachers as mentors and coaches, this means not falling back on regular solutions and existing options, but being prepared to support their learners in an adaptive flexible way. For students it means taking a stance as a learner with high responsibility for self-regulation. These conditions are difficult to match as was found out by the teaching staff.

8. Maintaining a coherent programme and staff involvement. Every innovation calls for breaking down existing strategies and ‘epistemologies’ of practice (Pintrich & Hofer, 2002). Some teachers seem to flourish in an open unstructured environment that brings the best out of them, but lack of programme coherence and structure can backfire on the implemented changes at some later date. The acquisition of new practices especially in teams of teachers needs vehicles for change and criteria for success in order to ‘route’ the innovation. This involves, for instance, making explicit and recognising tacit theories of teachers in use (Pintrich & Hofer, 2002). The EDAS sandwich meetings in this respect formed a crucial element in explicating this knowledge of practice and promoted team learning.

The EDAS project, now completed, clearly showed the high interconnectivity of evaluation and assessment with the curriculum. Teachers play a crucial role in this link because of their ‘content’ and

Page 12: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

Harm H. Tillema

124

‘delivery’ expertise. However, teachers may also endanger a successful link depending on their ability to frame competencies in the format of instruction; it requires special expertise of the teacher in the assessment of student learning. However, at the same time, it places new prospects for teachers and staff to incorporate the world of work into their teaching. Teachers can be helped in this task by providing the necessary tools that go along with assessment of competencies in the curriculum. Competence-related assessment tools, therefore, open up, as well as require new solutions to the teacher’s own work.

Correspondence

Dr H.H. Tillema, Department of Education, Leiden University, PO Box 9555, NL-2300 RB Leiden, Netherlands ([email protected]).

References

Boekaerts, M. (1999) Self Regulated Learning, Where We are Today, International Journal of Educational Research, 31, pp. 445-457.

Broadfoot, P.M. (1996) Education Assessment and Society. Buckingham: Open University Press.

Byham, D. (2001) What is an Assessment Centres; the Assessment Center Method. Development Dimensions International Inc.

Butler, D.L. & Winne, P.K. (1995) Feedback and Self-regulated Learning: a theoretical synthesis, Review of Educational Research, 65, pp. 245-281.

EDAS (1997) Competence Profiles in Small Business and Retail Management. Saxion: Enschede. [In Dutch]

Fischer, C.F. & King, R.M. (1995) Authentic Assessment, a Guide to Implementation. Thousand Oaks: Corwin.

Gipps, C. (1994) Beyond Testing, Towards a Theory of Educational Assessment. London: Falmer Press.

Hargreaves, A. & Evans, R. (1997) Beyond Educational Reform, Bringing Teachers Back In. Buckingham: Open University Press.

Heartel, E.H. (1990) Performance Tests, Simulations and Other Methods, in: J. Millman & L. Darling Hammond (Eds) The New Handbook of Teacher Evaluation. Newbury Park: Sage.

Herman, J.L. & Winters, L. (1994) Portfolio Research, a Slim Collection, Educational Leadership, 52(2), pp. 48-55.

Hitchcock, G. & Hughes, D. (1995) Research and the Teacher, a Qualitative Introduction to School Based Research. London: Routledge.

Jones, R.G. & Whitmore, M.D. (1995) Evaluating Developmental Assessment Centers as Interventions, Personnel Psychology, 48, pp. 377-388.

Lundeberg, M., Levin, B. & Harrington, H.L. (1999) Who Learns What from Cases and How? Mahwah: Lawrence Erlbaum Associates.

Page 13: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

INTEGRATING DEVELOPMENTAL ASSESSMENT

125

Mroseck, S. (1996) Wisdom of practice onderzoek: het bepalen van competenties, in: H. H. Tillema (Ed.) Development Centers, Development of Competencies in Organisations. Deventer: Kluwer. [In Dutch]

Olson, M.W. (1991) Portfolios: education tools, Reading Psychology, an International Quarterly, 12, pp. 73-80.

Peterson, K.D. (1995) Teacher Evaluation. Thousand Oaks: Corwin Press.

Pintrich, P. & Hofer, B. (2002) Personal Epistemology, the Psychology of Beliefs about Knowledge and Knowing. Mahwah: Lawrence Erlbaum Associates.

Preedy, M., Glatter, R. & Levacic, R. (1997) Educational Management, Strategy, Quality and Resources. Buckingham: Open University Press.

Redman, W. (1994) Portfolios for Development, a Guide for Trainers and Managers. London: Kogan Page.

Shephard. L. (2000) The Role of Assessment in a Learning Culture, Educational Researcher, 29(7), pp. 4-15.

Smith, K. (1997) School Principals’ Experiental Learning with and about Portfolios, paper presented at the Annual Meeting of the American Educational Research Association, Chicago, March 24-28.

Smith, K. & Tillema, H.H. (2001) Long term influences of portfolios on professional development, Scandinavian Journal of Educational Research, 45, pp. 183-203.

Swanson, D.B., Norman, G. & Linn, R.L. (1995) Performance Based Assessment: lessons from the health profession, Educational Researcher, 24(5), pp. 5-11.

Tillema, H.H. (1996) Development Centers, Development of Competencies in Organisations. Deventer: Kluwer. [In Dutch]

Tillema, H.H. (1998) Design and Validity of a Portfolio Instrument for Professional Training, Studies in Educational Evaluation, 24, pp. 263-278.

Torrance, H. (1994) Evaluating Authentic Assessment, Problems and Possibilities in New Approaches to Assessment. Buckingham: Open University Press.

Topping, K. (1998) Peer Assessment Between Students in Colleges and Universities, Review of Educational Research, 68, pp. 249-276.

Wiggins, G. (1989) Teaching to the (Authentic) Test, Educational Leadership, 46, pp. 41-47.

Zimmerman, B.J. & Schunk, D.H. (2001) Selfregulated Learning and Academic Achievement. Mahwah: Lawrence Erlbaum Associates.

Page 14: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

Harm H. Tillema

126

Page 15: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

INTEGRATING DEVELOPMENTAL ASSESSMENT

127

THE JOURNAL OF VOCATIONAL EDUCATION & TRAINING WELCOMES SUBMISSIONS

The Editorial Board of JVET welcomes the submission of good quality articles from across the post compulsory sector of education and training, whether from experts, academics or practitioners, in further or higher education, or other public sector organisations. Bearing in mind our international readership, we welcome reports of significant developments in the field, research, book reviews, discussion articles on topics of relevance to those working and studying in the sector, both in the UK and overseas. Suitable topics might include: Social exclusion and the college response Impact of the introduction of standards in basic skills for Further Education teachers Career development and progression for teachers in post compulsory education Impact of 14-19 initiatives on vocational education provision Student perceptions of the post compulsory learning experience (e.g. in relation to key skills) Transferability of skills internationally Impact of selected initiatives on post-16 curriculum Role and mobility of senior managers in further education Implications of the advent of learning and skills councils for funding and management of colleges Evaluation of specific developments, experiences or initiatives overseas where implications can be drawn for practice or policy elsewhere.

Information about the presentation and style of manuscripts can be found in the Notes for Contributors, on the back cover of the journal. General information about the journal, including past contents, can be found at www.triangle.co.uk/VAE. If you would like to discuss a proposed contribution with the Editor, please contact Jaswinder Dhillon, School of Education, University of Wolverhampton, Gorway Road, Walsall WS1 3BD, United Kingdom ([email protected]).

Page 16: Integrating developmental assessment with student-directed instruction: a case in vocational education in the Netherlands

Harm H. Tillema

128

THE JOURNAL OF VOCATIONAL EDUCATION & TRAINING WELCOMES SUBMISSIONS

The Editorial Board of JVET welcomes the submission of good quality articles from across the post compulsory sector of education and training, whether from experts, academics or practitioners, in further or higher education, or other public sector organisations. Bearing in mind our international readership, we welcome reports of significant developments in the field, research, book reviews, discussion articles on topics of relevance to those working and studying in the sector, both in the UK and overseas. Suitable topics might include: Social exclusion and the college response Impact of the introduction of standards in basic skills for Further Education teachers Career development and progression for teachers in post compulsory education Impact of 14-19 initiatives on vocational education provision Student perceptions of the post compulsory learning experience (e.g. in relation to key skills) Transferability of skills internationally Impact of selected initiatives on post-16 curriculum Role and mobility of senior managers in further education Implications of the advent of learning and skills councils for funding and management of colleges Evaluation of specific developments, experiences or initiatives overseas where implications can be drawn for practice or policy elsewhere.

Information about the presentation and style of manuscripts can be found in the Notes for Contributors, on the back cover of the journal. General information about the journal, including past contents, can be found at www.triangle.co.uk/VAE. If you would like to discuss a proposed contribution with the Editor, please contact Jaswinder Dhillon, School of Education, University of Wolverhampton, Gorway Road, Walsall WS1 3BD, United Kingdom ([email protected]).