celban™ ten-year retrospective: impact and washback€¦ · national celban administrative...
TRANSCRIPT
CELBAN™ Ten-Year Retrospective:
Impact and Washback
TESL Canada Presentation October 2015
By: Catherine Lewis and Blanche Kingdon
1
Overview of Presentation
• Background • Development • National Administration and Operations • Growth and Change • Uniqueness of CELBAN • Observations of Test Impact and Washback • Challenges • Lessons Learned • Suggestions for Further Research
2
What is CELBAN? • an occupation-specific English language assessment tool • used to assess the threshold English language proficiency of
internationally-educated nurses (IENs) on the path to licensure to practice nursing in Canada
• assesses communicative language ability as four discrete skills: speaking listening reading writing
3
Background
Why was CELBAN developed? English proficiency tests previously used to assess the language proficiency of IENs were often inadequate (e.g., TOEFL, IELTS, TOEIC, CAEL, MELAB, etc.) Why? • tests were not based on analysis of the language demands of the
nursing profession; [Target Language Use (TLU) analysis] • tests were not validated with the target population (IENs); • test-takers did not receive feedback on strengths and weaknesses in
the productive skills (speaking & writing) to inform gap-filling efforts
4
Development of CELBAN History: 2000: Centre for Canadian Language Benchmarks (CCLB)
conducted a Feasibility Study
2002: CCLB awarded a contract to researchers from Red River College (RFP process) to conduct Phase I: Analysis of the Language Demands of Nursing.
Funding for research was secured by CCLB from: Provincial Governments (Ministry of Training, Colleges and Universities): Alberta and Ontario
5
Development…. History cont’d.:
CCLB awarded contracts to researchers/test developers from Red River College (through an RFP process): 2003: to develop CELBAN - Phase II: The Development of CELBAN,
A Nursing Specific Language Assessment Tool
2003-2004: to implement CELBAN at 3 pilot sites in Canada - Phase III: The Implementation of CELBAN
2004: to write/publish a resource: Developing and Occupation-Specific Language Assessment Tool Using the Canadian Language Benchmarks. A Guide for Trades and Professional Organizations.
2004: CCLB developed the CELBAN website: www.CELBAN.org
Funding for these CELBAN projects was secured by CCLB from: Provincial Governments (Labour and Immigration/CIC): British Columbia, Alberta, Manitoba, and Ontario
6
Development… History cont’d.: 2004: The Canadian English Language Assessment Services (CELAS)
Centre at Red River College in Winnipeg was established the National CELBAN Administrative Services centre.
2005: CCLB contracted with researchers/test developers from Red
River College (RFP process) to develop CELBAN Readiness Self-Assessment (CRSA)
• on-line resource, available free on the CELBAN website: www.celban.org.
• offline version in kit format, available for purchase from the CELAS Centre.
Funding for development of CRSA was secured by CCLB from: Labour & Immigration Governments of B.C., Alberta, Manitoba, Ontario, and the Ontario Region of Citizenship Immigration Canada
7
Development…
History cont’d.: 2005: CCLB contracted with researchers/test developers from Red
River College (RFP process) to develop, pilot and implement two more official versions, CELBAN Versions 2 and 3.
Funding for CELBAN Versions 2 and 3 was secured by CCLB from: Provincial Governments (Labour and Immigration/CIC): British Columbia, Alberta, Manitoba, and Ontario
8
Institutional CELBAN Development
2006: CCLB contracted with researchers/test developers from Red River College (RFP process) to develop the Institutional CELBAN (I-CELBAN) (Version One and Version Two)
- for use by institutions with nursing language bridging programs (for entry into, exit from, or diagnostic purposes within)
2007: I-CELBAN in kit format was available for eligible institutions to
purchase from the CELAS Centre [Note: 22 colleges and 4 universities have purchased I-CELBAN Kits as of Jan. 2014]
Funding for development of I-CELBAN was secured by CCLB from: Ontario Region of Citizenship and Immigration Canada
9
Why were Red River College’s researchers chosen? Researchers from RRC’s Language Training Centre were known nationally as
pioneers in CLB-based applied research: 1. “Benchmarked” more than 35 college programs at RRC & 1 university
program at U. of M. (2000-2013). 2. Trained applied researchers (‘How to benchmark’ college programs) at 2
colleges in B.C., and 6 in Ontario (CITTE project 2006-07) 3. “Benchmarked” 4 health professions AND 10 trades (national or
provincial projects) (2002-2012) 4. Published a “How To” Guide for Developing an Occupation-Specific
Language Assessment Tool (based on CELBAN model)(2004) 5. “Benchmarked” 10 Red Seal Trades Interprovincial Examinations (2010)
10
Facts about CELBAN
• For 10 years (2004 – 2014), the CELAS Centre at Red River College was the National CELBAN Administrative Services Centre for all operations, products and services related to CELBAN. (July 2014 Pivotal Change)
• Nursing licensing bodies across Canada approved CELBAN as one of only
two options for IENs to use to demonstrate English language proficiency (as of 2012) (CELBAN or IELTS).
• Eight official CELBAN Administration Sites (in 5 provinces) were established
and licensed by CELAS throughout a decade to administer CELBAN monthly or bi-weekly; 6 sites were fully-operational and there were also numerous itinerant administrations as of June 2014.
11
Scope of Activities re CELBAN
60%
6% 4%
2% 3%
8%
16%
National CELBAN™ Administrative Services at the CELAS Centre
National CELBAN Administration for all sitesin Canada
Itinerant CELBAN Administration
Establishment of New CELBAN AdministrationSite and Training
Additional Training
Reporting
Consulting
Winnipeg CELBAN Administration Site
12
Numbers of Tests
13
104
327
516 575
722
544 574
822
1104
0
200
400
600
800
1000
1200
2005 2006 2007 2008 2009 2010 2011 2012 2013
# ofTests
Uniqueness of CELBAN
CELBAN is unique when compared to many academic language proficiency assessments (e.g., IELTS, TOEFL, CAEL, MELAB, TOEIC, etc.) previously used by nursing regulators: • TLU analysis of nursing profession
• content informed and validated by the target population, nursing stakeholders and consultants
• occupation-specific language proficiency assessment, criterion-referenced, which identifies threshold levels of language proficiency required to practice based on TLU
• score reports include a CLB level in each of the 4 test components, as well as individualized feedback on strengths and weaknesses in the productive skills: speaking and writing.
14
Uniqueness of CELBAN
15
IELTS CELBAN TEST PURPOSE IELTS is an English language proficiency test
designed to test communicative English abilities of non-native speakers. Academic version is used for higher education and professional certification. (General Training version is used for work or migration purposes.)
CELBAN is an English language proficiency test designed to test communicative English abilities of non-native speakers who are internationally-educated nurses (IENs).
HISTORY IELTS was implemented in 1989 (25 year history).
CELBAN was implemented in 2004 (10 year history).
NUMBERS OF TESTS ADMINISTERED ANNUALLY AND NUMBERS OF TEST CENTRES
Over 2 million IELTS tests were administered in 2013.
There are over 900 IELTS testing centers and locations (130 countries) around the world (with 57 in Canada.)
Over 1100 CELBAN tests were administered in 2013.
There are 7 official CELBAN testing centers in 5 provinces in 1 country (Canada); itinerant CELBAN testing is conducted in specific targeted areas within Canada as needed.
DELIVERY FORMAT IELTS is a paper-and-pencil test with 4 sections - Listening, Reading, Writing and Speaking. The Speaking test includes a live, face-to-face interview with one trained and certified ESL professional.
CELBAN is a paper-and-pencil test with 4 sections - Listening, Reading, Writing and Speaking. The Speaking test includes a live, face-to-face interview and role-plays with two trained and certified assessors who are ESL professionals.
Uniqueness of CELBAN
16
IELTS CELBAN TEST CONSTRUCT IELTS is a test of English as an “international”
language that includes all standard varieties of English - American, Australian and British.
CELBAN is a test of English for nursing purposes that uses the variety of English which is prevalent in Canada. (There are no British or American accents on any of the texts on the Listening component.)
TEST SCORES
All IELTS test results are reported on a 9-band scale from 1 (non-user) to 9 (expert user), in whole and half bands (e.g., 7.0, 8.5).
All components are scored by one trained scorer.
All CELBAN test results are reported in Canadian Language Benchmark (CLB) levels from 5 (Initial Intermediate Ability) to 10 (Developing Advanced Ability). (Note: CELBAN is a criterion-referenced test and focusses on the specific range of levels pertinent to the context of nursing in Canada - CLB 8-10)
In addition to scores, test-takers receive individualized feedback on strengths and weaknesses in productive skills (speaking and writing).
All components are scored by two trained scorers/assessors.
OWNERSHIP IELTS is jointly owned by a global partnership of education and language experts - British Council, IDP: IELTS Australia and Cambridge English Language Assessment.
CELBAN is owned by the Centre for Canadian Language Benchmarks (CCLB), Ottawa, ON.
Observations of Test Impact and Washback
17
Test Impact
What is “test impact”?
Test impact includes the wider effects that tests have on:
• individuals (e.g., test-takers and test administrators)
• educational systems (e.g., test results used to make decisions about curriculum planning, funding allocation, admission, etc.)
• society at large (e.g., immigration policy, licensing for professionals, etc.)
(Taylor, 2005; Bachman and Palmer, 1996)
18
Observations of Test Impact 1. Social consequences of testing indicate a consequential aspect
of validity (Messick, 1996).
• use of test scores affects career and life changes for individual test-takers, due to judgments made by decision-makers
(positive or negative impact of CELBAN scores)
19
Observations of Test Impact 2. Relevant stakeholders showed interest during the decade in
exploring the possibility of either expanding the target population of CELBAN test takers (IENs) to include other health professions (physicians, physiotherapists/occupational therapists, pharmacists, midwives)
OR adapting the CELBAN model for the development of new
assessment tools for each of the other health professions. Note: Interest was also shown by non-health professions regarding
utilization of the CELBAN model for test development.
20
Observations of Test Impact
3. Deeper scrutiny of the use of language assessment tools by nursing and other regulators was prompted due to the existence of CELBAN and its increasing popularity during the period of 2005-2013.
• In 2010, standardization exercise was conducted by a trained consultant with The National Fluency Working Group, a sub-committee of the CNO.
• In 2013, an applied research project was initiated by the Office of the
Manitoba Fairness Commissioner in response to inquiries from regulators regarding alignment of test scores.
21
Washback
What is ‘washback’? “Washback has been discussed in language assessment largely as the direct impact of testing on individuals”(Bachman & Palmer, 2010, p.109). Washback can be positive or negative determined by its beneficial or harmful impact on educational practice or educational/political systems.
22
Observations of Washback
Washback related to official CELBAN: During piloting of the official CELBAN, educational institutions expressed interest in using the tool for pre-admission, or for diagnostic purposes within programs. • The washback effect was the development of an institutional
version: Institutional CELBAN (I-CELBAN)
(positive washback)
23
Observations of Washback Washback related to Institutional CELBAN:
Users of I-CELBAN had decisions to make, with resulting consequences.
1. Training (optional)
2. Determining institution’s purpose for using I-CELBAN (pre-admission, diagnostic within a program, and/or exit from bridging/entry into clinical placement)
3. Setting cut scores relative to the purpose for which I-CELBAN was used
(positive OR negative washback)
24
Observations of Washback
More washback related to CELBAN:
• Emergence of ‘CELBAN-Prep’ offerings:
1. ‘Orientation to CELBAN’ workshops 2. ‘CELBAN prep’ included in nursing language bridging
programs 3. Independent ‘CELBAN prep’ workshops/courses
(positive OR negative)
25
Observations of Washback
Examples of positive washback from anecdotal information collected from Institutional CELBAN (I-CELBAN) users:
• Program planners determined appropriate entrance CLB Levels for nursing language bridging programs.
• Instructors
• were informed about gaps to address throughout nursing language bridging programs from specific diagnostic feedback.
• designed individualized (or group) learning plans for a variety of learners based on patterns that emerged from gap analysis of I-CELBAN results.
• developed new courses in which students practiced tasks based on CELBAN-type tasks. • were provided with an appropriate means for conducting exit assessments.
26
Observations of Washback
More examples….
• Instructors reported an increase an confidence in their ability to: a) score and provide feedback using I-CELBAN answer keys, exemplars, and
assessor protocols (after having been trained) b) generate new tasks (modeled after I-CELBAN) which were more valid and
reliable than others previously developed
• I-CELBAN test-takers confirmed that receiving specific individual feedback regarding strengths and weaknesses in speaking and writing was useful to prepare them for clinical practice and to take the official CELBAN.
27
28
Challenges
Challenges
1. There was no precedent for developing, implementing
and operationalizing a high-stakes occupation-specific language assessment tool of this scope nationally in Canada prior to CELBAN.
29
Challenges cont’d.
30
104
327
516 575
722
544 574
822
1104
2
2.5 3 3.5
2.9 2.9 2.9 2.15 3.5
0
5
10
15
20
25
0
200
400
600
800
1000
1200
2005 2006 2007 2008 2009 2010 2011 2012 2013
# ofTests
Numbers of Tests / CELAS Centre staff
2. Demand exceeded capacity (2012 on).
Staffing for all CELBAN™ Operations, Services and Products Approximately 90 qualified and trained individuals on a national scale were involved in CELBAN operations, services & products (as of July 2014)
RRC staff: • 1 CELBAN National Coordinator (& Trainer) (FT) • 1 Administrative Assistant (FT) • 1.5 Administrative Clerk/Scorer (PT) • 1 Site Administrator (PT) • 2 Invigilators (PT) • 6 Trained Speaking Assessors (PT) • 1 Researcher/Test Developer (& Trainer) (PT)
Other contracted CELBAN team members: • 5 Site Administrators (PT) • 43 Trained Speaking Assessors (PT) • 14 Invigilators (PT) • 10 Trained Writing Scorers (PT) • 1 psychometrician (casual PT)
31
Challenges cont’d.
3. Securing external funding for expansion of infrastructure, ongoing validation and further test development
4. Establishing reliable sustainable CELBAN testing sites across Canada
32
Challenges cont’d.
5. Formal business plan needed to replace a Three-Year Business Plan that expired in 2008
6. Model of governance of CELBAN
33
Lessons Learned
34
Lessons Learned
1. There are advantages and disadvantages of a post-secondary institution in the role of a national administrative centre for a high-stakes test.
Due diligence must be in place at the onset, as well as at
strategic intervals, in order to ensure that the disadvantages do not outweigh the advantages.
35
Lessons Learned… 2. A business management model is needed.
• A sound business plan
- every stage of operations and development
• A Service Level Agreement (SLA) for the partnership agreement - manageable, realistic targets and the type of support available from
the owner.
3. A solid governance structure is needed. - representation from a range of appropriate stakeholders
36
Lessons Learned… 4. Challenges are part of ‘running the business’ of CELBAN at
multiple sites in many regions of Canada.
• Test site sustainability and meeting the demand will be an ongoing challenge.
5. Communication is crucial with all CELBAN stakeholder groups (partners, funders, ‘team members’, and IENs)
37
Suggestions for Future Research
• A formal impact study of CELBAN and I-CELBAN
38
Conclusion “No endeavor that is worthwhile is simple in prospect; if it is right, it will be simple in retrospect.” Edward Teller
• Valuable lessons have been learned by the CELBAN team at the
CELAS Centre during this past; • Research to inform the future development and direction of
CELBAN (e.g., formal impact study) and consultation with all relevant stakeholders is important for both the caretaker (Touchstone Institute) and the owner of CELBAN (CCLB).
39
Acknowledgements
• Funders • Committee members • Test Administration teams in 5 provinces • Nursing regulators • Educators (nursing and language) • Internationally-educated nurses (IENs)
Special Note: Lucy Epp, pioneer in ‘benchmarking’ initiatives in Manitoba; started in 1996; continued until retirement in 2008.
40
References Bachman, L. and Palmer, A. (1996). Language Testing in Practice. Oxford University Press.
Bachman, L. and Palmer, A. (2010). Language Assessment in Practice. Developing Language Assessments and Justifying their Use in the Real World. Oxford University Press.
Messick, S. (1996). Validity and washback in language testing. Language Testing. 13 (1996) 241.
Phase I: An Analysis of the English Language Demands of the Nursing Profession Across Canada. (2002). Centre for Canadian Language Benchmarks, Ottawa, ON. www.language.ca
Phase II: The Development of CELBAN: A Nursing-Specific Language Assessment Tool (2003). Centre for Canadian Language Benchmarks, Ottawa, ON. www.language.ca
Phase III: Implementation of CELBAN (2004). Centre for Canadian Language Benchmarks, Ottawa, ON. www.language.ca
Taylor, Lynda. (2005). Key Concepts in ELT. Washback and Impact. ELT Journal Volume 59/2 April 2005. Oxford University Press. Retrieved from http://eltj.oxfordjournals.org/content/59/2/154.full.pdf
41