assessment for learning with the global scale of english · they form the starting point for the...

4
Assessment for learning with the Global Scale of English

Upload: hoangnguyet

Post on 05-Oct-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

Assessment for learning with the Global Scale of English

1. https://creativesystemsthinking.wordpress.com/2015/02/21/noam-chomsky-on-the-dangers-of-standardized-testing/

How much is too much? Judging by this year’s Manchester IATEFL programme, the question foremost in the minds of the ELT teaching profession is how much to test. A whole host of presentations from the Potential and Pitfalls of Assessment and What can you learn from a test? to Why teachers should love testing and this year’s ELT Journal debate: Language testing does more harm than good.

Everyone, it seems, has a view on testing – good or bad. Noam Chomsky, the eminent linguist, has entered the debate on standardised testing claiming that testing is being turned into something “extremely harmful”: It’s turning us into individuals who devote our lives to achieving a rank. Not into doing things that are valuable and important1. But can testing itself be inherently good or bad? Even if we acknowledge that some tests or systems are broken, do the naysayers truly believe that a world without any assessment would be a better place to learn? Is it not possible to identify the positive impact of testing and create an assessment system that ensures teachers and learners are “doing things that are valuable and important”?

All testing is not the same We can all dredge up examples from our past of tests and exams that have scarred us for life. Mine was the practical paper in Art at aged 16. Topic: canals. I still can’t look at a lock gate without feeling sick! And many of us have failed exams in subjects that we have later gone on to master. I failed French at school, and went on to obtain a degree in the subject. So how valid were those tests?

Although they were both a snapshot of my proficiency at that moment, the way the results were dealt with had vastly different impacts. The Art exam went off to be marked and I got a grade. End of story. I never saw the artistic output of my efforts again, had no feedback on my performance (other than the grade) and, honestly, never gave it another thought until I started writing this article! The French test, on the other hand, was graded by my form teacher and handed back to me with annotations and feedback. We went through it together (I think he was almost as shocked as I was) and used it as the basis for follow-up work. This was in the 1970’s and no one was yet speaking about Assessment for Learning. But, as with all great practices, it was something that teachers were doing even before the terminology was coined.

To take another example - what about health tests? They let us know how we are doing, physically, and enable us to modify behaviour if the results are not what we – or the doctor – want to see. We don’t get the results of a medical test and then simply throw them away, we work through them with the medical professionals to see what can be done.

Why should educational test results be any different? The test shouldn’t necessarily be the end of the process – it could be the means to a more “valuable and important” outcome; the start of an informed discussion about what the learner should do next.

Helping English language learners answer the question - am I making progress?At Pearson, we have set ourselves the goal to demonstrate the “efficacy” of our products and services – identifying the positive impact that they have on the lives of language learners. Of course, for English language learners “positive impact” depends very much on learner goals. It may mean the ability to communicate with friends or colleagues around the world, the possibility of studying at a university or college of higher education in an English-speaking country – or simply the confidence to express themselves in English. Even if the positive impact that learners desire varies, what learners have in common is the need to remain motivated – to see their progression and understand what they need to do next to meet their longer term goals.

Research shows that lack of time and motivation are the biggest barriers to becoming fluent in English. This knowledge, and our desire to show that our products and services have a positive impact on learners, led to the creation of the Global Scale of English.

What started as a research initiative to look at the measurement of language proficiency and how this can be used to inform and motivate – rather than just “test” – has blossomed into a completely new learning and teaching ecosystem. The Global Scale of English ecosystem is made up of four parts – the scale itself, a set of Learning Objectives or ‘can do’ statements that describe exactly what a learner can do at each point on the scale, course materials and assessment tools. Unlike some other frameworks that measure English proficiency in broad bands, the Global Scale of English identifies what a learner can do at each point on a scale from 10 – 90, across each of the four skills: listening, reading, speaking and writing. It’s been psychometrically aligned to the CEFR so teachers familiar with the CEFR system will find it easy to navigate.

The Learning Objectives are central to the ecosystem in that they provide context for teachers and learners, describing exactly what it means to be at a level of proficiency in English in terms of what a learner can do. For example,

• At 15 in reading a learner ‘can read and understand simple prices.’ (Below A1 on CEFR)

• At 26 in listening a learner ‘can understand basic questions about people’s likes and dislikes’. (A1 CEFR)

• At 37 in speaking a learner ‘can make simple, direct comparisons between two people or things using common adjectives.’ (CEFR A2+)

• At 62 in writing a learner ‘can systematically develop an argument giving the reasons for or against a point of view.’ (CEFR B2)

They form the starting point for the creation of all our new learning and assessment materials – ensuring a direct link between what is taught and what is assessed. In short, the Global Scale of English and these learning objectives form the backbone that connects every Pearson English product or service.

Having a granular scale means that proficiency can be measured more accurately, progress can be demonstrated more regularly and formative assessment can be used to plan future learning. Rather like those discussions with my French teacher, the Global Scale of English learning objectives facilitate meaningful discussion.

Filling in the gaps in the CEFRThe CEFR uses a six-level classification of learner proficiency from A1 (low basic) to C2 (fully proficient). The amount of instruction needed to progress from one level to the next varies according to several factors including age, ability and native language, making it difficult to be precise. However, it has been observed that most people studying for three to four hours a week may take two or more years to move from one CEFR level to the next. This can leave both learners and teachers feeling frustrated with the amount of progress that is seemingly (not) being made. How motivating is it to take an end of year exam and be told year after year that you are still B1? No wonder the discussion at IATEFL was so roundly anti-testing if this is how we are assessing our learners.

The work to develop the Global Scale of English (GSE) builds upon the research carried out by Brian North and the Council of Europe in creating the Common European Framework of Reference for Language (CEFR). In developing the scale, we have created new learning objectives which extend the existing CEFR Can Do statements in both number and range, providing information to support a far more granular definition of language proficiency across all four skills. When

proficiency is measured on a more granular scale, it is easier to demonstrate small amounts of progress. Yes, a learner may still be B1 for several years (according to the CEFR), but on GSE, they have 16 points of incremental proficiency to be measured against (43 – 58) – so progress within a CEFR level can be demonstrated.

The CEFR has good coverage of Can Do statements for Speaking but less so for the other three skills. And almost two thirds of the Can Do statements cover A2-B2, with little at the lower and higher levels. The new GSE Learning Objectives serve to fill those gaps in the CEFR, enabling progress to be measured equally across the four skills. GSE also identifies a level “below A1” (from 10-21), meaning that it is now possible to assess beginners using the GSE Learning Objectives. Readers who are interested in learning more about this research and would like to download and use the latest set of GSE Learning Objectives should go to www.english.com/gse.

The real power of this research project, however, has been the creation of the full ‘GSE Ecosystem’: a combination of learning objectives, course materials and assessment tools, all aligned to the same proficiency scale. In this way, we can ensure that assessment products (measuring learner outputs) truly reflect the teaching and learning that has gone before (learning inputs). The newest assessment offering from Pearson English is Progress – a package of three tests taken at the beginning, in the middle and at the end of a course of study to measure progress. At each stage, a test report indicates a learner’s score on the Global Scale of English – and identifies specific GSE Learning Objectives that require further work to improve. Once learning gaps have been identified, appropriate remediation can be offered that targets the same specific learning objective. In a way, this puts a new spin on “teaching to the test”. If a functional-based course is assessed using a tool that measures functional competence then maybe teaching to the test is not all bad?

+ + +LEARNING OBJECTIVES

SCALECOURSE MATERIAL

TESTING

PERSONALISATION OF PROGRESS =

Over time all products and services across the Pearson English portfolio will be aligned to the Global Scale of English

WHICH CAN –

Inform teachers

Motivate learners

Support our e�cacy vision

Anyone who’s tried to learn another language will know that one of the most difficult challenges is staying on track. Seeing real progress in the skills you’ve been working at, step-by-step, is hugely motivating, whether the goal is working towards a high-stakes test or becoming more confident in English for social reasons. Our research indicates that learners find it empowering to see their progress as it happens and that assessment across all four skills facilitates a more informed discussion with their teachers. For teachers of English, the GSE ecosystem offers the detail required to create learner focused syllabuses and courses that reflect learner needs and expectations that can be measured in a meaningful way.

Participants at the ELT Journal debate were pretty vocal in their objections to testing – but the focus was on testing as we know it today. Testing might be harmful in some contexts – but this doesn’t mean that the process of assessing is

necessarily harmful. In his IATEFL presentation on why teachers should love testing, Jeremy Harmer ended with a rallying cry to teachers to do something about the current state of assessment. “Testing is crucial to what we do. Even if you don’t like it, it’s not going away.” His conclusion? “It’s up to you!” To which I would add “You are not alone! Some of us working in assessment genuinely have the learner’s best interests at heart.”

Our development of Learning Objectives for different groups of learners (Adults, Young Learners, learners of Professional and Academic English) is ongoing. Pearson English would love to hear from experienced teachers who are interested in getting involved in this exciting project. Please contact us for more information at: [email protected]

This article appeared in the July edition of Modern English Teacher www.modernenglishteacher.com

Mike Mayor is Director of Global Scale of English within Pearson English. In this role, Mike heads up the team developing learning objectives that describe what learners can do at each point on the Global Scale of English. The team also works with Content teams to ensure that the Global Scale of English underpins all new products and services. On leaving university, Mike worked as a teacher of English in France before embarking on a career change and joining the world of publishing as a lexicographer. Mike joined the Longman Dictionary division of Pearson in 2002 and headed the list until his move to the Global Scale of English in 2013.

See progress, make progress

GSE

<A1 A1 C1 C2A2 B1 B2+ + +

10 20 30 40 50 60 70 80 90

CEFR