technology - university of otago · 2010-10-28 · technology contents 3 preface 4 acknowledgements...
TRANSCRIPT
Terry CrooksLester Flockton
TechnologyAssessment Results
1996
EARU
TE
CH
NO
LOG
Y A
SSESSM
EN
T RE
SULT
S 1996
NATIONAL EDUCATION MONITORING REPORT 5
Technology
Assessment Results
1 9 9 6Terry Crooks
Lester Flockton
with extensive assistance from other members of the EARU team:
Lee Baker
Nicole Brown
Robyn Caygill
Linda Doubleday
Liz Eley
Janice McDrury
Miriam Richardson
EARU
NATIONAL EDUCATION MONITORING REPORT 5
This report was prepared and published byThe Educational Assessment Research Unit, University of Otago, New Zealand
under contract to the Ministry of Education, New Zealand.
©1997 Ministry of Education, New Zealand
NATIONAL EDUCATION MONITORING REPORT 5ISSN 1174–0000
ISBN 1–877182–06–0
Printed and published in New Zealand by:
Educational Assessment Research UnitUniversity of Otago
Box 56Dunedin
New Zealand
Fax (64 3) 479 7550Email [email protected]
TECHNOLOGY CONTENTS 3
PREFACE 4
ACKNOWLEDGEMENTS 4
CHAPTER 1 KEY FEATURES OF THE NATIONAL EDUCATION MONITORING PROJECT 5
CHAPTER 2 ASSESSING TECHNOLOGY 9
CHAPTER 3 KNOWLEDGE AND UNDERSTANDINGS 1 3
Tool time 14
Useful machines 15
Crane 17
Stapler 18
Link task 1 & 2 19
CHAPTER 4 DESIGN 2 0
Gift soap 21
Flag 22
Sports bag 23
Planning a class event 24
Link tasks 3, 4 & 5 25
CHAPTER 5 EVALUATING DESIGN AND DESIGN IDEAS 2 6
Space game 27
Coloured sheep 28
Green sheep 29
10-in-1 gardening tool 30
Link task 6 & 7 31
CHAPTER 6 USING COMPUTERS 3 2
Mac moves 33
Treasure find 34
Form fill 35
Link task 8 36
CHAPTER 7 TECHNOLOGY SURVEY 3 7
CHAPTER 8 PERFORMANCE OF SUBGROUPS 4 0
APPENDIX 1 A DESCRIPTION OF THE NATIONAL EDUCATION MONITORING PROJECT 4 4
APPENDIX 2 THE SAMPLE OF SCHOOLS AND STUDENTS IN 1996 4 9
New Zealand’s National Education Monitoring Project commenced in 1993, withthe task of assessing and reporting on the achievement of New Zealand primaryschool children in all areas of the school curriculum. Children are assessed at twoclass levels: Year 4 (halfway through primary education) and Year 8 (at the end ofprimary education). Different curriculum areas and skills are assessed each year,over a four year cycle. The main goal of national monitoring is to provide detailedinformation about what children can do so that patterns of performance can berecognised, successes celebrated, and desirable changes to educational practicesand resources identified and implemented.
Each year, small random samples of children are selected nationally, then assessedin their own schools by teachers specially seconded and trained for this work.Task instructions are given orally by teachers, through video presentations, or inwriting. Many of the assessment tasks involve the children in the use of equip-ment and supplies. Their responses are presented orally, by demonstration, in writ-ing, or through submission of other physical products. Many of the responses arerecorded on videotape for subsequent analysis.
In 1996, the second year that national monitoring was implemented, four areaswere assessed: music, reading, speaking, and aspects of technology. This reportpresents details and results of the assessments relating to aspects of technology.Because of the breadth of the technology curriculum, and the time consuming na-ture of tasks which would be needed to assess some aspects of it, we explicitlylimited our focus to selected aspects.
The assessments revealed wide variations in performance on technology tasks.Tasks involving explanation, design, or evaluation of design were generally notvery well handled, but students had greater success with tasks involving observa-tion, description and factual knowledge. Most students were able to use comput-ers to carry out simple tasks. Year 8 students performed better than year 4students on most aspects of the tasks which were common to both levels.
Responses to the Technology Survey revealed generally positive attitudes towardstechnology as an activity at school. Not surprisingly, the responses reflected vari-ous interpretations of what technology is, not all of which match the emphases inthe new technology curriculum. Both year 4 and year 8 students reported that themost common technology activity in school involved using computers. Making ordesigning was the next most common activity reported at both levels. Many year 8students also mentioned participation in workshop or manual subjects. Despitethe emphasis on computer usage in school, regular use of computers was morecommon at home than at school, for both year 4 and year 8 students. Nevertheless,15 to 20 percent of students at both levels only used computers when at school.
Girls and boys performed similarly on most tasks at both levels. Students inschools with high proportions of Ma–ori students and/or with low socio-economicdecile ratings had lower levels of performance than other students on about 40percent of the tasks. However, there were fewer tasks on which Ma–ori studentsperformed less well than non-Ma-ori students, suggesting that the differences maybe more due to socio-economic factors affecting schools than to ethnicity.
4 PREFACE
ACKNOWLEDGEMENTS
The Project directors acknowledge the vital support and contributions of manypeople to this report, including:
• the very dedicated staff of the Educational Assessment Research Unit• Dr Hans Wagemaker and Mr James Irving, Ministry of Education• members of the Project’s National Advisory Committee• members of the Project’s Technology Advisory Panel• technical consultants, Professor Warwick Elley and Dr Alison Gilmore• principals, staff, and children of the schools where tasks were trialed• principals, staff, and Board of Trustees members of the 265 schools
included in the 1996 sample• the 2868 children in the 1996 sample, and their parents• the 95 teachers who administered the assessments to the children• the 22 senior tertiary students who assisted with the marking process• the 150 teachers who assisted with the marking of tasks early in 1997
CHAPTER 1 5KEY FEATURES OF THE NATIONAL EDUCATION MONITORING PROJECT
This chapter presents a concise outline of the rationale and operating proce-dures for national monitoring, together with some information about the re-sponses of the participants in the 1996 assessments. A more extendeddescription of national monitoring, including information about areas to be as-sessed and task development procedures, is available in Appendix 1 (p44). De-tailed information about the sample of students and schools for the 1996assessments is available in Appendix 2 (p49).
Purpose of National MonitoringThe main purposes for national monitoring in New Zealand are:
➢ to meet public accountability and information require-ments by identifying and reporting patterns and trends ineducational performance
➢ to provide high quality, detailed information which policymakers, curriculum planners and educators can use to de-bate and review educational practices and resourcing.
National monitoring fills a gap in the information available about New Zealandprimary education. The need for systematic national information on educa-tional outcomes has been highlighted by various government-appointed com-missions and working parties over at least the past 35 years.
Monitoring at Two Class LevelsNational monitoring provides a “snapshot” of what children can do at twolevels in primary and intermediate schools: year 4 (ages 8–9) and year 8 (ages12–13).
National Sample of StudentsNational monitoring information is gathered using carefully selected randomsamples of students, rather than all year 4 and year 8 students. The nationalsamples of 1440 year 4 children and 1440 year 8 children represent aboutthree percent of the children at those levels in New Zealand schools.
Three Sets of Tasks at Each LevelSo that a considerable amount of information can be gathered without placingtoo many demands on individual students, different students attempt differenttasks. The 1440 students selected at each level are divided into three groups of480 students, comprising four students from each of 120 schools.
Timing of AssessmentsThe assessments take place in the second half of the school year, between Au-gust and November. The year 8 assessments occur first, over a five week pe-riod. The year 4 assessments follow, over a similar period. Each studentparticipates in about four hours of assessment activities spread over one week.
Specially Trained Teacher AdministratorsThe assessments are conducted by experienced teachers, usually working intheir own region of New Zealand. They are selected from a national pool of ap-plicants, attend a week of specialist training in Wellington led by senior Projectstaff, and then work in pairs to conduct assessments of 60 children over fiveweeks. Their employing school is fully funded by the Project to employ a reliefteacher during their secondment.
6 NEMP Report 5: Technology 1996
Four Year Assessment CycleEach year, the assessments cover about one quarter of the national curriculumfor primary schools. The first four year cycle of assessments began in 1995 andwill be completed in 1998. Similar cycles of assessment are expected to be re-peated in subsequent four year periods, with about one third of the tasks keptconstant from one cycle to the next. This re-use of tasks allows trends inachievement across a four year interval to be observed and reported.
Important Learning Outcomes AssessedThe assessment tasks emphasize aspects of the curriculum which are particu-larly important to life in our community, and which are likely to be of enduringimportance to students. Care is taken to achieve balanced coverage of impor-tant skills, knowledge and understandings within the various curriculumstrands, but without attempting to slavishly follow the finer details of currentcurriculum statements. Such details change from time to time, whereas na-tional monitoring needs to take a long-term perspective if it is to achieve itsgoals.
Wide Range of Task DifficultyNational monitoring aims to show what students know and can do. Becausechildren at any particular class level vary greatly in educational development,tasks spanning multiple levels of the curriculum need to be included if all chil-dren are to enjoy some success and all children are to experience some chal-lenge. Many tasks include several aspects, progressing from aspects whichmost children can handle well to aspects which are less straightforward.
Engaging Task ApproachesSpecial care is taken to use tasks and approaches which interest students andstimulate them to do their best. Students’ individual efforts are not reportedand have no obvious consequences for them. This means that worthwhile andengaging tasks are needed to ensure that students’ results represent their capa-bilities rather than their level of motivation. One helpful factor is that exten-sive use is made of equipment and supplies which allow students to beinvolved in “hands-on” activities. Presenting some of the tasks on video also al-lows the use of richer stimulus material, and standardizes the presentation ofthose tasks.
Positive Student Reactions to TasksAt the conclusion of each assessment session, students complete evaluationforms in which they identify tasks which they particularly enjoyed and taskswhich did not appeal. Averaged across all tasks in the 1996 assessments, 72percent of year 8 students indicated that they particularly enjoyed the tasks,and only 16 percent indicated that they did not enjoy the tasks. Year 4 studentswere even more positive about the tasks. On average, 75 percent of them indi-cated that they particularly enjoyed the tasks, and only 12 percent indicatedthat they did not enjoy the tasks. Reactions to the tasks and assessment ap-proaches from the students’ parents and teachers were also very positive. Themost popular individual task was enjoyed by 96 percent of year 4 students.Only one task (involving individual singing for year 8 students) had more stu-dents disliking it than liking it.
Appropriate Support for StudentsA key goal in Project planning is to minimise the extent to which studentstrengths or weaknesses in one area of the curriculum might unduly influencetheir assessed performance in other areas. For instance, skills in reading andwriting often play a key role in success or failure in paper-and-pencil tests in ar-eas such as science, social studies, or even mathematics. In national monitor-ing, a majority of tasks are presented orally by teachers or on videotape, andmost answers are given orally or by demonstration rather than in writing.
Chapter 1: Key Features of the National Education Monitoring Project 7
Where reading or writing skills are required to perform tasks in areas otherthan reading and writing, teachers are happy to help students to understandthese tasks or to communicate their responses. Teachers are working with nomore than four students, so are readily available to help individuals.
To further free teachers to concentrate on providing appropriate guidance andhelp to students, so that the students achieve their best efforts, teachers are notasked to make notes or judgements on the work the students are doing. Allmarking and analysis is done later, when the students’ work has reached theProject office in Dunedin. Some of the work comes on paper, but much of it ar-rives recorded on videotape. In 1996, three quarters of the students’ workcame in that form, on a total of 4200 videotapes. The video recordings give adetailed picture of what both the student and teacher did and said, allowingrich analysis of both process and task achievement.
Four Task Approaches UsedIn 1996 three task approaches were used. Each student was expected tospend about an hour working in the first two formats and two one hour ses-sions working in the third format. The three approaches were:
➢ One-to-one interview. Eachstudent worked individually with ateacher, with the whole sessionrecorded on videotape.
➢ Stations. Four students, workingindependently, moved around aseries of stations where tasks hadbeen set up. This session was notvideotaped.
➢ Team. Four students worked collaboratively, supervised bya teacher, with the whole session recorded on videotape.
Professional Development Benefits for Teacher AdministratorsThe teacher administrators reported that they found their training and assess-ment work very stimulating and professionally enriching. Working so closelywith interesting tasks administered to 60 children in at least five schools of-fered valuable insights. Several teachers have reported major changes in theirteaching and assessment practices as a result of their experiences workingwith the Project. Given that 95 teachers served as teacher administrators in1996, or about half a percent of all primary teachers, the Project is making amajor contribution to the professional development of teachers in assessmentknowledge and skills. This contribution will steadily grow, since preference forappointment each year is given to teachers who have not previously served asteacher administrators. The total after two years is 187 teachers.
Marking ArrangementsThe marking and analysis of the students’ work occurs in Dunedin. Most of thetasks which can be marked objectively or with modest amounts of profes-sional experience are marked by senior tertiary students, most of whom havecompleted three or more years of preservice preparation for primary schoolteaching. The student markers for the 1996 tasks were employed five hoursper day for periods ranging between 5 weeks and 9 weeks.
The tasks which required higher levels of professional judgement are markedby teachers, selected from throughout New Zealand. In 1996 approximatelytwo thirds of the teachers who applied were appointed: a total of 150. Mostteachers worked either mornings or afternoons for one week. Their ratings ofthe experience were overwhelmingly positive, with 87 percent stating em-phatically that the experience was “professionally satisfying and interesting”.
8 NEMP Report 5: Technology 1996
YEAR 4SCHOOLS
YEAR 8SCHOOLS
less than 20year 4 students
less than 35year 8 students
20–35year 4 students
35–150 year 8 students
more than 35year 4 students
more than 150year 8 students
Analysis of ResultsThe results are analysed and reported task by task. Although the emphasis ison the overall national picture, some attention is also given to possible differ-ences in performance patterns for different demographic groups and catego-ries of school. The variables considered are:
➢ Student gender: male, female.
➢ Student ethnicity: Ma–ori, non-Ma–ori.
➢ Geographical zone: Greater Auckland, other North Island,South Island.
➢ Size of community: urban area over 100,000, communityof 10,000 to 100,000, rural area or town of less than 10,000.
➢ Socio-economic index for the school: bottom three deciles,middle four deciles, highest three deciles.
➢ Percent of Ma–ori children in the school: less than 10 per-cent, 10 to 30 percent, more than 30 percent.
➢ Percent of Pacific Island children in the school: less than5 percent, 5 percent or more.
➢ Size of school:
➢ Type of school (for year 8 sample only): Full primary school,intermediate school (some students were in other typesof schools, but too few to allow separate analysis).
Each of the categories listed above included at least 16 percent of the children.Categories containing fewer children, such as Asian students or female Ma–oristudents, were not used because the resulting statistics would be based on theperformance of less than 75 children, and would therefore be too unreliable.
Funding ArrangementsNational monitoring is funded by the Ministry of Education, and organised bythe Educational Assessment Research Unit at the University of Otago, under thedirection of Dr Terry Crooks and Lester Flockton. The current contract runsuntil 1999. The cost is just under $2 million per year, less than one tenth of apercent of the budget allocation for primary and secondary education. Half ofthe funding is used to pay for the time and expenses of the primary schoolteachers who assist with the assessments as task developers, teacher adminis-trators or markers.
Review by International ScholarsIn June 1996, three scholars from the United States and England, with distin-guished international reputations in the field of educational assessment, ac-cepted an invitation from the Project directors to visit the Project. Theyconducted a thorough review of the progress of the Project, with particular at-tention to the procedures and tasks used in 1995 and the results emerging. Atthe end of their review, they prepared a report which concluded as follows:
The National Education Monitoring Project is well conceived and admirablyimplemented. Decisions about design, task development, scoring, andreporting have been made thoughtfully. The work is of exceptionally highquality and displays considerable originality. We believe that the Project hasconsiderable potential for advancing the understanding of and public debateabout the educational achievement of New Zealand students. It may alsoserve as a model for national and/or state monitoring in other countries.
(Professors Paul Black, Michael Kane & Robert Linn, 1996)
9CHAPTER 2ASSESSING TECHNOLOGY
Technology is a universal and age-old human activity. ... The technologiesused today have built on the ingenuity, traditions, observation, and knowl-edge of people who, throughout history, have sought to improve their lives,solve problems, and satisfy their needs and wants.
Technology in the New Zealand Curriculum
Technology in the New Zealand CurriculumTechnology became a learning area in its own right with the formulation ofthe New Zealand Curriculum Framework (1993). The new curriculumstatement for technology introduced a much wider domain of learningthan the 1986 Workshop Craft Syllabus it replaced. It also embraced amuch broader perspective of the subject than typically understood or rec-ognised by many in the community. The new technology curriculum is forall students at all levels of the school, junior primary through to senior sec-ondary.
Aim of Technology EducationThe three-fold aim of technology education in the national curriculum is toenable students to achieve technological literacy through the developmentof:
➢ technology knowledge and understanding;
➢ technological capability
➢ understanding and awareness of the relationship betweentechnology and society.
The three parts of the aim are interrelated; the intention is that they betreated holistically rather than taught as three separate entities. For Na-tional Monitoring purposes, the 3 parts provide a useful basis for an assess-ment framework.
Technological Knowledge, Understandings and SkillsTechnology education is broad in its scope, yet is quite focussed in theways that knowledge, understandings and skills are acquired and used bystudents.
A technology education in the New Zealand curriculum is specificallyabout
➢ investigating, using and understanding technologies
➢ gaining knowledge of technological principles and processes;
➢ exploring needs and opportunities that could benefit fromcreative and scientific technological activity;
➢ creating, designing, planning, trying and evaluating ideas toimprove or modify existing products and processes;
➢ using materials, tools and equipment skilfully and safely;
➢ recognising the connections between technology and societyin time and place.
10 NEMP Report 5: Technology 1996
Areas of TechnologyThe areas of technology within which students develop their knowledge,understandings and skills embrace a great deal of personal, cultural, envi-ronmental and economic activity. Biotechnology, for example, involves theuse of living systems and organisms; materials technology includes the in-vestigation, use and development of materials such as wood, textiles, metalsand fuels; information and communication technology covers a complexrange of processes, equipment and devices that enable the managementand use of numerous forms of data and information.
Design, including the processes of specification, development and testingof ideas, is central to all areas of technology. In technology education stu-dents plan, make, modify, maintain, use, evaluate and improve products, sys-tems and environments.
Aspects of Technology: AssessmentTechnology is a multi-disciplinary activity. Its extensive cross-curricularpossibilities reflect its vast pervasiveness throughout the world in whichwe live as individuals, groups and societies. To attempt to represent all oreven most of the areas, meanings and applications of technology within theNational Monitoring assessment programme would be unrealistic.
After careful examination of the construct of the technology curriculum, itwas decided to assess some key aspects, with a particular focus on theknowledge, understandings and skills listed above. Selected areas of con-tent (e.g. electronics, materials, information and communication technol-ogy) and the broad overlapping contexts (e.g. personal, home, school,community) have been chosen as means to investigating the processes stu-dents use and the ideas they have, rather than as ends in themselves. ForNational Monitoring purposes, it is neither necessary nor desirable to coverevery possible area of content or explore every major context.
For the purposes of this report, the tasks have been grouped into four sec-tions: Knowledge and Understanding; Design; Evaluating Designs and De-sign Ideas; Using Computers.
Framework for National Monitoring AssessmentNational monitoring task frameworks are developed by the project’s cur-riculum advisory panels. These frameworks have two key purposes. Theyprovide a valuable guideline structure for the development of selection oftasks, and they bring into focus those important dimensions of the learningdomain that are arguably the basis for valid analyses of students’ skills,knowledge and understandings.
The frameworks are organising tools which interrelate content with strate-gies, skills and processes. They are intended to be flexible and broadenough to encourage and enable the development of tasks that lead tomeaningful descriptions of what students know and can do. They also pro-vide help to ensure a balanced representation of important learning out-comes.
The technology framework has a central organising theme supported by in-terrelated aspects.
The theme, ‘Knowing about technology in society and using opportunitiesto solve problems and meet needs in contexts appropriate to the student’sworld of experience”, is consistent with New Zealand’s new technologycurriculum and sets the broad context for tasks.
Chapter 2: Assessing Technology 11
Examples of Contexts suggest that technological activities are carried outin a variety of broad, overlapping situations which range from personal topublic experience.
The aspects titled Knowledge and Understandings and Abilities and Skills
highlight the learning that students could be expected to demonstratewhile engaged with Areas of Content and Contexts. The knowledge,understandings and skills are highly interrelated both within each aspectand across the total framework.
The Motivation and involvement aspect of the framework directs attentionto the importance of having information about students’ technological in-terests, attitudes, confidence and involvement, both within and beyond theschool setting. Educational research and practice confirm the impact ofstudent motivation and attitudes on achievement and learning outcomes.
FRAMEWORK FOR ASPECTS OF TECHNOLOGYKnowing about technology in society and using opportunities to solve problems and meet needs
in contexts appropriate to the student’s world of experience.
CONTEXTS
• PERSONAL: food, clothing,toys, transportation,communication
• HOME: storage, security,heating
• SCHOOL: playground, library,foodshop
• RECREATION: sports, games,technology, clubs
• COMMUNITY: traffic, wastemanagement, fire service
• INDUSTRY AND BUSINESS: buying,selling, marketing,manufacturing
KNOWLEDGE ANDUNDERSTANDINGS
• Technological principles
• Properties of substances
• Tools and devices
• Systems
• Environments
• Interactions betweentechnology and society
ABILITIES AND SKILLS
• Recognising needs andopportunities fortechnological solutions.
• Generating possiblesolutions and relatedstrategies. Selecting,developing, and adaptingsolutions.
• Managing time, humanand physical resources toproduce technologicalmodels, products, systems orenvironments.
• Communicating decisions,strategies and outcomes.
• Evaluating decisions,strategies and outcomes.
AREAS OF CONTENT
MOTIVATION
• Enthusiasm for knowing about and exploring technology.• Voluntary engagement in technology activities.• Confidence and willingness to try new ideas.• Perceptions about appropriate and inappropriate uses of technology
BiotechnologyUsing living systems ororganisms to manipulatechemical or naturalbiological processes in orderto develop products.
Materials TechnologyUsing and developingmaterials, understandingtheir properties andcharacteristics, and knowingtheir appropriate uses.
Electronics and Control(Automation)Knowledge and use ofelectronic systems anddevices, as well as theirdesign, construction andproduction.
Production and Process(Systems)Chemical, industrial andtransportation systems andprocesses.
Structures and MechanismsSimple mechanical devices tolarge complex structures
Information TechnologySystems for the collection,structuring, manipulation,retrieval, and communica-tion of information.
Food TechnologyProducing, preparing, andstoring food; packaging andmarketing.
The Choice of Technology Tasks for National MonitoringThe choice of technology tasks for national monitoring is guided by anumber of educational and practical considerations. Uppermost in any de-cisions relating to the choice or administration of a task is the central con-sideration of validity and the effect that a whole range of decisions can haveon this prime attribute. Tasks are chosen because they provide a good rep-resentation of important dimensions of a technology education, but also be-cause they meet a number of requirements to do with their administrationand presentation. These requirements are discussed in Appendix 1 (p44).
12 NEMP Report 5: Technology 1996
National Monitoring Technology Assessment TasksTwenty-two technology tasks were administered, using three different ap-proaches. Eight were administered in one-to-one interview settings, wherestudents used materials and visual information. Five tasks were presented inteam situations involving small groups of students working together. Ninetasks were attempted in a stations arrangement, where students worked in-dependently on tasks which involved reading information, manipulatingequipment and supplies, and recording responses on paper or computer.
Eleven of the twenty-five tasks were the same for both year 4 and 8, sevenwere unique to either year 4 or year 8, and there were also two pairs oftasks which were very similar for year 4 and year 8 students.
Link TasksFourteen tasks are released through this report. The remaining eight taskswill be used again in the second cycle of assessments, in the year 2000.These link tasks will provide a basis for comparison of performance overtime. Although first cycle results for link tasks are given in this report,more detailed descriptions are not given to help avoid biasing the results in2000.
National Monitoring Technology SurveyIn addition to the assessment tasks, a section of the total sample of studentscompleted an interview questionnaire which investigated their interests, at-titudes and involvement in technological activities.
Marking MethodsThe individual and team responses produced by the students were assessedusing specially designed marking procedures. Responses requiring high lev-els of professional judgement were marked by experienced teachers, whoworked in pairs when marking team tasks. Tasks that required marker judge-ment and were common to year 4 and year 8 were intermingled duringmarking sessions, with the goal of ensuring that the same scoring standardsand procedures were used for both. The criteria used in the marking hadbeen developed in advance by Project staff, but were sometimes modifiedas a result of issues raised during the marking.
When the marking for each task commenced, all markers gathered to be in-troduced to the task and the marking criteria. They then collectivelymarked two or three performances, discussing discrepancies between themarks awarded. In this way, the meaning of the criteria and the standards tobe applied were determined collectively by the 20 markers and the projectstaff member leading the session. Once good consistency had beenachieved, the markers marked performances individually or in pairs, peri-odically being brought back together to collectively mark a few perform-ances and discuss any discrepancies which were apparent. This processprovided both assurance and reassurance that adequately consistent mark-ing was being achieved.
Task by Task ReportingNational monitoring assessment is reported task by task so that results canbe understood in relation to what the students were asked to do.
13CHAPTER 3KNOWLEDGE AND UNDERSTANDING
The assessments included six tasks which allowed students to demonstratetheir knowledge and understanding about the make-up and operation oftechnological devices. One task also asked students to look at historicalchanges in technology and their effects on lifestyle.
Four of the tasks were the same for year 4 and year 8 students. One taskwas given only to year 4 students and one task only to year 8 students.
Two of the tasks are link tasks (tasks which will be used again in the year2000 and therefore are not described in detail here). One of these was at-tempted only by year 8 students, the other was attempted by both year 4and year 8 students. The remaining four tasks are released tasks, describedfully here.
This chapter presents the tasks and results in the following order:
➢three released tasks attempted by both year 4 and year 8 students;
➢a released task attempted only by year 4 students;
➢the two link tasks.
Averaged across all task components which were common to year 4 andyear 8 students and which asked students to explain technological features,20 percent of year 4 and 35 percent of year 8 students gave responseswhich were rated as strong. It is not surprising to note that students hadgreater success reporting observations and factual knowledge than givinggood explanations for design features and processes.
Approach: Level:
Resources:
Questions/instructions % responsesy4 y8
% responsesy4 y8
14 NEMP Report 5: Technology 1996
Tool Time
One to one Year 4 and year 8
Metal can-opener with twisted handles; small can of spaghetti.
In this activity I want you to try to explainhow something works.
Give student the can openerHave a look at this piece of equipment.
1. What is this used for?
2. Do you have one like this at home?
3. Have you used one of these?
If the student doesn’t know, tell them it is acan opener. Give the student the can.4. Here is a can. Can you explain to me
how the can opener works withoutopening the can?
5. Why do you think the flat piece of metalhas been twisted?
6. Have you any ideas on how thiscan opener could be improvedso that when people see one inshops they would really like tobuy one?
Knew use 95 100
Yes 73 76
Have used one 77 89
Explanation of use
full and clear 15 35
partial 52 50
inadequate 33 15
Reason for twisted handles 74 88
Good ideas for improvement
operational 39 59
decorative 47 60
Approach: Level:
Resources:
Questions/instructions % responsesy4 y8
% responsesy4 y8
Chapter 3: Knowledge and understandings 15
Useful Machines
One to one Year 4 and year 8
2 photos of washing machines
Show photograph of old washing machine.1. Do you know what this is?
Tell me how you think the washing is donein this washing machine.
Prompt if necesssaryThis is a washing machine that was
used more than 50 years ago. The wa-
ter was heated in here by lighting a fire
underneath. Soap was put in the water
and the clothes were added later. A
large stick was used to stir the clothes.
The clothes were lifted out of here with
the stick and put in cold water in here.
Then the clothes were put through the
wringer to squeeze some of the water
out of them.
Lets look at both pictures together.
3. What is different about how the washingis done in these 2 machines?
Prompt if necesssaryWhat is different about the machines?
What is different about how they’re
used?
Show photograph of new washing machine.2. Do you know what this is?
Tell me how you think the washing isdone in this washing machine.
Prompt if necesssaryThis is a washing machine that is
used today in some peoples homes.
The clothes are put in the top, some
soap is put in the machine, several
buttons are pressed and the machine
washes the clothes. It also spins the
clothes around to take some of the
water out of them.
4. Washing machines have changed manytimes since people used this old washingmachine. How have these changes in ma-chines changed people’s lives?
% responsesy4 y8
% responsesy4 y8
Questions/instructions
Identified new machine 99 100
Ability to describe differencesin washing procedures
strong 14 35
moderate 64 54
low 22 11
Ability to describe effects on people’s lives
strong 7 27
moderate 45 51
low 48 22
Identified old machine 41 57
Approach: Level:
Resources:
Questions/instructions % responsesyear 4 year 8
16 NEMP Report 5: Technology 1996
Approach: Level:
Resources:
Questions/instructions% responses
year 4 year 8
Chapter 3: Knowledge and understandings 17
Crane
One to one year 4 and year 8
Board with cogs, handles, and drum for string with hook, weight to be lifted on hook.
In this activity you will be looking at a model and telling me how it works.
On this board there are two black handles.
Show student the two handles.Without touching the board, tell me which handle you would turn to make the string movemore quickly, and which handle you would turn to make the string move up more slowly.
Correctly predicted faster handle 44 48
Now try turning the handles and see which one makes the string move up the more quickly.
Teacher may need to hold the board steady for the student at the edge of the desk so that itdoesn’t tip.
Correctly observed faster handle 64 72
Tell me why you think that turning the big cog makes the string move up more quickly thanturning the little cog.
Point to particular cogs.Now we have a heavy weight to lift on the hook. Which handle do you think it would bebest to turn to make it easiest to lift the weight. Tell me without trying it.
Why do you say that?
Now have a go at trying to lift the weight by turning the handles.
Have you changed your mind about what you think?
Explanation for faster handle
clear 6 24
vague 41 37
no idea 53 39
Commentary
Difficulty was experienced with the equipment for this task. Friction and wobbling cogs led to unreliableresults for the second part of the experiment (related to lifting weight). Results for that part are notreported.
Approach: Level:
Resources:
Questions/instructions % responsesyear 4 year 8
18 NEMP Report 5: Technology 1996
Stapler
Station Year 4 only
Stapler — opened to show mechanism, response sheet.
Look at this stapler and see how it works. You can open it up and you can make it work.
The arrows on this answer sheet point to parts of the stapler.
In each box, write down how that part helps the stapler work.
Adequacy of explanation of function of:
staple pusherstrong 17
moderate 37weak 46
top armstrong 22
moderate 28weak 50
spring
strong 10moderate 17
weak 73
staple outletstrong 44
moderate 11weak 45
stapler platestrong 21
moderate 9weak 70
Overall ratingvery high 1
quite high 9moderate 39
low 51
Approach:
Level:
Resources:
Task:
% responsesy4 y8
% responsesy8
Approach:
Level:
Resources:
Task:
Link tasks
Chapter 3: Knowledge and understandings 19
Approach:
Level:
Resources:
Task:
LINK TASK 1
One to one
Year 4 and year 8
Assembled object
To explain how the object had beenmade.
LINK TASK 2
Station
Year 8 only
Circuit board with wires and electricalcomponents
To wire the circuit board to achieve threerequested functional arrangements
Explanation of planning stage
strong 4 17
moderate 29 38
weak 67 45
Explanation about selection of materials
strong 5 12
moderate 49 49
weak 46 39
Explanation about cutting/shapingcomponents
strong 2 9
moderate 27 41
weak 71 50
Explanation about assemblyof components
strong 3 16
moderate 52 55
weak 45 29
Overall rating
very high 0 2
quite high 6 18
moderate 46 47
low 48 31
Achieved arrangement 1 48
Achieved arrangement 2 8
Achieved arrangement 3 37
20 CHAPTER 4DESIGN
The assessments included seven tasks which allowed students to demon-strate their capabilities in technological design.
Two of these tasks were identical for year 4 and year 8 students, one was at-tempted only by year 4 students, and two were attempted only by year 8students. The final two tasks were similar in nature and marked to the samecriteria, but there were significant differences in the stimulus materials forthe year 4 and year 8 students.
One task which was identical for year 4 and year 8 students, and the pair oftasks which were similar for year 4 and year 8 students are link tasks (to beused again in the year 2000), and so are not described in detail here. Theother four tasks are released tasks for which full details are given.
This chapter presents the assessment tasks and results in the followingorder:
➢the released task attempted by both year 4 and year 8 students
➢the released tasks attempted by either year 4 or year 8 students
➢the three link tasks.
Averaged across all task components attempted by both year 4 and year 8students, including the four components for the very similar Link task 6
and Link task 7, only 6 percent of year 4 students achieved “strong” per-formance ratings compared to 21 percent of year 8 students. Using a lessdemanding criterion, on average 44 percent of year 4 students achieved“strong” or “moderate” performance ratings, compared to 68 percent of year8 students. These results reveal substantial growth in design skills betweenyear 4 and year 8.
Chapter 4: Design 21
Approach: Level:
Resources:
Questions/instructions % responsesyear 4 year 8
Gift Shop
Station Year 4 and year 8
4 round cakes of soap, scissors, ruler, blue-tack, A3 white cardboard, response sheet.
In this activity you will be designing shapes for packaging 4 soaps together for presents.
We could draw 2 soaps in a box like this: It would look like this from the side view:
1. Sketch 3 different shaped packages that would hold the 4 soaps.
Draw the packages here with the 4 soaps in them like the first picture.
The 4 soaps should be touching each other and the sides of the box sothey don’t move around.
Number of packages drawn 3 or more 65 79
2 13 9
1 20 8
none 1 4
Appropriateness of designs strong 13 37
moderate 42 41
weak 45 22
We could draw the shape we will cut out of the cardboard to make our box like this:
2. Draw the shapes here that you would cut out of the cardboard to make your 3 packages.You don’t need to draw a lid or a top for your package.
Appropriateness of nets strong 5 28
moderate 39 41
weak 56 313. Now make up one of the shapes you drew in question 2 with the cardboard.
Make it big enough to hold the 4 soaps.
Appropriateness of model strong 10 38
moderate 30 38
weak 60 24
22 NEMP Report 5: Technology 1996
Approach: Level:
Resources:
Questions/instructions % responsesyear 4 year 8
Flag
Station Year 4 only
Stick, piece of material, scissors, cellotape, stapler, drawing pins, glue stick, string, response sheet.
You are going to make a flag that is strong. You can use all the supplies. You have 15 minutes todo this task.
1. Make your flag look like this:
Shape matched picture 82
Commentary
Students had little success in producing a flag which was strong and also looked appropriate.Few students had very good ideas for improving their flags.
4. Did it come apart? � yes � a little bit � no
Student said flag did not come apart at all 83
Marker quality rating for construction
very high 0quite high 5
moderate 40
poor 555. What could you do to make your flag stronger?
Merit of ideas for making flag stronger
very high 0quite high 1
moderate 53poor 46
6. What other things could you do to make the flag better?
Merit of other ideas for improving the flagvery high 0
quite high 1moderate 46
poor 53
2. Now test your flag to see if it is strong.
First — Wave your flag.
Second — Pull the flag hard. Pull it like this:
Was your flag strong? � yes � no
Student said flag was strong 93
Chapter 4: Design 23
Approach: Level:
Resources:
Questions/instructions % responsesyear 4 year 8
Resources
Sports Bag
Station Year 8 only
Video showing equipment to be carried in sports bag; pencil; answer sheet; bag of materialsamples marked A–G (polar fleece; parka nylon; canvas; cotton rugby knit; polyester; Oxford nylon;Microfibre suede crepe).
In this activity you will be designing a really useful sports bag that will hold these things:swimming togs and towelrunning shoesshorts and t-shirta softball and glovea bike helmet.
Play the video.In the clip the students are posed with the problem. A child has a
range of sports equipment which she is carrying around in a
plastic bag. She needs a sports bag which will hold the sports gear
in a suitable way. Each item to be contained in the sports bag is
shown. The students are asked to do the three things outlined on
their answer sheet.
1. Write down all of the things you would need to think aboutwhen designing the sports bag.
Merit of list
strong 22
moderate 49weak 29
2. Draw a design for the sports bag in the box.Write words to label the different parts or features.
Merit of design drawn
strong 24moderate 52
weak 24
3. Look at the materials you can choose from.Choose what you think would be the best materials for the bag.
Write the letters of the materials you would choose: � � �Write down why you chose those materials.
Merit of reasons for choice of materialstrong 20
moderate 50
weak 30
4. How would you check that your sports bag is going to be a good one?
Merit of ideas for checking
strong 10moderate 47
weak 43
24 NEMP Report 5: Technology 1996
Approach: Level:
Resources:
Questions/instructions % responsesyear 4 year 8
Planning a Class Event
Team Year 8
Prompt sheet, A3 answer sheet, marker pens
Imagine that next week a class is visiting your school from another school, and you are goingto make lunch for them. You will need to make enough food for 60 students. You will haveto think about what you will choose to make them for lunch and how you will organise agroup of people to help you. You will also need to think about how you will make the lunchlook and taste interesting. There is a cooking room you can use, but the teachers say you willhave to do everything yourself, and that you’re not to buy ready made food. You won’t havelots of money so try to use it wisely when you choose the food.
Your team’s job is to work out a plan that will show everything you will need to do.
Here is a card that lists the things you will need to do. I’ll read it through with you, and thenyou can start. There is paper and a felt pen for making your plan.
Read the card titled “Planning a Class Event” with the group.
Planning a Class Event
Work together as a team to plan the special lunch.
1. Talk about the things you will need to plan.
Things to think about:
• The food you will have.
• How much food to get.
• Where you will get the food from.
• How will the food be prepared
• How you will present the food so that it is safe andinteresting.
• Who will be involved and what they will do.
2. Make a plan to show what you will do.Use the answer sheet to write down your plan.
Choice of foodstrong 40
moderate 47weak 13
Quality of foodstrong 40
moderate 31weak 29
Supplier of foodstrong 19
moderate 52
weak 29
Food preparationstrong 20
moderate 43weak 37
Food presentationstrong 21
moderate 43
weak 34
Personnel planningstrong 15
moderate 46
weak 39
Overallvery high 7
quite high 29moderate 43
low 21
% responsesy4 y8
% responsesy4 y8
Approach:
Level:
Resources:
Task:
% responsesy4 y8
Approach:
Level:
Resources:
Task:
Approach:
Level:
Resources:
Task:
Link tasks
Chapter 4: Design 25
LINK TASK 3
One to one
Year 4 and year 8
Patches of material, various fasteners,photograph, answer sheet.
To select appropriate material andfastening method, and explain thechoices.
LINK TASK 4
Station
Year 4
Video cassette; answer sheet; pencil.
To design a device to achieve a particularpurpose, identifying and taking intoaccount key design considerations.
LINK TASK 5
Station
Year 8
Video cassette; answer sheet; pencil.
To design a device to achieve a particularpurpose, identifying and taking intoaccount key design considerations.
Explanation 1 strong 8 18
moderate 59 70
weak 33 12
Explanation 2 strong 9 14
moderate 43 62
weak 48 24
Identify problems with initial designstrong or moderate 13
Identify elements needed in new designstrong or moderate 40
Merits of design elements includedin drawing strong or moderate 46
Identify “hard to make” items in designstrong or moderate 37
Identify problems with initial designstrong or moderate 41
Identify elements needed in new designstrong or moderate 65
Merit of design elements includedin drawing strong or moderate 72
Identify “hard to make” items in designstrong or moderate 51
CommentaryThis task was very similar to Link task 4, and wasmarked using the same criteria
26 CHAPTER 5EVALUATING DESIGNS AND DESIGN IDEAS
The assessments included six tasks which offered the students opportuni-ties to show their skill in evaluating designs and design ideas.
Two of the six tasks were identical for year 4 and year 8 students. Twomore tasks (one for year 4 students and the other for year 8 students) werevery similar and were marked using the same criteria and standards. The re-maining two tasks were used at one level only: one with year 4 students andthe other with year 8 students.
Two of the tasks are link tasks (to be used again in the year 2000), andtherefore are not described in detail here. One of these tasks was used atboth class levels, the other only at year 4. The other four tasks are releasedtasks for which full details are given.
This chapter presents the assessment results in the following order:
➢ the released task attempted by both year 4 and year 8students
➢ the pair of similar released tasks, one attempted by year4 students and the other by year 8 students
➢ the two link tasks
Student performance in evaluating designs and design ideas was not strong.On many task components, less than a third of year 8 students performed atlevels variously rated as “strong”, “very high”, “quite high”, “full” or “quitefull”. Averaged across all task components which were identical or verysimilar for year 4 and year 8 students, 23 per cent of year 8 students per-formed at high or quite high levels, compared to 7 percent of year 4 stu-dents. At the other end of the scale on these same task components andaverage of 30 per cent of year 8 students performed at low levels, com-pared to an average of 50 per cent of year 4 students. Substantial progressbetween year 4 and year 8 is evident from these figures.
Approach: Level:
Resources:
Questions/instructions% responses
year 4 year 8
Chapter 5: Evaluating design and design ideas 27
Space Game
Team Year 4 and Year 8
Game Board; set of Red Alert cards (red); set of Direction cards (blue); video.
In this activity you will be playing a game and thinking about ways to improve the game. Youwill also be thinking up questions to ask other people what they thought about the game.
Watch the video which explains how to play the game.
Imagine that the makers of the game want you to tell them how they can make it better.They want the game to be fun for people who are 6 years or older. They also want childrento learn about directions of north, south, east and west. Take 5 minutes to play the game.As you are playing, think about ideas you have to improve the game.
Give students 5 minutes to play the game.1. Did you enjoy playing the game?
Discuss your ideas for making the game more fun. When you have finished your discus-sion, I will get you to choose your four best ideas. You will have 5 minutes to do this.
Give the students 5 minutes for discussion. Then ask them to choose their four best ideas.
Quality of ideas for making the game more fun Strong 6 22
Moderate 55 67
Weak 39 11
4. Imagine that you have been asked by the makers of the game to find out whether otherchildren think the game is fun. Discuss how you would find out if they like the game. Alsodiscuss how you could find out if other children think the game needs any improvements.You have 5 minutes to discuss this as a group.
Give the students 5 minutes for discussion. Then ask them to report on their ideas.
Quality of plan to find out if other children think the game is fun,and to collect ideas for improvement Very high 1 6
Quite high 13 29
Moderate 38 50
Low 48 15
Approach: Level:
Resources:
Questions/instructions % responsesy4 y8
28 NEMP Report 5: Technology 1996
Coloured Sheep
Team Year 4 only
Video showing coloured sheep moving around; photograph showing coloured sheep; twolaminated sheets with discussion ideas; A3 answer sheets; marker pens.
Commentary
The discussion of Idea 1 was intended only to al-low practice of the “good point/bad point” tech-nique. All marking was based on the discussion ofIdea 2. This task is very similar to year 8 task Green
Sheep (opposite page), which was marked usingthe same criteria and standards.
In this activity your team is going to think up thegood points, the bad points, and interesting pointsabout an idea. I want you to discuss them to-gether and write the ideas on a chart.
The first idea is that children should only goto school in the morning, not the afternoon.
Give children the sheet with idea 1 written onFirst of all, think of the GOOD points of this idea.When your team has finished discussing the goodpoints write them down.
Allow time
Now discuss the BAD points about the idea thatchildren should be paid for going to school.When your team has finished discussing the badpoints write them down.
Allow timeNow think of the INTERESTING points of
the idea - ideas that aren’t good or bad.When your team has finished discussingthe interesting points write them down.
Allow time
Now we’ll do a second idea. Here is the idea.
New Zealand earns a lot of money byselling wool to other countries. Beforethe wool is used for making carpets,clothes and fabrics, it is usually dyed dif-ferent colours. If sheep were born withcoloured wool it wouldn’t need to bedyed, and we would save money.
We will watch a short video about this.
Play the videoThe video shows a flock of sheep. Each sheep
has wool of a different colour —yellow,
green, red, blue. At the beginning of the clip
they are seen in a paddock on a farm. As
the clip progresses they are brought into a
pen and are on display in a township.
Give students the activity sheet idea 2 and thepicture.
Think about the GOOD points of this idea.When your team has finished discussingthe good points write them down.
Allow time.Merit of good points strong 6
moderate 42weak 52
Now think of the BAD points of the ideathat sheep should be bred so that they areborn with coloured wool. When your teamhas finished discussing the bad pointswrite them down.
Allow time.Merit of bad points strong 6
moderate 44
weak 50
Now think of the INTERESTING points ofthe idea — ideas that aren’t good or bad.When your team has finished discussingthe interesting points write them down.
Allow time.
As a group, you have talked about the good,bad and interesting points. Now talk aboutthis as a group and decide whether youagree with this idea.
Allow time.
Overall rating very high 1
quite high 9
moderate 51
low 39
Approach: Level:
Resources:
Questions/instructions % responsesy4 y8
Chapter 5: Evaluating design and design ideas 29
Green sheep
Team Year 8 only
Video showing coloured sheep moving around; photograph showing coloured sheep; two promptsheets with discussion ideas; A3 answer sheet; marker pens.
In this activity your team is going to think up thegood points, the bad points, and interesting pointsabout an idea. I want you to discuss them to-gether and write the ideas on a chart.
The first idea is that children should bepaid for going to school.
Give children the sheet with idea 1 written onFirst of all, think of the GOOD points of this idea.When your team has finished discussing the goodpoints write them down.
Allow time
Now discuss the BAD points about the idea thatchildren should be paid for going to school.When your team has finished discussing the badpoints write them down.
Allow timeNow think of the INTERESTING points of
the idea - ideas that aren’t good or bad.When your team has finished discussingthe interesting points write them down.
Allow time
Now we’ll do a second idea. Here is the idea.
New Zealand earns a lot of money byselling wool to other countries. Beforethe wool is used for making carpets,clothes and fabrics, it is usually dyed dif-ferent colours. If sheep were born withcoloured wool it wouldn’t need to bedyed, and we would save money.
We will watch a short video about this.
Play the videoThe video shows a flock of sheep. Each sheep
has wool of a different colour —yellow,
green, red, blue. At the beginning of the clip
they are seen in a paddock on a farm. As
the clip progresses they are brought into a
pen and are on display in a township.
Give students the activity sheet idea 2 and thepicture.
Think about the GOOD points of this idea.When your team has finished discussingthe good points write them down.
Allow time.Merit of good points strong 22
moderate 53weak 25
Now think of the BAD points of the ideathat sheep should be bred so that they areborn with coloured wool. When your teamhas finished discussing the bad pointswrite them down.
Allow time.
Merit of bad points strong 24
moderate 42weak 34
Now think of the INTERESTING points ofthe idea — ideas that aren’t good or bad.When your team has finished discussingthe interesting points write them down.
Allow time.
As a group, you have talked about the good,bad and interesting points. Now talk aboutthis as a group and decide whether youagree with this idea.
Allow time.
Overall rating very high 6
quite high 27
moderate 45
low 22
Commentary
This task is very similar to the year 4 task Coloured
Sheep (opposite page) which was marked using thesame criteria and standards.
Approach: Level:
Resources:
Questions/instructions % responsesyear 4 year 8
30 NEMP Report 5: Technology 1996
The ten-in-one gardening tool
One to one Year 8 only
Picture of tool.
In this activity you will be looking for the good points and bad points in a new product.
Here is a picture of a new invention called the 10-in-1 gardening tool.
Why do you think this tool was made?
Tell me as many good points about this tool as you can think of.
Now tell me as many bad points about this tool as you can think of.
Mentioned multipurpose use and convenience 95
convenience/availability of tools 92
impracticality, lack of strength, weight or other usability factors 89
% responsesy4 y8
% responsesy4
Approach:
Level:
Resources:
Task:
Approach:
Level:
Resources:
Task:
Link tasks
Chapter 5: Evaluating design and design ideas 31
LINK TASK 6
One to one
Year 4 and year 8
Kitchen tools designed to achieve thesame task; food items to use equipmentwith.
To demonstrate use of both kitchen tools,evaluate their usefulness, explaindifferences between them, and evaluatetheir relative merits.
LINK TASK 7
Team
Year 4 only
Two electronic circuit boards
To make minor changes in the wiring ofthe circuit board and evaluate the meritsof two different signalling devices
Evaluation Tasks
1. full or quite full response 2 11
limited response 28 35
no clear ideas 70 54
2. full or quite full response 2 14
limited response 34 45
no clear ideas 64 41
3. full or quite full response 3 16
limited response 34 44
no clear ideas 63 40
Explanation Tasks
1. clear explanation 17 40
some explanation 72 54
no clear explanation 11 6
2. full or quite full response 1 10
limited response 31 40
no clear ideas 68 50
Task 1 completed without help 91
Task 2 completed without help 93
Ability to justify choice betweentwo devices
very high 3
quite high 16
moderate 64
low 17
32 CHAPTER 6USING COMPUTERS
The assessments included three tasks which related to the use of comput-ers. These task sample some skills involved in using computers. Othercomputer skills will be sampled each year as part of the national monitor-ing programme, in a wide variety of curriculum areas. For instance, compu-ter use will be included in the 1997 assessments of mathematics,information skills and social studies.
All three tasks were administered to both year 4 and year 8 students. Thefirst two tasks presented here are released tasks, for which full details aregiven. The third task is a link task (to be used again in the year 2000), andtherefore is not described in detail here).
The results show that most Year 4 and year 8 students were able to use thecomputers effectively to complete simple tasks. It should be noted thatthis result was obtained after students completed an exercise which al-lowed them to practise standard procedures required to use the MacintoshPowerbook computers. The success rate was consistently higher for year 8students than for year 4 students, but not by a large margin.
Approach: Level:
Resources:
Questions/instructions % responsesyear 4 year 8
Chapter 6: Using computers 33
Mac Moves
Station Year 4 and year 8
Macintosh Powerbook 5300CS laptop computer with program Hypercard® Player; headphones;mouse; Hypercard® document Mac Moves.
Three skills were considered essential for carrying out the assessment tasks using computers. The Mac
Moves exercises were designed to ensure the students had the opportunity to acquire or practice thoseskills before beginning the assessed tasks. For each exercise the student heard their instructionsthrough the headphones.At any time they could use the buttons at the bottom of the screen:
Tell me… Hear the instructions repeated.Home Return to the main computer screen for choosing tasks.Again Have another go.Next Go to the next exercise.
2. Clicking into a text field and typing.Type in some words for the frog to say.
Click on the line then type with the keys on the keyboard.
Once the spoken instructions finished the words visiblein the field on the screen disappeared.
If the student clicked on the frog, the student heard a dis-gruntled noise and words appeared in the field saying‘don’t’ in various ways. If the student typed withoutclicking anywhere they heard a beep for each key typedand the verbal instruction “click on the line before typing
any letters”.
Once they had typed in something, if they clicked OK then they heard the frog ‘speak’whatever they had typed (using the Macintosh® speech technology).
3. Pointing with the fingertip part of the hand-shaped Hypercard® cursor.Click on the dark oval in the centre and
then on each ring from the fat rings in the
centre to the narrow rings on the outside.
Use the pointed finger on the little hand on
the screen to point before you click.
As they clicked on each ring the colours (shades of blueand red) changed, and when they clicked on the smallestring the screen flashed and they heard “Well done! You
were able to click even the smallest red oval!”
1. Pointing and clicking an object.
Point and click on each frog to make it
jump into the water.
Each frog made a noise as it appeared to jump into thewater.
Tell me. . . Home Again Next
Tell me. . . Home Again Next
Type me something to say! OK
Tell me. . . Home Again Next
Approach: Level:
Resources:
Questions/instructions % responsesyear 4 year 8
34 NEMP Report 5: Technology 1996
Treasure Find
Station Year 4 and year 8
Macintosh Powerbook 5300CS laptop computer with program Hypercard® Player; headphones;mouse; Hypercard® documents Mac Moves (introductory exercises) and Treasure Find.
Commentary
The results shows that all students were eventually able to enter the required series of re-sponses on the computer. The only errors were errors in selecting among options involvingword knowledge rather than computer usage.
Treasure Find opened with the screen below.The student heard their instructions through the headphones:
Welcome to Find the Treasure.
Remember, if you want to
choose a word, move the hand
over the word and push down
the button on the top of the
mouse. Start by choosing the
word Start.
The student was given a series of tasks such as the example above left, and ended at a finalbrightly coloured reward screen (above right), which varied depending on which wordsthey had chosen.
The End button returned the student to the main (Home) screen for choosing tasks, andwas available at any time.
Answered all questions 60 79
Completed answering questions on computer, but made one or more word knowledge errors 40 21
Did not complete the sequence of responses required 0 0
Approach: Level:
Resources:
Questions/instructions % responsesyear 4 year 8
Chapter 6: Using computers 35
Form Fill
Station Year 4 and year 8
Macintosh Powerbook 5300cs laptop computer with program Hypercard® Player; headphones;mouse; introductory Hypercard® document Mac Moves and Hypercard® document Form Fill.
Responded appropriately to all task components 57 70
Responded appropriately to most task components, but made one error 33 23
Incomplete response, or more than one error. 10 6
The form the students were to fillin was on 3 screens (left). Therewere no verbal instructions for thistask.
The student clicked the brightlycoloured button to go to the nextscreen; the button on the finalpage returned the student to themain (Home) screen for choosingtasks.
If the student typed without firstclicking in a field they heard beepsfor each key typed and a cue fieldappeared (visible second screen atleft).
Approach:
Level:
Resources:
36 NEMP Report 5: Technology 1996
% responsesy4 y8
Task:
Link tasks
LINK TASK 8
One to one
Year 4 and year 8
Macintosh Powerbook 5300CS laptopcomputer with program Hypercard®Player; headphones; mouse; introductoryHypercard® document Mac Moves; textprogramme SimpleText®; instructionbooklet for task, recording sheet forteacher to use.
Complete introductory programme Mac
Moves, then complete a series of taskcomponents with the help of writteninstructions and diagrams. Teachers tonote where help was required tocomplete any task component.
Component 1 84 94
2 82 92
3 49 77
4 66 87
5 89 97
6 89 95
7 77 92
Components all 7 35 62
5–6 42 32
less than 5 23 6
37CHAPTER 7TECHNOLOGY SURVEY
Attitudes and MotivationThe national monitoring assessment programme recognises the impact of attitudinaland motivational factors on student achievement in individual assessment tasks. Stu-dents’ attitudes, interests and liking for a subject have a strong bearing on progressand learning outcomes. Students are influenced and shaped by the quality and styleof curriculum delivery, the choice of content and the suitability of resources. Otherimportant factors influencing students’ achievements are the expectations and sup-port of significant people in their lives, the opportunities and experiences they havein and out of school, and the extent to which they have feelings of personal successand capability.
Technology SurveyThe national monitoring technology survey sought information from students abouttheir curriculum preferences and their perceptions of their achievement and poten-tial in technology. Students were also asked about their involvement in technologyrelated activities within school and beyond. The survey was administered to bothyear 4 and year 8 students in a one-to-one interview setting, with most questions re-quiring short written answers and others a written or spoken response. There arenumerous research questions that could be asked when investigating student atti-tudes and engagement. In national monitoring it has been necessary to focus on afew key questions that give an overall impression of how students regard technol-ogy in relation to themselves.
38 NEMP Report 5: Technology 1996
Responses of Year 4 Students to the Technology Survey %response
1. How much do you like doingtechnology at school? 57 38 4 1
2. How good do you think you areat technology compared toother subjects? 22 56 18 4
lots quite often sometimes never
3. How often do you use acomputer at school? 10 20 62 8
4. How often do you use acomputer when not at school? 27 18 30 25
Year 4 students were generally positive about doing technology at school. Fifty-seven percent chose the highest rating for the first question (how much they likedto do technology at school), and only five percent chose negative ratings. Studentsperceptions of their expertise in technology compared to other subject (question 2)were also quite positive.
Only 30 percent of year 4 students reported that they used a computer at school“lots” or “quite often”, but 45 percent reported that they used a computer “lots” or“quite often” when not at school. One quarter of the students said they never used acomputer when not at school.
In response to a question not listed in the table above, year 4 students describedwhat they thought technology was. Their responses were categorised into eightcategories. The three popular categories were:
➢ making and designing (16 percent of students)
➢ using tools, computers or other equipment (14 percent of students)
➢ modifying or finding out how things work (14 percent of students).
A further open-ended question asked year 4 students to describe the technologytheir class did at school. The two popular response categories used in theseresponses were:
➢using computers (37 percent of students)
➢making and designing (23 percent of students)
Chapter 7: Technology survey 39
Responses of Year 8 Students to the Technology Survey %response
1. How much do you like doingtechnology at school? 45 48 6 1
2. How good do you think you areat technology compared toother subjects? 18 62 17 3
lots quite often sometimes never
3. How often do you use acomputer at school? 9 31 57 3
4. How often do you use acomputer when not at school? 26 22 34 18
Year 8 students were slightly less positive than year 4 students about doing technol-ogy at school. Our experience to date in national monitoring suggests that similardeclines in enthusiasm occur in most curriculum areas. Forty-five percent of theyear 8 students chose the highest rating for the first question (how much they likedto do technology at school), and only seven percent chose negative ratings. Studentsperceptions of their expertise in technology compared to other subject (question 2)were quite positive, with very similar percentages to those recorded for year 4 stu-dents.
Only 40 percent of year 8 students reported that they used a computer at school“lots” or “quite often,” while 48 percent reported that they used a computer “lots” or“quite often” when not at school. The percentage who said that they never used acomputer when not at school was a little lower than for year 4 students, at 18 per-cent.
Like year 4 students, year 8 students described what they thought technology was,and their responses were placed into eight categories.The most common responses were:
➢ using tools, computers or other equipment (31 percent of students)
➢ making and designing (29 percent of students)
➢ modifying or finding out how things work (27 percent of students).
Year 8 students were also asked to describe in writing the technology their class didat school. The most popular response categories used were:
➢ using computers (41 percent of students)
➢ making and designing (40 percent of students)
➢ workshop/manual subjects (36 percent of students).
The responses to the two open-ended questions show that year 8 students havemore definite ideas about what technology is than year 4 students do. This does notmean, however, that the responses are consistent with each other or with emphasesof the new technology curriculum. The responses seem to include three mainthreads: the use of computers, workshop and manual activities, and making and design-ing activities.
40 CHAPTER 8PERFORMANCE OF SUBGROUPS
Although national monitoring has been designed primarily to present anoverall national picture of student achievement, there is some provision forreporting on performance differences among subgroups of the sample.Nine demographic variables are available for creating subgroups, with stu-dents divided into two or three subgroups on each variable, as detailed inChapter 1 (p5).
The analyses of the relative performance of subgroups used an overallscore for each task, created by adding scores for the most important com-ponents of the task.
Where only two subgroups were compared, differences in task perform-ance between the two subgroups were checked for statistical significanceusing t-tests. Where three subgroups were compared, one way analysis ofvariance was used to check for statistically significant differences amongthe three subgroups.
Because the number of students included in each analysis was quite large(approximately 450), the statistical tests were quite sensitive to small differ-ences. To reduce the likelihood of attention being drawn to unimportantdifferences, the critical level for statistical significance was set at p = .01 (sothat differences this large or larger among the subgroups would not be ex-pected by chance in more than one percent of cases). The critical level wasadjusted to p = .05 for the six tasks where differences in team performanceamong 120 teams were being examined.
For the first four of the nine demographic variables, few statistically signifi-cant differences among the subgroups were found. For the remaining fivevariables, statistically significant differences were found on substantial num-bers of tasks. Details are presented below.
ZoneResults achieved by students from Auckland, the rest of the North Island,and the South Island were compared.
For year 8 students, there were no statistically significant differences amongthe three subgroups on any of the 17 tasks. There was a statistically signifi-cant difference among the three subgroups on one question of the Technol-
ogy survey (p39). In response to question 4, Auckland students reportedthe highest level of use of computers when not at school and students fromelsewhere in the North Island reported the lowest use.
For year 4 students, there was a statistically significant difference amongthe three subgroups on only one of the 15 tasks. Students from Aucklandscore highest and students from elsewhere in the North Island score loweston Gift soap (p21). There were no statistically significant differencesamong the three subgroups on questions of the Technology survey.
Community SizeResults were compared for students living in communities containing over100,000 people (main centres), communities containing 10,000 to 100,000people (provincial cities), and communities containing less than 10,000people (rural areas).
For year 8 students, there were no statistically significant differences amongthe three subgroups on any of the 17 tasks, but there was a difference onone question in the Technology survey (p39). Students from main centresreported the highest level of computer use outside of school (question 4),while students from rural areas reported the lowest level of use.
For year 4 students, there were no statistically significant differences amongthe three subgroups on any of the assessment tasks, or on questions of theTechnology survey .
Chapter 8: Peformance of subgroups 41
School SizeResults were compared from students in larger, medium sized, and smallschools (exact definitions were given in Chapter 1, p8).
For year 8 students, there was a statistically significant difference amongthe three subgroups on only one of the 17 tasks. Students from smallschools scored lowest on Planning a Class Event (p24). There was also onestatistically significant differences among the three subgroups on a ques-tion of the Technology survey (p39). Students from small schools reportedless use of computers outside of school (question 4).
For year 4 students, there were no statistically significant differences among thethree subgroups on the assessment tasks, or on questions of the Technology survey.
School TypeResults were compared for year 8 students attending full primary schoolsand year 8 students attending intermediate schools. A statistically significantdifference was found on only one of the 17 technology tasks. Students fromintermediate schools scored lower than students from full primary schoolson Link task 6 (p31). There was also a statistically significant difference onquestion 4 of the Technology survey. Students from intermediate schoolsreported greater use of computers outside of school.
GenderResults achieved by male and female students were compared.
For year 8 students, there were statistically significant differences betweenboys and girls on three tasks. Girls scored higher than boys on two designtasks: Sports Bag (p23) and Link task 5 (p25). However, girls scored lowerthan boys on a task involving electrical circuits: Link task 2 (p19). On theTechnology survey (p39), there were statistically significant differences be-tween boys and girls on three of the four questions, with boys higher ineach case. Boys expressed greater liking for doing technology at school(question 1), judged themselves to be performing better in technology(question 2), and reported a greater level of computer usage outside ofschool (question 4).
For year 4 students, there were statistically significant differences betweenboys and girls on three tasks. Girls scored higher than boys on one designtask (Gift soap, p21), but lower than boys on another design and makingtask (Flag, p22). Girls also scored lower than boys on a task involving un-derstanding how a technological device works (Stapler, p18). There werealso statistically significant differences on two questions of the Technology
survey (p38). Compared to girls, boys reported greater usage of computersboth at school (question 3) and outside of school (question 4).
Socio-Economic IndexSchools are categorised by the Ministry of Education based on census datafor the census mesh blocks where children attending the schools live. TheSES index takes into account household income levels, categories of em-ployment, and the ethnic mix in the census mesh blocks. The SES indexused ten subdivisions, each containing ten percent of schools (deciles 1 to10). For our purposes, the bottom three deciles (1-3) formed the low SESgroup, the middle four deciles (4-7) formed the medium SES group, and thetop three deciles (8-10) formed the high SES group. Results were comparedfor students attending schools in each of these three SES groups.
42 NEMP Report 5: Technology 1996
For year 8 students, there were statistically significant differences amongthe three subgroups on seven of the 17 tasks. In each case, performancewas lowest for students in the low SES group. Students in the high SESgroup generally performed better than students in the medium SES group,but in some cases these differences were small. Because of the number oftasks, the specific tasks will not be listed here, but it should be noted thatthey included tasks in three of the four strands assessed (Chapters 3 to 5).On the Technology survey (p39), there was a statistically significant differ-ence on one question: students from low SES schools reported lower levelsof use of computers outside of school (question 4).
For year 4 students, there were statistically significant differences amongthe three subgroups on five of the 13 tasks: Useful Machines (p15), Stapler
(p18), Link task 1 (p19), Link task 3 (p25), and Space game (p27). In eachcase, performance was lowest for students in the low SES group and high-est for students in the high SES group. There were no statistically significantdifferences among the three subgroups on questions of the Technology survey.
Student EthnicityResults achieved by Ma–ori and non-Ma–ori students were compared.
For year 8 students, there were statistically significant differences of per-formance between Ma–ori and non-Ma–ori students on three tasks. In eachcase, non-Ma–ori students scored higher than Ma–ori students. These threetasks were Link task 1 (p19), Gift soap (p21), and Link task 6 (p31). Therewere no statistically significant differences between Ma–ori and non-Ma–oristudents on questions of the Technology survey.
For year 4 students, there were statistically significant differences in per-formance on two tasks. Ma–ori students scored lower than non-Ma–ori stu-dents on both tasks: Link task 1 (p19) and Link task 3 (p25). There wereno statistically significant differences between Ma–ori and non-Ma–ori stu-dents on questions of the Technology survey.
Proportion of Ma–ori Students in SchoolsSchools were categorised into three subgroups: schools with less than 10percent Ma–ori students, schools with 10 to 30 percent Ma–ori students, andschools with more than 30 percent Ma–ori students. Results were comparedfor students attending schools in these three categories.
For year 8 students, statistically significant differences in performanceamong the three subgroups were found on seven of the 17 tasks. On eachof these tasks, students attending schools with less than 10 percent Ma–oristudents scored highest. Because of the number of tasks involved, they willnot be listed here, but it is worth noting that all seven tasks involved indi-vidual students rather than teams of students. On the Technology survey
(p39), there was a statistically significant differences between the threesubgroups on one question. Students from schools with less than 10 per-cent of Ma–ori students reported higher levels of use of computers outsideof school (question 4).
For year 4 students, statistically significant differences in performance be-tween the three subgroups were found on four of the 15 tasks, all involvingindividual performance. In each case, students attending schools with morethan 30 percent Ma–ori students scored lowest, with generally smaller differ-ences between the other two subgroups. The four tasks were: Tool time
(p14), Stapler (p18), Link task 1 (p19), and Link task 3 (p25). There wereno statistically significant differences on questions of the Technology sur-
vey.
Chapter 8: Peformance of subgroups 43
Proportion of Pacific Island Students in SchoolsBecause most of the Pacific Island students are concentrated into relativelyfew schools, it was difficult to create sensible subgroups for schools withhigher or lower percentages of Pacific Island students. Two subgroups wereformed: students attending schools with up to 5 percent Pacific Island stu-dents, and students attending schools with more than 5 percent Pacific Is-land students. Results were compared for students in these two subgroups.
For year 8 students, statistically significant differences in performanceamong the two subgroups were found on five of the 20 tasks: Tool time
(p14), Gift soap (p21), Link task 3 (p25), Link task 5 (p25), and Green
sheep (p29), . For each of these tasks, average performance levels werelower in the schools with more than 5 percent Pacific Island students.There were no statistically significant differences on questions of the Tech-
nology survey.
For year 4 students, statistically significant differences in performanceamong the two subgroups were found on two individual tasks (Stapler, p18,and Link task 1, p19)and two team tasks (Coloured sheep, p28, and Link
task 7, p31. For each of these tasks, average performance levels were lowerin the schools with more than 5 percent Pacific Island students. There wereno statistically significant differences on questions of the Technology sur-
vey.
IntroductionThe task of the National Education Monitoring Project (NEMP) is to obtain adetailed national picture of the educational achievements and attitudes of NewZealand primary and intermediate school children. Its main goal is to help toimprove the education which children receive. National monitoring provides,for the first time in New Zealand, a national “snapshot” of children’s knowl-edge, skills and motivation, and a way to identify which aspects are improving,staying constant, or declining. This information will allow successes to becelebrated and priorities for curriculum change and teacher development tobe debated more effectively. As the New Zealand Curriculum Framework(1993, p26) puts it, the purpose of national monitoring is to provide informa-tion on how well overall national standards are being maintained, andwhere improvements might be needed.
The need for national monitoring in New Zealand has been discussed for manyyears. The Currie Commission (1962), the Educational Development Confer-ence (1974), the Royal Commission on Social Policy (1988), and the Ministe-rial Working Party on Assessment for Better Learning (1990) all recommendedthat New Zealand should establish a system for obtaining a national picture ofstudent achievement. They noted that New Zealand had ways for monitoringthe resources available to the education system (its teachers, students, curricu-lum materials, buildings, finance and community support) and the processesused by teachers and schools in helping students learn, but that sound waysfor monitoring the achievements of the students were lacking. Since 1969,useful information on national patterns of student achievement has been avail-able from New Zealand’s participation in international surveys conducted bythe International Association for the Evaluation of Educational Achievement,but these surveys are necessarily constrained in content and approach and arenot a substitute for comprehensive national monitoring.
The approach which has been selected for national monitoring in New Zea-land uses carefully selected national samples of schools and students, assessedby experienced teachers who are given special training for this work. Assess-ment procedures and tasks are selected to provide a rich picture of what chil-dren can do and to optimise value to the educational community. The result isa quite detailed national picture of student achievement. It is neither feasiblenor appropriate, given the purpose and the approach used, to release informa-tion about individual students or schools.
The first four years of national monitoring assessments are being conductedby the Educational Assessment Research Unit, University of Otago, under con-tract to the Ministry of Education.
How the Assessments are Carried OutSampling Each year, about 2880 children from 260 schools are selected to participate in
national monitoring. Half are in year 4 (ages 8–9), half in year 8 (ages 12–13).These levels have been chosen so that a picture is obtained of educationalachievement at the middle and end of primary schooling.
At each of these two levels, 120 schools are selected randomly from nationallists of state, integrated and private schools teaching at that level, with a prob-ability of selection proportional to the number of students enrolled in thelevel. Schools with fewer than four students enrolled are excluded, becausesome of the tasks are designed for groups of four children working collabora-tively. Also excluded at present are special schools and Kura Kaupapa schools(by mutual agreement, Kura Kaupapa schools will be included in nationalmonitoring from 1999).
44 APPENDIX 1A DESCRIPTION OF THE NATIONAL EDUCATION MONITORING PROJECT
Appendix 1: National Education Monitoring Project 4 5
Once schools have agreed to participate, twelve students to be assessed arerandomly chosen from year 4 or year 8 students at each school. Because fiveto ten percent of the chosen schools will have less than twelve students in theselected class level (reflecting the large proportion of small schools in NewZealand), each such small school is paired with a nearby small school to pro-vide the required twelve students (four from one school, eight from the other).The twelve students selected are randomly divided into three groups of four.Each group is given a different set of tasks.
Comprehensive information about the sampling process and resulting samplefor the 1996 assessments is given in Appendix 2 (p49).
In the schools Monitoring takes place in the second half of the school year. The students areassessed by 96 trained teachers, experienced at working with year 4 or year 8students, who are released from their normal positions for six weeks (oneweek of training, five weeks of assessments). The teachers are selected nation-ally, and usually assess in schools in their home region. Important criteria inselecting the teachers are skills in quickly establishing good relationships withstudents, and ability to encourage and carefully probe student achievementwithout providing answers. The professional development of the teachersover their six weeks engagement is a very important outcome of the Project.The teachers who conducted the 1996 assessment were most enthusiasticabout the value of the experience for their professional development.
Teachers work in pairs, each pair spending a week visiting each school (orpair of small schools). During the week they aim to meet with each of thetwelve students five times, for about an hour each time. The first session doesnot involve any formal assessment, but rather is an opportunity for the stu-dents to meet the visiting teachers and become comfortable with them andwith the schedule of activities for the rest of the week. The remaining sessionsinvolve individual students or groups of four children working on a wide vari-ety of assessment tasks.
Student experience Each student has one session working individually with a visiting teacher. Thisone-to-one approach allows the teacher to help ensure that the students un-derstand what the tasks require, and to ask questions about the processes thestudents use. Most of the students’ responses are presented orally or throughactive demonstration. Compared to paper-and-pencil testing, this approachallows students whose reading or writing skills are weak, a better opportunityto show their skills. To ensure that maximum information is obtained for lateranalysis, the entire session is recorded on videotape (the camera sits in thebackground throughout, with no operator to cause distractions).
Each student also has a session in which they are asked to work collabora-tively with three other students, supervised by one of the visiting teachers.Tasks in this hour involve group activities, such as discussion, team decisionmaking, drama, or some students acting as presenters and others as audience.This session is also recorded on videotape for later analysis.
A third session for each student uses a “stations” format. Four students partici-pate, working individually, supervised by a teacher. This session is notvideotaped. Students tackle a series of tasks set up at stations around theroom. Some are strictly paper-and-pencil tasks, others involve responding tovideo material, and others involve hands-on activities with equipment. Theteacher is available to help students with reading or writing difficulties (ex-cept where the purpose of the task is to assess reading or writing abilities).
The fourth session for each student will vary in nature from year to year, to suitthe particular curriculum areas being assessed that year. In the 1996 assess-ments, this fourth session involved further tasks in which four students workedcollaboratively, supervised by a teacher. Many of the tasks were marked torecord separately the achievements of the individual students.
4 6 NEMP Report 5: Technology 1996
Y E A R NEW ZEALAND CURRICULUM FRAMEWORK
1 1995 Science
Art
Information Skills (graphs, tables, maps,charts and diagrams)
2 1996 Language: Reading and Speaking
Aspects of Technology
Music
3 1997 Mathematics (numeracy skills)
Social Sciences
Information Skills (library, research)
4 1998 Language: Writing, Listening, Viewing
Health and Physical Well-being
Com
mun
icat
ion
skill
s
Prob
lem
-sol
ving
ski
lls
Self
-man
agem
ent
and
com
peti
tive
ski
lls
Soci
al a
nd c
o-op
erat
ive
skill
s
Wor
k an
d st
udy
skill
s
Att
itud
es†
Minimum intrusion Every effort is made to minimise disruption and extra demands for schools.The use of outside teachers to conduct the assessments, central provision ofall equipment and materials, and flexibility in the choice of an assessmentweek and the programme within that week all served to make schools morewilling to participate. These factors, combined with the extensive informationprovided about the project, were received very positively in 1996. Schools,parents and children generally seemed happy about their experience withnational monitoring. Highly positive comments were made by many of them,and despite the availability of the Project’s 0800 number to all schools andparents involved, no concerns were raised with the Project directors duringthe 10 weeks of assessments.
What is Monitored?The New Zealand Curriculum Framework is the blueprint for the schoolcurriculum. It places emphasis on seven essential learning areas, eight essen-tial skills, and a variety of attitudes and values. National monitoring aims totouch upon all of these areas, rather than restrict itself to preselected priorityareas.
In order to cover such a broad range, yet do so in adequate depth, assessmentof the range of areas is to be spread across four years, and then the same cyclewill be repeated in subsequent four year periods. Some of the skills and atti-tudes will be assessed every year, but other skills and the subject areas will beallocated to specific years. The planned schedule is set out below.
†Attitudes and values assessed include: motivation and confidencein different learning areas, involvement in related development ac-tivities outside school, and developing attitudes and values relevantto the school curriculum.
How are Tasks Selected and Developed?Many of the assessment tasks used in national monitoring are developed spe-cifically for the Project, over a period of about 18 months before the assess-ments take place. The remainder are adapted from tasks already in use in NewZealand or elsewhere.
Appendix 1: National Education Monitoring Project 4 7
Advisory panels For each major curriculum area to be assessed, a small (five or six member)national advisory panel is formed. A panel normally includes three or fourcurriculum experts from colleges of education or universities, at least oneprimary or intermediate school teacher and at least one person who can effec-tively present a Ma-ori perspective. In a one day meeting with Project staff, thepanel develops a one page framework for task development and selection.This framework identifies the main areas to be covered by national monitor-ing, such as the main strands of the subject, and the most important skills andattitudes. It usually links quite closely to the corresponding national curricu-lum document, but has some significant differences of wording and emphasisto fit with the goals of national monitoring.
Subcontractors Subcontractors are then appointed to work on task development and informaltrialing. A condition of their contract is that they will include teachers in theirteam. It is our experience that most tasks submitted by subcontractors willrequire substantial further development before they are ready for use in na-tional monitoring. It is very clear, however, that the ideas provided make avital contribution towards a rich final blend of assessment tasks.
When subcontractors have submitted their reports, the national advisory panelmeets for two or three days with Project staff to review all available tasks(from subcontractors, from the Project’s library, and from development workundertaken by Project staff ). The goals of this meeting are to identify tasksworthy of further development work and/or trialing, to make suggestions forimprovement of tasks or development of new tasks, and to look for areas ofthe framework which may be inadequately represented in the available tasks.
Further development Following this meeting, the project staff will continue to work on the morepromising tasks, revising them as suggested by the national panel, developingresource materials needed for national use of the tasks, and conducting fieldtrials. Project staff then conduct a final review, in consultation with members of thenational advisory panel, and select tasks for use in that year’s national monitoring.
Field trial About three months before national assessment is due to take place, the se-lected tasks are given a final field trial. Any necessary modifications to indi-vidual tasks or to their administration are then made, and final task materialsare produced in sufficient quantities for the full national sample of schoolsand students.
What Criteria are Used for Task Selection and Development?Seven major criteria are considered when tasks are being developed and se-lected for inclusion in national monitoring. The central focus is on obtainingthe richest possible picture of the knowledge, skills and attitudes which thestudents are developing to become good contributors to and participants insociety. The seven criteria are listed below.
Wide range 1:Tasks assess a wide range of significant learning outcomes (cognitive, affec-tive and physical), and emphasise knowledge and skills which are valuable infurther learning and in life outside school. Outcomes are not excluded merelybecause they are difficult to assess. The importance and validity of the result-ing information are the dominant considerations.
Meaningful context 2: Tasks are set in meaningful contexts related to students’ experiences. Fulluse is made of media and materials to enrich tasks, increase authenticity, andcommunicate task requirements effectively.
What students can do 3: Tasks are designed to assess what students are able to do, and are not re-stricted to gauging their abilities against stated curriculum goals for their classlevel.
Span the range 4: Tasks span the expected range of capabilities of year 4 and year 8 students. Sometasks allow very able students to demonstrate the extent of their abilities, others allowthe least able students to show the skills and knowledge they do have. Task administra-tion procedures allow all students to have a positive assessment experience.
4 8 NEMP Report 5: Technology 1996
Avoid bias 5: Individual tasks avoid unnecessary bias associated with gender, culture orsocial background. However, tasks which may reflect the experiences, inter-ests and skills of particular categories of students are included, and care istaken to achieve a good balance of tasks across those categories.
Variety of formats 6: Tasks use a wide variety of formats chosen to suit the outcomes being as-sessed. These include interviews, observations, questionnaires, and measuresof performance on physical tasks, as well as the commonly used pencil-and-paper formats. Many of the tasks allow processes to be analysed, in addition tooutcomes.
Time required 7: Each task can be administered satisfactorily in school settings, in less than anhour, by a visiting teacher trained in assessment procedures and working withone or four students. Sufficient time is allowed for each student to make agood attempt at completing each task.
How are the Student Performances Analysed and Reported?About 20 senior education students, most of whom are at or near the end oftheir training as primary school teachers, together with about 150 teachers,assist the permanent staff with the analysis and marking of the student’s per-formances each year. The marking process includes discussion of initial exam-ples and periodic checks on consistency of marking by different markers.Teacher professional development through participation in the marking proc-ess is another substantial benefit from national monitoring. In evaluations oftheir experiences on a four point scale (“dissatisfied” to “highly satisfied”), 80to 95 percent of the teachers who marked student work in 1996 chose “highlysatisfied” in response to questions about:
➢ the extent to which marking experience was professionallysatisfying and interesting
➢ the contribution to their professional development in the area ofassessment
➢ whether they would recommend NEMP marking work tocolleagues
➢ whether they would be happy to do NEMP marking again.Written reports will be produced for each curriculum area covered, with peri-odic reports about cross-curricular skills. The use of video reports is beingconsidered, because of their strong communicative power, but for this explicitpermission will be needed from parents of students who appear in the videoreports.
Each report will fully describe about two thirds of the tasks used that year,thus giving teachers details of a substantial number of carefully developedassessment tasks which can be used as models or prototypes to enrich theirown assessment practices. Because the remaining one third of tasks will beused again four years later, so that trends in performance can be reported, it is notdesirable that this subgroup of tasks be described in detail or widely available.
ReferencesDepartment of Education (1962). Report of the commission on education inNew Zealand (“Currie Report”). Wellington: Government Printer.Educational Development Conference (1974). Improving learning and teach-ing: report of the working party on improving learning and teaching. Wel-lington: Educational Development Conference.Working Party on Assessment for Better Learning (1990). Tomorrow’s stand-ards: the report of the ministerial working party on assessment for betterlearning. Wellington: Ministry of Education.Ministry of Education (1993). The New Zealand curriculum framework. Wel-lington: Learning Media.Royal Commission on Social Policy (1988). The April Report: report of TheRoyal Commission on Social Policy. Wellington: The Commission.
APPENDIX 2 49THE SAMPLE OF SCHOOLS AND STUDENTS IN 1996
Sampling proceduresIn 1996, 2868 children from 265 schools were in the final samples to partici-pate in national monitoring. About half were in year 4, the other half in year 8.At each level, 120 schools were selected randomly from national lists of state,integrated and private schools teaching at that level, with their probability ofselection proportional to the number of students enrolled in the level. Theprocess used ensured that each region was fairly represented. Schools withfewer than four students enrolled at the given level were excluded, as werespecial schools and Kura Kaupapa schools (by mutual agreement, the latterwill be included from 1999 onwards).
Late in April 1996, the Ministry of Education provided computer files contain-ing lists of eligible schools with year 4 and year 8 students, organised by regionand district, including year 4 and year 8 roll numbers drawn from school statis-tical returns based on enrolments at 1 March 1996.
From these lists, we randomly selected 120 schools with year 4 students and120 schools with year 8 students. Schools with four students in year 4 or 8 hadabout a one percent chance of being selected, while some of the largest inter-mediate (year 7 and 8) schools had a more than 90 percent chance of inclu-sion. In the one case where the same school was chosen at both year 4 andyear 8 level, a replacement year 4 school of similar size was chosen from thesame region and district, type and size of school.
Pairing small schoolsAt the year 8 level, 13 of the 120 chosen schools had less than 12 year 8 stu-dents. For each of these schools, we identified the nearest small school whichmet our criteria to be paired with the first school. Wherever possible, schoolswith 8 to 11 students were paired with schools with 4 to 7 students, and viceversa. However, the travelling distances between the schools were also takeninto account. Similar pairing procedures were followed at the year 4 level,creating 12 pairs of schools at this level.
Contacting schoolsDuring the second week of May, we attempted to telephone the principals oracting principals of all schools in the year 8 sample. We made contact with allschools during that period, where necessary leaving messages for the princi-pal to return our call on the Project’s 0800 number. Discussions with the lastfew principals were not completed until the following week.
In our telephone calls with the principals, we briefly explained the purpose ofnational monitoring, the safeguards for schools and students, and the practicaldemands participation would make on schools and students. We informed theprincipals about the materials which would be arriving in the school (a copyof a 15 minute NEMP videotape plus copies for all staff and trustees of theNEMP brochure and detailed booklet for sample schools). We asked the prin-cipals to consult with their staff and Board of Trustees and confirm their par-ticipation by the end of June.
A similar procedure was followed in mid June with the principals of the schoolsselected in the year 4 sample, and they were asked to respond to the invitationby the end of July.
5 0 NEMP Report 5: Technology 1996
Response from schoolsOf the 265 schools invited to participate, 260 agreed. Two schools declinedbecause of major disruption caused by building work, two because they wereRudolph Steiner schools reluctant to participate in any “research” (anotherRudolph Steiner chose to participate), and the fifth was a small private schoolwhich had been included in 1995 and wanted a break from participation. Afurther school in the original sample was replaced when it became apparentthat its year 4 roll had become too small. All six schools were replaced in thesame way: we selected the next school of similar size in the same district fromour sampling list.
Sampling of studentsWith their confirmation of participation, each school sent a list of the namesof all year 4 or year 8 students on their roll. Using computer generated ran-dom numbers, we randomly selected the required number of students (12, or4 plus 8 in a pair of small schools), at the same time clustering them intorandom groups of four students. The schools were then sent a list of theirselected students and invited to inform us if special care would be needed inassessing any of those children (e.g. children with disabilities or limited skillsin English).
At the year 8 level, we received about 88 comments from schools about par-ticular students. In 42 cases, we randomly selected replacement students be-cause the children initially selected had left the school between the time theroll was provided and the start of the assessment programme in the school, orwere expected to be away throughout the assessment week. The remaining46 comments concerned children with special needs. Each such child wasdiscussed with the school and a decision agreed. Three students were re-placed because they were very recent immigrants (within six months) whohad extremely limited English language skills. Six students were replaced be-cause they had disabilities of such seriousness that it was agreed that the stu-dents would be placed at emotional risk if they participated. Participation wasagreed upon for the remaining 37 students, but a special note was prepared togive additional guidance to the teachers who would assess them.
In the corresponding operation at year 4 level, we received 126 commentsfrom schools about particular students. In part, the larger number arose be-cause there was a longer time gap between our receipt of the class rolls andthe assessment weeks. This meant that 61 children originally selected neededto be replaced because they had left the school. Nineteen students were men-tioned because of their ESOL status. Of these, 10 very recent immigrants werereplaced. Fourteeen students were mentioned because they were participantsin total immersion Ma-ori language programmes. Assessment in Ma-ori was ar-ranged for the 10 immersion students at one school, and two immersion stu-dents were replaced. Two students were replaced because they had beenreclassified as year 3. Other special needs were mentioned for 40 children,and 14 of these children were replaced (1 because of very severe physicaldisabilities, and 13 because of concerns about their ability to cope with theassessment situation). Special notes for the assessing teachers were made about37 children retained in the sample.
Communication with parentsFollowing these discussions with the school, Project staff prepared letters toall of the parents, including a copy of the NEMP brochure, and asked the schoolsto address the letters and mail them. Parents were told they could obtainfurther information from Project staff (using an 0800 number) or their schoolprincipal, and advised that they had the right to ask that their child be ex-cluded from the assessment.
Appendix 2: The 1996 Sample 5 1
Our 0800 number was monitored in evenings, as well as during the day, fortwo weeks following each mailing of letters to parents.
At the year 8 level, we received about 15 phone calls including several fromstudents wanting more information about what would be involved. The mainissues raised by parents were our reasons for selection of their child, a wish forfuller details or reiteration of what would be involved, concerns about the useof video equipment, or reluctance of the child to take part. Five children werereplaced as a result of these contacts, one at the child’s request, and four at theparents’ request (two were Exclusive Brethren and did not allow video view-ing, one did not want her child video recorded, and the fourth gave no reason)
At the year 4 level we received about 20 phone calls from parents. Somewanted details confirmed or explained (notably about reasons for selection).One child chose to withdraw even though her parents were happy for herparticipate. Nine children were replaced at parents’ request: three becausethe family was Exclusive Brethren, and six because parents were concernedabout additional stress for their children.
Practical arrangements with schoolsOn the basis of preferences expressed by the schools, we then allocated eachschool to one of the five assessment weeks available and gave them contactinformation for the two teachers who would come to the school for a week toconduct the assessments. We also provided information about the assessmentschedule and the space and furniture requirements, offering to pay for hire ofa nearby facility if the school was too crowded to accommodate the assess-ment programme.
Results of the sampling processAs a result of the considerable care taken, and the attractiveness of the assess-ment arrangements to schools and children, the attrition from the initial sam-ple was low. Less than two percent of selected schools did not participate, andless than three percent of the originally sampled children had to be replacedfor reasons other than their transfer to another school. The sample can beregarded as very representative of the population from which it was chosen(all children in New Zealand schools at the two class levels except the one totwo percent in special schools, Kura Kaupapa schools, or schools with lessthan four year 4 or year 8 children).
Of course, not all the children in the sample were actually able to be assessed.Some were absent from school for some or all of their assessment sessions,and a small percentage of performances were lost because of malfunctions inthe video recording process. For many tasks, over 95 percent of the samplewere assessed. No task had less than 90 percent of the sample assessed. Giventhe complexity of the Project, this was a very acceptable success rate.
5 2 NEMP Report 5: Technology 1996
Demography
Composition of the sampleBecause of the sampling approach used, regions were fairly represented in thesample, in approximate proportion to the number of school children in theregions.
Percentages of children from each regionRegion % of year 4 sample % of year 8 sample
Northland 4.9 4.2
Auckland 29.3 28.4
Waikato 9.9 10.0
Bay of Plenty/Poverty Bay 8.3 8.4
Hawkes Bay 4.2 5.0
Taranaki 4.2 2.5
Wanganui/Manawatu 5.8 6.6
Wellington/Wairarapa 10.9 10.9
Nelson/Marlborough/West Coast 4.2 4.1
Canterbury 11.6 12.6
Otago 4.2 4.0
Southland 2.5 3.3
Percentages of children in each category of thedemographic variables
Variable Category % year 4 sample % year 8 sampleGender Male 52 53
Female 48 47
Ethnicity Non-Ma-ori 81 78Ma-ori 19 22
Geographic Zone Greater Auckland 29 28Other North Island 48 48South Island 23 24
Community Size > 100,000 55 5610,000–100,000 29 26< 10,000 16 18
School SES Index Bottom 30 percent 36 33Middle 40 percent 30 40Top 30 percent 34 27
School % Ma-ori < 10% 39 3210–30% 42 43> 30% 19 25
School % PacificIsland Up to 5% 76 76> 5% 24 24
Size of School < 20 y4 students 2120–35 y4 students 17> 35 y4 students 62
<35 y8 students 2935–150 y8 students 30> 150 y8 students 41
Type of School Full Primary 39Intermediate 51
Region
University of Otago
The Educational Assessment Research Unit
National monitoring provides a “snapshot” of what NewZealand children can do at two levels in primary andintermediate schools: ages 8–9 and ages 12–13.
The main purposes for national monitoring are:• to meet public accountability and information requirements by
identifying and reporting patterns and trends in educationalperformance
• to provide high quality, detailed information which policymakers, curriculum planners and educators can use to debateand review educational practices and resourcing.
The areas of technology within which studentsdevelop their knowledge, understandings andskills embrace a great deal of personal, cultural,environmental and economic activity.
Biotechnology, for example, involves the use of livingsystems and organisms; materials technology includesthe investigation, use and development of materialssuch as wood, textiles, metals and fuels; informationand communication technology covers a complexrange of processes, equipment and devices that enablethe management and use numerous forms of data andinformation. Design, including the processes ofspecification, development and testing of ideas, iscentral to all areas of technology. products, systemsand environments.
TE
CH
NO
LO
GY
A
SS
ES
SM
EN
T R
ES
ULT
S 19
96
ISSN 1174–0000ISBN 1–877182–06–0S A P E R E - A U D E