john q - edge hill university web viewvia written proformas and video screen captures, although...

20
Technology enhanced formative assessment: Participant experiences, relative efficiencies and tutor learning curves Dr W. Rod Cullen Manchester Metropolitan University, [email protected] Susan Gregory Manchester Metropolitan University, [email protected] Dr Neil Ringan Manchester Metropolitan University, [email protected] Mark Roche Manchester Metropolitan University, [email protected] ABSTRACT Provision of timely, relevant and constructive feedback is an essential component of student learning. It can however be challenging to provide feedback that is useful to students within an appropriate timescale. In addition, getting students to utilise feedback they receive can be problematic especially when assessment is formative rather than summative. These issues are becoming increasingly difficult as academics are faced with larger class sizes, less face-to-face contact with students and additional demands on their time. Over a period of several years we have explored three main mechanisms, written proformas, audio recordings and video screen captures, for providing feedback on formatively assessed activities on units we teach on the Post-graduate Certificate/Masters in Academic Practice. Over this period, we 1 SOLSTICE & CLT Conference 2015, Edge Hill University

Upload: hathuy

Post on 06-Mar-2018

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: John Q - Edge Hill University Web viewvia written proformas and video screen captures, although there is variation based on the experience and preferences of individual tutors relating

Technology enhanced formative assessment: Participant experiences, relative efficiencies and tutor learning curves

Dr W. Rod Cullen

Manchester Metropolitan University, [email protected]

Susan Gregory

Manchester Metropolitan University, [email protected]

Dr Neil Ringan

Manchester Metropolitan University, [email protected]

Mark Roche

Manchester Metropolitan University, [email protected]

ABSTRACTProvision of timely, relevant and constructive feedback is an essential component of student learning. It can however be challenging to provide feedback that is useful to students within an appropriate timescale. In addition, getting students to utilise feedback they receive can be problematic especially when assessment is formative rather than summative. These issues are becoming increasingly difficult as academics are faced with larger class sizes, less face-to-face contact with students and additional demands on their time.

Over a period of several years we have explored three main mechanisms, written proformas, audio recordings and video screen captures, for providing feedback on formatively assessed activities on units we teach on the Post-graduate Certificate/Masters in Academic Practice. Over this period, we have explored both the participant experience of these different feedback formats and the timescales involved in producing it. This has involved gathering feedback from participants and keeping detailed logs of our activity as tutors teaching on the programme.

Findings suggest that participants find video and audio formats more personal and engaging than written proformas. Some participants find the visual impact of video screen captures particularly engaging while others find that the ease with which audio recordings can be paused and replayed encourages a more reflective approach to learning. Analysis of tutor activity logs suggests that audio feedback can be produced more efficiently than via written proformas and video screen captures, although there is

1

SOLSTICE & CLT Conference 2015, Edge Hill University

Page 2: John Q - Edge Hill University Web viewvia written proformas and video screen captures, although there is variation based on the experience and preferences of individual tutors relating

variation based on the experience and preferences of individual tutors relating to different technologies.

This paper will: emphasise the need to design feedback provision into the delivery of the teaching, learning and assessment; demonstrate different technological approaches to providing feedback; present findings on students’ perceptions and use of the feedback in different formats; and assess the relative efficiencies of the different feedback technologies from the staff perspective.

KEYWORDSFormative, Assessment, Feedback, Technology, Efficiencies, Student Experience

IntroductionEnhancing Learning Teaching and Assessment with Technology (ELTAT) is a 30 credit optional unit on the Post-graduate Certificate and Masters in Academic Practice (PGC/MA AP) Programme taught at Manchester Metropolitan University. The unit has evolved, over several years, building iteratively on tutor and participant experiences. The unit title, credit rating, duration and number of tutors has changed over the years as summarised in Table 1. Table 1 - Evolution of ELTAT unit 2008-2015

Evolution of Enhancing Learning Teaching and Assessment unit

Date Unit Title Credit Weeks Tutor(s)

2007-09 Designing Courses for VLEs (DCVLEs)

10 4 RC

2009-10 Designing Courses for VLEs (DCVLEs)

10 4 RC, SG

2011-13 Design Effective Online and Blended Learning (DEBOL)

10 4 RC, SG with guest sessions/workshops from NR and MR

2014 Enhancing Learning Teaching and Assessment (ELTAT)

30 12 RC, SG, NR, MR

Tutor Key RC = Rod Cullen; SG = Susan Gregory; NR = Neil Ringan; MR = Mark Roche

Participants who take the unit are primarily new academic staff but also include experienced academic colleagues, academic support staff and increasingly technical support staff. Typical cohorts have between 15 and 25 participants. Although the developments in Table 1 represent significant changes to the unit, the aims, learning outcomes, assessment strategy and delivery model have been relatively consistent. The overall aim of the unit has always been to enable participants to explore the opportunities offered by the wide range of technologies available to them at MMU to support learning, teaching and assessment.

The delivery model that has been used throughout is summarised in Figure 1.

2

SOLSTICE & CLT Conference 2015, Edge Hill University

Page 3: John Q - Edge Hill University Web viewvia written proformas and video screen captures, although there is variation based on the experience and preferences of individual tutors relating

Figure 1 - ELTAT delivery model

Each week participants are required to undertake at least one preparatory task that is set online via the unit area in the institutional VLE (currently Moodle). The output of these tasks are integral to the in-class session for that week. These are highly interactive sessions involving group and whole class activities designed to share experiences and promote critical dialogue. Weekly formative tasks are set as follow-up to the in class sessions upon which the participants receive individual, personalised feedback that contributes to online preparation for the next class session. This cycle repeats throughout the duration of the unit.

Although the finer details of the assessment strategy have changed slightly over the years the overall approach has been the same. The weekly formatively assessed activities build into a mini (e)Portfolio that reviews participants’ current practice, identifies strengths and weaknesses, reflects on experiences with technologies and opportunities to use appropriate technologies in the context of their own practice. By the end of the unit participants have a formative evidence-base in their mini (e)Portfolio upon which to draw and cross-reference in the production of a summatively assessed plan to implement (or enhance) an element of technology enhanced learning, teaching and/or assessment in their practice along with a personal reflection of the learning experience on the unit.

There are some key principles that are worth highlighting in this approach

Participants receive regular (weekly), rapid (before the following classroom session), personalised, developmental feedback. The aim is to give “value” to the feedback in that those who don’t engage with it find that they cannot participate and/or contribute in the class based activities and discussions as effectively as those who have. Participants are asked to “buy in” to the participatory nature of the unit right from the start.

The course exposes participants to a student experience rich in technology giving them a first-hand experience that enables them to make better-informed

3

SOLSTICE & CLT Conference 2015, Edge Hill University

Page 4: John Q - Edge Hill University Web viewvia written proformas and video screen captures, although there is variation based on the experience and preferences of individual tutors relating

decisions about the potential role of technology in their own practice. In particular, participants are given feedback in a range of technological formats.

The products of the formative and summative assessments ultimately provide a set of useful working documents that participants can use to implement plans they develop on the unit in their practice.

The specific technologies used to provide formative feedback have changed but these fit into three main categories.

Written text: For example, student submissions (MS word format) are annotated (using the review comments tool) and further written feedback is provided using feedback proformas (e.g. MS Word format). Annotations and feedback profomas are returned to the student.

Audio recordings: For example, student submissions (MS word format) are annotated (using the review comments tool) and further feedback is provided using short audio recordings (e.g. mp3 format produced using Audacity). Annotated work and audio recordings are returned to the students.

Video Screencast recordings: For example, student submissions (MS word format) are annotated (using the review comments tool) and further feedback is provided using short video screen captures (e.g. wmv format produced using BB Flashback) of the students annotated work.

Purpose of this paperThis paper provides an evidence-based narrative account of the teaching team and participants’ experiences of technology enhanced formative assessment throughout the evolution of ELTAT from the ancestral units shown in Table 1.

MethodsThis paper is the result of opportunistic action research undertaken for each iteration of the units in Table 1 throughout the period 2008 to 2014. Similar data were collected during each iteration as follows.

Teaching team experienceTo examine the time invested by tutors in producing formative feedback, all tutors teaching on the unit kept a detailed feedback activity log. This recorded the amount of time spent, in total, producing feedback for each formative task (all versions of the unit from 2008-14) and for the ELTAT version of the unit (2014) the time spent on specific technical tasks associated with different feedback methods was also recorded.

The opportunistic nature of the research has resulted in a complex overall data set. For this reason, the statistics presented in this paper are descriptive and formal statistical comparison techniques have not been used.

The participant experience A range of data collection methods have been used to explore the participant experience and perceptions of the feedback received. There has been some variation in the methods used between 2008 and 2014. The main methods used include:4

SOLSTICE & CLT Conference 2015, Edge Hill University

Page 5: John Q - Edge Hill University Web viewvia written proformas and video screen captures, although there is variation based on the experience and preferences of individual tutors relating

One-to-one participant interviews about experiences End of unit survey (contains free text questions on experience) Formative assignment submission data (tracked via VLE and ePortfolio) Participant attendance at face-to-face sessions Informal anecdotal evidence from in-class discussion of activities

This has resulted in a complex multi-source data set. The text based data sources (interview transcripts, free text questions in surveys etc) have been treated holistically and have been thematically analysed (Creswell, 2003) using an adapted framework approach (Richie and Spencer, 1994). Attendance data has been subject to descriptive statistical analysis.

Early experiences

Producing formative feedbackWhen the unit was first delivered as DCVLEs, it was taught by a single tutor, Rod Cullen. Over the first few iterations of the unit, Rod gained considerable experience of providing the rapid, personalized formative feedback that the unit required. He had undertaken and published action research (Cullen, 2010, Cullen 2011) on this work that demonstrated that he was able to produce equivalent amounts of feedback in written and video screen capture formats in roughly the same time but could produce equivalent amounts of audio feedback in about half the time. As the DCVLEs unit became more popular with students on the PGC/MA AP programme and the numbers of students wanting to take the unit grew it was decided to bring another tutor, Susan Gregory, on board to co-teach. This also seemed like a good opportunity to extend the scope of the action research to see if the findings in relation to the feedback efficiencies were repeatable. Consequently, when the unit ran again for the 2009/10 academic session both Rod and Susan kept detailed logs (see Methods) of the time they spent producing formative feedback.

The results came as something of a surprise. Although the overall pattern was roughly similar for Susan, in that it took longer to produce text and screen capture feedback than audio feedback, it was taking her almost twice as long as Rod to produce feedback

for each submission (Figure 1).

Upon reflection, the reasons for this difference were quite subtle and complex.

5

SOLSTICE & CLT Conference 2015, Edge Hill University

Page 6: John Q - Edge Hill University Web viewvia written proformas and video screen captures, although there is variation based on the experience and preferences of individual tutors relating

There were some issues to do with the technology being used. At this time Pebblepad was being used as an ePortfolio tool and Susan was learning how to use it and upload the feedback she had produced for the first time (time for this was included in the activity logs), while Rod had been using it for several years. When producing video screen captures for the first time she encountered some problems getting the videos into a shareable format due to codec issues and rendering video files that were simply too big to upload to Pebblepad. It took her some time to fine tune production processes that worked for her.

There were also some issues related to the formative tasks. The formative

tasks are not easy and Susan has pointed out that it took her some time to get to grips with them and consequently to work out what feedback was required. Having delivered the unit several times previously Rod was very familiar with the tasks, knew what was expected of the students and had previous experience of providing feedback to draw upon.

Susan was also initially uncomfortable with recording process and found it difficult to listen to recordings of her own voice. This lead her to re-record some of her early video and audio recordings while Rod was doing all of his recordings in one take. Despite this, Susan felt that she had be able to get up to speed with the audio format more quickly than the other methods and this is clearly borne out in Figure 2.

This was the first time that we really began to think about the extent of the “learning curve” associated with adopting new technologies into aspects of teaching, learning and assessment. Both Rod and Susan are experienced Senior Lecturers in Learning and Teaching Technologies and we had not anticipated how much difference there would be. We will return to this issue later in the paper.

Participant attendance and engagement with formative tasksThere is a great deal of published literature concerning the challenges of student engagement with feedback (e.g. Orsmond et al., 2005, Bloxham and Boyd, 2007; Jollands et al., 2009) and many readers will have academic colleagues who tell them:

“I can’t get my students to do work that doesn’t get a mark”

and/or

“My students don’t use the feedback I give them – sometimes they don’t even pick it up”

It was to address these commonly held beliefs that the delivery model in Figure 1 has been core to the delivery of all of the units that evolved into ELTAT.

6

SOLSTICE & CLT Conference 2015, Edge Hill University

Figure 2 Comparison of feedback production formats DCVLEs 2009-10 (Mean minutes/submission)

Page 7: John Q - Edge Hill University Web viewvia written proformas and video screen captures, although there is variation based on the experience and preferences of individual tutors relating

Attendance at in-class sessions and submission rates for the formative tasks were recorded for each iteration of DCVLEs between 2007 and 2010. These data are summarised in Figure 3. There are several key things to note.

Attendance for the in class sessions was generally high (80-90%). Submission of the formative tasks was also reasonably high; for the first three weeks, it was consistently over 70%. Given that participants receive no marks (only feedback) on the formative tasks this is particularly pleasing. It is also worth highlighting that debriefing and

discussion of the preceding week’s formative tasks is embedded in the in class sessions so that all participants,

including those who did not complete the formative tasks, are engaged with the task and the feedback to some extent (helped by the high attendance).

Throughout the 2008-10 period of data collection, there was a characteristic drop in the number of submissions for the final task that is set at the end of the final in-class session. The feeling of the team is that this is because there is no in-class session in which the feedback will be required. In other words, the “value” or motivation to receive the feedback is missing. It is worth pointing out that not all students enrolled on the unit with the intention of completing the summative assessment for accreditation purposes, some were simply participating for personal continuing professional development (CPD). A closer inspection of the submission data showed that those students who were taking the summative assessment were the ones who submitted the final (week 4) formative task. We suggest that having a complete ePortfolio of formative tasks as the evidence base of their summative assignment adds the value that motives these students to complete the task.

Overall, we are confident that this delivery model gives valuable feedback that motivates students to undertake formative tasks so that they are able to participate fully in interactive discussion based class sessions. Additional value is added to feedback through the tight underpinning of summative assessment with relevant developmental formative assessment.

Ongoing developmentsIn 2011, the unit title was changed following a programme review to Designing Effective Online and Blended Learning (DEBOL). Although there were some small changes to the content of the unit, the delivery model, duration and the main formative tasks remained the same. Two other tutors, Neil Ringan and Mark Roche, were brought onboard primarily to spread the teaching load. Rod and Susan continued as the primary tutors

7

SOLSTICE & CLT Conference 2015, Edge Hill University

Figure 3 – Mean % attendance and mean % formative tasks submission 2007-10 (DCVLEs)

Page 8: John Q - Edge Hill University Web viewvia written proformas and video screen captures, although there is variation based on the experience and preferences of individual tutors relating

providing feedback on the formative activities, although Neil also made one or two contributions at busy periods.

In 2013, the PGC/MA AP programme underwent a major review and as a result, the unit evolved into its current form as Enhancing Learning Teaching and Assessment with Technology (ELTAT). ELTAT became a 30-credit unit, delivered over a much longer period (12 weeks) giving more time to complete tasks and provide feedback. The four-person team teaching approach continued but this was extended so that all four tutors contributed to the provision of feedback for the formative task (each tutor was allocated a group of 3-4 participants to work with throughout the course). We also took the opportunity to slightly redesign some of the formative tasks with one or two becoming more substantial. We continued to use the written, audio and video screencast methods to provide feedback but we also introduced the use of Turnitin Grademark (combining annotation, general text comments and an audio recording) for one task.

Recent experiencesHaving redesigned ELTAT and expanded the teaching team, once again, it seemed like a good opportunity to extend the scope of the ongoing action research to further investigate the feedback efficiencies. In this respect, when ELTAT ran for the first time in 2014 Rod, Susan, Neil and Mark kept detailed logs (see Methods) of the time they spent producing formative feedback.

Producing formative feedback for ELTATAt this time, Rod and Susan had several years’ experience of the both the formative tasks and the technologies used to provide feedback. In contrast, to earlier experiences on DCVLEs (Figure 2) their mean times for producing feedback per submissions for the first running of ELTAT are very similar (Figure 4). This time it was evident that both Neil

and Mark were going through similar experiences to Susan when she was initially getting to grips with both the requirements of the formative tasks and the technical and practical aspects of producing the feedback. This is particularly evident in relation to the tasks that received video and audio screen capture feedback the more experienced tutors (Rod and Susan) were able to produce feedback for individual submission in approximately half the time of tutors who were providing feedback on the tasks for essentially the first time (Neil and Mark). Again, on reflection this appears to be a result of time being required to; figure out an efficient

technical procedure for the feedback method; getting to grips with the learning

8

SOLSTICE & CLT Conference 2015, Edge Hill University

Figure 4 - Comparison of feedback production formats ELTAT 2014 (Mean minutes/submission)

Page 9: John Q - Edge Hill University Web viewvia written proformas and video screen captures, although there is variation based on the experience and preferences of individual tutors relating

requirements of the formative task so that appropriate feedback can be given; and becoming comfortable with the recording process (both Neil and Mark reflected upon instances where they had not been satisfied with the clarity of the feedback that they had recorded and deciding to rerecord it which takes considerable time).

Interestingly, in terms of producing written feedback, members of the team were able to produce the feedback in a similar time frame (≈ 60 minutes). Following the redesign of the unit this was the first time that a written feedback proforma had been used for this particular task. In this respect, it is possible that all of the team were at the start of a learning curve with this task. In comparison with written feedback that Rod and Susan produced for a different task on DCVLEs (Figure 2) it took Rod much longer to produce this written feedback and Susan slightly longer. This suggests to us that the learning curve is as much about the getting to grips with the requirements of the tasks as it is about the technicalities of producing the feedback in a given technological format.

The use of Turnitin has increased massively in recent years at MMU particularly in terms of using the Grademark tools it provides for the provision of multi-layered feedback that include text annotations on students work (Quickmarks), general text comments and audio comments. The fact that this can be done using a single tool is really powerful and attractive to colleagues. As can be seen in Figure 4 for each member of the teaching team, Turnitin compared well with written feedback in terms of the time taken per submission but it should be remembered that with Turnitin both written and audio feedback was provided.

Using data combined for the team as a whole the mean time taken to produce feedback using the four technological approaches has been broken down into component steps (Figure 5). For the team as a whole, the mean overall time taken to produce feedback per submission is relatively consistent (≈50-60 minutes) but there is a quite a lot of difference in what this time is spent on.

For example, audio feedback took an average of just over 50 minutes for the whole process. This includes downloading the submission from the

VLE, annotating writing comments onto the work, recording the audio feedback and then uploading the feedback to the VLE for the participant to access. For screencasts, the process includes the additional time-consuming activity of rendering the video (Publishing) into a format that is suitable for sharing with the student. For all of the methods, a relative small amount of time is taken up with technical processes of downloading and uploading work to the VLE, although it needs to be recognised that this takes 2-3 minutes longer per submission for audio and video screen capture feedback than for written feedback and for TII there is no upload and download

9

SOLSTICE & CLT Conference 2015, Edge Hill University

Figure 5 - Comparison of feedback times (Team Mean) per submission broken down into component steps (ELTAT 2014)

Page 10: John Q - Edge Hill University Web viewvia written proformas and video screen captures, although there is variation based on the experience and preferences of individual tutors relating

requirement at all. What emerges quite clearly is that for all four methods, it is reading through the submissions and annotating them that is the most demanding. This ties in closely with our perception that the learning curves that we are seeing are heavily influenced by the need to get to grips with the learning requirements of the tasks so that useful feedback can be given as much as by learning the technological process.

One of the advantages of collecting this data was that the team were able to feed their experiences back to the participants. This enabled in class discussion around the participant experience of receiving feedback using the different methods to be broadened out to include informed discussions around using them in practice. These proved to be interesting and insightful discussions for both participants and the teaching team. During at least one of these discussions, the idea that the different methods of feedback might provide different “quantities” of feedback was raised, in simple terms maybe people speak more quickly than they write.

In an attempt to explore this idea, the data and the feedback needed to be looked at again. We wanted to try to get a count for the number of words (written or spoken) in the feedback for each submission with each feedback method.

Written feedback: We simply used the word count tool in MS Word

Screen capture and audio feedback: For each tutor 1 minute of their recorded feedback was played and the number of words spoken was counted to estimate their speaking rate. Data had already been recorded for the length of each audio and video recording and this was simply multiplied by the tutors speaking rate to get an estimate of the total number of words in each piece of feedback.

Turnitin: To get a word count here we copied and pasted the general text comments from TII into a word document to get a word count. For the Turnitin audio feedback we multiplied the length of the recording by the estimated tutors speaking rate. We then added the two word counts together.

Unfortunately, we were unable to easily count the number of words on annotations (comments in MS word and Quickmarks in Turnitin) for any of the feedback methods so these are not included. This adds to the roughness of the estimates we have made.

Once we had a rough word count for all of the methods this was divided by the total time taken to produce the feedback (from downloading it to uploading back into the VLE) giving us a rough measure of the relative efficiency of the method in terms of Feedback words per minute. The results are shown in Figure 6.

For video screen capture and audio feedback these results add support to the notion of an experiential learning curve with Susan and Rod producing

10

SOLSTICE & CLT Conference 2015, Edge Hill University

Figure 6 - Comparison of relative efficiencies (Feedback Word per minute) for ELTAT teaching team.

Page 11: John Q - Edge Hill University Web viewvia written proformas and video screen captures, although there is variation based on the experience and preferences of individual tutors relating

more feedback per minute than Neil and Mark, although this is less clear between Susan and Neil for video screen captures.

For written feedback, preference as well as experience may have a role to play. Neil (most efficient for written feedback) is very experienced at academic writing and has published widely. He has also expressed a personal preference for writing as has Susan. Rod much prefers oral to written communication and is conscious that he needs to take great care to ensure he says what he actually means in writing which often leads to him reworking written feedback. In terms of teaching on an academic programme such as the PGC/MA AP, Mark is the least experienced member of the team and feels that he has needed to really take his time and provide a considered response on all of the tasks.

From Figure 6, the least efficient method of producing feedback across all of the teaching team would appear to be Turnitin. On reflection, we are very cautious about this result. Discussion between the team indicates the use of Quickmarks (similar to annotations using comments in MS Word) to provide feedback was more extensive than with the other techniques. Consequently, it is highly likely that the amount of feedback provided has been underestimated.

Participant attendance and engagement with formative tasksUnfortunately, our attendance data is at present incomplete for the face-to-face sessions of the 2014 & 2015 iterations of ELTAT. However, we can report that attendance was between 80 and 100% for each week of the course.

ELTAT now runs over 10 weeks with a greater number of formative tasks. It is very pleasing that the delivery model continues to facilitate engagement (60–90% submission rates) with the purely formative tasks (Figure 7).

We once again see a slight drop in submission rates towards the end of the face-to-face delivery (drops below 70%). Again closer inspection shows that these submissions are made by those participants who are undertaking formal summative assessment and those who are talking the unit for personal CPD are less likely to submit.

Overall student perspectivesOver the eight years that DCVLEs and DEBOL evolved into ELTAT the teaching team have continually elevated the formative feedback provision and the following consistent themes have emerged.

11

SOLSTICE & CLT Conference 2015, Edge Hill University

Figure 7 - Mean % formative tasks submission ELTAT 2014 & 15

Page 12: John Q - Edge Hill University Web viewvia written proformas and video screen captures, although there is variation based on the experience and preferences of individual tutors relating

1. Participants express a clear preference for audio and video over written feedback formats.

2. The annotations provided alongside the audio and video formats are highly valued as they provide additional context and signpost the main points made in the feedback.

3. The participants consider audio and video feedback more personal and engaging and frequently describe the spoken word as more understandable than written feedback.

4. Most of the participants were receiving feedback in audio and video format for the first time and this newness/novelty may have an influence on perceptions.

5. There is consistently a 50:50 split in preference for audio or video feedback.6. There is some evidence to suggest that video and audio feedback is used

differently: a. Some participants describe the “high visual impact” of video screen

captures, which they tend to watch in a single sitting like a TV programme. b. Others describe using audio recordings in a more reflective approach

where they start, stop and rewind and replay recordings in short sections while reading their original (annotated) piece of work.

7. The experience of receiving feedback in different formats as a student and receiving insights into the tutor experience of producing it enables participants to make informed decisions about the potential use of these methods in their own practice.

12

SOLSTICE & CLT Conference 2015, Edge Hill University

Page 13: John Q - Edge Hill University Web viewvia written proformas and video screen captures, although there is variation based on the experience and preferences of individual tutors relating

ConclusionsThe action research and reflective practice undertaken has revealed complex and subtle multifaceted learning curves associated with the adoption of new technologies and the formative task for which they are used to provide feedback. These learning curves include: developing technical skills/processes; overcoming feelings self-consciousness when making recordings; and above all, developing an understanding of the learning requirements of formative tasks that allows for appropriate feedback to be given.

Regardless of the technology being used to provide feedback, the team’s experience indicates that the technical aspects of providing feedback are not as time consuming as cognitive aspects of reading, understanding and determining what feedback is required.

In terms of efficiencies, it is reasonably clear that given time to work through the required learning curve, audio feedback was produced more efficiently that written and video screen capture feedback. Despite this and the fact that participants indicated a preference for audio and video screen captures, the team is of the opinion that the choice of feedback method should be matched to the task. For example, if the learning task focuses on students’ written skills, feedback in written format would be more appropriate while tasks that have a visual/design element are better suited to screen capture feedback.

High attendance and submission rates for the formative tasks suggest that the delivery model used for the units in the research has a positive influence on engagement by giving a clear “value” to the feedback provided on the formative tasks and this is strengthened by the suite of formative assessment tasks underpinning and providing an evidence-base for the summative assessment of the unit.

Take home messages To embed formative assessment into a curriculum:

1. Design it into delivery model and make the value of the feedback clear

2. Align it with summative assessment and signpost this to students

3. Select feedback methods/technologies that best match the type of learning tasks

4. Be prepared to work through the learning curve

REFERENCESBloxham, S. and Boyd, P. (2007) Developing Effective Assessment in Higher Education: a Practical Guide. Berkshire, UK: Open University PressCreswell, J.W. (2003), Research design: qualitative, quantitative, and mixed method approaches. 2nd ed. California: Sage publications.

13

SOLSTICE & CLT Conference 2015, Edge Hill University

Page 14: John Q - Edge Hill University Web viewvia written proformas and video screen captures, although there is variation based on the experience and preferences of individual tutors relating

Cullen W. R. (2011) A multi-technology formative assessment strategy, Media-Enhanced Feedback case studies and methods, Proceedings of the Media-Enhanced Feedback event, Sheffield, 27 October 2010 pp 28-33

Cullen W.R. (2010) Formative assessment: Can technology help us to provide effective, timely feedback, Higher Education Academy Annual Conference, University of Hertfordshire, 22-23 June 2010.

Jollands, M., McCallum, N. and Bondy, J. (2009) If students want feedback why don’t they collect their assignments? In Proceedings of 20th Australasian Association for Engineering Education Conference, 2009, University of Adelaide, pp 735-740.Orsmond, P., Merry, S. and Reiling, K. (2005) Biology students' utilization of tutors' formative feedback: a qualitative interview study, Assessment and Evaluation in Higher Education, 30, 369-386.Ritchie, J. and Spencer, L. (1994) Qualitative data analysis for applied policy research In Bryman, A and Burgess, R.G Eds Analyzing qualitative data. London: Routledge, Chapter 9 pp173-194.

14

SOLSTICE & CLT Conference 2015, Edge Hill University