an e-learning resource for a computer science...

92
School of Computer Science Final Year Project Report April 2013 An E-learning Resource for a Computer Science Topic This is a project report done in the School of Computer Science at the University of Manchester Author: Supervisor: Shyam Gorasia Bijan Parsia BSc (Hons) Computer Science and Mathematics

Upload: dinhtram

Post on 15-Jul-2018

230 views

Category:

Documents


0 download

TRANSCRIPT

School of Computer Science

Final Year Project Report

April 2013

An E-learning Resource for a Computer Science Topic

This is a project report done in the School of Computer Science at the University of Manchester

Author: Supervisor: Shyam Gorasia Bijan Parsia BSc (Hons) Computer Science and Mathematics

2

Abstract The Unified Modelling Language (UML) is a core component for computer science students studying the second year software engineering module at the University of Manchester. UML is not as easy as it seems. This was proven after analysing coursework marks and conducting interviews with students. Although there are several existing UML e-learning resources available such as websites, many of them are insufficient mainly because of the fact that they are too text-heavy which can be off-putting to students. The existing e-learning resources tend to merely provide the theory of UML rather than the elaboration on how the diagrams are constructed. UML is widely used today in software engineering to help understand software systems through different diagrammatic representations. As most students frequently use the internet in their lives, it is important that learning and revision tools are made available online, to assist with exam preparation, anytime and anywhere. By focusing on the UML, this project consisted of developing an e-learning resource, which would target second year computer scientists who are studying the software engineering module. It would assist as a supplementary resource to aid students with their revision. This e-learning resource has its similarities to Blackboard, and consists of short video lectures, quizzes and exercises for students to complete to help them practice. This report coherently describes the various different stages of the project that were completed, with visual aids where necessary.

3

Acknowledgements Firstly, I would like to thank my supervisor Bijan Parsia, for his constant support and advice throughout the course of this project. I would like to thank Kung-Kiu Lau, for his support on UML diagrams. I would also like to thank all the students, who agreed to take part in my questionnaires, face-to-face interviews and tests. Finally, I would like to take this opportunity to thank my family for having faith in me and encouraging me.

4

Contents Chapter 1: Introduction 6

Chapter 2: Background Literature & Research 7

2.1 What is e-learning? 7

2.1.1 Types of e-learning 8

2.1.2 Pros and limitations of e-learning 8

2.1.3 Pedagogy of e-learning 9

2.1.4 Evaluating e-learning 9

2.2 What is the Unified Modelling Language? 10

2.3 Existing UML learning resources 18

2.4 Software Engineering course research 28

Chapter 3: Design & Requirements Analysis 42

3.1 Development technologies 42

3.1.1 Pathwright 42

3.1.2 Powerpoint 42

3.1.3 AT&T Natural Voices Text-to-speech 42

3.1.4 Cool Edit Pro 2 43

3.1.5 Xilisoft Powerpoint to Video Converter 43

3.1.6 Vidmeup 43

3.2 Design 44

3.2.1 Prototype Pathwright course path 44

3.2.2 Final Pathwright course path 47

3.3 Requirements specification 49

3.3.1 Functional requirements 49

3.3.2 Non-functional requirements 49

Chapter 4: Implementation 51

4.1 Implementation overview and approach 51

4.2 Prototype Implementation 51

4.2.1 Pathwright course path 51

4.2.2 Video lectures 52

4.3 Final Implementation 53

5

4.3.1 Pathwright course path 53

4.3.2 Video lectures 54

4.4 Problems encountered 56

Chapter 5: Results 57

5.1 Prototype implementation features 57

5.1 Final implementation features 57

5.1.1 Video lectures 58

5.1.2 Quizzes 59

5.1.3 Exercises 60

5.1.4 Links to other sources 62

5.1.5 Other features 63

Chapter 6: Testing and Evaluation 66

6.1 Testing of prototype implementation 66

6.1.1 Results 66

6.2 Testing of final implementation 73

6.2.1 Pre-test 73

6.2.2 Post-test 73

6.2.3 Results 73

6.2 Evaluation 84

Chapter 7: Conclusion 89

7.1 Project review 89

7.1.1 Achievements 89

7.1.2 Future work 89

Bibliography 90

6

Chapter 1: Introduction

The core aim of this project was to develop and evaluate (for pedagogic efficacy), an e-learning resource for a computer science topic. The reason for completing this project was because of the lack of existing e-learning resources for learning the UML, for second year computer science students at the University of Manchester. UML is a highly significant area of the software engineering course, because the exams and coursework are based primarily on UML. This gap in support resources for UML at the University of Manchester was examined previously in 2011 by a student Monique Cadman, who addressed it with an online tool [1]. It will be explained in this report that the thesis that the student took alone does not address identifiable needs of the student body. In order to develop a valid and useful resource, the following methodology was carried out;

1. investigating whether there is a learning need for this topic, by sending out questionnaires for students, interviewing students, analysing course material and possibly coursework marks

2. conducting background research on the topic, by analysing current existing e-learning resources

3. developing an e-learning resource which addresses this learning need 4. and finally evaluating the effect of the e-learning resource on learning outcomes

(grades and performance). This report will consist of an explanation of all stages completed throughout the course of this project. Any background research and literature will be explained. The design stage will be discussed. The report will also describe the implementation phase and the results achieved from the implementation, along with the testing, the evaluation and a conclusion.

7

Chapter 2: Background Literature & Research

This chapter will describe the theory of e-learning and also illustrate what UML is. There will also be concrete examples of existing UML e-learning resources.

2.1 What is e-learning? A basic definition of e-learning is; “technology-enhanced teaching and learning”. Other definitions include;

“E-learning is the use of technology to enable people to learn anytime and anywhere” [2].

E-learning is another way of teaching and learning. Many similar terms are in use such as technology-enhanced learning, web based learning and distributed learning [3]. E-learning is widely being used around the world today and has permeated all aspects of learning due to the advancement of computers and the internet. Students at the University of Victoria in Canada provided responses on their views on using online web-based learning tools. The students particularly liked using web-based tools mainly because they were;

“convenient accessible 24 hours a day, 7 days a week flexible in terms of accessing information from different locations and supportive of their learning” [4].

The students also commented that the tools were best usable, when they were:

“well-designed, easy to learn, easy to use simple to navigate and have a well-designed layout compatible with other platforms and programs accessible from all places outside of the university transparent (tool does not hinder, frustrate the user) used as up to date support for the course, not as a replacement of lectures and relevant to the course and tied into the specific course structure and content” [5].

This provides a clear indication that students favour the approach of e-learning to support their studies and it is the case that students will only be satisfied if the tool itself satisfies some of the above requirements. It can be said that the above requirements are some of the characteristics of a good e-learning resource.

8

2.1.2 Types of e-learning

There are mainly two types of e-learning, synchronous and asynchronous.

“Asynchronous e-learning occurs when students begin and complete a training course at different times, according to their schedule” [6].

Some of the features of asynchronous e-learning include; message boards, discussion groups and self-paced courses. Asynchronous e-learning essentially lets people learn at any time.

“Synchronous e-learning occurs when remote students enrol in a class that is paced at particular intervals that must be attended/completed according to a specific schedule” [7].

Some of the features of synchronous e-learning include; shared whiteboards, virtual classrooms and scheduled online examinations. Synchronous e-learning essentially lets people learn anywhere.

2.1.3 Pros and limitations of e-learning E-learning can be compared with traditional classroom learning. The advantages and limitations are shown in the table below.

Advantages Limitations

Traditional classroom learning

Feedback is instant as students can ask instructors for assistance [8].

There is more face-to-face interaction which students may prefer to learning on their own.

It is more instructor-centred and students are unable to learn on their own [9].

It is more expensive [10].

E-learning Students can study anytime and anywhere so more flexible. There is minimal interruption of the student’s working routine [11].

It is more learner-centred [12] as students can learn on their own.

Assessment activities measure the learning outcomes providing confidence that students have learnt the subject well [13].

Easier to update and manage.

There is a lack of instant feedback [14] as there is not much face-to-face interaction.

The time it takes for the instructor to prepare may be longer [15].

Although e-learning has its advantages, there are also several other limitations of e-learning. For instance, student motivation may always be a problem regardless of how well the material is presented [16]. Student motivation may also be a problem in classroom learning. There is also the case where students may have a lot of other tasks and work to

9

do, so they may be likely to put e-learning activities or coursework assignments (if any) at the bottom of their to-do list [17].

There may be times and situations where classroom learning may be more beneficial than e-learning, and vice versa.

2.1.3 Pedagogy of e-learning Pedagogy refers to the method and practice of teaching. Pedagogic principles are hypotheses that oversee the practice of teaching [18]. Some pedagogical usability criteria for e-learning resources include; time, where it possible for the students to learn a subject within a short but reasonable duration of time; interactivity, where students can enjoy user-friendly interfaces to access subject information and materials; autonomy, where students are able to work on their own without depending entirely on the teacher; and collaboration, where students can work with each other to complete tasks [19]. These criteria were taken into consideration when the e-learning resource was under development. 2.1.4 Evaluating e-learning There are many evaluation strategies and approaches which can be applied depending on which method is most effective. Expert review This type of evaluation involves ‘experts’ (possibly instructors or lecturers) evaluating the e-learning resource to see what their views are and whether they can (or are likely to) incorporate the resource into their teaching. Although this method can be helpful in some ways, it is still biased and not controlled as it is involves relying on subjective impressions. There is also the problem of estimating how likely students will use it, therefore it fails to show whether it helps. Pre-test and post-test evaluation This strategy involves conducting a more controlled experiment by evaluating with and without the e-learning resource. Firstly, a pre-test (without using the e-learning resource) would be carried out by students, whether it is some sort of online test, to identify the current level the students are performing at. Then a post-test (using the e-learning resource) would be carried out to see whether or not there has been an improvement in performance when using the e-learning resource. This method is more useful as it indicates whether or not there has been a change in performance after the post-test. It has its advantages over the expert review because it does not involve subjectivity.

10

2.2 What is the Unified Modelling Language (UML)? The Unified Modelling Language (UML) “is a graphical language for visualizing, specifying, constructing, and documenting the artefacts of a software-intensive system. The UML offers a standard way to write a system's blueprints, including conceptual things such as business processes and system functions as well as concrete things such as programming language statements, database schemas, and reusable software components” [20]. The modelling language was created by Grady Booch with Ivar Jacobson and James Rambaugh at the company Rational Software, a company which provided tools for software engineering practices in the 1990s [21]. UML was then adopted by the Object Management Group (OMG) in 1997, which is a computer standards consortium formed in 1989. UML is a critical component of software engineering and should be a familiar concept to all computer scientists. UML helps to visualise a software system by using diagrams which makes it easier to understand the system. This concept is heavily used in the industry today and is fast becoming a highly required skill for people who are involved in any software project.

Types of modelling

There are three types of modelling in software engineering; functional, structural and behavioural. The functional model typically captures the functionality of the system. This type of model consists of diagrams such as; activity diagrams and use case diagrams. The structural model typically captures the structure of a system through class diagrams. The behavioural model typically captures the behaviour of a system through interaction diagrams such as; sequence diagrams, communication diagrams and state machine diagrams. These different types of diagrams will be explained.

11

Types of UML diagrams

Activity Diagram Activity diagrams can be defined as;

“graphical representations of workflows of stepwise activities and actions with support for choice, iteration and concurrency” [22].

They typically represent a step-by-step flow of activities undertaken in a system. They show the overall flow of control. Activity diagrams are in order of sequence so each activity happens in order. The diagram below is an example of an activity diagram for examinations.

Figure 2.2.1: Activity Diagram

Student sits exam paper

Lecturer receives exam script

Lecturer marks exam script

paper

Pass Fail

Student receives exam mark

Student resits exam

Lecturer sets exam paper

This is called an initial state. This is the starting state. All activity diagrams start with an initial node.

This is called an activity. This shows an activity or an action performed.

This is called the final state. This is the ending state. This represents the end of the activity diagram.

This is called the decision state. Decision states are used when a choice has to be made in the system, which lead to alternate paths.

This is called a flow. Flows show the sequence in which the actions/activities are performed or executed.

12

Use Case Diagram Use case diagrams are;

“Use case diagrams overview the usage requirements for a system” [23].

They typically represent the functionalities of a system with its corresponding users. The diagram below is an example of a use case diagram for examinations.

This is called a system boundary. It is labelled and basically defines what the system is (in this case; examinations). It surrounds the use cases. Actors are placed outside the system boundary.

These are called use cases. Use cases are functions of the system which are represented as ovals and labelled with verb-direct object nouns.

These lines are basically relationships. Relationships show the roles which users have in the system. They show how and which actors interact with the system.

These are called actors. These are basically users of the system. They can be human or an external entity such as a database. They are labelled in terms of their role in the system.

Figure 2.2.2: Use Case Diagram

13

Class Diagrams A class diagram is;

“a UML structure diagram that shows classes with their attributes and operations, together with the associations between classes” [24].

The diagram below is an example of a class diagram for a driving school company.

A class consists of three sections. First section defines the class, middle section consists of attributes and the final section consists of operations. A class is something which the system will require.

These are known as the attributes of the class. These are basically properties that describe the state of an object. They are placed in the middle section of a class.

The numbers are known as multiplicities. This specifies that one customer has one shopping cart.

Figure 2.2.3: Class Diagram

These are known as the operations of the class. These are basically the functions that the class can perform. They are placed in the final section of a class.

14

Types of relationships in class diagrams

Relationship Representation Description

Association

Represents a relationship between two or more classes. It is usually labelled to illustrate how the classes are related.

Generalisation

Represents a-type-of

relationship between

classes. For example; a

student is a type of a

person.

Composition

Represents a ‘contains’

relationship between

classes. For example; A

class contains students.

Multiplicities

Notation Description

1 No more than one

0…1 Zero to one

0…* Zero to many

1…* One to many

Figure 2.2.4: Table of relationships

Figure 2.2.5: Table of multiplicities

15

Examples

Sequence Diagrams A sequence diagram;

“shows an interaction between objects arranged in a time sequence” [25].

The diagram below is an example of a sequence diagram showing a customer checking the balance on the ATM.

One school contains one or more students.

Student is a type of a person.

One manager manages one or more employees.

Figure 2.2.6: A Composition relationship

Figure 2.2.7: A Generalisation relationship

Figure 2.2.8: An Association relationship

16

Communication Diagrams A communication diagram;

“shows an interaction between lifelines (e.g. objects) and the context of the interaction in terms of the links between the lifelines” [26].

The diagram below is an example of a communication diagram for a student taking out a book from the university library.

These boxes represent objects participating in the sequence. They also represent actors, but must be identified as shown on the customer.

These narrow boxes represent execution occurrences (or activations). They denote when an object receives and sends messages.

These are messages which is basically information passed from one object to another. They are represented by solid arrows which are labelled with the messages. Return messages are represented by dashed lines which are labelled.

These are lifelines which show the life of an object during a sequence.

These are the objects involved in the interaction.

These are the messages interacted between the objects. These must be numbered to illustrate the order in which the interactions occur. The arrows show the direction of interaction.

These are the actors involved in the interaction as we have seen before in use case diagrams and sequence diagrams.

Lines show the association between the objects and actors.

Figure 2.2.9: A Sequence Diagram

Figure 2.2.10: A Communication Diagram

17

State Machine Diagrams A state machine diagram;

“shows the different states of an entity also how entity responds to various events by changing from one state to another” [27].

The diagram below is an example of a state machine diagram showing the process that an applicant goes through for a job application.

This is called a frame. It is labelled with the context of the state machine, in this case, applicant. This surrounds the whole state machine diagram.

This is a state and is basically a point during the life of an object, during which it either; performs some activity, waits for an event to occur or some external event.

These arrows are called transitions. A transition indicates the relationship between one state and another.

This is called an initial state as explained earlier in activity diagrams.

This is called an final state as explained earlier in activity diagrams.

Figure 2.2.11: A State Machine Diagram

This is an event which is an occurrence that triggers a change in state. An event is usually used to label a transition.

18

Although there are other UML diagrams, these are the only ones covered in the software engineering module at the University of Manchester.

2.3 Existing UML e-learning resources There exists a variety of different UML e-learning resources. Below are some examples of UML e-learning resources that are currently available. Agile Modeling website (www.agilemodeling.com/artifacts)

Figure 2.3.1: Agile Modeling website

Figure 2.3.2: Agile Modeling website

19

Sparx Systems (http://www.sparxsystems.com/resources/uml2_tutorial/)

Figure 2.3.3: Agile Modeling website

Figure 2.3.4: Sparx Systems website

20

Microsoft (http://msdn.microsoft.com/en-us/library/vstudio/dd409436.aspx)

Figure 2.3.5: Sparx Systems website

Figure 2.3.6: Microsoft website

21

Smart Draw (http://www.smartdraw.com/resources/tutorials/uml-diagrams/)

Figure 2.3.7: Microsoft website

Figure 2.3.8: Smart Draw website

22

Visual Paradigm (http://www.visual-paradigm.com/VPGallery/diagrams/index.html)

Figure 2.3.9: Visual Paradigm website

Figure 2.3.10: Visual Paradigm website

23

Prior students tool (http://www2.cs.man.ac.uk/~cadmanm8/Home.html) This was a similar project done in 2011 by a previous student.

Figure 2.3.11: Prior student’s tool

Figure 2.3.12: Prior student’s tool

24

Figure 2.3.13: Prior student’s tool

Figure 2.3.14: Prior student’s tool

25

Figure 2.3.15: Prior student’s tool

Figure 2.3.16: Prior student’s tool

26

Evaluating the existing e-learning resources The table below evaluates each e-learning resource;

E-learning resource

Strengths Weaknesses

Agile Modeling

Covers all the types of diagrams mentioned in the software engineering course at the University of Manchester

Gives examples of the diagrams

Explains components of diagrams

Very text heavy

Unappealing

No interactivity

Does not explain how the diagrams can be constructed

Sparx Systems

Covers all the types of diagrams mentioned in the software engineering course at the University of Manchester

Explains components with examples

Very text heavy

Unappealing

No interactivity

Does not explain how the diagrams can be constructed

No proper examples

Microsoft Explains components

Gives examples

Very text heavy

Unappealing

No interactivity

Does not explain how the diagrams can be constructed

Does not cover all diagrams mentioned in software engineering at the University of Manchester

Smart Draw Slightly less text heavy

Explains components of diagrams

Covers all the types of diagrams mentioned in software engineering at the University of Manchester

Does not explain how the diagrams can be constructed

No interactivity

Visual Paradigm

Covers all the types of diagrams mentioned in software engineering at the University of Manchester

Gives examples of diagrams

Gives definitions

Explains components

Text heavy

No interactivity

Does not explain how the diagrams can be constructed

Prior student’s tool (Monique Cadman)

Covers most of diagrams mentioned in software engineering at the University of Manchester

Highly interactive with quizzes, crosswords, spot the error and match the definitions.

Not text heavy

Does not cover Communication diagrams

Does not explain how the diagrams can be constructed

No reference points if students get stuck on a particular activity

Figure 2.3.17: Table of comparisons

27

Most of these resources cover the UML diagrams taught in the software engineering module at the University of Manchester, so they are supportive and relevant to the module. However, there is no interactivity in any of them except the final e-learning resource. E-learning resources that appear to be unattractive with too much text, are likely to discourage students. It would be essential to have an e-learning resource which students are likely to use. The student’s tool is very interactive, which makes learning fun and interesting. Although It is targeted as a revision tool, there is no proper learning content to refer to as it consists only of quizzes and activities. This could be a problem for students because if they do get stuck on a particular activity, there is not much on the tool to which they can refer to, for instance any notes or lectures.

28

2.4: Software Engineering course Research

The idea was to identify a topic that was proven to be difficult for students and determine whether there was good evidence to support this. In order to do this, two things were assessed; the significance of UML, and the efficacy of a possible solution that would address this problem. What is meant by significance and efficacy? Significance refers to how important something is. Efficacy in this context, refers to the effect that the e-learning resource will have, for instance on performance and grades of students. Ideally, a solution that improves performance and grades for a topic that is of high priority is more beneficial. Essentially, the topic must be of high significance, with the solution being of high efficacy. UML at the University of Manchester Assessing the significance of UML involved analysing current course materials. Software engineering is a compulsory module for all computer science students at the University of Manchester. This module is taught over two semesters and has three pieces of coursework based on UML in semester one, and end of semester exams in both semester one and two. The bulk of the first semester is based primarily on UML. Coursework: The coursework is split up into three exercises which count for 15% of the

course;

Number of marks Description

Exercise 1 (Requirements gathering and functional modelling)

20 Drawing activity and use case diagrams with use case descriptions for a given scenario

Exercise 2 (Structural modelling)

20 Drawing class diagrams

Exercise 3 (Behavioural modelling)

20 Drawing state machine, communication and sequence diagrams

Exam: The exams count for 60% of the course split up approximately equally over

the two semesters. The first semester exam tests students on UML, taught over the first semester.

The first semester exam consists of two sections, A and B. Currently, the exam rubric is to answer all of section A and one question from section B. Section A consists of 40 multiple choice questions, and section B consists of two written questions which students pick one out of. The exam is out of 60.

Figure 2.4.1: Coursework exercises

29

The multiple choice section was not published because it was restricted. The screenshot above shows the first question in section B from the 2012 semester one exam. It is seen that all the questions were based on UML. There were 20 marks out of 60 available for this question which is a third of the marks.

Figure 2.4.2: Software Engineering exam 2012

30

This screenshot above shows the second question in section B from the 2012 semester one exam. There were 15 marks available and the questions were based on UML. The screenshots illustrate that almost all of section B of the exam is based on UML, so there are a lot of marks available from the possible 60. In semester one, this course consists of weekly one hour lectures with two-hour labs for completing the coursework exercises and getting them marked. Almost all the lectures presented for this course in semester one from the first week, are based on UML with the last few lectures introducing different software engineering processes and user interface design. It is clear that a large amount of time is spent on this topic in semester one, which is a key indicator that this topic is in fact very significant in this software engineering module. Now that the significance of the topic had been identified, the idea was to see whether or not there was a learning need for this topic. It was a case of determining whether or not UML was proven to be difficult for students. To do this, some primary research was conducted to gather subjective information, mainly about the topic. Acquiring coursework marks for the coursework exercises which students did in the first semester of the software engineering module is very helpful as with this data, it is possible to analyse whether UML was actually a problem for students. After asking some of the software engineering lecturers and instructors at the University of Manchester, it was not possible to acquire this data due to issues with anonymisation and lecturers being busy.

Figure 2.4.3: Software Engineering exam 2012

31

The only other option was to set up an online survey using Survey Monkey http://www.surveymonkey.com/), where students could submit their marks for each exercise anonymously and leave comments if necessary. This was advertised by email and on the computer science Manchester Facebook pages. There were 436 second and third year computer science students in total. Below is an image of the link students received to complete the survey.

Students could select what mark they achieved. Each exercise was out of 20 marks. 56 responses were collected out of the 436 second and third year computer science students. The figures below show the mark distributions.

Figure 2.4.4: Survey screenshot

32

COMP23420 Semester 1 Exercise 1 Marks

0

1

2

3

4

5

6

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Marks

No

. o

f S

tud

en

ts

Results

No students achieved below 10 marks for this exercise. The mean mark was 14.9, which is approximately 75%. The median was 14.8, which is approximately 74%. The modes were 11.5 and 12, which is approximately 58% and 60% respectively. 5% of the students achieved a mark as low as 10, whilst 7% of students achieved full marks for this exercise. Some of the comments students made include;

“Missed out a few action states and used decision states incorrectly.” “I thought this coursework was very easy, but tended to lose marks on missing out actors and use cases in the use case diagram.” “I combined two actors in to one. The placement actor and arrangement actor. I assumed they could be one person which is technically correct however they may have different roles at different stages of the system.” “I found this exercise the most enjoyable out of the three.” “Used too many actors and use cases in the use case diagram.”

Figure 2.4.5: Survey results for Exercise 1

33

COMP23420 Semester 1 Exercise 2 Marks

0

1

2

3

4

5

6

7

8

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Marks

No

. o

f S

tud

en

ts

The mean mark was 14.0, which is 70%. The median was 13.5, which is approximately 68%. The mode was 10.5 which is approximately 53%. 2% achieved a lowest mark of 9, whilst 7% of students achieved full marks. Some of the comments students made about this exercise include;

“I thought this was the hardest. I failed to understand the types of relationship which lost me marks, also some multiplicity errors.”

“This was very difficult, and I'd say I failed to understand class diagrams.”

“Coursework was really useful when it came to revising for exams.”

“This was more challenging than exercise 1 but again I found it a case of following the logic presented in order to arrive at a correct solution.”

Figure 2.4.6: Survey results for Exercise 2

34

COMP23420 Semester 1 Exercise 3 Marks

0

1

2

3

4

5

6

7

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Marks

No

. o

f S

tud

en

ts

The graph shows that no students went below 10 as in the first exercise. The mean mark was 14.6, which is approximately 73%. The median was 14.0, which is 70%. 4% of students achieved a mark as low as 10, whilst 9% of students achieved full marks. Some of the comments students made about this exercise include;

“Coursework was really useful when revising for exams.”

“I went into more detail than necessary for the marks in my state machine diagram, encapsulating far too much detail.”

“The coursework was easy. Some of my diagrams were not properly labelled and one had improperly named transitions.”

Figure 2.4.7: Survey results for Exercise 3

35

Overall, many students had problems with mainly the syntax of UML which lost them marks. The column graphs show a wide spread of marks. Many of the students found the first exercise the easiest out of the three, with the second being the hardest. After analysing the coursework marks, some further primary research was required to obtain subjective information and generate a hypothesis. This was done by conducting in-depth interviews with a small number of second and third year computer science students from the University of Manchester. These students had already taken the second year software engineering module or were taking it. The goal here was to find out about their experiences with the UML. A total of 5 students were interviewed. Most of them had similar views and comments. The results are shown below.

1. General views on UML (what was easy and hard). What were the problems and

were they understood? What were the major issues?

How did you find UML?

0 1 2 3 4 5

Easy

Hard

Number of Students

60% of the students found UML hard, whilst the other 40% found it easy. Three of the students’ had similar responses;

“I mostly found understanding the diagrams difficult, and understanding the different components of the diagrams and what they represented.” “Components of each diagram were not clearly explained in the lectures.”

The other two students had a similar response;

“UML was a good topic to learn about and was easy. There weren’t many problems.”

Figure 2.4.8: Interview results

36

How helpful were the learning materials on Moodle?

0 1 2 3 4 5

Very helpful

Helpful

OK

Not helpful

Not helpful at all

Number of Students

2. Views on learning tools on Moodle for UML. What were the major issues? What was good and bad from learning tools on Moodle?

60% of the students found the materials on Moodle unhelpful in some way or the other, whilst the other 40% thought it was ok. Three of the students’ responses were;

“The diagrams provided as supporting material were good but not enough. Needed more examples to do the coursework and understand the concepts.” “Lectures were not really helpful, and didn’t explain how to construct the diagrams.” “Components of each diagram were not clearly explained in the lectures.” “The active learning sheets were good.”

The other two students’ responses were;

“UML was a good topic to learn about and was easy. There weren’t many problems.” “The active learning sheets were really helpful.”

Figure 2.4.9: Interview results

37

How did you find Exercise 1?

0 1 2 3 4 5

Easy

Hard

Number of Students

3. Views on coursework exercises. Which exercises were hardest and easiest? What were the problems and major issues?

40% of the students found this particular exercise difficult, whilst the other 60% thought it was easy. Many of the students found this exercise quite enjoyable compared to the other two. Three of the students’ responses were;

“It wasn’t easy, but easiest out of 3. I lost marks for missing out states in the activity diagram.”

The other two students made no comments on this exercise because they found it relatively straight forward.

Figure 2.4.10: Survey results

38

How did you find Exercise 2?

0 1 2 3 4 5

Easy

Hard

Number of Students

80% of the students found this particular exercise difficult, whilst the other 20% (one student) thought it was easy. Four of the students’ responses were;

“It was the hardest out of the three. I didn’t understand domain and system classes. It wasn’t very well explained, and there were not many examples.” “It was hard. I failed to understand the difference between a class diagram for a domain model, and a class diagram for system classes.” “I tended to lose marks for using the wrong type of relationship between classes i.e. a composition relationship rather than an association.” “I lost marks for not labelling relationships with multiplicities.”

The other student found the exercise easy but lost a mark;

“I lost one mark for forgetting to label the multiplicity of a relationship.”

Figure 2.4.11: Interview results

39

How did you find Exercise 3?

0 1 2 3 4 5

Easy

Hard

Number of Students

40% of the students found this particular exercise difficult, whilst the other 60% thought it was relatively easy. Two of the students’ responses were;

“I lost marks for syntax i.e. not labelling the transition with correct event name.” “I lost marks for the sequence diagrams because I forgot that messages always had a corresponding return message.”

The other three students who found it easy made no comments. 4. What were their views on exam questions? What were the major issues?

Some of the comments that the students made were;

“The exam questions were fair, but because I found the coursework difficult, I made the same mistakes in the exam.” “I thought they were easy, because I understood the coursework exercises and did them correctly.”

“The hardest questions were on class diagrams.” “The questions were similar to coursework exercises, but I tended to lose marks for syntax.”

“I found them easy, as I did well in the coursework.” The first comment shows that the coursework did not achieve the desired learning outcome as the student made the very same error.

Figure 2.4.12: Survey results

40

Do you think a UML revision tool would be useful?

0 1 2 3 4 5

Yes

No

Number of Students

5. What would be helpful on learning this topic? Would a revision tool be helpful in understanding this topic?

20% of the students thought that some sort of UML revision tool would not be helpful, as the materials available are already useful. The other 80% thought it would be helpful, because they were the struggling students. The students made comments and possible suggestions which were taken into consideration.

“Short tests, quizzes and modelling exercises to do in own time to consolidate understanding.” “More examples of each diagram.” “A set of revision lectures and test questions focused specifically on UML.” “Typical exam style questions with solutions.”

Most of the students thought that a tool purely based on UML would be very helpful and would help with revision.

6. Other comments or feedback.

“Perhaps creating some sort of online tool like Blackboard with lectures and quizzes.” “There should be plenty of examples of diagrams to help students understand.” “The UML learning tool on Moodle was pretty good but not enough. Could perhaps improve on this, by creating something which would prepare you for exam questions.” “Feedback given for coursework not very helpful, there were no model solutions.” “Struggled on coursework exercises due to lack of material available on UML.” “A set of lectures focused entirely on UML would be a good idea with examples of diagrams.” “Lectures didn’t explain how the diagrams are constructed.”

Figure 2.4.13: Survey results

41

The interview results show that many students lost marks because the lectures and course material on Moodle were unhelpful. This was an indicator that the current method of learning was not favoured by these students, so by speculation UML was in fact very significant, but the current learning method had a low effect on learning outcomes. Ultimately, the research conducted clearly shows that there is in fact a learning need here, because the topic is highly important and the current method of learning has a low effect on learning outcomes.

At this point, a possible solution was required which would address this problem and improve performance and learning outcomes.

42

Chapter 3: Requirements Analysis & Design This chapter illustrates the requirements and design of the e-learning resource which was implemented. It also justifies the methods and technologies which were used for implementation.

3.1 Development technologies There were several methods which were taken into consideration, but it was important to use the method which would be most effective. The following options were all considered; a website, a video, a self-assessment quiz or a tool called Pathwright. Although a website would have been appropriate, it would have served the same purpose as the existing websites which were evaluated earlier. They were unappealing and very text heavy. There was the option of creating an educational video which students could watch or a self-assessment quiz which students could use to test themselves. However, Pathwright had all these features built-in with other capabilities as mentioned below. 3.1.1 Pathwright Pathwright (www.pathwright.com) is a Blackboard-like tool which enables users to create online courses which consist of a step-by-step learning path (also known as a course path) with features such as; embedding videos, audio, readings, assessments and live streams. Pathwright was the clear choice, because a more sophisticated interactive learning tool could be developed. It was better than a website because it was able to incorporate a mixture of videos, audio and self-assessment quizzes which would maximise the effectiveness of the course path as a revision aid. 3.1.2 Powerpoint To create educational videos, the best option was to use Microsoft Powerpoint. This was because it had the feature of adding sound files with the slides and also timing the effects of animations and transitions. 3.1.3 AT&T Natural Voices Text-to-Speech Three methods of delivery alongside the slides were considered; synthesised audio description, human audio description and no audio description. The following website would

be used to record the synthesised audio to support the powerpoint slides; http://www2.research.att.com/~ttsweb/tts/demo.php. It was essential to identify which method of delivery would be most suitable and preferred by students.

43

3.1.4 Cool Edit Pro 2 As the AT&T website had a limit of recording so many sentences, Cool Edit Pro (which is an audio editing software), had to be used to merge the audio files. It was expected that recording synthesised audio would cause problems in terms of pausing correctly between sentences. Cool Edit Pro had the facilities to generate silence to use as pauses in between the sentences, so that the speech flowed smoothly. This software would also be used to record the human audio description for the slides. 3.1.5 Xilisoft Powerpoint-to-Video Converter The powerpoint files needed to be converted to a video format so they could be embedded within Pathwright. This converter captured slide transitions, any audio which was incorporated in the powerpoint and all custom animation effects which were set up on the presentation. This was an advantage over other converters which did not have these features. Alternative methods of creating video lectures such as using Camtasia were considered which allowed recording and capturing of the screen and microphone audio. Another considered method was Screenflow. The problems with these were that the powerpoint slides would still have had to be created no matter what. Also, Screenflow was only available for the Mac operating system. It was for this reason to stick to the method of creating the slides and recording the audio separately, and then syncing them in with the powerpoint slides with transitions and effects. 3.1.6 Vidmeup Once the videos were made, they needed to be uploaded and embedded within Pathwright. Many video uploading websites are available such as Vidmeup and Vimeo. Vidmeup was more efficient because there was no video upload limit.

44

3.2 Design 3.2.1 Prototype Pathwright course path To observe what students thought of Pathwright, a design for a prototype course path was planned. This was so that if there were any concerns or if students did not prefer the tool, it would be known before hand. As the method of delivering the video lectures was uncertain at this point, three sets of three different videos were to be created; one set with synthesised audio description, one set with human audio description and one set with no audio description. This was so that when the prototype was tested, there would be the certainty to decide which kind of delivery students preferred. A written script was created for the synthesised and human audio description which would be used for recording. The table below lists the activities which the prototype course path would feature.

Description Activities

A prototype course path on use case diagrams.

Watch a video on what a use case diagram is

Watch a video that gives examples of a use case diagram

Watch a video on how to construct a use case diagram

Complete a practice and assessed quiz based on use case diagrams

Complete a modelling exercise (draw a use case diagram)

Video lectures Instead of having one video, it was best to have three videos for each type of UML diagram. This was because one video may have been boring for students, so having three short videos kept the content more focused, organised and short;

Video Content

Use case diagrams Video 1 Explains what the use case diagrams are and its components

Use case diagrams Video 2 Gives two examples of use case diagrams

Use case diagrams Video 3 Explains how to construct the use case diagrams given a scenario and concludes with instructions of what to do next and key points to remember.

Figure 3.2.1.1: Table of activities on prototype Pathwright course

Figure 3.2.1.1: Table of videos for prototype Pathwright course

45

The number of slides needed for each video for the prototype design, are shown in the table below, with the information included in each slide of the three videos for use case diagrams.

Video Powerpoint Slide number

Information on slide Example of slide

Video 1 Slide 1 A title slide with audio giving a short introduction

Slide 2 An objectives slide explaining the goals achievable after watching the three videos.

Slide 3 A slide giving a definition of what the type of diagram is and explaining the different components of the diagram.

Video 2 Slide 1 A slide giving an example of the diagram.

Slide 2 A slide giving another example of the diagram.

Use Case Diagrams

Objectives Know what a use case diagram is

Identify components of a use case diagram

Know how to create a use case diagram

What is a use case diagram? A diagram that represents a systems functions

Components Use case, actor, system boundary

Examples of use case diagrams [example diagram 2]

Examples of use case diagrams [example diagram 2]

46

Video 3 Slide 1 A slide giving a scenario that will be used to construct a diagram for.

Slide 2 A slide showing the construction of a diagram.

Slide 3 A summary slide finishing off, with a few points to remember.

Slide 4 A slide introducing the next type of diagram that will be discussed after this.

Quizzes As Pathwright contained a built-in feature of creating online quizzes, a set of quiz questions to complete after watching the videos were sketched out. The built-in feature allowed the creation of multiple choice and free-text questions.

Constructing a use case diagram [Scenario]

[identify actors and system boundary]

What is a use case diagram? [diagram construction step-by-step]

It’s time for you to try [advise students to complete exercises, and quiz] [key points to remember]

Next time…

Activity Diagrams

Figure 3.2.1.2: Table of slides for each video

47

Exercises As many of the students struggled with the coursework exercises, it was vital to include some modelling exercises. For the prototype design, the coursework exercise 1 from the software engineering module was used for testing purposes. 3.2.2 Final Pathwright course path For the final Pathwright design, the course was to be structured in terms of weeks. There were 7 weeks in total (including an introduction week), which covered the six types of UML diagrams covered in the software engineering module. This was to keep the course structured and organised. Each week consisted of multiple activities (or steps) to complete week-by-week;

Week Description Activities (or steps)

Week 0 – Introduction and UML

An introductory week which explains what the course is about and gives and introduces the Unified Modelling Language.

Watch a video on an introduction to the course

Watch a video on an introduction to UML

Complete a quiz on UML

Week 1 – Use case diagrams Each week would cover a different type of UML diagram. Each week would consist of 10 steps.

Watch a video on what the diagram is

Watch a video that gives examples of the diagram

Watch a video on how to construct the diagram

Complete a quiz based on the diagram

Complete modelling exercise 1 based on the diagram

Complete exercise 2

Complete exercise 3

Watch optional video explaining solutions to exercises

Refer to optional links for further reading

Refer to optional videos for further reading

Week 2 – Activity diagrams

Week 3 – Class diagrams

Week 4 – Sequence diagrams

Week 5 – Communication diagrams

Week 6 – State machine diagrams

Figure 3.2.1.1: Table of activities for final Pathwright course

48

Video lectures The video lectures for the final Pathwright course path followed the same structure as described previously (see Figure 3.2.1.2). So there were another five different UML diagrams to create videos for as mentioned earlier. Quizzes For the final Pathwright course path, a full set of questions for each week were sketched out. This included a mixture of different theory questions. The amount of questions varied for each type of diagram. Exercises A full set of questions (3 exercise questions) were sketched out for each week. The questions described different scenarios for the students to draw diagrams for. The solutions were also sketched out which Kung-Kiu Lau (a software engineering lecturer at the University of Manchester) vetted for correctness. The original idea was to provide the solutions in the form of a document, but it occurred that a video with detailed audio explanations of how the diagrams are constructed may be more useful for students. As the method of delivery (synthesised audio, human audio or no audio) was uncertain at this point as explained earlier, the method would be decided after receiving the prototype test results. Links to optional sources As students may prefer to do extra reading or use other sources, it would be an advantage if these sources were available; websites and other videos.

49

3.3 Requirements specification 3.3.1 Functional requirements Functional requirements of a system basically define the functions that the system is required to perform [28]. They describe what a system is supposed to accomplish.

Functional requirement

Description Required or optional

Pathwright course path

FR.1 The user must be able to log in to view the course. Required

FR.2 The user must be able to view the videos on the course. Required

FR.3 The user must be able to complete the course step-by-step. Required

Videos

FR.4 Each video must move through the slides smoothly. Required

Pathwright quizzes

FR.5 The user must be able to select one option only from a multiple choice question.

Required

FR.6 It should inform the user whether the selected multiple choice answer is correct or incorrect.

Required

3.3.2 Non-functional requirements Non-functional requirements define various constraints, performance and verification requirements, standards, limitations and validation criteria [29].

Non-functional requirement

Description Required or optional

Pathwright course path

NFR.1 The course path must have 7 sections (introduction and a section on each type of UML diagram covered in the software engineering course).

Required

NFR.2 Each section must have links to steps (i.e. watch videos, take quiz and take exercises) for students complete in order.

Required

NFR.3 Each section must have a video introducing the diagram and explaining what it is.

Required

NFR.4 Each section must have a video which gives examples of the diagram.

Required

NFR.5 Each section must have a video which explains how to construct the diagram.

Required

NFR.6 Each section must have a video explaining the solutions to the exercises.

Required

NFR.7 Each section may have optional links to other sources. Optional

Figure 3.3.1.1: Table of functional requirements for final implementation

50

NFR.8 The course path must provide progress statistics. Required

Videos

NFR.9 There must be four videos in each section. Required

NFR.10 First video must explain what the diagram is and explain its components.

Required

NFR.11 The second video must give examples of the diagram. Required

NFR.12 The third video must explain how the diagram is to be constructed.

Required

NFR.13 The fourth video must explain the solutions to the exercises. NFR.12

NFR.14 Each video may have audio description. Optional

Pathwright quizzes

NFR.15 The quiz must be short answer or multiple choice. Required

NFR.16 Each multiple choice question in a particular quiz must have 4-5 multiple choice options.

Optional

Exercises

NFR.17 Each exercise in a section must have a scenario given for the user to read and construct a diagram for.

Required

Figure 3.3.1.2: Table of non-functional requirements for final implementation

51

Chapter 4: Implementation

This chapter describes the implementation phase of the e-learning resource using the designs illustrated in the previous chapter.

4.1 Implementation overview and approach The implementation consisted of two main phases; building the course path using Pathwright, and creating the powerpoint slides. There were other several tasks such as; converting the powerpoint files to video format, uploading the videos on Vidmeup, embedding the videos within Pathwright and creating the quizzes and exercises on Pathwright.

4.2 Prototype Implementation 4.2.1 Pathwright course path Pathwright required registration, so an account was created. The prototype Pathwright course path did not involve as much development effort as the actual final Pathwright course path. Under each week, the activities (or steps) were created which students would complete whether it involved watching a video or taking a quiz. The steps in the prototype implementation are shown in the table below, along with the number of quiz questions and exercises;

Number of activities (or steps) Description

6 Steps 1 – 3 involved watching videos on use case diagrams Steps 4 – 5 involved completing quizzes Step 6 involved completing a drawing exercise

Week Number Number of quiz questions Number of exercises

Week 1 – Use case diagrams 4 (practice) 11 (assessed) 1

Figure 4.2.1.1: Steps required for prototype implementation

Figure 4.2.1.2: Number of exercises and quizzes for prototype implementation

52

4.2.2 Video lectures Three sets of videos were created on use case diagrams; one set with synthesised audio description, one set with human audio description and another set with no audio description as described in the previous chapter. Audio

Using the script written in the design phase, the human audio was recorded using a microphone and synthesised audio was generated automatically using the text-to-speech converter as described earlier. The synthesised audio had to be edited to merge the audio files together and insert pauses where necessary.

Figure 4.2.2.1: Screenshot of Cool Edit Pro audio recording and editing

53

4.3 Final Implementation 4.3.1 Pathwright course path As the development had already begun by creating a prototype, it was the case of continuing this development. As explained in the previous chapter, the course was to be structured into weeks so that students could work through activities (or steps) for a different particular diagram each week (see Figure 3.2.1.1). The number of activities for the final Pathwright course are summarised in the table below;

Week Number Number of activities (or steps) each week

Description

Week 0 – Introduction to course and UML

5 Steps 1 – 2 involved watching videos Steps 3 involved completing a quiz Step 4 - 5 included links to optional videos and further reading

Week 1 – Use case diagrams 10 Steps 1 – 3 involved watching videos Steps 4 involved completing a quiz Step 5 - 7 involved completing drawing exercises Steps 8 involved watching video of solutions to exercises Steps 9-10 included links to optional videos and further reading

Week 2 – Activity diagrams 10

Week 3 – Class diagrams 10

Week 4 – Sequence diagrams

10

Week 5 – Communication diagrams

10

Week 6 – State machine diagrams

10

Figure 4.3.1.1: Table of activities in each week

54

This table below describes the number of videos, quizzes, exercises and optional links to sources included in each week for the final implementation.

Week Number Number of videos

Number of quizzes

Number of exercises

Number of optional links

Week 0 – Introduction to course and UML

2 1 (12 questions) 0 2 video links 2 website links

Week 1 – Use case diagrams

3 1 (18 questions) 3 2 video links 2 website links

Week 2 – Activity diagrams

3 1 (19 questions) 3 2 video links 2 website links

Week 3 – Class diagrams

3 1 (24 questions) 3 2 video links 2 website links

Week 4 – Sequence diagrams

3 1 (20 questions) 3 2 video links 2 website links

Week 5 – Communication diagrams

3 1 (16 questions) 3 2 video links 2 website links

Week 6 – State machine diagrams

3 1 (22 questions) 3 2 video links 2 website links

4.3.2 Video lectures The method of delivery for the final Pathwright course path would be synthesised audio because the results from the prototype showed that this method of delivery was most favoured by students (see Figure 6.1.1.11). The lengths of the videos in each week are shown in the table below;

Week Number Video Duration

Week 0 – Introduction to course and UML

Video 1 Introduction to course

1 minute 9 seconds

Video 2 What is UML?

1 minute 39 seconds

Week 1 – Use case diagrams

Video 1 What are use case diagrams?

2 minutes 8 seconds

Video 2 Examples of use case diagrams

1 minute 10 seconds

Video 3 How to construct use case diagrams

2 minutes 55 seconds

Week 2 – Activity diagrams

Video 1 What are activity diagrams?

2 minutes 12 seconds

Video 2 1 minute 46 seconds

Figure 4.3.1.2: Number of videos, quiz questions and exercises in each week

55

Examples of activity diagrams

Video 3 How to construct activity diagrams

4 minutes 48 seconds

Week 3 – Class diagrams Video 1 What are class diagrams?

3 minutes 39 seconds

Video 2 Examples of class diagrams

1 minute 17 seconds

Video 3 How to construct class diagrams

3 minutes 13 seconds

Week 4 – Sequence diagrams

Video 1 What are sequence diagrams?

2 minutes 42 seconds

Video 2 Examples of sequence diagrams

1 minutes 34 seconds

Video 3 How to construct sequence diagrams

3 minutes 40 seconds

Week 5 – Communication diagrams

Video 1 What are communication diagrams?

2 minutes 27 seconds

Video 2 Examples of communications diagrams

1 minute 57 seconds

Video 3 How to construct communication diagrams

3 minutes 58 seconds

Week 6 – State machine diagrams

Video 1 What are state machine diagrams?

2 minutes 21 seconds

Video 2 Examples of state machine diagrams

2 minutes 43 seconds

Video 3 How to construct state machine diagrams

4 minutes 19 seconds

The solutions to the three exercises in each week were created as one video with synthesised audio description.

Figure 4.3.2.1: Table of video duration times

56

Audio

Once the synthesised audio was recorded, there were multiple files for each slide so these were merged together using Cool Edit Pro and inserted into the slides.

4.4 Problems encountered In Pathwright, the multiple choice questions could be self-assessed, but the free-text questions could not. This gave rise to the problem of not receiving instant feedback when answering a question like the multiple choice questions did. As this e-learning resource would be used as a supplementary aid for revision, students would expect instant feedback on their answers to questions with as less interaction with the instructor as possible. The test results showed that the students did not prefer free-text questions (see Figure 6.1.1.9), so free-text questions were not used. As instructors are likely to be busy, they may not want to mark work which is not necessary. To solve this problem, practice quizzes were used instead of assessed, because assessed quizzes only allowed students to take the test once, whereas the practice quizzes were attemptable multiple times, which would be better for revision purposes.

57

Chapter 5: Results This chapter illustrates the results of the final implementation of the e-learning resource. It will outline its features and functions.

5.1 Prototype implementation features There is no difference between the features in the prototype implementation and the features in the final implementation. The only difference was that the prototype implementation had fewer activities to complete because it was created for testing purposes. Every other feature was included in both the prototype and final implementation, as described below.

5.2 Final implementation features The main purpose of this Pathwright course is to aid students with the UML. It is more of a self-paced revision tool. The students are able to complete a step-by-step paced course which covers all content taught at Manchester and will help future students with their coursework and exam preparation.

This screenshot shows the course path in a user’s perspective. Once a step is completed, the student would progress to the next step until all steps have been completed. The course is structured in weeks as described earlier in the design phase with multiple steps in each

Figure 5.2.1: Screenshot of Pathwright course

58

week with different activities; watching videos, completing quizzes and exercises and optional further reading if necessary.

5.1.1 Video Lectures The Pathwright course path has embedded video lectures which coherently explain the different types of diagrams, focusing on; what they are (and their components), examples, and how they are constructed as shown in the screen show below.

5.1.2 Quizzes The screenshots below show the quizzes which students can attempt.

Figure 5.1.1.1: Screenshot of videos on Pathwright

59

The quizzes will automatically tell the user whether their submitted answer was correct or incorrect as shown in the screenshots below.

Figure 5.1.2.1: Screenshot of quiz page on Pathwright

Figure 5.1.2.2: Screenshot of quiz on Pathwright

60

5.1.3 Exercises The course path also consists of exercises for students to complete in their own time along with solutions explained by a video. Although there is no available facility to complete the exercises online, it gives them practice on drawing the different diagrams and helps them prepare for typical exam questions which appear over the years.

Figure 5.1.2.3: Screenshot of quiz (correct answer)

Figure 5.1.2.4: Screenshot of quiz (incorrect answer)

Figure 5.1.3.1: Screenshot of exercise page

61

The video solutions to the exercises are available for students to check their solutions as seen below.

Figure 5.1.3.2: Screenshot of exercise video solutions

Figure 5.1.3.3: Screenshot of exercise video solutions

62

5.1.4 Links to other sources The course also provides links to other optional material outside of Pathwright, for example; links to other videos from other sources, and also web links for further reading.

Figure 5.1.4.1: Screenshot of optional video links

Figure 5.1.4.2: Screenshot of optional websites for further reading

63

5.1.6 Other features Step completion feature The screenshot below shows that when a student has completed a particular step, it is marked as completed.

Progress bar feature As steps are completed in the course, the progress bar in the top right corner is updated as seen below.

Figure 5.1.6.2: Progress bar Figure 5.1.6.3: Progress bar

Figure 5.1.6.1: Step completion

64

Report feature The students can also track their progress in terms of a report card summarising steps completed, points earned for completing, steps remaining and other relevant information as seen below. Discussions Discussion feature As this tool is to be used to aid with revision, students may discuss and work together to complete the tasks. This allows the students to collaborate rather than relying entirely on the instructor. The screenshots below show the discussion link and how students can post questions and answers.

Figure 5.1.6.4: Progress feature

Figure 5.1.6.5: Report card

Figure 5.1.6.6: Discussion link

Figure 5.1.6.7: Discussion topic

65

Invitation feature This Pathwright course lets students invite other students to join the course by sending them an invitation by email as the screenshot shows below.

Figure 5.1.6.8: Invitation feature

66

Chapter 6: Testing and Evaluation

This chapter will detail the testing phase of the e-learning resource with justification of the type of testing that was conducted. The results of the testing will be outlined along with an evaluation and its summary.

6.1 Testing the prototype implementation 6.1.1 Results The screenshot above shows the prototype Pathwright course. A total of 9 students (a mixture of friends and random students) were asked to test this prototype to get some feedback. Registered anonymous accounts were created on Pathwright for them to log in with and use. This was done so that the progress could be monitored, as Pathwright had the facility where the moderator could monitor the students’ activity and progress. The testing consisted of;

Watching three videos on use case diagrams

Completing a practice quiz with 4 questions (mixture of multiple choice and short-answer)

Completing an assessed quiz with 11 questions (mixture of multiple choice and short-answer)

Completing a short modelling exercise (coursework exercise from software engineering) using a drawing program (i.e. paint)

The students were then asked to complete a short questionnaire. Results of quizzes

These figures show the practice quiz and assessed quiz results of the 9 students. Guests 1 to 3 were watching the videos with synthesised audio, guests 4 to 6 were watching the videos

Figure 6.1.1.1: Pathwright practice quiz results Figure 6.1.1.2: Pathwright assessed quiz results

67

with human audio, and guests 7 to 9 were watchng the videos with no audio just slides. The average mark for this practice quiz was 86% which was outstanding. None of the students achieved below 50% (2 out of 4) which was good. 5 out of 9 students achieved full marks. For the assessed quiz, the average mark was 83% which was again outstanding. None of the students went below 64% (7 out of 11) which was terrific. 3 out of 9 students achieved full marks.

Exercise Results

0 2 4 6 8 10 12 14 16 18 20

Guest 1

Guest 2

Guest 3

Guest 4

Guest 5

Guest 6

Guest 7

Guest 8

Guest 9

Mark

Series1

This figure above illustrates the students’ exercise results. The average mark for the exercise was 16 which was pretty good. There was no mark below 50% which was good. Guests 1 and 5 achieved full marks. Guests 1 to 3 all had a mark of 85% or over, and they were all watching the synthesised audio. The marks of guests 7 to 9 lied in the 55% to 65% range which was not bad at all, considering they had no audio description. Below is the feedback questionnaire which the students completed;

Figure 6.1.1.3: Pathwright exercise results

68

1. How satisfied were you by using the Pathwright tool?

0 1 2 3 4 5 6 7 8 9

Very Satisifed

Satisfied

Unsatsified

Very unsatisifed

Sati

sfi

ab

ilit

y

Number of Students 100% of the students were satisfied in some way or the other, so this was definitely a good sign and indicator.

2. Would you say that you are definitely in a much better position to

attempt the coursework from semester 1 now then you were before?

0 1 2 3 4 5 6 7 8 9

Yes

No

Not sure

Number of Students The students were all satisfied and commented that they did better in the exercise this time round than when they had to hand in the coursework.

Figure 6.1.1.4

Figure 6.1.1.5

69

4. Were all the questions answerable after watching the video lecture?

0 1 2 3 4 5 6 7 8 9 10

Yes

No

Number of Students

3. How helpful would you say the video lectures were in helping to

answer the questions?

0 1 2 3 4 5 6 7 8 9 10

Very helpful

helpful

not helpful

Number of Students

This figure shows that they really favoured the video lectures in the course path, and said that they were very helpful.

All the students felt that the quiz questions were answerable after watching the video lectures.

Figure 6.1.1.6

Figure 6.1.1.7

70

5. Did the video lectures explain things well?

0 1 2 3 4 5 6 7 8 9

Yes

No

Number of Students

Although approximately 89% of the students felt that the videos were explaining the material well, there was one student who did not quite agree. This was because the student felt that the video showing examples of use case diagrams was just repeated again when constructing a use case diagram and it failed to show how use cases can be identified. This comment was taken into consideration.

6. Would you prefer multiple choice questions or short-answered

questions?

0 1 2 3 4 5 6 7 8 9

Multiple choice

Short-answered

No preference

Number of Students 11% (1 student) had no preference and 89% preferred multiple choice questions. None of the students preferred short answer questions.

Figure 6.1.1.8

Figure 6.1.1.9

71

7. Did the video lectures appear to be a bit long and tended to cram too

much in?

0 1 2 3 4 5 6 7 8 9

Yes

No

Number of Students 100% of the students felt that the video lectures were not too long and did not cram too much information into one video.

8. Would you prefer the video lecture to use synthesised audio, human

voice audio?

0 1 2 3 4 5 6 7 8 9

a. Synthesised audio

b. Human

c. Text (no audio)

d. No preference

Number of Students None of the students favoured the human audio in the videos. This showed that not only was it difficult to record a perfect human voice, the students did not prefer it. 67% of the students preferred synthesised audio, 33% had no preference.

Figure: 6.1.1.10

Figure: 6.1.1.11

72

9. Would you prefer the video lecture to be in parts or everything in

one? i.e. one video introducing use case diagrams, another showing

how to create them etc.

0 1 2 3 4 5 6 7 8 9

a. One video

b. Parts

c. No preference

Number of Students 22% of the students had no preference and the other 78% preferred the video in parts how it was. The original idea was the have one video, but this could have made the duration of the video much longer and it may have been quite boring to sit through instead of three more focused and short videos.

10. What were your views on the lecture layout?

0 1 2 3 4 5 6 7 8 9

a. Very well structured

b. Well structured

c. Not structured well

d. Not structural well at all

Number of Students 100% of the students felt that the video lectures were organised and well structured, so there were no problems at this point.

Figure: 6.1.1.12

Figure: 6.1.1.13

73

General comments overall Some of the general comments that many of the students made include;

“The human voice was very deep and dull and it put me off”. “Generally, a good tool for revision”. “The multiple choice questions were very good”. “Video lectures explained things well, slightly better than the ones on Moodle”. “A much more sophisticated UML learning tool, rather than just coursework and face to face lectures”. “Examples of the diagram were good and clearly explained the different parts”. “A good video on how to create a use case diagram”. “Video on constructing a use case diagram wasn’t very good because it didn’t tell us how to identify the use cases, it just stated what they were”. “Did much better in the exercise and had a better understanding than I did before”. “Quiz results were good. Got all the questions correct!”

Some of the suggestions were;

“Possibly have true and false questions”. “Could have links to other sources for extra reading”. “Could have more questions”. “The slides could change a bit slower, the transitions were a bit too fast. There needed to be a short gap”.

Summary Overall, the testing of the prototype went well and received positive reviews. However, there were minor drawbacks which the students pointed out such as the concerns with the human voice. Generally, in terms of the material presented in the video lectures, the outcome was pleasant, except for the one student who stated that use case identification needed to be explained. This was solved by altering the construction slide so that the use cases were identified and explained earlier on from the scenario given. As the prototype generally received positive reviews, this indicated that this idea of a possible solution may well be successful.

74

6.2 Testing the final implementation

As the prototype had been tested, a more controlled pre-test and post-test was conducted on the final implementation. The idea behind this was to assess the performance and learning outcomes before and after using the e-learning resource to see whether or not there has been an improvement in performance. It was expected that it would take time to test the final Pathwright course as there was more material and activities to get through compared to the prototype test. This meant that difficulty would arise with students testing it with the supervision of an observer because it would take up a lot of their time. Having the students test it in their own time and pace was a better option as they could save and continue later which gave them more flexibility. 6.2.1 Pre-test The pre-test involved completing the quiz and one exercise (marked out of 20) from each week without watching the videos. It was a test to see how much they knew already, and how much they were improving once they did the post-test. 12 different students who had not been selected previously took part in the pre-test which would then be followed by a post-test. 6.2.2 Post-test The post-test involved the very same test as the pre-test but the students watched the video lectures before attempting the quiz questions and exercises. There was also an opportunity for students to explore some of the tools features and capabilities after the test. 6.2.3 Results The students completed the pre-test and post-test separately. The videos on Pathwright were taken off during the pre-test to make sure there was no cheating. The videos were then put back on during the post-test. The results of the pre-test and post-test quizzes and exercises are shown below.

75

Week 0 Quiz Marks

0 2 4 6 8 10 12

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 0 Quiz Marks

0 2 4 6 8 10 12

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 0 – Introduction to course and UML

There was a quiz but no exercise for the introductory week. The quiz during the pre-test had a mean mark of 7.92 (66%), a mode of 7 (58%) and a median of 7 (58%). The quiz during the post-test had a mean mark of 11.08 (92%), a mode of 12 (100%) and a median of 12 (100%). There is clear evidence that average student performance increased after watching the video lectures.

Figure 6.2.3.1: Quiz results (pre-test) Figure 6.2.3.2: Quiz results (post-test)

76

Week 1 Quiz marks

0 2 4 6 8 10 12 14 16 18

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 1 Quiz Marks

0 2 4 6 8 10 12 14 16 18

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 1 Exercise Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 1 Exercise Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

de

nt

Mark

Week 1 – Use Case Diagrams

For weeks 1 to 6, there were both exercises and quizzes to complete. The quiz during the pre-test had a mean mark of 10.42 (58%), a mode of 14 (78%) and a median of 10 (56%). The quiz during the post-test had a mean mark of 16.67 (93%), a mode of 18 (100%) and a median of 17.5 (97%). The results show a boost in average student performance after watching the video lectures.

The exercise during the pre-test had a mean mark of 16.83 (84%), a mode of 15 (75%) and a median of 17 (85%), which was a decent result. The exercise during the post-test had a mean mark of 18.75 (94%), a mode of 20 (100%) and a median of 19.5 (98%). Although the exercise had a high success rate already, the video lectures improved performance even further.

Figure 6.2.3.3: Quiz results (pre-test) Figure 6.2.3.4: Quiz results (post-test)

Figure 6.2.3.5: Exercise results (pre-test) Figure 6.2.3.6: Exercise results (post-test)

77

Week 2 Exercise Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 2 Quiz Marks

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 2 Exercise Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 2 Quiz Marks

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

de

nt

Mark

Week 2 – Activity Diagrams

The quiz during the pre-test had a mean mark of 13.08 (69%), a mode of 12 (63%) and a median of 12.5 (66%) out of a possible mark of 19. The quiz during the post-test had a mean mark of 18.00 (95%), a mode of 19 (100%) and a median of 18.5 (97%).

There was a mean mark of 13.92 (70%), a mode of 12 (60%) and a median of 12 (60%) out of a possible mark of 20 during the pre-test for the exercise. There was a mean mark of 19.33 (97%), a mode of 20 (100%) and a median of 20 (100%) during the post-test.

Figure 6.2.3.7: Quiz results (pre-test) Figure 6.2.3.8: Quiz results (post-test)

Figure 6.2.3.9: Exercise results (pre-test) Figure 6.2.3.10: Exercise results (post-test)

78

Week 3 Quiz Marks

0 2 4 6 8 10 12 14 16 18 20 22 24

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

de

nt

Mark

Week 3 Quiz Marks

0 2 4 6 8 10 12 14 16 18 20 22 24

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

de

nt

Mark

Week 3 Exercise Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 3 Exercise Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 3 – Class Diagrams

The quiz after the pre-test, had a mean mark of 13.33 (56%), a mode of 11 (46%) and a median of 12.5 (52%) out of a possible mark of 24. The quiz after the post-test, had a mean mark of 21.42 (89%), a mode of 22 (92%) and a median of 22 (92%).

The exercise after the pre-test, had a mean mark of 11.58 (58%), 10 (50%) and 11 (55%) out of a possible mark of 20. The exercise after the post-test, had a mean mark of 14.08 (70%), a mode of 13 (65%) and 13.5 (68%).

Figure 6.2.3.11: Quiz results (pre-test) Figure 6.2.3.12: Quiz results (post-test)

Figure 6.2.3.13: Exercise results (pre-test) Figure 6.2.3.14: Exercise results (post-test)

79

Week 4 Quiz Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 4 Quiz Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 4 Exercise Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 4 Exercise Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 4 – Sequence Diagrams

The quiz after the pre-test, had a mean mark of 12.83 (64%), a mode of 10 (50%) and a median of 12 (60%) out of a possible mark of 20. The quiz after the post-test, had a mean mark of 18.33 (92%), a mode of 18 (90%) and median of 18 (90%).

The exercise after the pre-test, had a mean mark of 12.58 (63%), a mode of 11 (55%) and a median of 12 (60%) out of a possible mark of 20. The exercise after the post-test, had a mean mark of 17.83 (89%), a mode of 18 (90%) and a median of 18 (90%).

Figure 6.2.3.15: Quiz results (pre-test) Figure 6.2.3.16: Quiz results (post-test)

Figure 6.2.3.17: Exercise results (pre-test) Figure 6.2.3.18: Exercise results (post-test)

80

Week 5 Quiz Marks

0 2 4 6 8 10 12 14 16

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 5 Exercise Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 5 Exercise Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 5 Quiz Marks

0 2 4 6 8 10 12 14 16

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 5 – Communication Diagrams

The quiz after the pre-test, had a mean mark of 9.42 (59%), a mode of 9 (56%) and a median of 9 (56%) out of a possible mark of 16. After the post-test, the quiz had a mean mark of 14.17 (89%), a mode of 16 (100%) and a median of 14.5 (91%).

The exercise after the pre-test, had a mean mark of 13.83 (69%), a mode of 16 (80%) and a median of 13.5 (68%) out of a possible mark of 20. After the post-test, this exercise had a mean mark of 16.83 (84%), a mode of 16 (80%) and a median of 16.5 (83%).

Figure 6.2.3.19: Quiz results (pre-test) Figure 6.2.3.20: Quiz results (post-test)

Figure 6.2.3.21: Exercise results (pre-test) Figure 6.2.3.22: Exercise results (post-test)

81

Week 6 Quiz Marks

0 2 4 6 8 10 12 14 16 18 20 22

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 6 Quiz Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 6 Exercise Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 6 Exercise Marks

0 2 4 6 8 10 12 14 16 18 20

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Stu

den

t

Mark

Week 6 – State Machine Diagrams

The quiz after the pre-test, had a mean mark of 13.50 (61%), 11 (50%), and 12 (55%) out of a possible mark of 22. After the post-test, the exercise had a mean mark of 20 (91%), a mode of 21 (95%) and a median of 20 (91%).

The exercise after the pre-test, had a mean mark of 11.92 (60%), a mode of 11 (55%) and a median of 11 (55%) out of a possible mark of 20. After the post-test, this exercise had a mean mark of 16.33 (82%), a mode of 15 (75%) and a median of 15.5 (78%).

Figure 6.2.3.23: Quiz results (pre-test) Figure 6.2.3.24: Quiz results (post-test)

Figure 6.2.3.25: Exercise results (pre-test) Figure 6.2.3.26: Exercise results (post-test)

82

Summary The results show very significant improvements in performance for the exercises and quizzes from each week. The table below provides a summary of the marks achieved in the quizzes each week by each student during the pre-test and post-test.

Week 0 (Mark/12)

Week 1 (Mark/18)

Week 2 (Mark/19)

Week 3 (Mark/24)

Week 4 (Mark/20)

Week 5 (Mark/16)

Week 6 (Mark/22)

Pre-test

Post-test

Pre-test

Post-test

Pre-test

Post-test

Pre-test

Post-test

Pre-test

Post-test

Pre-test

Post-test

Pre-test

Post-test

Student 1 8 11 14 15 11 19 15 21 20 20 9 16 11 22

Student 2 7 12 13 15 15 19 15 20 16 18 9 16 11 21

Student 3 7 12 14 15 15 19 19 22 13 18 10 11 11 21

Student 4 11 12 9 16 13 18 11 22 15 18 15 17 21 21

Student 5 7 12 9 14 12 16 12 24 12 19 11 14 10 19

Student 6 6 12 8 18 9 16 11 24 11 19 7 14 13 18

Student 7 6 9 11 18 16 19 13 23 11 17 7 13 15 19

Student 8 12 12 12 18 19 20 12 19 12 20 9 15 10 17

Student 9 8 10 7 18 12 17 14 19 14 20 8 15 12 22

Student 10 7 9 8 18 14 18 11 18 10 18 7 16 20 20

Student 11 6 12 10 18 12 17 16 23 10 16 11 12 16 20

Student 12 10 10 10 17 9 19 11 22 10 17 10 12 12 20

This table below illustrates the students’ percentage gains between the pre-test and post-test for each quiz in each week.

Week 0 Week 1 Week 2 Week 3 Week 4 Week 5 Week 6

Student 1 25% 5% 42% 25% 0% 44% 50%

Student 2 42% 11% 21% 20% 10% 44% 45%

Student 3 42% 5% 21% 13% 25% 6% 45%

Student 4 8% 39% 27% 46% 15% 6% 0%

Student 5 42% 28% 21% 50% 35% 19% 41%

Student 6 50% 56% 37% 54% 40% 44% 23%

Student 7 25% 39% 16% 42% 30% 37% 18%

Student 8 0% 33% 0% 29% 40% 38% 32%

Student 0 16% 61% 26% 21% 30% 44% 45%

Student 10 8% 56% 21% 29% 40% 56% 0%

Student 11 50% 44% 26% 29% 30% 6% 18%

Student 12 0% 38% 53% 46% 35% 12% 36%

Figure 6.2.3.27: Quiz results summary table

Figure 6.2.3.28: Percentage gain from pre-test quiz results

83

The table below provides a summary of the marks achieved in the exercises each week by each student during the pre-test and post-test.

Week 0 (Mark/20)

Week 1 (Mark/20)

Week 2 (Mark/20)

Week 3 (Mark/20)

Week 4 (Mark/20)

Week 5 (Mark/20)

Week 6 (Mark/20)

Pre-test

Post-test

Pre-test

Post-test

Pre-test

Post-test

Pre-test

Post-test

Pre-test

Post-test

Pre-test

Post-test

Pre-test

Post-test

Student 1 X X 15 17 11 20 10 15 10 18 16 20 10 16

Student 2 X X 15 17 15 20 9 10 11 18 16 16 10 16

Student 3 X X 20 20 12 19 9 13 11 19 14 15 12 20

Student 4 X X 20 20 13 20 11 13 18 20 14 15 11 14

Student 5 X X 17 17 12 17 19 20 11 20 12 17 17 19

Student 6 X X 17 20 10 19 10 15 13 20 15 17 14 15

Student 7 X X 14 20 20 20 11 13 12 17 12 18 11 15

Student 8 X X 13 16 20 20 15 17 10 16 13 18 10 15

Student 9 X X 20 20 12 18 10 11 16 18 19 19 13 14

Student 10 X X 18 19 19 20 12 14 14 16 11 16 13 15

Student 11 X X 15 19 12 20 11 13 13 17 11 16 11 20

Student 12 X X 18 20 11 19 12 15 12 15 13 15 11 17

This table below illustrates the students’ percentage gains between the pre-test and post-test for each exercise in each week.

Week 1 Week 2 Week 3 Week 4 Week 5 Week 6

Student 1 10% 45% 25% 40% 20% 30%

Student 2 10% 25% 5% 35% 0% 30%

Student 3 0% 35% 20% 40% 5% 40%

Student 4 0% 35% 10% 10% 5% 15%

Student 5 0% 25% 5% 45% 25% 10%

Student 6 15% 45% 25% 35% 10% 5%

Student 7 30% 0% 10% 25% 30% 20%

Student 8 15% 0% 10% 30% 25% 25%

Student 0 0% 30% 5% 10% 0% 5%

Student 10 5% 5% 10% 10% 25% 10%

Student 11 20% 40% 10% 20% 25% 45%

Student 12 10% 40% 15% 15% 10% 30%

The evidence shown in the tables illustrates that there was a significant improvement in performance by each student during the post-test. The goal was to identify a topic that was highly significant and needed attention, and then to design a solution for the problem which proves to address the problem and is highly effective on learning outcomes (high efficacy). The results of the pre-test and post-test, show that these two conditions have definitely been met because the tool has had a major improvement in student performance.

Figure 6.2.3.29: Exercise results summary table

Figure 6.2.3.30: Percentage gain from pre-test exercise results

84

2. Would you say that you are in a much better position to attempt

the coursework from semester one 1 now, than you were before?

100%

0%

Yes

No

6.2 Evaluation After the testing was completed, the 12 students completed a short questionnaire which was set up online using Survey Monkey. This was to get some final feedback.

1. How satisfied were you with the Pathwright tool? 1. How satisfied were you with the Pathwright tool?

75%

25%

0%

0%

Very satisfied

Satisfied

Unsatisfied

Very unsatisfied

2. Would you saw that you are in a much better position to attempt the coursework

from semester 1 now, than you were before?

85

3. How structured were the video lectures? 3. How structured were video lectures?

67%

33%

0%

0% o       Very well structured

o       Well structured

o       Not structured well

o       Not structural well at all

4. How helpful were the video lectures in helping to answer the quiz questions?

4. How helpful were the video lectures in helping to answer the

quiz questions?

100%

0%

0%

0%

Very helpful

Helpful

Unhelpful

Very unhelpful

5. Were the quiz questions of a good mix? (i.e. identifying the actors, definition of an actor etc)

5. Were the quiz questions of a good mix? i.e. identifying actors,

definition of an actor?

83%

17%Yes

No

86

7. Were the exercise scenarios enjoyable?

100%

0%

Yes

No

8. Was there an improvement of your marks for the exercises and

quizzes after using the tool?

100%

0%

Yes

No

6. How helpful were the video lectures in helping to attempt the exercises? 6. How helpful were the video lectures in helping to attempt the

exercises?

58%

42%

0%

Very helpful

Helpful

Unhelpful

7. Were the exercise scenarios enjoyable?

8. Was there an improvement in your marks for the exercises and quizzes after watching the videos?

87

9. What were your views on the navigation around the tool?

41%

42%

17%

0%

0%

Very good

Good

Adequate

Poor

Very poor

10. Was the tool exciting and enjoyable to use?

100%

0%

Yes

No

9. What were your views on the navigation around the tool?

10. Was the tool exciting and easy to use?

11. What did you like most and least about the Pathwright tool?

“I liked the videos. They were very useful and very professional”. “The multiple choice quizzes were a very good interactive element”. “The exercises were very helpful. They would be good for exam preparation”. “The ability to monitor your own progress, and saves the progress to be resumed later”.

88

12. Please include any other comments, feedback or suggestions that you may have.

“It would have been much better if there was an incorporated drawing tool within Pathwright to complete the exercises with instead of completing them using Paint or some other software”. “A page with all definitions that were mentioned in the videos would be helpful to practice and learn definitions”. “An end of course mock exam would have been great, with different scenarios to practice more exam-style diagram questions”. “The tool is well designed and looks professional”. “An excellent tool for revision”.

Summary of evaluation After analysing the questionnaire results, there was sufficient evidence to say that the Pathwright e-learning resource was a fantastic achievement and highly favoured. Many of the students thought that it would be highly effective for revision purposes. There was feedback and comments as shown above in ways which the e-learning resource could be improved.

89

Chapter 7: Conclusion This chapter will review what has been achieved throughout the course of this project and whether the goals and the requirements of the project have been met. There will also be a short discussion of the possible improvements which could make the tool better than it is.

7.1 Project review The methodology used to tackle this project (as mentioned at the very beginning of this report) was the following;

1. investigating whether there is a learning need for this topic, by ending out questionnaires for students, interviewing students, analysing course material and possibly coursework marks

2. conducting background research on the topic, by analysing current existing e-learning resources and what works and does not

3. developing an e-learning resource which addresses this learning need 4. and finally evaluating the effect of the e-learning resource on learning outcomes

(grades, performance). Overall, this methodology of approach has been followed precisely and the requirements for the project have been met. This project has been both challenging and rewarding and has involved substantial research, development and testing and has required huge effort and time. This project has been a success because the thesis has been solved with the development of an interactive e-learning resource and has been significantly evaluated for pedagogic efficacy. The tests carried out has proven that the tool itself is not only useful, but effective on learning outcomes. 7.1.1 Achievements Throughout the course of this project, a variety of new skills have been gained and existing skills such as time management have extensively been improved. The original plan which was produced at the very beginning of the project has been very helpful in terms of managing the time and effort put in to various phases of the project. Following this plan has left enough time to develop and test a prototype before developing a final implementation, so that any problems that were encountered were solved before hand. 7.1.2 Future work There are several improvements that could be made to make this e-learning resource much more effective, whether it is by implementing an incorporated drawing tool within Pathwright which was a suggestion by one of the students so that the exercises could be attempted online rather than using software such as Paint or using paper.

90

Bibliography

1. Www2.cs.man.ac.uk (n.d.) HomePage. [online] Available at: http://www2.cs.man.ac.uk/~cadmanm8/Home.html [Accessed: 17 Apr 2013].

2. E-learningconsulting.com (2012) What is e-Learning?. [online] Available at:

http://www.e-learningconsulting.com/consulting/what/e-learning.html [Accessed: 14 Apr 2013].

3. ochems, . and erri nboer, J., et al. (2004) Integrated E-learning: Implications for

Pedagogy, Technology and Organization. RoutledgeFalmer, p.5. 4. Storey, M. and Phillips, B., et al. (2002) Evaluating the usability of Web-based learning

tools. Educational Technology & Society, 5 (3). Available at: http://www.ifets.info/others/journals/5_3/storey.html [Accessed: 29th April 2013].

5. Storey, M. and Phillips, B., et al. (2002) Evaluating the usability of Web-based learning

tools. Educational Technology & Society, 5 (3). Available at: http://www.ifets.info/others/journals/5_3/storey.html [Accessed: 29th April 2013].

6. Mindflash.com (2013) Asynchronous E-Learning Vs. Synchronous E-Learning |

Mindflash. [online] Available at: http://www.mindflash.com/elearning/asynchronous-synchronous [Accessed: 14 Apr 2013].

7. Mindflash.com (2013) Asynchronous E-Learning Vs. Synchronous E-Learning |

Mindflash. [online] Available at: http://www.mindflash.com/elearning/asynchronous-synchronous [Accessed: 14 Apr 2013].

8. Zhang, D. and Zhao, J., et al. (2004) Can e-learning replace classroom learning?.

Communications of the ACM, Iss. 45 p.76. 9. Zhang, D. and Zhao, J., et al. (2004) Can e-learning replace classroom learning?.

Communications of the ACM, Iss. 45 p.76. 10. Zhang, D. and Zhao, J., et al. (2004) Can e-learning replace classroom learning?.

Communications of the ACM, Iss. 45 p.76. 11. Incotermsexplained.com (2010) Pros and cons of e-learning. [online] Available at:

http://www.incotermsexplained.com/pros_and_cons_of_elearning.html [Accessed: 14 Apr 2013].

12. Zhang, D. and Zhao, J., et al. (2004) Can e-learning replace classroom learning?.

Communications of the ACM, Iss. 45 p.76. 13. Incotermsexplained.com (2010) Pros and cons of e-learning. [online] Available at:

http://www.incotermsexplained.com/pros_and_cons_of_elearning.html [Accessed: 14 Apr 2013].

91

14. Zhang, D. and Zhao, J., et al. (2004) Can e-learning replace classroom learning?. Communications of the ACM, Iss. 45 p.76.

15. Zhang, D. and Zhao, J., et al. (2004) Can e-learning replace classroom learning?.

Communications of the ACM, Iss. 45 p.76. 16. Incotermsexplained.com (2010) Pros and cons of e-learning. [online] Available at:

http://www.incotermsexplained.com/pros_and_cons_of_elearning.html [Accessed: 14 Apr 2013].

17. Incotermsexplained.com (2010) Pros and cons of e-learning. [online] Available at:

http://www.incotermsexplained.com/pros_and_cons_of_elearning.html [Accessed: 14 Apr 2013].

18. Govindasamy, T. (2002) Successful implementation of e-Learning Pedagogical

considerations. Internet and Higher Education, 4 p.289. 19. Hadjerrouit, S. (2010) Developing Web-Based Learning Resources in School Education:

A User-Centered Approach. Interdisciplinary Journal of E-Learning and Learning Objects, 6 p.119.

20. Sparxsystems.com (2000) UML Tutorial - UML Unified Modelling Language - Sparx

Systems. [online] Available at: http://www.sparxsystems.com/uml-tutorial.html [Accessed: 14 Apr 2013].

21. Hamilton, M. (1999) Software Development: A Guide to Building Reliable Systems.

Prentice Hall, p.48. 22. Highered.mcgraw-hill.com (n.d.) Glossary of Key Terms. [online] Available at:

http://highered.mcgraw-hill.com/sites/0077110005/student_view0/glossary.html#a [Accessed: 14 Apr 2013].

23. Agilemodeling.com (2012) Introduction to UML 2 Use Case Diagrams. [online]

Available at: http://www.agilemodeling.com/artifacts/useCaseDiagram.htm [Accessed: 14 Apr 2013].

24. Highered.mcgraw-hill.com (n.d.) Glossary of Key Terms. [online] Available at:

http://highered.mcgraw-hill.com/sites/0077110005/student_view0/glossary.html#c [Accessed: 14 Apr 2013].

25. Highered.mcgraw-hill.com (n.d.) Glossary of Key Terms. [online] Available at:

http://highered.mcgraw-hill.com/sites/0077110005/student_view0/glossary.html#s [Accessed: 14 Apr 2013].

26. Highered.mcgraw-hill.com (n.d.) Glossary of Key Terms. [online] Available at:

http://highered.mcgraw-hill.com/sites/0077110005/student_view0/glossary.html#c [Accessed: 14 Apr 2013].

92

27. Visual-paradigm.com (n.d.) State Machine Diagram - UML Diagrams - Unified Modeling Language Tool. [online] Available at: http://www.visual-paradigm.com/VPGallery/diagrams/State.html [Accessed: 17 Apr 2013].

28. Dawson, C. (2009) Projects in Computing and Information Systems: A Student's Guide.

2nd ed. Pearson Education Canada, p.117. 29. Dawson, C. (2009) Projects in Computing and Information Systems: A Student's Guide.

2nd ed. Pearson Education Canada, p.117.