evaluation of instructional module development system by

63
Evaluation of Instructional Module Development System by Vaishnavi Raj A Thesis Presented in the Partial Fulfillment of the Requirements for the Degree Master of Science Approved May 2018 by the Graduate Supervisory Committee: Srividya Bansal, Chair Ajay Bansal Alexandra Mehlhase ARIZONA STATE UNIVERSITY August 2018

Upload: others

Post on 18-Oct-2021

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluation of Instructional Module Development System by

Evaluation of Instructional Module Development System

by

Vaishnavi Raj

A Thesis Presented in the Partial Fulfillment of the Requirements for the Degree

Master of Science

Approved May 2018 by the

Graduate Supervisory Committee:

Srividya Bansal, Chair Ajay Bansal

Alexandra Mehlhase

ARIZONA STATE UNIVERSITY

August 2018

Page 2: Evaluation of Instructional Module Development System by

i

ABSTRACT

Academia is not what it used to be. In today’s fast-paced world, requirements

are constantly changing, and adapting to these changes in an academic curriculum

can be challenging. Given a specific aspect of a domain, there can be various levels of

proficiency that can be achieved by the students. Considering the wide array of needs,

diverse groups need customized course curriculum. The need for having an archetype

to design a course focusing on the outcomes paved the way for Outcome-based

Education (OBE). OBE focuses on the outcomes as opposed to the traditional way of

following a process [23]. According to D. Clark, the major reason for the creation of

Bloom’s taxonomy was not only to stimulate and inspire a higher quality of thinking

in academia – incorporating not just the basic fact-learning and application, but also

to evaluate and analyze on the facts and its applications [7]. Instructional Module

Development System (IMODS) is the culmination of both these models – Bloom’s

Taxonomy and OBE. It is an open-source web-based software that has been

developed on the principles of OBE and Bloom’s Taxonomy. It guides an instructor,

step-by-step, through an outcomes-based process as they define the learning

objectives, the content to be covered and develop an instruction and assessment plan.

The tool also provides the user with a repository of techniques based on the choices

made by them regarding the level of learning while defining the objectives. This helps

in maintaining alignment among all the components of the course design. The tool

also generates documentation to support the course design and provide feedback

when the course is lacking in certain aspects.

It is not just enough to come up with a model that theoretically facilitates

effective result-oriented course design. There should be facts, experiments and proof

that any model succeeds in achieving what it aims to achieve. And thus, there are two

research objectives of this thesis: (i) design a feature for course design feedback and

evaluate its effectiveness; (ii) evaluate the usefulness of a tool like IMODS on various

aspects – (a) the effectiveness of the tool in educating instructors on OBE; (b) the

Page 3: Evaluation of Instructional Module Development System by

ii

effectiveness of the tool in providing appropriate and efficient pedagogy and

assessment techniques; (c) the effectiveness of the tool in building the learning

objectives; (d) effectiveness of the tool in document generation; (e) Usability of the

tool; (f) the effectiveness of OBE on course design and expected student outcomes.

The thesis presents a detailed algorithm for course design feedback, its pseudocode, a

description and proof of the correctness of the feature, methods used for evaluation

of the tool, experiments for evaluation and analysis of the obtained results.

Page 4: Evaluation of Instructional Module Development System by

iii

DEDICATION

I would like to dedicate my thesis to my family. My parents, with their

unwavering love and support have given me the strength and determination to

persevere through the difficult stages in life, including the challenges faced

throughout the duration of my Master’s and my brother, whose faith in me has

always pushed me to do better.

Page 5: Evaluation of Instructional Module Development System by

iv

ACKNOWLEDGMENTS

I would like to take this opportunity to thank Dr. Srividya Bansal for her

incredible guidance and support for my thesis and in the other aspects of my career. I

would also like to thank Dr. Ajay Bansal and Dr. Alexandra Mehlhase for being on my

committee.

Page 6: Evaluation of Instructional Module Development System by

v

TABLE OF CONTENTS

Page

LIST OF TABLES ........................................................................................................ viii

LIST OF FIGURES ........................................................................................................ ix

CHAPTER

1. INTRODUCTION ....................................................................................................... 1

1.1 Motivation ............................................................................................................. 1

1.2 Research Statement .............................................................................................. 2

2. BACKGROUND .......................................................................................................... 4

2.1 Bloom’s Taxonomy ............................................................................................... 4

2.2 Outcome Based Education ................................................................................... 4

2.3 Related Work ........................................................................................................ 5

2.3.1 Theoretical Models ......................................................................................... 5

2.3.2. Software Tools .............................................................................................. 7

2.4 Instructional Module Development System (IMODS) ........................................ 8

2.4.1 Theoretical Framework .................................................................................. 8

2.4.2 Flow of IMODS Application .......................................................................... 9

2.4.3 Implementation of IMODS .......................................................................... 10

2.4.4. Evaluation Instruments .............................................................................. 10

3. APPROACH AND IMPLEMENTATION FOR COURSE DESIGN FEEDBACK ...... 12

3.1 Approach ............................................................................................................. 12

3. 2 Implementation ................................................................................................. 13

3.3 Evaluation .......................................................................................................... 18

Page 7: Evaluation of Instructional Module Development System by

vi

CHAPTER Page

4. EXPERIMENTAL STUDY FOR EVALUATION OF IMODS ................................... 19

4.1 Instruments for evaluation ................................................................................. 19

4.1.1 Pre/Post Tests ..............................................................................................20

4.1.2 Interviews ..................................................................................................... 21

4.1.3 Document comparison/ analysis ................................................................. 22

4.1.4 Usability Test ............................................................................................... 23

4.1.5 User Testing ................................................................................................. 24

4.1.6 Webinars ...................................................................................................... 24

4.2 Effectiveness of the tool in educating the instructors on OBE .......................... 25

4.3 Evaluation of the repository of techniques ........................................................ 25

4.4 Effectiveness of the tool in building learning objectives.................................... 26

4.5 Evaluation of documents generated by the tool ................................................. 28

4.6 Effectiveness of OBE course design and achieving student outcomes .............. 28

4.7 Evaluation of the tool’s usability ........................................................................ 29

4.8 Study set up ........................................................................................................ 29

5. ANALYSIS AND RESULTS ...................................................................................... 31

5.1 Evaluation of feedback feature ........................................................................... 31

5.2 Interview Results ................................................................................................ 33

5.3 Webinar Results ................................................................................................. 39

5.4 Evaluation of the tool ......................................................................................... 39

5.4.1 Educating instructors on OBE ..................................................................... 39

5.4.2 Repository of Techniques ............................................................................ 41

Page 8: Evaluation of Instructional Module Development System by

vii

CHAPTER Page

5.4.3 Building Learning Objectives ...................................................................... 42

5.4.4 Document Comparison ................................................................................ 44

5.4.5 Usability Testing .......................................................................................... 45

6. CONCLUSION AND FUTURE WORK .................................................................... 48

6.1 Findings .............................................................................................................. 48

6.2 Future Work ....................................................................................................... 49

6.2.1 Bugs and Action Items ................................................................................. 49

6.2.2 Suggested improvements ............................................................................ 50

6.3 Personal Outcomes from the Thesis .................................................................. 51

7. REFERENCES………………………………………………………………………………………………52

Page 9: Evaluation of Instructional Module Development System by

viii

LIST OF TABLES

Table Page

1. Defining constants and Course Overview Feedback…….…………………………………….14

2. Learning Objective Feedback………………………………………………………………………….14

3. Content Feedback………………………………………………………………………………………….14

4. Assessment Feedback…………………………………………………………………………………….15

5. Pedagogy Feedback……………………………………………………………………………………….15

6. Feedback Feature Evaluation Table……………………………………………………………..…32

7. Tool Helpfulness Table………………………………………………………………………………….33

8. Familiarization to OBE Table…………………………………………………………………………33

9. Repository of Techniques Table…………………………………………………….……………….34

10. Student Outcomes Table………………………………………………….…………………………..35

11. Learning Objectives/Bloom’s Taxonomy Table………………….……………………………35

12. Effectiveness of Backward Approach Table………………….………………………………...35

13. Support Documentation Table………………………………..……………………………………36

14. Ease of LO Construction Table………………………..………………………………...............36

15. Documentation Table…………………………………………………………..………………………37

16. Domain Category Table………………………………………………….…………………………….37

17. Edit Feature Usage Table………………………………………………..……………………………38

18. Participant Information Table………………………………..…………………………………….39

19. Scores of OBE Tests…………………………………………………………………………………….40

20. Updated scores of OBE Tests…………………………………………..………………..............41

21. Learning Objectives Evaluation Table……………………………………………………………44

22. Strengths and Weaknesses of the tool…………………………….…………………………….49

23. Bugs found by Usability testing……………………………………………………………………50

24. Suggested Improvements…………………………………………………………………………….51

Page 10: Evaluation of Instructional Module Development System by

ix

LIST OF FIGURES Figure Page

1. PC3 Model ................................................................................................................... 9

2. New IMOD - Initial Screen ...................................................................................... 15

3. New IMOD - Course Design Status .......................................................................... 15

4. Course Overview Filled ............................................................................................ 16

5. Instructor Information Added ................................................................................. 16

6. One Learning Objective defined .............................................................................. 16

7. One content topic added .......................................................................................... 16

8. Two Learning Objectives and 4 content topics defined ........................................... 16

9. Three Learning Objectives added ............................................................................ 17

10. Six content topics added ........................................................................................ 17

11. Assessment technique added for one objective ...................................................... 17

12. Pedagogy technique added for one objective ......................................................... 17

13. Assessment techniques added for all objectives .................................................... 17

14. Pedagogy techniques added for all objectives - design complete .......................... 17

15. Learning Objective Construction ........................................................................... 27

16. Bar Chart for descriptive power of Technique titles .............................................. 41

17. Bar Chart for Selection of Techniques .................................................................... 42

18. Bar Chart for description of techniques ................................................................. 42

19. Usability Results ..................................................................................................... 46

20. Radar Charts for Usability ..................................................................................... 47

Page 11: Evaluation of Instructional Module Development System by

1

CHAPTER 1

INTRODUCTION

1.1 Motivation

STEM education is continuously evolving, but in today’s world there is a rapid

increase in the rate of this evolution. To go hand in hand with the changing trends,

academia needs to change the traditional strategy of teaching and focus on a result-

oriented approach, that is, an effective mechanism to design a course is required.

According to Boice [8], 95% of the new instructors take three to five years to come up

with an effective course plan. A very small percentage of 5 can do it in one to two

years. Usually instructors go about teaching the same way that they were taught,

which might prove to be ineffective. Thus, there was a need to come up with a

structured methodology and a design model that would guide instructors through the

process of developing a successful course design.

Various organizations such as NSF, ABET and NAS have focused on finding

methods to support STEM education that go beyond the existing traditional methods.

The reason, according to Fairweather [22], is that recent years have seen a decrease

in the number of students opting to choose a major belonging to the STEM field, in

the number of students graduating and the number of students enrolling in STEM

courses in undergrad. This has various social and economic consequences. According

to Seymour and Ferrare [24], major reason is the poor STEM teaching practices in

colleges. To address all these issues NSF has funded many educational research

projects. Many universities have also tried to put in efforts to improve teaching and

learning [25].

One of the most widely used version of Outcome-based Education is by Spady.

According to Spady [9], for the instructors to be able to successfully help the student

Page 12: Evaluation of Instructional Module Development System by

2

achieve their goal, the objectives of the course pertaining to the outcomes expected

has to be very clear. The following are the major steps to set up an effective

curriculum: (i) Defining a course objective and the Intended Learning Outcomes

(ILO); (ii) Designing assessments such that proof can be shown that the student did

indeed learn what was intended; (iii) Designing student-centered teaching and

learning activities. One of the main reasons the use of OBE is rapidly increasing is

because of its focus on the process of learning and the learning environment [9].

The principles of OBE have already inspired a few models for course design such

as – Effective Course Model by Fedler and Brent [4] and Integrated Course Design by

Fink [5]. Not only is it important to come up with a course design methodology, it is

also essential that we be able to prove the efficiency of the proposed approach.

1.2 Research Statement

IMODS is one such model implemented with the intention of providing a

significant awareness to Outcome Based Education, which focuses on outcomes

rather than the process or input. One of the goals of IMODS is to familiarize

instructors to OBE and impart some knowledge on it. Results are what matter in

STEM education and the tool is designed to structure an instructor’s thoughts while

they are going through the process of designing the course. The following are the

major goals of the thesis:

1. Design a feature for effective feedback to the instructor based on their given

learning objectives and chosen assessment and pedagogical techniques. Evaluate this

feature for its correctness by conducting experiments and measuring the accuracy of

it.

2. Evaluate how effective the tool is in familiarizing and educating instructors on

Outcome-based Education as well as various other aspects such as the learning

Page 13: Evaluation of Instructional Module Development System by

3

objective creation, usability and the repository of techniques available for

instruction(pedagogy) and assessments.

Page 14: Evaluation of Instructional Module Development System by

4

CHAPTER 2

BACKGROUND

2.1 Bloom’s Taxonomy

The focus of Outcome-Based Education (OBE) is on the product and not on

the process. So, defining the goals and objectives clearly is very important as the

complete course design is dependent on them. Knowledge of Bloom’s Taxonomy

helps instructors compartmentalize the various aspects of the course objectives and

design each objective with specificity and precision. Learning is classified into three

categories – cognitive (mental skills), affective (emotional growth) and psychomotor

(physical skills) [3]. There are different levels of knowledge and excellence that can

be obtained for each of the learning domains and thus we have six domain categories

– remember, understand, apply, analyze, evaluate and create [10]. Along with having

domain categories, Bloom’s taxonomy also has a mechanism of labeling the actual

content – factual, conceptual, procedural, metacognitive. Any content can encompass

a learning domain category and any combination of the knowledge dimensions.

2.2 Outcome Based Education

Unlike the traditional methodologies adopted for teaching, new ways are

being looked for due to rapid change in the field of technology and for the need of it

being included in a course curriculum. The focus of this method, as the name

suggests, is the outcomes [12]. The learning outcomes need to be well-defined and

specific. In traditional systems, the focus is on the process and not on whether the

students learn any of the material.

The focus of OBE though is on maximizing the student performance by

carefully designing the outcomes to match the level of learning expected from the

Page 15: Evaluation of Instructional Module Development System by

5

students. The main advantages of this method are clarity (having clear, well-defined

objectives), flexibility (ability to adopt methods of instruction based on the needs of

the student), comparison (it can happen within the class to the level of institutions)

and student involvement.

2.3 Related Work

2.3.1 Theoretical Models

Traditional teaching methods including the OBE are approaches where the

outcome defines the process. Decisions on how the content is organized, the

strategies and assessment procedures need to be made. Based on the above

requirements, OBE solves many of them and has been chosen for a variety of reasons:

1. Provides a solution to both the students and educators, by giving them a better

success rate in learning and a structured way for the educators to design the courses.

2. Easy method to learn frameworks and then design learning environments.

3. In STEM fields outcome-based learning has gained a lot of traction with

Accreditation boards such as ABET [11].

Many models have been developed to demonstrate the application of OBE for the

design of effective courses.

1. Effective Course Model by Felder and Brent: This model develops a framework for

designing instructional development programs to equip engineering educators to

adapt to the growing trends of the modern world [4]. It stresses on the fact that many

existing programs vary significantly – geographically and academically. To make any

course effective, it requires the application of certain criteria to the design and

delivery of the program and the involvement of faculty in the design.

2. Integrated Course Design by Fink: This model, like the above operates on a

backward-looking design process [5]. The results are first identified and upon them

assessments have been provided to achieve those results. The integrated course

Page 16: Evaluation of Instructional Module Development System by

6

design operates by taking the situational factors and provide a backward design

approach on a closed loop of three factors, i.e, Learning Goals, Teaching and learning

Activities, Feedback and Assessment.

3. Understanding by Design Framework (UbD): It has a three-stage backward design

process for planning the curriculum for the course [13]. A template and a set of

design tools for the process. One of the major concepts in UbD framework is the

alignment. All the stages must clearly align, to both standards and to one another.

Stage 1 content and understanding should be assessed in stage 2 and taught in stage

3. Stage 1 tries to understand the priorities and have a structure on it. It considers the

goals, examine established standards and review the curriculum expectations. Stage 2

identifies two types of assessment, performance tasks and other evidence. The

performance tasks ask the student to apply their learning on to a new problem, on

which their understanding ability and the ability to transfer their understanding is

judged. Stage 3 teachers plan their lessons and learning activities based on three

different types of goals identified in Stage 1.

4. Content Assessment Pedagogy Model: The framework presented is based on

outcome-based course design [14]. One of the key features of the paper is the authors

argue about aligning content assessment and delivery. A comparison of this model

can be made to an engineering design approach. Also, stresses on the fact that

teachers of the 21st century need to consider setting aside their roles as teachers and

take an active role in designers of learning experiences.

One common thing that has been observed through all the models mentioned

above is the alignment between various components of a course, which will then

provide relevant information for strategies. Another objective of the Content

Assessment Pedagogy Model is to provide a model for aligning course content with

assessment and delivery that practitioners can use to inform the design or re-design

engineering courses.

Page 17: Evaluation of Instructional Module Development System by

7

2.3.2. Software Tools

Three major software applications are discussed below.

1. Electronic Performance Support System (EPSS): A system or a category of systems

that helps users in improving performance is called an EPSS [15]. It is an electronic

environment that helps users do their job with minimal help and support by others.

Thus, it provides a complete range of information, data, tools and guidance to

enable users to do their job on their own. Using this kind of system reduces training

costs and helps save the time of the experts who would previously have to train and

help the employees with their tasks. It is also helpful in scenarios where employees

would be required to have a certain base knowledge about their job before they start

working on it, by providing them with the relevant information. As one of the features

of IMODS is guiding the user through a complex process in a systematic and

organized manner, an EPSS is pretty much like IMODS.

2. Learning Management System (LMS): Some systems act as online instructors, and

sometimes even replace classroom teachers. An LMS is such a system. According to

[21], LMS is used for administration, documentation and progress reporting of the

student. These systems can be used to provide subject matter and notes to the

student, answer queries the student might have about a topic, and even conduct and

grade online assessments. In contrast, IMODS is used to facilitate the instructor in

course planning according to the instructors’ learning objectives, instead of the tool’s.

3. Knowledge Management System (KMS): A KMS is a system that helps knowledge

sharing among people. It helps to organize knowledge by sharing experiences of the

user, and by recording past successes and failures. Along with recording all the

information, it can also be used for knowledge sharing. Information stored onto this

system can be shared amongst people so that everyone with access to this system can

make use of the information previously recorded. Thus, a KMS enables

Page 18: Evaluation of Instructional Module Development System by

8

organizational learning. Examples of KMS are Blackboard [16] and Moodle [17].

IMODS is quite different from KMS in that it focuses on institutional educational

learning, and not organizational. However, a way both systems are similar is that

both make use of ontologies.

2.4 Instructional Module Development System (IMODS)

2.4.1 Theoretical Framework

There are a few tools that have already been developed for course design that

are based on OBE. All of them have considered the main components of the course

design to be learning objectives, content, assessments and pedagogy [7]. Not only

should these elements be included while designing a course, it is also important to

maintain a level of alignment between all the components. Keeping the existing

useful elements from the other models, IMODS has added various other elements

that make it different. It deeps dive in the first step – defining the learning objectives.

Learning objectives are obviously one of the most important parts of

designing a result-oriented course. And thus, Performance, Content, Criteria,

Condition (PC3) model was designed. It takes into consideration, the performance

(expected from the course), content (things to be learnt), criteria (level of success to

be achieved) and condition. Interactions of these elements are used to integrate the

other components – Content, Pedagogy and Assessment. Performance and criteria

are used for defining Assessment, content and performance are used to define

Pedagogy and Content is defined based on content and condition components. The

inclusion of additional components and the focus on the alignment among all aspects

of course design makes IMODS highly effective.

Page 19: Evaluation of Instructional Module Development System by

9

2.4.2 Flow of IMODS Application

IMODS is a web application that assists instructors in designing curriculum

based on OBE. For users not familiar with OBE, the application acts as a tutorial. As

OBE is result-oriented, IMODS first prompts the user to have a set of well-defined

learning objectives (based on learning domains and domain categories from Bloom’s

taxonomy). After learning objectives are defined based on a unique technique (PC3

model), assessment and pedagogy techniques need to be chosen. There is already a

vast database of techniques for assessments and pedagogy that are suitable for a large

portion of STEM-related courses. Each learning objective is tied to one or more

assessment and pedagogy techniques. A user can also create new techniques that

uniquely satisfies the user’s need. This can be saved in the database for future use as

well.

Figure 1: PC3 Model

Page 20: Evaluation of Instructional Module Development System by

10

IMODS also provides the feature of scheduling on a calendar, all the activities

can be organized and maintained. It generates a document of syllabus, course

information and the assessment and pedagogical techniques used. The tool also

provides control to the instructors over sharing of any course related information

with the students. In the end, IMODS intends to have given the user a little more

insight into OBE and how it can be used and be influential in getting results in a

class.

2.4.3 Implementation of IMODS

The tool is meant to have a step-by-step process as mentioned in section 2.4.1.

It is an open source web application. The system architecture is the Model-View-

Controller (MVC) model. Several technologies were considered, and the best suited

technologies were chosen for implementation [20]. The framework for MVC was

chosen to be Grails (Groovy on Rails). PostgreSQL was the chosen NoSQL database,

along with Jenkins for continuous integration and Git for version control.

The whole system adheres to the PC3 model, making sure that alignment is

maintained between all the components – objectives, assessment and pedagogy.

2.4.4. Evaluation Instruments

New instruments need to be designed for evaluating the tool. Several different

criteria need to be taken into consideration. The following factors have been

considered with respect to the criteria for evaluation:

1. The usability of the tool [28].

2. The tool should be effective in achieving its main objective – guide the

instructor step-by-step to design a course [29].

Page 21: Evaluation of Instructional Module Development System by

11

3. The tool should generate effective documentation [1].

4. The tool should educate the instructor on OBE [1].

In order to find instruments that can help in doing the above, I read several

research papers and books on not only education techniques and its research but also

web tools. For the tool’s usability, I have used the questionnaire-based evaluation

[26]. A usability questionnaire contains statements regarding the tool’s intuitiveness

and usage and asks the participant to rate their experience on a Likert scale.

The tool also generates syllabus document. This needs to be evaluated as well.

So, I collected the syllabi documents of the courses designed using the tool - before

and after. A comparison between these documents gave a clear picture of the things

lacking in the old document and the faults in the auto-generated document as well.

Based on [22], the idea of using pre- and post- tests for evaluating the

instructor’s knowledge of OBE is chosen. This method basically consists of two sets

(pre and post) of tests – with the same question set in both. The pre-test is

administered before the use of the tool and the post-test will be taken by the

participants after they have used the tool.

Apart from the above-mentioned techniques, the method of interviewing [27]

has also been chosen as a way to get more insight into the instructor’s experience

using the tool and what they felt to be good and lacking. According to Alshenqeeti

[27], interviews are a way of getting the data that cannot be uncovered with

techniques like questionnaires and observations.

Thus, the final set of techniques chosen are – questionnaires, interviews and

pre- and post-tests.

Page 22: Evaluation of Instructional Module Development System by

12

CHAPTER 3

APPROACH AND IMPLEMENTATION FOR COURSE DESIGN FEEDBACK

While the tool is intended to be intuitive, it is also necessary to provide the

user with feedback required so that they can design the course as best as possible. So,

along with the progress bar which indicates the level of their progress on the course, a

feedback feature has been implemented. If the progress bar just indicates the level of

progress and gives no indication of how exactly to achieve a hundred percent

complete course structure, it is not very helpful. And so, in order to guide the

instructors designing the course on where exactly their course might be lacking, a

feedback feature is implemented.

3.1 Approach

The major components to have a complete course design are:

i. Course Overview

ii. Instructor Information

iii. Learning Objectives

iv. Content

v. Assessment Techniques

vi. Pedagogy Techniques

Counts are considered to determine the lacking components in the course. The

criteria for each is as follows:

i. Course Overview – The required fields should be filled.

ii. Instructor Information – At least one instructor needs to be added.

iii. Learning Objectives – At least three learning objectives have to be defined.

iv. Content – Six content topics should be defined.

Page 23: Evaluation of Instructional Module Development System by

13

v. Assessment Techniques – At least one technique has to be selected for each

learning objective.

vi. Pedagogy Techniques – At least one technique has to be selected for each

learning objective.

The previous algorithm for progress calculation took into consideration the count of

assessment and pedagogy techniques over the whole course. I have changed it such

that, now each objective must have one assessment and one pedagogy technique.

3. 2 Implementation

The application has been designed in such a way that it requires the basic

information of the course to be filled before moving forward in the process. It means

that basically the Course Overview information is required (Course Title, Course

Number, Start Date, End Date, Start time, End time and Subject Area). This takes

care of the first requirement. Once the user fills out this information, the Instructor

field, along with the Learning Objective tab, Content tab, Assessment Tab and

Pedagogy Tab are displayed to the user.

Since the user does not get access to all the tabs in the first step, it doesn’t

make sense to give them feedback about those tabs. So, the first message says, “Please

fill the Course Overview section to see the minimum requirements for completing the

course design.” Then, once the first section is filled, information regarding the next

steps is given. The algorithm is shown in the tables below, for each component of the

course design.

int minLo = 3;

int minContent = 6;

int minInstr = 1;

String initial = 'Please fill the course overview to see the minimum requirements to

complete the course design.’;

Page 24: Evaluation of Instructional Module Development System by

14

String instStatus = 'Add instructor Information </br>';

String loStatus = ‘At least three learning objectives need to be defined - ';

String contentSstatus = ‘ At least six content topics need to be defined - ;’

String asstStatus = ‘At least one assessment technique needed for each objective.

</br>’

String pedStatus = ‘At least one pedagogy technique needed for each objective.

</br>’

info = ‘IMOD Info’;

if (ImodID == “new”):

info += initial;

else if (CourseOverviewCompleted):

if !(InstructorInfoCompleted):

info += intStatus;

Table 1: Defining constants and Course Overview Feedback.

int LoCount = 0;

LoCount = currentImod.learningObjectives.length;

if (LoCount <= minLo):

info += loStatus + LoCount +‘ defined. </br>';

Table 2: Learning Objectives Feedback

int ContentCount = 0;

ContentCount = currentImod.contents.length;

if (ContentCount < minContent):

info += contentStatus + ContentCount + ‘ defined. </br>’;

Table 3: Content Feedback

int AsstTechCount = 0;

AsstTechCount = currentImod.learningObjectives.Asst.count;

if ( AsstTechCount <= loCount):

info += asstStatus;

asstPercent = 100 – (loCount - AsstTechCount)/ loCount * 100;

else:

Page 25: Evaluation of Instructional Module Development System by

15

asstPercent = 100;

Table 4: Assessment Feedback

int PedTechCount = 0;

PedTechCount = currentImod.learningObjectives.Ped.count;

if ( PedTechCount <= loCount ):

info += pedStatus;

pedPercent = 100 – (loCount - PedTechCount)/ loCount * 100;

else:

pedPercent = 100;

Table 5: Pedagogy Feedback

The screenshots below show the step by step process with all the feedback provided at

each stage.

Figure 2: New IMOD - Initial Screen

Figure 3: New IMOD - Course Design Status

Page 26: Evaluation of Instructional Module Development System by

16

Figure 4: Course Overview Filled

Figure 5: Instructor Information Added

Figure 6: One Learning Objective defined

Figure 7: One content topic added

Figure 8: Two Learning Objectives and 4 content topics defined

Page 27: Evaluation of Instructional Module Development System by

17

Figure 12: Pedagogy technique added for one objective

Figure 9: Three Learning Objectives added

Figure 10: Six content topics added

Figure 11: Assessment technique added for one objective

Figure 13: Assessment techniques added for all objectives

Figure 14: Pedagogy techniques added for all objectives - design complete

Page 28: Evaluation of Instructional Module Development System by

18

3.3 Evaluation

The evaluation of this feature is presented in Chapter 5. The methodology

adopted is explained here. All the major components required for the course design

are taken into consideration. Each of those represent one column in the table. Each

row is a new IMODS. A combination of various components is used to build each of

these IMODS. The expected feedback result based on the conditions mentioned in the

previous section and the actual feedback result obtained are compared. If both the

expected outcome and the actual outcome match, then the condition for that part of

the system is considered to be correct.

Ten sample IMODS have been created for testing purpose. The table and

results are presented in Chapter 5.

Page 29: Evaluation of Instructional Module Development System by

19

CHAPTER 4

EXPERIMENTAL STUDY FOR EVALUATION OF IMODS

In this section, the research questions will be further broken down into sub-

questions, the set up for the study will be explained in detail, the approaches to

answer these questions will be described and the results will be discussed.

To evaluate the tool’s performance, various aspects of it that needed to be

tested had to be listed out. The following are the sub-parts in measuring the tool’s

effectiveness in achieving its purpose.

1) How effective is the tool in imparting knowledge on OBE?

2) How effective is the tool in providing a wide selection of appropriate pedagogy and

assessment techniques?

3) How effective in the tool in the construction of learning objectives?

4) How effective is the tool with regards to the course documentation it generates?

5) How effective is OBE in course design and achieving student outcomes?

6) Is the tool user-friendly?

In the following sections, I will discuss further details about each of the above

questions. But before that, another important aspect to discuss would be the

approaches for the evaluation.

4.1 Instruments for evaluation

A specific set of methods needed to be defined for evaluation. I have employed

the following techniques for conducting my study.

1) Pre/Post Tests

2) Interviews

3) Document Comparison and Analysis

Page 30: Evaluation of Instructional Module Development System by

20

4) Usability Survey

5) User Testing

6) Webinars

4.1.1 Pre/Post Tests

The pre-test and post-test, both have the same set of questions. This

questionnaire consists of question on OBE and Bloom’s taxonomy in general. The

participants are supposed to take the pre-test before they start using the tool and the

post-test after they have finished course design. There needs to be a significant

amount of time between both the tests as recall has to be avoided as best as possible.

These tests are useful in determining if there has been any increase in the knowledge

of the participants with respect to outcome-based education. The questions were

objective questions as follows:

a. What are the three domains of learning as specified by Bloom’s Taxonomy?

b. What are the different learning categories under the Cognitive Domain?

c. Which domain involves the recall or recognition of specific facts, procedural

patterns, and concepts that serve in the development of intellectual abilities

and skills?

d. Which domain includes the manner in which we deal with things emotionally,

such as feelings, values, appreciation, enthusiasms, motivations, and

attitudes?

e. Which domain includes physical movement, coordination, and use of the

motor-skill areas?

f. What are the four types of knowledge that learners acquire?

g. List the two different kinds of Assessments.

Page 31: Evaluation of Instructional Module Development System by

21

h. Outcome-based education is a theory that is ___________

• Process-based

• Product-based

i. Which of the following is NOT an outcome?

• Solve dynamic programming problems

• Design a neural network algorithm

• Attend four workshops

• None of the above

j. Which of the following is related to Outcome-based Education (OBE)?

• Exit outcomes are a critical factor

• Input based education

• Result oriented thinking

• Emphasis is on the educational process

4.1.2 Interviews

One on one interview is an excellent method for getting a detailed account of

their thoughts and opinions. The focus of the interviews was to get as much

information regarding the process of course design as a whole, the problems they

faced while developing the course, their perspective of the tool’s role in education,

their opinion on all aspects of the process – with special focus on learning objectives

and the selection of techniques for assessments and pedagogy. Also, information

regarding participant’s teaching experience was also collected. The general question

list was as follows with further probing as needed:

a. What was the course that was designed? Is it new or a redesign?

b. Was the tool helpful in the course design process? How?

Page 32: Evaluation of Instructional Module Development System by

22

c. What are some of the strengths and weaknesses of the tool?

d. Can you elaborate on your understanding of OBE?

e. What is your understanding of the knowledge dimensions of the topics?

f. Do you think the choice of assessment/pedagogy techniques presented for

your course are appropriate?

g. Did the Learning Objective feature force you to think about the level of

learning for students? Elaborate.

h. Can you provide a reflection on how the course designed would help in

achieving expected student outcomes?

i. Was it useful to have the Learning Objective feature connected to Bloom’s

Taxonomy?

j. Did you find this backward/reverse approach of designing a course was

effective?

k. What do you think of the provided support and help documentation?

l. How many years of teaching experience do you have?

m. How many courses have you taught? How many of those did you build from

scratch?

n. What is the average class size in the courses you have taught?

o. Additional feedback?

The questions k-o are focused on getting the background information of the

instructor’s experience.

4.1.3 Document comparison/ analysis

The syllabus is one of the important documents in course design. It contains

all the required information that students need regarding the subject – including the

Page 33: Evaluation of Instructional Module Development System by

23

expected outcomes, the topics to be covered, the rules etc. This document is what

gives the student a firsthand idea as to what the course entails. And thus it is very

important for the document to be clear, concise and consistent. And this is what I aim

to compare – how helpful the tool is in covering all three aspects and also reducing

the instructor’s effort in writing the document from scratch.

4.1.4 Usability Test

For a software tool, no matter how amazing it is, to be truly successful, it has

to be user-friendly. This questionnaire contains questions that asks the participant,

several questions on the usability of the tool. Having played a role in either coming

up with the idea for the tool or brainstorming or even implementing certain features

for the tool, our perspective tends to be biased. And thus, even if we assume the tool

is user-friendly, it is important to get the user’s perspective on the matter. The

questions are designed so that they can rate the tool on a Likert scale – Strongly

agree, agree, neutral, disagree, strongly disagree. The questions in the survey are as

follows:

a. The organization of information on the screen for IMODS was clear.

b. The IMODS application gave error messages that told me how to fix

problems.

c. The titles for assessment and pedagogy techniques were self-descriptive.

d. The description of the assessment and pedagogy techniques was clear.

e. The documentation produced (assessment plan and instruction plan) for

assessments and pedagogy is satisfactory.

f. The selection available for the assessment and pedagogy techniques is

satisfactory.

Page 34: Evaluation of Instructional Module Development System by

24

g. It is easy to define custom assessment and pedagogy techniques.

h. The application doesn't need a supporting document to use.

i. The application was easy to navigate.

j. The font size and style are easy to read.

k. The application is intuitive and easy to use.

l. The application looks aesthetically nice.

m. The overall satisfaction with the application is high.

n. I would recommend this application to my colleagues.

4.1.5 User Testing

Given a set of instructions for a software tool, it should be easy to follow. The

user interface has to be intuitive enough for a naive user to navigate using the

instruction set. In order to evaluate the tool on this front, a user testing was done

with a class of students. Never having had any teaching or course design experience,

they simply had to follow the given instruction to create a complete course design.

Around an hour of time was given and in the end the students gave a Usability survey

with the same questions mentioned in the previous section.

4.1.6 Webinars

The final methodology adopted for data collection was webinars. A series of webinars

were conducted for professors participating from India. There were around 20

participants who joined the webinar. The plan was to have two sessions each of an

hour. The first session focused on introducing the tool, its background, help material

for the tool and providing the participants with the link to the tool. The participants

were expected to give the pre-test, go through the tool and design a course in a week’s

time. The second session was conducted exactly after a week and was meant as more

Page 35: Evaluation of Instructional Module Development System by

25

of a discussion – a way for them to share their experience with the tool and follow up

with any questions they might have. At the end of the second session, the post-test

and the usability questionnaires were shared.

4.2 Effectiveness of the tool in educating the instructors on OBE

One of the major goals the tool tries to achieve is to promote knowledge of

OBE among instructors. Outcome based education is a methodology in which the

product defines the process. The goals that need to be achieved are defined first and

then the process to reach those goals is mapped out. IMODS is based on the

principles of Outcome based education. It is proven to have a higher student success

rate and is growing in popularity.

Study indicates that newly appointed instructors take about five years to

perfect the process of effective course design through trial and error [8]. The students

are most affected during this time. Coming up with an efficient way to help the

instructors have a lower margin for error is of utmost importance. And thus,

educating the instructors on OBE is one of the goals of the IMODS tool.

The pre/post-tests, as well as interviews are used to measure this particular

aspect.

4.3 Evaluation of the repository of techniques

It is important to employ appropriate assessment and pedagogical techniques

that align with the level of learning that is expected out of a target audience. Let’s say

for example that a student cannot be expected to create or evaluate a course-specific

subject when the level of learning expected is that one simply needs to understand

that particular topic.

Page 36: Evaluation of Instructional Module Development System by

26

Using the learning domains, domain category and knowledge dimensions, it is

important to check if the techniques offered by the tools really do match up with the

level of expertise chosen by the instructor [19]. Thus, it is important to evaluate the

selection of techniques presented to the user.

The usability questionnaire contains questions regarding the repository of

techniques. Interviews have also been used as a mechanism to get the opinion of the

participants regarding the techniques.

4.4 Effectiveness of the tool in building learning objectives

The tool uses the principles of Bloom’s taxonomy for the construction of the

learning objectives. In order to understand this part, I will just summarize how

exactly the learning objective gets constructed.

The first step is to choose the learning domain – Cognitive, Affective or

Psychomotor. Based on the selection of the learning domain, the user will be

presented will a dropdown box, using which a selection for the domain category has

to be made. If, say, the learning domain is cognitive, the domain categories presented

will be Remember, Understand, Apply, Analyze, Evaluate and Create. This is basically

equivalent to choosing the level of learning to be expected. Further details and

terminology regarding the level can be chosen using action word category and action

word selection [18].

Page 37: Evaluation of Instructional Module Development System by

27

In the next step, content with respect to the learning objective is created

and/or chosen. Each content topic is associated with a knowledge dimension –

conceptual, factual, procedural or metacognitive. Then, the criteria, which can be

defined as the level of competence, has to be decided followed by the conditions

under which the criteria should be met. This basically constructs the rough outline

for the learning objective. The whole process is outlined in figure 15.

The user of given the option of making changes and refining the resultant

learning objective. This part of study is aimed at figuring out how good the tool

generated learning objective is, without the use of this option of customizing the

Figure 15: Learning Objective Construction

Page 38: Evaluation of Instructional Module Development System by

28

learning objective, as simply writing the whole learning objective, defeats the purpose

of the tool.

Interviews are the main source for this information to be collected.

4.5 Evaluation of documents generated by the tool

After all the steps of course design are completed, a complete syllabus is

generated by the tool based on the various selections made by the user. This

document is ready for direct distribution among the students. An option of hiding

certain aspects of the syllabus is provided if the instructors wishes to do so.

Evaluating the auto-generated documents is yet another way of establishing

that the tools is useful and reduces extra effort from the instructor’s point of view.

Interviews are used to evaluate this aspect along with actual comparison of

documents obtained for syllabus using the tool and the existing syllabus.

4.6 Effectiveness of OBE course design and achieving student outcomes

One way of measuring this particular aspect would be to analyze two

consecutive offerings of the same course – the first being designed traditionally,

without the tool and the next using the tool and comparing the student feedback for

the course as well the achieved student results. But in an ideal scenario it would also

require the same set of students, along with the same level of knowledge while

entering the class. This is, however, not possible. So instead, a discussion with the

participants on their thoughts about the student performance based on their

experience is used for evaluating this particular aspect.

Interview is the method used for collecting data about perceived student

performance and the influence on OBE on it.

Page 39: Evaluation of Instructional Module Development System by

29

4.7 Evaluation of the tool’s usability

Usability is one of the core qualities that is expected from a software. Along

with being functionally effective, the tools also need to be easy to use as otherwise,

not many will be inclined to use a tool that requires more effort in just navigating

through it. Not only should it be aesthetically pleasing, it should also be intuitive. The

user shouldn’t have to read through a ton of documentation to understand its

working. Thus, usability was chosen as one of the aspects to evaluate the tool on.

4.8 Study set up

The study has been conducted with diverse groups of participants using a

different process for each group.

1) Students – The tool was given to a class of students with a set of very

specific instructions and a sample syllabus. The goal of this was for the students to be

able to follow simple instructions and be able to design a course that was 100%

complete.

2) Instructors – A group of six instructors were recruited. The first step for

this group was to give a pre-test prior to any exposure to the tool. The pre-test

consists of a set of questions focusing on Outcome-based education. After the

completion of this step, the participants are given ample amount of time to explore

the tool, go through the documentation, help videos and seek any further help

required from the research team in order to build either a previously taught course or

a brand new one. An interview is conducted after the successful completion of course

design, with a focus on gathering information on their experience, their prior

Page 40: Evaluation of Instructional Module Development System by

30

expectations, acquired knowledge about OBE, their perceptions on expected student

outcomes, any challenges faced during the process and feedback for

further improving the tool. Then, a post-test was conducted followed by a usability

survey.

Page 41: Evaluation of Instructional Module Development System by

31

CHAPTER 5

ANALYSIS AND RESULTS

5.1 Evaluation of feedback feature

The feedback feature was designed using the conditions described in Chapter

3. But the UI of the feature was not intuitive. There was a horizontal blue bar that

said “IMOD Info.” The user has to hover on this text for a gray colored bar to appear

along with the required information for course design completion. During the

interviews, it came to my attention though that half of the participants did not even

notice that feature because of its lack of obviousness to its existence. After the

feedback received, I have removed the hover feature, which turned out to be a major

design flaw and made the feature more obvious and central.

The headers for table 6 are too big and thus I have assigned the following

codes for better visibility:

1. IMOD ID – ID

2. Course Overview – CO

3. Instructor Information – II

4. Learning Objectives – LO

5. Assessments – A

6. Pedagogy – P

7. Expected Feedback – No code

8. Actual Feedback - AF

ID CO II LO C A P Expected Feedback AF

1 Please fill the course overview to see the minimum requirements to complete the course design

2 Add instructor information. At least three learning objectives needed – 0

Page 42: Evaluation of Instructional Module Development System by

32

defined. At least six content topics need to be added – 0 defined.

3 At least three learning objectives needed – 0 defined. At least six content topics need to be added – 0 defined.

4 At least three learning objectives needed – 1 defined. At least six content topics need to be added – 0 defined. At least one assessment technique needs to be added for each objective. At least one pedagogy technique needs to be added for each objective.

5 At least three learning objectives needed – 1 defined. At least six content topics need to be added – 1 defined. At least one assessment technique needs to be added for each objective. At least one pedagogy technique needs to be added for each objective.

6 At least three learning objectives needed – 2 defined. At least six content topics need to be added – 4 defined. At least one assessment technique needs to be added for each objective. At least one pedagogy technique needs to be added for each objective.

7 At least six content topics need to be added – 4 defined. At least one assessment technique needs to be added for each objective. At least one pedagogy technique needs to be added for each objective.

8 At least one assessment technique needs to be added for each objective. At least one pedagogy technique needs to be added for each objective.

9 At least one pedagogy technique needs to be added for each objective.

10 You have met the minimum requirements of an IMOD.

Table 6: Feedback Feature Evaluation Table

The expected results and the actual results match in all ten cases. The

feedback feature works as expected which proves its correctness. One thing that was

observed was that, for a learning objective to count, just adding the action word was

Page 43: Evaluation of Instructional Module Development System by

33

sufficient – which means that once the action word is added, the system does not take

into consideration whether the condition, content and criteria part of the learning

objectives were added or not. For this issue to be fixed, there should be sort of

mechanism to take into consideration the components of the learning objectives as

well.

5.2 Interview Results

In this section, I present a consolidated view of the results obtained from the

interviews using the question mentioned in section 4.1.2.

a. Was the tool helpful in the course design process?

Participant Yes No

1

2

3

4

5

6

Table 7: Tool Helpfulness Table

All the participants agreed that the tool was helpful in the course design

process.

b. Did the tool familiarize you to OBE?

Participant Yes No Already knows

1

2

3

4

5

6

Table 8: Familiarization to OBE Table

Page 44: Evaluation of Instructional Module Development System by

34

Most of the participants did not have any prior knowledge on the concept of

OBE, but the tool was successful in familiarizing/increasing their knowledge on OBE.

Check mark in both the “Yes” column and the “Already knows” means that the

participant knew a little bit about OBE and tool increased their knowledge as well.

c. Do you think the choice of assessment/pedagogy techniques presented for your

course are appropriate?

Participant

Yes

No Sparked new

ideas

1

2

3

4

5

6

Table 9: Repository of Techniques Table

Almost all the participants wanted a generic assessment technique –

Assignment. Thus, 33% of the participants were not satisfied with the choice of

techniques presented to them. One major flaw turned out to be that the repository

did not contain any techniques for the CREATE level domain category.

d. Do you think the course designed using the tool would help in achieving expected

student outcomes?

Participant

Yes

No

Maybe

Cannot make an educated

guess

1

2

3

4

5

Page 45: Evaluation of Instructional Module Development System by

35

6

Table 10: Student Outcomes Table

To answer this question, the experiment was supposed to have an extra step.

The course was supposed to be used in session to record student performance. But

due to the time constraint it was not possible. So, I asked the participants to provide

an educated guess on the matter considering all the facts. Only 50% of the

participants believed that a focused course design would indeed have an impact on

the student performance and outcomes.

e. Was it useful to have the Learning Objective feature connected to Bloom’s

Taxonomy (BT)?

Participant Yes No No idea about BT

1

2

3

4

5

6

Table 11: Learning Objectives/Bloom’s Taxonomy Table

f. Did you find this backward/reverse approach of designing a course was effective?

Participant Yes No

1

2

3

4

5

6

Table 12: Effectiveness of Backward Approach Table

Page 46: Evaluation of Instructional Module Development System by

36

g. What do you think of the provided support and help documentation? Which of the

help features did you/would you use?

Participant Video Information Tab User Manual

1

2

3

4

5

6

Table 13: Support Documentation Table

Based on the study and the participants’ feedback, it was clear that they

preferred the videos (a short 2 min video), rather than reading through textual

information.

h. Were you able to design the Learning Objectives without using any of the help

material?

Participant Yes No

1

2

3

4

5

6

Table 14: Ease of LO Construction Table

Learning objective is the central part of the IMODS system. So, it is important

to have this feature be as intuitive as possible. Only 50% of the participants found the

tool intuitive to use without any help. But everyone was able to use it once they

watched the help video.

Page 47: Evaluation of Instructional Module Development System by

37

i. Were you satisfied by the documentation generated by the tool (syllabus

document)?

Participant

Satisfied

Not satisfied Satisfied but

felt it was restricted

Consistent

1

2

3

4

5

6

Table 15: Documentation Table

One feedback common among all the participants was that the syllabus

document generated helps in having consistency across all the courses and that was

one of the best things about the tool-generated document. But 33% of the

participants also felt that the document was restrictive and wanted a more flexible

structure to it.

j. Did the LO feature force you to think about the level of learning?

Participant

Yes

No Yes, but it was

still overwhelming

1

2

3

4

5

6

Table 16: Domain Category Table

Page 48: Evaluation of Instructional Module Development System by

38

This is the one of the main goals of OBE. And all the participants agreed that

the tool met its major goal by forcing them to think about their audience and the level

of learning to be set for them.

k. Would you be inclined to write your own objectives using the edit feature of the

learning objectives without using the actual building structure?

Participant No Yes

1

2

3

4

5

6

Table 17: Edit Feature Usage Table

Writing the learning objectives by using the edit functionality defeats the

whole purpose of the tool. And one of the participants agreed that they would be

inclined to using the edit function rather than using the structure of IMODS for

constructing complete learning objectives when they already have pre-defined

objectives. There should be some sort of a way to prevent the users from doing this.

m. How many courses have you taught? How many of those did you build from

scratch? What is the average class size in the courses you have taught?

This question provides insight into the background of the participants. The results

are tabulated in the table below.

Participant

Teaching Experience (in

years)

Total Courses New Courses/

Major Redesign

Average Class Size

1 8 9 4 70

2 2 6 0 40-50

3 4.5 4 3 60

4 1.5 1 0 70

Page 49: Evaluation of Instructional Module Development System by

39

5 7 73 73 30-40

6 1.5 5 1 150

Table 18: Participant Information Table

5.3 Webinar Results

The audience consisted of more than 25 participants for the first session. But

only 10 participants completed the pre-test before the start of the second session a

week later. Many of the participants were first-timers for the second session – which

defeated its purpose. The second webinar had to be conducted the same way as the

first because of the presence of new participants. There were a few participants who

attended the previous session as well. All the questionnaires were handed out at the

end of the session. One participant resubmitted the pre-test they had given the week

before and the post-test. Just one response was collected for the post-test. The

webinars didn’t really turn out to be useful in terms of data collection tool evaluation

or feedback.

5.4 Evaluation of the tool

For evaluating the Instructional Module Development System, various aspects

have been considered and different criteria has been chosen to evaluate each of the

aspects. The results for each of the aspects are discussed below.

5.4.1 Educating instructors on OBE

Each question was given a score of 10. For questions with multiple answers,

the points were equally divided. For example, if the multiple answer question had two

Page 50: Evaluation of Instructional Module Development System by

40

correct options, each one carried a partial credit of 5 points each. The questions

which all the participants got right in both the pre and post test were disregarded.

There were three such questions, each of them testing the participant’s ability to

identify the learning domain given its description.

Participant Pre-test score Post-test score

1 47.5 69.17

2 47.5 72.5

3 50 98.33

4 75 75

5 69.17 75

6 70 85

Table 19: Scores for OBE Tests

A combination of pre and post-tests was used for this particular evaluation, in

combination with in-person interviews. There were six participants in the study and

their results (scores out 100) are shown in the table above. There is an overall gain of

24.38% in the knowledge of the instructors after using the tool.

The gain formula is as follows:

Gain = (Sum of Post – Sum of Pre)*100/Sum of Post

All the participants got 3/10 questions right in both the pre and the post tests.

The questions c,d and e from section 4.1.1 – which are basically about recognizing the

learning domain given its definition. So not taking into account those questions, the

new scores are shown in the table below. Overall increase in knowledge based on this

table is 39.2%.

Participant Pre-test score Post-test score

1 17.5 39.17

2 17.5 42.5

Page 51: Evaluation of Instructional Module Development System by

41

3 20 68.33

4 45 45

5 39.17 45

6 40 55

Table 20: Updated Scores of OBE Tests

5.4.2 Repository of Techniques

From the data collected from 54 participants this year, 72.3% agreed that the

selection of techniques available for both pedagogy and assessment was satisfactory

(figure 3) and 61.1% agreed that the titles used for the techniques are self-explanatory

(figure 4). But only 57.4% of the total participants agreed the description of the

techniques to be clear (figure 5).

Figure 16: Bar Chart for descriptive power of Technique titles

Page 52: Evaluation of Instructional Module Development System by

42

5.4.3 Building Learning Objectives

Three different learning objectives are considered from the courses designed

with and without the tool during the course of the study.

i. Sample Learning Objective 1

With IMODS:

(LO1) Given an algorithm, apply Analytical Analysis and Empirical Analysis.

• Assessments – Midterm Test

Figure 18: Bar Chart for description of techniques

Figure 17: Bar Chart for Selection of Techniques

Page 53: Evaluation of Instructional Module Development System by

43

• Instructional Techniques – Lecture

Without IMODS:

(LO2) Students can analyze existing algorithms and use these techniques in designing

algorithms.

ii. Sample Learning Objective 2

With IMODS:

(LO3) After completing the course, the student will be able to analyze Number

Representation with 95% accuracy.

• Assessments – Assignment

• Instructional Techniques – Lecture

Without IMODS:

(LO4) Convert between common number systems (including: Binary, Decimal, and

Hexadecimal), in support of program outcome Technical Competence.

iii. Sample Learning Objective 3

With IMODS:

(LO5) After completing the course, the student will be able to apply user interaction

models and prototypes with accuracy.

• Assessments – Assignments, Final, Semester project.

• Instructional Techniques – Critical Debate, thinking aloud pair problem

solving

Without IMODS:

(LO6) After successfully completing SER315, the student will, construct user

interaction models and prototype,

• in support of SER student outcome Technical Competence

• in support of SER student outcome Design

The above objectives are part of the results obtained from the study. Prior to

the use of the tool, the learning objectives defined were either vague or specific to the

Page 54: Evaluation of Instructional Module Development System by

44

content topics. But there was no mention of the either instructional or assessment

techniques that would be employed to achieve or measure the outcome. The process

of defining the learning objectives using the tool basically forces the instructor to put

some thought into the assessments and instruction of content to assure that the

alignment between these components is maintained. The table below shows a clearer

picture.

Learning Objectives

Performance

Content

Condition

Criteria

Alignment with

Pedagogy and

Assessment

LO1

LO2

LO3

LO4

LO5

LO6

Table 21: Learning Objectives Evaluation Table

From the above table, it is clear that the alignment with assessment and

instructional techniques was missing in all the objectives that were designed not

using the IMODS. All the components of PC3are included in the objectives built using

IMODS.

5.4.4 Document Comparison

The syllabus generated by the tool follows a specific format. If the tool is

employed at all levels, the consistency provided by these documents will be very high.

Referring to a pre-requisite course while in the process of designing an advanced

course becomes significantly more insightful, providing the instructor with the

Page 55: Evaluation of Instructional Module Development System by

45

general level of knowledge in the incoming class. This helps the instructor to set a

basic level of inherent knowledge the class will possess which in turn helps in

appropriately setting off the level of learning from the class. And that is what

basically IMODS strives to achieve.

The syllabus generated by the tool also maintains the alignment in learning

objectives, giving the student a clearer picture of the course and instructor

expectations. Also, a nifty feature called the time ratio gives the students an idea of

the amount of time they are expected to spend in class to out of class helping them

understand the time commitment expected from the course and help them make

better decisions.

5.4.5 Usability Testing

In the previous section, the questions for the Usability Survey have

been listed. Students took this survey as a part of user testing, as well as the

instructors who built their own courses using the tool.

Bar charts (shown in figure) has been used as a way to represent the results.

The blue bars represent the students, the red bars represent the instructors and the

yellow bars represent the total – combination of both. The horizontal axis shows the

Likert scale – and the vertical axis shows the percentage of participants. Data

collected over last few years was compared to look for improvement with

incorporation of user feedback. Figures 19 show this progression.

Page 56: Evaluation of Instructional Module Development System by

46

Radar charts are used to show the continuous feedback received throughout

the years 2015-2018. For both the charts, the percentages are considered as they

better represent the data rather than the count as the number of participants vary

from year to year. The rate of disagreement over the years has decreased for almost

all the questions on usability.

Figure 19: Usability Results

Page 57: Evaluation of Instructional Module Development System by

47

Figure 20: Radar Charts for Usability

Page 58: Evaluation of Instructional Module Development System by

48

CHAPTER 6

CONCLUSION AND FUTURE WORK

The results of the study demonstrate that for the most part, IMODS achieves

the goals that it has aimed to since its inception and has improved over the years. The

study also helped identify a list of improvements to the system that would go a long

way in increasing its effectiveness. The evaluation has shed light on some of the

issues that escaped the development team’s notice.

6.1 Findings

The system offers a limited number of sections for the syllabus. The repository

of assessment and pedagogy techniques needs to grow and a variety of techniques for

various subject areas must be added. Making this software more flexible is one of the

future goals for improvement. The study was conducted with a small group of

participants and conducting this study with improved questionnaire with a larger

group can have more promising results. One of the interview questions that helped in

finding more about this was - “What are the strengths and weaknesses of the tool.”

The results are tabulated in the table below.

Strengths • Forced to have a structure

• Step-by-step process

• Provides scaffolding for the best way to design a course.

• Action words are helpful.

• The tool keeps you honest; calls attention to stuff and makes you think about things deeply.

• The criteria part is a nice thing to have.

• Assessment and Pedagogy are nice to have listed out – makes you think about them.

• References in the techniques are good.

Page 59: Evaluation of Instructional Module Development System by

49

• One of the participants said, “Alignment is what you are stuck with and what the tool holds you to.” This is one of the best part of the tool.

• According to one of the interviewee, “Part of teaching is knowing what your students can and can’t do.” The tool forces the user to think about this.

Weaknesses • Doesn’t explain the aspects of OBE and BT – problem for a new user.

• Access to other tabs (Content, Assessment, Pedagogy) before finishing the current one (Learning Objectives) is confusing.

• Sample IMOD is too simple.

• The repository doesn’t have techniques for higher levels of domain categories.

Table 22: Strengths and Weaknesses of the tool

6.2 Future Work

6.2.1 Bugs and Action Items

During the usability testing, multiple bugs in the application were discovered.

Each of them can be considered as an action item that needs to be fixed in the future.

All the major bugs are tabulated in the tale below.

No. Bugs

1 Server-side scripting happens for the Learning objectives

2 Grading Policy – Changed to competency based but doesn’t stick even though it shows competency-based in the syllabus.

3 Techniques – A few combinations did not have ideal matches. Selections did not stick. Page needed to be refreshed sometimes for the changes to stick. The slow performance of the progress bar makes it happen.

4 Topics appear in random order while creating learning objectives, although edit option is available, but it would be good if topics in learning objectives appear in the same order as they are listed in the content.

5 An undescriptive error occurred when attempting to add a new pedagogy

Page 60: Evaluation of Instructional Module Development System by

50

6 Login registration if incomplete doesn’t display the proper error to use when trying to login before authentication.

7 End date can be exorbitant e.g. an 8 year class

8 Couldn't select more than one technique without refreshing the page.

9 Another issue was where when adding content, the generic response, even when selected, sometimes wouldn't be saved to the learning objective

10 When clicking save on a few pop up add topic boxes, it would let show a message saying that information might be lost. But on saving, there were multiple instances of the topic

11 Credit hours field allowed to enter text.

12 If you try to delete a sub-topic, it will delete the above topic not the one you tried to delete. for example, if you have topic 1 - sub topic 1 and try to delete sub topic 1, then topic 1 will be deleted and you will be left with sub topic 1

13 IMOD Info shouldn’t be something that you hover over.

Table 23: Bugs found by Usability testing

6.2.2 Suggested improvements

During the evaluation process, feedback obtained from the participants –

both student and instructor, provided insight into the features/items that need to be

added to the tool. The table below shows the consolidated list.

No. Items

1 Instructor – No role for lecturer

2 Techniques – A few combinations did not have ideal matches.

3 The tool could be a bit more step-by-step visually

4 The videos could give a little background (OBE and Bloom’s Taxonomy) or links on the concept behind each feature.

5 Pre-requisites to be included in Syllabus

6 More flexible Course Overview section to provide a more complete syllabus

7 Graph representation for dependencies between objectives – display and feedback – This means that the learning objectives should be represented as a network and provide feedback. For eg., if a person is trying to incorporate an objective that needs prior knowledge that

Page 61: Evaluation of Instructional Module Development System by

51

is connected to another objective, the system should warn the user that the other objectives is not yet completed.

8 The Progress Bar feature is very slow and sometimes leads to the users assuming application errors when they are not able to select techniques until the progress bar is completely loaded. Its performance needs to be improved a lot.

9 A feature to preload the basic information – like Course Overview, Instructor Information, Course policies etc., will remove redundancy from the course design process.

Table 24: Suggested Improvements

6.3 Personal Outcomes from the Thesis

In this thesis, I have worked on designing the feedback feature and evaluating

the tool. I have always worked with software tools in development mode. During this

thesis I got to learn more about the importance of the Software Engineering process.

Development is not the only thing that’s important, evaluation also plays a vital role.

Evaluation of any software product is imperative if the tool intends to serve the user

better in a world with rapid changes and where there are new technologies and

offerings every day.

One major feedback received was on the UI of the tool. If this evaluation was

not conducted, we would not have gotten the information that the application looks

outdated and needs to be changed to adapt to the newer views of web pages.

I also got to learn more about the importance of designing an easy-to-use UI.

The feedback feature I designed needed the user to hover on the text “IMOD Info” for

them to get the feedback. It was way too easy to be missed by the user. Based on the

feedback, I removed the hover feature which was completely unnecessary. Design

flaws such as these are very easy to be overlooked by the developer and evaluation

process, a part of the Software Engineering cycle, is helpful in fixing these flaws to

have a better application.

Page 62: Evaluation of Instructional Module Development System by

52

REFERENCES

[1] O. Dalrymple, S. Bansal, K. Elamparithi, H. Gafoor, A. Lay, S. Shetty, “Instructional Module Development System: Building Faculty Expertise in Outcome-based Course Design,” in Proceedings of Frontiers in Education Conference (FIE), Oklahoma City, USA, October 2013. [2] R. Mager, “Preparing Instructional Objectives: A critical tool in the development of effective instruction. 3rd ed.,” The Center of Effective Performance, Inc, 1997. [3] D. Clark, “Bloom’s Taxonomy of Learning Domains,” 1999. [Online]. Available: http://www.nwlink.com/~donclark/hrd/bloom.html [4] R. Fedler, R. Brent, M. Prince, “Engineering Instructional Development: Programs, Best Practices, and Recommendations,” Journal of Engineering Education, vol.100, no. 1, pp 89-122, January 2011. [5] L. Fink, “Creating significant learning experiences: An integrated approach to designing college courses,” San Francisco: Jossey-Bass, 2003. [6] S. Bansal, O. Dalrymple, V. Menon, K. Andhare, V. Moghe, “IMoD: Semantic Web-based Instructional Module System,” in Proceedings of the 2012 IASTED Software Engineering and Applications Conference (SEA), Las Vegas, Nevada. [7] S. Bansal, A. Bansal, O. Dalrymple, “Outcome-based Education Model for Computer Science Education,” in Proceedings of Second Intl. Conference of Transformations in Engineering Education, Bengaluru, India, January 2015. [8] R. Boice, “Advice for new faculty members,” Allyn & Bacon, 2000. [9] W. G. Spady, K. J. Marshall, “Beyond Traditional Outcome-Based Education,” Educational Leadership, vol. 49, no. 2, pp. 67–72, 1991. [10] L. W. Anderson, D. R. Krathwohl, “A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives (Complete ed.),” New York, 2001. [11] R. M. Felder, R. Brent, “Designing and teaching courses to satisfy the ABET engineering criteria,” in Journal of Engineering Education, 92(1), 7–25, 2003. [12] G. C. Furman, “Outcome-Based Education and Accountability,” Education and Urban Society, 26(4), 417–437, 1994. [13] G. P. Wiggins and J. McTighe, “Understanding by design. Assoc.,” for Supervision & Curriculum Development, 2005. [14] R. A. Streveler, K. A. Smith, M. Pilotte, “Aligning Course Content, Assessment, and Delivery: Creating a Context for Outcome-Based Education,” in Outcome-Based Education and Engineering Curriculum: Evaluation, Assessment and Accreditation, Hershey, Pennsylvania: IGI Global, 2012. [15] G. J. Gloria, “Electronic performance support systems,” 1991.

Page 63: Evaluation of Instructional Module Development System by

53

[16] “Blackboard” [Online] : www.blackboard.com [17] “Moodle” [Online] : www.moodle.org [18] K. Andhare, O. Dalrymple, S. Bansal, “Learning Objectives Feature for the Instructional Module Development System,” in the Proceedings of the ASEE PSW Section Conference Cal Poly - San Luis Obispo, 2012. [19] S. K. Bansal, O. Dalrymple. "Repository of Instructional and Assessment Techniques for OBE-based Instructional Module Development System," in Journal of Engineering Education Transformations (JEET), 29.3: pp. 93-100, 2016. [20] S. K. Bansal, O. Dalrymple, A. Gaffar. “Design, Development and Implementation of Instructional Module Development System,” in Proceedings of American Society for Engineering Education Conference(ASEE) - NSF Grantees session, Seattle, USA, , June 2015. [21] “Learning Management System” [Online] : www.epharmasolutions.com/our-solutions/learning-management-system [22] J. Fairweather. “Linking Evidence and Promising Practices in Science Technology, Engineering and Mathematics (STEM) Undergraduate Education,” in A Status Report for The National Academies National Research Council Board of Science Education. [23] M. H. Davis. “Outcome-based Education,” in Journal of Veterinary Medical Education, vol. 30, No. 3, 2015. [24] E. Seymour, J. J. Ferrare, “Talking about leaving: Why undergraduates leave the sciences,” in Gardner Institute Symposium on Student Retention, Ashville, NC, USA, June 2015. [25] A. E. Austin, M. McDaniels, “Preparing the Professoriate of the Future: Graduate Student Socilaization for Faculty Roles,” in Higher Education: Handbook of Theory and Research book series, vol. 21, Chapter 8. [26] S. Roy, P. K. Patnaik, R. Mall, “A Quantitative approach to evaluate usability of academic websites based on human perception,” in Egyptian Informatics Journal, vol. 15, Issue 3, November 2014. [27] H. Alshenqeeti, “Interviewing as a Data Collection Method: A Critical Review,” in English Linguistic Research, 3. 10.5430/elr.v3n1p39, 2014. [28] P. J. Lynch, S. Horton, P. Morville, “Web Style Guide: Basic Design Principles or Creating Web Sites,” Yale University Press, ProQuest Ebook Central, 2009. [29] S. Bansal, O. Dalrymple, “Instructional Module Development System (IMODS),” in Proceedings of 21st ACM Annual Conference on Innovation and Technology in Computer Science Education (ITiCSE), Peru, July 2016.