learning object oriented programming through …
TRANSCRIPT
LEARNING OBJECT ORIENTED PROGRAMMING THROUGH SERIOUS GAMES
A thesis submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy
By
Suhni Abbasi
Department of Computer Science,
Faculty of Engineering, Science and Technology,
ISRA University, Hyderabad
October, 2020
LEARNING OBJECT ORIENTED
PROGRAMMING THROUGH SERIOUS GAMES
By
Suhni Abbasi
Names of Supervisor and Co-Supervisors
Prof. Dr. Hameedullah Kazi (Supervisor) Professor (Computer Science)
Dr. Ahmed Waliullah Kazi (Co-Supervisor) Associate Professor (Computer Science)
Dr. Kamran Khawaja (Co-Supervisor) Associate Professor (Computer Science)
iii
ACKNOWLEDGMENTS
First, I am highly thankful to Almighty ALLAH (SWT), who bestowed with
education, skills, capabilities, and most importantly, strength to undertake such
significant PhD research work. I would not have been able to achieve this goal
without His blessings and divine guidance.
I would like to express my deep gratitude and high respect to the
supervisor, Prof. Dr. Hameedullah Kazi. He has always been an important
source of inspiration, guidance and encouragement and whose constructive
criticism, wisdom, valuable comments, and inspirational feedback at all the
stages of work, has relatively passionate me to complete this research work.
Without his kind of crucial input, the research could not have been done at all!
I am deeply indebted to my co-supervisor, Dr. Ahmed Waliullah Kazi,
and sincerely grateful to Dr. Kamran Khowaja for providing useful insights,
always supporting and encouraging at every step of this research work. Their
kind reviews, hard questions, relevant feedback, and valuable time contributed
to the completion of the research.
I am very grateful for Sir Arshad Shaikh's guidance at the initial stage of
prototype design and development. He generously supported whenever I stuck
at any point in the phase of prototype development. I wont be in a position to
complete my PhD without his meaningful contribution. I am also grateful to
Prof. Dr. Ahsanullah Baloch, who always discusses research and always
encourages me for research work.
I am intensely grateful to my parents, who are an ocean of prayers and
siblings, and all the family member for their lasting wishes, prayers
iv
unconditional love, concern, and, most importantly, their tremendous support
throughout research work and for life. Special thanks to my husband for his
continuous support, prayers, and encouragement.
I offer special thanks to my friends Mehwish and Mahjabeen Leghari,
Taskeen Buriro, Saima Tunio, and my students supporting my work's
development and evaluation phase. Their discussions and interactions were
making my PhD simply amazing exposure and learning experience.
I take this opportunity to thank all the faculty of ITC, SAU, for their
unconditional moral support in keeping me encouraged and motivated to
accomplish PhD research work.
Sincere thanks to all Faculty of Isra University, especially Prof. Dr.
Ahsanullah Baloch, Dr. Mutee-ur-Rehman, Sir Asghar Mughal, Mam Sehrish
Abrejo, Mam Amber Baig, Mam Shadia, who have dedicated their time in
providing their valuable suggestions and efficient support, making research
work easy and achievable and special thanks to those well-wishers who were
practically not in sight but kept me remember in their prayers. Their support
would not be forgotten at any stage of my life.
v
ABSTRACT In today's world, where the promising use of serious games has made
STEM (Science Technology Engineering and Mathematics) learning
comfortable and productive, many serious games for learning OOP (Object
Oriented Programming) can be found. However, after many years of
considerable effort, empiric support of serious games is still lacking, especially,
the students’ difficulties and misconceptions preordained to be overcome using
serious games, have not been undertaken to be incorporated. The existing
literature also indicates a revealed paucity in implementing learning theory and
instructional designs to design and develop serious games of OOPs to attain
good learning results. Moreover, it is also debatable that the serious game's
motivation affects student’s learning outcomes.
The OO paradigm includes objects related to real life, so OO's domain
is considered a natural domain (in which work can be done). Nonetheless,
mapping those real-life objects with basic OOP concepts becomes challenging
for the students to understand. Thus, the goal of this thesis is to design and
develop a serious game prototype to resolve the challenges and
misconceptions of the student in learning OOP and achieve positive learning
outcomes. This study is carried out in five phases; in the first phase, the
students' difficulties and misconceptions faced while learning OOP were
identified. The existing studies showed the paradigm shift issues, i.e., from
procedural programming vs OOP, understanding in program flow, memory
management, problem decomposition, comprehension of OOP concepts,
debugging, and the motivational issues. The results of students’ completed
vi
task indicated obstacles in the comprehension of basic OOP concepts.
Moreover, students' feedback results showed that it is difficult to understand
the difference in procedural vs. OOP languages, lack of comprehension, and
applying basic OOP concepts. These difficulties become the main reason for
students' lack of interest in learning OOP. In the second phase of the research,
an extensive review was conducted to study the available serious games
facilitating OOP learning, the various ways they were incorporated in the
learning environment, instructional contents incorporated, game attributes
contributed to learning and motivation, methods of instructional delivery, and
learning theories and effects observed using those serious games. The third
phase of the research is dedicated to designing a serious game model for
learning OOP. The formulated model contains a detailed description of the
instructional contents or difficulties preordained to be overcome, mapping the
instructional steps into the game attributes. Since the model was intended to
improve learning, learning theories on model formulation are also considered.
A serious game prototype is built in the fourth phase to show a logical view of
the proposed model. The developed serious game incorporates all the
components and flow of activities involved in designing the serious game
model. In the fifth and final stage, an experimental evaluation was performed
to show the performance difference in results between the experimental group
of students, who interacted with the developed game and the control group of
the students who used to practice conventional teaching methods.
The results from the experimental evaluation indicate that the
experimental group performs better than the control group. The experimental
group's normalized learning gain is considerably higher than the control group
vii
(p<0.05, paired t-test). The effect size suggests for the experimental group a
substantially greater effect size (d=3.40) relative to the control group. The
perceived motivation effect showed a significant impact of motivation provided
by the serious games prototype on the students' learning outcomes (r=0.530,
p = 0.000). The results of the evaluation study show that the OOsg's perceived
motivation on the IMMS 5-point Likert scale resulted in the highest mean score
for attention (3.87) followed by relevance (3.66) subcategories. There was a
significant difference between the perceived feedback obtained using the
OOsg and those received using the traditional instructions (p<0.05, Wilcoxon
Signed Ranks). This study provided evidence that the overall game experience
was effective, and the developed serious game had the potential to improve
overall understanding of the OOP concepts. For the usability evaluation, the
students showed satisfactory results for the usefulness and quality of the
information provided by the serious game prototype.
This study concludes that the serious game prototype created is an
effective tool for education that can enhance learning outcomes and have the
potential to motivate to learn OOP.
viii
ABBREVIATIONS & SYMBOLS
ix
TABLE OF CONTENTS
Page #
Acknowledgments………………………………………………………….. Abstract …………………………………………………………………….. Abbreviations & Symbols ………………………………………………… Table of Contents ………………………………………………………… List of Tables ……………………………………………………………… List of Figures …………………………………………………………….. List of Appendices ………………………………………………………..
iii v
viii ix xiii xv
xviii
CHAPTER – I …………………………………………………………….. INTRODUCTION …………………………………………………………. 1. OVERVIEW ………………………………………………………….… 2. PROBLEM STATEMENT …………………………………………….. 3. RESEARCH QUESTIONS …………………………………………… 4. OBJECTIVES ………………………………………………………….. 5. ORGANIZATION OF THE THESIS ………………………………….
1 1 1 5 6 7 8
CHAPTER – II …………………………………………………………… LITERATURE REVIEW ……………………………………………….. 1. INSTRUCTIONAL CONTENTS OF OOP …………………………
1.1 Abstraction ………………………………………………………… 1.2 Class ……………………………………………………………… 1.3 Encapsulation …………………………………………………… 1.4 Inheritance ………………………………………………………… 1.5 Object ……………………………………………………………… 1.6 Message Passing ………………………………………………… 1.7 Method …………………………………………………………… 1.8 Polymorphism ……………………………………………………
2. IDENTIFICATION OF DIFFICULTIES AND MISCONCEPTIONS IN LEARNING OOP FROM THE EXISTING STUDIES …………… 2.1 Discussion …………………………………………………………
3. POSSIBLE CAUSES OF DIFFICULTIES IN LEARNING OOP…… 3.1 Improper Feedback for High Cognition Topics ……………….. 3.2 Lack of Comprehension ……………….………………………… 3.3 The shift from understanding to implementation ………………
4. COMPETENCIES REQUIRED FOR MASTERING OOP ………… 4.1 Havenga Competency Model (2008) ….……………………… 4.2 COMMOOP Structural Model (2016) ………………………….
4.2.1 OOP Knowledge and Skills ……………………………….. 4.2.2 Mastering Representation ………………………………… 4.2.3 Cognitive Process …………..…………………………….. 4.2.4 Metacognitive Processes ………………………………….
4.3 Simona Hierarchy-based Competency Structure Model (2019) …… 5. APPROACHES FOR TEACHING OOP …………………………….
5.1 Objects-First Approach ………………………………………….
10 10 10 12 12 13 14 14 14 15 15
16 20 24 24 24 25 25 26 27 27 27 28 28 28 30 30
x
5.2 Imperative-First Approach ……………………………………… 5.3 Concept-First Approach …………………………… …………….. 5.4 GUI-First Approach ….…………………………………………….. 5.5 Game-First Approach ……………………………………………..
6. SERIOUS GAMES (SGs)…………………………… ………………… 7. AVAILABLE TOOLS FOR LEARNING OOP ………………………..
7.1 MUPPETS ………………………………………………………… 7.2 Alice………………………………………………………………… 7.3 Scratch ……………………………………………………………… 7.4 Jeliot ………………………………………………………… ……… 7.5 Greenfoot ………………………………………………………….. 7.6 BlueJ ……………………………………………………………….. 7.7 Ztech de ……………………………………………………………. 7.8 POO Serious Games ………………………………………………. 7.9 Discussion …………………………………………………………..
8. GAMES ATTRIBUTE CONTRIBUTING TO LEARNING AND MOTIVATION FOR OOP ……………………………………………… 8.1 Game Attributes by Garris et al., (2002) ………………………… 8.2 Game Attributes by Wilson et al., (2009) ……………………….. 8.3 Game Attributes by Yusoff, (2010) ……………………………… 8.4 Game Attributes by Bedwell et al., (2012) ……………………… 8.5 Game Attributes by dos SANTOS, (2019) ……………………… 8.6 Discussion …………………………… …………………………….
9. LEARNINGTHEORIES AND SGs …………………………… ………. 9.1 Behaviourism Learning Theory …………………………… …….
9.1.1 Behaviourism instantiation within SGs …………………… 9.2 Cognitivism Learning Theory …………………………………….
9.2.1 Cognitivism instantiation within SGs ……………………… 9.3 Constructivism Learning Theory …………………………………. 9.3.1 Constructivism instantiation within SGs ………………….. 9.4 Experimentalism Theory …………………………… ……………. 9.4.1 Experimentalism instantiation within SGs ………………… 9.5 Discussion …………………………………………………………..
10. METHODS OF INSTRUCTIONAL CONTENTS DELIVERY ……… 10.1 The Gagné-Briggs-model ……………………………………….. 10.2 Dick-Carey Model ………………………………………………… 10.3 Kemp Design Model …………………………… ……………….. 10.4 Gerlach-Ely Model …………………………… ………………….. 10.5 The ARCS Model of Motivational Design ………………………
11. SUMMARY ……………………………………………………… ……..
31 32 33 34 35 36 38 38 39 40 40 41 41 42
52 53 53 54 54 54 55 65 65 67 67 68 69 70 71 73 75 75 77 79 80 81 82
xi
CHAPTER – III ……………………………………………………………. METHODS AND TOOLS ………………………………………………… 1. INVESTIGATION ON STUDENTS PERFORMANCE ……………
1.1 Experimental Design …………………………………………… 1.1.1 Demographical details: ………………………………… 1.1.2 Activity details: …………………………………………… 1.1.3 Procedure ………………………………………………… 1.1.4 Results: ……………………………………………………
2. STUDENT'S FEEDBACK FOR BASIC UNDERSTANDING OF THE OOP CONCEPTS ………………………………………………
3. THE PROPOSED COMPETENCY STRUCTURE MODEL …… 4. PURPOSE OF THE PROPOSED MODEL……………………… 5. DESIGN OF THE MODEL ………………………………………… 5.1 Identification of attributes for Serious Games Model …… 5.1.1 Identification of the instructional contents and
difficulties needs to be incorporated as an input in the model ……………………………………………
5.1.2 Identification of the instructional design for the delivery of the contents …………………………………
5.1.3 Identification of the game attributes which contributes to learning and motivation…………………
5.1.4 Determine the expected learning outcomes to be achieved ……………………………………………….
5.2 Structure of the model ………………………………………… 5.2.1 Three phases of the model ………………………… 5.2.2 Linking the game attributes with instructional design
6. DEVELOPMENT OF THE OOsg ……………………………………. 6.1 Tools used ………………………………………..………………
6.1.1 Java Standard Edition(Java SE)……………………. 6.1.2 JavaFX Scene Builder v10.0…………………………. 6.1.2 JSON Library………………………………………….
6.2 Description About the OOsg ………………………………. 6.2.1 Navigation between OOsg screens ………………. 6.2.2 Game mechanics and dynamics………………….. 6.2.3 The UI of OOsg……………………………………… 6.2.4 Logfile generated as a result of playing OOsg …..
85 85 85 85 85 86 89 89
95
103 105 106 106
107
109
112
114 115 116 118 121 121 121 122 122 123 124 126 142
xii
CHAPTER – IV ……………………………………………….………….. RESULTS ……………………………………………………….………… 1. EXPERIMENTAL DESIGN ………………………………..………….. 1.1 Experiment #1: Pre-Test Session …………………….…………
1.1.1 Personal information ……………………………...……….. 1.1.2 Institutional information ……………………………………. 1.1.3 Background in computer programming ……….…………. 1.1.4 Background in computer gaming …………..……..……… 1.1.5 Pre-Test learning activity session ………..……..………..
1.2. Division of the control and experimental groups …….……….. 1.3. Rubric for Student’s Assessment ……………………….……… 1.4. Teaching and Practice Session ……………………………..….. 1.5. Experiment#2: Post-Test Session ……………………..………..
1.5.1 Post-Test learning activity session …………..………….. 1.5.2 Parameters of the evaluation study ………….………….. 1.5.2.1 Perceived motivation ……………………………. 1.5.2.2 Perceived feedback …………………….………. 1.5.2.3 Game experience ……………………….………. 1.5.2.4 System usability ……………………….…………
2. RESULTS AND ANALYSIS ……………………………….…………. 2.1 Result of Experimental Analysis …………………………………
2.1.1 Study-1: The difference in students’ performance for learning OOP with and without the intervention of prototype ……………………………………………………
2.1.2 Study-2: The difference in students’ normalized learning gain for learning OOP with and without the intervention of prototype …………………………………………………
2.1.3 Study-3: The effect size of the serious game prototype on learning outcomes ……………………………………..
2.1.4 Study-4: The effect of perceived motivation on the learning outcomes of the students ………………………
2.2 Results of Evaluation …………………………………………… 2.2.1 Results of perceived motivation ……………………….. 2.2.2 Results of perceived feedback ………………………… 2.2.3 Result of game experience ……………………………. 2.2.4 Results of system usability ……………………….
143 143 143 145 145 146 147 148 150 151 152 152
155 155 156
156 157 158 158 160 161
161
163
165
167 170 171 180 184 189
CHAPTER – V DISCUSSION
193 193
CHAPTER - VI CONCLUSIONS AND FUTURE WORK
201 201
REFERENCES 205
APPENDICES
220
xiii
LIST OF TABLES
Table Description Page
II-1 Comprehensive difficulties identified from existing work…… 22
II-2 Summary of the existing studies used serious games for learning OOP……………………………………………………
46
II-3 Summary of game attributes identified from the existing studies……………………………………………………………
57
II-4 Summarized learning theories, their major theorist, related learning method, game genre, and example games……..…
74
II-5 Gagne events of instruction and their associated actions (Gagné and Medsker, 1996)………………………………...…
76
III-1 Demographical details of the participants…………………… 86
III-2 Responses collected from the participants while solving OOP activities……………………………………………………
91
III-3 Overall mean score, frequency, and percentage for the category procedural programming vs OOP……………………
97
III-4 Overall mean score, frequency, and percentage for category understanding OOP concepts……………………………….…
99
III-5 Overall mean score, frequency, and percentage for category motivation for learning OOP……………………………………
101
III-6 Example game activities incorporated as Gagne instructional events………………………………………………………….…
111
III-7 Game attributes and possible outcomes provided by Garris 2002………………………………………………………………
113
III-8 Placement of the components in the serious games model’s phases…………………………………………………………….
116
III-9 Linking game attributes with the instructional design for the enhancement of the perceived motivation of the player……
121
III-10 Game mechanics and dynamics of OOsg…………………… 125
IV-1 Details about the teaching and practice session with control and experimental Groups………………………………………
154
IV-2 The difference in students’ performance for learning OOP with and without the intervention of prototype…………………
162
IV-3 The difference in students’ normalized learning gain for learning OOP with and without the intervention of prototype..
165
IV-4 The effect size of the OOsg on the learning outcomes……… 167
xiv
IV-5 Pearson correlation showing the effect of perceived motivation on the learning outcomes (control group) ….…….
168
IV-6 Pearson correlation showing the effect of perceived motivation on the learning outcomes (experimental group) …
169
IV-7 Average and standard deviation for the IMMS subcategory attention……………………………………………………………
173
IV-8 Average and standard deviation for the IMMS subcategory relevance…………………………………………………………
175
IV-9 Average and standard deviation for the IMMS subcategory confidence…………………………………………………………
177
IV-10 Average and standard deviation for the IMMS subcategory satisfaction………………………………………………………
178
IV-11
Overall mean score, frequency, standard deviation, and percentage about the learner’s perception of motivation towards learning OOP using OOsg……………………………
180
IV-12
The difference in student’s perception about feedback received while learning OOP with and without the intervention of prototype…………………………………………
182
IV-13 Average and standard deviation for the perceived feedback…………………………………………………………
183
IV-14 Correlation analysis between levels played/reached and the post-test scores………………………………………………….
185
IV-15 Participants comments about the OOsg……………………… 188
IV-16 Average and standard deviation for the usability subcategories……………………………………………………
191
xv
LIST OF FIGURES
Figure
Description Page
II-1 Organizational quark of OOP given by Amstrong(2006)… 12
II-2 Havenga’s competency model (2008)……………………… 27
II-3 Simona’s hierarchy-based competency structure model (2019)………………………………………………………….. 29
II-4 Kolb experimentalism learning model (2014)……………… 72
II-5 Three phases of Gagné-Briggs instructional activities (Gagné and Briggs, 1974……………………………………. 77
II-6 Dick-Carey model (Dick, et al., 2005)………………………. 79
II-7 Kemp design model (Kemp and Rodriguez, 1992)..……… 80
II-8 Gerlach-Ely instructional design model (Gerlach and Ely, 1980)…………………………………………………………… 81
II-9 Components of the Keller’s ARCS model (Keller, 1983 and Keller, 2010)……………………………………………… 82
III-1 Alignment of the activities with Bloom's learning outcomes model…………………………………………………………... 87
III-2 Summarized responses while solving the OOP activities... 92
III-3 Proposed competency model for mastering OOP skills….. 104
III-4
Summarized difficulties from (A) the existing studies, (B) investigation on student’s activities, and (C) feedback provided from the students…………………………………... 109
III-5 Proposed Serious Game model for learning OOP…...…… 116
III-6 Linking game attributes with instructional model………..… 120
III-7 Navigation between OOsg screens………………………… 128
III-8 Screen 2 – Personal and game control information…….… 124
III-9 Screen 3 – Selection of game scenario……………………. 128
III-10 Screen 4 – Warm-up session……………………………….. 128
III-11 Screen 5 – Selection of game level………………………... 128
III-12 Screen 6 – Learning about the basic concepts of OOP….. 129
III-13 Screen 7 – Goals and rules of games……………………… 129
III-14 Screen 8 – Screen layout of the game levels……………… 132
xvi
III-15 Screen 9a) Level-1 main screen…………………………… 133
III-16 Screen 9b) Level-1 correct attempt made by the player…. 133
III-17 Screen 9c) Level-1 wrong attempt made by the player…... 134
III-18 Screen 9d) Level-1 achievement of game level’s goals….. 134
III-19 Screen 9e) Level-1 Statistics/evaluation as result of completing level………………………………………………. 134
III-20 Screen 10a) Level-2 main screen………………………….. 135
III-21 Screen 10b) Level-2 correct attempt made by the player... 135
III-22 Screen 10c) Level-2 adding details of the attributes……… 136
III-23 Screen 10d) Level-2 wrong attempt made by the player… 136
III-24 Screen 11a) Level-3 main screen…………………………... 137
III-25 Screen 11b) Level-3 correct attempt made by the player... 137
III-26 Screen 11c) Level-3 adding details of the methods………. 138
III-27 Screen 11d) Level-3 wrong attempt made by the player… 138
III-28 Screen 12a) Level-4 main screen…………………………... 139
III-29 Screen 12b) Level-4 correct attempt made by the player... 139
III-30 Screen 12c) Level-4 Number of objects created for a class……………………………………………………………. 139
III-31 Screen 12d) Level-4 effect of object destruction on class. 139
III-32 Screen 13a) Level-5 population of classes in combo-box for creating a relationship between classes……………….. 145
III-33 Screen 13b) Level-5 correct attempt made by the player... 141
III-34 Screen 13c) Level-5 multi-level hierarchy identified by the player…………………………………………………………... 141
III-35 Screen 13d) Level-5 effect of selecting base-class as sub-class………………………………………………………. 145
III-36 Screen 13e) Level-5 effect of selecting the same class as both for child and parent-class………………………………. 141
III-37 Screen 13f) Level- effect of missing class between a hierarchy attempt to be made by the player……………….. 141
III-38 Sample log file generated as a result of interaction with level-1 142
IV-1 Sequence of steps carried out for conducting the experimental study……………………………………………. 144
xvii
IV-2 Gender of the participants………………………………..….. 146
IV-3 Age Group of the participants……………………………….. 146
IV-4 Institute/University of the participants………………….…… 147
IV-5 Program Enrolled of the participants………………..……… 147
IV-6 Prior experience in programming languages……………… 148
IV-7 Experience in types of programming language…………… 148
IV-8 Prior experience of playing games………………………… 149
IV-9 Types of games played………………………………………. 149
IV-10 Frequency of playing games by the participants………….. 150
IV-11 Perceived impact of educational games by the participants…………………………………………………… 150
IV-12 Perceived impact of educational games on participants' learning computer programming…………………………… 150
IV-13 Grouping of students for experimental study………………. 151
IV-14 Average normalized learning gain for control and experimental groups………………………………………….. 164
IV-15 Correlation analysis for perceived motivation and learning outcomes (control group) …………………..……………….. 168
IV-16 Correlation analysis for perceived motivation and learning outcomes (experimental group) …………..………………… 169
IV-17 Evaluation framework used in the study…………………… 170
IV-18 Reliability measure for IMMS……………………...………… 171
IV-19 Comparison between control and experimental group’s perceived motivational level…………………………………. 179
IV-20 Responses for close-ended questions for game experience survey…………………………………………….. 186
IV-21 Rating of overall game experience…………………….…… 187
IV-22 Reliability measure for PSSUQ……………………………… 189
xviii
LIST OF APPENDICES
Appendix Description Page
A. Problem scenario for the Identification of difficulties in learning OO……………………………………………………………………
220
B. Details about the learning objectives, designated activities, and related OO concepts ……………………………………………….
222
C. Questionnaire for the Identification of difficulties in learning Object-Orientation ………………………………………………….
223
D. JSON Solution Model…………………………………………… 225
E. Quick Reference to OOsg………………………………………… 226
F. Game Story ………………………………………………............... 229
G. Pre-Test Scenario and associated Questions ………………… 231
H. Rubric for Assessment ……………………………………………. 232
I. Post-Test Scenario and associated Questions …………………. 233
J. Questionnaire for Perceived Motivation …………………………. 234
K. Questionnaire for Perceived Feedback ………………………….. 236
L. Questionnaire for Game Experience……………………………... 237
M. Questionnaire for System Usability …………………………….. 238
N. Pre and post-test raw scores for control and Experimental Groups……………………………………………….......................
239
O. Result of the normality test and paired t-test for study-1……….. 241
P. Result of the normality test and paired t-test for study-2 ……….. 243
Q. Result of the normality test and Wilcoxon Signed rank for perceived feedback………………………………………………...
244
1
CHAPTER I
INTRODUCTION
1. OVERVIEW
Computer programming involves algorithm design, code writing,
debugging, testing and implementation. Computer programming is completely
related to the abstract principles of problem-solving. It requires logical
reasoning to solve various real-world problems in different situations. For
beginners, programming (mainly basic knowledge) is challenging to learn
(Alyami and Alagab, 2013 and Muratte, et al., 2009), because rote learning is
merely impossible. Basic level students cannot solve complex problems;
instead, they need to acquire basic concepts to build a higher cognitive
understanding of advanced programming concepts. Transitioning between the
programming paradigm, such as from procedural programming to object-
oriented programming (OOP) is also a challenging task for the students
(Bennedsen and Schulte, 2007).
The object-oriented paradigm (OO Paradigm) is a more natural domain
because OOP's problem domain is composed of objects related to real-life
objects (Livovský and Porubän, 2014). However, the mapping of those real-
life scenarios with OOP's basic concepts (for example, class, object, attribute,
method, message passing, inheritance, polymorphism, and encapsulation) is
tough to comprehend for students (Yan, 2009).
OOP's approach has many benefits, such as reducing the overall
software development time and providing code reusability and code
2
organization flexibility. However, the high-level software development skills are
required for learning OOP whereas it is complex for students to understand its
underlying concepts clearly in the beginning. The difficulties in learning OOP
needs to be addressed at the earliest level to reduce the frustration to grasp
the advanced topics of OOP and avoid poor learning outcomes.
If the students are presented with the complex and technical problems
at the foundation level, the result may compromise their grades and lose their
interest or demotivate towards the subject. Therefore, mastering the students
in computer programming, creates the urge to identify diverse ways in which
the programming problems can be presented and solved. Therefore, choosing
the correct programming method is important, because the boredom and
current knowledge in programming concepts may also affect students' learning
outcomes (Sandersn and Mueller, 2000). To provide an excellent learning
experience, one should start by conceptualizing OOP basic features and then
moving towards the language's technical details.
However, it remains a challenge for the researchers to provide such an
environment where the whole learning process is followed in a fun and
engaging way, as well as maximum learning outcomes, could be achieved.
The learning environment like learning by playing games that can motivate
students to learn entertainingly. Incorporating games in terms of the curriculum
means converting the instructional contents such as physics law, chemistry
equation, or the law of supply and demand into the game mechanics, operating
in independent systems based on choices and results (Perrotta, et al., 2013).
3
Besides, games help stimulate the learner's abstract thinking into the
cognitive thinking and further improve their advanced thinking skills
(Carbonaro, et al., 2008). Therefore, not only can students learn happily due
to the attractiveness, immersion and interactive characteristics of computer
games, but if teachers apply digital games to their curriculum properly, the
positive learning outcomes could be achieved (Smith and Gotel, 2007 and
Shaffer and Clinton, 2005 and Gee, et al., 2008). Other educational
advantages of using computer games for learning include enhancing problem
solving, motivation and retention (Livovský and Porubän, 2014), and to
advance learners' ability to adopt new skills level and support alternative
learning styles (Yan, 2009).
The term serious games refer to a category of games with a clear and
specific educational or learning purpose and is not designed and developed
primarily for entertainment purposes (Abt, 1987). Serious games may also
serve other purposes, such as government or company training, or achieve
goals in education, health, public policy and strategic communication (Perrotta,
et al., 2013). Some serious games, such as the basic rules of dealing with
emotions, learning healthy behavior and modeling adult behavior, are used to
learn social, physical, and psychological skills (Gee, et al., 2008).
Serious games are more learner-cantered, which makes the learning
process more comfortable, fun and effective, and the whole learning process
can be carried out naturally by playing. In games, entertainment and fun are
the main attributes that attract people to participate in the learning experience
and positive learning outcomes can be obtained by adequately mapping the
4
course content in the game elements (Shaffer and Clinton, 2005 and
Benedensen and Schulte, 2007).
Researchers have made efforts to identify the effect of incorporating
serious games as an instructional medium or supporting learning outcomes.
As the educational games were mainly used to achieve learning outcomes
(Connolly, et al., 2012), there is little debate about the implications of using
games for OOP learning (Xu, 2009, Al-Linjawi and Al-Nuaim, 2010, Yulia and
Adipranata, 2010, Rais and Syed-Mohamad, 2011, Depradine, 2011, Livovský
and Porubän, 2014, Poolsawas, et al., 2016, Seng and Yatim, 2014, Seng, et
al., 2015, Seng, et al., 2016, Seng and Yatim, 2018 and Seng, et al., 2018)
and the various ways of incorporating games into the learning environment
(Abbasi, 2017).
Earlier studies help provide valued knowledge regarding learning theory
issues, which provide the foundations for the development of serious games.
Smith (Smith, 1999) describes the five categories of learning theories:
behaviorism, cognitivism, constructivism, experimentalism, and Socio-
Contextual theory. These theories are invoked to strengthen the design of
educational computer games.
For achieving the positive learning outcomes from serious games, this
research study enlightens the development of a serious game prototype
named OOsg for learning OOP. There are many reasons why OOP was
chosen as the primary focus area of this research. Most students encountered
severe difficulties in understanding simple OOP concepts (Sheetz, et al., 1997,
Milne and Rowe, 2002, Or-bach and Lavy, 2004, Ragonis and Ben-Ari, 2005,
5
Thomasson, et al., 2006, Butler and Morgan, 2007, Kölling, 2010,
Liberman, et al., 2011, Bashiru and Joseph, 2015).
This study attempts to explore the challenges and misconceptions of
the students first and offer a solution to resolve these difficulties in a fun and
engaging way. Moreover, it appears that if the potential difficulties and
misconceptions are known in advance and the serious game prototype's
design is based on these obstacles, it would be significantly helpful to enhance
the learning experience and outcomes. Findings can be observed from
literature review and the investigation conducted on current OOP learning
students. Consequently, identifying the obstacles led to incorporating the
learning theories and instructional model to design and develop a serious
game.
2. PROBLEM STATEMENT
Learning programming is considered a challenging task and there has
been a drop in the number of students with majors in Information Technology
(IT) and Computer Science, as programming courses tend to have a high rate
of failure. The OO paradigm is considered a more natural domain to work with
because OOP's problem domain comprises objects related to real-life.
Nonetheless, it is very challenging for students to understand and map OOP's
basic concepts (e.g., classes, objects, attributes, methods, method passing,
inheritance, polymorphism, and encapsulation) through real-life scenarios and
vice versa. In today's world, the promising use of serious games has made
STEM learning comfortable and productive. This research study, about serious
6
games for learning OOP, attempted to set out the answers for the following
problems:
Several studies were conducted to assess the students' difficulties and
misunderstandings in OOP learning. Besides, it is imperative to know what
barriers make learning programming difficult and how students could learn
correctly and efficiently; the students' difficulties and misconceptions are not
undertaken in developing the serious games to achieve positive learning
outcomes. The current literature revealed paucity in using learning theories
and instructional designs for creating serious games prototypes for learning
OOP. Due to the high level of complexity involved in OOP subjects, student
boredom and dropout rates may increase. Serious games are therefore used
to attracts the students towards the subjects like programming and similar
domains. Despite some encouraging findings, current literature does not show
the presumed relation between the encouragement of serious games and
actual learning outcomes. It remains debatable that which specific aspects of
the game lead to learning or whether motivation affects learning outcomes or
not.
3. RESEARCH QUESTIONS
This research study tried to address the research problem which arises
the need for the development of the prototype of the serious game based on
critical components of game elements, instructional and learning theories;
therefore, the following research questions are formulated:
Q1. What are the competencies required to master the OOP skills?
Q2. What are the difficulties faced by students for learning OOP?
7
Q3. What are the different programming approaches used for teaching
OOP?
Q4. What are the different serious games available to facilitate OOP
learning?
Q4.1 What and how the learning theories are integrated into
Serious games?
Q.4.1.1 Which game elements are essential to be incorporated
to facilitate learning?
Q4.2 How the delivery of the contents is provided in available
serious games?
Q4.2.1 What are the different ways of instruction delivery?
Q5. How can the motivation be sustained while playing the prototype
of the serious game?
Q.5.1 Which game elements are essential to be incorporated to
motivate the player?
Q6. Which learning outcomes could be achieved using a serious
game?
Q7. Is there any significant difference in student performance with and
without using the prototype?
Q7.1 What is the difference in students’ performance for learning
OOP with and without the prototype's intervention?
Q7.2 What is the difference in students’ normalized learning gain
for learning OOP with and without the prototype's
intervention?
Q7.3 What is the difference between the effectiveness in the
intervention of instructional techniques?
Q7.4 Is there any effect of perceived motivation on student’s
learning outcomes?
4. OBJECTIVES
This research study objectives include:
1. To investigate difficulties faced by the students during learning OOP
8
2. To formulate a model for fostering learning outcomes and improve the
learning performance of the OOP students
3. To design and develop a serious game prototype that supports the
model for learning OOP
4. To conduct an experimental study to analyze the student's performance
for learning OOP with and without using the prototype
5. ORGANIZATION OF THE THESIS
The thesis organization continues as follows; Chapter II is about
literature review, which addresses the OOP and its instructional contents, the
learning difficulties in OOP available in the previous studies, approaches used
for teaching OOP and competencies required to master in OOP. The literature
review also discusses the various serious games available to support OOP
learning. The existing studies for learning OOP via serious games need to
understand how the learning theories and instructional design are incorporated
in serious games development. The different elements leading to motivation
and learning are also discussed. Chapter III is about methods and tools in
which first, the investigation is conducted on current OOP learning students
and feedback provided by the students. The investigation results led to the
instructional contents integrated as the game story of the OOsg. In the later
part of the chapter III, design of the serious games model and development of
the serious game prototype is discussed as a result obtained from chapter II
and the first part of the chapter III. The model emphasizes how the student's
difficulties and instructional contents of OOP are presented to achieve the
indented competencies and learning outcomes. The model also focused on
9
the design of the practice session for the improvement of the performance.
The development of a serious game prototype named OOsg, described the
activities in OOsg from the user's perspectives and represented the steps
required when interacting with it. Chapter IV discusses OOsg 's experimental
study and evaluation results. Discussion is given in Chapter V, and Chapter VI
addresses the conclusion and possible future work, followed by references and
appendices.
10
CHAPTER II
LITERATURE REVIEW
The primary aim of this chapter discuss object-oriented programming
(OOP) learning in general, instructional contents of the OOP, student’s
difficulties and misconception while learning OOP, possible causes of
student’s difficulties available in the existing studies, core competencies
required for mastering in OOP and the various approaches adopted for
teaching of OOP instructional contents. This chapter also includes a summary
of the related literature on the concept of serious games and their associated
terminology. This includes a review of existing serious games that facilitate
OOP learning, various attributes of serious games that contribute to students '
learning and motivation, a review of learning theories, and instructional
delivery methods.
The last part of the chapter is dedicated to an extensive review on
studies dedicated to the Identification of various programming approaches
applied to teach OOP concepts or to overcome OOP difficulties using available
serious games, how the intervention of serious games is done, which learning
theories or instructional design are integrated into available serious games and
learning outcomes achieved using these games.
1. INSTRUCTIONAL CONTENTS OF OOP
In software development, the OOP approach is commonly used
(Sommerville and Prechelt, 2004), and learning about the development of
high-quality OOP Software is a core theme of computer science, software
11
engineering, and Information Technology curricula. It is because many benefits
can be achieved by applying the OOP approach, including the reduction in
overall software development time, reusability of the code and flexible way to
organize the code. Despites, its benefits, the expert-level software
development skills are required to be master in OOP, and for students, it
becomes complex to comprehend their underlying concepts clearly.
There is a debate about what should be the instructional content to be
taught during the OOP course. The basic quarks include Class their associated
attributes and methods, message passing, abstraction, inheritance,
encapsulation, polymorphism, and relationships as key concepts in OOP
(Morris, et al., 1999). On the other hand, Rosson (Rosson and Alpert, 1990)
suggested that abstraction, class, objects, message passing, encapsulation,
information hiding, inheritance, object model, and polymorphism concepts
should be taught in OOP courses.
(Armstrong, 2006) provides a detailed list of the eight fundamental OOP
concepts called OOP quarks. He provided the information about quarks and
presented a taxonomy by organizing these quarks into two primary constructs
named structure and behavior. The organizational quarks given by him are
shown in Figure II-1. The descriptions of these quarks are discussed below:
12
Figure II-1 Organizational quark of OOP given by Amstrong(2006)
1.1 Abstraction
Abstraction is a fundamental principle of OOP which is used to handle
or control complexity by hiding unnecessary information. This enables users
to apply complex logic to the given abstraction without comprehending or even
worrying about any of the system's secret complexities. Creating a class is to
use the inherent differences in the problem to simplify all aspects of reality.
Data abstraction is the standard term broadly defined as a mechanism that
allows us to show complex details by simplifying models, thereby suppressing
insignificant details to enhance understanding (Henderson-Sellers, 1996 and
Ledgard, 1995 and Yourdon, et al., 1995). Abstraction can also be used to find
the more common behaviors among objects by eliminating specific behaviors
(Morris, et al., 1999).
1.2 Class
A class can be defined as a blueprint or prototype having attributes and
methods common to all its objects. A class can also be described as a set of
objects described by the same declaration (Robson, 1981 and Rosson and
Alpert, 1990) and considered the essential OOP modeling element. The class
can also be defined as a set of objects that share a standard structure and
13
behavior (Booch, et al., 2008). The class at runtime can perform several things,
it describes how the object responds to messages; in the development
process, it provides programmers with an interface to interact with the object
definition; in the running system, it is becoming the source for the new objects
(Robson, 1981).
1.3 Encapsulation
Encapsulation can be defined as a technique for designing classes and
objects, restricting access to data and behaviors by defining a limited set of
messages that class objects can receive. Many researchers said that the
concept of encapsulation existed before OOP was introduced (Page-Jones
and Weiss, 1989), while others researcher asserts that its new concept initially
introduced in the Simula programming language (Booch et al., 2008). The
author also described encapsulation as a new term for the existing concept of
information hiding (Henderson-Sellers, 1996 and Yourdon et al., 1995).
Three main concepts exist in the literature behind encapsulation; first, it
is a combination of data and methods(which acts on data)(Wirfs-Brock and
Johnson, 1990). Second, the objects could be accessible via the their defined
external behavior as an implementation detail of the object is hidden (Wirfs-
Brock and Johnson, 1990; Booch et al., 2008). The third concept is combining
the first two concepts and summarizing it into information about the object, how
to process the information, keep it together strictly and separate it from all other
information (Morris et al., 1999).
14
1.4 Inheritance
Inheritance is a new concept, defined as data and behavior of one class
shared or used by another class (Wirfs-Brock and Johnson, 1990). The
inheritance provides a mechanism by which a class can inherit the properties
(attributes and methods) from another class. The class which requests the
sharing of the properties is known as child or subclass, and the class which is
being shared is known as the parent or superclass.
Inherited children or subclasses or lower levels of the hierarchy contain
more specific or concrete instances of abstract concepts at the top of the
hierarchy (Morris, et al., 1999).
1.5 Object
An object refers to the instance of a class (Booch, et al., 2008), where
objects can be a combination of variables, functions, and data structures. It is
a separate, identifiable item, whether real or abstract, containing data about
itself and its data operations description. An object has a well-defined role in
the problem domain (Morris, et al., 1999) and it is both data carriers and
objects that perform actions (Robson, 1981). Booch, et al., 2008 defined that
an object is simply something with the state, behavior, and identity (Booch, et
al., 2008); most commonly, the object maintains its state in variables and
implements its behavior with methods.
1.6 Message Passing
Message passing refers to the way of communication between two
objects. It is the process by which an object can send or receive data to each
other, or the request from one object to another to invoke a method. Therefore,
15
two groups of researchers see message passing as a different concept. The
first group focused on messages passing as objects making requests to
perform actions and pass information to each other (Morris, et al., 1999).
Another group focused the message as a signal from an object to another
object, and the message requested the receiving object to perform one of its
methods (Byard, 1990).
1.7 Method
The method is previously knowns as procedure or operation or function.
The method is the core elements of the object program, and the concept of the
method as an inseparable part of an object has emerged from the Smalltalk
programming language (Booch, et al., 2008). The method is a way to access,
set, or manipulate an object’s information (Robson, 1981).
In most cases, the discussion of methods is intertwined with the concept
of messages. Usually, a method is a carrier that sends messages to other
objects that call the object's methods (Rosson and Alpert, 1990).
1.8 Polymorphism
Before the introduction of OOP, the concept of polymorphism has been
used in software development. Polymorphism is the ability to hide different
implementations behind a standard interface (Yourdon, et al., 1995), while
some people conceptualize polymorphism as the ability of different objects to
respond the same message for invoking different responses (Stefik and
Bobrow, 1985). Polymorphism is a class's ability to have different method
signatures, which allow them to perform a single operation in different ways
(Morris, et al., 1999 and Ledgard, 1995).
16
There are two types of polymorphisms, compile or static and runtime or
dynamic type polymorphism.
The compile-time polymorphism is achieved by method or operator
overloading, whereas runtime polymorphism is achieved by method overriding.
2. IDENTIFICATION OF DIFFICULTIES AND MISCONCEPTIONS IN LEARNING OOP FROM THE
EXISTING STUDIES
The primary purpose of the literature review of the existing studies is to
identify the difficulties students facing while learning OOP. Topics covered in
the literature review include tools used for identifying the difficulties, OOP
concepts or contents covered in the studies, analysis technique used, specific
OOP problems addressed, and identified issues.
Programming involves algorithm design, code writing, debugging,
testing, and implementation. Students need to master these stages to become
good programmers. Computer programming is entirely related to problem-
solving, but students are not expected to solve complex problems; first, they
need to start from learning basic concepts to build a higher cognitive
understanding and possibly a better grasp of advanced programming
concepts. Problems that enable students to master programming concepts
include identifying different ways the programming problems are presented
and solved. The transition from one programming paradigm to another (from
procedural programming to OOP) is also difficult because it includes many
overlying concepts.
There are diverse dynamics related to the difficulties experienced by the
students while learning OOP. Thomasson (Thomasson, et al., 2006) designed
17
two phases of experimental design for the difficulties students faced in the
software development process. The topics covered in the study were related
to the classes, objects, and reference classes. In the first phase, the UML
diagram for model classes was supposed to be designed by the students, in
which they are supposed to identify the required classes and their relevant
attributes and methods. In the later phase, the same students were presented
with a different problem in which they are asked to design courses for car rental
companies with multiple warehouses, cars, and customers. Finally, to support
the final system, students need to create a class diagram.
The most common difficulty observed as a result of the experiment was
related to Non-Referenced Class fault means students were not able to
integrate the classes properly into the design. Other observed challenges were
related to references to Non-Existent Classes, attribute identification problem,
and cohesion issues such as single Attribute misrepresentation faults, multiple
attribute misrepresentation faults, multiple object misrepresentation faults, etc.
Or-Bach and Lavy (Or-bach and Lavy, 2004) explored the cognitive
difficulties the 3rd year students studied OOP and OOD(Object Oriented
Design). The testing instruments include the question related to basic OOP
concepts, such as classes, inheritance, and polymorphism. The different items
were presented to students, and solutions supposed to be solved by the
students were not abiding by any specific formal notation so that the
Identification of the cognitive difficulties could be possible without worrying that
how well students they remember and know to implement the notation they
had learned some time before. Moreover, the students' solution indicated that
only attributes in the abstract class were included, but they were fails to
18
include methods of any type. Some students also include extra classes
(classes that are not related to the solution, and classes that can be integrated
into existing classes' methods or attributes). They missed adding the
necessary class details, placing insignificant attributes within the class, and
reducing cohesion.
Sheetz (Sheetz, et al., 1997) study focused on identifying OOP
difficulties by the undergraduate and graduate students of information
systems. The specific problems presented to the students were about to
classify the categories of object-oriented concepts, provide the rank of
importance and determine the relationship between categories. The study
results revealed that learning basic OOP concepts, issues of design problems
and programming techniques are difficult for the students. The research results
show that learning basic object concepts is the most difficult for students,
followed by design problems and programming techniques.It is also difficult for
students to distinguish the functions of programming language and OOP
language and to use or reuse class libraries.
Ragonis (Ragonis and Ben-Ari, 2005) collected the observations and
field notes, audio and video recordings, and collection of Artefacts as well as
homework assignments, classwork, examinations, final projects for the two
semesters to precisely identify what concepts were understood and what
concepts were problematical to the students. The most common problems
arose during the analyses was difficulty in the general picture of the program
exaction, state-changing during execution, the sequence of method
invocations related to solving the problem, method invocation, where the
values of parameters come from and where the return value of a method goes
19
to, the need for the input instructions, the connections between the constructor
declaration, invocation and the execution.
Topics related to understanding memory-related concepts, such as
copy constructors and virtual functions, pointers, the execution of programs in
terms of memory, how to store them in memory, and how they are related to
each other in memory are struggling enough for students. Many
misconceptions have also lain about the memory operation in the students.
The precise construction of the mental model of what happened within the
machine and the memory while the program is running, when the object
attempts to communicate with another object, and how it solves the problem,
is difficult for students to understand (Bashiru and Joseph, 2015).
Students faced more difficulty understating and implementing high-level
concepts such as algorithm designing, methods, designing a program, and
OOP concepts compared to topics with low-level conceptual difficulty such as
understanding the syntax of any language (Bashiru and Joseph, 2015). Even
though the current teaching method applies the segmentation to the complex
topics into easily understandable pieces but it is still hard for novices to leap
from understanding to applying the concepts (Butler and Morgan, 2007,
Lahtinen, et al., 2005 and Winslow, 1996). Such methodology is as a novice
is limited to the surface knowledge of the subject, whereas experts have an in-
depth understanding of their subjects, which is hierarchical and many layers.
Liberman (Liberman, et al., 2011) addressed the student's difficulties
and misconceptions in the topics related to interfaces, inheritance, and
polymorphism. The difficulty lies in understanding these topics, but
implementing these topics is becoming a challenge for the students.
20
A study by Jenkins (Jenkins, 2002) shows that demographic factors and
programming experience are not an essential basis for programming success.
So, to understand the difficulty of learning programming, we can turn to the
cognitive perspective. Therefore, Jenkins reports two cognitive factors:
motivation and learning styles, which can make learning to program difficult.
He claims that students with the right learning style are motivated to learn and
master programming skills quickly. It is important to understand students'
cognitive styles, learning styles, motivations, and other possible factors to
alleviate learning problems and difficulties. Gomes (Gomes and Mendes,
2007) describe similar ideas. They propose a multimedia environment that
includes several types of problem-solving activities to attract students and
develop the motivation to learn to program.
2.1 Discussion
In this section, the result of the literature review is carried out with a view
to achieve objective 1 and to address research questions 1 and 2. The
summaries of literature reviews on students difficulties from the above studies
are comprehensively presented in Table II-1. The result of the reviews
indicates the issues of transition from procedural programming to OOP,
program flow and execution. Issues of memory such as what happened during
the execution of the program, how the operation are performed in the memory,
concepts of pointer and virtual function. Studies also reports the algorithmic
difficulties such as how to decompose the given problem, and how to analyse,
design the builds the solution for the given problem. As many libraries available
21
to perform specific function, the studies reports the students have difficulty in
importing those libraries.
The reviews regarding the OOP concepts includes the difficulties in the
comprehension of all basic quarks of OOP such as Classes, Objects, methods,
attributes, abstraction encapsulation, polymorphism, events, debugging and
message passing. In addition, the studies included in the literature review
conclude that obstacles arise because of the motivational problems that may
be due to an uninteresting approach to teaching OOP concepts such as
starting with technical language details, learning and practice environment
complexity, improper or incomplete feedback on the activities of students and
fear of failure during the course. As this thesis is concerned with the field of
object-oriented programming, the difficulties and myths associated with OOP
are therefore considered to be resolved by providing an environment in which
students' motivation is sustained by beginning the basic conceptualization in a
way that is entertaining and engaging.
22
Table II - 1 Comprehensive difficulties identified from existing work
Topics Covered in Literature
Sheetz, et al., 1997
Milne and Rowe, 2002
Or-bach and Lavy,
2004
Ragonis and Ben-Ari, 2005
Thomasson, et al., 2006
Butler and Morgan, 2007
Kölling, 2010
Liberman, et al., 2011
Bashiru and Joseph, 2015
• Switching from Procedural programming to OOP ✓ Program Flow ✓ Program Execution ✓ Memory Management
• During the execution of the program
• Misunderstand of memory operation
• Pointers
• Virtual function • Object relates to memory
✓ ✓
Problem decomposition • Analysis of program
• Design of program • Problem solution building
✓ ✓ ✓
Importing from Library ✓ Understanding OOP Concepts
• Classification
• Applying
• utilizing
• implementing
✓ ✓ ✓ ✓ ✓ ✓ ✓
Classes • Adding/creating classes • Assigning properties and methods to
class/abstract class
• Identify classes
✓ ✓ ✓
Objects • Multiple object representation
• Relationship between classes ✓
Authors & year
23
Methods • Creating methods
• Assigning methods to classes
• Method Invocation
• Order of method invocation
• Categorization
• Return values • Parameter • Procedural function vs. OOP methods
✓ ✓ ✓ ✓
Attributes • Assigning properties to classes • Multiple attribute Misrepresentation
• Single attributes misrepresentation
✓ ✓
Abstraction • Execution
• The desirable level of abstraction
Inheritance • Understanding the concept • Inheritance of function
• Understand hierarchies
✓ ✓
Encapsulation Polymorphism
• Method Overriding ✓ ✓
Event • State change during execution ✓
Debugging • Testing
✓
Message passing ✓ Motivational issues ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
24
3. POSSIBLE CAUSES OF DIFFICULTIES IN LEARNING OOP
Many difficulties in learning OOP have come in the notice from the
literature review; however, it is also a significant concern to find the possible
causes of those difficulties. Some of the identified reasons from the review are
discussed as under:
3.1 Improper Feedback for High Cognition Topics
Feedback is a significant factor for successful and active learning.
Students can receive feedback from a concerned instructor, from the
development environment or the testing or debugger tool (Butler and Morgan,
2007). However, the feedback is provided on the topics which need a low-level
of conceptualization such as syntax, variables, and data types, feedback on
high conceptualization and relevant issues such as program design, memory
management, and OO principles is either neglected or not properly given
which results in deficient performance in the implementation of OOP concepts.
An instructional method that can deploy students' learning and guide them
through a process as a program design can be valuable to reduce OOP's
advanced topic's perceived difficulties.
3.2 Lack of Comprehension
In most instances, students cannot get the right answers because they
cannot grasp what's going on in the memory while any program is running.
They have not developed a paradigm of accurate thought in their minds. If the
basic model built in students' minds is not correctly constructed, the whole
model will become a wreak (Milne and Rowe, 2002; Bashiru and Joseph,
25
2015). Understanding memory operations and misconceptions about class
and object is the source of several difficulties that occur at later stages of
learning (Bashiru and Joseph, 2015).
3.3 The Shift from Understanding to Implementation
Students struggled to comprehend the OOP concepts because they
were less effective at using and applying the OOP concepts because they had
trouble recognizing their concepts (Rajashekharaiah, et al., 2017 and Butler
and Morgan, 2007). Understanding the syntax of programming languages was
the only aspect that did not experience this change.
4. COMPETENCIES REQUIRED FOR MASTERING OOP
Competence refers to the cognitive abilities and skills that an individual
has or may learn to solve a specific problem (Weinert, 2001). Regarding
measurement, Klieme (Klieme, 2004) stated that competencies are the range
of situations or tasks that one needs to master, and assessment of those
competencies might be done by challenging the student by providing the
sample of such (eventually simulated) conditions. Determining such tasks or
situations for any course are essential for designing a competency model. The
competency model results from such Identification that describes and
measures the primary competencies that are subjects for an individual must
master in a specific topic. Klieme (Klieme, 2004) describes three types of
competency models:
• The competency structure model, usually composed of dimensions
(such as competence domain of competence characteristics), describing the
26
learner’s cognitive tendencies needed to solve tasks and problems in specific
content or required domains
• Competency Level Models, provides the details information or profiles
of the described competencies, and
• Competency Development Models represents how competencies will
evolve.
Usually, the level or development model is based on a structural model.
In this section, the various competency model available for mastering in OOP
are discussed, and in the end, the competency model proposed and
implemented by the researcher is presented.
The following sections discuss various OOP competency models
studied from literature reviews:
4.1 Havenga Competency Model (2008)
Havenga (Havenga, 2008) discusses how high-performance student
programmers can facilitate the development of successful computer programs
through thinking processes and strategies. The model proposed is intended to
assess student skills improvements in initial programming courses. The
score(s) in their model would be attributed to the student's work in the same
way that the teachers assign a score for the semester test. Figure II-2 shows
the dimensions of the proposed model. This research aims to find the
difference between successful programmers and failed programmers, and
showed that a framework is needed to support novice programmers. However,
no evidence is available for the measurement results based on this model as
per our knowledge.
27
Figure II-2 Havenga’s competency model (2008)
4.2 COMMOOP Structural Model (2016)
Karmer (Kramer, et al., 2016) proposed the OOP's competency
structure model and evaluation instruments. The proposed model includes two
major components: 1) a set of candidates for (potentially measurable)
competencies, and 2) a category system that supplies a structure for these
competencies. The model has the following four dimensions and sub
dimensions to measure the student’s competencies in OOP:
4.2.1 OOP Knowledge and skills
This dimension includes the competencies required for acquiring core
programming knowledge and skills. This dimension's subdimension consists
of the competencies for Data structure, Class & Object structure, Algorithmic
structure, Notional Machine, Representation structure, and Execution
structure.
4.2.2 Mastering representation
The competencies required to understand any system's formal
description, such as syntax or the semantics of any programming language,
are covered in this dimension. The master in this competency dimension is
28
considered as a competent programmer. The sub-dimension of this
competency level includes languages, syntax, and semantics.
4.2.3 Cognitive process
This dimension of the COMMOOP structural model includes the
competencies related to the Problem-solving stage, such as understanding the
problem, determining how to solve the problem, translating the problem into a
computer language program, testing, and debugging the problem program.
The programming is about generating a piece of software that serves a
specific predefined purpose, and the process is usually considered problem-
solving skills. The level is further divided as; problem-solving step, cognitive
process type, Interpretation, and production the software applications.
4.2.4 Metacognitive processes
Weinert (Weinert, 2001) addresses that metacognitive factors like
motivation, self-efficacy, perceived understanding, or theoretical value belief
may play a significant role in defining and measuring the competencies. This
dimension exists in many competency models, but the author places less
emphasis on this aspect.
4.3 Simona Hierarchy-based Competency Structure Model (2019)
Ramanauskait˙e (Ramanauskaite and Slotkiene, 2019) proposed a
level-based competence structure for object-oriented courses, and evaluated
students by designing and implementing selected data structures in the Java
programming language. The competency tree structure has been transferred
to the test competency tree; however, the lowest level has been eliminated.
29
The level of competence required to conduct the test, or the level of
competence evaluated in the previous test, is marked as the pre-required
competence, whereas the higher-level competence constitutes the level of
competencies being assessed.
Ramanauskait˙e model aims to evaluate students' skills, and the model
can be used to perform semantically expressed evaluation methods for manual
evaluation and electronic evaluation processes. The author focused that
competencies in the model should be mapped appropriately to the
competencies required for any study program, i.e., the levels in the
competency model could be used as guidelines for students’ evaluation. The
advanced students can skip some low-level tasks as per their expertise and
begin with higher-level tasks. The example competency model proposed by
the authors is shown in Figure II-3
Figure II-3 Simona’s hierarchy-based competency structure model (2019)
The findings obtained from the proposed model indicate that the
distribution of scores and the importance of the proposed e-evaluation process
are more similar to the teacher evaluation than the conventional e-assessment
system tasks. Instead of displaying the summary score of all skills, you will see
30
the results of many tasks. The author has nevertheless not implemented the
model in any current study framework to demonstrate implementation.
5. APPROACHES FOR TEACHING OOP
Besides the debatable arguments about the instructional contents to be
taught in the OOP courses, it is also vital to approach the teaching of these
contents. This section describes the various teaching approaches for OOP
observed in the existing literature. Each of these is discussed in the following
section.
5.1 Objects-First Approach
The Object-first is a widely used approach for teaching the OOP
paradigm (Cooper, et al., 2003, Bennedsen and Schulte, 2007, Proulx et al.,
2002, Mohammed, et al., 2018, Krishnamurthi and Fisler, 2019 and Kunkle
and Allen, 2016), focusing on the establishment of the OOP concepts, i.e.,
object, classes, inheritance, and encapsulation before introducing any
procedural programming concept, i.e., assignments, loops, operators or
conditional statements in the context of OOP (Cooper, et al., 2003). More
simply, it means focusing on usage before implementation (Livovský and
Porubän 2014).
In comparison to the classical teaching approach to introductory
programming concepts, students learn by beginning with a basic program and
progressing towards complex programming, allowing students time to master
each concept and build it step by step. The object-first is considered a more
challenging approach. In this approach, students have to conceptualize the
31
OOP terms and learn to write their programming syntax (Cooper, et al., 2003).
The complexity of learning OOP using object-first also arises because students
have to simultaneously grasp multiple tasks such as various concepts, ideas
and skills, which is mentally challenging (Proulx, et al., 2002).
The various strategies are used in literature to reduce the challenges of
the object-first approach such as design-first strategy such as using UML
(Unified Modelling Language) (Burton and Bruhn, 2003) to develop a visual
and intuitive model of objects and their relationship or use other visualizing
object tools such as BlueJ (Kölling, et al., 2003), Java Power Tools (Proulx, et
al., 2002), Karel J. Robot (Bergin, et al., 1997), Alice 3D (Cooper, et al., 2003),
Jeliot 3 (Levy, et al., 2003), and various graphics libraries.
5.2 Imperative-First Approach
The imperative-first is also known as a fundamental-first or procedural-
first or object-later approach. This approach focuses on establishing
procedural aspects of OO language such as assignments, control or
conditional structures, logical expression, or the function before introducing the
OOP quarks (Smolarski, 2003 and Kunkle and Allen, 2016). Writing the source
code is also presented while learning the procedural or OOP aspects of any
programming language intended to gain.
One of the advantages of this approach is to obtain the application
basics that will enable students to switch to new programming languages and
paradigms, if necessary, as they will be "completely based on language
independence." This approach is used by professional educators, who first
32
teach basic principles and master students' progress in designing and solving
problems (Johnson, et al., 2000).
Burton (Burton and Bruhn, 2003) point out that students with procedural
programming language experience will be more capable and competent in
learning OOP because the programming paradigm includes two main
components: algorithmic thinking and structured programming. Algorithmic
thinking itself is considered a paradigm because algorithms may contain
elements such as selection and repetition. Despite its many advantages, it is
still arguable that understanding algorithms and structured programming
before learning an OO paradigm is beneficial to students or not because the
paradigm of OOP involves the modeling structures and relationships that are
different in procedural programming languages.
5.3 Concept-First Approach
This approach aims to provide a clearer image of the problem area and
recognize the context under which the OOP concepts occur before providing
detailed descriptions of the individual OOP quarks, or we can say the problem
domain of OOP is explained as a whole at the beginning of this approach
(Livovský and Porubän, 2014). This approach primarily focuses on users
without having programming-related experiences (Haendler, 2019), as the
many computer science students facing problems in the abstract thinking,
especially from the perspective of abstractions of real-world issues in OOP
(Sien and Carrington, 2007).
Livovský (Livovský and Porubän, 2014) further provides the steps
required to implement the concept-first approach. The steps include, knowing
33
the domain, which is considered as a prerequisite than a recommendation, the
participants must be well aware of the domain to be used as an example or
project so that the student should focus on important things instead of the
process model. The second step is about using the complex examples,
emphasizing the concepts in a broader and related context, rather than
explaining in isolation, e.g., explaining multiple classes and communication,
relationships and responsibilities between them. The third step is to conceal
the programming language code they do not yet know and hide the principles
they do not know. The fourth step is to provide an immediate response, this
step focuses on providing the immediate or response with a minimum delay so
that students could change their solution without putting extra effort, and for
this, using the UML diagram for this approach is not suitable as they are not
executables. In the fifth and final step of this approach, computer games or
other interactive environments are used. So that students can observe the
game world, engage with it and learn by watching the effects of their actions,
and get enough and timely feedback.
5.4 GUI-First Approach
This approach illustrates all classes' common attributes by using
applets or other Graphical User Interfaces (GUI) (Decker and Hirshfield, 1999).
In this approach, the students needs to understand the classes' functions and
their components, followed by OOP concepts by developing the GUI
programs. This approach may lead students to think that there are more
practical programming in their curriculum than purely abstract theory (Gibbons,
2002). If students can see that their running programs are displayed in a GUI
34
rather than a static textual alternative, they can be more motivated and
satisfied. However, this approach is incapable of adaptation by many
academicians and practitioners due to the lack of emphasis on algorithmic
thinking, structured programming, and OOP design, and some believe that
students must look at these concepts from an OOP perspective from the start
before they do any hands-on programming (Hadjerrouit, 1998).
5.5 Game-First Approach
Yan (Yan, 2009) proposed a method of teaching OOP programming
through games. He used an interactive game-like environment named as
Greenfoot to show the interaction of objects in a wombat scene (Kölling, 2010).
Students will be exposed to the OOP concepts in this approach, including
objects, classes, observing behavior, calling methods, and instantiating
objects using this Greenfoot. Another similar environment BlueJ (Kölling, et
al., 2003), was also used in literature for teaching about observe behaviors,
invoke methods, and instantiate objects using the picture game.
Al-Linjawi (Al-Linjawi and Al-Nuaim, 2010) suggested using Alice, a
game like a 3D environment, helps beginners and novice learn about OOP's
fundamental quarks. Many other researchers (Mohammad, 2019 and Sideris
and Xinogalos, 2019 and Seng, et al., 2018 and Clark, et al., 2018 and Corral,
et al., 2014 and Livovský and Porubän, 2014) suggested that specific game-
like tools serve the entire educational process to be helpful for better
understanding of the OOP concepts and develop programming skills to the
students who do not have the programming-related experience. Additionally,
these tools help the students gain inspiration at the start of the course.
35
6. SERIOUS GAMES (SGs)
Game is a general term that applies to all organized play, consisting of
rules, objectives and challenges to diversion or amusement. Video games are
ideal for learning because they can allow players to practice their learning skills
and to control their learning according to their style (Papert, 1998). The
category of games has an explicit and unique aim of learning rather than just
amusement, referred to as serious games (SGs) (Abt, 1987). Serious games
are student-centered, making the learning process more relaxed, enjoyable
and effective. The whole learning process is carried out by playing games.
Entertainment and fun are the important qualities that draw people to the
learning experience.
Using games in curriculum terms means converting the subject (such
as the laws of physics, or the law of supply and demand) into a game’s
mechanism that will run in a separate system based on choices and results
(Perrotta, et al., 2013). Other educational benefits of using computer game
learning include improving the ability to solve problems, the level of motivation
and retention (Prensky and Thiagarajan, 2007), and it may provide an
excellent opportunity for the stimulation of student’s abstract thinking into the
cognitive development and further advanced thinking skills (Carbonaro, et al.,
2008). (Carroll, 1982, Gee, et al., 2008 and Shaffer and Clinton, 2005)
concluded that learning through digital games, e.g., simulation, video, or
computer games, carries out high power to be used as an instructional method
in many subject disciplines. Serious games' promising use has made STEM
learning comfortable and productive (Boyle, et al., 2016). Therefore, if teachers
36
apply digital games to the teaching of any subject area, then students will not
only get better learning results, but also learn happily through the interactivity,
immersion, and interactivity of these games (Smith and Gotel 2007 and Gee,
et al., 2008 and Shaffer and Clinton, 2005 and Carroll, 1982).
7. AVAILABLE TOOLS FOR LEARNING OOP
Computer programming is entirely linked to problem-solving and
building a higher cognitive understanding of advanced concepts. To solve
complex problems, students should first acquire basic concepts at the
grassroots level, as Muratet (Muratet, et al., 2009) explains that basic
programming concepts are difficult for inexperienced students to understand
rote learning is almost impossible in the programming context. Beginners,
learning programming languages, want to see their programs' immediate
outcomes, while learning programming languages also requires a great deal
of comprehension and work to produce useful results (Anderson and
McLoughlin, 2007). Immediate feedback on code-building during the
programming allowed students to revise iteratively more easily and
immediately see how their code changes were created (Phelps, et al., 2005).
However, it is very riotous to offer input as much as possible, and the programs
have no choice but to make errors more often. Researchers in the area use
simulation techniques to visualize student outcomes using programming
languages and are familiar with the syntax and examples of programming
processes (Jiau, et al., 2009).
Object-Oriented paradigm (OO Paradigm) is considered a more natural
domain to work with, as OOP's problem domain constitutes objects that are
37
interrelated to real-life objects (Livovský and Porubän, 2014). Nevertheless,
the understanding and mapping OOP's fundamental concepts with real-life
scenarios becomes challenging at the students' foundation level (Yan, 2009).
The shift in programming paradigm, such as from procedural to OOP, is
another challenging task as it contains many overlapping concepts
(Bennedsen and Schulte, 2007).
For providing a good learning experience, one should start by the
conceptualization of OOP basic features such as Object, Class, Inheritance,
Polymorphism, and so on, then move towards the technical details of the
language. Initial exposure to OOP concepts should be attractive and fun for
students, and the fun can be part of the learning environment to keep students
focused on learning. Serious games can deal with this problem by providing
close-to-real-life scenes in an immersive and engaging way of interaction
without worrying about failure.
In the past decade, the utilization of serious games to learn OOP and
other programming-related topics has increased. Many researchers stress that
the potential of using the serious games as initial exposure for learning basic
concepts of OOP interactively and engagingly. Mainly, there are two primary
purposes for using serious games for learning OOP: first is to promote
learning, secondly, motivating, and engaging the students. The example of
serious games developed for learning OOP can be found in the following
sections:
38
7.1 MUPPETS
MUPPETS (Multiuser Programming Pedagogy for Enhancing
Traditional Study) (Phelps, et al., 2005) is an immersive OOP learning
framework. The framework itself is not a game, but it has an essential
mechanism that can provide prompt feedback, interaction, and community
integration like modern games.
In MUPPET’S virtual environment, students work on a domain that
simulates work that must be done on real-world programming. Students learn
by creating objects and avatars and sharing the created objects with peers and
upper-division students to improve and contribute to students' success. The
senior students behave as role models and helpers to the novice. The authors'
findings on their student group research praise the strengths of the system.
MUPPETS intends to improve the affective and cognitive learning outcomes,
and it engaged the students by immediate feedback integrated into the
environment, enhancing peer-to-peer collaborations.
7.2 Alice
Alice (Cooper, et al., 2000) is an immersive programming environment
for 3-D graphics that has been designed for beginners through enticing 3-D
environments and explore the new learning medium. Alice has an OOP flavor
where users can learn by writing simple scripts and control the appearance
and actions of objects. When executing the script, objects react to user input
via mouse and keyboard. Each action is smoothly animated for a certain
length.
39
Alice is a learner-centric environment that facilitates OOP by allowing
objects to be manipulated directly using a limited set of simple commands. In
Alice, a simple story can be achieved by selecting objects in the world (such
as skaters) and calling one of them (such as Skate). Students can use the
graphical interface to incorporate method calls by drag the name of the method
from the object list and place it in the calling method of the valid location. This
textual representation is intended to allow the line of code to be read as a
sentence describing the object (Cooper, et al., 2000).
Many authors validated the potential of the developed environment by
performing statistical tests, such as (Al-Linjawi and Al-Nuaim, 2010) indicates
the student’s Knowledge of inheritance increased, many others have shown
improvement in computational thinking to middle school students (Qualls and
Sherrell, 2010, Lee, et al., 2011 and Razak, et al. 2016).
7.3 Scratch
Scratch (Scratch, 2010 is a syntax-free visual programming
environment developed by MIT Media Lab in collaboration with UCLA. It allows
point & click interface for creating media-rich games, stories, interactive
animations, music and art and other applications. Scratch is suitable for
teaching basic programming concepts such as variables, arrays, logic, and
user interface. Scratch also works similar to Alice (Alice, 2008), allowing the
program by placing sprites or objects on a screen and manipulate them by
drag-and-drop programming; besides basic programming concepts, they help
to get familiar with object-oriented concept to students where the students get
introduced through visually manipulating objects to implement some using
40
simple games. The use of visual programming environments is often perceived
to be ideal because they allow students to quickly create solutions without
having a background in game programming or the need for excessive program
code).
7.4 Jeliot
Jeliot is an interactive visualization platform designed to help beginners
learn procedural and OOP languages (Moreno and Robles, 2015). Jeliot
supports the method "First Object" or "First Fundamentals" for initial
programming courses (Myller and Nuutinen, 2006). But Jeliot is primarily
intended for OOP Java programming, making it difficult for students to
transition between various IDE's to make weaker students feel confused
(Mutua, et al., 2012). Furthermore, even though Jeliot can see control flow, the
effects of variables can not be visualized (Bednarik and Tukiainen, 2006).
7.5 Greenfoot
Greenfoot (k ̈olling, 2010) is a comprehensive educational software
development environment designed to teach and teach young novices.
Greenfoot helps in alleviating the use of the standard java programming
language by offering a customized environment, minimizing much of the
complexity of OOP. Simultaneously, it adds the ability to create graphics,
animations, and sounds easy, so you can handle compelling examples as
early as possible. Young learners can create simple games and simulations
with Greenfoot while learning basic programming principles. User feedback
seems to indicate successful using these examples to attract learners' goals.
41
For middle-aged and older learners, Greenfoot can be used as the first
programming approach or as the second for young learners.
7.6 BlueJ
BlueJ is an optimized Java framework for simple OOP concepts
(Kölling, et al., 2003). It aims to provide Java language's easy-to-use teaching
environment. However, the source code editor is not technically equivalent to
industrial IDEs. Integrating BlueJ with a competent IDE is ineffective.
7.7 Ztech de
ZTECH (Seng and Yatim, 2018) is a 2D role-playing to inspire students
to learn OOP in a relaxed, interactive atmosphere. In the gameplay, Ztech
traveled around the map using the navigation system to battle with enemies to
gain experience and win gold points. As the player enjoys the game, they learn
OO expertise. The gaming part aims to increase the interest of users in
learning the knowledge. The game offers users basic OO concepts such as
encapsulation, inheritance and polymorphism, and other basic programming
concepts. Feedback is presented by appropriate dialogue. The authors'
findings claimed by testing their game on inexperienced learners include an
improvement in learner confidence, courage, and determination to learn and
understand OOP concepts.
7.8 POO Serious Games
POO (Mohammad, 2019) serious games is a 2D mobile-based game
developed for beginners of software development to teach OOP concepts
basics in a fun and engaging way. The environment of the zoo inspires the
42
gameplay. The game's primary purpose is to construct and identify animals
and understand various processes, e.g., "voices, acts, attitudes, etc.," by using
the OO paradigm. In-game assessment technique is applied to evaluate
learners’ knowledge about the field in each level of the POO, and the level
includes “class, object, inherence, and polymorphism” OOP concepts.
However, the author did not statistically prove the developed game.
7.9 Discussion
The OOP was taught through the playing and creation of various SGs.
The effects of other game-related tools, such as simulators, microworld and
game engine were also evaluated in addition to the SGs. Alice, Jeliot are used
as an immersive learning environment in which the code must be used to
define the object. Greenfoot, though, is just an educational software creation
platform for the use of OOP learning scenarios and expected to understand
the technical specifics of the code. BlueJ and MUPPETS are optimized
learning systems and again need sound computer code skills to learn with
these tools. The effect on students observed in terms of results provided by
these tools and almost all studies in the study showed results that affect
student learning directly or indirectly. Learning from other related instruments,
however, includes the order to underscore programming codes that contain all
game features like demotivation, boredom, irritation and lack of interest in the
topics.
In comparison, ZTECH and POO are SGs to learn OOP and have a
positive influence on their integration. The studies resulted in cognitive and
affected outcomes, learning improvement, problem-solving skills, motivation,
43
student engagement, Improved grading, reinforced learning, knowledge
acquisition, and student satisfaction effects observed, but still they lack in
providing the statistical evidences for the improvement in the learning
outcomes of the students. The studies do not demonstrate the purported link
between the game motivation and the actual learning results that are to be
obtained with the inclusion of SGs. The studies lacks in consideration of the
difficulties and misconceptions of the student that impede mastering OOP
skills. Other problems may include the lack of use of learning theories and
instructional design for prototyping SGs. Thus this research acknowledges all
of the constraints such as the student's difficulties and misunderstanding, the
incorporation of learning theories and instructional design in the design and
development of SGs prototype. The statistical evidence for the effectiveness
of the developed prototype is also closely analyzed and provided the
relationship between motivation and actual learning outcomes.
The Table II-5 presents a detailed review of existing studies dedicated
to available serious games for learning OOP. The reviewed studies focused
on instructional contents covered in available SGs, programming approaches
applied, learning theories or instructional design applied, and the effect
observed by the SG intervention, if any.
The results for the instructional contents covered in available serious
games include core OOP concepts such as classes, objects, inheritance,
encapsulation, message passing, polymorphism, encapsulation, abstraction,
modularization, and generalization. Most of the studies incorporated the object
and class concepts by implementing the avatars, game scenes, Enemy
44
Characters, balls, paddles, etc. In one of the studies (Corral, et al., 2014), all
the basic concepts were represented via tangible and visible way through
Tangible User Interface (TUI) with sensing, wireless communication, and user-
directed performance of several small physical devices. The review results
showed that existing studies did not consider the student's difficulties as
instructional content, which needs to be overcome and improved.
The studies under investigation involve the various programming
approaches applied for OOP learning. The result shows that four studies
(Phelps, et al., 2005 and Chen and Cheng and 2007 and Yan, 2009 and
Livovský and Porubän, 2014) applied object-first, seven as game-first (Xu,
2009 and Seng and Yatim, 2014 and Seng, et al., 2015 and Seng, et al., 2016
and Seng and Yatim, 2018 and Seng, et al., 2018 and Poolsawas, et al., 2016),
two as GUI first (Yulia and Adipranata, 2010 and Al-Linjawi and Al-Nuaim,
2010) two as code first approach (Depradine, 2011 and Corral, et al., 2014)
and one used a concept first (Rais and Syed-Mohamad, 2011). Thus, it has
been observed that most of the studies applied game as the first programming
approach followed by Object, GUI, code, and concept first approach.
A study (Chen and Cheng, 2007) used a mixture of imperative
programming and objects as the first programming approach, and we treated
it as an object-first method. Five studies from a single researcher (Seng and
Yatim, 2014 and Seng, et al., 2015 and Seng, et al., 2016 and Seng and Yatim,
2018 and Seng, et al., 2018) claims game as the first programming approach
by considering the games as supporting educational tools.
45
The current literature also revealed paucity in the use of learning
theories and instructional designs in the available serious game prototypes for
learning OOP (Xu, 2009, Al-Linjawi and Al-Nuaim, 2010, Yulia and Adipranata,
2010, Rais and Syed-Mohamad, 2011, Depradine, 2011, Livovský and
Porubän, 2014, Poolsawas, et al., 2016, Seng and Yatim, 2014, Seng, et al.,
2015, Seng, et al., 2016, Seng and Yatim, 2018, and Seng, et al., 2018).
Despite some promising results obtained from the current literature does not
show the presumed link between the motivation provided by the games and
actual learning outcomes supposed to be achieved by incorporating SGs for
learning OOP (Phelps, et al., 2005).
Effect on students observed in terms of outcomes produced through
SGs. Almost all the studies under the investigation presented outcomes that
directly or indirectly impact student learning. However, the majority of the
studies lacks in providing the statistical evidences for the improvement in the
learning outcomes of the students by various modes of intervention of the SGs.
The results presented in Table II-2 revealed that most of the studies resulted
in cognitive and affected outcomes, learning improvement, problem-solving
skills, motivation, student engagement, Improved grading, reinforced learning,
knowledge acquisition, and student satisfaction effects observed.
46
Table II-2: Summary of the existing studies used serious games for learning OOP
Author & year Programming Approach
OOP CONCEPTS
Participants Serious games
intervention
Game Contents/ Difficulties
Incorporated
Learning Theory/Instructio
nal Design
Games/Tools Incorporated
Outcomes
Pla
yin
g
Cre
atin
g
Rela
ted
tools
Phelps, et al., 2005
Object-First • Encapsulation • Inheritance • Polymorphism
N= 2 or 4 Final Year Project
✓ ACM/IEEE Computing Curriculum 2001 Computer Science
Active and Constructivist Learning
Name: MUPPETS Game Genre: Not Available Details: Virtual Environment in which Programming is learned by creating objects and avatars. Created objects can be shared with peers and upper-division students for further improvements.
Cognitive & affective level outcomes and peer-to-peer collaboration improved in students. Due to the small number of participants, the results were not validated.
Chen and Cheng, 2007
Mix(imperative and objects-first)
• OOP principles and Design patterns
N=38 (19 Teams) Fall 2005 OOP Laboratory Undergraduates students
✓ ✓ ACM/IEEE Computing Curriculum 2001 Computer Science
Instruction Design: teaming, theme selection, theme confirmation, one-to-one instruction, project monitoring, support from TA, midterm milestones, final
Name: Game Framework Game Genre: Not Applicable Details: Game Framework was used in the laboratory course to create Games
Student deeply understands OOP concepts to develop Games. For sixteen teams, results were significant but insignificant for two teams. One team
47
evaluation, and grading.
let fall out at the start, of the course.
Xu, 2009 Game-First • Basic OOP concepts
• Graphics • Animation • JavaBeans • vent handling,
- Special Topics (Game Programming)
✓ - - Name: WormChase, Breakout, Othello Game Genre: Puzzle Details: Game’s static scenes, dynamic object animation, collision detections, player controls, a score calculation system, and a level system were analyzed and synthesized to understand OOP and COP concepts.
Cognitive & affective level outcomes and improvement in Problem-solving skills. Limited participant’s information led to the invalidation of results.
Yan, 2009 Object-First • Class • Object
- 1st year CS/IT
✓ - Experiential learning
Name: GreenFoot and BlueJ Game Genre: Not Applicable Details: Wombat game scenario was implemented in Greenfoot to understand Objects, instantiation, class, observing behaviors, and invoking methods. BlueJ was used to create Picture Game by applying the concepts learned using Greenfoot.
The student successfully learns about Objects, Classes, instantiation, method invoking, and Observing behaviors. Limited participant’s information led to the invalidation of results.
48
Al-Linjawi and Al-Nuaim, 2010
GUI-First • Inheritance
N=45( Treatment Gr: 24 Control Gr: 21) 3rd-year female students
✓ - - Name: Alice Game Genre: Not Applicable Details: Bunny family Tree was created for understanding inheritance
Student’s Knowledge of inheritance increased in pre-and post-tests between the control and treatment groups. The results were statistically significant.
Yulia and Adipranata, 2010
GUI-First • Class • Object • Inheritance
N=125 It is divided into five heterogeneous groups. 2ndsemester
✓ ✓ ✓ - - Name: GameMaker, MinimUML, and WARCRAFT Game Genre: Not Applicable Details: GameMaker was used for Object concepts by creating Games, MinimUML for Classes, and WARCRAFT for learning Inheritance concepts
Improvement in student’s grading and passing ratio. The results were significant when comparing results with previous semester results.
Rais and Syed-Mohamad, 2011
Concept-First • Classes • Objects • Encapsulation • Inheritance
N=10 Treatment Gr. (GAPS and Alice) and Control Gr. (Traditional learning) 2nd Year CS
✓ ✓ Contents of: Principles of Programming Programming Methodology and Data Structures Database Organization and Design
- Name: Traditional learning, GAPS 1.0 and Alice 2.0 Game Genre: Not Applicable Details: Student learns about basic programming concepts, classes, and object by playing
Student learning was improved, high percentage scores, and improved grading, but the insignificant result for learning about objects using Alice. Overall, the results
49
Software Development Workshop
game GAPS 1.0 2D game. Students learn about objects, class, methods, repetition, and control structures by dragging and dropping graphic tiles using the Alice 3D environment.
were statistically significant.
Depradine, 2011
GUI-First • classes and objects, encapsulation
• Class Design (e.g., inheritance versus composition relationships)
• Design by Contract (e.g., exception handling)
• Advanced Concepts (e.g., abstract classes)
• Applications using Class Libraries
N=54 Advanced level II (second year)
✓ ✓ - - Name: SIMOO Game Genre: Arcade Details: SIMOO is a paddle, balls, and blocks game. Paddle and single ball game was presented to the students who need to be completed by applying the programming concepts
Student’s ability to solve programming problems was improved. The course and final examination results were satisfactory as compared to prior years. The statistical significance of the results was limited because of the primary sort of study.
50
Livovský and Porubän, 2014
Object-First • Object • Class • Attribute • Operation • Encapsulation • inheritance
N=14 1st Phase: teacher and Ph.D. Student 2nd Phase: 2nd-year students 3rd Phase: 14 students
✓ - - Name: Amiga action-adventure game Game Genre: RPG Details: 2D game, where player have shoot enemies to success for the next level
Five students showed significant results for the MCQs test but insignificant for explanation type questions. Three students showed insignificant results for MCQs but significant results for explanation type, two students ignored the explanation type question. Four students didn’t submit their responses. Significant results for learning performance.
Corral, et al., 2014
GUI-first • Basic OOP concepts and Events
N=30( Experimental Group: 15 Control Group:15) 1st students (Industrial Technology Engineering)
✓ - Experiential learning
Name: TUI Sifteo cubes Game Genre: Not Applicable Details: OOP concepts were learned by programming the visible and tangible Sifteo Cubes
Student learning was improved; the experimental group achieved high marks in the acquisition of basic OOP concepts, spent less time for task completion, and feeling less difficulty. The results were
51
statistically significant.
Seng and Yatim, 2014 and Seng, et al., 2015 and Seng, et al., 2016; Seng and Yatim, 2018; Seng, et al., 2018
Game-First
• Basic OOP concepts,
• Control and Repetition Statements,
• Array •
N=40 BS(CS)and BS(SE) 1st Year 3 semester N=60 1st year BS(GD)
✓ - - Name: Ztech de Game Genre: RPG Details: The Game Player Ztech should move around the map using the navigation system. Player success to the next level by defeating the boss.
Students showed significant improvement in their final examination grades. The results were significant.
Poolsawas, et al., 2016
Game-First • Object Classes inheritance
•
N=40 3rd year, majoring IS and ID&GD
✓ - Problem-Based Learning
Name: Unity 3D Game Genre: Not Applicable Details: OOP concepts were learned from the tool’s objects such as game scenes, Enemy Characters, logics, and items.
Students will understand the OOP concepts using the GDP. Improved scores from pre- and post-test. The findings were statistically significant.
MUPPETS: Multi-User Programming Pedagogy for Enhancing Traditional Study, SCRUMBAM: the combination of agile methodologies Scrum and Kanban, SIMOO: Simulator for Teaching Object-Oriented Programming, RDGC: Rapid Digital Game Creation, IS: Information System, ID & GD: Interactive Design and Game Development, GDP: Game Development Platform; RPG: Role Playing. TUI: Tangible User Interface; GAPS: Game-Based Approach to Support Programming Skill
52
8. GAMES ATTRIBUTE CONTRIBUTING TO LEARNING AND MOTIVATION FOR OOP
Game elements are components that compose a game (Bedwell, et al.,
2012). Game elements, sometimes known as game attributes (Bedwell, et al.,
2012), can promote learning and boost student’s interest if used adequately
during game design. Serious game attributes are those aspects of a game that
support learning and engagement (Yusoff, 2010). To ensure the Serious
games provide the same teaching practices as traditional teaching methods,
certain successfully proven aspects of traditional teaching are important to
consider in their design and development. These successful game design
elements are called game attributes. The widely accepted view is that games
themselves are not appropriate for learning, but game elements can be
enabled within an instructional framework to improve the learning process
(Garris, et al., 2002). Ricci (Ricci, et al., 1996) suggested that instruction
integrating game features increased student motivation, resulting in greater
exposure to learning programs and better retention. The game attributes help
learning, encouragement, and interaction while playing the game and
encourage players to think critically while playing games (Khowaja, 2017).
Instead of selecting game attributes to produce expected learning
outcomes, it is difficult to align learning outcomes with game attributes; on the
other hand, tying game attributes with particular learning outcomes to achieve
desired learning objectives would seem advantageous to instructors, trainers,
practitioners even the learners themselves. Several studies have been on
identifying the game attributes dedicated to effective learning, and user
53
motivation and engagement, some of them are discussed in the following
section:
8.1 Game Attributes by Garris et al., (2002)
The study based on reviews for games used for motivation and learning
concluded there are six core elements for every game, which includes (1)
fantasy, (2) rules/goals, (3) sensory stimuli, (4) challenge, (5) mystery, and (6)
control. The description of these attributes and outcomes produced using
these attributes when applied in games is given in Table II-3. The author
concluded that motivational characteristics of games encourage participation,
which ultimately leads to learning (Garris, et al., 2002).
8.2 Game Attributes by Wilson et al., (2009)
The researcher extends Garris's work (Garris, et al . , 2002) by deciding
which game attributes contribute to learning outcomes. The authors identified
18 potential game attributes which are (1) Adaptation (2) Assessment (3)
Challenge (4) Conflict (5) Control (6) Fantasy (7) Interaction (equipment) (8)
Interaction (Interpersonal) (9) Interaction (Social) (10) Language/
Communication (11) Location (12) Mystery (13) Pieces or players (14)
Progress and surprise (15) Representation (16) Rules/goals (17) Safety, and
(18) Sensory stimuli. The description of these attributes and outcomes
produced using these attributes when applied in games given in Table II-3.
Wilson's attributes may be used as a template for evaluating possible learning
outcomes as a result of playing.
54
8.3 Game Attributes by Yusoff, (2010)
The twelve more serious game attributes were identified to support
effective learning (Yusoof, 2010). The serious games attribute suggested by
the author has (1) incremental learning, (2) linearity, (3) attention span, (4)
scaffolding, (5) transfer of learned skills, (6) interaction, (7) learner control, (8)
practice and drill, (9) intermittent feedback, (10) rewards, (11) situated and
authentic learning and (12) accommodating to the learner’s styles. The Table
II-3 describes these attributes and outcomes produced using these attributes
when applied in games.
8.4 Game Attributes by Bedwell et al., (2012)
In this study, an extensive review of linking game attributes with learning
was conducted, and in results, the author specified many overlaps between
different game attributes (Bedwell et al., 2012). The list of nineteen attributes
(the attribute progress/surprise, which was initially proposed by Wilson
(Wilson, 2009) were generated and summarized by subject matter experts
(SMEs) was split into two individual attributes of progress and surprise by
SMEs because of inherent disparity) identified by Wilson (Wilson, 2009) into
the nine non-overlapping categories using the card-sort technique. The
description of the attributes and outcomes produced when applied in games
given in Table II-3.
8.5 Game Attributes by dos SANTOS, (2019)
A mapping study to identify the game elements used in Serious games
for learning programming was conducted, and the author identified 27
attributes used in 43 serious games (dos Santos, et al., 2019). The author
55
classifies the defined attributes in three categories, which are dynamics,
components, and mechanics. There were four attributes under the category of
dynamics, which were, 1) fantasy, (2) meaning, (3) constraint, and (4)
progression. Among them, fantasy is the most used element in the games. In
the group of components, nine attributes were classified which includes, (1)
level, (2) quest, (3) avatar, (4) virtual good, (5) boss fight, (6) hint, (7)
leaderboard, (8) combat, and (9) card. Among them, level, quest, and avatar
were the most used attributes in the identified games. There were 14 elements
in the group of mechanics, which includes (1) goal, (2) point system, (3)
reminder, (4) time pressure, (5) change difficulty, (6) progressive disclosure,
(7) competition, (8) achievement (9) win state, (10) cooperation, (11) resource
acquisition, (12) badge, (13) loss aversion, and (14) turns, whereas the most
used attributes from this category were goal and point system.
Overall, the six attributes most commonly used in identified games
include fantasy, level, quest, avatar, goal, and point system. However, the
author avoids discussing the essential attributes of serious games for learning
programming due to the lack of evaluation of game elements. Moreover, it has
been observed that did author overlaps the meaning of many attributes.
8.6 Discussion
The description about the various game attributes and the possible
outcomes produced using those attributes from the existing studies are
summarized in Table II-3.
Matching the desired learning outcomes with the game attributes, or
instead of selecting the game attributes to produce the desired outcome, is a
56
difficult task. On the other side, tying the game attributes with exclusive
instructional design to achieve intended learning outcomes would help
trainers, instructors, practitioners, and learners. To ensure that serious games
have the same degree of teaching experience as traditional teaching methods,
it is important to integrate traditional teaching elements that have been
successfully established in design and development of the serious games
prototype.
The game attributes, which are majorly used by all the authors of
reviewed studies, are selected to be incorporated in the design of SGs model.
Furthermore, the selection is also based on the type of outcomes produced by
the game attributes. Conclusively, the game attributes defined by Garris
(Garris, et al., 2002), i.e. (1) fantasy, (2) rules/goals, (3) sensory stimuli, (4)
challenge, (5) mystery, and (6) control and one more attributes “Progress” is
used by the majority of the authors. However, we have a limit to use the six-
game attributes defined by the Garris (Garris, et al., 2002), as the outcomes
produced by the game attribute “Progress” are yet not well defined by the
authors. Specifically, to make the learning activities intrinsically motivating,
Malone (Malone,1981) proposed challenge, curiosity, and fantasy attributes
very effectively.
57
Table II-3: Summary of game attributes identified from the existing studies
Attributes Description Output
Studies
Ga
rris
, e
t a
l.,
20
02
Wils
on
, 2
009
Yu
so
ff,
20
10
Be
dw
ell,
et
al.,
20
12
Dos S
anto
s,
et
al., 2
01
9
Fantasy
The game's fantasy element represents something different from real life and evokes a mental image that does not exist (Garris et al., 2002; Malone and Lepper, 1987).
Increase learning and interest (Cordova and Lepper, 1996; Parker and Lepper, 1992).
Increase the Automaticity
Increase the focus of attention and self-absorption (Driskell and Dwyer, 1984)
✓ ✓ ✓ ✓
Rules/Goals
The rules are the game's goal makeup and the well-defined standard defining how to win a game. Specific, well-defined rules and guidelines are necessary for serious games, and feedback on progress towards achieving goals. There are three types of rules: (1) system rules, (i.e., functional parameters
Enhance performance and motivation (Locke and Latham, 1990)
✓ ✓ ✓ ✓
58
inherent in the game), (2) procedural rules (i.e., actions in-game to regulate behavior), and (3) imported rules (i.e., rules originating from real-world).
Sensory stimuli Visual or auditory stimuli are helpful to change the perception and imply temporary acceptance of alternative reality.
Enhance the motivational appeal of instructional activities
Helps in grabbing attention (Malone and Lepper, 1987)
✓ ✓ ✓
Challenge
Challenge refers to the difficulty of the possibility of achieving the goal. The challenging game has multiple specified goals, progressive complexity, and information ambiguity. The challenge also increases fun and competition by creating obstacles between the current state and the target state.
Improve personal competencies, Engaging competitive or cooperative motivations,
Improves the learner’s retention of knowledge
✓ ✓ ✓
Mystery
Mystery refers to the gap between known and unknown information. It is the product of differences or inconsistencies in knowledge. Inconsistent information, complexity, novelty, violation of surprises and expectations, incompatible ideas, and inability to make predictions enhance the mystery in the game.
Drive learning evokes curiosity to complete the next levels or quests, Enhance motivation
✓
✓ ✓
59
Control
It is the degree to which learners in the game can direct their learning activities to provide the abilities of self-learning and self-exploration that fit their speed and experience.
Enhance motivation, learning, and affect skill-based learning.
✓ ✓ ✓ ✓
Adaptation
Adaptation refers to the adjustments of difficulty of game levels according to the player's skills bt matching challenges and possible solutions
Increase learner’s cognitive strategies
✓ ✓
Assessment
It is a measure of achievement by continuously monitoring players' activities (e.g., points or score).
Enhance motivation and attitude
✓ ✓
Conflict
The way to solve the problem in the game usually drives the game's plot or the game's actionroviding interaction. Conflict in the game creates a competition either with other players or outside force. There are four types of conflict: (1) direct conflict, (2) indirect conflict, (3) violent conflict, and (4) non-violent conflict.
Enhance positive cognitive learning
✓ ✓
Interaction (equipment)
It refers to the adaptability and operability of the game, that is, when the player makes any action, the change in-game will occur.
Improve skill-based learning
✓ ✓
Interaction (Interpersonal)
Face-to-face interaction, the relationship between players in real space and time, provides opportunities for others to recognize achievements, and challenges become meaningful, thereby stimulating participation.
Not-provided ✓ ✓
60
Interaction (Social)
Interpersonal activities using technology as a medium encourage recreational gatherings by creating a sense of belonging.
Not-provided ✓ ✓
Language/ Communication
The specific communication rules of the game may be an important part of the game. The two communication methods are verbal and written.
Not-provided ✓ ✓
Location
The physical or virtual world where the game is located. It affects rules, expectations, and solution parameters. The location may be real, or fantasy, and space may be bound, unbound, or augmented.
Not-provided ✓ ✓
Pieces or players
Objects or people such as proxy items, avatars or human participants, contained in the game narrative or game scenario.
Not-provided ✓ ✓
Progress It refers to the value that indicates how the player progresses towards the game goal
Not-provided ✓ ✓ ✓
surprise The random elements of the game. Not-provided ✓ ✓
Representation
The player's perception of the reality of the game. It is a subjective function that makes the game psychologically real..
Increase the psychomotor skill’s learning
Enhance motivation
✓ ✓
Safety
Separate actions and consequences (i.e., a safe way to experience reality). The only result is the loss of dignity when lost. The results are not as demanding as the modeling solution
Not-provided ✓ ✓
61
Incremental learning
Provide learning materials and gradually introduce learning activities
Helps to understand/adopt new learning, keep the learner engaged
✓
Linearity
The degree to which the game ranks learning activities (and is suitable for a continuous learning style), and the degree to which active learners may construct their sequence.
Not-provided ✓
Attention span
It involves the cognitive processing and short-term memory load imposed by the game on the player. These loads need to be carefully calibrated for the target learner.
Not-provided ✓
Scaffolding The support and help gave by the game in learning activities.
Not-provided ✓
Transfer of learned skills
Support provided by the game to apply previously learned knowledge to other game levels.
Not-provided ✓
Interaction The extent to which game activities require learners to respond and participate
Not-provided ✓
Practice and drill
Provide repetitive learning activities, and the task becomes more and more arduous, can better achieve the expected learning outcomes.
Not-provided ✓
Intermittent feedback
The degree of feedback received during each game interaction, or whether the frequency of feedback is low.
Not-provided ✓
62
Rewards The arrangement in the game aims to encourage learners and maintain their motivation.
Not-provided ✓
Situated and authentic learning
It involves providing a game environment or world where learners can relate their learning to the outside world's needs and interests.
Not-provided ✓
Accommodating to the learner’s
styles
It refers to the game's ability to adapt and influence different learner styles by providing multiple gameplay methods.
Not-provided ✓
Constraint
Constraints are rules that apply to the entire game mode. The limitation provides boundaries to the players that enable or restricts the game in some way.
Not-provided ✓
Level
Levels are parts of the game that contain one or more complete challenges. Subsequent levels are usually built on previous foundations by adding new or more complicated problems, new abilities, exploring new areas, or adding new opponents. Levels range from simple to complex or from easy to hard.
Not-provided ✓
Quest
Tasks provide the players with a fixed goal. Usually consists of a series of interconnected challenges that doubles the sense of accomplishment
Not-provided ✓
Avatar It is a picture or character used as a graphical representation of the game player.
Not-provided ✓
63
Virtual good Virtual goods are items purchased or traded for games such as coins, or money.
Not-provided ✓
Boss fight
These are challenges with the primary opponent (usually physical conflicts) and typically mark the entire game level's final challenge. Many games require players to achieve specific achievements or score levels to win the opportunity to participate in boss challenges.
Not-provided ✓
Leaderboard The leaderboard in games is used to show players ranks relative to other peers in the game.
Not-provided ✓
Hint The indirect suggestion in the form of text, audio, or animations is provided to solve or achieve something.
Not-provided ✓
Combat It is a fight or contest between individuals or groups.
Not-provided ✓
Card The piece of paper or something used to provide information in the game.
Not-provided ✓
Point system A system to identify players by evaluating their activities based on scores representing the quality of achievement.
Not-provided ✓
Reminder It is alert for times or lives remaining in the game.
Not-provided ✓
64
Time pressure It is part of the challenge and subjective state related to the players' perception about allocated time to complete a task.
Not-provided ✓
Progressive disclosure
It is a concept of managing information that indicates that everything in the game should evolve naturally, from simple to complex.
Not-provided ✓
Competition The competition allows players to compete with others. This may be a way to win rewards.
Not-provided ✓
Achievement It is the form of feedback provided to the players when meeting the game levels' goals.
Not-provided ✓
Win state It is the state where the player achieves the goals in specified rules.
Not-provided ✓
Cooperation The Player has to work in a group to complete a goal. It is helpful in team building, communication, and trust.
Not-provided ✓
Resource acquisition
It is a situation in the game where the player has to acquire resources to complete their tasks.
Not-provided ✓
Badge Badges and achievements are a form of feedback—rewards to the player who has achieved it.
Not-provided ✓
Loss aversion
Fear of losing status, friends, points, achievements, property, progress, etc. may be a strong reason for the player to perform actions in the game.
Not-provided
Turns The game flow is partitioned into defined parts, called turns, moves, or plays.
Not-provided
65
9. LEARNING THEORIES AND SGs
Learning is the process of obtaining new knowledge or modifying
existing knowledge, abilities, or actions. It can also be characterized as
permanent behavioral change or knowledge from experience or training. This
skill includes integrating many types of knowledge over time, and the skill of
humans, animals, and even machines (Nicolescu and Mataric, 2003). Learning
may be part of knowledge, self-education, or some other particular awareness
or unconsciousness (Prince and Felder, 2006 and Leahey and Harris, 1989).
Merrill reports (Merrill, 2002) that it is highly desirable to present course
content in various knowledge presentations to apply knowledge in the virtual
world to support and facilitate the learning process. The use of computer
games for this purpose includes several aspects, such as encouraging and
attracting learners, allowing learners to choose or making decisions at specific
points, and checking their learning outcomes based on their decisions and
actions while playing the game. They can also improve their social skills while
working with other learners. However, the effectiveness of overall learning
outcomes can be accomplished by employing other aspects that can be
investigated by considering the learning theories (De Gloria, et al., 2014).
Overview of some learning theories integrated with the Serious games
discussed in the following sections;
9.1 Behaviourism Learning Theory
According to behaviourist theories, learning happens inside the mind,
and it cannot be seen directly (Roblyer and Doering, 2014). The key
behaviourist Theorist comprising (Thorndike, 1923, Pavlov, 1927 and Skinner,
66
2011); all these theorists have assumed that modification or change in
apparent behaviour that is happening by exterior stimuli occurs the
atmosphere is learning. In simple words, learning is any change that happens
in behavior. Learning can be interpreted, explained and predicted on the basis
of observable events, namely the actions of the learner and the context and
implications of the environment.
According to the behaviourist theory, the efficiency and regularity of the
learner increase by positive reinforcement or award. Likewise, the learner's
efficiency and consistency decrease by negative reinforcement or punishment,
and in such a response, the learner is discouraged and less likely to repeat
such tasks.
Reeve (Reeve, 2009) suggests behaviorism and characterizes many
games are severe (story) construction and preparation of rewards.
Reeve (Reeve, 2009) enlists Positive reinforces for games comprise the
following kinds of rewards:
• Increasing extras
• Increasing bonus points
• Increasing power-ups
• Unlock the next levels
Likewise, Negative reinforces comprise:
• Failure to a shattered higher score
• An increase in difficulties or enemies
• The weakening in strength
• Losses of lifelines
67
9.1.1 Behaviourism Instantiation within SGs
As the gaming rule aspect, the player essentially knows about the pride
and lauds rules to understand the game mechanism. Players are rewarded for
completing specific tasks through promotion to successive levels or provided
with tools that will help them advance in the game. "Punishments" include
losing lives or restarting the game at different entry points to re-learn or
practice the material, or directly losing to a peer.
In almost all the genre of the game, the behaviourism learning plays an
important part. For example, to move forward, the player must press button A
and press button B should press button B. Behavioural learning is particularly
useful in explaining the autonomic response triggered in certain situations.
Players need to stimulate their reaction process to understand their goals and
achieve them, for example, in Tetris. Third, from the point of view of "game
narratives," behaviourists see players as machines full of information and are
expected to absorb narratives passively. This is usually done by editing non-
player character (NPC) scenes or textual information (Ang, et al., 2008).
9.2 Cognitivism Learning Theory
According to the cognitivism theory, learning originates from mental
activities, such as motivation, memory, reflection, and thinking. Cognitivism
considers learning is the fundamental procedure that relies upon the learner’s
capacitance, determination, and motivation (Craik and Lockhart 1972, Craik
and Jacoby, 1975). The cognitivism theory is based on exploring the mind
(mental processes) while observing the outside behaviour (Gagné and
Driscoll, 1988).
68
Transferring the knowledge from someone expert in knowledge to the
user or learner who does not know the subject is cognitivism. The inexpert
users' mechanism works as the knowledge is received, takes it on board,
stored, and correlates with existing ideas and information, index it such as an
archiving system). Later, if the information is required, it will retrieve to find it
in memory. It is integrating chunks of information in a more precise and
unforgettable manner. Besides the comprehensive description of the shared
and undistinguishable destination in cognitive philosophy, goals and outcomes
are common-sense guidelines for the learning journey (Protopsaltis, et al.,
2010).
Cognitivism pays more attention to the process than the product.
Therefore, the game's cognition is more important than improving the
response, promoting critical thinking, or helping people learn using various
communication modes (Reeve, 2012). The student becomes the focus of
attention and gains knowledge in various ways (for example, text, pictures, and
sound), enabling players to identify and analyze problems and use past
learning experiences. Players are immersed in a world where they can
integrate with society, and players can interact with other participants and
acquire and use the acquired knowledge in a virtual environment (Protopsaltis,
et al., 2010).
9.2.1 Cognitivism instantiation within SGs
From the perspective of game rules, cognitivism stresses the context-
related essence of knowledge and facilitates learning to complete tasks by
scaffolding. Also, player/learner control is an important part of any game, and
69
players can play the game according to their mood. Some games have a
"warm-up" scene, which helps players to understand how the game is played
and how it works. The player can observe, reflect, and infer the rules below by
interacting with the game. In terms of "game story," environments can be
complicated and include interpersonal tensions between game characters.
Players do not need to develop behavioural patterns, they need to grasp the
significance of scenes, events, characters, etc. In the early days, players will
continue to use the game-like strategies that have been used before and try to
apply the old experience to the new world (Wu, et al., 2012).
9.3 Constructivism Learning Theory
Learning is not a stimulus-response phenomenon defined in behavioral
science, but involves self-regulation and the creation of a conceptual
framework through reflection and abstraction (von Glasersfeld, 2012). The
constructivist approach means that, in addition to receiving the information
from others who know best, the learner has an active part in building their
understanding. Conferring to constructivists, learners view new information
from their unique personal experience of previous understanding. Leaner learn
from their perception, processing and interpretation: personalization of
information through experience (Cooper, 1993 and Wilson, 1997). Situated
learning theory is the lighting point of constructivist theory since it
acknowledges the cognitivist approach to learning. Ernest (Ernest, 1995)
describes that, in contextual learning, the facts are placed in a relevant
situation, taking a version of the trainee's opinions and ideas on the details.
70
Constructivism often recognizes what happens in the real world by
communicating, sharing, and generating new ideas, while stressing the
importance of socio-cultural experiences. Six conventions about the
constructivism learning coined by Dimock and Boethel, (1999) includes:
• It is an adaptive action
• It is situated in a situation where it occurs
• The learner constructs knowledge
• Experience and prior understanding play a role in learning
• There is resistance to change
• Social interaction plays a role in learning
Papert (Papert, 1998) believes that computer programming is a tool for
constructivism. His ideas have appeared in non-professional development
environments, such as the Kudu for Xbox, Mission Maker, and GameStar
Mechanic. Creating games allows users to express their understanding in new
ways and simultaneously consider communicating the basic principles best.
Essentially, this is an opportunity for ordinary game developers to make games
in their own words. The Constructivists suppose the learner as a knowledge
constructor for personal demonstrations of objective certainty (Bednar, et al.,
1991).
9.3.1 Constructivism Instantiation within SGs
From a gaming perspective, constructivism sees learning as a social
mechanism. Constructivism's "playing laws" underscore the relationship
between players and socially created games. Most of the new data can relate
to prior knowledge; thus, mental demonstrations are subjective (Resnick, 1987
and Brown, e*t al., 1989).
71
Constructivist games are the primary source of knowledge, essential
elements and raw data for players to experiment and manipulate. Simulation,
roles and strategy games are good examples of constructivist environments,
since they allow participants to manage resources, sustain and improve these
resources, involve mathematical calculations, and manage a complex
inventory of human, material and financial resources (Frasca 2003). From the
"game story" viewpoint, players' personal views are constructed by players
communicating with each other. In this situation, the world is founded on the
perception of culture's interaction as a whole.
9.4 Experimentalism Theory
Experimental learning focuses on student-centered and personalized
teachers, or educators only act as mediators or promoters. The purpose of this
theory is to create a self-actualized, supportive environment by recognizing the
cognitive and emotional needs of individuals.
Experiential learning is free from requiring any expert or teacher and
encourages learners' straight experience through the meaning-making
process. Kolb (Kolb, 2014) describes that knowledge uninterruptedly
processes, obtained by environmental and individual experiences.
Kolb (Kolb, 2014) suggested that the user learns from their experience.
He defined the learning cycle consists of the following stages, also shown in
Figure II-4:
1. Concrete Experience
2. Observation and Reflection
3. Abstract Conceptualisation
4. Testing concepts in new situations
72
Figure II-4 Kolb experimentalism learning model (2014)
In simple terms, learns to do something, think about the process of
doing that thing, understand and note down the main points, and apply this
knowledge to their future work. The first half cycle is considered perceptual, in
which the user senses and absorb the information originates from concrete
experience and reflects on significance. The second half cycle is regarded as
a processing period, in which the user constructs cognitive models that can be
tested in exercise.
Kolb (Kolb, 2014) claimed that the user could arrive in the learning cycle
from any point, and the cycle is repeated through the remaining phases. In
this cycle, response achieves through experience befits significant in
enhancing performance and the apprentice’s capability and relating
information to advance situations.
When comparing experimentalism with behaviorism or constructivism,
the empiric view of learning is more complex since it represents a more
detailed learner's view. However, as constructivism does, experiential learning
Reflection
Conceptulization
Test
Experience
73
often focuses on the personal experience of learners. The promoter's task is
to enable learners to solve all stages of the learning cycle, which means that
the practitioner's goal is not to teach specific knowledge or train fixed habits,
but to help learners find one useful to them (Reeve, 2012).
The promotion applies to developing and providing learners with space
to try new things, focus on the learners' experiences, draw new conclusions,
and apply the outcomes to the job and life or learner. With this in mind, the
learner will learn with their help instead of having it done to or for them.
9.4.1 Experimentalism instantiation within SGs
Experimentalism emphasizes "game rules" that learners should have
direct experience and focus on reflection. More learner-centered are the Ludus
rules and Payeia rules, which means players can set standards for winning or
losing games (Wu, et al., 2012). The learner-centric approach is the most
important component in terms of 'gameplay,' and players can play the game at
their own pace and mood (Wu, et al., 2012).
Typical experimental games include task-based simulations or role-
playing games, where players have a set or chosen goal and have to "play"
the role consistently to achieve the goal. The benefit of these "open
sandboxes" is that players can try and "fail gently" (Reeve, 2012).
9.5 Discussion
The learning theories, their prominent theorist, related learning method,
game genre, and game example from the above literature review are
summarized in Table II-4.
74
Table II-4: Summarized learning theories, their major theorist, related learning method, game genre, and example games
Behaviourism Cognitivism Constructivism Experimentalism
Major Theorist
B.F. Skinner
Ivan Pavlov,
Edward Thorndike,
John B. Watson
Jean Piaget,
Robert Gagne,
Lev Vygotsky
John Dewey,
Jerome Bruner,
Merrill,
Lev Vygotsky,
Seymour Papert
David A. Kolb
Learning Method
Learning from mistakes,
Learning by doing,
Practice and feedback
Guidance and instructions,
Practice and feedback,
Practice and feedback,
Task-based learning
Task-based,
Problem-solving
Game genre
Simulation, Adventure,
Strategy,
Drill & Practice,
Trial and Error and Puzzle games
Simulation,
Strategy,
Drill & Practice,
Role-Playing and Puzzle games
Simulation,
Strategy,
and Puzzle games
Simulation,
Strategy,
Role-Playing games
Example Games
Myst, Logical journeys of Zoombinis, Math for the Real World, Math blaster
Age of Empires, Professor Layton, Zadarh, The Fifth Empire
SimSchool,
The Incredible Machine,
Pandemic 2,
War on Paper 2
Pediatric Board Game,
Global Conflicts: Palestine,
Europa Universalis II
75
10. METHODS OF INSTRUCTIONAL CONTENTS DELIVERY
Instructional design is a systemic process in which programs for
education and training can be developed and constructed to considerably
increase learning quality (Reiser, 2007). The instructional design model (IDM)
plays a vital role in effectively delivering teaching content to learners. It
provides a program framework for systematic production and delivery of the
instructions by analysing teachers' and learners' learning needs and goals.
Some example of IDM from the literature reviewed is discussed in the following
sections:
10.1 The Gagné-Briggs-model
Robert Gagne is the foremost contributor to the systematic approach to
instructional design and training. Gagné (Gagné and Briggs, 1974) model aims
to answer three essential questions, which are:
1. Which human learning knowledge is relevant to the instructional
design?
2. Does the knowledge about human learning apply to specific
learning situations?
3. What methods and procedures can be used to use knowledge
about human learning for instruction design effectively?
The Gagné-Briggs model thus describes how instructional and teaching
content can be created to meet the basic needs of learning. The model
classifies the nine Gagné instructional events in Table II-5 into three phases
(1) Preparation of learning, (2) acquisition and performance, and (3) transfer
of learning. The model can be expressed as a closed circuit that describes the
76
iterative design, trial-error and revision process shown in Figure II-5. Before
introducing these nine events, Gagné recommended setting course goals and
learning objectives to ensure that they understood or did something that they
did not know or did before instruction. In short, instruction is "stimulation", and
learning is "response".
Table II-5. Gagne events of instruction and their associated actions (Gagné and Medsker, 1996)
Instructional Events Actions
iGain attention iIntroduceistimulusitoieleciticuriosity
iInform learner of the objectives
iDescribeitheiexpectediperformance
iStimulate prerequisite recall iRecallioficonceptsiandirules
iPresent learning material iPresentiexamplesiofitheiconcepts/rules
iGuide learning iUseiverbalicues, iillustrations, ietc
iElicit performance iLetitheilearnersiapplyitheiconcepts/rules
iProvide feedback iConfirmicorrectnessiofiperformance
iAssess performance iTestitheiapplicationiofitheiconcept/rule
iEnhance retention and transfer
IProvideiaivarietyiofiotheriapplications
77
Figure II-5 Three phases of Gagné-Briggs instructional activities (Gagné and Briggs, 1974
10.2 Dick-Carey Model
The Dick (Dick, et al., 2005) model uses a systematic and procedural
approach to design instructions. It is a widely standard instructional design
process model, and all the other models are compared with this model to meet
the standard requirements for the design and development of ID (Gustafson
and Branch, 2002). Figure II-6 depicts the events of design for this model,
which must be executed in sequence to achieve the purpose of instruction
evaluation. The details of the events are also discussed here;
1. Identify instructional goals – It describes the skills, knowledge, or
attitudes that the learner expects to obtain.
2. Conduction of instructional analysis -Specification of learner’s prior
knowledge or abilities for accomplishments of the task.
78
3. Analyze the learner and background-determine the target audience's
general characteristics, including previous skills, experience, and
demographic information; Identify characteristics directly related to
the skills taught; analyze performance and learning settings.
4. Writing of performance goals – goals include descriptions of
behaviors, conditions, and standards. The components of the goal
describe the criteria that will be used to judge learner performance.
5. Develop assessment tools: to conduct testing of initial behavior, pre-
testing, post-testing, and practical intervention.
6. Specify pre-class activities, an intro of contents, learner participation,
and evaluation.
7. Development and selection of instructional materials
8. Design and conduct a formative evaluation of instruction -identify
areas of teaching materials that must be improved.
9. Revise instruction -identification of areas in which students have poor
results and their relevant instructional events
10. Design and conduct a summary evaluation.
79
Figure II-6 Dick-Carey model (Dick, et al., 2005)
10.3 Kemp Design Model
The model is appropriate for all the areas and systematic and non-linear
when working with it (McGriff, 2001). Although the model is useful for
developing instructional plans that integrate technology and pedagogy, it does
not address learners ’mobility and attention. Kemp model depicted in Figure II-
7, uses all the factors in the learning environment, including topic analysis, to
obtain learner characteristics of objective teaching activities, resources to be
used, support service needs, and assessment. Kemp (Kemp and Rodriguez,
1992) design model uses six questions to emphasize the learner's point of
view:
1. What level of preparation does each student need to achieve?
2. Which teaching strategy is most appropriate in terms of goals and
student characteristics?
3. What are the suitable resources and media for the delivery of content?
80
4. Is the support necessary for successful learning?
5. What are the various ways by which the determination for the
achievement of the goals could be done?
6. If the trial version of the program does not meet expectations, what
revisions are required?
Figure II-7 Kemp design model (Kemp and Rodriguez, 1992)
10.4 Gerlach-Ely Model
Gerlach-Ely designed an illustrative instructional model shown in Figure
II-8, the same as the Dick-Carey model. The model has been tested and
applied to both the K-12 level of education and higher education (Gerlach and
Ely, 1980). The model is suitable for its emphasis on content section and media
selection within instruction. It is also ideal for resource allocation (Gerlach and
Ely, 1980). The selection and inclusion of the media or other material within
the instructions are also part of this model. It also handles the allocation of
resources and focuses on resource allocations and enhancement of behavior
while ignoring other factors like learner characteristics, e.g., attention.
81
Figure II-8 Gerlach-Ely instructional design model (Gerlach and Ely, 1980).
10.5 The ARCS Model of Motivational Design
Keller's motivational design approach (Keller, 1983 and Keller, 2010)
focuses on learning motivation and refers specifically to strategies, principles,
and processes that make instructions attractive. Keller believes that when
students are involved in the entire learning process, and proper guidance
methods ensure that student engagement can continue until completing the
learning task, learning can proceed more effectively. Keller’s model is based
upon Gagné’s instructional events. As a central part of the teaching plan,
motivation design refers to the process of organizing teaching resources and
procedures to change learning motivation. The Keller model aims to (1) identify
the motivations and needs of the students, (2) analyze the characteristics of
the students that indicate the motivation requirements for the instructional
system to be designed, and (3) The diagnosis of the materials stimulates the
motivation of the students; (4) Choose appropriate strategies to maintain
motivation; (5) Application and evaluation. Figure II-9 depicts the model
82
constituting four significant components, which provides a theoretical
explanation of the motivation problem, as shown in Figure II-9.
Figure II-9 Components of the Keller’s ARCS model (Keller, 1983 and Keller, 2010)
Keller, (2010) suggests that the ARCS model is designed for effectively
used classrooms and professional learning environments. The ARCS model
has become the most popular motivational model applied in education and is
considered the minimum requirement for designing a learning environment.
From Keller's perspective, the motivation design model is regarded as a
necessary supplement to Gagne's traditional ID model.
11. SUMMARY
In summary, basic concepts of object-oriented programming (OOP) and
their related difficulties and misconception provided in existing studies are
discussed in this chapter. The core competencies required for mastering in
OOP and the various approaches adopted for teaching of OOP instructional
contents are looked in details. The sections 1 to 5 and its subsection
contributed for the setting learning objectives and expected learning outcomes
Stimulate Learner’s
curiosity and interest
Attention
Related to learner’s
experiences and needs
RelevanceScaffold learner’s
success of meaningful tasks
Confidence
Builds learner’s sense of
reinforcement and achievement
Satisfaction
83
of the proposed study. The result of these sections will also help in creating
the competency model for SGs prototype.
Summary of the related literature on the concept of serious games and
their associated terminology. This includes a review of existing serious games
that facilitate OOP learning, various attributes of serious games that contribute
to students ' learning and motivation are discussed. Section 6 to 10 and its
subsection provides the guidelines for design and development of SGs model
to achieve enhance the learning outcomes or achieves the objectives
discussed in section 1 to 5.
84
CHAPTER III
METHODS AND TOOLS
This chapter aims for the investigation of students’ difficulties and
misconceptions while performing OOP tasks. The investigation was carried to
get a better picture of the type of difficulties, and misconception students were
making while solving the OOP related task. The feedback was also gathered
to know the student's opinions on the learning experience and OOP learning
issues. The authors’ competency model to be incorporated into the design and
development of the SGs model to overcome the identified difficulties and
misconceptions.
In order to design a serious game model, the purpose of the proposed
model is first introduced, followed by an explanation of the design of the model.
The formulated model contains a detailed description of the instructional
contents or difficulties preordained to be overcome using the model. The
mapping of the instructional steps into the game attributes is also focused on the
formulated model. Since the model to be developed is related to learning, the
influence of learning theory on model formulation is also considered. The
ARCS model for perceived motivation is incorporated into the development of
the model for the difficulties arising from the lack of motivation.
Serious game prototype development is given to illustrate the proposed
model's logical view. The developed OOP learning serious game is named as
OOsg. OOsg integrates all the elements and tasks involved in developing the
serious game model. Details of each stage are discussed in the below
sections;
85
1. INVESTIGATION ON STUDENTS PERFORMANCE
The purpose of this investigation on the performance of students to
better understand the types of difficulties and misconceptions encountered in
solving some OO-related tasks and activities. Below are the detailed objectives
of this study:
• To find students’ abilities for the achievement of OOP
competencies presented through OOP related tasks.
• Find out if students' abilities are different in solving various
activities involving OOP concepts
• To find out the types of mistakes and misconceptions student
make when various activities involving OOP concepts are given to
them.
1.1 Experimental Design
For the achievement of the objectives, a questionnaire was designed,
including two parts; the first part was about the participants' demographical
details; in the second part, a problem scenario depicting the hospital
management systems (available in Appendix-A) was designed. A model-test
involving the various activities related to OOP concepts was presented to the
students based on problem-scenario. The detail of the demographical
variables and activities will be discussed in the next sections.
1.1.1 Demographical details:
A total of 319 students from four separate universities participated in
this research. Two hundred thirty-one participants were from Information
Technology Centre, Sindh Agriculture University, Tandojam, 53 participants
were from the University of Sindh, Mirpurkhas campus, 18 students were from
86
Isra University, Hyderabad, and 11 students from Indus University, Karachi. All
the participants were grouped in the same age category, i.e., in the range of
18-25 age group. The volunteer participants were comprised of 26% of
females and 74% of male participants. The participants were enrolled in the
second term of their degree programs, which comprises 91% of the students
enrolled in a B.S. (Information Technology), 5% of the participants enrolled in
B.S. (Software Engineering) and 4% in B.S. (Computer Science) program.61%
(124) participants have previous programming experience regarding the prior
programming experience, whereas 39% (195) participants do not possess any
prior programming experience. The above details are also presented in Table
III-1.
Table III-1 Demographical details of the participants
No. of participants 319
Gender 74%(236) Male, 26%(83) Female
Age 18-25 years
Level of education Undergrad students
Program enrolled BS(IT), BS(CS), BS(SE)
Institutes Information Technology Centre, SAU, Tandojam Isra University, Hyderabad University of Sindh, Mirpurkhas campus Indus University, Karachi
Prior Programming Experience
61%(124) Yes, 39%(195) No
1.1.2 Activity details:
Learning activities are a series of activities designed to enable
participants to participate in problem-solving actively. The practical design of
these activities ensures that participants are focused on while solving them.
Before providing the activities to be addressed, a problem scenario depicting
87
the hospital management systems (available in Appendix-A) was briefly
explained to the students. The problem scenario was followed by a set of eight
cognitive activities based on OOP concepts. All the activities were aligned to
achieve the Bloom's (Bloom, 1956) levels of outcomes, as shown in Figure III-
1. The details about the objectives, designated activities, and related OOP
concepts are also provided in Appendix-B.
Figure III-1 Alignment of the activities with Bloom's learning outcomes model
Activity-1(A1) is about the identification of the potential classes from the
given scenario. This activity was aimed to develop cognition about the sensible
identification of various groups such as happened in class in OOP.
Activity-2(A2) was about identifying the properties and behaviors for
three different roles presented in the hospital, i.e., doctor, nurse, and patient.
This activity's objective was to decompose a problem domain into classes of
objects with related states (data members) and behavior (methods).
88
Activity-3(A3) was about providing the most relevant and irreverent
details that the patient must provide when they visit the doctor. This activity's
objective was to define the class at the proper degree of abstraction to
minimize the complexity.
Activity-4(A4) was about considering any medical equipment and
participants are supposed to describe how to operate the medical equipment,
how difficult it would be to describe the internal components of this item, such
as complete technical details, that how it works. The objective of this activity
was to explain how the class mechanism supports encapsulations and
information hiding.
In the activity-5(A5), the participants were supposed to consider the
hospital's people. The participants were supposed to write common list
occupations they expect to be employed there and some staff categories. The
objective of this activity was to analyse and explain the concepts of
generalization.
Activity-6(A6) was about some common properties that all the doctors
must share, and participants have to provide one statement that would true for
one specific kind of doctor and one statement that is not to be true for all other
types of doctors. The objective of this activity was to understand and
implement a class-hierarchy design for modelling.
Activity-7(A7) was about one statement about how hospital
management employee interacts with other hospital doctors of all kinds. This
activity's objective was to explain the relationship between the class's static
structure and the dynamic structure of the instances of the class, especially in
the context of how dynamic dispatch is involved in subtype polymorphism.
89
In the activity-8(A8), participants were supposed to think about any
object and list its current state and what they can do with that object. The
objective of this activity was to understand the concept of object interaction.
1.1.3 Procedure
The test was conducted in only one session. The participants were not
allowed to discuss their answers with each other. They were told to write down
all the answers in a given ample space. No time limit was imposed on the
participants to answer all the problems. However, the session lasts for two
hours when all the participants have handed in their work. Each student’s
answers were assessed and analyzed as complete and correct answers,
partially correct answers, partly wrong answers, a wrong answer, and no
response for the given set of activities. Moreover, the type of mistakes and
misconceptions participants frequently made during solving the activities were
also analysed. The difficulties and misconceptions were categorized based on
each of the participant’s responses; however, in this study, we focused only
on the difficulties and misconceptions related to classes, attributes,
behaviours, inheritance, and object concepts.
1.1.4 Results
The investigation results on students' performance in solving OOP-
related have shown that students have difficulty in almost every activity;
however, each activity has different results, as presented in Table III-2 and
Figure III-2. Students feel more difficulty in solving the activity-1 for providing
the complete or correct or even partially correct answers for the activity-1;
moreover, it has been observed from the results that the majority of the
90
students presented the wrong answer or no answer at all. Furthermore,
students performed better for activity-5, which is about “generalization”
compared to the activity-6, which is about “inheritance,” it is; therefore, we may
not include the activities related to the OOP's generalization concepts.
Table III-2 shows the percentages of the responses made by students
while solving OOP activities. Figure III-2 represents the result of Table III-2
graphically, but instead of proportion, the responses are given in count
numbers. Each activity's responses are given as a complete and correct
answer, partially correct answers, partly wrong answers, a wrong answer, and
no response for the given set of activities.
91
Table III-2: Responses collected from the participants while solving OOP activities
Activities
Complete and
Correct Answers
Partly correct
Answers
Partly Wrong
Answers
Wrong Answers
No Response
A1: identification of the classes
3.13% 7.84% 13.79% 59.56% 15.67%
A2: Identification of properties and Behavior
8.46% 20.69% 28.84% 23.82% 18.18%
A3: Abstraction (Relevant and Irrelevant)
22.57% 32.92% 6.27% 17.24% 21.00%
A4: Encapsulation (Example Object with internal Details)
16.61% 43.57% 0.00% 8.78% 31.03%
A5: Generalization (Types and Sub Types)
5.02% 62.07% 1.25% 10.34% 21.32%
A6: Inheritance (True Statement about Doctor and Surgeon)
20.38% 5.96% 23.20% 29.78% 18.81%
A7: Polymorphism (Interaction)
33.54% 0.00% 0.00% 33.23% 33.23%
A8: Object 10.97% 12.54% 6.58% 26.65% 43.26%
92
Figure III-2 Summarized responses while solving the OOP activities
93
If we look at the feedback from the student investigation on the
performance, difficulties in comprehension of OOP 's concepts can be found;
however, if we look at the suggestions provided by the students, they have
faced certain some other problems and misconceptions, following are the few
that can be noticed:
a. Difficulties and Misconceptions In Understanding Classes:
From the results of the solution provided for the activity-1, most of the
students had mistaken to give a reference to some-non-existing classes. The
researcher observes that the student did not identify the classes by
considering the given problem scenario. Another difficulty that has been seen
is that students fail to provide the complete identification of the classes.
Besides these difficulties, students also have some misconceptions of
conflation between class and attribute or method or object. Some students'
solutions included repeated identification of the classes as the same
occurrences of classes are written multiple times in the given problem
scenario.
b. Difficulties And Misconceptions In Understanding Objects:
From the results of the solution provided for the activity-8, the majority
of the students had shown the wrong instantiation of the classes, or in other
words, students have difficulty in showing association of an object with a
particular class. Besides these difficulties, students also have some
misconceptions about conflation between object and attribute or class. Some
student’s solutions included repeated identification of the objects, like the
94
same occurrences of objects written multiple times in the given problem
scenario.
c. Difficulties and misconceptions in understanding attribute:
From the results of the solution provided for the activity-2, most students
had difficulty giving the complete identification of attributes or properties for the
particular class. Another difficulty observed is that students fail to identify the
attributes of all the classes they have previously identified. Besides these
difficulties, students also have some misconceptions of conflation between
attributes and class or method or object.
d. Difficulties and Misconceptions In Understanding Methods:
From the results of the solution provided for activity-2, most of the
students had difficulty giving the complete identification of methods or
behaviors for the particular class. Another difficulty observed is that students
fail to identify the methods of all the classes they have previously identified.
Besides these difficulties, students also have some misconceptions of
conflation between methods and class or attributes or objects.
e. Difficulties and Misconceptions In Understanding Inheritance:
From the results of the solution provided for the activity-6, most of the
students had mistakenly provided a reference to some-non-existing
hierarchies of the classes. Another difficulty that has been observed is that
students fail to provide the complete hierarchies of classes. Besides these
difficulties, students also have some misconceptions about finding the class
that may exist in the between the identified hierarchy, and students sometimes
select the same class as both parents and children. Moreover, in many
95
occurrences of the student’s solution, it has been observed that they have
chosen child class as a parent class.
For this research purpose, the focus is only on the difficulties and
misconceptions in activities 1, 2, 6, and 8, as most students made mistakes
when solving these activities.
2. STUDENT'S FEEDBACK FOR BASIC UNDERSTANDING OF THE OOP CONCEPTS
After performing the investigations, the feedback was collected from
students for various factors that hindered learning OOP. The questionnaire
developed by the researcher, for obtaining the feedback (provided in
Appendix-C) consists of items for the following categories:
• Procedural programming Vs. OOP
• Understanding OOP concepts
• The motivation for learning OOP
There were 48 items included in the questionnaire, and each item is
presented to participants on a 5-point scale, ranging from 1 for “Strongly
disagree” to 5 for “Strongly Agree.” This questionnaire's results were not
formally analysed but were instead to analyse the strengths and weaknesses
of the various factors that hindrances in learning OOP.
The category procedural programming vs. OOP includes five items
related to the transition from procedural programming to OOP, utilization of the
procedural or OOP logic, the interestingness of OOP comparing procedural
programming, difficulty in solving given task using OOP or procedural
programming, and prior programming language helps learn OO or not. The
96
overall means score, frequency, and percentage of the students' responses
for the category procedural programming Vs. OOP is presented in Table III-3.
The percentage of “transition from procedural programming to OOP
paradigm,” “utilization of the procedural or OOP logic,” and “interestingness of
OOP comparing procedural programming” shows that participants have the
highest scores of 112 (35.10%), 125 (39.18%) and 125 (39.18%) score
between 3.01 to 4.00 respectively. However, for the item “difficulty in solving
given task using OOP rather than procedural programming,” most of the
students strongly disagree 150 (47.02%) on this. The majority of the students
were Dis-Agree for the item “prior programming experience helps learn OOP”.
The researcher assumed that shift from procedural to OOP or having prior
programming experience was not helpful because the existing teaching
methodology makes OOP learning hard.
97
Table III-3 Overall mean score, frequency, and percentage for the category procedural programming vs OOP
Items/Scores Score
1.00-2.00 Score
2.1-3.00 Score
3.1-4.00 Score
4.1-5.00
Me
an
Sco
re
The transition from procedural programming to OOP paradigm is easy?
81 (25.39%)
84 (26.33%)
112 (35.10%)
42 (13.16%)
3.30
Is it difficult to utilize the procedural programming logic wherein required in the OOP structure?
90 (28.21%)
74 (23.19%)
125 (39.18%)
30 (9.40%)
3.14
Does programming OOP is more interesting than procedural programming?
70 (21.94%)
64 (20.06%)
125 (39.18%)
60 (18.80%)
3.49
Is it difficult to solve a given task using the OOP approach than the Procedural programming approach
150 (47.02%)
42 (13.16%)
103 (32.28%)
24 (7.52%)
2.84
YES NO
Prior programming language experience is helpful in learning the OOP?
78 (24.45%)
241 (75.55%)
The category Understanding of the OOP concepts includes 23 items.
The items are related to the order of class or object being learned, difficulty in
identification of objects, attributes, methods, categorize class members,
applying attributes or methods to the objects, analyzing the interaction by
message passing, providing hierarchy, integrating class, remembering the
structure of the classes, comprehending the given problem/task in terms of
OOP, implementing the OOP concepts for creating programs, remembering
the importing the required package, applying specific access specifier,
optimization of the code, remembering the naming conventions when declaring
identifiers and difficulty in understanding which OOP structure best used in
98
given situation/task. The other items are about the necessity of ordering
sequence of topics learned in OOP, comparing polymorphisms and if-else
condition, and starting learning OOP by writing code, seeing graphical
components, or providing the real-life examples. The last item is whether
learning by start writing code is challenging to grasp the OOP concepts.
The overall means score, frequency, and percentage of the student’s
responses to the category understanding of the OOP concepts presented in
Table III-4. Results revealed that for the order of class or object being learned,
the majority156 (48.90%) of the students rated the highest score between 4.1-
5.00 for class should be learned before the object, whereas for the object
should be learned before class the majority of the students 222 (69.59%) rated
the highest score between 1.00-2.00. As evidence of the students' activities
before the feedback form, all the students rated between the 3.1-4.00 score
for all the items related to the difficulties in OOP concepts. For the item about
the necessity of having ordering sequence of topics learn in OOP majority of
the students111 (34.79%) rated the highest rated score between 3.1-4.00, and
for item comparison between polymorphisms and if-else condition, the majority
of the students 178(55.79%) have provided lowest rating score between 1.00-
2.00. For the items related to the starting of learning OOP by writing code or
seeing graphical components or providing a real-life example, the students
highly rated 190 (59.56%) for start learning by giving the real-life example
followed by seeing graphical components, i.e., 168 (52.66%). Students rated
a highest-rated score of 116 (36.36%) between 3.1-4.00 for item starting the
OOP by writing the code is difficult to grasp the concepts.
99
Table III-4 Overall mean score, frequency, and percentage for category understanding OOP concepts
Items/Scores Score
1.00-2.00
Score
2.1-3.00
Score
3.1-4.00
Score
4.1-5.00 Mean Score
Should the class be learned before the object?
77 (24.13%)
27 (8.46%)
59 (18.49%)
156 (48.90%)
3.58
Should an object be learned before class?
222 (69.59%)
38 (11.91%)
32 (10.03%)
27 (8.46%)
2.45
Is it difficult to identify /list the different objects to be used for the given task
81 (25.39%)
41 (12.85%)
145 (45.45%)
52 (16.30%)
3.43
Is it difficult to grasp class attributes and methods?
58 (18.18%)
45 (14.10%)
168 (52.66%)
48 (15.04%)
3.53
Is it difficult to categorize the class members (static members), local variables, and parameters of the methods
64 (20.06%)
51 (15.98%)
113 (35.42%)
91 (28.52%)
3.62
Is it difficult to apply attributes and methods to objects
68 (21.31%)
43 (13.47%)
142 (44.51%)
66 (20.68%)
3.53
Is it difficult to analyse the interaction between the objects by message passing
55 (17.24%)
71 (22.25%)
103 (32.28%)
90 (28.21%)
3.62
Is it difficult to provide the hierarchy of the classes
74 (23.19%)
56 (17.55%)
132 (41.37%)
57 (17.86%)
3.48
Is it difficult to integrate the classes
62 (19.43%)
81 (25.39%)
126 (39.49%)
50 (15.67%)
3.45
Is It difficult to remember the structure of the class?
68 (21.31%)
56 (17.55%)
144 (45.14%)
51 (15.98%)
3.53
Is it difficult to comprehend the given problem/programming task in OOP terms?
81 (25.39%)
62 (19.43%)
153 (47.96%)
23 (7.21%)
3.24
Is it difficult to implement the OOP
115 (36.05%)
126 (39.49%)
41 (12.85%)
37 (11.59%)
2.98
100
concepts for creating the program?
Is it difficult to remember which package is required to import for the specific method
74 (23.19%)
48 (15.04%)
44 (13.79%)
152 (47.64%)
3.91
Is it difficult to apply access specifier depending on a given task
79 (24.76%)
72 (22.57%)
114 (35.73%)
53 (16.61%)
3.35
Is it difficult to optimize the code in OOP
93 (29.15%)
60 (18.80%)
113 (35.42%)
52 (16.30%)
3.29
Is it difficult to remember the naming convention when declaring identifiers
66 (20.68%)
45 (14.10%)
132 (41.37%)
75 (23.51%)
3.60
Is it difficult to understand which OOP structures best to be used in the given situation
63 (19.74%)
45 (14.10%)
172 (53.91%)
39 (12.22%)
3.55
Is it necessary to have an ordering sequence in topics learn in OOP?
53 (16.61%)
46 (14.42%)
111 (34.79%)
109 (34.16%)
3.80
Is the polymorphism useless to use in OOP, as this can be achieved by using the if-else statement
178 (55.79%)
44 (13.79%)
39 (12.22%)
58 (18.18%)
2.76
Do You want to start learning Programming by coding?
235 (73.66%)
41 (12.85%)
37 (11.59%)
6 (1.88%)
2.08
Is it good to start learning OOP by seeing graphical components?
37 (11.59%)
35 (10.97%)
168 (52.66%)
79 (24.76%)
3.87
Is it good to start learning OOP by providing a real-life example?
74 (23.19%)
30 (9.40%)
190 (59.56%)
25 (7.83%)
3.43
Starting the OOP by writing the code is difficult to grasp the concepts
70 (21.94%)
33 (10.34%)
116 (36.36%)
100 (31.34%)
3.71
101
The category motivation for learning OOP includes 20 items related to
the satisfaction, completeness, enjoy-ability, interestingness, attention of the
activities provided before the feedback form, and the majority of the student
rated the highest score between 1.00-2.00 for all these items. The overall
means score, frequency, and percentage of the student’s responses to this
category in Table III-5. For the items related to learning, OOP is boring in the
classroom, putting extra efforts to understand OOP concepts, trouble in
understanding OOP, thinking to quit the program currently enrolled in, and
leaving because of programming the highest rated score was between 3.1-
4.00.
Table III-5 Overall mean score, frequency, and percentage for category motivation for learning OOP
Items/Scores Score
1.00-2.00 Score
2.1-3.00 Score
3.1-4.00 Score
4.1-5.00 Mean Score
I am satisfied with my performance at this task.
153 (47.96%)
52 (16.30%)
40 (12.53%)
74 (23.19%)
3.01
After working at this activity for a while, I felt pretty competent.
198 (62.06%)
25 (7.83%)
61 (19.12%)
35 (10.97%)
2.67
I thought this activity was quite enjoyable.
187 (58.62%)
33 (10.34%)
72 (22.57%)
27 (8.46%)
2.73
I would describe this activity as very interesting
169 (52.97%)
61 (19.12%)
39 (12.22%)
50 (15.67%)
2.82
While I was doing this activity, I was thinking about how much I enjoyed it.
198 (62.06%)
31 (9.71%)
50 (15.67%)
40 (12.53%)
2.56
I am very interested in learning OOP
138 (43.26%)
47 (14.73%)
86 (26.95%)
48 (15.04%)
2.90
Solving OOP activities like this hold my attention at all.
174 (54.54%)
70 (21.94%)
40 (12.53%)
35 (10.97%)
2.71
I like the way I am learning about OOP in class
192 (60.18%)
60 (18.80%)
30 (9.40%)
37 (11.59%)
2.61
102
I enjoy learning new things in class
147 (46.08%)
50 (15.67%)
44 (13.79%)
78 (24.45%)
3.05
When I run into a difficult homework problem, I keep working at it until I think I’ve solved it.
168 (52.66%)
56 (17.55%)
59 (18.49%)
36 (11.28%)
2.81
Is your course grades accurately reflect how much you have learned?
168 (52.66%)
29 (9.09%)
65 (20.37%)
57 (17.86%)
2.97
I always enjoyed learning about OOP in the classroom.
183 (57.36%)
60 (18.80%)
34 (10.65%)
42 (13.16%)
2.67
I always enjoyed learning about OOP outside the classroom.
169 (52.97%)
40 (12.53%)
59 (18.49%)
51 (15.98%)
2.91
I am satisfied with the way the OOP subject is taught to me during this course
209 (65.51%)
60 (18.80%)
30 (9.40%)
20 (6.26%)
2.40
I have enjoyed the way the OOP subject is taught to me during this course
198 (62.06%)
66 (20.68%)
30 (9.40%)
25 (7.83%)
2.47
I think learning OOP is boring in the classroom environment
66 (20.68%)
35 (10.97%)
145 (45.45%)
73 (22.88%)
3.66
In class, I have to put extra efforts to understand OOP
50 (15.67%)
35 (10.97%)
122 (38.24%)
112 (35.10%)
3.89
I have trouble understanding the OOP problems; I go over it again until I understand it.
49 (15.36%)
37 (11.59%)
162 (50.78%)
71 (22.25%)
3.77
Have you ever thought about the quit the program you are enrolled in?
72 (22.57%)
65 (20.37%)
146 (45.76%)
36 (11.28%)
3.43
If yes, is it because of the difficulty in learning programming languages?
81 (25.39%)
53 (16.61%)
153 (47.96%)
32 (10.03%)
3.37
103
3. THE PROPOSED COMPETENCY STRUCTURE MODEL
In chapter II and section 4, the various competency models are
discussed. However, some limitation creates urge for creating the new
competency structure model, such as competency model proposed by
Havenga (Havenga, 2008) discussed in section 4.1 of chapter II, lacks in
providing the evidence for the measurement of the discussed competencies.
The COMMOOP structural model (Kramer, et al., 2016) discussed in section
4.2 of chapter II, focused on many other competencies besides competencies
related to OOP such as mastering in the syntax and semantics, problem-
solving testing, debugging and meta-cognitive competencies. The model also
lacks in refinement of the core competencies related to OOP, which results in
widen the scope of research. Simona Hierarchy-based competency model
discussed in section 4.3 of Chapter-II does fails in the implementation of model
given study framework.
Henceforth, it is observed that available models fail to describe that
how the competencies should be evaluated either in the traditional learning
system or using any game prototype; it is; therefore, the researcher design the
competency model herself. The competency model is also required to be
implemented in the research study prototype aiming to evaluate the core
competencies of OOP in students.
The proposed competency structure model is presented in Figure III-3.
104
Figure III-3 Proposed competency model for mastering OOP skills
Figure III-3 shows the competency model, used to describes the
student's competencies that we want to assess, such as knowledge, skills, or
other attributes. CM is used primarily to support the reasoning for specific
purposes, such as providing scores for students' homework or assignments,
certificates, diagnosis, or further guidance. A group of knowledge and skills in
CM are called nodes. A more specific CM version is called the student model,
which describes competencies at a finer granularity, such as transcripts or
progress reports.
This research study emphasizes the achievement of competencies
related to the “understand structures” and their underlying sub-competencies.
The sub-competencies of “understand structures,” which are "understanding
inheritance", "understanding of classes," and "understanding of objects," and
105
their sub-competencies (that is, the shaded parts in Figure II-5) are focused in
this research study and incorporated in the serious game model. The
competency "understanding inheritance" has sub- competency or tasks "sub-
and super-classes". The node “identification of class” does not include any
sub-competency; therefore, it is referred to as concrete tasks/activities. The
competency “creation of class” have two sub-nodes related to the “define the
status” and “define behaviors”, furthermore, the “define state” and “define
behaviors” have other child nodes, considered as a concrete task/active
nodes.
4. PURPOSE OF THE PROPOSED MODEL
From identifying the student’s difficulties and misconceptions through
the investigation on student’s performance discussed in chapter III, section 1,
and existing serious games available for learning and teaching OOP discussed
in chapter II, section7, identifying the essential elements that make up the
proposed model. From the findings of Chapter II and section 2, which is about
identifying difficulties and misconception in learning OOP from the literature
review as well from the chapter III and section 1, investigation on students’
performance, the proposed model could serve as:
1. A model is required to overcome the difficulties and avoid
misconceptions in comprehending given OO problems such as identifying the
classes, their attributes and methods, the impact of creation and destruction
of class objects, and establishing the correct hierarchy. The model also aims
to foster the learning outcomes and improve the performance of the students.
106
From the findings of chapter II and section 7, which is a study about the
existing serious games to support OOP, the purpose of the proposed model
could be viewed as:
2. A model is needed to overcome students’ obstacles and follow
the entire learning process in a fun and engaging way. Therefore, the
development of models is influenced by learning theories, and by linking game
attributes to instructional design models. For the difficulties arise because of
the lack of motivation, the motivational model is incorporated into the model's
design.
5. DESIGN OF THE MODEL
In order to design a model of a serious game, the following process
needs to be performed:
5.1 Identification of Attributes for Serious Games Model
To propose a new model for the serious games, students' difficulties or
misconceptions are identified as instructional contents of the prototype.
Moreover, the mapping of the instructional contents related with the game
attributes either suitable for the achievement of the expected learning
outcomes or to sustain motivation while learning so that it becomes a
challenge, curiosity, and fantasy for students to complete them. The details of
these attributes are provided in the below sections.
107
5.1.1 Identification of the instructional contents and difficulties needs to be incorporated as an input in the model
There is a great debate about the instructional content to be taught in the
course of OOP. The discussion about the instructional contents or core OOP
quarks is available in chapter II from section 1.1 through 1.8.
Many difficulties in learning OOP have come in the notice from
reviewing the existing studies discussed in chapter II section 2, and the st6.2
udent’s evaluation performed after investigating to find the possible causes by
which those difficulties have occurred. Moreover, it is concluded from the
studies included in the literature review that obstacles are arises because of
the motivational issues which may be due to uninteresting approach for
teaching OOP concepts such as starting with technical details of the language,
complexity of learning and practicing environment, improper or incomplete
feedback on students’ activities and fear of failure during the course.
The results of the investigation on the student's activities provided in
chapter III indicates that students faced obstacles in the comprehension of
basic OOP concepts such as identifying classes, defining states and
behaviors, associations between classes, the impact of creating or destructing
objects of specific classes, defining the proper degree of abstraction,
mechanism of encapsulation and information hiding, concepts of
generalization, relationship between the static structure of the class and the
dynamic structure of the instances of the class, especially in the dynamic
context dispatch in a given problem scenario are a big challenge. Besides,
student feedback results indicate that it is difficult to understand the difference
in procedural vs. OOP languages, lack of comprehension, and applying basic
108
OOP concepts. The difficulties in existing teaching methods results become
the main reason for students' lack of interest in learning OOP.
For the design and development of the serious game model, this
research focused on instructional contents or quarks of OOP described by the
(Armstrong, 2006), i.e., structural (Abstraction, Class, Encapsulation, Object,
Inheritance) and behavioural (Message Passing, Method, Polymorphism).
However, in this study, the three structural, i.e., class, object, inheritance, and
one behavioural, i.e., method, are included as the model's instructional
contents.
Subsequently, the summarized difficulties found from above mentioned
three methods, i.e. (A) difficulties from the existing studies, (B) results of the
investigation on student’s activities, and (C) feedback provided as shown in
Figure III-4. The results of the most common difficulties in (A) & (B) and (B) &
(C) indicates the difficulties incomprehension of the OOP concepts, and the
result of the common difficulties between (A) &( C) indicates the difficulties
incomprehension of the OOP concepts and difficulties arises because of the
motivational issues.
Hence, the following difficulties need to be incorporated as input content
in developing the Serious games model.
1. The result indicates that students have difficulties and
misconception in the comprehension of almost all the quarks of the OOP;
however, it will become the beyond the scope of the study to incorporates all
those difficulties in the design and development of the prototype of a serious
game, it is; therefore, we limit this study to consider the difficulties related to
109
the classes, attributes, behaviours, objects, and hierarchies as discussed in
Chapter -III section 1.
2. Difficulties related to motivation issues are solved by using the
design of the Serious games model. This is because the method of teaching
OOP concepts is not interesting enough; the complexity of the learning and
practice environment, and improper or incomplete feedback on student
activities.
Figure III-4: Summarized difficulties from (A) the existing studies, (B)
investigation on student’s activities, and (C) feedback provided from the students
5.1.2 Identification of the instructional design for delivery of the contents
So far, in Chapter II and Section 10, various teaching models have been
discussed. Instructional design is the process of analyzing learning needs and
goals and developing instructional content accordingly. The instructional
A. Summary of Difficulties from Literature Review
B. Summary of Difficulties from Student’s Investigation
C. Summary of Difficulties from Student’s Feedback
110
contents for this study are based on the Gagné-Briggs model (discussed in
Section 10.1 of Chapter II). Later, Gagné's pioneering work in instructional
design, his disciples, has reached a consensus that teaching effectiveness
depends on the learner's motivation to learn. For motivation stimulation, some
measures must be taken first to attract the learner's attention. However, the
analysis of the existing ID model showed after preliminary analysis and
evaluation, most people have given up worrying about learners' motivation,
such as immediate results for learning effects, but despite that, they support
its internal processes. Other areas of education that are critical to learning are
content delivery. Learner feedback (as content delivery) followed the method
proposed by Gagné (Gagné and Briggs, 1974), which involves nine events to
support internal processes related to learning in this study.
The most important thing about these events is ensuring the learner
stays engaged and learning within the environment and stays immersed
without becoming bored. Activities should involve learning materials that are
appropriate and challenging for the target learner seeking competency at a
level slightly above that of their current competency. Subsequently, applying
the nine instructional events of Gagné-Briggs are considered to be an excellent
way to ensure effective and systematic delivery of instructional contents,
because it provides a structure for the lesson plan and provides a holistic view
of teaching.
The connecting game attributes to these in instructional events serve to
be an effective way to benefit from the designed Serious games (Becker,
111
2007). Some of the example game activities incorporated as Gagne
instructional events are presented in Table III-6.
Table III-6 Example game activities incorporated as Gagne instructional events
Gagne Instructional Events Example Game Activities
Gain attention The welcoming player with their chosen Avtar and name, or background music
Inform learner of the objectives Presenting the game objectives, outcomes or about information
Stimulate prerequisite recall Providing the warm-up scenarios or presenting the information for related items in the game
Present learning material Presenting the game activities
Guide learning
Presenting the information regarding rules and goals of each level, how to increase and decrease in points/score, how to get hints, win/lose strategies
Elicit performance Presenting the game levels needs to play, performing actions in the levels
Provide feedback Providing feedback on players action taken in the game
Assess performance Providing game score, correct and wrong attempts, remaining tasks, etc
Enhance retention and transfer Provide the previous information in the next levels with increased challenged or complexity
Besides effective instructional content delivery to achieve the expected
learning outcomes, the motivational design model should be considered the
necessary completion of the ID models in Gagné’s tradition. Therefore, the
Keller’s ARCS model, discussed in chapter II, section 10.5, is included in the
Serious games model design in this research. The ARCS model is chosen
because of its exceptional quality of emphasizing the instructional design
domain, such as learner’s motivation, attention, and so on, which is not
considered by other instructional models. The ARCS model has developed into
112
the most popular motivation design method in education, and its components
should be considered the minimum requirements for designing a learning
environment.
5.1.3 Identification of the game attributes which contributes to learning and motivation
Bedwell (Bedwell, et al., 2012) suggested that if the game attributes are
adequately used during game design, they help promote learning and boost
student’s interest. To ensure that serious games have the same degree of
teaching experience as traditional teaching methods, it is important to integrate
traditional teaching elements that have been successfully established in
designing and developing serious games prototype. These successful aspects
are known as game attributes in the field of game design. Ricci (Ricci, et al.
1996) proposed that instruction that incorporated game features enhanced
student motivation, which led to more considerable attention to training content
and greater retention.
Matching the desired learning outcomes with the game attributes, or
instead of selecting the game attributes to produce the desired outcome, is a
difficult task. On the other side, tying the game attributes with exclusive
instructional design to achieve intended learning outcomes would help
trainers, instructors, practitioners, and learners. Several studies have been
reviewed for the identification of the game attributes contributing to effective
learning and motivation. The summary of game attributes and outcomes that
are supposed to be achieved using those attributes is presented in Table II-3
of chapter II. The game attributes, which are majorly used by all the authors of
113
reviewed studies, are selected to design the Serious games model in this
research study.
Furthermore, the selection is also based on the type of outcomes
produced by the game attributes. Conclusively, the game attributes defined by
Garris (Garris, et al., 2002), i.e. (1) fantasy, (2) rules/goals, (3) sensory stimuli,
(4) challenge, (5) mystery, and (6) control and one more attributes “Progress”
is used by the majority of the authors. However, we have a limit to use the six-
game attributes defined by the Garris (Garris, et al., 2002), as the outcomes
produced by the game attribute “Progress” are yet not well defined by the
authors. Specifically, to make the learning activities intrinsically motivating,
Malone (Malone,1981) proposed challenge, curiosity, and fantasy attributes
very effectively. The description of Garris (Garris, et al., 2002) game attributes
and the outcomes produced is sum-up in Table III-7.
Table III-7 Game attributes and possible outcomes provided by Garris 2002
Attributes Description Possible Outcomes
Fantasy
The fantasy element in the game represents something different from real life and evokes a mental image that does not exist
Increase learning and interest (Cordova and Lepper, 1996; Parker and Lepper, 1992). Increase the Automaticity The Increase focus of attention and self-absorption (Driskell and Dwyer, 1984)
Rules/ Goals
The rules are the game's goal makeup and the well-defined standard defining how to win a game. Specific, well-defined rules and guidelines are necessary for serious games, and feedback on progress towards achieving goals.
Enhance performance and motivation (Locke and Latham, 1990)
Sensory stimuli
Visual or auditory stimuli are helpful to change the
Enhance the motivational appeal of instructional activities
114
perception and imply temporary acceptance of alternative reality.
Helps in grabbing attention (Malone and Lepper, 1987)
Challenge
Challenge refers to the difficulty of the possibility of achieving the goal. The challenging game has multiple specified goals, progressive complexity, and information ambiguity. The challenge also increases fun and competition by creating obstacles between the current state and the target state.
Improve personal competencies, Engaging competitive or cooperative motivations, Improves the learner’s retention of knowledge
Mystery
Mystery refers to the gap between known and unknown information. It is the product of differences or inconsistencies in knowledge. Conflicting information, complexity, novelty, violation of surprises and expectations, conflicting ideas, and inability to make predictions helps to enhance the game's mystery.
Drive learning evokes curiosity to complete the next levels or quests, Enhance motivation
Control
It is the degree to which learners can guide their learning activities in the game to provide self-learning and self-exploration abilities that suit their pace and experience.
Increased motivation, learning, and affect skill-based learning.
5.1.4 Determine the expected learning outcomes to be achieved
The overall game goals, followed by game rules, need to meet to
achieve the expected learning outcome during playing. The game gaols are
created by combining the specific functions implemented in the game and the
related instructional contents. The game activities included in the game design
have at least one specified learning outcome that needs to be achieved. These
results will be broken down into all game activities according to the overall
teaching content that the player wants to learn. The player is known about,
115
what needs to be done and what they can learn from the learning materials
included in specific activities. The intended learning outcomes supposed to be
achieved using the proposed Serious games model are described in the form
of OOP competencies. The competency model proposed by the researcher is
described in Chapter-III, section 3 (Figure III-3). Besides these core
competencies, this research also analyzes the learning outcomes by finding
the difference in student’s performance for learning OOP with and without the
intervention of prototype, the difference in students’ normalized learning gain,
the effect of perceived motivation on the learning outcomes of the students
and finally the effect size is measured to see how important a difference with
and without the intervention of prototype.
5.2 Structure of the Model
Based on the details given in sections 5.1.1 to 2.1.5, is useful to create
serious game models. Figure III-5 shows a serious game model based on the
described components i.e., instructional contents or learning difficulties, game
attributes, learning theories, required competencies, and motivational aspects
logical placed in the presentation, practice, and performance phases. The
purpose of the placement of the components in these phases is presented in
Table III-8.
116
Figure III-5: Proposed Serious Game model for learning OOP
Table III-8: Placement of the components in the serious games model’s phases
Phase Purpose Components
Presentation The input content presented to the user/player as learning material
Instructional contents, learning difficulties, and misconceptions Intended learning outcomes, required competencies
Practice Learning and practicing of the user/player
Instructional Models, Learning Theories, Game attributes
Performance Assessment of the performance of the user/player
perceived motivation and game statistics and log files
5.2.1 Three phases of the model
The proposed serious game model is divided into three phases. The
following subsections describe all three phases and their related components.
117
i. Presentation
In the presentation phases, the instructional contents of OOP, and the
contents on which students faced difficulties and misconceptions collected
from existing studies, from student’s investigation and feedback survey
(discussed in section 2 of chapter III) are presented as a game input or
contents to the user/player to learn and improve their performance in OOP
skills.
ii. Practice
In the practice phase, the inputted data from the presentation phase is
presented to the user/player in learning activities to practice and accomplish
the intended learning outcomes. This stage helps trigger a cycle that includes
a sense of achievement or response (such as perceived motivation or interest)
or change in user behaviour (such as more remarkable persistence or time on
task completion), and further system feedback. The practicing phase is the
core of the serious games model where the game activities are prudently
designed by linking the game attributes with the instructional design model,
and the whole learning process has occurred under the rules of the adapted
learning theories. Every game activity in the practicing phase is supposed to
be played and mastered in one or more learning outcomes and some
motivational aspects. This phase's components include the instructional
design, learning theories, and game attributes, which are discussed in section
5.1.2 and 5.1.3.
118
iii. Performance
Each activity played by the user/player in the practicing phase is
assessed to check their performance. Two types of assessment are done in
this phase, first is the formative assessment, where the player is informed
about their every correct and incorrect action performed while playing the
activities in each game levels. In the formative assessment, the player is not
only informed of their right and wrong actions, but they are also provided with
tailored feedback to improve their performance. A summative assessment is
done to evaluate the performance of each player, and it is usually done at the
end of each level, either completed or not completed by the player. In the
summative assessment, the player is informed about their correct and
incorrect attempts, remaining actions required to solve the complete level, total
time is taken in playing the level, overall score, and their percentage of
performance (i.e., correct solution attempted/total correct solution x100).
5.2.2 Linking the game attributes with instructional design
The design of serious games involves creating learning activities that
can be used in the game environment or entails some game attributes to
change students' learning experience (Lameras et al., 2016). Therefore, it is
the link between instructional design and game attributes that are important to
be found to scaffold teachers’ understanding of how to perpetuate learning in
optimal ways while enhancing the in-game learning experience.
The limitation of the existing research has little focused on what game
attributes lead to learning outcomes, and whether the relationship between
games and learning is direct or indirect, and if so, what the mediating variables
119
may be. It is, therefore, for designing the serious game model, we have first
understood whether a single game attribute leads to learning or enhance
perceived motivation or a combination of multiple attributes within a game has
a more substantial effect, and which game elements mainly helps to produce
which learning outcomes, the result of such analysis from the existing studies
is presented in Chapter II, Table II-3. In addition to this, the identification of
game attributes required for the intended outcomes supposed to be achieved
from the proposed model is discussed in section 5.1.3, and outcomes
produced from each of these game attributes are presented in Table III-9.
Figure III-6 Illustrates the serious game model, in which linking of the
game attributes with instructional design is established. In this model, the
cognitive process of the Gagne model is presented as the instructional model.
The model depicts which Gagne cognitive process of instructional events
combines with game attributes could enhance the players' motivation and
produce the learning outcomes. It can be seen from the model that a single
game attribute is not always a situation, but the combination of game attributes
helps to enhance motivation and learning outcomes.
120
Figure III-6 Linking game attributes with instructional model
As mentioned above, a game's learning outcomes can be viewed as
indicators of assessment methods achieved as in-game experience played by
the player. Linking the game attributes with instructional design helps in
recurring and self-motivated gameplay. The combination of instructional
events with game attributes, conceptualized for the serious game model, is
presented in Table III-9 in detail.
121
Table III-9 Linking game attributes with the instructional design for the enhancement of the perceived motivation of the player
Gagne Event of Instruction
Gagne Cognitive Process
Game Attributes
Keller’s ARCS model
Gain attention Reception Control Sensory stimuli
Attention Inform learner of the objectives
Expectancy
Stimulate prerequisite recall Retrieval
Present learning material Selective Perception
Fantasy Rules/Goals Sensory stimuli
Relevance
Guide learning Semantic Encoding
Elicit performance Responding Fantasy Rules/Goals Sensory stimuli
Confidence
Provide feedback Reinforcement
Assess performance Retrieval Fantasy Sensory stimuli Challenge Mystery
Satisfaction Enhance retention and transfer
Generalization
6. DEVELOPMENT OF THE OOsg
The details of the development of OOsg is described in the following
sections:
6.1 Tools Used
The libraries and tools used for prototyping are discussed in the
sections below.
6.1.1 Java Standard Edition (Java SE)
The Java SE was used to build a prototype that can be compiled,
deployed, and run in desktop, server, and embedded environments. The Java
122
library provides numerous functions needed in today's applications, such as a
rich user interface, performance, flexibility, portability, and security features.
6.1.2 JavaFX Scene Builder v10.0
JavaFX is now part of Java SE, allowing developers to build desktop
applications and rich Internet (RIA) applications that can run on different
platforms and devices. The library has been designed to replace Swing
components with the standard GUI library in Java SE. JavaFX Scene Builder
is used to generate all screens used in the prototype development.
6.1.3 JSON library
The solution model used to validate the student solution is stored in the
JavaScript Object Notation (JSON) format (Available in Appendix-D). JSON
was chosen because the JSON data format provides the flexibility needed to
accommodate mapping and validating the student’s solution model with the
game solution model. The result of the Student is also written in an external
JSON format. JSON. Simple is a simple Java-based toolkit used in this
research to encode or decode JSON format.
6.2 Description About the OOsg
The developed 2D game is named OOsg (Object Oriented game). The
motivational point behind the design and development of OOsg is to provide a
Serious games environment in which the novice programming students or the
student who has difficulty in the conceptualization of fundamental quarks of
OOP can start learning interestingly and engagingly. The students can improve
their concepts, especially the concepts of classes, attributes, methods,
123
objects, and inheritance using developed Serious games. The post-test would
be performed to provide proof of the validity of student outcomes using the
OOsg.
6.2.1 Navigation between OOsg screens
In OOsg navigation between screen are controlled by the player. Figure
III-7 shows the sequence of OOsg screens displayed to the player. The
screens are represented with a rectangle, whereas an arrow's direction
indicates that navigation could be possible between the screen. The “Start”
indicated the splash or welcoming screen of the game. After the “Start” screen,
a screen about “Provide basic and game control information” is displayed in
which the player should provide their basic info such as name and gender, and
some game controlling information or Game Dynamics such as player type
(i.e., Beginner, Intermediate, or Expert) and difficulty level (i.e., Easy, Medium
or Hard) for playing the game (details are also provided in Table III-10).
After providing the information, the player may either navigate to play a
“warm-up session” or “select scenario” for playing the game. The two-headed
ended arrow from the “warm-up session” or “select scenario” screen indicated
that the player could navigate backs to change their provided information on
the “Provide basic and game control information” screen. After “select level”
for playing, the player can learn about the level-related concept provided on-
screen “Learn concept” and then move towards the screen “Level goals and
rules” to reads the goals and rules for clearing the level. After then, the player
can play level provided on-screen “Play Level.” The player can switch back
and forth to any level by navigating to the “Select Level” screen or switching
124
directly to the next level by moving to the “Level goals and rules” screen. The
player can quit from the game from any screen except the “Start” screen.
Figure III-7 Navigation between OOsg screens
6.2.2 Game mechanics and dynamics
The set of actions, behaviours and control mechanism handed to the
player is known as game mechanics (Hunicke, R., et al., 2004), whereas the
dynamics are the change which occurs in the game environment when the
game mechanism is triggered. Game Mechanics help define a rule-based
system for the game environment by explicitly stating all the objects available
within the game environment, how each object would behave, and how the
user can interact with them in the game world. The dynamics of the game
provide more entertainment and engagement for the player. If the user is
playing a game in which they have identified the different classes from a given
story-based scenario. The story may also contain other non-playing elements
125
such as an object, attributes, or other non-reference classes in the scenario,
but if the player is identifying the wrong classes, i.e., attributes, methods,
object, or non-existing classes, then ultimately, they may lose all the lives, and
time will also run out. The game mechanics and the game dynamics used in
the OOsg are presented in Table III-10.
Table III-10 Game mechanics and dynamics of OOsg
Game Level
Game Mechanics
Player Type
Difficulty Level
Game Dynamics
Level1 Identify classes
Beginner
Easy 10 correct class identification needs to be done in 20 minutes The correct identification of classes
• The name of the class is added in the class box
• The name of the classes is loaded in the offline list
• Positive feedback is provided
• Increase-in-score-based music will be played in the background
• Increase in the score by 10 points
• The decrement in the total number of classes needs to be identified
The incorrect identification of classes
• Tailored feedback is provided depending on the specific wrong identification
• decrease-in-score-based music will be played in the background
126
• The decrease in the score by 10 points
Medium 20 correct class identification needs to be done in 20 minutes Same as above
Hard 60 correct class identification needs to be done in 20 minutes Same as above
Intermediate Easy 10 correct class identification needs to be done in 15 minutes Same as above
Medium 20 correct class identification needs to be done in 15 minutes Same as above
Hard 60 correct class identification needs to be done in 15 minutes Same as above
Expert
Easy 10 correct class identification needs to be done in 5 minutes Same as above
Medium 20 correct class identification needs to be done in 5 minutes Same as above
Hard 60 correct class identification needs to be done in 5 minutes Same as above
6.2.3 The UI of OOsg
The following section provides brief details about the screen of OOsg.
The details about the game UI are also available in Appendix-E.
127
Screen 1 – Splash Screen
The splash screen or “start” screen is the welcoming screen of the
game, which provides the information regarding the game's title and its
developer.
Screen 2 – Personal and game control information
In screen-2 shown in Figure III-8, players should provide some personal
information, such as name and gender, as well as information about game
controls or game dynamics, including player type (i.e., beginner, intermediate
or expert) and difficulty level (i.e., easy, medium or hard) (details about game
dynamics is provided in Table III-10). This information is needed to create and
maintain log files for individual players and their interaction with the game. As
soon as the player press play or exit buttons provided on screen 2, a log file
will generated.
Screen 3– Select Scenario
In screen-3 shown in Figure III-9, players can select a scenario to be
played in the developed game. After choosing the particular scenario by the
player, the activities are presented in the game environment accordingly.
Currently, the designed game implements a scenario related to the hospital
management system only. However, the game's architecture is flexible enough
to perform any scenario by adding a solution JSON file discussed in section
5.2.1.2.
128
Figure III-8: Screen 2 – Personal and game control information
Figure III-9: Screen 3 – Selection of game scenario
Screen 4– Warm-up Session
In screen-4 shown in Figure III-10, instead of selecting the scenario,
players can choose the option of playing warm-up sessions available on
Screen-3. The warm-up session helps the students provide an idea of an
environment they are supposed to play the actual game.
Screen 5– Select Level
In screen-5 shown in Figure III-11, the player is presented with three
different levels (Class, Object, and Inheritance) in the game that the player can
select to play. The Level Class has two sub-levels about the attributes and
behavior provided once the player plays the level Class; the details about these
levels are given in the next section.
Figure III-10: Screen 4 – Warm-up
session Figure III-11: Screen 5 – Selection
of game level
129
Screen 6– Learn basic concepts
In screen-6 shown in Figure III-12, the player is presented with a brief
description of the topic/concept of the OOP they have selected on screen 5.
The screen also provides the relationship of the concepts with the game story
(available in Appendix-F) embedded in the game environment.
Screen 7– Level goals and rules:
Once the player learns about the basic concept related to the level they
choose to play, screen 7- level goals and rules shown in Figure III-13, will be
displayed. The level goals define the learning objectives which players have to
achieve by playing that particular level. The level goals are different for the
different levels of the game. The level rules are game dynamics configured
based on the information provided on Screen-2; the details are also available
in Table III-10. The screen-7 also includes information about the options
available to the next screen, such as reading the story and how to activate the
hint option, if required.
Figure III-12: Screen 6 – Learning about the basic concepts of OOP
Figure III-13: Screen 7 – Goals and rules of games
130
Screen 8– Level Screens
After careful reading of the goals and rules for playing the levels, the
games' levels will be activated for the player, Screen 8, and Figure III-14 shows
the necessary information used throughout as background of the game level’s
screens. The game level’s screen layout is divided into five regions, i.e., top,
bottom, left, right, and center, shown as a dotted rectangle in Figure III-14 The
region “T” is used as a scoreboard of the game which includes the information
for the total number of correct solution attempted, score, other options
available to get help or hint if the player stuck on any point, timer, information
about the current level and option to quit from the game. The “T” region is
almost the same for all the levels, except the information may vary for the
number of correct attempts and remaining attempts. Option for showing or
hiding the game story (available in Appendix-D) may also be enabled using
the “T” region. The region “B” is used as a game story drawer, and the drawer
can be shown or hidden by pressing a hamburger sign available in the region
“T.” The region “L” and “R” are dedicated to navigating to the previous or next
screen/levels. The region “C” is the leading region used for the actual play of
the game. The contents of this region are different for different levels available
in the game.
The number in the circle shown in Figure III-14 Indicated different points
of information such as 1) hamburger symbol by pressing this button, the drawer
for the game story (available in Appendix-D) will be shown/hide at the bottom
of the screen, the game story will be available in chunks, the player can read
the story and perform the level’s activity accordingly. 2) indicated the number
of solution or task correctly made by the player upon the total number of correct
131
solutions available, such as for the current scenario shown in Figure III-14, the
total number of class the player has to identify is 60, whereas he/she didn’t
identify any class yet. 3) Timer information, the timer is set depending on game
control information provided at Screen-2 and shows elapsed time at the level.
4) shown the number of levels currently being played by the player. 5)
Navigation towards the previous level or screen of the game, in case of the
game's level-1, the last screen will be the “select level” or screen-5 of the
game. On other levels, it will navigate to the previous level. 6) Navigation
towards the next level or screen of the game, in case of the last level of the
game, this button will be disabled. 7) shows the score information. For every
correct attempt, increment in the score table with animation and playing
incremental sound will be done, and for every wrong decrement in the score
table with animation and playing decremental sound will be done. For every
correct attempt, the increment of 10 points, and every wrong attempt,
decrement of 10 points is also recorded in the player’s log files. 8) the button
is used to show/hide the context menu. The menu contains the necessary
information, such as, loads/unload the information from the previous levels. If
the player already played on the same device any, the player can seek any
hint if he/she stuck on any point while playing the game, switch to the “select
level” or Screen-5 of the game. They can also mute/unmute the background
music of the game using this menu. 9) player can quit from the entire game
using this button.
132
Figure III-14 Screen 8 – Screen layout of the game levels
Screen 9– Level-1 Screens
The level-1 and some of its action occurred during play are shown in
Figure III-15 to Figure III-19 and its subsequent parts. In Level-1, the game
story is presented to the player in chunks (shown at the bottom of the Screen-
9 a), and the player has to identify the correct candidates for the class
according to the domain of the story. The player will be supposed to drag the
candidate class from the game story and drop on the empty box available to
populate the class list, as shown in Figure III-15, screen 9a. Suppose the
player's correct identification, the candidate class occurrence, would appear in
the class list available in the playing region of the screen. In that case, the
player will also be notified with feedback for their attempt with increment in the
score table, and an increase in the number of solution or task correctly made
by the player as well as decrement in the total number of correct solutions
available, as shown on Figure III-16, screen 9b. The player's wrong attempt
leads towards the decrement in the score table with appropriate feedback, as
1
2 3 4
5 6
7
8 9
T
L R
C
B
133
shown in Figure III-17, Screen9c. If the player has successfully achieved the
game goals, which are set based on the game control information, the player
will greet congratulation messages to enhance their motivation to play the
game, as shown in Figure III-18, screen 9d. Once the game control information
is matched, the game will go into the stop state, all the actions taken, and game
statistics are recorded into the log files. Some game level statistics/evaluation
information is also presented to the player, such as Win/Lose status, correct
and incorrect attempts/solution, how many correct attempts/solutions were
remaining, the performance of playing level, total time played, and total score,
as shown on Figure III-19, screen 9e. This screen is presented for all the levels
if the players play the level until it matches the game control information;
otherwise, this information with more details is only recorded into the log files.
Figure III-15: Screen 9a) Level-1 main screen
Figure III-16: Screen 9b) Level-1 correct attempt made by the
player
134
Figure III-17: Screen 9c) Level-1 wrong attempt made by the player
Figure III-18: Screen 9d) Level-1 achievement of game level’s goals
Figure III-19: Screen 9e) Level-1 Statistics/evaluation as result of completing level
Screen 10– Level-2 Screens
The level-2 and some of its actions during play are shown in Figure III-
20 to Figure III-23 and its subsequent parts. In Level-2, the player will be
provided with the class list identified in Level-1, the player is also equipped
with the same game story as of Level-1. At this level, the player has to identify
the attributes from the story for the previously identified classes. The class box
for each identified class will have separately appeared with animation. If the
player clicks on the classes available in the class-list, the box will be
highlighted with the class-name on the top of the newly opened class-box, and
space for adding the attributes is enabled on each class-box as shown in
135
Figure III-20, Screen 10a. The player will be supposed to drag the candidate
attributes from the game story and drop on the relevant class-box. For every
correct identification, the player will receive the appropriate feedback with
further instruction to complete the task, as shown in Figure III-21, Screen 10b.
The increment in the score table and attempt/solution will only be updated once
the player completes the details of the identified attributes, i.e., access
specifier, attribute type, and attribute name (it is the same as attribute name
identified and written automatically if the player clicks on identified attributes)
as shown on Figure III-22, screen 10c. Initially, the attribute types combo-box
is populated with basic data types and will incrementally be populated with
class names, as the class can also be the type of any attributes. However, the
strict validation of these details is limited to the scope of this research. The
Player will receive the appropriate feedback for their wrong attempt and
decrement in the score table, as shown in Figure III-23, Screen 10d. The
achievement of the level-goals, game statistics, and record in log files will be
done in the same way as of Level-1.
Figure III-20: Screen 10a) Level-2 main screen
Figure III-21: Screen 10b) Level-2 correct attempt made by the player
136
Figure III-22: Screen 10c) Level-2 adding details of the attributes
Figure III-23: Screen 10d) Level-2 wrong attempt made by the player
Screen 11– Level-3 Screens
The level-3 and some of its action occurred during play are shown in
Figure III-24 to Figure-27 and its subsequent parts. In level-3, the player will
be provided with a class list identified in Level-1 and information of the
attributes identified for classes in level-2, if any, and the player is also equipped
with the same game story as of Level-1. In this level, the player has to identify
the game story methods for the classes they previously identified in Level-1.
The class box for each identified class will have separately appeared with
animation. If the player clicks on the class names available in the class-list, the
box will be highlighted with the class-name on the top of the newly opened
class-box, and space for adding the method is enabled on each class-box.
However, the area dedicated to the attributes is enabled but inactive at this
level, as shown in Figure III-24 Screen 11a. The player will be supposed to
drag the game story's candidate method and drop on the relevant class-box.
For every correct identification, the player will receive the appropriate feedback
with further instruction to complete the task, as shown in Figure III-25, Screen
11 b. The increment in the score table and attempt/solution will only be updated
137
once the player completes the details of the identified methods, i.e., access
specifier, return type, method name (it is the same as the method name
identified and written automatically if the player clicks on identified method),
parameter name and type of the player, the player can add as many as a
parameter as they want as shown in Figure III-26, Screen 11c. Initially, the
return type of method and type of parameter combo-box is populated with
basic data types and will incrementally be populated with class names, as the
class can also be the type of any method; however, the strict validation of these
details is limited for the scope of this research. The Player will receive the
appropriate feedback for their wrong attempt and decrement in the score table,
as shown in Figure III-27, Screen 11d. The achievement of the level-goals,
game statistics and record in log files will be done in same way as of Level-1.
Figure III-24: Screen 11a) Level-3 main screen
Figure III-25: Screen 11b) Level-3 correct attempt made by the player
138
Figure III-26: Screen 11c) Level-3 adding details of the methods
Figure III-27: Screen 11d) Level-3 wrong attempt made by the player
Screen 12– Level-4 Screens
The level-4 and some of its actions during play are shown in Figure III-
28 to Figure III-31 and its subsequent parts. In Level-4, the player will be
provided with a class list identified in Level-1 and game story as of Level-1. In
this level, the player will be supposed to learn the concept of creation and
deletion of the objects and their effects on the class. The Player has to identify
the objects available in the game story or create new objects. If the object for
a class is available in the game story, and the player drags and drop the object
on the relevant class correctly, then the player will be notified with appropriate
feedback and increment in the score table, and attempts/solution made
correctly, as shown in Figure III-28, Screen 12a & Figure III-29, Screen 12b.
The player can also create as many as the object for a class they want.
However, they have to keep the object name the same as of class followed by
a digit, e.g., for class “patient,” the player can create object patient1, patient2,
and so on, as shown in Figure III-30, Screen 12c. The number of generated
objects could be checked by either clicking the drop-down arrow available for
any class in the list of classes, or the player can simply select any class name
139
from the list of classes.. All its objects and box for a class itself will pop-up on
the screen's playing region, as shown on Screen 12 c. Player will also learn
that if an object for a class is destroyed, it will not affect the class itself.
However, if any class is destroyed, all its objects will automatically be
destroyed, as shown in Figure III-31, Screen 12d. The achievement of the
level-goals, game statistics, and record in log files will be done in the same
way as of Level-1.
Figure III-28: Screen 12a) Level-4 main screen
Figure III-29: Screen 12b) Level-4 correct attempt made by the player
Figure III-30: Screen 12c) Level-4 Number of objects created for a class
Figure III-31: Screen 12d) Level-4 effect of object destruction on class
Screen 13– Level-5 Screens
The level-5 and some of its action occurred during play are shown in
Figure III-32 to Figure 37 and its subsequent parts. In the Level-5, the player
will be provided with the class list identified in Level-1. The class list is also
140
populated in the combo-box available on the class-box's top-right, as shown in
Figure III-32, Screen 13a. In this level, the player will be supposed to learn the
concept of creating the relationship between the classes they have identified.
The player can activate any class from the class-list and select their parent
from the top-right combo-box. If the player correctly identifies the relationship,
they will be notified with appropriate feedback, increment in the score table,
increment in attempt/solution made correctly, and decrement in the remaining
relationship they are supposed to create. The hierarchical relationship
between the classes will also have appeared on the screen's playing region,
as shown in Figure III-33, Screen 13b. The player can also check the multi-
level hierarchy by pressing the class name from the class-list if there is any,
as shown in Figure III-34, Screen 13c. The player is also notified with
appropriate feedback if they have made any logical mistake for creating the
relationship between the classes, such as if the player selects base-class as
sub-class as shown on Figure III-35, Screen 13d if they select the same class
as both for child and parent-class as shown on Figure III-36, Screen 13e if a
class is missing between the hierarchy attempt to be made by the player as
shown in Figure III-37, Screen 13f. The achievement of the level-goals, game
statistics, and record in log files will be done in the same way as of Level-1.
141
Figure III-32: Screen 13a) Level-5 population of classes in combo-box for
creating a relationship between classes
Figure III-33: Screen 13b) Level-5 correct attempt made by the player
Figure III-34: Screen 13c) Level-5 multi-level hierarchy identified by the
player
Figure III-35: Screen 13d) Level-5 effect of selecting base-class as sub-
class
Figure III-36: Screen 13e) Level-5 effect of selecting the same class as
both for child and parent-class
Figure III-37: Screen 13f) Level- effect of missing class between a hierarchy
attempt to be made by the player
142
6.2.4 Logfile generated as a result of playing OOsg
The log files are generated once the user-created their profile shown in
on screen-2. This file keeps all the interaction which players made with the
game, the first row shows the variable names for which data is gathered from
OOsg, while the remaining rows shows the values received for those variables.
The example log file for level-1 is shown in Figure III-38; in this example,
detailed information about players attempts such as how many time the player
attempted to select the attributes, method, or object as a class, what is the
total number of class player is supposed to identify and how many classes are
remaining will be recorded. The log file records many other concrete details
such as total wrong and correct attempts, score, performance (calculated as
correctattempts/totalNoOfSolutions*100), total time played, date and time of
playing, and correctly attempted solution. The log file is generated locally for
each level player played in a separate file. The information recorded in the log
file helps evaluate the students, who used to learn basic OOP concepts using
OOsg.
Figure III-38: Sample log file generated as a result of interaction with level-1
143
CHAPTER IV
RESULTS
This chapter discusses the fourth research objective of this study. It
introduces the experimental setup, evaluates the OOsg, and the results of the
evaluation. The two primary analyses performed during the study were:
1. The experimental analysis to determine the difference in student
performance and normalised learning gain with or without the intervention of
the serious game prototype, and effect size and effect of perceived motivation
by using the developed serious game prototype on the students' learning
outcomes.
2. The evaluation of serious game prototype and traditional teaching
method for perceived motivation, perceived feedback, game experience, and
system usability by the students of control or/and experimental groups.
1. EXPERIMENTAL DESIGN
The experimental study was conducted at three local universities of
Pakistan named Sindh Agriculture University, Tandojam, University of Sindh,
Mirpurkhas Campus, and Indus University, Karachi. There was a total of
eighty-three students participated in the experimental study, and all were
enrolled in various computer science-related degree programs offered, such
as Bachelor of Science (Information Technology), Bachelor of Science
(Computer Science), and Bachelor of Science (Software Engineering).
Potential students to participate in the study were selected voluntarily. A pre
and a post-test was implemented to address the research questions that
guided this study. The control and experimental grouping were done according
144
to pre-test scores to ensure that the students are grouped according to their
OOP performance.
After grouping the participants, the intervention session begins with
introducing the environment to the control and experimental groups. The
control group was treated with the traditional teaching method to grasp the
OOP concepts in the classroom setting whereas, the experimental group was
presented with a OOsg to interact and grasp the basic concepts of the OOP.
After the intervention sessions, the post-test session started. In the post-test
session, the participants were presented with an OOP scenario and question
which needs to solve in the given time. The scenarios for the pre- and post-
test sessions were different; however, the questions that need to be solved
were kept the same to ensure the responses' accuracy. The sequence of the
steps carried out for conducting the experimental study is shown in Figure IV-
1.
Figure IV-1 Sequence of steps carried out for conducting the experimental study
145
1.1 Experiment #1: Pre-Test Session
At the beginning of the pre-test, students were required to fill the
personal information, institution information, computer programming
background, and computer game background. The purpose of the information
is to identify everyone and get information about their programming languages
experience and playing computer games. Students were guaranteed the
confidentiality of their personal information. After completing the personal
information filling session, the pre-test learning activity session begins in which
all the eighty-three participants were asked to read the learning scenario and
answer for the questions related to OOP based on a given learning scenario.
1.1.1 Personal information
The personal information section includes the information of student ID,
which is not necessarily a student's genuine ID but can be any number chosen
by the students and will be used in the post-test session also, gender, and age
range of the students. The information about the gender reveals that among
83 participants, 39 (47%) were the male, whereas 44 (53%) were the female
students; the distribution is shown in Figure IV-2. The 59 (71%) students
belonged to the age groups of 15-18 years, whereas 24 (29%) student’s age
ranges from 19 to 24 years, as shown in Figure IV-3
146
1.1.2 Institutional information
The institutional information was aims to investigates the information
about the a) institute/university the students belong to and b) the program they
are enrolled and c) the semester/term in which they are currently studying. The
information about the institute/university is shown in Figure IV-4, which shows
that most of the participants, i.e., 48 (58%), were from the Information
Technology Centre, Sindh Agriculture University, Tandojam, followed by the
participants from the Sindh University, Mirpurkhas campus and Indus
University, Karachi which was 21 (25%) and 14 (17%) students respectively.
The information about the program enrolment is shown in Figure IV-5, shows
that 69(83%) participants were enrolled in the BS(IT) / BS(ITC) program,
followed by the BS(SE) and BS(CS) which were 10 (12%) and 4 (5%)
participants respectively. All the participants were studying in their second year
of the degree program.
Figure IV-2 Gender of the participants
Figure IV-3 Age Group of the participants
147
Figure IV-4 Institute/University of the participants
Figure IV-5 Program Enrolled of the participants
1.1.3 Background in computer programming
In this section, the information about the participant’s computer
programming experience, if they have any, was collected. This section
includes the information about a) do participants have any earlier computer
programming experience b) if yes, which computer programming language
they have experience. The answer to these questions would help find any
potential threats to the validity of findings on pre and post-test scores. The
responses show that 61 (74%), participants have no prior programming
language experience, whereas 22 (26%) participants have previous
programming experience, as shown in Figure IV-6. Among those 22(26%) with
previous programming experience, 20 participants have experience
programming languages like c, c++ or c#, whereas 1 participant has
experience in web programming and 2 have experience in other programming
experience. The details of the responses for the background in computer
programming languages are shown in Figure IV-7.
148
1.1.4 Background in computer gaming
This section provides information about the participant’s computer
gaming experience if they have any, or perception regarding learning computer
programming or any other subject by plying games. The participants were first
asked about a) whether they possess any gaming experience, if yes, b) which
kind of the games they play, and c) how often they play. The participants were
then investigated about their d) perception regarding the significant impact of
playing educational games, if they have played any. The participants have also
investigated about their e) perception regarding the substantial impact in
playing educational games specifically for learning computer programming, if
they have played any. The intention behind these questions would help find
any potential threats to the validity of findings on the experimental group's post-
test scores. The responses show that 37(45%), participants have no prior
game experience, whereas 46 (55%) participants have previous gaming
experience, as shown in Figure IV-8. Among those 46 (55%) participants, only
12 (14%) participants have prior experience playing educational games,
Figure IV-6 Prior experience in programming languages
Figure IV-7 Experience in types of programming language
149
whereas all other participants have experience in other games. The details of
the responses are shown in Figure IV-9.
The play's frequency indicates that most of the participants replied that
they play games during their leisure time 33(40%), followed by the once a week
or daily; the details of responses are shown in Figure IV-10. Among the 12
participants who have prior experience of playing educational games, 9(11%)
perceive the educational games significantly impact their learning, as shown
in Figure IV-11. Among those 12 participants who have prior experience of
playing educational games, 3(4%) think that educational games explicitly
designed to learn computer programming have a significant effect on their
learning. Other participants may be assumed as they didn’t experience the
educational games specifically for learning computer programming, the details
are shown in Figure IV-12.
Figure IV-8 Prior experience of playing games
Figure IV-9 Types of games played
150
Figure IV-10 Frequency of playing games by the participants
Figure IV-11 Perceived impact of educational games by the
participants
Figure IV-12 Perceived impact of educational games on participants' learning computer programming
1.1.5 Pre-Test learning activity session
In the pre-test learning activity session, all eighty-three participants
were asked to read the learning scenario related to OOP and answer the given
questions based on a given learning scenario (the sample of pre-test scenarios
and questions are available in Appendix-G). The students were given two
hours to answer all the questions. Before the start of the pre-test, the
151
researchers briefly described the purpose of the study. However, students
were not notified of the existence of the experimental and control groups. The
analysis of the result of the pre-test session will be discussed in the section
2.6 of this chapter.
1.2 Division of the Control and Experimental Groups
The division of groups is done by dividing the students into two groups, and
the first group includes the students who secure more than 40% marks in the pre-
test learning activity session, and another group consists of the students who
secure less than 40% marks. The final grouping ensures that each sample size
for the more than 40% and less than 40% pre-test score is almost the same for
the experimental and control groups. There was a total of eighty-three students
participated in the experimental study. Later, forty-one students were part of the
control group, whereas forty-two students were part of the experimental group.
This division is shown in Figure IV-13.
Figure IV-13 Grouping of students for experimental study
33
8
33
9
0
5
10
15
20
25
30
35
Less than 40% in Pretest More than 40% in Pretest
No
. of
Part
icip
ants
Groups
Control Group Experimental Group
152
1.3 Rubric for Student’s Assessment
Scoring strategies for rubrics involve using a scale for interpreting the
judgments of a student’s tasks. Rubrics are not only set for grading the
students, but it can serve as part of a formative assessment of their works in
progress. The scoring system is based on a rubric to measure students’
achievement in mastering OOP skills through competencies defined in
Chapter II. The main competencies assessed in pre- and post-test scenarios
in this research study include class identification, structuring class, object
creation, deletion, and establishing the relationship between classes. The
scoring scheme consists of a total of 4 points scales. Some of the
competencies have sub-competencies as well, and there were 11 total
competencies or sub-competencies. The maximum score for each completely
correct and accurate answer is 4 points, and the minimum score is 0 for no
attempt or wrong response by the participants. Thus, the maximum score
gained by the participants could be 44 points. The sample of the rubric used
in the experimental study is available in Appendix-H.
1.4 Teaching and Practice Session
The teaching and practice session started with the introduction session
in which the domain area of the scenarios needs to be solved, and the
procedure to write the solutions was introduced to the experimental and the
control groups. This session lasted for two and a half hours, in which, one hour
was dedicated for the lecture session, one hour for exercises, and 30 minutes
were occupied for the question-answer session if the participants have any.
153
The introduction session was covered in two sessions, each having twenty to
twenty-two participants.
After, the introduction session, the intervention session begins to learn
about the essential feature of OOP. The intervention session is the practice
session in which the control group students learn and practice the OOP
concepts in the traditional classroom environment. In contrast, the
experimental group students were demonstrated with the OOsg to learn and
practice by interacting with it. The researcher for both the groups serves as a
teacher and facilitator in different sessions. The intervention session lasted for
four days in four weeks, and there were two hours per week.
At the end of the intervention session, the post-session was conducted
to assess the learning of students. The post-test session has lasted for 2 hours
for control and experimental groups separately.
After the completion of the post-test session, the evaluation session was
started. In the evaluation session, some survey form was given to students of
both groups separately. The evaluation session was intended to obtain
evidence about the OOsg's impact or traditional teaching method. There was
no time limit set for conducting the evaluation session. The details about all
these sessions are presented in Table IV-1.
154
Table IV-1: Details about the teaching and practice session with control and experimental Groups
Participants Purpose Activities
Duration of
Session
No of
Days
Participants per session
Introduction Session
Control Group
Familiarize students with the domain area of the scenarios needs to be
solved and how to write solutions
1. Lectures 2. Exercises 3. Question
& Answers
1 hour
1 hour
30 minutes
1
20-21
Experimental Group
Familiarize students with the domain area of the scenarios needs to be solved using
OOsg
20-22
Intervention Session
Control Group To learn about
the basic feature of OOP
1. Lectures 2 hour/we
ek 4
20-21
Experimental Group
2. Practice with OOsg
20-22
Post-test Session
Control Group To assess the
learning of students
Class test 2 hours 1
20-21
Experimental Group
20-22
Evaluation Session
Control Group
To obtain the sound evidence about the impact of the OOsg or
traditional teaching method
Survey forms - 1 41-42 Experimental
Group
155
1.5 Experiment#2: Post-Test Session
For the post-test session, the students were required to fill in personal
information and institution information. The purpose of the filling is to identify
everyone and match the result with their pre-test scores. The participants were
requested to keep their ID number the same as they mentioned in the pre-test
session. Students are guaranteed the confidentiality of their personal
information. The two primary analyses performed during the post-test session
for the control and experimental groups were about;
1. The experimental analysis for the difference in student’s performance
and normalized learning gain with or without the intervention of the serious
game prototype, and the effect of perceived motivation and effect size of the
serious game prototype on the learning outcomes of the students
2. The evaluation of serious game prototype and traditional teaching
method for perceived motivation, perceived feedback, game experience, and
system usability by the students of control or/and experimental groups
A learning activity followed by the question response session was
conducted to investigate the student's achievement before and after/with or
without the serious game prototype's intervention. The evaluation from both
groups was conducted to analyse the OOsg or traditional teaching method's
impact. The details about these sessions are discussed in the following
session.
1.5.1 Post-Test learning activity session
In the post-test learning activity session, the control and experimental
students were provided with a learning scenario related to OOP and answer
156
for the given questions based on a given learning scenario (the sample of post-
test scenarios and questions are available in Appendix-I). The students were
given two hours to answer all the questions. The analysis of the result of the
control and experimental groups' responses is discussed in the 2.6 section of
this chapter.
1.5.2 Parameters of the evaluation study
The second part of the experiment is concerned about the evaluation of
developed serious game prototype and traditional teaching methods by the
students of control or/and experimental groups. There were four evaluation
parameters performed, two of them, namely perceived motivation and
perceived both groups of students evaluated feedback. In contrast, for the
other two, i.e., game experience and system usability, were performed by
experimental group students only. The details of these parameters will be
discussed in the following sections.
1.5.2.1 Perceived motivation
According to Keller (Keller, 2010), four components affect motivation in
the learning process: Attention, Relevance, Confidence, and Satisfaction
(ARCS). All these components contribute to and sustain motivation throughout
the learning and this models also help in the diagnosis of motivational
problems. To measure the perceived motivation IMMS (Instructional Materials
Motivation Survey) instrument is used in this research study, the IMMS survey
includes 36 items and four subscales. The four subscales are attention (12
items), relevance (9 items), confidence (9 items), and satisfaction (6 items).
There are ten reverse items in the IMMS instrument. The motivational score
157
becomes higher in the reverse item if the learner gives the lower score to
reverse items. The reverse coded items are scored by recording the same
variables using SPSS software. Each item is presented to participants on a 5-
point scale, ranging from 1 for “Strongly disagree” to 5 for “Strongly Agree.”
The items of the questionnaire are available in Appendix-J.
1.5.2.2 Perceived feedback
Feedback is a necessary condition for student goal setting (Erez, 1977).
Feedback is defined as information provided to students about their
performance (Rowe and Wood, 2008). It includes written comments on
assignments, verbal responses supplied in class or individually, postings on
WebCT (the online student learning system), and peer- and self-evaluation
forms of feedback. Feedback is an essential component, and it has the most
significant impact on student learning through formative assessments (Black
and Wiliam, 1998). Feedback identifies gaps between students' performance
levels and expectations (Shute, et al., 2008), which hinders learning (Alton-
Lee, 2003). Perceived feedback is about the importance of the feedback which
students received while learning. Students were asked whether they found the
feedback valuable on their tasks or whether it only helps them when they
receive low grades, how they perceive positive or negative feedback, and the
detailed feedback is encouraging or annoying to them.
The questionnaire for the perceived feedback includes the value of the
feedback in students’ points of view and their perception of the feedback
received while playing the OOsg. The questionnaire consists of 17 items; none
of them was scored inversely. Each item is presented to participants on a 5-
158
point scale, ranging from 1 for “Strongly disagree” to 5 for “Strongly Agree.”
The items of the questionnaire are available in Appendix-K.
1.5.2.3 Game experience
The game experience is about what the game is and what it affects. It’s
the game itself, its presentation, its style, the community's interactions, and the
nature of the community itself. Most of the game experience questionnaire
items are about gameplay experience and used for observational purposes, or
as supporting data to find correlations with students' post-test scores in this
research study. The questionnaire includes both closes-ended and open-
ended questions, and each closed-ended item is presented to participants on
a 5-point scale, ranging from 1 for “Strongly disagree” to 5 for “Strongly Agree.”
In contrast, open-ended questions and comments sections are provided for
suggestions and know the student's opinion regarding the gameplay
experience. The items of the questionnaire are available in Appendix-L. The
reliability test result for the game experience questionnaire shows a Cronbach
alpha value of .766.
1.5.2.4 System usability
Usability is the measure of the general quality of the appropriateness of
a system in particular context (Brooke, 1996). Usability refers to assessing the
quality attributes of the user interface that are easy to use (Jokela, et al., 2003).
Brooke (Brooke, 1996) suggests that usability measures covers the
effectiveness, efficiency, and satisfaction of a user's implementation of a
specific target environment. The measurement of usability is complex
because usability is not a specific property of a person or thing. Usability
159
cannot be measure with simple device such as “usability” thermometer
(Dumas, 2003; Hertzum, 2010; Hornbæk, 2006). Rather, it is an emergent
property dependent on interactions among users, products, tasks, and
environments
There are many widely used standardized usability questionnaires
available for assessment of the perception of usability such as the
Questionnaire for User Interaction Satisfaction(QUIS) (Chin et al., 1988), The
Software Usability Measurement Inventory (SUMI) (Kirakowski & Corbett,
1993; McSweeney, 1992), Post-Study System Usability Questionnaire
(PSSUQ) and its non-lab variant, the Computer Systems Usability
Questionnaire (CSUQ) (Lewis, 1990, 1992, 1995, 2002) and SUS (Software
Usability Scale) (Brooke, 1996; Sauro, 2011).
Among many different metrices available for usability, QUIS and SUMI
were not freely available questionnaire, whereas SUS is the most widely used
and freely available standardized usability questionnaire (Sauro & Lewis,
2009), but SUS consists mixed tone or composed of alternating positive and
negative tone items. On the other hand, PSSUQ and CSUQ are consistently
positive item tone. This research prefer on PSSUQ, because mixed tone
questionnaire create undesirable structure in a metric in which positive items
align with one factor and negative items align with the other (Barnette, 2000;
Davis, 1989; Pilotte & Gable, 1990; Schmitt & Stuits, 1985; Schriesheim & Hill,
1981; Stewart & Frye, 2004; Wong, Rindfleisch, & Burroughs, 2003). Another
issues for mixed tone metric includes various types of errors such as
misinterpretation(Users might respond to items forced into a negative tone in
160
a way that is not the simple negative of the positive version of the item),
mistakes (Users might not intend to respond differently to mixed-tone items
but might forget to reverse their score, accidentally agreeing with a negative
item when they meant to disagree) and miscode(to generate a composite
overall score from mixed-tone items, it is necessary to reverse the scoring of
the negative-tone items before combination with the positive-tone items.
Failure to perform this step would result in incorrect composite values, with the
errors not necessarily easy to detect) (Lewis, 2014). The other limitation for
SUS scale includes that respondents are expected to record their immediate
response to each item, rather than thinking about items for a long time (Brooke,
1996).
The PSSUQ is a 16-item questionnaire that measures users’ perceived
satisfaction with a product or system. Obtaining an overall satisfaction score is
done by averaging the four sub-scales of System Quality (item 1-6),
Information Quality (item 7-12), and Interface Quality (item 13-16). Each item
is presented to participants on a 5-point scale, ranging from 1 for “Strongly
disagree” to 5 for “Strongly Agree.” The items of the questionnaire are available
in Appendix-M.
2. RESULTS AND ANALYSIS
For the experimental research conducted in this study, two types of
analyses were performed. The primary analysis was performed using diverse
statistical methods to answer the research questions that guided this study to
test whether there is any significant improvement in the students who
participated in the experimental group compared to the students who
161
participated in the control group. Four different studies were performed for this
type of analysis, and the test data used for this part of the analysis were the
pre-test and post-test scores and normalized learning gain. The raw scores for
the pre-test and the post-test and scatter plots for both the experimental and
control group are shown in Appendix-N. The second type of evaluation is
conducted on control or/and experimental students to evaluate the serious
game prototype and traditional teaching methods to understand these two
methods' impact. The results of the experimental analysis and evaluation are
discussed in the following sections.
2.1 Result of Experimental Analysis
In this analysis, four studies were performed to achieve objective-4 of
this research study. In this experimental evaluation, the pre-test scores of both
the experimental and control groups are compared with the post-test scores.
The main aim is to determine whether the difference between means for the
two sets of scores (pre-test and post-developedtest) is the same or different.
The details of the studies are discussed in the following section.
2.1.1 Study-1: The difference in students’ performance for learning OOP with and without the intervention of prototype
In this analysis, the pre-test scores of both the experimental and control
groups are compared with the post-test scores. The main aim is to determine
whether the difference between means for the two sets of scores (pre-test and
post-test) is the same or different.
To perform any statistical test, first, the normality of each group was
tested. The results of the normality of each group revealed that the assumption
162
of the t-test is meet for both, the control and experimental groups (The results
of the normality are provided in the Appendix-O(a)). It is; therefore, the paired
t-test is performed for both groups separately.
The paired t-test result for the control, and experimental groups are
given in Table IV-2. The control group reveals that t(40)=0.933, p>0.05, which
indicates homogenous results or there is no significant difference in the
average of the pre-test scores and the post-test average for the control group.
The paired t-test results for the experimental group show that t(41)=14.11,
p<0.05, and indicates the significant difference in the average of the pre-test
scores and averages of post-test scores. Appendix-O (b) provides complete
SPSS output for paired t-test results on pre and post-test scores of the control
and the experimental group.
Table IV-2: The difference in students’ performance for learning OOP with and without the intervention of prototype
Test-Data Mean Statistical test t-value df Sig. (2-tailed)
Control Group
Post-Test 14.341 Paired t-test .933 40 .356
Pre-Test 13.829
Experimental Group
Post-Test 33.285 Paired t-test 14.11 41 .000
Pre-Test 15.190
Therefore, it can be concluded that there is a significant difference in
the performance for learning OOP with and without the intervention of the
163
OOsg, the participants in the experimental group performed better in learning
OOP than the participants in the control group.
2.1.2 Study-2: The difference in students’ normalized learning gain for learning OOP with and without the intervention of prototype
The normalized learning gain is the rough measure of the prototype's
effectiveness in promoting conceptual understanding of the subject. The
amount of students learned was divided by the amount they could have
learned (Hake, 1997). The formula for calculating the normalized gain
proposed by Hake (Hake 1997), is given below:
Normalized Learning Gain=𝑃𝑜𝑠𝑡−𝑡𝑒𝑠𝑡 𝑆𝑐𝑜𝑟𝑒−𝑃𝑟𝑒−𝑡𝑒𝑠𝑡 𝑆𝑐𝑜𝑟𝑒
100−𝑝𝑟𝑒−𝑡𝑒𝑠𝑡𝑆𝑐𝑜𝑟𝑒………(1)
For example, if a student scored 70 in the post-test and 30 in the pre-
test score, then the normalized learning gain could be as
70−30
100−30 =
40
70 = .57
Thus, it indicates that students gained .57 (or 57%) of the possible
gained from pre to post the test scores.
For this analysis, the normalized gain for each participant from the
control and experimental group was first calculated by using the formula (1).
The average normalized learning gain shown in Figure IV-14 indicates that for
the control group shows the normalized learning gain was 0.01 or 1% learning
gain found, whereas the experimental groups show .21 or 21% gain in the
learning of the participants.
164
Figure IV-14: Average normalized learning gain for control and experimental groups
For performing the statistical test to check the difference in students’
normalized learning gain with and without the intervention of serious game
prototype, first, the normality of each group was tested. The normality of each
group revealed that the assumption of the t-test is meet for both, the control
and experimental groups (The details of the normality are provided in
Appendix-P(a)).
The paired t-test result Table IV-3 reveals that t (40)=12.499, p<0.05,
which indicates the significant difference in the average normalized learning
gain of the control and experimental groups. Appendix-P(b) provides complete
SPSS output to analyze the control and experimental groups' normalized
learning gain.
1%
21%
0%
5%
10%
15%
20%
25%
% in
No
rmal
ized
Lea
rnin
g G
ain
Average Learning Gain
Control Group Experimental Group
165
Table IV-3: The difference in students’ normalized learning gain for learning OOP with and without the intervention of prototype
Test-Data Mean Statistical
test t-value df
Sig. (2-
tailed)
Experimental Group- Normalized Learning Gain
0.2105
Paired t-test
12.499 40 .000 Control Group-Normalized Learning Gain
0.0056
Therefore, it can be concluded that the normalized learning gain of
participants in the experimental group is significantly higher than the
participants in the control group.
2.1.3 Study-3: The effect size of the serious game prototype on learning outcomes
The effect size (ES) is the magnitude of the difference between groups.
Absolute ES is the difference between the average or means of two different
intervention groups (Cohen, 1988). ES is the main finding of quantitative
research, and the statistical p-value only helps to understand whether there is
a difference, but it does not reveal the size of the effect (Sullivan and Feinn
2012). Data used for this analysis include the pre and post-test scores of both
the control and experimental groups. The effect size is calculated using
Cohen's d, which can be found by the following formula:
d = <𝑃𝑜𝑠𝑡−𝑡𝑒𝑠𝑡 𝑆𝑐𝑜𝑟𝑒>−<𝑃𝑟𝑒−𝑡𝑒𝑠𝑡 𝑆𝑐𝑜𝑟𝑒>
𝑆𝐷𝑝𝑜𝑜𝑙𝑒𝑑………(2)
166
where <post-test score>is the average of the post-test score, <pre-test
score> is the average of the pre-test score, and s is the pooled standard
deviation, which can be calculated as formula given in (3)
𝑆𝐷𝑝𝑜𝑜𝑙𝑒𝑑 = √(𝑆𝐷1 2+ 𝑆𝐷2
2)
2………(3)
Where SD1 is the standard deviation for the values of pre-test scores
and SD2 is the standard deviation for the values of post-test scores of control
or experimental groups, Cohen suggested that effect sizes of 0.2-0.3 are small,
0.5 medium, and ≥0.8 large effect size.
The pre and post-test raw score details used for calculating the effect
size for both the control and experimental groups are available in Appendix-N.
The result of the effect size for the control group, experimental group, and the
effect size of the post-test score of both the control and experimental groups
are presented in Table IV-6. The result of the effect size for the control groups
is 0.41, which is, according to Cohen’s suggestion, considered as small effect
size, whereas the effect size for the experimental group is 3.40, which is
significant results to be regarded as large effect size, same as the effect size
for the post-test score of both groups, the result is 3.43 which is also
considered as large effect size.
167
Table IV-4: The effect size of the OOsg on the learning outcomes
est-Data Mean SDPooled Values Effect Size
Control Group
Post-Test 14.341 3.68 (14.341-13.829)/3.68 0.14
Pre-Test 13.829
Experimental Group
Post-Test 33.285 5.32 (33.285-15.190)/5.32 3.40
Pre-Test 15.190
Control and Experimental Group
Post-Test Exp Group
33.285
5.52 (33.285-14.341)/5.52 3.43 Post-Test
Control Group 14.341
Therefore, it can be concluded that the effect size for the control group
where the traditional teaching method was used, shows the small effect of such
intervention, and the results were not statistically reliable. In contrast, the effect
size for the experimental groups and post-test scores shows a more significant
impact and proves to be a significantly reliable intervention method.
2.1.4 Study-4: The effect of perceived motivation on the learning outcomes of the students
As one of the research problems addressed in this research was to find
the answer for, does the perceived effects on the learning outcomes of the
students or not. The data used in this study include, values for the perceived
motivations were collected using the IMMS scale for both control and
experimental groups after performing post-test sessions and data for learning
outcomes were the normalized learning gain used in study-2. The Pearson
168
product-moment correlation coefficient is calculated for both the control and
the experimental groups for performing analysis of this study. The main aim of
this analysis is to find the effect of the perceived motivation on the learning
outcomes of the students.
The Pearson correlation for the control group students is shown as
Scatterplot in Figure IV-15 and Table IV-4. The results reveal, r=0.046, which
shows no effect of perceived motivation on the learning outcomes in the control
group (r= .046, n=41, p>0.05).
Figure IV-15 Correlation analysis for perceived motivation and learning outcomes (control group)
Table IV-5 Pearson correlation showing the effect of perceived motivation on the learning outcomes (control group)
Perceived Motivation
Normalized Learning Gain
Perceived Motivation
Pearson Correlation Sig. (2-tailed) N
1 41
.046
.773 41
Normalized Learning Gain
Pearson Correlation Sig. (2-tailed) N
.046
.773 41
1 41
169
The result of the Pearson correlation for the experimental group
students is shown as Scatterplot in Figure IV-16 and Table IV-5. The results
reveal r=0.530, which shows there is a significant effect of perceived
motivation on the experimental group's learning outcomes. (r=.530, n=42,
p<0.05).
Figure IV-16 Correlation analysis for perceived motivation and learning outcomes (experimental group)
Table IV-6 Pearson correlation showing the effect of perceived motivation on the learning outcomes (experimental group)
Perceived Motivation
Normalized Learning Gain
Perceived Motivation
Pearson Correlation
Sig. (2-tailed)
N
1
42
.530**
.000
42
Normalized Learning Gain
Pearson Correlation
Sig. (2-tailed)
N
.530**
.000
42
1
42
** Correlation is significant at the 0.01 level(2-tailed)
170
Therefore, it can be concluded that positive and modestly strong
correlation (r=.530, n=42, p<0.05) was identified between the perceived
motivation and learning outcomes for the experimental group. However, there
was no significant correlation between the two variables for control group
students.
2.2 Results of Evaluation
For performing the evaluation, various parameter discussed in section
2.6.2.1 to 2.6.2.4 is used for the analysis; two of them, namely perceived
motivation and perceived feedback was evaluated by both group of students
whereas, for other two, i.e., game experience and system usability, were
performed by the students of the experimental group only. The descriptive
statistics were used for each of these parameters; the parameters used in the
evaluation are shown in Figure IV-17. The details of each parameter evaluation
are described in the following sections.
Figure IV-17: Evaluation framework used in the study
171
2.2.1 Results of perceived motivation
Learner’s perception of their motivations refers to how they perceive
their motivations while playing OOsg. Although we can't measure players'
direct motivation by any scale, we can estimate the perceived motivation of
players. Students of both groups evaluated the learner’s motivations, i.e.,
control and experimental group students. The quantitative data was generated
using scale IMMS (Instructional Materials Motivation Survey). The four
subcategories of the IMMS scale include attention, relevance, confidence, and
satisfaction. The mean alpha reliability of the overall scale was 0.92, whereas,
the mean alpha reliability for subscale attention was 0.862, relevance was
0.815, confidence was 0.928, and satisfaction was 0.871. The result of the
reliability is also shown in Figure IV-18. The quantitative analysis of these
subcategories is as follows:
Figure IV-18 Reliability measure for IMMS
172
For designing the serious game model, we have first understood
whether a single game attribute leads to learning or enhance perceived
motivation or a combination of multiple attributes within a game has a more
substantial effect, and which game elements mainly helps to produce which
learning outcomes, the details about Gagne’s instructional events, OOsg
activities, game attributes and type of the motivational aspects supposed to be
achieved.
The attention subcategory is about acquiring and continuously focusing
on the learning environment (Green and Sulbaran, 2006). It is measure of how
much students are aware of the instructional design materials used. The game
attributes control, and sensory stimuli used in OOsg activities such as
welcoming player with their chosen name, or background music or presenting
the game objectives, outcomes or about information or Providing the warm-up
scenarios helps in achieving the attention subcategory.
In the case of the experimental group students, the goal was to
determine how the OOsg attracted students’ attention, which contributed to
students’ perceptions of their overall motivation during the learning or playing
process. Attention is acquired in the developed game by applying (Gagne and
Driscoll, 1998) three actions that vary the appearance or sound of instructional
materials, use concrete examples, and informing the novelty and incongruity.
The subcategory attention includes twelve items, five of which are
reverse scoring items. The average score and standard deviation of this
subcategory are given in Table IV-7. Attention categories had an average
score of 2.76 and a standard deviation of 1.21 for the control group students,
173
whereas the average score of 3.87 and a standard deviation of 1.22 for the
experimental group students. It is concluded that students believe OOsg helps
to gain or maintain their attention more during playing the game than those
who used to be attentive in a traditional teaching way.
Table IV-7: Average and standard deviation for the IMMS subcategory attention
Item Description
Control
iGroup
Experimental
iGroup
�̅� 𝑠 �̅� s
M1 Something was interesting at the beginning of this course\game that got my attention.
2.54 1.19 4.14 1.18
M2 These materials are eye-catching. 2.56 1.43 4.21 1.05
M3 The quality of the writing helped to hold my attention.
2.17 0.92 4.00 1.17
M4 This course\game is so abstract that it was hard to keep my attention. (Reverse)
2.44 0.90 4.12 1.02
M5 The pages of this course\levels, of course, look dry and unappealing. (Reverse)
2.71 1.19 3.67 1.46
M6 The way the information is arranged on the pages\level helped keep my attention.
2.44 1.23 3.67 1.41
M7 This course\game has things that stimulated my curiosity.
3.29 1.29 3.79 1.30
M8 The amount of repetition in this course\game caused me to get bored sometimes. (Reverse)
3.22 1.37 3.88 1.23
M9 I learned some things that were surprising or unexpected.
2.78 1.33 3.57 1.27
M10 The variety of reading passages, exercises, illustrations, etc., \levels helped keep my attention on the course\game.
2.66 1.09 3.64 1.39
M11 The style of writing is boring. (Reverse) 2.51 1.31 4.31 0.98
M12 There are so many words on each page\level that it is irritating. (Reverse)
3.80 1.31 3.43 1.13
Average 2.76 1.21 3.87 1.22
174
The relevance aims that learning content should not only be accurate,
but it must also be consistent with the learning outcomes. The purpose of this
category is to determine whether students think that instructional method, i.e.,
OOsg or traditional teaching, is related to their existing knowledge, interests,
experience, and real life. It shows the learners that their success is a direct
result of their efforts and can enhance personal needs and traits related to
relevance. They are providing feedback to access learners' efforts and which
helps in increase the sense of achievement. In the design of the OOsg, the
game attributes like, rules, goals, fantasy, and sensory stimuli used in OOsg
activities like providing or presenting the information regarding rules and goals
of each level, increase and decrease in points/score, how to get hints, win/lose
strategies, helps develop relevance between the learning contents and
learning outcomes.
The relevance subcategory is composed of eight items, one of which is
a reverse scoring item. The average scores and standard deviation for the
relevance category presented in Table IV-8 show the average and standard
deviation for the control group was 2.69 and 1.25, respectively, whereas, for
the experimental group students, the average score was 3.66 and the standard
deviation of 1.10. Students from the experimental groups believe that OOsg is
often related to their existing knowledge, experience, and real life.
175
Table IV-8: Average and standard deviation for the IMMS subcategory relevance
Item Description
Control iGroup
Experimental iGroup
�̅� 𝑠 �̅� s
M13 It is clear to me how the content of this material\game is related to things I already know.
2.56 1.21 3.93 1.16
M14 There were stories, pictures, or examples that showed me how this material could be important to some people.
3.07 1.23 3.55 1.19
M15 Completing this course\game successfully was important to me.
2.83 1.38 3.71 1.11
M16 The content of this material\levels of the game is relevant to my interests
2.56 1.18 3.86 0.93
M17 There are explanations or examples of how people use the knowledge in this course\game.
2.76 1.26 3.74 1.04
M18
The content and style of writing in this course\presentation of material in-game convey the impression that its content is worth knowing.
2.66 1.28 3.74 0.80
M19 This course\game was not relevant to my needs because I already knew most of it. (Reverse)
1.95 1.05 3.62 1.19
M20 I could relate the content of this course\game to things I have seen, done, or thought about in my own life.
2.98 1.44 3.45 1.33
M21 The content of this course\game will be useful to me.
2.80 1.25 3.36 1.16
Average 2.69 1.25 3.66 1.10
Confidence is about students' expectancy of success and learning
failure (Bohlin, Milheim, and Viechnicki 1990). Keller (Keller and Keller 2010)
indicated that confidence is related to the learners' feeling of personal control
and its influence on learning effort and performance. In the OOsg, Fantasy,
Rules/Goals, Sensory stimuli used for presenting the game levels needs to
play, performing actions in the levels or providing feedback on players action
taken in the game helps to develop confidence in the learners.
176
Gagné (Gagne and Medsker, 1996) also suggest designing an
environment where learners capable of performing their potential are always
challenging. This subcategory's goal was to determine how confident students
felt about the sustainability of the instructional methods, which contributed to
their overall perceptions of their motivation. However, the learning situation
should begin to strengthen their confidence and gradually change the level of
difficulties. The facilitator needs to establish a good rapport with the learners
then increase expectancy for success. Instructional strategies expected to
enhance the learners’ confidence include advancing advance organizers and
clear learning objectives. In this, confidence can develop through feedback,
highlighting the relationship between learner effort and the results achieved.
The confidence scale consists of 9 items, 4 of which are reverse scoring
items. The average scores and standard deviation for the confidence category
are presented in Table IV-9. The results show, an average score of 2.59 and
a standard deviation of 1.16 for the control group students, whereas an
average score of 3.45 and a standard deviation of 1.22 for the experimental
group students. The experimental group students think they can learn using
the OOsg, but there are some difficult factors in the learning process, and
students of the control group are less confident in learning using the traditional
teaching method.
177
Table IV-9: Average and standard deviation for the IMMS subcategory confidence
Item Description
Control iGroup
Experimental iGroup
�̅� 𝑠 �̅� s
M22 When I first looked at this course\game, I had the impression that it would be easy for me.
2.78 1.19 3.60 1.25
M23 This material was more difficult to understand than I would like for it to be. (Reverse)
2.24 0.97 3.36 1.19
M24 After reading the introductory information, I felt confident that I knew what I was supposed to learn from this course/game.
2.88 1.35 3.67 1.22
M25
Many of the pages had so much information that it was hard to pick out and remember the important points. (Reverse)
2.24 0.73 3.31 1.28
M26 As I worked on this course\game, I was confident that I could learn the content.
2.73 1.27 3.40 1.29
M27 The exercises in this course\game were too difficult. (Reverse)
2.27 1.27 3.50 1.06
M28 After working on this course\game for a while, I was confident that I would be able to pass a test on it.
2.56 1.25 3.33 1.20
M29 I could not understand quite a bit of the material in this course\game. (Reverse)
2.88 1.23 3.52 1.23
M30 The good organization of the content helped me be confident that I would learn this material.
2.71 1.21 3.38 1.25
Average 2.59 1.16 3.45 1.22
The last subcategory, satisfaction, is about accomplishments in
learning. Several factors can affect satisfaction, such as feedback. Using a
comprehensive feedback process, learning iterations, and experiences can
support learner self-confidence, maintaining the relationship between attention
and learning activities. Establishing clear learning goals can avoid adverse
effects on learners. Therefore, providing a clear and concise guide will enable
learners to make a difference. The game attributes such as Fantasy, Sensory
178
stimuli, Challenge and Mystery provided in activities like providing game score,
information for correct and wrong attempts, remaining tasks, etc or to provide
the previous information in the upcoming levels with increased challenged or
complexity, helps the learners to satisfy about their level of satisfaction.
The satisfaction scale consists of six items, none of which were scored
reversed. The average scores and standard deviation for the satisfaction
subcategory are presented in Table IV-10, The average score for the
satisfaction category was 2.83, and the standard deviation was 1.27 for the
control group students, whereas the average and standard deviation of 3.28
and 1.33 respectively for the students of the experimental group. Students
agreed it was generally true that they were satisfied with the OOsg compared
to the traditional teaching method.
Table IV-10: Average and standard deviation for the IMMS subcategory satisfaction
Item Description
Control iGroup
Experimental iGroup
�̅� 𝑠 �̅� s
M31 Completing the exercises\levels in this course gave me a satisfying feeling of accomplishment.
2.78 1.37 3.38 1.38
M32 I enjoyed this course\game so much that I would like to know more about this topic.
2.93 1.21 3.07 1.22
M33 I enjoyed studying this course/game 2.93 1.06 2.98 1.14
M34
The wording of feedback after the exercises, or of other comments in this course/game, helped me feel rewarded for my effort.
2.80 1.29 3.52 1.19
M35 I felt good to complete this course/game 2.80 1.36 3.07 1.63
M36 It was a pleasure to work on such a well-designed course/game.
2.76 1.34 3.64 1.43
Average 2.83 1.27 3.28 1.33
179
The comparison between the average score for all the IMMS scale
subcategories for both the control and experimental groups is shown in Figure
IV-19. The comparison results reveal that the experimental group students'
motivation levels were positive for all the sub-categories of IMMS.
Figure IV-19 Comparison between control and experimental group’s perceived motivational level
The overall means score, frequency, standard deviation, and
percentage of the learner’s perception of motivation for the experimental group
is presented in Table IV-11. The highest percentage (47.61%) of the mean
score indicates that the majority of the participants have means score between
4.01 to 5.00 for the attention subcategory, same for the satisfaction
subcategory also have the highest percentage (47.61%) for the means score
between 3.1 to 4.0 The overall mean score of subcategories of the IMMS scale
shows that participant has the high score in attention (3.87) followed by
relevance (3.66).
180
Table IV-11 Overall mean score, frequency, standard deviation, and percentage about the learner’s perception of motivation towards learning
OOP using OOsg
IMMS subcategories
Mean Score 1.00-2.00
Mean Score
2.1-3.00
Mean Score
3.1-4.00
Mean Score
4.1-5.00
Mean Score
SD
Attention 1
(2.38%) 4
(9.52%) 17
(40.47%) 20
(47.61%) 3.87 0.65
Relevance 2
(4.76%) 11
(26.19%) 13
(30.95%) 16
(38.10%) 3.66 0.75
Confidence 7
(16.66%) 6
(14.28%) 15
(35.71%) 14
(33.33%) 3.45 0.97
Satisfaction 8
(19.04%) 5
(11.90%) 20
(47.61%) 9
(21.42%) 3.28 1.07
2.2.2 Results of perceived feedback
Feedback is defined as providing students with information about their
performance, such as written reviews on assignments, oral responses
submitted to the classrooms or individuals, posts or comments from online
learning systems, and feedback can take the form of peers and self-
assessments (Rowe and Wood, 2008). Feedback is probably the essential
component, and it has the most significant impact on student learning through
formative assessments (Black and Wiliam, 1998). Effective feedback affects
student progress and success. Feedback identifies gaps between students'
performance levels and expectations (Shute, et al. 2008), which hinders
learning (Alton-Lee, 2003). When students have a proper understanding of
their progress, they can easily determine their next steps (Education Review
Office, 2012). This study aimed to explore the value of the feedback in
students’ points of view and their perception of the feedback received during
the classroom or playing the developed game.
181
The perceived feedback questionnaire includes 17 items; none of them
was scored inversely. The reliability test was conducted to evaluate the
perceived feedback items; the overall scale's result had a value of α=.769. For
performing the statistical analysis to check the difference of student’s opinion
for perceived feedback they received using their particular instructional
methods, the Shapiro Wilk’s test(p>0.05) and visual inspection of the
histogram test of normality were performed to check the normality test (The
details of the normality test are provided in the Appendix-Q(a)), the result
reveals that means values of perceived feedback for the control group are
normally distributed, however for experimental groups values deviates from a
normal distribution, hence Wilcoxon Signed-rank test was performed.
The result of the Wilcoxon test is presented in Table IV-12. The results
show that at a 5% level of significance, z= -5.55, p<0.05, which indicates the
significant difference in the control and experimental groups' perceived
feedback. Further, the mean values of the experimental group are higher than
the control group. Therefore, it can be concluded that the experimental group
participants have better results for perceived feedback than the control group
participants. Appendix-Q(b) provides complete SPSS output to analyze the
control and experimental groups' perceived feedback.
182
Table IV-12 The difference in student’s perception about feedback received while learning OOP with and without the intervention of prototype
Test-Data Mean Statistical test z-
value Sig. (2-tailed)
Experimental Group -Perceived Feedback
3.657 Wilcoxon
Signed Ranks -5.55 .000
Control Group -Perceived Feedback-
2.616
For the descriptive statistics, the average score and standard deviation
of perceived feedback are provided in Table IV-13. The average rating for the
feedback value category was 2.62, and the standard deviation was 1.20 for
the control group, whereas the average and standard deviation for the
experimental group was 3.28 and 0.90, respectively. Further, it is concluded
that the average values about the feedback perception are higher for the
experimental group students as compared to the control group students.
183
Table IV-13 Average and standard deviation for the perceived feedback
Item Description
Control iGroup
Experimental iGroup
�̅� 𝑠 �̅� s
F1 Feedback is important to me 2.54 1.19 3.71 0.51
F2 I always collect my assignments/task 2.56 1.43 4.14 0.75
F3 I always read the feedback on my assignments/tasks
2.17 0.92 3.74 0.89
F4 I use feedback to try and improve my results in future assignments
2.44 0.90 3.67 0.75
F5 Feedback is only useful when I receive a low grade
2.71 1.19 4.02 0.75
F6 General feedback provided in class/game helps me learn independently
2.56 1.21 3.62 0.88
F7 Teachers’/games written comments are often difficult to read and poorly explained
3.07 1.23 3.43 1.23
F8 Feedback is only useful when it is positive 2.83 1.38 3.48 0.89
F9 Individual feedback is better because I can clarify any issues with the teacher/game
2.56 1.18 3.64 1.12
F10 I like it when teacher/game post sample answers on assignment/tasks
2.76 1.26 3.81 0.86
F11 I feel encouraged when teacher/game provide general feedback in class/game
2.66 1.28 3.07 0.81
F12 An essential part of learning can discuss the subject with my teacher/game
1.95 1.05 3.43 0.97
F13 I learn better when the teacher/game encourages me to think deeply about the subject matter
2.98 1.44 3.64 1.25
F14 Written feedback is easier to understand 2.80 1.25 3.79 0.95
F15 Specific feedback is better because it helps me to understand what I did right and wrong in an assignment/task
2.78 1.19 3.60 1.08
F16 I learn more when my teacher/game focuses on the questions I got wrong
2.24 0.97 3.79 0.90
F17 It is annoying when the teacher/game provide general feedback to the class/game
2.88 1.35 3.62 0.73
Average 2.62 1.20 3.28 0.90
184
2.2.3 Result of game experience
The game experience analysis was done to identify the effect of the
OOsg on the participant's experience during playing. The questionnaire
consists of 15 questions, 10 of them were closed-ended questions, and 5 were
open-ended questions. The open-ended questions were about the dedicated
for the number of the levels in the game played/reached by the participants,
the overall game rating scale, what did the participants most liked in the game,
and what did they don’t like in the game, and in the last suggestions were also
collected for the improvement of the prototype. The mean alpha reliability of
the close-ended items was α=.766. The expected answers for various
questions such as the developed game were easy to learn and understand,
participants enjoyed playing the game, besides the game tasks were clear, the
tasks varied behaviour from easy to difficult, improved the engagement of
playing the game. Participants felt that they were happy to complete the game
tasks, but the developed game provides an excellent example for learning
OOP and has the potential to improve overall understanding of the OOP
concepts. The results of the close-ended responses are parented in Figure IV-
20.
When working with the questions, the number of levels played/reached
by the participants, Pearson’s product-moment correlation coefficient was
used to identify the strength and direction of correlations between post-test
scores. The result of Pearson’s analysis is presented in Table IV-14, and it
shows, that there is a strong positive correlation between the levels reached
by the player and post-test scores (r=0.737, n=42, p=0.000).
185
Table IV-14 - Correlation analysis between levels played/reached and the post-test scores
Level Played Post-test scores
Level Played Pearson Correlation
Sig. (2-tailed)
N
1
42
.737**
.000
42
Post-test scores
Pearson Correlation
Sig. (2-tailed)
N
.737**
.000
42
1
42
** Correlation is significant at the 0.01 level(2-tailed)
186
Figure IV-20 Responses for close-ended questions for game experience survey
187
Another open-ended question was about the rating of the developed
serious game on a rating scale ranging from 1 (worst) to 5(best)—the results
for the overall game rating presented in Figure IV-21. The result shows that
none of the participants rated the game as worst, and only 7 participants rated
the game less than the good or best. The other 35 students ranked the game
from good to best.
Figure IV: Rating of overall game experience
Participants also received three optional open-ended questions to
express their preferences in the game, what they like in the developed serious
game, what they don’t like, and suggestions on improving the game. This
qualitative data helps in providing the strength and weaknesses of the OOsg.
Table IV-15 presented some useful and most frequent comments from the
participants.
188
Table IV-15. Participants comments about the OOsg
What student’s liked in the developed Serious Game
Idea, content, theme, and time challenging warnings.
Everything, but the presentation of the game from beginning up to end, is fantastic.
The interface of this game was I most liked.
if anyone knows how to play, then it’s very fantastic to play and increase knowledge
I like all over the game
Presentation of the contents, animation, and sound
Clarity of the concept of class attributes methods, relations, and objects.
I like the presentation of relationships
The story of the game
It establishes the understanding of objects, class, relationship using hospital scenario, drag and drop feature, and especially the popup congratulations menu when aiming to achieve goals.
The feedback when I made any mistake
What student’s did not like in the developed Serious Game
It's a long and in-depth story.
When I am making a mistake, the sound is a little bit annoying
I can only find the few objects and classes
There were no pictures in the game; that's why it was boring for me.
In 3 level, my game hanged, and this was irritating for me
It’s hard to play this game for a non-game player like me
The level 2 was challenging
The story was too deep and required more time to read it, and therefore, I could not complete all the levels.
The object is being selected by right-clicking rather than left.
Student’s comments for the improvement of the developed serious game
Add more attractive sounds
Include some bonus level and bonus lives
Add the pause button while reading the story in the game
Improve drag and drop mechanism
Resolve errors and fix bugs
There should be an enhanced warm-up session for easiness
You can more improve this game by providing different stories to choose according to their interested domains.
The level should be selective from the start menu not to skip from inside
189
Currently, making the selection is available via mouse only. Please provide a way so that it can be played through the keyboard
We need animated games with interesting stories that attract
I can suggest that just improve some things of OOsg like play in full-screen mood, sounds, show how to play levels before starting level.
Improve some bugs which are faced during play
In level 4, add a complete and empty hierarchy structure, so the user just drags and drops the parent or a child class in an empty box in the hierarchy.
2.2.4 Results of system usability
Usability measures are concerned with measuring the user’s
satisfaction while using the OOsg. All the selected participants evaluated the
system’s usability to be part of the experimental group in this study. The
quantitative data was generated from the Post-Study System Usability
Questionnaire (PSSUQ) developed by (Lewis, 1995). PSSUQ scale
subcategories include system usability, information quality, and interface
quality. The means reliability of PSSUQ was (.94) and is entirely free; the result
of each subscale's reliability is also shown in Figure IV-22.
Figure IV-22 Reliability measure for PSSUQ
190
The quantitative analysis of three subdomains of PSSUQ is as follows:
The System Usefulness or usability domain refers to a system that is
easy to use and easy to learn, allows the user to complete tasks effectively,
and becomes productive quickly. The goal was to determine how easily they
can use the OOsg to accomplish their tasks and whether they were satisfied
with using the developed game.
The subdomain system's usability includes six items. The average
score and standard deviation of all the sub-domain are presented in Table IV-
16. The system usability domain had an average rating of 3.77 and a standard
deviation of 0.20. The highest-rated item was U2 ‘It was simple to use this
game,’ with 3.98 out of 5 possible scores. Overall, the students believe that
OOsg was easy to use and learn and satisfied to learn with it.
The information quality domain refers to the system's feedback, such as
error messages and or other information they receive if something was done
wrong and how to fix problems. It may also include other help options such as
hints, onscreen messages, rules, and goals of playing the game at the start of
the game. Moreover, it also measures if the contents included in the game are
easy to understand, effectively helps the user complete tasks, and is
organized.
The subdomain system information quality includes six items.
Information quality had an average score of 3.67 and a standard deviation of
1.07. Overall the students believe that information provided in the OOsg for
learning contents and the information provided if students performed any
mistake was effectively helpful and presented in an organized manner.
191
The interface quality domain concerns with how pleasant the system
was to their user. It measures if s/he liked the system and if the system has all
the functionality and capabilities s/he expected. The domain is also concerned
with the participant’s overall satisfaction for using the interface.
The subdomain interface quality includes four items; one is dedicated
to whether the participant’s satisfaction for using the developed game. The
interface quality had an average score of 3.60 and a standard deviation of
1.26.
Table IV-16: Average and standard deviation for the usability subcategories
i �̅� s
System Usability (�̅�=3.77, s=1.20)
U1 Overall, I am satisfied with how easy it is to use this game. 3.83 1.25
U2 It was simple to use this game. 3.98 1.00
U3 I was able to complete the levels and scenarios quickly using this game.
3.88 0.99
U4 I felt comfortable using this game. 3.67 1.32
U5 It was easy to learn to use this game. 3.38 1.23
U6 I believe I could become productive quickly using this game. 3.86 1.39
Information Quality (�̅�=3.67, s=1.07)
U7 The game gave error messages that told me how to fix problems. 3.69 1.30
U8 Whenever I made a mistake using the game, I could recover easily and quickly. 3.60 0.96
U9 The information (such as online help, on-screen messages, and other documentation) provided with this game was clear. 3.55 0.89
U10 It was easy to find the information I needed. 3.83 1.03
U11 The information was effective in helping me complete the tasks and scenarios. 3.86 1.16
U12 The organization of information on the game level screens was clear. 3.52 1.06
Interface Quality (�̅�=3.60, s=1.26)
U13 The interface of this game was pleasant. 3.33 1.44
U14 I liked using the interface of this game. 3.90 1.25
192
U15 This game has all the functions and capabilities I expect it to have. 3.69 1.12
U16 Overall, I am satisfied with this game. 3.48 1.23
The highest-rated subcategory was the system's usefulness, followed
by information quality, with an average score of 3.77 and 3.67.
193
CHAPTER V
DISCUSSION
This chapter focused on the work carried out throughout the research
study. First, the purpose and motivation of the research are introduced.
Second, it shows the findings that meet the research objectives and research
questions. This research's main focus is to design and develop serious game
prototypes to improve the performance of students learning OOP. Initially,
students regarded learning programming languages as a difficult task, which
led to low motivation for learning object-oriented and, in many cases, the
dropout rate of computer science or related courses was high. Therefore, the
prototype is developed and designed to provide an environment where
students can learn and practice OOP concepts in a stimulating and engaging
way in which the motivation to learn is enhanced without having to worry about
failure.
For the achievement of the objective-1 of the study, a comprehensive
survey for identifying potential difficulties in learning OOP was carried out
through an extensive review of existing studies as well the investigation was
conducted on current OOP learning students. It is discovered that some many
misconceptions and difficulties hinder the students in learning OOP.
Consequently, the objective was necessary to be achieved, because, if the
potential obstacles are identified in advance, the serious game prototype could
be designed and developed accordingly. The Chapter II and section 1,
dedicated to identifying student's potential difficulties and misconceptions
while learning OOP. The topics covered in the literature review include: tools
194
for identifying difficulties, OOP concepts or instructional contents covered,
content analysis techniques used, specific OOP solved and identified
problems. The result of the reviewed studies is comprehensively presented in
Table II-1 of chapter II. The results showed that many studies concluded that
programming is difficult for the first time, regardless of the type of programming
language (Bennedsen and Schulte, 2007 and Wei, et al., 2005 and Xinogalos
and Satratzemi, 2005 and Bergin, et al., 2012 and Robins, et al. 2003 and
Bennedsen, et al., 2008) they are learning. Results show that students faced
difficulties throughout the whole learning process, from topics related to the
transitioning of procedural programming to OOP to the issues concerned with
perceived motivation form their existing learning environment. After gaining an
essential understanding of the problem area, students are better off starting to
solve the problem. The potential difficulties identified from the studies include
the difficulty in understanding of basic OOP concepts, which is classification,
application, utilization, implementation, and debugging of the OOP features.
More concretely, the majority of the studies stressed on the cognitive
difficulties related to fundamental quarks of OOP such as adding or creation of
the classes, assigning properties and methods to the classes, identification of
the classes, method creation, deletion, invocation, return values of the
methods and passing the parameter to the methods, message passing and so
on. All the studies also showed that difficulties arise because of the
motivational issues of the learning environment in which the existing
instructional method is carried out.
Besides, the results of student feedback provided in chapter III and
section 2, indicate that it is difficult to understand the difference between the
195
programming language and OOP, lack of understanding, and unable to apply
basic OOP concepts. Difficulties in existing teaching methods have led to a
lack of interest in learning OOP.
In summary, the overall results indicate that students have difficulties
and misunderstandings in understanding almost all OOP quarks. However,
incorporating all these difficulties into the design and development of serious
game prototypes is beyond research scope. Therefore, this study is limited to
considering cognitive difficulties related to classes, attributes, behaviors,
objects and hierarchies. Difficulties related to motivation issues are also
considered, because the method of teaching OOP concepts was not attractive
enough, the complexity of the learning and practice environment was the
primary cause for student's lack of interest in learning.
Objective-2 of this study was aimed at the formulation of a model for a
learning environment that overcomes the student's difficulties in learning OOP
as a result of objective-1. The designed model focused on fostering learning
outcomes and improving learning performance by rather starting with the
technical details of OOP, students would be provided with an environment
where the basic concepts of OOP at the beginning of the course are taught
using the entertaining and engaging environment. The students can be
motivated to learn boring topics by enjoying the experience. To achieve the
objective-2 of study, first, the review of existing instructional methods used for
teaching OOP was carried out. In section 5 of Chapter II, the approaches
available for teaching OOP have been discussed. Among the various available
approaches (approaches?), the game-first was selected as the instructional
method for this research. The reason for considering this method is because
196
game-like tools have played a vital role in the entire educational process,
helping to understand better and promote learning and this method helps to
gain student motivation at the beginning of the course. After finalizing the
game-first approach, reviewing the existing games available for learning OOP
has been reviewed in Chapter II, Section 7. The review involves various game
attributes of serious games that contribute to learning and motivation, the
integrated learning theories, and instructional delivery methods in the existing
serious games available.
The review results in Table II-2 of chapter II showed that existing serious
game design and development did not consider the student's difficulties, which
needs to be overcome and improved. The current literature also revealed
paucity in the use of learning theories and instructional designs in the design
and development of serious game prototypes for learning OOP (Xu, 2009, Al-
Linjawi and Al-Nuaim, 2010, Yulia and Adipranata, 2010, Rais and Syed-
Mohamad, 2011, Depradine, 2011, Livovský and Porubän, 2014, Poolsawas,
et al., 2016, Seng and Yatim, 2014, Seng, et al., 2015, Seng, et al., 2016, Seng
and Yatim, 2018 and Seng, et al., 2018). Despite some promising results
obtained from the current literature, it does not show the presumed link
between the motivation provided by the games and actual learning outcomes
supposed to be achieved by incorporating SGs for learning OOP (Phelps, et
al., 2005).
Therefore, by considering the limitations of the existing serious games,
the new model was effectively designed. The developed model is presented
and discussed in section 5.2 of chapter III. In the third phase of the research,
a serious game model was designed for learning OOP. In the formulated model
197
shown in Figure III-5 of chapter III, the components discussed in existing
serious games, i.e., instructional contents or learning difficulties, game
attributes, learning theories, required competencies, and motivational aspects,
are logically placed in the presentation, practice, and performance phases.
The presentation phase provides the input content to the user/player as
learning material; in this case, the basic OOP quarks and the objective-1 result
were used as instructional contents. The practice phase is about the logical
mapping of the instructional contents over game attributes and content delivery
instructions. The learning theories are also integrated into the design of the
model in this phase. The assessment and evaluation of the user/player as a
result of playing the game is provided in the performance phase. The
performance was evaluated based on intended learning outcomes or required
OOP competencies and the effect of the developed model's motivation.
The Objective-3 was achieved by the design and development of a
serious game prototype that supports the model for learning OOP designed as a
result of objective-2. The developed serious game for learning OOP is named
OOsg. The detail about the development of the OOsg and its OOsg incorporates
all the components and flow of activities involved in designing the serious game
model.
To achieve the objective-4, an experimental evaluation was carried out
to analyze the student's performance for learning OO with and without using
the developed serious game prototype. The four studies, i.e., the difference in
student's performance, normalized learning gain, effect size, and effect of
perceived motivation by using the developed serious game prototype on the
198
students' learning outcomes, were performed to achieve objective-4 of this
research study.
For the experimental evaluation of the study-1, the pre-test scores of
both the experimental and control groups are compared with the post-test
scores. The study aimed to determine whether the difference between means
for the two sets of scores (pre-test and post-test) is the same or different. The
experimental study-1 presented in Table IV-2, showed that the experimental
group's post-test scores are higher than the pre-test scores. However, no
significant difference was found in the post-test and pre-test scores of the
control group (p<0.05, paired t-test).
The experimental Study-2 was aimed to find the difference in students'
normalized learning gain for learning OOP with and without the prototype's
intervention. The average normalized learning is estimated to measure the
effectiveness of the prototype in promoting conceptual understanding of the
subject. The result of mean scores for average normalized learning gain shown
in Figure IV-14 indicates that only 1% of the gain has been observed for the
control group, whereas the experimental groups showed a 21% gain in the
participants' learning. The experimental analysis results for the study-2
presented in Table IV-3 showed that students' normalized learning gains led
to highly significant for the experimental group than those of the control group
(p<0.05, paired t-test).
The experimental study-3 was conducted to find the effect size of the
serious game prototype on the students' learning outcomes. The effect size
was measured because the p-value determined as the result of study-1 could
only inform whether an intervention effect exists or not, but it is not sufficient
199
to determine the size of the effect. The result of study-4 presented in Table IV-
6 indicates a significantly larger effect size (d=3.40) for the experimental group
compared to the control group (d=0.14).
The experimental study-4 aimed to find the effect of perceived
motivation on the learning outcomes of the students. The experimental group
results presented in Table IV-4 showed a significant impact of motivation
provided by the OOsg on the students' learning outcomes (r=.530, n=42,
p<0.05). However, the control group results presented in Table IV-5 revealed
no significant effect of perceived motivation on the students' learning outcomes
(r= .046, n=41, p>0.05).
The evaluation of the prototype was performed by considering the four
different parameters; two of them, namely perceived motivation and perceived
feedback, were evaluated by both groups of students, whereas evaluating the
other two, i.e., game experience and system usability, were performed by
experimental group students.
The evaluation of perceived motivation was conducted using a 5-points
IMMS scale. The result of the evaluation presented in Table IV-7 to Table IV-
10, showed the highest mean score for attention �̅�=3.87 followed by relevance
�̅�=3.66 subcategories. The percentage of the score presented in Table IV-11
indicates that the highest percentage of 47.61% for subcategories attention
and satisfaction is at a rating of 4.01 to 5.00 and between 3.1 to 4.0,
respectively.
The results for the evaluation of the perceived feedback presented in
Table IV-12 showed that significant differences had been observed for the
perception of the feedback received from the OOsg than those obtained using
200
the traditional instructions (p<0.05, Wilcoxon Signed Ranks).
The evaluation of game experience presented in Table IV-14 showed a
strong positive correlation between the number of the levels played/reached
and the post-test scores for the participants of the experimental
groups(r=0.737, n=42, p = 0.000). The evaluation provided evidence that the
overall game experience was effective and the developed serious game had
the potential to improve understanding of the OO concepts.
For the usability evaluation, results presented in Table IV-16 indicate
that students showed satisfactory results for the usefulness of the OOsg with
an average score of 3.77, followed by the information quality subcategory with
an average score of 3.67.
201
CHAPTER VI
CONCLUSIONS AND FUTURE WORK
This chapter outlines the major contribution to achieving the research
objectives, limitations, and the possible future work. The two major
contributions in this research include designing the serious game model and
its implementation named OOsg. The design and development of the OOsg
fulfils the missing evidence found in existing serious games such as
considering the difficulties and misconceptions of the student in learning OOP,
incorporation of learning theories, instructional designs, improper mapping of
the game and learning attributes. The second is the statistical contribution for
providing the missing rigorous statistical evidence between the students used
the traditional teaching methods and OOsg for learning OOP. The details
about these two contributions are discussed below;
• Investigation on students
The investigation on the performance of students was done to better
understand the types of difficulties and misconceptions encountered in solving
some OO-related tasks and activities. In the first phase students’ abilities for
the achievement of OOP competencies presented through OOP related tasks
and to find out the types of mistakes and misconceptions student make when
various activities involving OOP concepts are given to them. Investigation on
the students' difficulties has been done to be considered as guidelines for the
incorporation of the instructional contents in the serious games model.
• Design of Competency Model
In this study, we addressed the issues in designing and developing the
202
serious game prototype for learning OOP. Thus, as a result, the competency
model was designed to determine the major competencies required for
mastering the skill of OOP. The competency model is also used as the
expected learning outcomes intended to achieve by the OOsg.
• Design of Serious Game Model
The issues of paucity in learning theories, instructional designs and
improper mapping of the game attributes in the available games are expedited
by designing a new model for serious games. The formulated model is
designed based on the extensive review conducted to identify students'
difficulties and pitfalls in the existing serious game models.
• Development of Serious Game Prototype
A serious game prototype was created to demonstrate the
implementation of the designed model. The prototype allows students to
practice and learn the basic OOP concepts, such as classes, objects,
attributes, methods, and inheritance using different scenarios provided in the
prototype.
• Experimental study and evaluation of the OOsg
The experimental analysis results showed that the prototype proved to
be helpful for the students in improving the learning outcomes of OOP
concepts in which they were facing difficulties. The effect size by intervening
in the OOsg is also observed to be significantly larger. This study aimed at
presenting the lacking observational proof that the game's motivation has an
effect on improving learning outcomes for OO learning. The impact of the
prototype's motivation proved a significant impact on the students' learning
203
outcomes. Through the strong results obtained from the prototypes that have
been developed, our work has made an important contribution in encouraging
the use of OOsg to learn OO in a fun and engaging environment. The flexible
techniques and strategies used in developing the prototype can set the path
for quickly building serious games for other programming domains, which help
reduce the development time and simplify the integration of instructional
content.
• Limitation and future recommendation
As future recommendations, there are various suggestions to improve
this study. The combination of the experimental study results and the
limitations of this study marks several directions that can be explored. This
research is related to the design and development of a serious game named
OOsg, which is specifically used to help students improve their learning OOP
performance. In the future, improvements can be made by enhancing more
functions related to polymorphism, encapsulation, abstraction in the prototype,
and expanding the prototype's evaluation scope.
In developing the existing serious games prototype, the incorporation of
instructional contents was based on an analysis of students' difficulties and
misconceptions in learning OOP. However, considering and overcoming all
these difficulties is beyond the scope of this study, so the prototype can be
further enriched in the future to consider other encountered difficulties such as
abstraction (Relevant and Irrelevant details), encapsulation (internal details of
the classes), generalization (types and subtypes), and polymorphism
(Interaction). Besides, for practicing and learning using OOsg, only one game
story related to the hospital management system is included. In the future, as
204
the prototype is flexible enough to adapt to changes in instructional content or
game stories without changing the prototype's architecture, more stories can
be easily included.
In the existing prototype, the player/users' assessment is performed
from the log files generated as a result of playing the game, the assessment
mechanism could be improved by providing real-time results, and more
accurate and tailored feedback is certain while playing the game. Furthermore,
the students' comments given in-game experience surveys for the game's
improvement could also be considered in the future.
For the experimental evaluation, a limited number of participants from
the local universities were selected, the scope of the evaluation may further be
lengthened by increasing the number of the participant as well as participants
across geographical boundaries. The experimental evaluation of the OOsg is
completed by students participating in the OOP course. The prototype can be
further evaluated by experts in the OO field to collect their opinions to
determine the effectiveness of OOsg and make further improvements based
on their suggestions.
205
REFERENCES
ABBASI, S., KAZI, H., and KHOWAJA, K., (2017), “A systematic review of learning object oriented programming through serious games and programming approaches”. In 2017 4th IEEE International Conference on Engineering Technologies and Applied Sciences (ICETAS) (pp. 1-6). IEEE.
ABT, C.C., (1987), “Serious games”, University Press of America. AL-LINJAWI, A. A. and AL-NUAIM, H. A., (2010), “Using Alice to teach novice
programmers OOP concepts”, Journal of King Abdulaziz University: Science, 148(632), pp.1-20.
ALICE, (2008), Carnegie Mellon University, “Alice.org”. http://www.alice.org/.
(Accessed, June 2018). ALTON-LEE, A., (2003), “Quality teaching for diverse students in schooling:
Best evidence synthesis”, Wellington, New Zealand: Ministry of Education.
ALYAMI, S. M. and ALAGAB, A. M., (2013), “The Difference in Learning
Strategies in Virtual Learning Environment and Their Effect on Academic Achievement and Learning Satisfaction for Distance Teaching and Training Program Students”, In Proceedings - 4th International Conference on e-Learning Best Practices in Management, Design and Development of e-Courses: Standards of Excellence and Creativity, ECONF, pp. 102–112.
ANDERSON, E. F. and MCLOUGHLIN, L., (2007), “Critters in the Classroom:
A 3D Computer-Game-like Tool for Teaching Programming to Computer Animation Students”, In ACM SIGGRAPH 2007 Educators Program on - SIGGRAPH ’07, 7. New York, New York, USA: ACM Press.
ANG, C. S., AVNI, E. and ZAPHIRIS, P., (2008), “Linking pedagogical theory
of computer games to their usability”, International Journal on E-Learning, 7(3), pp.533-558.
ARMSTRONG, D.J., (2006), “The Quarks of Object-Oriented Development”,
Communications of the ACM, 49(2), pp.123–28. BLOOM, B. S., (1956), “Bloom’s taxonomy” BASHIRU, L. AND JOSEPH, A. A., (2015), “Learning difficulties of Object
Oriented Programming (OOP) in University of Ilorin-Nigeria: Students perspective”, In Eighth TheIIER-Science Plus International Conference, Dubai, UAE. cf. p. 45, 46, 50.
206
BECKER, K., (2007), “Oh, the thinks you can think: language barriers in serious game design”, In Proceedings of the 2007 conference on Future Play, pp. 229-232.
BEDNARIK, R. and TUKIAINEN, M., (2006), “An eye-tracking methodology for characterizing program comprehension processes”, In Proceedings of the 2006 symposium on Eye tracking research & applications, pp. 125-132.
BEDNAR, A. K., CUNNINGHAM, D., DUFFY, T. M. and PERRY, J. D. (1991), “Theory into practice: How do we think: In CJ Anglin (Ed.), Instructional technology: Past, present, and future”, Englewood, CO: Libraries Unlimited, pp. 88-101.
BEDWELL, W. L., PAVLAS, D., HEYNE, K., LAZZARA, E. H. and SALAS,
E., (2012), “Toward a taxonomy linking game attributes to learning: An empirical study”, Simulation & Gaming, 43(6), pp.729-760.
BENNEDSEN, J. and SCHULTE, C., (2007), “What does objects-first mean?
An international study of teachers' perceptions of objects-first”, In Proceedings of the Seventh Baltic Sea Conference on Computing Education Research, Vol.88, pp. 21-29.
BENNEDSEN, J., CASPERSEN, M.E. and KÖLLING, M. Eds., (2008),
“Reflections on the teaching of programming: methods and implementations”, Vol. 4821. Springer.
BERGIN, J., STEHLIK, M., ROBERTS, J. and PATTIS, R., (1997), “Karel++:
A gentle introduction to the art of object-oriented programming”, Vol. 1, New York: Wiley.
BERGIN, J., ECKSTEIN, J., VOLTER, M., SIPOS, M., WALLINGFORD, E.,
MARQUARDT, K., CHANDLER, J., SHARP, H. and MANNS, M.L., (2012), “Pedagogical patterns: advice for educators”, Joseph Bergin Software Tools.
BLACK, P. and WILIAM, D., (1998), “Assessment and classroom learning”,
Assessment in Education: principles, policy & practice, 5(1), pp.7-74. BOHLIN, R. M., MILHEIM, W. D. and VIECHNICKI, K. J., (1990), “A
Prescriptive Model for the Design of Motivating Instruction for Adults”, In annual meeting of the American Educational Research Association, Boston, MA.
BOOCH, G., MAKSIMCHUK, R. A., ENGLE, M. W., YOUNG, B. J.,
CONNALLEN, J. and HOUSTON, K. A., (2008), “Object-oriented analysis and design with applications”, ACM SIGSOFT software engineering notes, 33(5), pp.29-29.
BOYLE, E. A., HAINEY, T., CONNOLLY, T. M., GRAY, G., EARP, J., OTT,
M., LIM, T., NINAUS, M., RIBEIRO, C. and PEREIRA, J., (2016), “An update to the systematic literature review of empirical evidence of the
207
impacts and outcomes of computer games and serious games”, Computers & Education, No.94, pp.178-192.
BROWN, J. S., COLLINS, A. and DUGUID, P., (1989), “Situated cognition
and the culture of learning”, Educational researcher, 18(1), pp.32-42. BURTON, P. J. and BRUHN, R. E., (2003), “Teaching programming in the
OOP era”, ACM SIGCSE Bulletin, 35(2), pp.111-114. BUTLER, M. and MORGAN, M., (2007), “Learning challenges faced by novice
programming students studying high level and low feedback concepts”, Proceedings ascilite Singapore, pp.99-107.
BYARD, C., (1990), “Object-oriented technology a must for complex systems”, Computer Technology Review, 10,(14), pp.15-20.
CARBONARO, M., KING, S., TAYLOR, E., SATZINGER, F., SNART, F. and
DRUMMOND, J., (2008), “Integration of e-learning technologies in an interprofessional health science course”, Medical teacher, 30(1), pp.25-33.
CORDOVA, D. I. and LEPPER, M. R., (1996), “Intrinsic motivation and the
process of learning: Beneficial effects of contextualization, personalization, and choice”, Journal of educational psychology, 88(4), p.715.
CARROLL, J. M., (1982), “The adventure of getting to know a computer”,
Computer, Vol.11, pp.49-58. CARTER, J. and FOWLER, A., (1998), “Object oriented students?(poster)”,
ACM SIGCSE Bulletin, 30(3), p.271. CHEN, W.K. and CHENG, Y.C., (2007), “Teaching object-oriented
programming laboratory with computer game programming”, IEEE Transactions on Education, 50(3), pp.197-203.
CLARK, D. B., TANNER-SMITH, E., HOSTETLER, A., FRADKIN, A. and
POLIKOV, V., (2018), “Substantial integration of typical educational games into extended curricula”, Journal of the Learning Sciences, 27(2), pp.265-318.
COHEN, J., (1988), “Statistical Power Analysis for the Social Sciences”. CONNOLLY, T. M., BOYLE, E. A., MACARTHUR, E., HAINEY, T. and
BOYLE, J. M., (2012), “A systematic literature review of empirical evidence on computer games and serious games”, Computers & education, 59(2), pp.661-686.
COOPER, S., DANN, W., PAUSCH, R. and PAUSCH, R., (2000), “Alice: a 3-
D tool for introductory programming concepts”, In Journal of computing sciences in colleges, Consortium for Computing Sciences in Colleges.
208
15(5), pp. 107-116. COOPER, S., DANN, W. and PAUSCH, R., (2003), “Teaching objects-first in
introductory computer science”, ACM SIGCSE Bulletin, 35(1), pp.191-195.
COOPER, P. A., (1993), “Paradigm shifts in designing instruction: From
behaviorism to cognitivism to constructivism”, Educational Technology,
33(5), pp. 12-19. CORRAL, J. M. R., BALCELLS, A. C., ESTÉVEZ, A. M., MORENO, G. J. and
RAMOS, M. J. F., (2014), “A game-based approach to the teaching of object-oriented programming languages”, Computers & Education, Vol. 73, pp.83-92.
CRAIK, F. I. and LOCKHART, R. S., (1972), “Levels of processing: A
framework for memory research”, Journal of verbal learning and verbal behavior, 11(6), pp.671-684.
CRAIK, F. I. and JACOBY, L. L., (1975), “A process view of short-term
retention”, Cognitive theory, No. 1, pp.173-192. DECKER, R. and HIRSHFIELD, S., (1999), “Programming Java: An
Introduction to Programming Using Java”, Brooks. DEPRADINE, C. A., (2011), “Using gaming to improve advanced
programming skills”, The Caribbean Teaching Scholar, 1(2). DICK, W., CAREY, L. and CAREY, J. O., (2005), “The systematic design of
instruction”, Third Edition, Harper Collins DIMOCK, V. and BOETHEL, M., (1999), “Constructing Knowledge With
Technology”, Southwest Educational Development Laboratory. DIVJAK, B. and TOMIĆ, D., (2011), “The impact of game-based learning on
the achievement of learning goals and motivation for learning mathematics-literature review”, Journal of Information and Organizational Sciences, 35(1), pp.15-30.
DOS SANTOS, A. L., SOUZA, M. R., DAYRELL, M. and FIGUEIREDO, E.,
(2018), “A Systematic Mapping Study on Game Elements and Serious Games for Learning Programming”, In International Conference on Computer Supported Education, pp. 328-356.
DRISKELL, J. E. and DWYER, D. J., (1984), “Microcomputer videogame
based training”, Educational technology, 24(2), pp.11-16. EREZ, M., (1977), “Feedback: A necessary condition for the goal setting-
performance relationship”, Journal of Applied psychology, 62(5), p.624.
209
ERNEST, P., (1995), “The nature of mathematics and teaching”, PERSPECTIVES-EXETER-, No. 53, pp.29-41.
FRASCA, G., (2003), “Simulation versus narrative. The video game theory
reader”, pp.221-235. GAGNE, R. M. AND BRIGGS, L. J., (1974), “Principles of instructional
design”, Holt, Rinehart & Winston. GAGNE, R. M. and DRISCOLL, M. P., (1988), “Essentials of learning for
instruction”, Englewood Cliffs. GAGNE, R. M. and MEDSKER, K. L., (1996), “The conditions of learning:
Training applications”. GARNER, S., HADEN, P. and ROBINS, A., (2005), “My program is correct
but it doesn't run: a preliminary investigation of novice programmers' problems”, In Proceedings of the 7th Australasian conference on Computing education, Vol.42, pp. 173-180.
GARRIS, R., AHLERS, R. and DRISKELL, J. E., (2002), “Games, motivation,
and learning: A research and practice model”, Simulation & gaming, 33(4), pp.441-467..
GEE, J. P., HAYES, E., TORRES, R. J., GAMES, I. A., SQUIRE, K. and
SALEN, K., (2008), “Playing to learn game design skills in a game context”, In Proceedings of the 8th international conference on International conference for the learning sciences, Vol.3, pp.368-374. International Society of the Learning Sciences.
GERLACH, VERNON S., DONALD P. ELY, and ROB M., (1980), “Teaching
& Media: A Systematic Approach”. GIBBONS, T. E., (2002), “Using Graphics in the First Year of Programming
with C++”, College of St. Scholastica. GIRAFFA, L. M., MORAES, M. C. and UDEN, L., (2014), “Teaching object-
oriented programming in first-year undergraduate courses supported by virtual classrooms”, In The 2nd International Workshop on Learning Technology for Education in Cloud, pp. 15-26. Springer, Dordrecht.
GUSTAFSON, K. L. AND BRANCH, R. M., (2002), “What is instructional
design”, Trends and issues in instructional design and technology, pp.16-25.
DE GLORIA, A., BELLOTTI, F. and BERTA, R., (2014), “Serious Games for
education and training”, International Journal of Serious Games, 1(1).
210
GOMES, A. and MENDES, A. J., (2007), “Learning to program-difficulties and solutions”, In International Conference on Engineering Education–ICEE, Vol. 2007.
GREEN, M. and SULBARAN, T., (2006), “Motivation assessment instrument
for virtual reality scheduling simulator”, In E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education, Association for the Advancement of Computing in Education (AACE), pp. 45-50.
HADJERROUIT, S., (1998), “A constructivist framework for integrating the
Java paradigm into the undergraduate curriculum”, In Proceedings of the 6th annual conference on the teaching of computing and the 3rd annual conference on Integrating technology into computer science education: Changing the delivery of computer science education pp. 105-107.
HAENDLER, T., (2019), “A Card Game for Learning Software-Refactoring
Principles”, In Proceedings of the 3rd International Symposium on Gamification and Games for Learning (GamiLearn’19). Barcelona, Spain.
HAKE, R. R., (1997), “Evaluating conceptual gains in mechanics: A six
thousand student survey of test data”, In AIP Conference Proceedings, 399(1), pp. 595-604). American Institute of Physics.
HAVENGA, M., MENTZ, E. and DE VILLIERS, R., (2008), “Knowledge, skills
and strategies for successful object-oriented programming: a proposed learning repertoire”, South African Computer Journal, 12(Dec 2008), pp.1-8.
HENDERSON-SELLERS, B., (1996), “A book of object-oriented knowledge:
an introduction to object-oriented software engineering”, Prentice-Hall, Inc.
HOLLAND, S., GRIFFITHS, R. and WOODMAN, M., (1997), “Avoiding object
misconceptions”, In Proceedings of the twenty-eighth SIGCSE technical symposium on Computer science education, pp. 131-134.
HUNICKE, R., LEBLANC, M. AND ZUBEK, R., (2004), “MDA: A formal
approach to game design and game research”, In Proceedings of the AAAI Workshop on Challenges in Game AI, 4(1), p. 1722.
JENKINS, T., (2002), “On the difficulty of learning to program”, In Proceedings
of the 3rd Annual Conference of the LTSN Centre for Information and Computer Sciences, pp. 53–58.
JIAU, H. C., CHEN, J. C. and SSU, K. F., (2009), “Enhancing self-motivation
in learning programming using game-based simulation and metrics”, IEEE Transactions on Education, 52(4), pp.555-562.
211
JOHNSON, W. L., RICKEL, J. W. and LESTER, J. C., (2000), “Animated
pedagogical agents: Face-to-face interaction in interactive learning environments”, International Journal of Artificial intelligence in education, 11(1), pp.47-78.
JOKELA, T., IIVARI, N., MATERO, J. and KARUKKA, M., (2003), “The
standard of user-centered design and the standard definition of usability: analyzing ISO 13407 against ISO 9241-11”, In Proceedings of the Latin American conference on Human-computer interaction. pp. 53-60.
KEBRITCHI, M., HIRUMI, A. and BAI, H., (2008), “The effects of modern math
computer games on learners’ math achievement and math course motivation in a public high school setting”, British Journal of Educational Technology, 38(2), pp.49-259.
KELLER, J. M., (1983), “Motivational design of instruction”, Instructional
design theories and models: An overview of their current status, 1(1983), pp.383-434.
KELLER, J. M., (2010), “What is motivational design?. In Motivational design
for learning and performance”, Springer, Boston, MA. pp. 21-41. KEMP, J. E. and RODRIGUEZ, L., (1992), “The basics of instructional design”
The Journal of Continuing Education in Nursing, 23(6), pp.282-284. KHOWAJA, K., (2017), “A serious game design framework for vocabulary
learning of children with autism”, Ph.D. dissertation, University of Malaya, Malaysia.
KLIEME, E., (2004), “The development of national educational standards: An
expertise”, BMBF, Publ. and Website Division. KÖLLING, M., (2010), “The Greenfoot Programming Environment”, ACM
Transactions on Computing Education, 10(4), pp.1–21. KÖLLING, M., QUIG, B., PATTERSON, A. and ROSENBERG, J., (2003),
“The BlueJ system and its pedagogy”, Computer Science Education, 13(4), pp.249-268.
KOLB, D. A., (2014), “Experiential learning: Experience as the source of
learning and development”, FT press. KOULOURI, T., LAURIA, S. and MACREDIE, R. D., (2014), “Teaching
introductory programming: A quantitative evaluation of different approaches”, ACM Transactions on Computing Education (TOCE), 14(4), pp.1-28.
KRAMER, M., HUBWIESER, P. and BRINDA, T., (2016), “A competency
212
structure model of object-oriented programming”, In 2016 International Conference on Learning and Teaching in Computing and Engineering (LaTICE). pp. 1-8.
KRISHNAMURTHI, S. and FISLER, K., (2019), “Programming Paradigms and
Beyond”, The Cambridge Handbook of Computing Education Research, pp.377–413.
KUNKLE, W. M. and ALLEN, R. B., (2016), “The impact of different teaching
approaches and languages on student learning of introductory programming concepts”, ACM Transactions on Computing Education (TOCE), 16(1), pp.1-26.
LAHTINEN, E., ALA-MUTKA, K. and JÄRVINEN, H. M., (2005), “A study of
the difficulties of novice programmers”, Acm sigcse bulletin, 37(3), pp.14-18.
LAMERAS, P., ARNAB, S., DUNWELL, I., STEWART, C., CLARKE, S. and
PETRIDIS, P., (2017), “Essential features of serious games design in higher education: Linking learning attributes to game mechanics”, British journal of educational technology, 48(4), pp.972-994.
LEAHEY, T. H. and HARRIS, R. J., (1989), “Human learning”, Prentice Hall. LEDGARD, H. F., (1995), “The little book of object-oriented programming”,
Prentice-Hall, Inc.. LEE, I., MARTIN, F., DENNER, J., COULTER, B., ALLAN, W., ERICKSON,
J., MALYN-SMITH, J. and WERNER, L., (2011), “Computational thinking for youth in practice”, Acm Inroads, 2(1), pp.32-37.
LEWIS, J. R., (1995), “IBM computer usability satisfaction questionnaires:
psychometric evaluation and instructions for use”, International Journal of Human‐Computer Interaction, 7(1), pp.57-78.
LEVY, R. B. B., BEN-ARI, M. and URONEN, P. A., (2003), “The Jeliot 2000
program animation system”, Computers & Education, 40(1), pp.1-15. LIBERMAN, N., BEERI, C. and BEN-DAVID KOLIKANT, Y., (2011),
“Difficulties in learning inheritance and polymorphism”, ACM Transactions on Computing Education (TOCE), 11(1), pp.1-23.
LIVOVSKÝ, J. and PORUBÄN, J., (2014), “Learning Object-Oriented
Paradigm by Playing Computer Games: Concepts First Approach”, Central European Journal of Computer Science, 4(3), pp.171–182.
LOCKE, E. A. and LATHAM, G. P., (1990), “A theory of goal setting & task
performance”, Prentice-Hall, Inc.
213
LOTFI, E. and MOHAMMED, B., (2018), “Teaching Object Oriented Programming Concepts Through a Mobile Serious Game”, In Proceedings of the 3rd International Conference on Smart City Applications. pp. 1-6.
MALONE, T. W., (1980), “What makes things fun to learn? Heuristics for
designing instructional computer games”, In Proceedings of the 3rd ACM SIGSMALL symposium and the first SIGPC symposium on Small systems, pp. 162-169.
MALONE, T. W. and LEPPER, M. R., (1987), “Aptitude, learning, and
instruction”, Cognitive and affective process analyses. McGRIFF, S. J., (2001), “ISD knowledge base/instructional design &
development/instructional systems design models”, No.28, pp:2005. MERRILL, M. D., (2002), “First principles of instruction”, Educational
technology research and development. 50(3), pp.43-59. MILNE, I. and ROWE, G., (2002), “Difficulties in Learning and Teaching
Programming-Views of Students and Tutors”, Education and Information Technologies 7(1): 55–66.
MOHAMMED, K. S., POTRUS, M. Y. and DAHAM, B. F. A., (2018), “Effect of
Hybrid Teaching Methodology and Student Group Policy on Object Oriented Problem Solving”, ZANCO Journal of Pure and Applied Sciences. 30(5), pp.140-148.
MOHAMMED, B., (2019) “Towards a Mobile Serious Game for Learning
Object Oriented Programming Paradigms”, In Innovations in Smart Cities Applications Edition 2: The Proceedings of the Third International Conference on Smart City Applications, pp. 450-462.
MORENO-LEÓN, J. and ROBLES, G., (2015), “Computer programming as an
educational tool in the English classroom a preliminary study”, In 2015 IEEE Global Engineering Education Conference (EDUCON), pp. 961-966.
MORRIS, M. G., SPEIER, C. and HOFFER, J. A., (1999), “An Examination of
Procedural and Object‐oriented Systems Analysis Methods: Does Prior Experience Help or Hinder Performance?”, Decision Sciences, 30(1), pp.107-136.
MURATET, M., TORGUET, P., JESSEL, J. P. and VIALLET, F., (2009),
“Towards a Serious Game to Help Students Learn Computer Programming”, International Journal of Computer Games Technology, pp.1–12.
MUTUA, S., WABWOBA, F., OGAO, P., ANSELMO, P., AND ABENGA, E.
(2012), “Classifying program visualization tools to facilitate informed choices: teaching and learning computer programming”.
214
MUURO, M. E., OBOKO, R. O. and WAGACHA, W. P., (2016), “Evaluation of intelligent grouping based on learners’ collaboration competence level in online collaborative learning environment”, International Review of Research in Open and Distributed Learning, 17(2), pp.40-64.
MYLLER, N. and NUUTINEN, J., (2006), “JeCo: Combining program
visualization and story weaving”, Informatics Edition, 5(2), pp. 267–276 NICOLESCU, M. N. and MATARIC, M. J., (2003), “Natural Methods for Robot
Task Learning: Instructive Demonstrations, Generalization and Practice”, In Proceedings of the Second International Joint Conference on Autonomous Agents and Multiagent Systems,pp. 241–48.
OR-BACH, R. and LAVY, I., (2004), “Cognitive activities of abstraction in
object orientation: an empirical study”, ACM SIGCSE Bulletin, 36(2), pp.82-86.
PAGE-JONES, M. and WEISS, S., (1989), “Synthesis: An object-oriented
analysis and design method”, American Programmer, 2(7-8), pp.64-67. PAPERT, S., (1998), “Does easy do it? Children, games, and learning. Game
Developer”, 5(6), p.88. PAPERT, S., (1998), “Technology in schools: To support the system or render
it obsolete”, Milken exchange on education technology. PARKER, L. E. and LEPPER, M. R., (1992), “Effects of fantasy contexts on
children's learning and motivation: Making learning more fun”, Journal of personality and social psychology, 62(4), p.625.
PROTOPSALTIS, A., PANNESE, L., HETZNER, S., PAPPA, D. and DE
FREITAS, S., (2010), “Creative learning with serious games”, International Journal of Emerging Technologies in Learning (iJET), 5.
PAVLOV, I., (1927), “Conditioned Reflexes”, Oxford University Press. PERROTTA, C., FEATHERSTONE, G., ASTON, H. and HOUGHTON, E.,
(2013), “Game-based learning: Latest evidence and future directions”, Slough: NFER.
PHELPS, A. M., EGERT, C. A. and BIERRE, K. J., (2005), “MUPPETS: multi-
user programming pedagogy for enhancing traditional study: an environment for both upper and lower division students”, In Proceedings Frontiers in Education 35th Annual Conference. pp. S2H-8.
POOLSAWAS, B., CHAIPORNKEAW, P. and SUPARKAWAT, S., (2016),
“Platform-based for Game Development to Improve The Object-oriented Programming Skills”, in 30th Natl. Conf. Educ. Technol. Naresuan Univ. Phitsanulok 65000 Thailand, Vol. 30, pp. 6.
215
PRENSKY, M. and THIAGARAJAN, S., (2007), “Digital Game-Based Learning”, Paragon House, St. Paul, MN, pp.17.
PRINCE, M. J. and FELDER, R. M., (2006), “Inductive teaching and learning
methods: Definitions, comparisons, and research bases”, Journal of engineering education, 95(2), pp.123-138.
PROULX, V. K., RAAB, J. and RASALA, R., (2002), “Objects from the
beginning-with GUIs”, In Proceedings of the 7th annual conference on Innovation and technology in computer science education. pp. 65-69.
QUALLS, J. A. and SHERRELL, L. B., (2010), “Why computational thinking
should be integrated into the curriculum”, Journal of Computing Sciences in Colleges, 25(5), pp.66-71.
RAGONIS, N. and BEN-ARI, M., (2005), “On Understanding the Statics and
Dynamics of Object-Oriented Programs”, Proceedings of the 36th SIGCSE Technical Symposium on Computer Science Education - SIGCSE ’05, 226.
RAIS, A. E., SULAIMAN, S. and SYED-MOHAMAD, S. M., (2011), “Game-
based approach and its feasibility to support the learning of object-oriented concepts and programming”, In 2011 IEEE Malaysian Conference in Software Engineering. pp. 307-312.
RAJASHEKHARAIAH, K. M. M., PAWAR, M., PATIL, M. S., KULENAVAR,
N. and JOSHI, G. H., (2016), “Design Thinking Framework to Enhance Object Oriented Design and Problem Analysis Skill in Java Programming Laboratory: An Experience”, In 2016 IEEE 4th International Conference on MOOCs, Innovation and Technology in Education (MITE). pp. 200-205.
RAMANAUSKAITĖ, S. and SLOTKIENĖ, A., (2019), “Hierarchy-Based
Competency Structure and Its Application in E-Evaluation”, Applied Sciences, 9(17), pp.3478.
RAZAK, S., GEDAWY, H., DANN, W. P. and SLATER, D. J., (2016), “Alice in
the Middle East: An Experience Report from the Formative Phase”, In Proceedings of the 47th ACM Technical Symposium on Computing Science Education, pp. 425-430.
REEVE, C., (2009), “Narrative-based serious games”, In Serious Games on
the Move. Springer, Vienna, pp. 73-89. REISER, R. A., (2007), “What field did you say you were in”, Trends and
issues in instructional design and technology, Vol. 3, 1-7. RENKL, A., (2005), “The worked-out-example principle in multimedia
learning”, The Cambridge handbook of multimedia learning, pp.229-245.
216
RESNICK, L., (1987), “Learning in school and out: Educational Researcher”. RICCI, K. E., SALAS, E. and CANNON-BOWERS, J. A., (1996), “Do
computer-based games facilitate knowledge acquisition and retention?”, Military psychology, 8(4), pp.295-307.
ROBINS, A., ROUNTREE, J. and ROUNTREE, N., (2003), “Learning and
teaching programming: A review and discussion”, Computer science education, 13(2), pp.137-172.
ROBLYER, M. D. and DOERING, A. H., (2014), “Integrating educational
technology into teaching”, Pearson new international edition. ROBSON, D., (1981), “Object-Oriented Software Systems”, Byte. ROWE, A. and WOOD, L., (2008), “What feedback do students want?”, In
Australian Association for Research in Education, Conference, pp. 1-9. ROSSON, M. B. and ALPERT, S. R., (1990), “The cognitive consequences of
object-oriented design”, Human-Computer Interaction, 5(4), pp.345-379.
SANDERS, I. and MUELLER, C., (2000), “A fundamentals-based curriculum for first year computer science”, ACM SIGCSE Bulletin, 32(1), pp.227-231.
SANDERS, K., BOUSTEDT, J., ECKERDAL, A., MCCARTNEY, R.,
MOSTRÖM, J. E., THOMAS, L. and ZANDER, C., (2008), “Student understanding of object-oriented programming as expressed in concept maps”, In Proceedings of the 39th SIGCSE technical symposium on Computer science education, pp. 332-336.
SANDERS, K. and THOMAS, L., (2007), “Checklists for grading object-
oriented CS1 programs: concepts and misconceptions”, ACM SIGCSE Bulletin, 39(3), pp.166-170.
SATRATZEMI, M., XINOGALOS, S. and DAGDILELIS, V., (2003), “An
environment for teaching object-oriented programming: ObjectKarel”, In Proceedings 3rd IEEE International Conference on Advanced Technologies, pp. 342-343.
SCRATCH., (2010), A programming language for everyone: Create interactive
stories, games, music and art – and share them online. http://scratch.mit.edu (Accessed: August 2018).
SENG, W. Y. and YATIM, M. H. M., (2014), “Computer game as learning and
teaching tool for object-oriented programming in higher education institution”, Procedia-Social and Behavioral Sciences, Vol. 123, pp.215-224.
217
SENG, W. Y., YATIM, M. H .M. and HOE, T. W., (2015), “Learning Object-Oriented Programming With Computer Games : A Game-Based Learning Approach”, In the Proceedings Of The European Conference On Information Management & Evaluation, pp. 729–38.
SENG, W. Y., YATIM, M. H .M. and HOE, T. W., (2014), “Use Computer Game
to Learn Object-Oriented Programming in Computer Science Courses”, IEEE Global Engineering Education Conference, EDUCON, pp: 9–16.
SENG, W. Y., YATIM, M. H .M. and HOE, T. W., (2016), “A Propriety Game-
Based Learning Game as Learning Tool to Learn Object-Oriented Programming Paradigm”, In Joint International Conference on Serious Games Springer International Publishing, pp. 42–54.
SENG, W. Y. and YATIM, M. H. M., (2018), “A Propriety Multiplatform Game-
Based Learning Game to Learn Object-Oriented Programming”, Proceedings - 2018 7th International Congress on Advanced Applied Informatics, IIAI-AAI 2018, pp. 278–83.
SENG, W. Y., YATIM, M. H .M. and HOE, T. W., (2018), “Learning Object-
Oriented Programming Paradigm Via Game-Based Learning Game – Pilot Study”, The International Journal of Multimedia & Its Applications. 10(6), pp. 181–97.
SHAFFER, D. W. and CLINTON, K. A., (2005), “Why All CSL Is CL:
Distributed Mind and the Future of Computer Supported Collaborative Learning”, In Proceedings of th 2005 conference on Computer support for collaborative learning: learning 2005: the next 10 years!, pages 592–601.
SHEETZ, S. D., IRWIN, G., TEGARDEN, D. P., NELSON, H. J. and
MONARCHI, D. E., (1997), “Exploring the difficulties of learning object-oriented techniques”, Journal of Management Information Systems, 14(2), pp.103-131.
SHUTE, V. J., HANSEN, E. G. and ALMOND, R. G., (2008), “You can't fatten
A hog by weighing It–Or can you? evaluating an assessment for learning system called ACED”, International Journal of Artificial Intelligence in Education, 18(4), pp.289-316.
SIDERIS, G., and XINOGALOS S., (2019), “PY-RATE ADVENTURES: A 2D
Platform Serious Game for Learning the Basic Concepts of Programming With Python”, Simulation and Gaming, 50(6), pp. 754–70.
SIEN, V. Y. and CARRINGTON, D., (2007), “A concepts-first approach to
object-oriented modelling”, In Proceedings of the Third Conference on IASTED International Conference: Advances in Computer Science and Technology, Phuket, Thailand, pp. 108–113.
218
SKINNER, B. F., (2011), “About behaviourism”, Vintage. SMITH, M. K., (1999), “Learning theory”, The encyclopedia of informal
education. SMITH, R. and GOTEL, O., (2007), “Using a game to introduce lightweight
requirements engineering”, In 15th IEEE International Requirements Engineering Conference (RE 2007), pp. 379-380.
SMOLARSKI, D. C., (2003), “A first course in computer science: Languages
and goals”, Teaching Mathematics and Computer Science, 1(1), pp.137-152.
SOMMERVILLE, I. and PRECHELT, l., (2004), “Object-Oriented Design”,
Software Engineering. STEFIK, M. and BOBROW, D. G., (1985), “Object-oriented programming:
Themes and variations”, AI magazine, 6(4), pp.40-40. SULLIVAN, G. M. and FEINN, R., (2012), “Using effect size-or why the P value
is not enough”, Journal of graduate medical education, 4(3), pp.279-282.
THOMASSON, B. J., (2005), “Identifying Faults and Misconceptions of Novice
Programmers Learning Object Oriented Design”, Ph.D. Dissertation, University of Wales, United Kingdom.
THOMASSON, B., RATCLIFFE, M. and THOMAS, L., (2006), “Identifying
novice difficulties in object oriented design”, ACM SIGCSE Bulletin, 38(3), pp.28-32.
THORNDIKE, E. L., (1923), “Educational Psychology: Mental work and fatique
and individual differences and their causes”, Vol. 3. Teachers college, Columbia university.
VON GLASERSFELD, E., (2012), “A constructivist approach to teaching”, In
Constructivism in education, Routledge. pp. 21-34 WEI, F., MORITZ, S. H., PARVEZ, S. M. and BLANK, G. D., (2005), “A
student model for object-oriented design and programming”, Journal of Computing Sciences in Colleges, 20(5), pp.260-273.
WEINERT, F. E., (2001), “Concept of competence: A conceptual clarification”. WILSON, K. A., BEDWELL, W. L., LAZZARA, E. H., SALAS, E., BURKE, C.
S., ESTOCK, J. L., ORVIS, K. L. and CONKEY, C., (2009), “Relationships between game attributes and learning outcomes: Review and research proposals”, Simulation & gaming, 40(2), pp.217-266.
219
WILSON, B. G., (1997), “Reflections on constructivism and instructional
design”, In C. R. Dills & A. J. Romiszowski (Eds.), Instructional
development paradigms, Educational Technology Publications,
Englewood Cliffs, New Jersey, pp. 63-80. WINSLOW, L. E., (1996), “Programming pedagogy-a psychological overview”,
ACM Sigcse Bulletin, 28(3), pp.17-22. WIRFS-BROCK, R. J. and JOHNSON, R. E. (1990), “Surveying current
research in object-oriented design”, Communications of the ACM, 33(9), pp.104-124.
WU, W. H., HSIAO, H. C., WU, P. L., LIN, C. H. and HUANG, S. H., (2012),
“Investigating the learning‐theory foundations of game‐based learning:
a meta‐analysis”, Journal of Computer Assisted Learning, 28(3), pp.265-279.
XINOGALOS, S. and SATRATZEMI, M., (2005), “Using hands-on activities
for motivating students with OOP concepts before they are asked to implement them”, In ITiCSE, pp. 380.
XINOGALOS, S., (2015), “Object-oriented design and programming: an
investigation of novices’ conceptions on objects and classes”, ACM Transactions on Computing Education (TOCE), 15(3), pp.1-21.
XU, C. W., (2009), “Teaching OOP and COP technologies via gaming”, In
Handbook of Research on Effective Electronic Gaming in Education, IGI Global. pp: 508-524.
YAN, L., (2009), “Teaching Object-Oriented Programming with Games”, ITNG
2009 - 6th International Conference on Information Technology: New Generations, pp. 969-974.
YOURDON, E., NEVERMANN, P., OPPEL, K., THOMANN, J. and
WHITEHEAD, K., (1995), “Mainstream Objects: An Analysis and Design Approach for Business”, Yourdon Press.
YULIA AND R. ADIPRANATA, (2010), “Teaching object oriented
programming course using cooperative learning method based on game design and visual object oriented environment,” in Proceedings of the 2nd International Conference on Education Technology and Computer (ICETC '10), pp. V2-355–V2-359,
YUSOFF, A., (2010), “A Conceptual Framework for Serious Games and Its
Validation”, Ph.D. dissertation, University of Southampton, United Kingdom.
220
APPENDICES
APPENDIX- A
Problem scenario for the identification of difficulties in learning OO:
221
222
APPENDIX- B
Details about the learning objectives, designated activities, and related OO concepts:
223
APPENDIX- C
Questionnaire for the identification of difficulties in learning Object-Orientation
224
225
APPENDIX- D
JSON Solution Model
226
APPENDIX- E
Quick reference to OOsg
Start-up of the game
Screen for the personal and game control information
Selection of the scenario will appear after inserting the personel and control information
227
At the scenario selection screen user may select the warm-up session to get fimiliar with the envirnment of playing game; User may learn from the warm-up session as much as they want, or skip and play the game
By skiping the warm-up session, user will be presented to play the various levels related to the OOP concepts
After selecting the particular level related to the OOP concepts user may get fimiliar with the introduction of that topic
228
Once the user get the introduction of the topic, user will be presented with game goals and rules to play and clear the level
Once the user read the game goals and rules, game level screen will be presented. The game level’s screen layout is divided into five regions, i.e., top, bottom, left, right, and center. The top region is used as a scoreboard of the game which includes the information for the total number of correct solution attempted, score, other options available to get help or hint if the player stuck on any point, timer, information about the current level and option to quit from the game. The top region is almost same for all the levels, except the information may vary for the number of correct attempts and remaining attempts. Option for showing or hiding the game story may also be available in top region. The bottom region is used for displaying the game story, the drawer can be shown or hidden by pressing a hamburger sign available in the top region. The left region is dedicated to navigating to the previous screen/levels same as right region is used for navigating to the next screenlevel. The center region is used as actual play area, the contents of this region are different for different levels available in the game.
229
APPENDIX- F
Game Story
It was a great relief for Sara because her first-year undergrad examinations were over, and she had done quite well in them, but that was not the only reason for her joy. Sara, for a long time, had been waiting for the vacation so that she could finally tour the hospital of her father. She had always wanted to run a hospital, not only because she found this profession noble, but also because her father was an inspiration for her, and she wanted to become like him.
So instead of taking a rest, she decided to spend them learning about the hospital of her father, because she was the only child in her family, and she was going to inherit the position. So the next day she visits the hospital, where her father was already waiting for her. So first, they exchanged greetings. Her father started walking her towards the way of his office, where she has seen the nameplate of his father, Dr. Adnan Malik, M.B.B.S, F.C.P.S on the door, then they were passing through a psychiatric ward, where two patients were talking. Her father said that one of them used to be a great doctor until he went insane. They overheard the patient who spoke to the other.
Patient 1, Everything is unique; everything in this universe is an object with distinct features and qualities.
The Patient 2 nodded, so he continued. Patient 1: and that object can be anything, from a person to a small rock, and
the object can be created or destroyed. He then pointed to the tag attached to the uniform of the other patient and nurse
who was passing by and said You see, every person has different things for their identification such as N.I.C., and also name, father name, gender, dob, address, the phone, next of kin person, next of kin relation, and they all belong to other persons as well, they are not alone. All the while, Sara and her father were hearing the patient talk. Her father said…
Sara was your first lesson. They both then started walking into the first wing of the hospital, her father then
started talking, and Sara held a little notebook in her hand to note every important thing which his father was telling about.
Father: There are different groups of persons, one group of persons are the hospital employees, which include doctors and nurses, and they are responsible for medical care and curative treatment of the patients. There are several other employee types, such as salaried, commission-based employees. He then gave Sara a temporary I.D. And said to her. Every hospital employee you see has an Id, date of joining, the specified department, and the duty timings.
We can get the id, get name, get the address, or get the phone of any employee at any time. We can also set id, set name, set address, or set phone of any newly appointed employee.
Then they walked back towards the offices of the doctors, where he said the doctors could give a prescription, give suggestions, view prescription, view suggestions, or view reports to the patient. Her father said: All the doctors have information about their degree, their specialization, and commission list of commission information if they are the commission-based employees.
The commission information is about the date and time on which the commission is arranged, the patient on which the commission is to be earned, and also the total amount of the commission. Then they reached the wing of patients, where her father said:
230
Another group that requires medical treatment and cure are patients. See Sara Patients also have the I.D., and for all the patients at the date of the first registration, the new patient session is created in the patient session list, which is evolved If you want to know detail about any patient such as the date and time of the previous visits of the patient, the type of session, which kind of doctors assigned, the receipt list, the prescribed treatment list, the report list and remarks given by the doctor.
If any invoice item list is availed by the patient, then their receipt list is also maintained in the patient session. He then showed Sara a sample and said. Each receipt has a unique receipt no, date and time, mode of payment, and the description of the drug. Not only this, the patient history, the doctor assigned, and their ward, room and bed details, these all are part of the patient session. When we convert O.P.D. to I.P.D., then the admission of patients is dependent on the availability of the room, bed, and the ward. Then they reached the wing, which had many wards. He continued all of the wards, rooms, and bed have ward number, ward name, room number, room name, and bed numbers. The representatives you see in the front table have information about room list, room type, and bed list, suppose if a room is occupied already, they also have information about the doctor and patient who occupy it. Then he showed Sara a hospital room and tell him room details, which includes the allocated room, deallocate room, checks availability, and cleaning of the room.
Then he also briefs about the bed details in the room; he said the details include about to allocate bed, deallocate bed, and check availability. The lab test, the image test, and vitals can be the prescribed investigations by the doctors.
They stopped at the diagnostic room where he continued Treatment given by the doctors may include surgery, nebulization, dialysis, or
just simple medicines. Then, he showed her a sample of lab tests, and said For every lab test, a report is generated, which becomes a part of the patient
record. The reports can be the imaging report, a text report, the parametric report, or the chart report. The doctors may ask the patient for the blood pressure report, the temperature report, a sample lab report, the lab test report, or a lab parameter report. However, for some severe condition doctors, can ask the patient for X-Ray reports, E.C.G. report, ultrasound report, or the C.T. scan report. Then they reached a counter, where he said:
All the laboratory tests are invoiced items; the patients must pay for the items prescribed to them. There can be other invoiced items such as surgeon fees to operate, anaesthesia fees, O.T. recovery room charges, blood bank charges, nebulization fees, dialysis fees, lab test fees, imaging test fees, room charges, bed charges, doctor fees. The prescribed treatment given to patient includes the details about dose frequency such as, the dose can be given once, at morning, at midday, at night, B.D., T.D.S., QID, hourly, four hourly, six-hourly, eight hourly, once a week, three times a week, PRN, STAT, before food, after food, the duration and date and time, should be included. He also showed a medicine sample to Sara and said, The medicines prescribed by the doctor have a medicine name, potency, dose frequency, and preparation instructions. Walking towards the waiting room he said
The patient session can be closed, which, in the case of I.P.D. patients, results in discharge. Once the patient session is closed, the patient must open a new one upon the next visit. Now they reached the office of the head nurse, where there were many nurses together discuss their tasks. One was talking about viewing suggestions by the doctor and file reports. Another nurse was talking about to view reports to view the prescription by the doctor. They both got tired of all the walk and talk. It was an excellent lesson for Sara to learn about the systematic way of the hospital management system. So, they both back to the home. Sara was done with the first-day lesson, so she penned down all that her father said to remember and start dreaming of establishing her hospital.
231
APPENDIX- G
Pre-Test Scenario and associated Questions
232
APPENDIX- H
Rubric for Assessment
233
APPENDIX-I
Post-Test Scenario and associated Questions
234
APPENDIX- J
Questionnaire for Perceived Motivation
235
236
APPENDIX- K
Questionnaire for Perceived Feedback
237
APPENDIX- L
Questionnaire for Game Experience:
238
APPENDIX- M
Questionnaire for System Usability:
239
APPENDIX- N
240
Scatter plots for Pre and post test scores of control and Experimental Groups
241
APPENDIX- O
Result of the normality test and paired t-test for study-1
(a)
Normality of Control Group
Normality of Experimental Group
242
(b)
Result of Paired Sample t-test of Control Group
Result of Paired Sample t-test of Experimental Group
243
APPENDIX- P
Result of the normality test and paired t-test for study-2
(a)
Normality of Control and Exp Groups’ Normalized
learning gain
(b)
Paired Sample T-Test of Control and Exp Groups’
Normalized learning gain
244
APPENDIX- Q
Result of the normality test and Wilcoxon Signed rank for perceived feedback
(a)
Normality of Control and Exp Groups’ perceived feedback
(b)
Wilcoxon Signed rank of Control and Exp Groups perceived feedback