reflection and transformational learning in a data

13
Paper ID #33773 Reflection and Transformational Learning in a Data Structures Course Ms. Cheryl Lynn Resch, University of Florida BS, MS Mechanical Engineering University of MD MS Computer Science Johns Hopkins University 2017-present University of Florida Teach core Computer Science courses and cybersecurity courses. 1988-2017 Johns Hopkins University Applied Physics Laboratory Mr. Amanpreet Kapoor, University of Florida Amanpreet Kapoor is a lecturer in the Department of Engineering Education, and he teaches computing undergraduate courses in the Department of Computer & Information Science & Engineering (CISE). He received his M.S. in Computer Science from the University of Florida in 2016 and a B. Tech. in Computer Science & Engineering from Jaypee University of Engineering and Technology, India in 2015. c American Society for Engineering Education, 2021

Upload: others

Post on 16-Oct-2021

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Reflection and Transformational Learning in a Data

Paper ID #33773

Reflection and Transformational Learning in a Data Structures Course

Ms. Cheryl Lynn Resch, University of Florida

BS, MS Mechanical Engineering University of MD MS Computer Science Johns Hopkins University

2017-present University of Florida Teach core Computer Science courses and cybersecurity courses.

1988-2017 Johns Hopkins University Applied Physics Laboratory

Mr. Amanpreet Kapoor, University of Florida

Amanpreet Kapoor is a lecturer in the Department of Engineering Education, and he teaches computingundergraduate courses in the Department of Computer & Information Science & Engineering (CISE). Hereceived his M.S. in Computer Science from the University of Florida in 2016 and a B. Tech. in ComputerScience & Engineering from Jaypee University of Engineering and Technology, India in 2015.

c©American Society for Engineering Education, 2021

Page 2: Reflection and Transformational Learning in a Data

Reflection and Transformational Learning in a Data Structures and Algorithms Class

Abstract Reflective practice is the process of using one’s beliefs and prior experiences to analyze a problem; it is making meaning from experience. The process starts with noticing and naming the problem, continues to analyzing the problem, and finishes with forming new beliefs in order to solve the problem. Reflective practice is an important skill for computing students to master; responding to reflection prompts can aid students in developing problem solving skills. However, there is limited empirical evidence on the effectiveness of reflective practice in Data Structures courses, in which computing students are honing problem-solving skills. To fill this gap, we evaluate the effectiveness of assigning guided reflection prompts with programming assignments in an undergraduate Data Structures course in encouraging students to articulate their problem-solving strategies. 219 students completed two programming assignments and were asked to respond to reflection prompts after each. Students’ responses were (1) analyzed for word and sentence count as a measure of thoughtful engagement in the reflection assignment and (2) deductively coded using progressive levels of reflection derived from Dewey and Moon: Noticing, Making Sense, Making Meaning, and Transformative Learning. We found that most students gave thoughtful responses for each prompt that averaged 3.8 sentences and 83 words. 87% of students produced responses at the level of Making Meaning or Transformative Learning. The thoughtful responses indicate that students find reflection beneficial to their learning and will expend effort on the process. The number of students reaching levels of Making Meaning and Transformative Learning indicates that students are successfully using the reflection prompts to make connections in their learning and develop new ways of thinking and problem solving. Introduction Undergraduate Computer Science education entails more than imparting facts and learning to use tools. The ACM Computer Science Curricula 2013 states that “Graduates need to understand how to apply the knowledge they have gained to solve real problems, not just write code and move bits” [1]. Thus, computing educators must equip their students with skills for solving problems. Dewey [2] defines reflection as a way of thinking that results from examining beliefs when habitual thinking is not adequate to solve a problem. Dewey evokes the scientific method of observation, formulation and testing of hypotheses and suggests that reflective thinking must occur in order to hypothesize new ways to think about and solve a problem. Reflection prompts can guide students to articulate a problem and develop their problem-solving skills [3]. In a Data Structures and Algorithms courses we expect students to use their knowledge from introductory programming courses to implement data structures and algorithms for solving problems efficiently in terms of space and time complexity [4]. Assignments and projects in Data Structures and Algorithms require students to make decisions about what data structures to use and are challenging enough that jumping straight into coding is not an efficient way to solve the problem. Students often have a hard time choosing the most efficient data structure, which may be less familiar, rather than the data structures they are comfortable with [5]. We believe that responding to reflection prompts can help students contemplate the choices they made, and make

Page 3: Reflection and Transformational Learning in a Data

connections that they can use when solving future problems [6][7]. In this paper, we explore the use of reflection in the context of problem solving in a Data Structures and Algorithms course. In particular, we measured students' (1) levels of reflection, and (2) level of effort in answering reflection prompts in order to answer the following research questions:

● RQ1 - What levels of reflection do data structures and algorithms students show when prompted to reflect on their problem solving for a programming assignment?

● RQ2 - How do students' levels of reflection change from the first programming assignment to the second programming assignment?

● RQ3 - What level of effort do students exert in their responses to reflection prompts about programming projects?

Theoretical Background Schon’s [8] “Reflective Practitioner” advocated for an expansion of the education of professionals beyond imparting a collection of facts and skills. Schon suggested that professionals must become familiar with the process of solving problems that they have not seen before, which requires them to engage in “Reflection-in-action” [8]. In the context of learning and problem solving, when a learner has a problem that cannot be solved in the habitual way, (e.g., iterating over a linear data structure), reflective thinking must occur in order to hypothesize new ways to think about and solve the problem. “Demand for the solution of a perplexity is the steadying and guiding factor in the entire process of reflection” [2]. Dewey defined distinct steps in reflection:

● The occurrence of a difficulty ● Definition of the difficulty ● Suggestion of a possible solution ● Rational elaboration of an idea ● Corroboration of an idea and formation of a concluding belief

Moon [3] extends the concepts of reflection from “difficulties,” or solving specific problems in professional practice, to the process of learning in general in undergraduate education. Moon describes the stages of learning as a continuum of surface learning to deep learning that correspond to Dewey’s stages of reflection:

● Noticing - naming the problem and imparting facts ● Making Sense - describing the problem, but not in relation to previous understandings ● Making Meaning - the beginning of a holistic view ● Working with Meaning - integrating ideas ● Transformative Learning - new learning has transformed understanding, there is a

restructuring of ideas Requiring students to memorize facts only gets them to the first stage of learning. The ultimate goal is for students to update the models in their minds so that they can use the concepts they have learned to solve future problems. Thus, Moon expands the idea of the reflective process from solving a problem or responding to a “difficulty” to making connections and developing a holistic view so students have a base of knowledge to draw on in new situations [3]. Instructors can guide learners through these stages of learning with reflection prompts [9][10]. This paper will examine responses to general open-ended reflection prompts to posit which stage of learning the students are in.

Page 4: Reflection and Transformational Learning in a Data

Rogers [9] reviewed the writings of Dewey and Schon and developed a set of requirements relating to the process of reflection for teaching and learning in higher education. Rogers points out that to enable learners to engage in reflection, they should be prompted to think about an unusual or perplexing situation. In the case of a computer science undergraduate, this could be in the context of writing an algorithm to solve a problem or working with a group to figure out how to meet software engineering requirements. Rogers suggested several frameworks for guiding learners through reflection. One of these frameworks uses a set of structured questions about a particular experience, and is the one we use in our study. The work of Schon, Dewey, Moon, and Rogers form our theoretical foundation and we used their ideas as a lens to interpret student reflections. Prior Work Vandegrift et. al. [11] describe an instrument used with an introductory Computer Science course with five programming assignments. After the second and third programming assignments they prompt students to “Write down at least one SPECIFIC plan that you can implement to improve your software development process”. They found that a majority of students could recognize what needs to be improved, but that students need specific guidance to create measurable plans for improvement. Vandegrift et. al. advocate reflection prompts as part of programming assignments because this would require students to reflect during the time they are solving problems. Kakavouli and Metaxas [12] describe an instrument to facilitate reflection for students in a Data Structures class. Throughout a semester, students fill out questionnaires to document their progress, thus also following the recommendation of Rogers for guided prompts. Example questions are: “What did you learn?”, “What would you like to explore further?”. They also make reflection prompts part of programming assignments, reinforcing the ideas of Vandegrift et. al., that this is good timing for reflection for Computer Science students to engage in reflection-in-action. We build on this work by providing reflection prompts with programming assignments, and then coding the responses to explore the level of reflection demonstrated by the students. Chen et. al. [13] describe a study of the responses of software engineering students to automatically generated reflection prompts based on a student’s github commits. The timing of the prompts was during graded iterations of a semester long programming project. They analyzed 34 responses and found that the average response was over 100 words, which is an encouraging result. While this is a rather small study, it does reinforce two of the factors that can encourage reflection - specific, guided instructions, and timing of the reflection assignment during a problem-solving activity. Our paper investigates students' movement toward Transformative Learning via guided reflection prompts given after completing programming assignments. Nylen and Isomottonen [14] examine 864 reflections of 36 students in a Software Engineering class. Students were asked to report on “critical incidents”, which were defined as times when they perceived that they learned something new or gained a better understanding of something. The instructions were vague and open for interpretation: “this was a deliberate choice since we were interested in seeing different ways in which learners could approach and benefit from the assignment” [14, p.90]. The authors coded the reflections into categories that roughly correspond to those of Moon. 3% of the reflections coded as Theoretical, which roughly corresponds to Transformational Learning. 65% of the reflections were coded as Generic Language/How-to, which roughly corresponds to Making Meaning. 15% were coded as Diary Like, which

Page 5: Reflection and Transformational Learning in a Data

corresponds to Making Sense. 17% of the reflections were coded as What Am I Doing/Status Report, which corresponds to Noticing. The authors also reported that the average length of the reflection was 12 words, seemingly quite short for a report on a critical incident. The authors conclude that they should consider how to guide students toward “conceptual reflection”. Our paper expands on this work coding responses to more specific prompts into similar categories, in order to explore whether specific prompts produce more fruitful reflection. In two papers, Coffey [15, 16] describes a study and a follow-on study of reflective activities learners perform as part of the design process in an advanced programming course. Learners produced an initial design for a programming assignment and then submitted a document describing the final design, what changed from the initial design, and why the changes were made. The goal of the work was to study how reflection on deficiencies in a preliminary design might help learners better anticipate what is needed in a design in subsequent projects. In the first study, 23 learner reflections were analyzed and coded [15]. The author found that learners did well at describing the changes that were made but did poorly at describing why changes were necessary. They also found that half the reflections contained little to nothing about why changes were made. In the second study, learners were given more detailed and specific instructions. In this study, students provided richer reflections and were able to describe why changes were necessary. This result further indicates that detailed instructions are required to encourage reflection. We build on this work by also requiring students to submit an initial design and describe design changes. Study Context In this paper, we evaluate the use of reflection in the context of programming assignments in a Data Structures and Algorithms course. To answer the research questions, we collected student responses to reflection prompts for two programming assignments. The students were recruited from a Data Structures and Algorithms course at the University of Florida during the Fall 2019 semester. This course is taken by Computer Science, Computer Engineering and Digital Arts and Sciences majors and Computer Science minors after completing two introductory courses: Computer Science 1 and Computer Science 2. The course objectives are the following: (1) students will be able to choose and implement data structures for solving problems based on their functions and situational appropriateness of application, and (2) students will be able to choose an algorithm for solving a problem based on its computational complexity and appropriateness of application. Methods Data were collected from 219 students. Table 1 provides demographics of the study participants. Table 1 - Demographics of Participants

Demographic Frequencies Gender Male 163, Female 54, Unspecified 3 Major Computer Science 155, Computer Engineering 37, Digital Arts and Sciences

2, Other (CS Minor) 25 Year Sophomore 132, Junior 58, Senior 25, Post-graduate 4 Retaking No 201, Yes 18

Page 6: Reflection and Transformational Learning in a Data

Data Collection Students were required to complete two major programming assignments. The first assignment was due 5 weeks into the semester and required students to create a binary expression tree (BET). The second assignment was due 13 weeks into the semester and required students to implement two graph algorithms (GA). The assignments required the use of the C++ programming language. Table 2 gives details of the assignments. Each assignment was split into two parts. With part a, students were required to submit a plan for completing the assignment, and with part b the students were required to respond to reflection prompts. Each programming assignment was worth 15% of a student’s final grade in the class, and the reflection portion was 20% of the grade on the assignment. Students were encouraged to write two to three sentences in response to each reflection prompt. Students were advised that responses would not be graded on content and that responses with at least two sentences to the prompts would be rewarded with full credit. We chose our reflection prompts to ask students to reflect on their learning and on a difficulty [17], [18], [14] and in the first assignment to describe changes they made to their initial design [15], [16]. Table 2 - Programming Assignments and Reflection Prompts

Binary Expression Tree (BET) Part a - Write a program that takes an infix expression as input and produces a postfix expression. Part b - Extend the code from Part a to include methods to create an expression tree and evaluate the expression. Write a reflection that answers the following questions. What was the hardest part of this assignment? What did you learn from this assignment? What changes did you make from your initial design for part b and your implementation? Why? If you didn't make any changes, why not? Graph Algorithms (GA) Part a - Implement a graph and a method that uses Kruskal's algorithm to find the minimum spanning tree. Part b - Extend the code from Part a to include an implementation of Dijkstra’s algorithm. Write a reflection that answers the following questions. What was the hardest part of this assignment? What did you learn from this assignment?

Page 7: Reflection and Transformational Learning in a Data

Table 3 - Description of Qualitative Codes

Code Description of code from Moon

Extended Description

How it is Distinguished from Next Level

Example student response to a post-assignment reflection prompt

Noticing Naming the problem and imparting facts

What makes it not Level 2 - no description

The hardest part of this assignment was understanding how recursion works when used to traverse a tree.

Making Sense Describing the problem, but not in relation to previous understandings

Descriptive words, e.g. fun, tricky, important, annoying, interesting. Description of one part of implementation

What makes it not Level 3 - No comparisons of two solutions, no before and after phases of problem solving

Finding the bugs required diligent use of the debugger, as well as careful attention to where in the tree the program was at any given time.

Making Meaning / Working with Meaning

Integrating ideas, the beginning of a holistic view

Description of problem solving including more than one part of implementation, describes before and after phases of solving the problem, compares two solutions

What makes it not Level 4 - No discussion of using the knowledge in the future

Initially, I wanted to do an adjacency list graph implementation; however, I had to convert to using an adjacency matrix in order to optimize the Dijkstra’s algorithm

Transformative Learning

New learning has transformed understanding, there is a restructuring of ideas.

Plans for the future, lessons to be used in the future, theoretical discussion

I learned that recursion is an effective tool for solving problems with trees. Prior to this assignment I knew of recursion and how it works, but never had known an effective way to use them or why I would ever want to use them. In this scenario recursion finally seems to be a useful tool that I can use to solve problems that I

Page 8: Reflection and Transformational Learning in a Data

may come across in programming.

Data Analysis Student responses to reflection prompts were deductively coded using progressive levels of reflection derived from Dewey and Moon: Noticing, Making Sense, Making Meaning, Working with Meaning, and Transformative Learning [2], [3]. Table 3 shows the levels developed by Moon [3] with further descriptions added by the authors to aid in coding and examples of student reflections from our course to contextualize the deductive codes. The reflection responses were analyzed and coded according to the highest level of reflection found in the response. The first author coded 1074 responses to the five reflective prompts from 219 students into the four codes described in Table 3. To verify the reliability of the first authors’ coding scheme, the second author independently coded 200 random responses (18.6% of the dataset). Cohen's Kappa, a measure of inter-rater agreement, was 0.77 between first and second author which qualifies as Substantial agreement, and was 0.59 between first and third author which is classified as Moderate agreement [19]. The authors discussed the codes in which there was a disagreement until a consensus was reached about the code accuracy. This step allowed the authors to refine the code descriptions. The third author additionally coded 50 random responses based on the new descriptions and the agreement was compared with first author’s codes. Kappa was 0.83, which is classified as Almost perfect agreement [19]. The coding process was followed by a frequency analysis of responses within each code to answer RQ1. For students who answered all five prompts, we recorded the highest reflection level for each student across the five prompts. To determine if the individual student reflection levels were higher later in the semester than the beginning of the semester and answer RQ2, we performed a paired samples t-test [20] on the reflection levels for each student for each repeated prompt, with a null hypothesis that students’ reflection level on the later assignment is not different than the reflection level on the earlier assignment. A paired samples t-test is appropriate because it will compare each student’s reflection level on the second assignment to their reflection level on the first assignment. Assumptions of normality (kurtosis and skew less than 2) and homogeneity of variance (ratio of variances less than 1:4) were found to be valid. Effect Size was calculated using D = d / SD, where d is the mean of the difference between the two reflection levels and SD is the standard deviation of the difference between the two reflection levels [21]. Cohen proposed that a value of D=0.2 is a small effect size, D=0.5 is a medium effect size, and D=0.8 is a large effect size [21]. This means that if the effect size is below 0.2, the difference is trivial, even if it is statistically significant. To quantify the level of effort students put into responding to reflection prompts and answer RQ3, we calculated the word and sentence count for each response. We then calculated descriptive statistics for word and sentence count per prompt and reflection level.

Page 9: Reflection and Transformational Learning in a Data

Findings Students answered the prompts ‘What did you learn?’ and ‘What was the hardest part?’ in response to two assignments: BET (due in week five) and GA (due in week thirteen). For the BET assignment, students also answered the prompt ‘What changes did you make from your initial design?’. There were a total of 1074 responses from 219 students. 207 students answered all five prompts; 12 students neglected to answer one or more of the prompts. Table 4 shows the overall number of responses coded at each level for the 1074 responses and the highest reflection level for the 207 students across the five prompts. RQ1 - What levels of reflection do data structures and algorithms students show Overall, we found that 46% of 1074 responses exhibited high levels of reflection with the majority of these responses coded as Making Meaning (40.7%) and a small minority of responses coded as Transformative Learning (5.3%). We also found that 64.3% of 207 students’ highest coded level of reflection (n= 133) were on the level of Making Meaning and 22.7% were on the level of Transformative Learning (n=47). Thus, 87% of 207 students were capable of demonstrating Making Meaning or Transformative Learning, which are the targeted levels of reflection that can result in meaningful improvements in students' problem-solving skills. While 54% of 1074 responses exhibited lower levels of reflection: Noticing (25%) and Making Sense (29%), we see a low percentage of students’ highest reflection levels being coded in these categories. This means that many students who wrote a lower level response had another response at a higher level. This suggests that while many data structures and algorithms students may write lower levels of reflective responses for some prompts, they also demonstrate higher levels of reflection for other prompts or on later assignments. Table 4 - Reflection Level Frequency per Prompt

Code Number Responses, N=1074 (%)

Number of students’ highest reflection level, N=207 (%)

Noticing 269 (25.0%) 3 (1.4%) Making Sense 311 ( 29.0%) 24 (11.6%)

Making Meaning

437 (40.7%) 133 (64.3%)

Transformative Learning

57 ( 5.3%) 47 (22.7 %)

RQ2 - How do students' levels of reflection change from the first programming assignment to the second programming assignment For the prompt 'What was the hardest part?', 96 students (46%) wrote a response at a higher level on the second assignment (GA) than they did for the first. For the prompt 'What did you learn?', 85 students (41%) wrote a response at a higher level on the second assignment (GA) than they did for the first. To determine the magnitude of the effect of increased reflection levels from the first to the second assignment, we performed a paired samples t-test on the reflection level for each student for each repeated prompt. For the prompt ‘What was the hardest part?’, the difference in

Page 10: Reflection and Transformational Learning in a Data

reflection level on responses on the second programming assignment and the first programming assignment was statistically significant (D=0.45, t(206)= 6.461, p < 0.001) where D is Cohen’s effect size [5]. The effect size is moderate, so the difference is also practically significant. The mean difference in reflection levels for this prompt is 0.425 (SD=0.947). For the prompt ‘What did you learn?’, the difference in reflection level on responses on the second programming assignment and the first programming assignment was statistically significant (D=0.15, t(206)= 2.202, p =0.029). The effect size is small, so the difference is not practically significant. The mean difference in reflection levels for this prompt is 0.193 (SD=1.262). The reflection level is higher later in the semester for 'What was the hardest part?' but not for 'What did you learn?'. For context, we provide examples of reflections for two students to the prompt 'What was the hardest part?' for two assignments. Each student achieved a higher level of reflection on the second assignment than on the first.

● Student 1 - Noticing (Level 1) to Making Meaning (Level 3) o Noticing – “The hardest part for me was implementing the algorithms for in order

and post order traversal. I knew the algorithm for each but writing the recursive code gave me a little trouble.”

o Making Meaning – “The hardest part of this assignment for me was the printMST in part a. I originally did not understand what the purpose of disjoint sets was and did not use this class in my original solution. Because of this I ... failed at test 7 where I eventually realized that my graph was not fully connected. I then had to change my strategy, implement a new class, and add several checks for determining when my MST was complete... ”

● Student 2 - Making Meaning (Level 3) to Transformative Learning (Level 4) o Making Meaning – “The hardest part of this assignment was checking for valid

arguments with recursive-style functions. It was much easier to keep track of the validity of the argument when tree traversal was done with a stack or queue instead of through recursive calls.”

o Transformative Learning – “The hardest part of this assignment for me was a debugging session in which I didn’t realize I was comparing the iterators for two completely different containers... I attribute this to poor naming conventions and will definitely do a better job of distinguishing my variable names, as well as limiting the use of variables for storage while not absolutely needed.”

RQ3 - What level of effort do students exert Students were advised to write 2 to 3 sentences per prompt. On average, students wrote responses of 3.8 sentences (SD = 2.3) and 83.6 words (SD = 51.9) per prompt. The statistics were consistent when disaggregating by prompt. Table 5 shows the mean word and sentence counts per reflection level. Word and sentence counts increase with reflection levels, indicating that students reflecting at a higher level put in more effort. At the lowest level of reflection, on average students put in the required amount of effort. At higher levels, students were willing to expend more than the

Page 11: Reflection and Transformational Learning in a Data

required effort to respond to the prompt, with an average sentence count of 4.66 sentences for Making Meaning and 4.43 sentences for Transformative Learning. Table 5 - Word and Sentence Counts per Level

Code Word count - mean (SD) Sentence count - mean (SD) Noticing 54.16 (32.7) 2.75 (2.62) Making Sense 70.49 (38.9) 3.34 (1.72) Making Meaning 106.48 (52.8) 4.66 (2.43) Transformative Learning 102.05 (47.15) 4.43 (2.03)

Discussion To answer the first research question, we found that 5.3% of reflections from our students are at the level of Transformative Learning and 40.7% are at the level of Making Meaning. Our results for Transformative Learning are consistent with Nylen and Isomottonen, and our results for Making Meaning are a lower percentage than that reported by Nylen and Isomottonen [14]. Our larger sample size may account for the increased variability in the reflection level. When tabulating the highest reflection level by individual students in their answers to the five prompts, we found that 87% of the students had at least one of their responses to the five prompts at the level of Making Meaning or Transformative Learning. This suggests that accompanying programming assignments with guided reflection prompts can encourage students to reflect more deeply on their problem-solving process and identify lessons learned for future problem-solving. Students' reflection levels were found to be statistically significantly higher later in the semester. These findings suggest that guided reflection prompts with programming assignments can help students grow in their articulation of problem solving and making connections during the semester. In our analysis of the level of effort expended by students in answering the reflection prompts, we found that the average response length was 3.8 sentences and 83 words. Students writing at higher reflection levels went beyond the suggested length even though they were told that two sentences would get them full credit. This provides evidence that many students find the activity useful. The word count is longer than that reported by Nylén and Isomöttönen [14] and similar to that reported by Chen et. al. [13]. Students reflecting at the lowest level generally met the two-sentence guideline, and those reflecting at higher levels went beyond the guidelines. This suggests that students will heed guidelines, and will thoughtfully engage with reflection prompts about programming assignments. Threats to Validity We coded students’ responses to reflection prompts according to the levels of learning as defined by Moon [3]. We interpret that code as the level of learning that the student has reached. We did not advise the students that we were doing this, teach them about the levels of learning, or ask them to show Transformative Learning. It is possible that our interpretation of student responses does not accurately reflect where they are in their learning. Future work should include sharing levels of learning with students, indicating where they are, and providing prompts that are adapted to encourage the student to reach the next level of learning.

Page 12: Reflection and Transformational Learning in a Data

Conclusion This paper contributes empirical evidence of the usefulness of guided reflection prompts with programming assignments for helping students grow their problem-solving skills in a Data Structures and Algorithms course. Our findings suggest that reflection prompts as part of programming assignments are well-received by students and that students thoughtfully engage when answering the prompts. Some students grew in their reflection level in the semester. Future research will explore whether prompts can be adapted so that all students are encouraged to move up a level of their reflection. This approach will make more apparent how students can grow in their reflective practice and move toward Making Meaning and Transformative Learning. References [1] ACM. [n.d.].Curriculum Guidelines for Undergraduate Degree Programs in Computer Science 2013. Retrieved April 3, 2020 from https://www.acm.org/binaries/content/assets/education/cs2013_web_final.pdf [2] John Dewey. 1933.How we think: A restatement of the relation of reflective thinking to the educative process. DC Health. [3] Jenny Moon. 2001. PDP working paper 4: Reflection in higher education learning.Technical Report. Higher Education Academy. 1–25 pages. [4] Leo Porter, Daniel Zingaro, Cynthia Lee, Cynthia Taylor, Kevin C. Webb, and Michael Clancy. 2018. Developing Course-Level Learning Goals for Basic Data Structures in CS2. In Proceedings of the 49th ACM Technical Symposium on Computer Science Education(Baltimore, Maryland, USA)(SIGCSE ’18). As-sociation for Computing Machinery, New York, NY, USA, 858–863.https://doi.org/10.1145/3159450.3159457 [5] Daniel Zingaro, Cynthia Taylor, Leo Porter, Michael Clancy, Cynthia Lee, Soohyun Nam Liao, and Kevin C. Webb. 2018. Identifying Student Difficulties with Basic Data Structures. In Proceedings of the 2018 ACM Conference on International Computing Education Research(Espoo, Finland)(ICER ’18). Association for Computing Machinery, New York, NY, USA, 169–177.https://doi.org/10.1145/3230977.3231005 [6] Cruz Izu and Brad Alexander. 2018. Using Unstructured Practice plus Reflectionto Develop Programming/Problem-Solving Fluency. In Proceedings of the 20th Australasian Computing Education Conference(Brisbane, Queensland, Australia)(ACE ’18). Association for Computing Machinery, New York, NY, USA, 25–34.https://doi.org/10.1145/3160489.3160496 [7] Dastyni Loksa, Andrew J. Ko, Will Jernigan, Alannah Oleson, Christopher J.Mendez, and Margaret M. Burnett. 2016. Programming, Problem Solving, andSelf-Awareness: Effects of Explicit Guidance. InProceedings of the 2016 CHI Conference on Human Factors in Computing Systems (San Jose, California, USA)(CHI ’16). Association for Computing Machinery, New York, NY, USA, 1449–1461.https://doi.org/10.1145/2858036.2858252 [8] Donald A. Schon. 1984.The reflective practitioner: How professionals think inaction. Basic books. [9] Russell R. Rogers. 2001. Reflection in higher education: A concept analysis. Innovative higher education 26, 1 (2001), 37–57. [10] Manuel G. Correia and Robert E. Bleicher. 2008. Making connections to teach reflection. Michigan Journal of Community Service Learning 14, 2 (2008), 41–49.

Page 13: Reflection and Transformational Learning in a Data

[11] Tammy VanDeGrift, Tamara Caruso, Natalie Hill, and Beth Simon. 2011. Experience Report: Getting Novice Programmers to THINK about Improving Their Software Development Process. In Proceedings of the 42nd ACM Technical Symposium on Computer Science Education(Dallas, TX, USA)(SIGCSE’11). Association for Computing Machinery, New York, NY, USA, 493–498.https://doi.org/10.1145/1953163.1953307 [12] Stella Kakavouli and Panagiotis Metaxas. 2012. Also "Your" Job to Learn! HelpingStudents to Reflect on Their Learning Progress.J. Comput. Sci. Coll.27, 6 (June2012), 113–120. [13] Hui Chen, Agnieszka Ciborowska, and Kostadin Damevski. 2019. Using Automated Prompts for Student Reflection on Computer Security Concepts. InProceedings of the 2019 ACM Conference on Innovation and Technology in Computer ScienceEducation(Aberdeen, Scotland Uk)(ITiCSE ’19). Association for Computing Ma-chinery, New York, NY, USA, 506–512. https://doi.org/10.1145/3304221.3319731 [14] Aletta Nylén and Ville Isomöttönen. 2017. Exploring the Critical Incident Technique to Encourage Reflection during Project-Based Learning (Koli Calling ’17). Association for Computing Machinery, New York, NY, USA, 88–97.https://doi.org/10.1145/3141880.3141899 [15] John W. Coffey. 2017. A Study of the Use of a Reflective Activity to Improve Students’ Software Design Capabilities. In Proceedings of the 2017 ACM SIGCSETechnical Symposium on Computer Science Education(Seattle, Washington, USA)(SIGCSE ’17). Association for Computing Machinery, New York, NY, USA, 129–134.https://doi.org/10.1145/3017680.3017770 [16] John W. Coffey and Bernd Owsnicki. 2016. Introducing a Reflective Activity into the Design Process in an Advanced Computer Programming Course. 31, 5 (May2016), 29–37. [17] Ng Sook Han, Ho Ket Li, Lee Choy Sin, and Keng Pei Sin. 2014. The evaluation of students’ written reflection on the learning of general chemistry lab experiment. MOJES: Malaysian Online Journal of Educational Sciences2, 4 (2014), 45–52. [18] Julia Prior, Samuel Ferguson, and John Leaney. 2016. Reflection is Hard: Teaching and Learning Reflective Practice in a Software Studio. In Proceedings of the Australasian Computer Science Week Multiconference(Canberra, Australia)(ACSW’16). Association for Computing Machinery, New York, NY, USA, Article 7, 8 pages.https://doi.org/10.1145/2843043.2843346 [19] J Richard Landis and Gary G Koch. 1977. The measurement of observer agreement for categorical data.biometrics(1977), 159–174. [20] Richard G Lomax and Debbie L Hahs-Vaughn. 2013.An introduction to statistical concepts. Routledge. [21] Jacob Cohen. 2013.Statistical power analysis for the behavioral sciences. Academic Press. [22] Mary Ryan. 2013. The pedagogical balancing act: teaching reflection in higher education. Teaching in Higher Education 18, 2 (2013), 144–155. https://doi.org/10.1080/13562517.2012.694104