teaching and learning computing - semantic scholar...education to any significant extent. as well,...

42
Teaching and Learning Computing Matt Bower Macquarie University [email protected] Introduction This literature review focuses on cognitive aspects of teaching and learning Computer Science. In order to limit the scope of this research the report does not address assessment, gender, media, collaborative or affective aspects of Computer Science Education to any significant extent. As well, softer skills such as IT management, Requirements Analysis or Human Computer Interaction are not addressed – emphasis is placed upon the teaching and learning of computer programming. The first section of the report introduces some general educational theories in order to provide a framework for all subsequent discussion. In the next section, literature relating to the cognitive processes and models of Computer Science learners is summarised so as to provide an understanding of the domain specific educational theories. The next section turns attention to attributes of the learner, in particular, identifying those characteristics lead to success in Computing. Following this, literature relating to the difference between novice and expert practitioners is identified, providing a framework around which the progressive development of the Computer Scientist can be based. In order for effective teaching and learning of computer science to occur, areas in which students experience difficulty in learning computing must be understood. Accordingly a section is devoted to this topic, which focuses especially the introductory phases of learning to program. This section allows unnecessary impediments to the learning process to be pre-empted (by both teachers and learners), and acts as an antecedent to the next section on approaches to Instructional Design in Computer Science Education. The section on Instructional Design presents literature relating to many different aspects of teaching and learning computing, such as how to introduce new concepts, the efficacy of approaches such as scaffolding and behaviour modelling, and how to develop higher order critical thinking skills. This is a large section that integrates several components of teaching and learning computing. Finally, the Theory in Practice section outlines some exemplar learning activities that have been recommended amongst literature and describes some approaches to synthesising pedagogical paradigms. This section provides evidence of how the theory and research relating to Computer Science Education can be applied to create effective learning experiences. - 1 -

Upload: others

Post on 22-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

Teaching and Learning Computing

Matt Bower Macquarie University

[email protected]

Introduction This literature review focuses on cognitive aspects of teaching and learning Computer Science. In order to limit the scope of this research the report does not address assessment, gender, media, collaborative or affective aspects of Computer Science Education to any significant extent. As well, softer skills such as IT management, Requirements Analysis or Human Computer Interaction are not addressed – emphasis is placed upon the teaching and learning of computer programming. The first section of the report introduces some general educational theories in order to provide a framework for all subsequent discussion. In the next section, literature relating to the cognitive processes and models of Computer Science learners is summarised so as to provide an understanding of the domain specific educational theories. The next section turns attention to attributes of the learner, in particular, identifying those characteristics lead to success in Computing. Following this, literature relating to the difference between novice and expert practitioners is identified, providing a framework around which the progressive development of the Computer Scientist can be based. In order for effective teaching and learning of computer science to occur, areas in which students experience difficulty in learning computing must be understood. Accordingly a section is devoted to this topic, which focuses especially the introductory phases of learning to program. This section allows unnecessary impediments to the learning process to be pre-empted (by both teachers and learners), and acts as an antecedent to the next section on approaches to Instructional Design in Computer Science Education. The section on Instructional Design presents literature relating to many different aspects of teaching and learning computing, such as how to introduce new concepts, the efficacy of approaches such as scaffolding and behaviour modelling, and how to develop higher order critical thinking skills. This is a large section that integrates several components of teaching and learning computing. Finally, the Theory in Practice section outlines some exemplar learning activities that have been recommended amongst literature and describes some approaches to synthesising pedagogical paradigms. This section provides evidence of how the theory and research relating to Computer Science Education can be applied to create effective learning experiences.

- 1 -

Page 2: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

It should be noted that wherever possible this report relates research that has been based upon scientific methodologies as opposed to mere opinion pieces. Where relevant the validity and reliability of these studies is critically evaluated. The preference towards experimental research has been made to allow clearer comparison between approaches and more objective conclusions to be drawn.

Some General Education Theories There are some general educational frameworks that pervade all teaching and learning in Computer Science. Three of these are described in this section: Constructivism, Information Processing and Scaffolding. A brief introduction to each of these is presented below to provide a context within which to hedge all subsequent discussions.

Constructivism Constructivist approaches to learning (Piaget, Vygotsky) propose that “learners must individually discover and transform complex information, checking new information against old rules and revising them when they no longer work” (Slavin, 1994, p. 199). Murnane and Warner (2001) go on to explain “constructivist classrooms are often viewed as problem-solving environments manifest through three C’s: context, construction and collaboration”. With constructivism the learning process becomes one of

i) assimilation – fitting new objects into existing schemes by which we view our world, and

ii) accommodation – adjusting existing schemes to explain the way that new objects relate to our world.

There are various educational techniques that can be used to create a constructivist curriculum. For assimilation tasks, Reigeluth’s (1980) idea of “epitomes” provide a simplified example of the concept being presented, can offer the learner a concrete building block from which to abstract. Carroll’s (1998) Minimalist Theory proposes the use of meaningful, self-contained activities in order to reinforce concepts constructed in the acquisition phase. Both of these techniques place emphasis on the student developing a ‘deep’ understanding as opposed to engaging in a superficial or ‘surface’ learning of facts. On the other hand, accommodation can be encouraged by creating “cognitive disequilibrium” in the mind of the students. This is where an “imbalance between what is observed and what is understood” is prompted by the learning activity or other experience (Slavin, 1994, p. 330) . This imbalance serves to catalyse student curiosity by challenging their existing world-view, which can in turn lead to the learner constructing new schemata and amending old ones thus progressing them to a higher level of understanding. Strategies for facilitating accommodation are discussed in later sections.

- 2 -

Page 3: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

- 3 -

Information Processing Theories An appreciation of Information Processing Models can assist students in better managing their learning and also assist academics to construct more efficient and effective educational materials. The basic human cognitive system can be considered to comprise the following faculties:

• sensory perception – facilities such as sight, hearing, smell, for obtaining information from the environment

• short term memory – a temporary, limited capacity store for holding and manipulating information

• long-term memory – a permanent, organised and virtually unlimited store of knowledge.

Figure 1: Simplified model of the human cognitive system

New information enters the human cognitive system from the environment via the sensory perception faculties and must go through the following processes for meaningful learning to occur:

1. reception – the learner must pay attention to the input in order for it to enter short term memory

2. availability – the learner must possess prerequisite knowledge in memory in order to assimilate new concepts

3. activation – the learner must actively use the new information in conjunction with prerequisite knowledge in order to connect the new material to existing schema. (Mayer, 1989)

If any of these processes are not executed then meaningful learning cannot occur and the learner will be forced to rote learn information and commit each piece to memory. The implication for teaching and learning computing is that students must be engaged, material must be appropriately pitched and tasks must involve reformulation of the material if an understanding framework is to be developed.

stimulus Sensory Perception

Short TermMemory

Long Term Memory

response

Page 4: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

- 4 -

There are several related Information Processing theories that can be directly applied to the teaching and learning of computing, which have been summarised by Slavin (1994, p. 197-199):

• Levels-of-Processing Theory – the recall of a stimulus from memory is dependent upon the amount of mental processing that it receives

• Dual Code Theory of Memory – information that is presented both visually and verbally is recalled better than information represented using only one medium

• Transfer-Appropriate Processing – the strength and durability of memory depends on the similarity between the conditions under which the material was learned and those under which it is called for

• Parallel Distributed Processing – information is processed simultaneously with different parts of the memory system (sensory register, short term memory and long term memory) operating on the same information at the same time

• Limited Capacity Processors – optimal learning occurs when the limits of people’s information processing capabilities are not exceeded.

These theories can be used to refine the types of tasks prescribed and the way they are presented. For instance, teachers may realise that written instructions on how to compile a Java program may be more appropriately presented as an online multimedia demonstration (Dual Code Theory) accompanied by a practical reformulation exercise to test skill acquisition (Levels of Processing). Students may realise that trying to learn too many new concepts at once without consolidating previous ones can actually impede their progress (Limited Capacity Processors). Information Processing Theories not only explain the mechanics of thinking, but also offer techniques for improving cognition. For instance, students can overcome the limits of cognitively capacity when dealing with complex programs by using ‘chunking’ (Rist, 1989). As small pieces of knowledge are combine into lines of code, lines of code into simple plans, simple plans into more complex plans and programs, the chunks of knowledge constructed at each level of description hide the complexity of the level below it. As students develop more expertise the task of writing programs becomes simpler because chunks of knowledge can be retrieved rather than created from scratch, and encouraging students to deliberately form chunks can improve their pace of development. At the same time as it is important to understand how Information Processing Theories explain the ways in which thorough learning occurs, it is equally as important to use these models to recognise how students come to construct a weakly formed or ill-founded understanding of a concept. ‘Interference’ is one such model that explains how sets of information can be disturbed by other concepts in memory. Two common forms of interference are:

• retroactive inhibition – when “previously learned information is lost because it is mixed up with new and somewhat similar information” (Slavin, 1994, p. 201). For instance in computing one type of loop should not be taught until the other has been thoroughly learnt

Page 5: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

- 5 -

• proactive inhibition - when “learning one set of information interferes with learning of later information” (Slavin, 1994, p. 201). This is particularly relevant when computing students are learning a second programming language, especially if it operates on a distinct conceptual paradigm to their first.

These theories of interference remind teachers that pushing learners beyond their information processing capabilities when learning ‘hard’ skills can be detrimental to their progress. However it should be noted that there are often times when one learning experience supports another. For instance, the learning of a first programming language can greatly assist in the learning of a second programming language (retroactive facilitation). At the same time as the learning of the second language can enrich the understanding of the first (proactive facilitation). The key is providing adequate and well timed learning opportunities for students to compare and contrast concepts so that they understand the differences and appreciate the similarities of ideas being presented – maximising facilitation and reducing the possibility of inhibition.

Scaffolding Vygotsky (1978) identified the ‘Zone of Proximal Development’ as those tasks that a learner cannot complete unassisted but can complete with some support. The idea of ‘scaffolding’ is to provide a high level of support at the initial (more difficult) phases of learning and gradually remove that support as the learner grows in competence. Support can include cues, heuristics, hints, providing examples and so on. The web is an ideal medium for providing scaffolding on self directed tasks; simple measures such as providing an ‘explanation’ link at each component task of a learning sequence allows learners to enlist support when required without slowing them down by occurring in the main flow of the text. In this way students can implement an individually differentiated form of Teles (1994) “fading approach to scaffolding”, whereby levels of learner support are initially high but reduce as the student gains confidence with the tasks and materials being presented. Different forms of scaffolding may be appropriate in different circumstances. Landa’s (1976) Algo-Heuristic Theory identifies classes of problems where it is necessary to execute operations in a well structured, predefined sequence (algorithmic problems) and also classes of problems for which precise and unambiguous sets of instructions cannot be formulated (creative or heuristic problems). Landa (1976) also describes semi-algorithmic and semi-heuristic problems, processes and instructions, the point being that the level of prescription in the scaffolding needs to be appropriately matched to the problem. A key idea behind Landa’s theory (and scaffolding generally) is that students ought to be taught not only subject matter but the underlying algorithms and heuristics of experts as well. An extension of this is that they also have to be encouraged to create algorithms and heuristics on their own, which introduces them to the cognitive

Page 6: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

operations, algorithms and heuristics which make up general methods of thinking; a rich exercise in metacognition1. Scaffolding caters to many of the principles of adult learning. Firstly, scaffolding works on the assumption that learners will want to process materials as quickly as possible, which is appropriate in an adult learning environment. Secondly, Knowles (1984) Theory of Andragogy implies that instruction for adults needs to focus more on the process and less on the content being taught. (Scaffolding in the form of algorithms and heuristics are process rather than content oriented.) Thirdly, process scaffolding is particularly appropriate for activities such as case studies, role-plays, simulations, and self-evaluation which are utilised in more ‘soft skill’ evaluative domains of adult learning, such as IT management.

Cognitive Processes and Models in Computing Thinking While Constructivism, Information Processing Theories and Scaffolding are pervasive theories that apply to learning in all subject areas, various investigations have been conducted into the types of cognitive models and processes involved in the learning computer programming. These processes and models are important in assisting Computer Science educators to better understand how students learn computing so that they can construct learning activities that allow students to more rapidly and confidently acquire programming skills and techniques. One such study into student cognition in Computer Science was conducted by Ahanori (2000), who used semi-structured phenomenological interviews to investigate the cognitive representations held by students in a Data Structures course. In his research Ahanori presents the cyclic Actions-Process-Object model that cognitive scientists use to describe how people abstract concepts. Under this model people build cognitive frameworks by transforming processes into objects.

Figure 2: A simplified version of the Actions-Process-Object model

(ref: Ahanori, 2000, p. 28)

1 Metacognition can be defined as “knowledge about one’s own learning, or knowing how to learn and monitoring one’s own learning behaviours to determine the degree of progress and strategies needed for accomplishing instructional goals.” (Slavin, 1994, p. 232)

- 6 -

Page 7: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

As well as the multidisciplinary Action-Process-Object model, Ahanori (2000) also presents a computing specific model that describes the level of abstraction that students possess, from Programming Language oriented thinking (low level abstraction, ‘action’ based) to Program Oriented Thinking (‘process’ based, where reference to a programming language is required, but not necessarily a specific one) to Programming-Free Thinking (high level abstraction, ‘object’ based).

Abstraction LevelLow High

Program

ming-Language

Oriented Thinking

Program

ming

Oriented Thinking

Program

-Free Thinking

Programming Context Thinking

Figure 3: Abstraction Levels of Thinking Types Relating to Programming

Ahanori (2000) proposes developing students’ skills in posing ‘ abstraction barriers’ (refer to the Action-Process-Object model) as an effective way to develop their Program-Free Thinking abilities. He recommends that hiding data structure implementations from students so that they can experiment with the data structure operations (in a concrete manner) provides an abstraction barrier that allows students to form cognitive schema that can eventually facilitate the development of higher level abstraction. In a different study Hazzan (2003) presents a model of how students reduce the level of abstraction in Mathematics and Computer Science in order to cope with unfamiliarity. He defines levels of abstraction in three ways:

“1. abstraction level as the quality of the relationships between the object of thought and the thinking person (Wilensky, 1991); 2. abstraction level as reflection of the process-object duality (Dubinsky, 1991; Sfard, 1991); 3. abstraction level as the degree of complexity of the concept of thought.”

(p. 97)

- 7 -

Page 8: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

Hazzan (2003) proposes that quality of the relationship between the object of thought and the person is compromised as students unconsciously find ways to make an unfamiliar idea more familiar by applying the concept to objects that they already know. For instance, computer science students may describe an array data structure as “a space in computer memory that can contain two or more numbers” without regard for the fact that it can contain fewer than two elements, contain items other than numbers, or even that it does not have to be represented on a machine. They do this because they are familiar with the ideas of lists of numbers being represented by a machine, and thus reduce the abstractness of the concept of “array” in order to provide them with a working object with which to address applications of the abstract idea. The danger of reducing abstraction in this way is that it can lead to misapplication of the underlying concept to unfamiliar domains. It is the responsibility of the teacher and student to continually expand student’s levels of abstraction by extending applications to unfamiliar domains so that the accuracy of student’s underlying mental models can be improved. Like Ahoroni (2000), Hazzan also refers to the process-object duality to describe how abstract concepts are formed. The key idea is that when concepts are forming in the learner’s mind they do so as a sequence of actions, or “process”. However as the learner undergoes “reflective abstraction” the process becomes a notion captured as an “object”, solid conceptual entity upon which other concepts can be built (Dubinsky, 1991, cited in Hazzan, 2003). Hazzan (2003) points out that students often unreflectively employing a canonical procedure (one that is more or less triggered by a particular problem) to arrive at solutions rather than engaging in reflective abstraction. An all too common phenomenon amongst Computer Science students is to apply familiar algorithms to solve problems rather than considering more abstract but efficient and concise solutions. In this way they avoid having to spend time exploring unfamiliar concepts and can remain within their comfort zone. However in doing so they compromise their ability to abstract concepts which ultimately retards their progress. It is the responsibility of educators to draw this to student’s attention, and the responsibility of learners to engage in reflective abstraction rather than merely apply canonical procedures. McGill and Volet (1997) create a conceptual framework for analysing computing thinking that integrates three distinct types of programming knowledge emerging from the educational computing literature (syntactic, conceptual, and strategic) and three distinct forms of knowledge proposed in the cognitive psychology literature (declarative, procedural, and conditional). A summary of the model constructed is presented in the table below.

- 8 -

Page 9: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

Declarative Knowledge Procedural Knowledge Syntactic Knowledge

1. Declarative-Syntactic Knowledge: Knowledge of syntactic facts related to a particular language, such as: Knowing that a semicolon is needed to end each statement in Pascal.

2. Procedural-Syntactic Knowledge: Ability to apply rules of syntax when programming, such as the ability to write a syntactically correct REPEAT statement in Pascal.

Conceptual Knowledge

3. Declarative-Conceptual Knowledge: Understanding of and ability to explain the semantics of the actions that take place as a program exeutes, such as the ability to explain what a fragment of pseudocode does.

4. Procedural-Conceptual Knowledge: Ability to design solutions to programming problems, such as the ability to design a procedure to compute the mean of some data.

5. Strategic/Conditional Knowledge: The ability to design, code, and test a program to solve a novel problem.

Table 1: McGill & Volet’s (1997) Components of Programming Knowledge Framework

This model has twofold utility. Firstly it has educational potential for diagnosing deficiencies in the programming knowledge of novice programmers during a course of instruction. Secondly, the model serves as a basis for designing comprehensive instruction in introductory programming. To demonstrate these benefits, McGill and Volet (1997) apply this framework to a previous experimental study (Volet, 1991). Their application allowed them to identify that Volet’s (1991) approach of an explicit planning strategy for algorithm development in conjunction with modelling, coaching, and collaborative-learning activities had significantly positive effects on students’ procedural-conceptual and strategic/conditional and procedural-syntactic knowledge but not upon students’ declarative-syntactic and declarative-conceptual knowledge 2. Some sub-domains of computer science have lead to specialised mental models of how students learn computing being developed. Recursion is one such area. For instance, in an ethnographic study involving 511 first year students at the University of the Witwatersrand, Gotschi, Sanders and Galpin (2003) conducted a pervasive analysis of students mental models of recursion. What they found was that without a viable mental model of recursion that correctly represents active flow (when control is passed forward to new instantiations) and passive flow (when control flows back from the terminated instantiations) students cannot reliably construct recursive algorithm traces. There are several advantages to such domain specific models. Firstly, they can inform educator’s decisions about the required approach to learning – in the case of recursion a constructivist approach is required in order for students to create a viable mental model adequate to apply design concepts and problem solve. Secondly, Gotschi, Sanders and Galpin (2003) point out that domain specific models assist lecturers by providing accurate mental models, such as Kayney’s ‘copies’ model of recursion, that have been demonstrated as successful at promoting understanding. Thirdly, such 2 It was argued that students' development of declarative-syntactic and declarative-conceptual knowledge were not significantly affected because these are skills that students can learn via independent study of written materials and is not dependent on the educational approach.

- 9 -

Page 10: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

research explicitly exposes non-viable mental models that students may form (such as the looping, magic, and step models), allowing lecturers and pupils to pre-empt student errors. One crucial mental model that has been identified in numerous studies (refer to Robins et al., 2003, p. 149) is that of an abstract version of the computer, often called a ‘notional machine’. The notional machine is “an idealised, conceptual computer whose properties are impled by the constructs in the programming language employed” (du Boulay et al., 1989, p. 431). Robins (2003, p 149) states “that the notional machine is defined with respect to the language is an important point, the notional machine underlying Pascal is very different from the one underlying Prolog. The purpose of the notional machine is to provide a foundation for understanding the behaviour of running programs.” Du Boulay et al. (1989) suggest that in order for novice programmers to overcome comprehension problems caused by the hidden, unmarked actions and side effects of visually unmarked processes the notional machine needs to be simple and supported with some kind of concrete tool which allows the model to be observed. They suggest that the visibility component of such models be supported through ‘commentary’ – a teacher delivered or automated expose of the state of the machine. On the other hand the simplicity component of the machine can be supported through:

1. functional simplicity (operations require minimal instructions to specify) 2. logical simplicity (problems posed to students are of contained scale) 3. syntactic simplicity (the rules for writing instructions are accessible and

uniform). They conclude that matching visibility and simplicity components of notional machines to different populations of novice learners leads to improved educational outcomes. That the notional machine assists learning is not a hypothetical proposition. Mayer (1989) showed that students supplied with a notional machine model were better at solving some kinds of problems than students without the model. One would also suspect that without a notional machine cognitive model student’s progress in computing would be severely restricted in the long term, and that the more sophisticated a student’s notional machine the more developed their problem solving abilities. Both of these conjectures represent potential areas for further research. Another distinction made between the various cognitive representations in computer programming is that of ‘schema’ (static, ‘program as text’) versus ‘plan’ (action oriented, ‘programming as activity’) (Rogalski & Samurcay, 1990, cited in Robins et al., 2003, p. 141). Although models that describe program comprehension (schema based) are much more common than those that describe program creation (plan based), Rist (1995) has constructed an elaborate model of how programs are generated. In Rist’s (1995) model, knowledge is represented using nodes in internal memory (working, episodic, and semantic) or external memory (the program specification, notes, or the program itself). Nodes are used to represent chunks of knowledge or plan using a tuple of the form <role, goal, object>. A node may represent a piece of

- 10 -

Page 11: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

knowledge that ranges in size from a line of code, to routines of arbitrary size or even an entire program. Nodes can therefore nest within one another to form tree structures. Nodes also have four ‘ports’, :use, :make, :obey and :control. These allow them to be linked with respect to control and data flow. Using Rist’s (1995) model the process of building a program commences by starting with a search cue such as <find, average, rainfall>, and retrieving from memory any matching node. If there are no matching nodes (schemas) then a top-down strategy would see the search cue being broken down into related cues, such as <find, sum, rainfall> and <find, count, rainfall>. From here the newly linked nodes are then expanded and linked in the same way. These linked systems of code represent ‘plans’, which are then stored by experts as schema-like knowledge structures. Such a model can be (and has been, see Rist, 1995) used to represent many aspects of learning. For instance, various design strategies (including procedural, functional, means-end and opportunistic) can be represented as a directed progression through the knowledge representations and actions3. Differences between the design strategies of novices and experts can then be identified using the framework by the model. Such a model offer educators a framework for analysing successful (and unsuccessful) student behaviour in the activity of computer programming.

What types of students are more likely to succeed in CS? While understanding underlying educational theories assists Computer Science teachers in the general construction of effective learning experiences, of equal importance is an understanding of student differences. Before engaging in educational processes, Instructional Design models (such as those outlined by Clarke, (1995) recommend performing a thorough user analysis to acquire an accurate understanding of learners. By appreciating the different attributes of Computer Science learners, educators can tailor instruction and courseware to more effectively meet the needs of all students and can assist learners in developing a greater understanding of the requirements for success in computing. As well, if the factors leading to achievement in computing are understood, appropriate interventions to improve the likelihood of student success can be made. Numerous studies from a variety of perspectives have been conducted in an attempt to determine these attributes. A summary of research findings on student characteristics that lead to success is provided below. Where relevant, results are critiqued for reliability and validity. One major study into characteristics of successful students was that conducted by Wilson and Shrock (2001). They investigated the correlation between student attributes and success based upon surveying and testing of 105 students in an introductory Computer Science course. The twelve possible predictive factors the considered were:

1. math background

3 Rist (1995) implements this approach using an Artificial Intelligence system called Zippy to demonstrate how a computer can use the various design techniques to solve problems.

- 11 -

Page 12: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

2. attribution of success/failure to luck 3. attribution of success/failure to effort 4. attribution of success/failure to task difficulty 5. attribution of success/failure to ability 6. domain specific self-efficacy 7. encouragement 8. comfort level in the course 9. work style preference 10. previous programming experience 11. previous non programming experience, and 12. gender.

Their study found the three statistically predictive factors (listed in order of effect) of success in their course:

i) comfort level in the course ii) math background, and iii) attribution of success to luck.

Success in the course was positively correlated to comfort level and math background, whereas attribution to luck was negatively correlated to performance. Also, by deconstructing the previous programming and non-programming survey instruments into component questions they also found that a formal class in programming had a positive influence on grades while a history of playing computer games actually had a negative influence on grades. This study was based upon a rigorous methodology (including a pilot implementation, content validity reviews by experienced Computer Science professors and reliability analysis based on Pearsons’ Correlation Coefficient for all predictor variables). However, the survey instrument used to collect student attribute data was distributed just prior to the mid-term test upon which the ‘performance success’ variable was based. The researchers note that this methodology may have affected the validity of the data. For instance, the comfort level that students reported may not have been the cause of superior performance but rather an intervening effect of some other factor such as amount of study completed. However experiments by other researchers also support the claim that students with prior programming experience attain significantly better performance than those with none. In quantitative analysis of 364 first year computing students Morrison and Newman (2001) found that a prior course in computing was a highly significant factor in determining final course grade, and that prior courses completed at university had a more positive impact than high school courses. Another study conducted by Hagan and Markam (2000) extends upon these results, by providing evidence that the more languages with which a student has experience, the better their performance tends to be. As well, Hagan and Markam (2000) found that the more programming languages that students had studied the greater their confidence in passing the course, and the less difficult they perceived the pace and content of the course compared with other courses.

- 12 -

Page 13: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

An interesting facet of Hagan and Markams’ (2000) research is that results were persistent throughout the semester long investigation. Data from the 121 participants was collected and 5 separate stages throughout the semester and the effect of prior programming experience upon performance didn’t diminish as students with less prior experience had a chance to close the knowledge and skills gap. Alan & Kolesar (1996) provide evidence that students who completed a precursory course targeted at developing problem solving skills and introducing students to the broad concepts of computer science using a skills based approach contributed more to success in an introductory programming course (CS1) than a previous formal course in programming. They implemented such a course (CS0) based on the belief that CS1 is not an appropriate first course in computing for most students, citing figures of a 22% pass rate for students with no previous experience in computing4. The study by Alan & Kolesar (1996) found that 53% of students who completed CS0 received an A grade in CS1 as compared to only 26% A grades for students who undertook a previous course in BASIC (which was the language used in their CS1 course). Developing students’ problem solving abilities5 and broad understanding of computing concepts, fostered through a skills-based laboratory approach, appears to have improved student’s likelihood of success in computing more than a prerequisite programming course. That mathematical-logical ability is a predictor of success in computer science is also supported by other research (such as Byrne and Lyon, 2001 and Alstrum, 1996). Interestingly, Byrne and Lyon (2001) also found that gender, prior computing experience, Kolb’s Learning Style preference (Converger, Accomodator, Diverger, Assimilator) were not predictive factors for success in an introductory Programming and Logical Methods course. The only other predictive factor that they found apart from Mathematics ability was Science ability – Leaving Certificate scores in English and foreign languages were not predictive. On consideration of more basal cognitive attributes leading to success in computing Mayer et al. (1989) deployed a battery of pre and post tests on 111 students enrolled in a semester long BASIC course to determine a significant correlation between final examination performance and the following skills:

1. word problem translation (requiring a translation of word problems into equations)

2. word problem solutions (providing the correct numerical answer to word problems)

3. following procedures (predicting output for a procedure stated in English) 4. following directions (predicting the consequences of following one or

more actions) 5. logical reasoning (general problem solving questions) 6. visual ability (interpreting results of paper folding tasks)

4 Interestingly, possessing mere applications experience (spreadsheets, word processing) increased the likelihood of passing to over 50%. This indicates that for many students the learning curve of computer familiarity is too steep to be scaled in the space of one semester. 5 See Alan & Kolesar (1996) for examples of the problem solving and computer skills activities that were utilised. Note that problem solving activities were designed to develop the nine ‘generic skills of good problem solvers’ they compiled from a literature review of articles spanning three decades.

- 13 -

Page 14: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

Of these logical reasoning and visual ability were significant predictive factors. It is interesting to note that general verbal ability and arithmetic computation ability were not found to significantly correlate to examination performance. Another student attribute that is frequently analysed for correlation to success in Computer Science is that of Field Dependence/Independence. Witkin et al. (1977) defines field independence as "the extent to which a person perceives part of a field as discrete from the surrounding field as a whole, rather than embedded in the field; or … the extent to which the person perceives analytically" (p. 7). Chou (2001) sites several studies that conclude Field-Independent (FI) people are more likely to do well with numbers, science, and problem-solving tasks than Field Dependent thinkers. They tend to analytically approach a problem and perceive a particular and relevant item in a field of distracting items6. To analyse the affect of Field Dependence/Independence on success in computing Chou conducted (2001) study involving 84 students enrolled in a series three, 2 hour web publishing courses. She found that Field Independent learners7 had a significantly higher pre and post Computing Self-efficacy rating than Field Dependent learners. That is to say, Field Independent students at least perceive themselves as better at computing! Gibbs (2000) cites work by Bishop-Clark that found Field Independence was more strongly correlated to performance in the design stage of computer programming than with coding. However when Gibb’s employed a constructivist approach (involving the explicit description of the purpose, structure, model cases and arguments of all semantic and syntactic programming elements being studied by students) to instruction in his own work he found no correlation between Field Dependence/Independence and achievement scores on design or coding tasks. Nor was there any correlation between field dependence/independence and the quality and quantity of student’s programming element descriptions. That is to say, under a constructivist approach to CSE a field dependent/independent cognitive style was not shown to be a significant predictor of success in the performance components of computer science learning that were measured, perhaps indicating that constructivism caters to a broader spectrum of learners. However, when considering the validity of these results it needs to be considered that on average students in this study scored below 50% in both the design and coding tests that were used to measure performance, perhaps indicating that the constructivist approach employed was not as effective in promoting learning overall as would be desired. Two tangential but informative results from Gibbs (2000) work were these:

1. A pretest in coding ability was a significant predictor of design score

6 On the other hand, Field-Dependent people tend to have greater skills at recalling such social information as conversations and relationships. They tend to approach a problem in a more global way and are generally better at seeing the whole picture in a given situation. 7 Field Dependence/Independence was measured using the Group Embedded Figures Test (GEFT) designed by Witkin et al, 1971. Computing Self-efficacy rating was gauged using the Computer Self-efficacy scale (CSE) developed by Murphy et al, 1989)

- 14 -

Page 15: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

- 15 -

2. The quantity of distinct syntactic elements that a student identified in their programming element description portfolio was a predictor variable for coding performance.

The first result provides evidence that the more developed a student’s concrete understanding of an underlying language the greater their capacity to design in that language. The second result indicates that student’s capacity to code is affected by the extent of their syntactic repertoire in the language being used. Analysis of preferred learning style is another popular approach to investigating attributes that lead to success in Computer Science. For instance, Thomas et al. (2002) investigated the correlation between Felder-SilvermanTP

8PT learning style and

performance of 107 first year programming students at the University of Wales. The Felder-Silverman Learning Style Model classifies students according to their preferred learning style:

• Active or Reflective learners • Sensing or Intuitive learners • Visual or Verbal learners • Sequential or Global learners • Inductive or Deductive learners.

Thomas et al. (2002) found that reflective learners (those who learn by working on their own and thinking things through) scored higher in the final course than active learners (those who learn by working with others and trying things out). Also, verbal learners (those who prefer written or spoken explanations) scored higher than visual learners (those who prefer pictures, diagrams or flow charts). They also propose a potential interaction effect between dimensions (for instance that ‘reflective/intuitive/verbal/global’ learners may perform higher than the sum of the component effects). However further research would be needed to substantiate this claim as cell sizes were too small to provide convincing evidence for this conjecture. In a separate study on the effect of learning styles on academic performance of 974 computer science students, Ross et al. (2001) found that students with an ordering preference for ‘sequential’ learning outperformed those with a ‘random’ learning tendencyTP

9PT. Interestingly no significant difference in performance was detected

between students with a tendency for concrete perception as opposed to abstract perception, which might be expected for a discipline such as computing. An implications of this research is that teachers need to employ deliberate tactics to support random learners in the learning of computer programming. Ross et al. (2001) suggest employing a variety of delivery methods, offering instruction on how to structure thought sequentially and providing group-work opportunities that pairs across ordering learning styles.

TP

8PT The Felder-Silverman Learning Style Model is based upon a synthesis of the Myers-Briggs Type

Indicator (MBTI, which classifies people according to Jung’s theory of psychological types), Kolb’s Learner style model which classifies learners on both concrete-abstract and active-reflective dimensions) and the Hermann Brain Dominance Indicator (HBDI, which identifies left or right and cerebral or limbic thinking dominance). TP

9PT The Sequential-Random and Concrete-Abstract dimensions of learning style were measured using the

Gregorc Style Delineator instrument.

Page 16: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

And on yet another front of analysis Perkins et al. (1989) distinguished between two main kinds of novice programmers:

1. ‘stoppers’, who are frustrated by mistakes/errors and when confronted with a problem are likely to simply stop

2. ‘movers’, who use feedback about errors effectively and keep trying, experimenting, and modifying their code.

Encouraging students to approach feedback from errors as a positive opportunity to grow and learn is a simple tactic to encourage moving rather than stopping programmers. Perkins et al. (1989) suggest that affecting this and other underlying ‘patterns of learning’ can not only improve student’s programming skills but also their ability to learn in other domains. They propose that teaching students to ‘close track’ code, break down problems and engage in directed ‘tinkering’ are effective approaches to help students move through programming problems. However, they also forewarn that students do not become extreme movers or ‘tinkerers’, who tend to make changes more or less at random and are unable to trace or track their program.

Novices versus Experts With the pedagogical theories and dimensions of the learner addressed, attention is now turned to the aim of computer science education – assisting the development of expertise. Much work has examined the difference between the “novice” and the “expert”, both in computing and other scientific domains. Identifying these differences allows the learner and educator to have a clear view of the types of skills and capacities that are required for expert thinking in the area. Robins et al. (2003, p. 165) point out that “it may be productive, in an introductory programming course, to explicitly focus on trying to create and foster effective novices.” They have performed an extensive review of the literature, which provides a substantial contribution to the discourse in the remainder of this section. The most commonly cited breakdown of Novice to Expert development is that by Dreyfus and Dreyfus (1986) which identifies the following phases:

1. Novice 2. Advanced beginner 3. Competence 4. Proficiency 5. Expert

with the characteristics of the novice programmer are almost diametrically opposite to those of the expert. Winslow (1996) identified the following deficits in novice programmers:

1. surface, fragile and superficially organised knowledge 2. lack of detailed mental models 3. failure to apply relevant knowledge or appropriate problem solving strategies 4. approach programming ‘line-by-line’ rather than using meaningful program

chunks or structures 5. lack planning and testing skills

- 16 -

Page 17: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

In order to develop expertise, novice programmers need to abstract their knowledge beyond the local, concrete representations they hold. Robins et al (2003, p. 140) comment that expertise “must rest on a foundation of knowledge about computers, a programming language or languages, programming tools and resources, and ideally theory and formal methods.” That is to say, the development of expertise is unavoidably a long term pursuit, especially because it involves the transformation of mental representations as well as extensive field experience with applications. Part of developing expertise is the ability to apply reasoning from one area of application to another. Kurland et al. (1989) propose that the very reason novices experience difficulty transferring expertise across domains is because of the way they classify similarities and differences in various applications and structures. It is because novices classify tasks and according to more superficial and concrete characteristics such as language and context rather than underlying conceptual structures the locus of their transfer is reduced. At the other end of the spectrum a summary of studies by von Mayrhauser and Vans (1994) notes that experts:

1. have efficiently organised and specialised knowledge schemas 2. tend to organise their knowledge according to functional characteristics (such

as the nature of underlying algorithms rather than superficial details (such as language syntax)

3. can draw upon general problem solving strategies (such as divide and conquer) as well as specialised strategies

4. use specialist schemas and a top-down, breadth-first approaches in order to efficiently decompose and understand programs

5. are flexible in their approach to comprehending programs and their willingness to abandon questionable hypotheses. (cited in Robins et al., 2003, p. 139)

There is considerable overlap between the characteristics of expert programmers and the characteristics of experts in other deductive fields. “Experts are good at recognising, using and adapting patterns or schemas (and thus obviating the need for much explicit work or computation). They are faster, more accurate, and able to draw on a wide range of examples, sources of knowledge, and effective strategies” (Robins et al., 2003, p. 140). That is to say, the challenge for the educator presented with a class of novices is to work towards the development of these sophisticated knowledge representations and problem solving strategies within their students. Robins (2003, p. 141) notes that introductory programming courses focus on programming knowledge far more than they focus on programming strategies. Davies (1993, cited in Robins et al., 2003) distinguishes between ‘programming knowledge’ (knowledge of a declarative nature, for example, being able to state how a ‘for’ loop works) and ‘programming strategies’ (the way knowledge is used and applied, for example, using a ‘for’ loop appropriately in a program). This distinction reminds teachers of the important of integrating tasks requiring strategic and evaluative skills into the curriculum in order to develop student’s expertise. However it is equally important to remember that novices need to develop a firm foundation of programming knowledge before embarking on more sophisticated

- 17 -

Page 18: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

levels of practice (such as application and synthesis) required for expert thinking. Kurland et al. (1989, p. 83) note that “the thinking skills we hope to develop and transfer out of programming depend upon student attaining certain proficiencies in programming”. Unless students master language constructs, operations, control flow and the like they lack the foundations required to problem solve. Work by Putnam et al. (1989) involving 96 students learning to program in BASIC supported this claim. They argue that the lack of knowledge relating to elementary features of the programming language was a major inhibitor to the development of the higher order cognitive skills that much programming instruction intends to foster. Care needs to be taken about what exactly comprises expert thinking and the order in which such capacities should be developed. For instance Fay and Mayer (1994) provide evidence for a "syntax-independent access theory" of learning computing; that planning (‘schematic’ and ‘strategic’) knowledge in programming can be learned independently of the ‘syntactic’ and ‘semantic’ knowledge that defines the programming language 10 . Their study of 20 college students with no prior programming experience found that pre-training in general design principles such as modularity and reusability lead to improved performance on programming assignments. Design in itself need not be considered an expert behaviour that is deferred to later stages of learning to program. It integration of design and application skills that is more in the domain of expertise.

Identifying Student Difficulties In order for students to progress from novice to expert as efficiently as possible it is important to have an understanding of the difficulties they experience. This allows educators to provide scaffolding that helps learners to surmount these difficulties and allows the students themselves to pre-empt impediments to their learning by being aware of their potential before they arise11. Du Boulay (1989) describes five overlapping and inextricably linked domains and potential sources of difficulty that must be mastered when learning computer programming:

1. general orientation (what programs are for and what can be done with them) 2. the notional machine (a model of the computer as it relates to executing

programs) 3. notation (the syntax and semantics of a particular programming language) 4. structures (schemas and plans) 5. pragmatics (the skills of planning, developing, testing, debugging and so on).

10 This is a point of contention in Computer Science Education circles. For instance, in their article their work “Design Early Considered Harmful” Buck and Stucki (2000) advocate a hierarchical progression of skill sets and gradual learning through example. They provide the suitable analogy that it makes little sense to teach students how to plan an essay before they can spell words, write sentences, and form paragraphs. They refer to Bloom’s Taxonomy of cognitive learning (Knowledge, Comprehension, Application, Analysis, Synthesis, Evaluation) as a suitable order for development of student skills. Further research into the sequencing of instruction for the various levels of expertise is required. 11 Robins et al. (2003) is an excellent source of discussion relating to student difficulties in computer science education, many relevant components of which have been described in this section.

- 18 -

Page 19: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

Du Boulay (1989) notes that much of the early difficulty in learning computing arises from the student’s attempt to deal with all of these different kinds of difficulties all at once. ‘Misapplication of analogy’, ‘overgeneralisation’ and ‘interaction of parts’ errors result. In the early stages teachers can assist the learning process by trying to address these domains separately (as far as possible) so as to reduce interference between them. Planning is one area of programming that poses many potential difficulties for students. Spohrer and Soloway (1989) outline a taxonomy of nine kinds of plan composition problems:

1. Summarisation problem - only the primary function of a plan is considered, implications and secondary aspects may be ignored.

2. Optimisation problem - optimisation may be attempted inappropriately. 3. Previous-experience problem - prior experience may be applied

inappropriately 4. Specialisation problem - abstract plans may not be adapted to specific

situations. 5. Natural-language problem - inappropriate analogies may be drawn from

natural language. 6. Interpretation problem - ‘implicit specification’ can be left out, or ‘filled in’

only when appropriate plans can be easily retrieved. 7. Boundary problem - when adapting a plan to specific situations boundary

points may be set inappropriately. 8. Unexpected cases problem - uncommon, unlikely, and boundary cases may

not be considered. 9. Cognitive load problem - minor but significant parts of plans may be omitted,

or plan interactions overlooked. One implication of this taxonomy that is supported across the literature (for example, Winslow, 1996 and Rist, 1995) is that basic program planning rather than specific language features12 presents the main source of difficulty to students. The taxonomy also highlights the pervasive extent to which fragile programming language specific knowledge and strategies can lead to student difficulties in novice learning. Bonar and Soloway (1989) constructed a taxonomy of eleven types of errors that students can make when learning to program but found that by far the most prevalent were those caused when Step-by-Step natural language Knowledge (SSK) confounded Programming Knowledge (PK). Novices and experts have similar SSK – people are used to providing algorithmic descriptions of how to solve problems from their every day life. However novices only possess fragmented PK, which leads them to create speculative solutions (or ‘patches’) to overcome stumbling blocks (or ‘impasses’). In their case study research of 34 programming students Bonar and Soloway (1989) concluded that confounding between SKK and PK accounted for between 47% and 67% of errors, depending on the programming context. 12 Spohrer and Soloway (1989) identify only three language feature or ‘construct based’ difficulties that students encounter: ‘natural language’ problems, ‘human interpreter’ problems and ‘inconsistency problems’.

- 19 -

Page 20: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

A useful representation for understanding student errors is the "HCI conception" model proposed by Rath and Brown (1995). Their model identifies that operational errors (syntactic & algorithmic) are often based upon three types of underlying [mis]conceptions:

1. Natural Language Reasoning - where a student's natural language understanding of a term or concept is incorrectly substituted for a programming language construct

2. Independent Computer Reasoning - where students assume that the computer will automatically take actions or that it will understand what to do

3. Alternative and incomplete reasoning - where an alternative model of reasoning is applied to the programming context (such as algebraic reasoning or mechanical process reasoning from a physical domain) or a programming language concept is incompletely formed.

Natural Language Reasoning Independent

Computer Reasoning

Operational Errors

Conceptions

Alternative and Incomplete Reasoning

Algorithmic Error

Syntactic Error

Figure 4: HCI conception model of student errors

Teachers can quite easily identify and correct operational errors, but from an educational point of view it is far more important to address the underlying [mis]conception. While this model is useful for considering the nature of student errors, no experimental research has been conducted to verify its validity. A study on student difficulties that was founded on substantial scientific research as that by Vickie Alstrum in 1996. Alstrum conducted an extensive study involving the results of more than 25 thousand Advanced Placement in Computer Science (APCS) exams spanning 8 years to examine the extent to which mathematical logic causes more student difficulty than other aspects of computer science. The research involved constructing a thorough taxonomy of the concepts in the logic space being considered, content analysis classification of multiple choice questions by 38 experienced computer science educators and statistical analysis of the ‘difficulty distribution’ that was formed. Her results indicated that students did find questions requiring mathematical-logical reasoning significantly more difficult than those relating to other aspects of computer science. However these results need to be considered in the light that the difficulty of the logical questions themselves may have been set at a level beyond that of other less-logical related questions, which could have effect the results

- 20 -

Page 21: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

reported. The conclusion that can be drawn from Alstrum’s (1996) study is that Computer Science students often struggle with the level of logical reasoning required to become proficient programmers.

Instructional Design “Instructional design” is the term used to describe the process of planning and constructing educational resources to promote learning. Once a learner profile has been obtained, factors relating to success in the subject domain understood, the requirements for progression from novice to expert appreciated, and potential impediments to achieving that progression identified, educators have a firm basis upon which to commence the Instructional Design process. However the learning objectives for Computer Science are multi faceted and intricately interconnected. So too are the approaches to achieving those objectives. This section identifies some of the main objectives of instructional design in Computer Science and explains approaches based on educational theory to achieve those objectives. Several dimensions of instructional design are considered, including how best to introduce new concepts, approaches to scaffolding, the merits of behaviour modelling, teaching problem solving, developing higher order thinking, promoting deep learning, the rationale for a relevant curriculum and the importance of sequencing. Wherever possible, scientific research evidencing the effectiveness of the various approaches is provided.

Introducing new concepts The clear and effective introduction of new concepts accelerates students’ development of expertise and short-circuits the need for remediation. Porter and Calder (2003) point out that introducing programming concepts is inherently problematic because higher level cognitive activities such as analysis and synthesis are required early in the learning process, often before students have had the chance to develop the required knowledge and comprehension for the task or construct. This leads to instructors often pondering the most effective techniques for introducing Computer Science concepts. Literature in the area can provide some illuminating perspectives. Different instructional frameworks have been proposed in the general educational sphere. For concepts and practices where students have little or no background expertise, Gagne’s (1985) Conditions of Learning Model is an appropriate framework upon which instruction can be based. This model identifies nine instructional events and corresponding cognitive processes:

(1) gaining attention (reception) (2) informing learners of the objective (expectency) (3) stimulating recall of prior learning (retrieval) (4) presenting the stimulus (selective perception) (5) providing learning guidance (semantic encoding) (6) eliciting performance (responding) (7) providing feedback (reinforcement) (8) assessing performance (retrieval) (9) enhancing retention and transfer (generalization).

- 21 -

Page 22: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

However for conceptual development in domains that are more familiar to learners a less explicit instructional process may be appropriate. Carroll presents this argument in his Minimalist Theory when he identifies the potential of “minimising the amount of reading and other passive forms of training by allowing users to fill in the gaps themselves” (Kearlsey, 2003). This approach can exploit the prior knowledge of the learner to affect more cognitively efficient learning. In some contexts it may be appropriate to offer students the choice between Gagne’s model and a Minimalisit model. Providing learners with flexible pathways ensures that materials cater to a broad spectrum of learners. One common choice in computer science education is whether to introduce concepts through practical activities (discovery learning) or whether to employ the more traditional lecture based approach (direct instruction). Haberman and Kolikant (2001) conducted research involving 413 tenth grade high school students to gauge the effect of introducing new concepts through lab activities (with subsequent follow up discussions in class) as opposed to the traditional technique of introducing new concepts in class (using ‘blackboard’ approaches and then requiring students to complete practical laboratory activities later on). The trial of the ‘laboratory first’ approach was built on the belief that one of the main goals of computer science is to facilitate the personal construction of valid mental models for new concepts. The results of Haberman and Kolikants’ (2001) research indicated that students who were first introduced to concepts using their tailored laboratory activities performed significantly better on a task involving the prediction of output for small sequences of Pascal code than those who first learnt the concepts in the classroom. They concluded that the ‘black box’ laboratory activity first approach that they had employed (involving the use of a simple program followed by an analysis of code) improved the development of valid schemata by students. This research further supports the use of constructivist approaches in Computer Science education.

Scaffolding Scaffolding in Instructional Design is the amount of support offered to learners to progress them through their Zone of Proximal Development. Scaffolding can take many forms, the quality of which determines the standard of educational provision. Research provides recommendations for supporting the learning of computer programming through scaffolding, but also offers a warning about assuming that any form of scaffolding will automatically support learning. There are many examples of scaffolding in the literature. This sub-section presents two examples of effective scaffolding that are applicable across many areas of learning computing, followed by an example of how scaffolding may not necessarily be effective. One strategy for assisting students in program interpretation is providing diagrammatic navigational support for code. In an experiment involving 44 beginning computer science pupils, Hendrix et al (2000) found that using Graphical Representation of Algorithms, Structures and Processes (GRASP) software to augment code with a Control Structure Diagram allowed students to more rapidly and correctly respond to test items relating to fundamental control flow of the program.

- 22 -

Page 23: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

Figure 5: An example of a Control Structure Diagram (Hendrix et al., 2000)

This is a simple scaffolding technique to implement using the GRASP software, and can be faded as students become more familiar with reading and working with code. Another technique that supports beginning learners is the use of program skeletons. Applin (2001) conducted an experiment involving 42 first year students enrolled in an introductory programming course that investigated the effect of setting assignments that requiring students to write specialised modules that are part of well developed prewritten programs rather than writing programs from scratch. The rationale for such a methodology is drawn from English language and foreign language classrooms that posit ‘the more students read the better they will write’. The analogy for Computer Science Education is that being exposed to well-written programs would cause students to write higher quality programs by the end of the course. Applin’s (2001) study confirmed that providing this scaffolding significantly improved student grades in the final examination, even after accounting for potential student differences13. Applin (2001) also provides anecdotal evidence for improved quality of program documentation, modularisation and parameter passing by students from the treatment group in their subsequent courses as a result of using the program skeleton technique.

13 Student differences were measured using the Watson-Glaser Critical Thinking Appraisal (WGCTA) to measure inference ability, recognition of unstated assumptions, deduction ability, interpretation skills and argument evaluation skills.

- 23 -

Page 24: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

However it is erroneous for educators to assume all scaffolding is necessarily beneficial to students. In an experiment involving 106 introductory programming students at the University of Wales, Thomas et al. (2004) found that providing students with object diagrams during a multiple choice test did not provide any significant improvement in performance. This was contrary to their expectations.

Figure 6: Partially completed Object Diagram and Question (Thomas et al., 2004, p. 251)

While beginners who were offered object diagram support did have higher average performance and those who used object diagrams in follow up tests did have a higher average score as a group compared to those who didn’t, the results were not significant at a 5% level. Also, Thomas et al. (2004) found that in future tests object diagrams were used less by students who entered the course with no formal programming background in a follow up assessment that did not contain object diagram support, which may indicate their lack of perceived merit. In fact, for students who did have a background in programming and constructed their object diagrams in this follow up assessment actually recorded a lower average score than those who did not (though not significantly so). One possible reason that performance did not improve with object diagram support in and students who received this support were no more likely to adopt the technique of their own accord in this case could be that to perform tasks involving program traces students must internally possess a mental model of the concepts being addressed. While presenting students with diagrams may be beneficial during the concept formation process, once students have constructed an accurate mental model there is no need for diagrams, and for those students who do not possess a correct mental model constructing one during a test using object diagram support may be a difficult undertaking. The moral of the story for educators is that they should not assume that all scaffolding is beneficial – cognitive support needs to be carefully designed and tested. Offering students the choice of scaffolding rather than forcing its use allows instruction to better cater to individual needs.

Behaviour Modelling Observing experts in action is espoused as one of the most effective techniques for developing programming expertise in novice learners (Collins et al., 1991). “Modelling” offers Computer Science educators the capacity to impart attitudes, thought processes, problem solving techniques and a whole range of other underlying

- 24 -

Page 25: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

skills that are not made explicit or at least not embedded in their context when other methods of teaching are employed. For instance “The Pragmatic Programmer” by Andrew Hunt and David Thomas (2000), identifies that programmers who are skilled at their business need to be fast adapters, inquisitive, critical thinkers, realistic and in many respects jack-of-all-trades. These are not skills that are easily taught. However, expert modelling offers students a “Cognitive Apprenticeship” (Collins et al., 1991) through which these behaviours can be portrayed and subsequently adopted by learners. That is not to say the behaviour modelling process need necessarily be devoid of explanation. For instance, Landa's (1976) Algo-heuristic theory can be deployed specifically to support expert modelling. Algo-heuristic theory deconstructs the conscious and especially unconscious mental processes that underlie expert learning, thinking and performance in any domain. This theory presents a system of techniques for getting inside the mind of expert learners and performers to uncover the underlying processes involved. These are then decomposed into elementary components – mental operations and knowledge units – which can in turn be used to teach algorithmic and/or heuristic based tasks. Thus by combining instructor modelling with Algo-heuristic support, students are exposed to both implicit and explicit expertise development mechanisms. There is research evidence to support the use of expert modelling. In her study of learning styles and delivery methods in a web publishing course, Chou (2001) found that students exposed to a behaviour modelling instructional approach achieved significantly higher scores on assessment tasks and had significantly higher increases in Computing Self-efficacy ratings than students who were exposed to a direct instruction approach. That is to say, Chou (2001) found expert modelling was a more efficient approach to teaching than the traditional lecture style14.

Approaches to Teaching Problem Solving Computer Science is a domain requiring substantial logical reasoning and problem solving. Research has shown that teaching problem solving skills can improve programming performance (for instance, Allan and Kolesar, 1996). The question then becomes how to best develop students’ problem solving ability, especially as applied to the computer science domain. Gagne (1985) identified a hierarchy of five discrete psychological capabilities upon which problem solving is based:

1. discrimination learning (distinguishing between various objects) 2. concept learning (objects matched to concept classes) 3. defined concepts (learning classification rules or definitions) 4. rule learning (responding to classes or relationships), and 5. higher order rule learning (combining rules in order to solve a problem).

Since these skills are hierarchically based, students must master lower level prerequisite skills before they can advance to the next higher level. Grant (2003) bears

14 For interested readers Chou provides a comprehensive yet concise literature review of self-efficacy, training methods and cognitive style.

- 25 -

Page 26: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

comparison between Gagne’s hierarchy and the process of solving a computer programming problems; problem identification, analysis of problem statement, design of a solution, and coding. Just like Gagne’s (1985) staged model, solving computer programming problems a hierarchical undertaking where success at previous stages is crucial to achieving a valid solution. Literature points to several strategies for developing students capacities at various stages of the problem solving hierarchy. For instance, research by David Ginat (2001) provides case study demonstrations of how the tactic of ‘self-explanation’ can be used to develop metacognitive awareness and thus improve student’s control skills in the problem solving process. Ginat refers to Schonfield’s (1985) problem solving model in the mathematical domain which specifies 3 fundamental aspects of problem solving:

1. Resources (such as axioms, formulae, and standard procedures) 2. Hueristics (strategies such as inductive search, analogy and backward

reasoning) 3. Control (monitoring decisions taken by the problem solver during the problem

solving process). Ginat argues that requiring students to engage in ‘self-explanation’ as a compulsory activity during the problem solving process forces them to reflect (metacognitively) on their control decisions. This in turn allows students to critically evaluate the success of their and other student’s problem solving tactics, thus improving their overall problem solving ability on future tasks15. In a subsequent article Ginat (2002) identifies an array of problem solving approaches that may applied to problem solving in Computer Science:

1. ‘Top-down’ – where the problem is partitioned into sub-problems whose combined solution yields the necessary result. This is the most common approach employed in Computer Science.

2. ‘Range Decomposition’ – where the overall task is divided into subtasks of the same function, applied to different subsets of the data.

3. ‘Element Decomposition’ – where the elements being processed are deconstructed to allow more efficient processing.

4. ‘Inductive Decomposition’ – where the problem space can be reduced without affecting the final solution.

For instance, consider the problem of determining whether a particular number makes up the majority of a large set of numbers. An example for a small set of numbers would be that in the set {5, 3, 6, 5, 5, 9, 12, 5, 5} the number 5 makes up the majority. The problem is to design the best algorithm for determining this number, if it exists. A ‘Top-down’ solution may eventuate in algorithms to count the frequency of numbers using a data structure to store values, calculate the maximum of those

15 While Ginat espouses the utility of this approach in developing the skills of 23 talented students preparing for the Computer Science Olympiad, no formal experiment was conducted, and no conclusions were drawn about the effectiveness of such a methodology for improving the problem solving skills of average or below average students.

- 26 -

Page 27: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

frequencies, count the number of elements in the list, and determine if the maximum frequency is greater than half the number of elements. This is an inefficient solution. A ‘Range Decomposition’ approach would find the majority element of subsets of the data and then apply rules based on these results to determine the overall majority element, if it exists. An ‘Element Decomposition’ approach might be to convert the integers into binary numbers and determine the majority element for each binary digit. Then a second linear pass of the original integer set would determine if the number composed of the majority elements from the binary digits was in fact a majority element. An ‘Inductive Decomposition’ approach would be to note that removing two different elements from the set does not affect the presence of a majority element, so pairs of different numbers can be discarded until there is only one (repeated) number left, or there is a set of three different numbers. The important point that Ginat’s (2002) work raises is that by identifying different problem solving techniques and making them explicit to students in the teaching process, learners have a greater repertoire of tactics to deploy when confronted with programming problems on their own. Muller, Haberman and Averbuch (2004) site work by Rist that suggests “programming expertise is partly represented by a knowledge base of pattern-like chunks, which consist of problems, their solutions and associated information. Given a problem, the expert programmer can retrieve an appropriate solution schema from his memory.” (pg 103). As such they introduce a set of guidelines for teaching problem solving patterns in Computer Science. This may be applied to finding the solution to a programming problem or it may be used as a more abstract, general approach to solving problems. It involves the following components:

1. Providing a representative example of a type of problem 2. Providing a definition and description 3. Specifying a pattern name 4. Identifying similar patterns and similar problems 5. Comparing to other types of solutions 6. Identifying typical uses of the pattern 7. Highlighting common mistakes and difficulties 8. Pattern composing (for problems whose solutions may be composed of several

problems) 9. Practicing the modification of pattern related solutions to solve alternative

problems. Muller, Haberman and Averbuch (2004) point out that the use of every stage is not required in all teaching situations that relate to problem solving, however, taken as a whole the heuristic provides a thorough checklist for teachers. Bloom’s Taxonomy of the Cognitive Domain (Bloom, 1971) is a well known educational hierarchy specifying levels of cognitive development. Porter and Calder (2004) used Bloom’s Taxonomy to decompose the task of learning to program:

- 27 -

Page 28: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

Bloom’s categories Learning to Program

Knowledge Tools, constructs, syntax Comprehension Relating concepts

Application Flow, semantics Analysis Understanding the problemSynthesis Create the solution Evaluation Assess other options

Table 2: Bloom’s Taxonomy Applied to the computing domain

(Table taken from Porter and Calder, 2004) In order to assist students through the various levels of solving problem solving Porter and Calder (2004) have suggested a pattern based support mechanism.

Figure 7: Building a Pattern Sequence (Porter and Calder, 2003)

- 28 -

Page 29: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

Their approach is intended to develop student’s capacity to choose problem solving patterns based on the pattern ‘language’ defined up to any given point in a course, and then successfully sequence those patterns together. However initial testing of the technique has been discouraging. In a pilot of an experiment to test the efficacy of their approach, Porter and Calder (2004) found that many students did not refer to the problem solving pattern when attempting to solve subscribed tasks. This may be because students feel that they are already developed beyond Porter and Calder’s pattern based approach to problem solving or that the technique is overly simplistic and subscriptive for the complex task of programming. The lesson for educators may be that applying techniques that deconstruct the learning process beyond a certain point may not be effective for a fuzzy domain such as problem solving. Rist (1989) identifies that only once a relevant problem solving schema has been developed can students perform forward problem solving approaches from problem statement to goal solution. In the absence of such a forward moving plan, students must go about creating (a simplified) plan in a backwards fashion, starting with the goal solution. Only once a programmer has developed the relevant problem solving schema for a situation can they retrieve the plan to efficiently solve other problems. This is an important point for Computer Science educators to recognise – when learners are solving unfamiliar problems they can’t expect to be able to adopt a forward (or ‘top-down’) approach to achieving a solution. Rather students need time and support to form problem solving plans so that they will be able to use a retrieval based approach in subsequent similar problems.

Higher Order/Critical Thinking Once a solid conceptual understanding of a learning domain has been formed it is important to challenge students to develop their higher order thinking skills. Critical thinking (often referred to as higher order thinking) has been defined as a “disciplined, self-directed thinking which exemplifies the perfections of thinking appropriate to a particular mode or domain of thinking” (Paul, 1990, cited in Grant, 2003). The development of critical thinking skills is not only an objective of many Computer Science courses but also a necessary component of expert thinking in the field. Spiro, Feltovitch and Coulsons’ (1988) Cognitive Flexibility Theory focuses on the nature of learning in complex and ill-structured domains. Spiro and Jehng (1990, p. 165, cited in Kearlsey, 2003) state:

"By cognitive flexibility, we mean the ability to spontaneously restructure one's knowledge, in many ways, in adaptive response to radically changing situational demands...This is a function of both the way knowledge is represented (e.g., along multiple rather single conceptual dimensions) and the processes that operate on those mental representations (e.g., processes of schema assembly rather than intact schema retrieval)."

The basic principles of CFT are:

- 29 -

Page 30: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

1. Instruction should reflect the complexity that faces industry practitioners rather than treating domain problems as simple linear decision making processes. This emphasises inter-connectedness and avoids oversimplification.

2. Instruction should be based on multiple cases to allow learners to abstract between examples.

3. Cases should be context-dependent to allow learners to acquire a deeper appreciation of the subtleties affecting decision making and more easily apply expertise in practice.

The relevance for computer programmers (and thus computing educators) is self evident. The theory emphasises the transfer of knowledge and skills beyond their initial learning situation through the presentation of information from multiple perspectives and the use of a diverse range of case studies. At this level oversimplification is to be avoided. In this way students can appreciate that learning is context-dependent, so that really much of the instruction they have received needs to be either considered as a generalisation, or understood as a specific example that may not apply in other circumstances. In order to develop cognitive flexibility, Spiro, Feltovitch and Coulson stress the importance of constructed knowledge. Learners need to be encouraged to reformulate subject matter into their own representations. This allows students to develop an appreciation of the interconnectedness of complex domains as opposed to trying to compartmentalise the subject matter. Mendes (2003) has experimented with applying Cognitive Flexibility Theory to teaching Web Engineering. In his research he applied Cognitive Flexibility Theory to teach the Hypermedia and Multimedia half of a Web Engineering course over the years 2000 to 2003. He reports a high level of achievement as a result of using these techniques to instruction. However, in his research he did not experimentally compare the cohort to a control group, which poses a potential area for further investigation. As well as setting tasks requiring analysis and production based on different perspectives, Teles (1994) recommends the use of ‘exploratory’ approaches to develop reflective, analytical and creative thinking. These enquiry based tasks are not only effective in developing higher order thinking skills, but the analogy, organisation, elaboration, and reformulation components can also serve to improve concept comprehension, assimilation as well as memorisation.

Promoting Deep Learning The following quote by Glasser is well publicised for a reason:

"... we learn: 10% of what we read 20% of what we hear 30% of what we see 50% of what we both see and hear 70% of what we discuss 80% of what we experience 95% of what we teach someone else." (Glasser, 1990)

- 30 -

Page 31: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

It arguably encapsulates the fundamental principle of education - the more actively involved a student, the deeper the learning experience. A “problem-centred” rather than “answer-centred” approach allows learners to develop a “relational” understanding (knowing both what to do and why) rather than merely an “instrumental” understanding (rules without reasons) (Skemp, 1976). Robins et al. (2003, p. 156) describes recent shifts in educational practices which are tending towards “a focus not on the instructor teaching, but on the student learning, and effective communication between teacher and student. The goal is to foster ‘deep’ learning of principles and skills, and to create independent, reflective, life-long learners. The methods involve clearly stated course goals and objectives, stimulating the students’ interest and involvement with the course, actively engaging students with the course material, and appropriate assessment and feedback.” ‘Andragogy’ is a general educational approach aimed at promoting deep learning. First proposed by Malcolm Knowles (1984), Andragogy is especially designed to cater for the needs of adult learners. The approach is formed on the basis that due to their experience and context, adults are more independent, self-directed learners than children and as such require a modified learning experience. Knowles’ theory of Andragogy is based on the following 5 tenets:

1. Effective adult learning must be relevancy oriented. 2. Adults are self directed learners and need to be free to control their learning

experience 3. Experience is the main foundation for learning activities for adults. 4. Adults are practically oriented learners. 5. Adult learners are goal oriented.

Knowles’ (1984) Andragogic approach to teaching proposes two specific strategies for promoting deep learning:

1. designing instructional approaches that are task-oriented as opposed to involving direct memorisation, and

2. establishing a problem-centred rather than content-oriented focus to courseware.

There is scarce scientific evidence to verify or discredit the effectiveness of applying Andragogy to Computer Science learning. Ellis (2002) does provide anecdotal evidence that incorporating Andragogic approaches to instruction (facilitating rather than instructing, adjusting content to provide more relevant examples, project based learning activities) improved student performance and satisfaction in a Web Technologies unit. Ellis also sites work by Dick et al. and Goodnight et al. that draws similar conclusions. However formal research into the area would be a valuable contribution to the Computer Science Education field. One crucial ingredient in promoting deep learning in Knowles’ Andragogy (and other approaches) is relevancy. Shuell, (1986) in his Review of Educational Research article “Cognitive conceptions of learning” states:

- 31 -

Page 32: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

“If students are to learn desired outcomes in a reasonably effective manner then the teacher’s fundamental task is to get students to engage in learning activities that are likely to result in their achieving those outcomes” (p. 429).

It is important to remember that carefully designed and applicable tasks need to be formed if students are to engage to a level required for deep learning to occur. If problems are framed in a context that students are likely to confront then they will be more motivated to learn the material, will be more likely to spend time reviewing and acquiring concepts from the learning domain and be prepared to drill further down into the content area. Brown, Collins and Duguid (1989) substantiated this through their “Situated Cognition” approach.

J.M. Carroll’s (1998) Minimalist approach to instruction also advocates relevancy in course design. Minimalism is founded on the principle that optimal learning occurs when all learning tasks are meaningful activities. What’s more this framework for the design of learning activities was verified with particular success in the area of training computer users.

Project based tasks are an obvious way for Computer Science educators to achieve embed learning in meaningful activities. Carroll’s (1998) Minimalist concurs, suggesting the deployment of realistic projects as quickly as possible in the instructional cycle. Carroll points out that this form of learning increases opportunities for self-directed reasoning, divergent thought and improvisation by increasing the number of active learning instances. It should be noted that Knowles (1984) Theory of Andragogy and Carroll’s Minimalism (1998) also identify mistakes as providing the basis for rich learning activities. They both recommend exploitation of errors by including error recognition and recovery activities in instructional design. The approach of providing non-examples can offer as much to promoting deep learning as providing examples.

Scope, Sequencing and Spacing The final area of instructional design that will briefly be addressed in this section is that of course structure. Research by Bower and Richards (2004) found that the majority of student difficulties reported by students in an Introduction to Object Oriented Programming Unit related to how the course was structured rather than the actual material covered or the lecturers capabilities. The way in which course content is sequenced and the timing of delivery undoubtedly has a major impact on the quality of learning that takes place. In education the ‘scope’ of a learning episode refers to the amount of novel conceptual material introduced. An important part of the instructional design process is to determine a scope for each learning episode that does not surpass the information processing capabilities of the target audience but is also expansive enough to maintain student interest. Once the curriculum has been deconstructed into episodes of appropriate scope, attention can be turned to the most appropriate form of sequencing. Sequencing refers to decisions about the order in which to deliver material. Reigeluth (1980) refers to a

- 32 -

Page 33: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

topical approach to sequencing in his Simplifying Conditions Method of Instructional Design. Topical sequencing allows each learning episode to be compared and contrasted to previous units as new concepts are introduced. Because topical sequencing completes each learning episode before then next key idea is introduced, students can reach a complete understanding of each autonomous unit more immediately. Then as the new related concepts are addressed the learner has a meaningful context within which subsequent ideas and skills can be assimilated. This is particularly useful in areas of computer science that require a constructivist approach to learning. As opposed to topical sequencing, spiralling provides an alternate approach to ordering content delivery (Reigeluth, 1980). This can be implemented by addressing common elements from related examples simultaneously, rather than always proceeding from one example to the next. Spiralling is a powerful approach to facilitating abstraction of computing concepts – students can compare and contrast elements of program features immediately. Generally speaking a topical approach to sequencing requires less learner expertise in the subject area than a spiralling approach. Also, a topical approach to sequencing is more conducive to areas requiring logical-deductive reasoning whereas spiralling is often applied to softer skill domains to allow comparison between case studies. Whether to choose a topical or spiralling approach to sequencing therefore depends on the objectives of the instructor, the material being covered and the ability of the target audience. Lastly, ‘spacing’ provides another structuring technique to improve quality of learning. Spacing, the technique of spreading out instruction and activities in smaller chunks over a longer time frame, is far less utilised that topical or spiralling sequencing. Dempster (1998, p. 627) describes spacing as “one of the most dependable and replicable phenomena in experimental psychology … with considerable potential for improving classroom learning, yet there is no evidence of widespread application”. By simply spacing learning out over the duration of a course concepts are more easily maintained and students retain the prerequisite knowledge they require for formation of new concepts, and less time is lost to relearning.

Theory in Practice With such a plethora of considerations to address in Computer Science education, the question becomes “which learning activities best achieve these multifaceted objectives and how can techniques be synthesised to optimise learning?” This section provides some examples of recommended learning activities approaches from the literature, as well as evidence of efficacy where available. The University of Sydney Basser Department of Computer Science has experimented with a Problem Based Learning (Kay et al., 2000) approach to teaching programming. In educational circles Problem Based Learning contains the following components:

1. open ended, authentic, substantial problems which drive learning 2. explicit teaching and assessment of generic and metacognitive skills, and

- 33 -

Page 34: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

3. collaborative learning in groups. In the approach students are presented with an authentic problem (for instance, implementing a supermarket checkout queue simulator) that is used both as a driving force to develop metacognitive skills (for instance reflection upon steps taken to solve the problem and delegation of time) and for the subject of groupwork (externalising knowledge, developing collaborative skills). With their carefully planned implementation, Kay et al. (2000) found that over the two year period since the introduction of Problem Based Learning in their Introduction to Computer Science unit the mean examination mark improved from 63% to 91%. They also note some positive affects on student satisfaction with the unit, as measured through an open ended questionnaire they deployed. Van Gorp and Grissom (2001) suggest the following activities to promote constructivist learning:

1. Code walkthroughs 2. Group code writing tasks 3. Group code debugging tasks 4. Lecture note reconstruction tasks (where students listen to a mini-lecture,

attempt to construct a summary by memory and then meet in groups to improve their summaries)

They conducted a (loose) experiment that indicated more frequent exposure to their constructivist activities resulted in an improvement in final exam scores. Their approach places emphasis on peer interchange and analysis of programs. Research by Zeller (2000) suggests making students read and review each other’s code improved the clarity of their programs, supporting this approach to learning. Tesser et al. (2000) propose the use of real time data across a variety of sciences as a way to improve motivation and interest. For instance, using a Ph probe to collect data on acidity levels during chemical reactions was introduce students to simple text-based I/O, indefinite loops and termination conditions in a meaningful and engaging context. They do not, however, provide any data to verify a significant improvement in learning outcomes as a result of their approach. Pre-quizzes serve multiple functions in education. They can act as forms of “Advance Organisers” (Ausbubel, 1960) which are general statements given before instruction to “orient students to material they were about to learn and to help them recall related information that could be used to assist in incorporating the new information”. They can serve to refresh student’s memory of prerequisite knowledge which often leads to improved acquisition of new concepts during the instructional phase. And they can be used to facilitate distributed practise (whereby items to be learned are repeated at intervals over a period of time) leading to longer term and more confident retention of concepts (Slavin, 1994, p. 203). Even if students have already mastered the skills being tested the time is not wasted; overlearning can lead to automaticity, a valuable attribute for any programmer. Finally, aggregate data collected from pre-quizzes can serve to help instructional designers understand whether or not the difficulty level of their materials is appropriately pitched.

- 34 -

Page 35: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

The literature provides examples of intelligent synthesis of teaching strategies into unified pedagogic approaches. For instance, Howard, Carver and Lane (1996) integrate Felder’s Learning Styles16 , Bloom’s Taxonomy and the Kolb Learning Cycle, and Cooperative Learning into a Data Structures course by applying Bloom’s taxonomy to the setting of lesson objectives, using Kolb Learning Cycle and Felder’s Learning Styles to determine instructional approach, and Cooperative Learning principles to drive group-work assignments. This method of applying different models to address different aspects of the educational process is a clever technique to harness the utility of several pedagogical models. In a separate instance Smith (1996) presents a “Process Education” approach to Teaching Computer Science which has a primary focus upon students “learning to learn”. Using Smith’s (1996) technique classes are conducted in closed computer laboratory sessions, team projects are utilised, journals are kept, reports are presented and presentations are made. Discovery learning is utilised whereby faculty pose critical thinking questions and then students are encouraged to discover the answer for themselves, engendering a deeper understanding of concepts and helping them become life-long-learners. Students choose the weight they wish to assign to different aspects of the course (within ranges). The role of the teacher changes from one of lecturer to at different times one of leader, assessor, facilitator and evaluator. While no formal scientific comparison to standard approaches is made, Smith (1996) points out that “Employers seek individuals who excel as: i) quick learners, ii) critical thinkers, iii) problem solvers, iv) communicators, v) professionals knowledgeable in their field, vi) team players, vii) self starters and viii) creative thinkers” (p 165). His approach addresses all of these qualities. A diametric shift in teacher mindset and great care with the logistics of implementation would be required for a unit to implemented using a Process Education approach, but the outcomes if successful appear fruitful.

Conclusion Literature provides many dimensions by which teaching and learning Computer Science can be analysed. General educational theories shed light on how learning occurs while domain specific theories explain how thinking processes and models apply to computing. Understanding the skills and behaviours that comprise expert programmers (as opposed to novices) provides a clear focus for the development of students. Research into the attributes that lead to success in computing and the reasons for student difficulties assist educators and students to improve the effectiveness of teaching and learning. And analysis of instructional design objectives, strategies and activities facilitates the deployment of informed teaching approaches.

16 Felder’s Learning Styles classifies learners and teaching approaches on four dimensions; Active-Reflective, Sensing-Intuitive, Visual-Verbal, Sequential Global. Howard, Carver and Lane (1996) cite work by Felder (1993) that demonstrated designing instructional approaches that cater to both ends of all four dimensions improves teaching effectiveness.

- 35 -

Page 36: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

There are many implications for teaching and learning that can be drawn from this review. Rather than summarise them in this section the process of drawing conclusions will be left to the reader (a task in higher order thinking!). Instead, this section will make three overarching recommendations. They are, firstly, teachers benefit from a research approach to Computer Science Education. But that, secondly, designing over-systematising approaches that are based on research findings alone can be sub-optimal. Which leads to, finally, “embrace the delight of learning”. Ahanori (2000) points out that while there is a great deal of cognitive process research in the mathematical domain, there is still very little research into learner cognition within the Computer Science domain. This is because Computer Science education is comparatively in its infancy, and thus requires a great deal of research in the cognitive processes area. It is important as educators that we take a scientific approach to our discipline. Lister (2003) cites Bruner as coining the term ‘folk pedagogy’, when educators espouse their intuitive theories about how learning occurs without any scientific basis. Lister points out that a research approach to education allows us to “understand better why other people – our students and our colleagues – do not think the same way we do” (p 15). Buck and Stucki (2000) similarly advise of the danger of teaching according to one’s own understanding, without regard to established models of cognitive development. They note that the many teachers have “an inclination to recapitulate the systems development process because that is the order in which we have learned to apply our craft” (p.76). Founding instructional design on theoretical underpinnings allows educators to approach teaching from more informed point of view. However incorporation of research findings should be motivated by the desire to “improving the learning experience”.rather than letting research itself be the driver. Without reflective analysis it is easy to over-systematise the design process and miss out on opportunities to integrate magical and entertaining learning experiences that transcend those captured by instructional design processes. Brent Wilson from the University of Colorado encapsulates this in his paper “The Postmodern Paradigm” (1998) where he recommends not throwing away the taxonomies entirely but rather to admit the tentativeness of any conceptual scheme applied to content and thus recommends offering holistic, information-rich experiences, that are flexible enough to integrate opportunities for mastery of un-analysed content. This is sensible advice. Several theories regarding the teaching and learning of Computer Science have been presented in this paper. With such a myriad of data and recommendations from such an experienced body of Education professionals it is easy for lecturers to fall into the role of “information assimilator”. But let me challenge you all to be “accommodators”! All of the theories presented are merely tools and in no way replace the magic of inspiring students, of positive energy, and of exiting activities. Rasala (2000) points out that the delight of construction, the satisfaction of expanding personal thinking abilities, the possibility of creating beautiful graphics and powerful

- 36 -

Page 37: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

applications are the dimensions that excite Computer Science students and attract them to the field in the first place. While educational theories are useful for teachers to understand, it is important not to forget that when distilled to its base elements, the primary goal of Computer Science Education is to participate in the joy of learning. With this as a driving force behind our motivation to educate, success is inevitable.

References Aharoni, D. (2000) Cogito, Ergo sum! cognitive processes of students dealing with data structures. Proceedings of the thirty-first SIGCSE technical symposium on Computer science education, p. 26-30. Allan, V. H. and Kolesar, M. V. (1996) Teaching Computer Science: A Problem Solving Approach that Works. In, Call of the North, NECC '96. Proceedings of the Annual National Educational Computing Conference (17th, Minneapolis, Minnesota, June 11-13, 1996), p. 2-9. Almstrum, V. L. (1996) Investigating Student Difficulties with Mathematical Logic. In Teaching and Learning Formal Methods, (Eds, Dean, C. N. and Hinchey, M. G.) London: Academic Press, p. 131-160. Applin, A. G. (2001) Second language acquisition and CS1. In, Proceedings of the thirty-second SIGCSE technical symposium on Computer Science Education, p. 174-178. Ausbubel, D. P. (1960) The Use of Advanced Organisers in the Learning and Retention of Meaningful Verbal Material. Journal of Educational Psychology, 51, p. 267-272. Bonar, J. and Soloway, E. (1989) Preprogramming Knowledge: A Major Source of Misconceptions in Novice Programmers. In Studying the Novice Programmer, (Eds, Soloway, E. and Spoher, J. C.) Hillsdale, NJ: Lawrence Erlbaum, p. 325-353. Bower, M. and Richards, D. (2004) Computing Learning Preferences Survey Results, Macquarie University. Brown, J. S., Collins, A. and Duguid, P. (1989) Situated Cognition and the Culture of Learning. Educational Researcher, 18(1), p. 32-42. Buck, D. and Stucki, D. J. (2000) Design early considered harmful: graduated exposure to complexity and structure based on levels of cognitive development. Proceedings of the thirty-first SIGCSE technical symposium on Computer science education, 32(1), p. 75-79. Byrne, P. and Lyons, G. (2001) The effect of student attributes on success in programming. SIGCSE Bull., 33(3), p. 49-52.

- 37 -

Page 38: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

Carroll, J. M. (1998) Minimalism beyond the Nurnberg Funnel, Cambridge, MA: MIT Press. Chou, H. W. (2001) Influences of cognitive style and training method on training effectiveness. Computers & Education, 37(1). Clark, D. (1995) Big Dog's Instructional Systems Design Page. Available at http://www.nwlink.com/~donclark/hrd/sat.html. [Last accessed 18 August 2004] Collins, A., Brown, J. and Holum, A. (1991) Cognitive apprenticeship: Making thinking visible. American Educator, 6(11), p. 38-46. Dempster, F. (1998) The Spacing Effect: A Case Study in the Failure to Apply the Results of Psychological Research. , American Psychologist, 43, p. 627-634. Dreyfus, H. and Dreyfus, S. (1986) Mind over machine: The power of human intuition and expertise in the era of the computer, New York: Free Press. du Boulay, B., O'Shea, T. and Monk, J. (1989) The black box inside the glass box: presenting computing concepts to novices. In Studying the Novice Programmer, (Eds, Soloway, E. and Spoher, J. C.) Hillsdale, NJ: Lawrence Erlbaum, p. 431-446. Ellis, H. J. C. (2002) Andragogy in a web technologies course. In Proceedings of the 33rd SIGCSE technical symposium on Computer science education, ACM Press, p. 206--210. Fay, A. L. and Mayer, R. E. (1994) Benefits of Teaching Design Skills Before Teaching Logo Computer Programming: Evidence for Syntax Independent Learning. Journal of Educational Computing Research, v11(3), p. 187-210. Gagne, R. (1985) The Conditions of Learning, New York: Holt, Rinehart & Winston. Gibbs, D. C. (2000) The effect of a constructivist learning environment for field-dependent/independent students on achievement in introductory computer programming. In, Proceedings of the thirty-first SIGCSE technical symposium on Computer Science Education, p. 207-211. Ginat, D. (2001) Metacognitive awareness utilized for learning control elements in algorithmic problem solving. SIGCSE Bull., 33(3), p. 81-84. Ginat, D. (2002) On varying perspectives of problem decomposition. In, Proceedings of the 33rd SIGCSE technical symposium on Computer science education, p. 331-335. Gotschi, T., Sanders, I. and Galpin, V. (2003) Mental models of recursion. In Proceedings of the 34th SIGCSE technical symposium on Computer science education, ACM Press, p. 346-350. Grant, N. S. (2003) A study on critical thinking, cognitive learning style, and gender in various information science programming classes. In, Proceeding of the 4th conference on Information technology curriculum, p. 96-99.

- 38 -

Page 39: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

Haberman, B. and Kolikant, Y. (2001) Activating “black boxes” instead of opening “zipper” - a method of teaching novices basic CS concepts. SIGCSE Bull., 33(3), p. 41-44. Hagan, D. and Markham, S. (2000) Does it help to have some programming experience before beginning a computing degree program? ACM SIGCSE Bulletin , 5th annual SIGCSE/SIGCUE conference on Innovation and technology in computer science education, 32, p. 25-28. Hazzan, O. (2003) How Students Attempt to Reduce Abstraction in the Learning of Mathematics and in the Learning of Computer Science. Computer Science Education, 13(2), p. 95-123. Hendrix, T. D., II, J. H. C., Maghsoodloo, S. and McKinney, M. L. (2000) Do visualizations improve program comprehensibility? experiments with control structure diagrams for Java. In Proceedings of the thirty-first SIGCSE technical symposium on Computer science education, ACM Press, p. 382-386. Howard, R. A., Carver, C. A. and Lane, W. D. (1996) Felder's learning styles, Bloom's taxonomy, and the Kolb learning cycle: tying it all together in the CS2 course. In, Proceedings of the twenty-seventh SIGCSE technical symposium on Computer science education, p. 227-231. Hunt, A. and Thomas, D. (2000) The Pragmatic Programmer: from journeyman to Master, Addison-Wesley. Kay, J., Barg, M., Fekete, A., Greening, T., Hollands, O., Kingston, J. H. and Crawford, K. (2000) Problem-Based Learning for Foundation Computer Science Courses. Computer Science Education, 10(2), p. 109-128. Kearlsey, G. (2003) Theory Into Practice Database. Available at http://tip.psychology.org/. [Last accessed 17th Oct, 2004 Knowles (1984) The Adult Learner: A Neglected Species, Houston: Gulf Publishing. Kurland, D., Pea, R., Clement, C. and Mawby, R. (1989) A Study of the Development of Programming Ability and Thinking Skills in high School Students. In Studying the Novice Programmer, (Eds, Soloway, E. and Spoher, J. C.) Hillsdale, NJ: Lawrence Erlbaum, p. 83-112. Landa, L. (1976) Instructional Regulation and Control: Cybernetics, Algorithmization, and Heuristics in Education, Englewood Cliffs, NJ: Educational Technology Publications. Lister, R. (2003) The five orders of teaching ignorance. SIGCSE Bull., 35(4), p. 16-17. Mayer, R. E. (1989) The Psychology of How Novices Learn Computer Programming. In Studying the Novice Programmer, (Eds, Soloway, E. and Spoher, J. C.) Hillsdale, NJ: Lawrence Erlbaum, p. 129-159.

- 39 -

Page 40: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

McGill, T. J. and Volet, S. E. (1997) A Conceptual Framework for Analyzing Students' Knowledge of Programming. Journal of Research on Computing in Education, 29(3), p. 276-297. Mendes, E. (2003) Applying the Cognitive Flexibility Theory to Teaching Web Engineering. Fifth Australasian Computing Education Conference (ACE2003), 20, p. 113-117. Morrison, M. and Newman, T. S. (2001) A study of the impact of student background and preparedness on outcomes in CS I. In Proceedings of the thirty-second SIGCSE technical symposium on Computer Science Education, ACM Press, p. 179--183. Muller, O., Haberman, B. and Averbuch, H. (2004) (An almost) pedagogical pattern for pattern-based problem-solving instruction. In Proceedings of the 9th annual SIGCSE conference on Innovation and technology in computer science education, ACM Press, p. 102-106. Murnane, J. S. and Warner, J. W. (2001) An Empirical Study of Junior Secondary Students' Expression of Algorithms in Natural Language. WCCE2001 Australian Topics: Selected Papers from the Seventh World Conference on Computers in Education, 8. Perkins, D. N., Hancock, C., Hobbs, R., Martin, F. and Simmons, R. (1989) Conditions of Learning in Novice Programmers. In Studying the novice programmer, (Eds, Soloway, E. and Spoher, J. C.) Hillsdale, NJ: Lawrence Erlbaum, p. 261-279. Porter, R. and Calder, P. (2003) A Pattern-Based Problem-Solving Process for Novice Programmers. In, Fifth Australasian Computing Education Conference (ACE2003), p. 231-238. Porter, R. and Calder, P. (2004) Patterns in Learning to Program - An Experiment? In, Proc. Sixth Australasian Computing Education Conference (ACE2004), p. 193--199. Putnam, R. T., Sleeman, D., Baxter, J. A. and Kuspa, L. K. (1989) A Summary of Misconceptions of High School BASIC Programmers. In Studying the Novice Programmer, (Eds, Soloway, E. and Spoher, J. C.) Hillsdale, NJ: Lawrence Erlbaum, p. 301-314. Rasala, R. (2000) Toolkits in first year computer science: a pedagogical imperative. In, Proceedings of the thirty-first SIGCSE technical symposium on Computer science education, p. 185-191. Rath, A. and Brown, D. E. (1995) Conceptions of Human-Computer Interaction: A Model for Understanding Student Errors. Journal of Educational Computing Research, 12(4), p. 395-409. Reigeluth, C. M. (1980) The Elaboration Theory of Instruction: A Model for Sequencing and Synthesizing Instruction. Instructional Science, 9(3), p. 195-219.

- 40 -

Page 41: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

Rist, R. S. (1989) Schema creation in programming. Cognitive Science, 13, p. 389-414. Rist, R. S. (1995) Program Structure and Design. Cognitive Science, 19, p. 507-562. Robins, A., Roundtree, J. and Roundtree, N. (2003) Learning and Teaching Programming: A Review and Discussion. Computer Science Education, 13(2), p. 137-172. Ross, J. L., Drysdale, M. T. B. and Schultz, R. A. (2001) Cognitive Learning Styles and Academic Performance in Two Postsecondary Computer Application Courses. Journal of Research on Technology in Education, 33(4). Schoenfeld, A. (1985) Mathematical Problem Solving, New York: Academic Press. Shuell, T. J. (1986) Cognitive Conceptions of Learning. Review of Educational Research, 56(4), p. 411-36. Skemp, R. (1976) Relational Understanding and Instrumental Understanding. Mathematics Teaching, 77, p. 20-26. Slavin, R. E. (1994) Educational Psychology, Boston: Allyn and Bacon,. Smith, P. D. (1996) A Process Education Approach To Teaching Computer Science. In, Notes In: Association of Small Computer Users in Education (ASCUE) Summer Conference Proceedings (29th, North Myrtle Beach, SC, June 9-13, 1996), p. 11. Spiro, R. J., Coulson, R. L., Feltovich, P. J. and Anderson, D. (1988) Cognitive flexibility theory: Advanced knowledge acquisition in ill-structured domains. In, Proceedings of the 10th Annual Conference of the Cognitive Science Society. Spohrer, J. C. and Soloway, E. (1989) Novice Mistakes: are the folk wisdoms correct? In Studying the Novice Programmer, (Eds, Soloway, E. and Spoher, J. C.) Hillsdale, NJ: Lawrence Erlbaum, p. 401-416. Teles, L. (1994) Cognitive apprenticeship on global networks. In Global networks: Computers and international communications, (Ed, Harasim, L.) Cambridge, Massachusetts: The MIT Press., p. 271-282. Tesser, H., Al-Haddad, H. and Anderson, G. (2000) Instrumentation: a multi-science integrated sequence. SIGCSE Bull., 32(1), p. 232-236. Thomas, L., Ratcliffe, M. and Thomasson, B. (2004) Scaffolding with object diagrams in first year programming classes: some unexpected results. In Proceedings of the 35th SIGCSE technical symposium on Computer science education, ACM Press, p. 250-254.

- 41 -

Page 42: Teaching and Learning Computing - Semantic Scholar...Education to any significant extent. As well, softer skills such as IT management, ... simplified example of the concept being

Thomas, L., Ratcliffe, M., Woodbury, J. and Jarman, E. (2002) Learning styles and performance in the introductory programming sequence. In, Proceedings of the 33rd SIGCSE technical symposium on Computer science education, p. 33-37. Van Gorp, M. J. and Grissom, S. (2001) An Empirical Evaluation of Using Constructive Classroom Activities to Teach Introductory Programming. Computer Science Education, 11(3), p. 247-260. Vygotsky, L. S. (1978) Mind in Society, Cambridge, MA: Harvard University Press. Wilson, B. C. and Shrock, S. (2001) Contributing to success in an introductory computer science course: a study of twelve factors. In, Proceedings of the thirty-second SIGCSE technical symposium on Computer Science Education, p. 184-188. Wilson, B. G. (1998) The postmodern paradigm. In Instructional development paradigms, (Eds, Dills, C. R. and Romiszowski, A. A.) Englewood Cliffs, NJ: Educational Technology Publications. Winslow, L. E. (1996) Programming pedagogy - A psychological overview. SIGCSE Bull., 28, p. 17-22. Witkin, H. A., Moore, C. A., Goodenough, D. R. and Cox, P. W. (1977) Field-dependent and field-independent cognitive styles and their educational implications. Review of Educational Research, 47, p. 1-64. Zeller, A. (2000) Making students read and review code. SIGCSE Bull., 32(3), p. 89-92.

- 42 -