testing the effectiveness of elearning: a multidimensional

15
Testing the Effectiveness of eLearning: A Multidimensional Assessment of Blended Learning Experience in an EFL Writing Course in a Saudi University Wael Hamed Alharbi Yanbu University College (YUC) Kingdom of Saudi Arabia

Upload: others

Post on 03-Feb-2022

2 views

Category:

Documents


0 download

TRANSCRIPT

Testing the Effectiveness of eLearning: A Multidimensional Assessment of Blended Learning Experience in

an EFL Writing Course in a Saudi University

Wael Hamed AlharbiYanbu University College (YUC)

Kingdom of Saudi Arabia

2 3rd International Conference For e-learning & Distance Education

ABSTRACTThe effectiveness of e-learning across a wide range of disciplines is discussed more than tested.

Very few studies have empirically tested or tracked learning episodes in any given e-learning environment. First, this essay examines the processes through which learners interact with two types of online tools when they write in English as a foreign language (EFL). Second, language related episodes (LREs) that emerge from that interaction are assessed. Data was collected from an undergraduate EFL composition course in Saudi Arabia for one semester while students interacted with the online tools. Exploratory techniques were implemented that triangulate real-time screen recordings, online corpus and dictionary queries, and oral stimulated reflections. All of the students’ interactions with these techniques were documented, including each learner’s efforts to resolve language problems by retrieving, evaluating, and appropriating the online tools search results. The findings show that the learner’s achievements depend both on their ability to interpret and exploit the search results and on the online tools’ ability to respond to the learner’s particular needs. Based on these findings, this paper suggests that computer-mediated communication research should focus on maximizing the learner’s developmental potential by developing more advanced online corpus systems and enhancing the learner’s textual awareness and analytical ability.

By working with the new tools, L2 writers met their lexical, grammatical and lexicogrammatical challenges in an effective and efficient way. The findings show that well-planned and gradually integrated blended education can facilitate the students’ development in a way that is strikingly similar to that of an experienced tutor, who helps students to advance through the learning process. The findings also show that Learner-Tool-Interaction (LTI) analysis opens up an exciting new avenue for research on the next generation of computer-mediated pedagogy that recognizes and thereby, facilitates language learning as computer-mediated activity focused on the learner’s development.

1. INTRODUCTIONOver the past two decades, corpora have been increasingly recognized as powerful methodology

in language learning and teaching (Conrad 2000; Granger et al. 2002; Hunston 2002; Sinclair 2004; Flowerdew 2005; Romer 2011). Drawing on the findings and recommendations by Tim Johns

Testing the Effectiveness of eLearning: A Multidimensional Assessment of Blended Learning

Experience in an EFL Writing Course in a Saudi University

المؤتمر الدولي الثالث للتعلم الإلكتروني والتعليم عن بعد3

(1986, 1991) on using data-driven learning (DDL) as a guiding principle, much of the research on the pedagogical use of online corpora has concentrated on quantitative evaluation of learning outcomes (Cobb 1997; Gaskell and Cobb 2004) and/or reports of learners’ attitudes toward and perceptions of using online corpora (Chambers and O’Sullivan 2004; Yoon and Hirvela 2004). Although these studies suggest that online corpora can generate positive learning outcomes, there is a lack of research that focuses on learners’ real interactions with online corpora and the learning process as it unfolds during these interactions.

This lack of empirical data about learners’ behavior and development challenges the validity and the usability of the findings presented in such studies, because it is difficult to attribute a learning outcome primarily to the use of online software without specifically testing how learners interact with them. The few studies that present data on how learners use corpora and online dictionaries (e.g. Chambers 2005; Yoon 2008) are methodologically inadequate, as (a) their analyses rely heavily on retrospective data, that is learners’ responses to questionnaires and/or their narrative accounts collected from reflective reports and interviews, and (b) they do not include data collected from first-hand observations of learners’ composing behaviors or data that captures the learner’s texts as they are being constructed. To date, no study has examined learners’ use of online corpora and online dictionaries in real-time.

Due to the fact that previous studies have focused exclusively on the use of concordances, our knowledge is very limited concerning how, for example, learners use and may benefit from the use of online dictionaries and corpora at the same time. This paper offers a new layer of research data that attempts to evaluate the effectiveness of a blended learning experience in a higher education setting by providing evidence-based Perspectives from the learners’ online search queries and screen recordings. By defining a unit of analysis, we tracked the students’ progress in order to establish whether or not their new online learning experience was successful.

2. REVIEW OF THE LITERATURE

2.1. Corpora as learning artifacts in language pedagogy.Throughout the history of corpus-assisted language pedagogy, a great number of studies

have noted the potential benefits of using corpora to facilitate language learning (Stevens 1991; Gavioli 1997; Aston 1997, 2000; Gaskell and Cobb 2004). The basis of these studies is Data Driven Learning (DDL) ( Johns 1991). According to DDL, directly confronting learners with real-language data, as opposed to invented examples, leads to effective learning. Ideally, by analyzing authentic data, learners will grasp the linguistic features and patterns of a language and how to use them (Flowerdew 1996; Tribble and Jones 1997; Wichmann et al. 1997; Conrad 2000; Hunston 2002; Sinclair 2004; Romer 2011). With regard to data-based learning, computers as learning

4 3rd International Conference For e-learning & Distance Education

artifacts have played a vital role in facilitating interactive multimedia-based instruction (Kennedy 2004), and the development of pragmatic competence (Belz, 2007).

2.2. Corpora in second language writing pedagogyCorpus technology holds considerable promise for writing pedagogy: its contribution is

recognized mainly in research on textual analysis and the development of materials for language teaching (Tribble and Jones 1997; Thurstun and Candlin 1998; Hyland 2002), teaching through the use of large, reference corpora (Bernardini 2000; Yoon 2008), small, specialized corpora (Aston 1997, 2002; Ghadessy et al. 2001), and more recently, online corpora (Anderson and Corbett 2009). In the classroom, researchers have primarily focused on learners’ evaluations of corpus-based activities and/or learners’ actual use of a corpus. In terms of research into learners’ evaluations, the corpus’s role in fostering learner autonomy, awareness, and confidence is the focus (Chambers and O’Sullivan 2004; Yoon and Hirvela 2004; O’Sullivan and Chambers 2006; Yoon 2008). In terms of research into learners’ corpus use, studies to date have focused on the learner’s experiences in various contexts, including legal writing (Hafner and Candlin 2007), process-oriented instruction (O’Sullivan 2007), and creative writing (Kennedy and Miceli 2010).

2.3. Language Related episodes in learner–online artifacts’ interactions This paper draws on the construct of Language-Related Episodes (LREs), segments of learner

interaction in which learners either talk about or question their own or others’ language usage within the context of carrying out a given task in the L2 (Swain, 1998; Swain & Lapkin, 2001). Dring LREs, learners attempt to make connections between meaning, forms, and/or functions (Swain, 1998). More specifically, LREs include instances in which learners may (a) question the meaning of a linguistic item; (b) question the accuracy of a word’s spelling/pronunciation; (c) question the accuracy of a grammatical form; or (d) implicitly or explicitly correct their own or another’s usage of a word, form or structure (Swain, 1998). In addition, LREs may include the use of metalinguistic terminology or the articulation of a rule; in most cases, however, they do not (Swain, 1998).

Swain further explains that it is through LREs that learners notice erroneous utterances, and formulate hypotheses, while engaging in other language learning processes, such as comprehension. LREs have received considerable attention in focus-on-form research, since this kind of attention to form “may serve the function of helping students to understand the relationship between meaning, forms, and function in a highly context-sensitive situation” (Swain, 1998, p. 69). Furthermore, these episodes may represent language learning in progress (Swain, 1998; Swain & Lapkin, 2001).

When studying the interactions between learners and the online tools, the key challenge is twofold: first, to identify instances of LREs, and secondly, to determine how the learner’s interactions with the tools influence their LREs in each instance. To understand this we will seek to answer the following research questions:

Testing the Effectiveness of eLearning: A Multidimensional Assessment of Blended Learning

Experience in an EFL Writing Course in a Saudi University

المؤتمر الدولي الثالث للتعلم الإلكتروني والتعليم عن بعد5

1. What are the processes through which learners interact with online dictionaries and corpora?

2. And, how does language related episodes (LREs) development emerge from learners’ interactions in this context?

3. METHOD

3.1. Research contextThis study is part of a larger project in which the researcher observed an intermediate EFL

writing class at a large Saudi college over the course of one semester. The course aimed to prepare first-year Saudi students for university-level academic writing. Twenty-five undergraduate students were enrolled in the course, and this paper documents and analyzes the interactions between three of these students and the online tools. Participation in the study was voluntary and followed the Saudi Arabian Ministry of Higher Education’s guidelines. The linguistic and educational backgrounds of these focal students were similar: all the students were male, Saudi Arabian, and spoke Arabic as their first language. In addition, they were all Management Information Systems (MIS) majors with seven years of English as a foreign language preparation in general education. The students are referred to by pseudonyms: Ali, Badr and Sultan.

These three participants had a similar level of proficiency in written English: their proficiency was intermediate as measured by their scores on the placement test they were required to pass, as a prerequisite for enrolling in the preparatory year program. Although the student’s oral proficiency was not measured as part of the study, their previous semester’s formal proficiency exam scores were available. The formal proficiency exams were part of the final exams, which the students completed prior to advancing to their current level. Their scores ranged from 70 to 80 out of 100, resulting in an average of 75 on the oral tests.

None of the students had any experience using concordances, corpora, or any computer-assisted learning software. However, they were familiar with online dictionaries and search engines, and used them on a daily basis. They were also familiar with using word processors in Arabic and English. Their familiarity with computers, word processors and search engines saved the researcher time and effort by limiting the necessary training to corpus use.

3.2. The online toolsThe online tools referenced in this study are online dictionaries and corpora. A corpus is usually

defined as a database of texts stored on a computer that is available for linguistic analysis (Biber et al. 1998; Meyer 2002). The ‘corpus systems’ that this paper describes consist of two different kinds of corpora: (i) The freely-searchable 425 million word Corpus of Contemporary American English (COCA), which is the largest corpus of American English currently available and (ii) A smaller

6 3rd International Conference For e-learning & Distance Education

corpus compiled by the researcher from simplified English texts (VOA). The dictionaries were Al Mawrid English-Arabic-English Dictionary and Longman English-English Dictionary.

3.3. The TasksThe students were given six major essay assignments during the semester as part of their

regular course work. They were asked to write six three-hundred word expository essays about various topics, three of which were designed, assigned and assessed online. These six tasks yielded the study data: the student’s interactions with the online tools were recorded as they engaged in authentic writing tasks.

3.4. TrainingSince introducing these new tools to the class is considered as an intervention, we needed to

train our participants on the use of these new tools. Although the results of the biodata questionnaire show that the participants were familiar with using online dictionaries when writing their English assignments, they had no experience with corpora, nor had they heard of them. The class met three times a week for two hour sessions; two sessions were held in the classroom, and one took place online.

In addition to completing their regular course work, students attended hands-on training sessions on consulting corpora. The training program was introduced throughout the semester over three phases. In phase one, the researcher provided the students a general overview of corpora and concordances, including online tasks that required guided use of corpora. In phase two, the students were trained to access the corpora for themselves. They received training on how to access and search the corpus to address lexical, grammatical, and lexicogrammatical issues. In phase three, each student had a one-to-one online session with the researcher, who offered feedback on their corrected drafts. During this phase, students had the chance to ask the researcher about the problems they faced when they consulted the corpora and asked about the possibilities of solving their language problems using the corpora.

3.5. Data CollectionThis study collected four types of data: (i) screen recordings, (ii) corpus and dictionary search

queries, (iii) oral reflections, and (iv) the students’ essays. The screen recordings are video clips created by screen-capture software that records the user’s composing activity on the computer screen. Due to their ability to capture and subsequently replay the user’s activity, screen recordings have been used to examine user behavior in computer-mediated learning activities (e.g. Lai and Zhao 2006; Smith 2008). The software program (Camtasia) was used to create screen recordings during the six in-class writing tasks, each of which lasted 45 minutes. Overall, six screen recordings were collected for each student, consisting of one video clip for each student per in-class writing session.

Testing the Effectiveness of eLearning: A Multidimensional Assessment of Blended Learning

Experience in an EFL Writing Course in a Saudi University

المؤتمر الدولي الثالث للتعلم الإلكتروني والتعليم عن بعد7

3.6. Unit of AnalysisIn research on Language Related Episodes (LRE), these episodes are identified based on the

presence of two kinds of evidence: First, any language-learning activity that shows the learner’s awareness of language use; and Second, changes implemented by the learner that improved their language use. Simply put, an LRE instance consists of three key components: linguistic needs (i.e. a linguistic item that poses a challenge), awareness of those needs (e.g. questions about, comments on, and discussions of an item), and some sign of linguistic improvement (e.g. correction and revision).

This paper proposes that corpus and online dictionary queries, along with supporting data from the stimulated recall data and screen recordings can compensate for the absence of verbalized LRE. The analysis of varied data, triangulation, a combination of analysis methodologies collected from multiple sources (Watson-Gegeo 1988; Davis 1995; Lazaraton 1995) constitute the primary analytical technique. Figure 1 compares the analytical units of LRE and LTI (Learner-Tool-Interaction).

Key aspects LRE LTI

Linguistic need vervalized in the stimulated recall interview inferred from the quer y analysis and visualized in screen recordings

Example

1. wanted to use the word (خطبة) in a sentence so I decided to look it up in the Arabic-English dictionary.

2. Here I wanted to know what collocates better with the word ‘speech’ so I looked for collocations in COCA Corpus. I found that both ‘deliver’ and ‘give’ are verb collocates for the noun ‘speech’ so I had to choose one of them.

3. For this reason, 1 conducted a frequency comparison between the two collocates and found that ‘deliver a speech is more frequent in academic English.

Query#1: ‘speech’

Query#2: ‘give a speech’

Query#3: ‘deliver a speech’

Figure 1: Analytical units in LRE and LTI

3.7. Data coding Once identified, the search processes were coded for instances of LREs. In order to identify

instances of LREs, the students’ queries and essays were compared to determine whether the search processes were related to positive, negative, or neutral effects on essay quality. Reseachers sought to answer whether the search process resulted in improved writing and correct revisions, worsened writing and incorrect revisions, or in no change in the writing. All search processes were reviewed in combination with screen recordings and the learner’s reflections whenever they were available. The researcher and a trained independent coder coded the data using a coding scheme. An inter-rater reliability coefficient of 0.84 was achieved.

8 3rd International Conference For e-learning & Distance Education

4. RESULTS AND DISCUSSION4.1. Instances of Language Related EpisodesInstances of LREs in this study of LTI were identified by determining the search processes that

resulted in improved writing and revision. Table 1 shows that more than half of the search processes (58%) had positive effects on writing and revisions, whereas seven search processes negatively affected the quality of writing (3%). For the rest of the search processes (39%), a relationship between the use of the corpus and the learner’s writing could not be established.

search processes Ali Badr Sultan Sum

Positive effects 18 22 72 112 (58%)

Negative effects 0 3 6 9 (3%)

Non-effects 19 32 40 93 (39%)

Sum 37 57 118 212

Total number of queries 48 87 194 304

Average number of queries 1.3 1.5 1.6

Table 1: Transactions and instances of LREs

The search processes reflect the learners’ efforts to address the issues that arose during their writing activity. Table 1 shows that there were 304 search processes, suggesting that as a group, the learners had encountered at least this number of challenges. The learners were able to resolve some issues through their search processes. In some search processes, however, the use of online tools complicated the issues, resulting in erroneous revisions that led to a deterioration in writing quality. Moreover, in a total of 93 instances, no evidence was found as to whether the learners benefited from using the corpus.

Table 1 suggests that the learners benefited to differing degrees from their online tools searches. The following sections outline the structure of LTI and then explain these outcomes through an analysis of the interactions in which the learners used the online tools.

4.2. Sequences in LTIA learner’s interaction with the corpus system consists of sequenced procedures with

transactions at the core. The learner initiates the procedures when he/she encounters challenges in trying to complete a particular task and is motivated to engage in problem solving. The procedures

Testing the Effectiveness of eLearning: A Multidimensional Assessment of Blended Learning

Experience in an EFL Writing Course in a Saudi University

المؤتمر الدولي الثالث للتعلم الإلكتروني والتعليم عن بعد9

are recursive, as there can be multiple challenges in a task that call for the initiation of a procedure, which is an effort to resolve an issue. The procedure is shown in Figure 2.

Figure 2: LTI sequences

4.3. Analyzing linguistic challengesA learner’s search process with the online tools begins when he encounters an issue. To initiate

a search process, a learner must first recognize the issue and then decide to solve it. In resolving the issue, a learner conducts a challenge analysis in order to define the challenges and decide on query strategies, that is, what to search for and how. The challenges are verbalized in the student’s reflections during the stimulated recall interviews, as shown in Table 2.

KEY ASPECTS Learner Resource Interaction (LRI)

Linguistic needs

Inferred from query analysis, visualized in screen recordingsThe plural noun “News” in English Query#08: “the news were”Query#09: “the news was”

Awareness of the challengeVerbalized in the learner’s oral retrospection“I was not sure if I should use a singular verb with news or a plural one”

Sign of improvement

Shown in the revised textRevision:“... and I wanted to surprise him. He was happy to see me and the news were welcomed by him...” (wrong verb before revision). “... and I wanted to surprise him. He was happy to see me and the news was welcomed by him...” (right verb choice after revision).

10 3rd International Conference For e-learning & Distance Education

4.4. QueryingThe three focal learners performed a combined total of 304 queries during the semester. Ali

consulted the online tools most often (194 times) followed by Badr (87 times) and then Sultan (48 times). In general, the learner’s queries are relatively short, averaging about 2.7 words per query. Among these queries, 191 (58%) are accounted for by repetitive querying, that is the process of querying the online tools with a series of related and refined searches.

4.5. Evaluation of the search results and decision-makingOnce the learners had queried the online tools and retrieved their search results, they evaluated

the quality of the results in terms of the extent to which the results met their needs. Triangulated data shows that the three focal learners differed in terms of how they evaluated the search results.

CONCLUSION

This paper sought to answer the following questions through LTI analysis: What are the processes through which learners interact with an online corpus system? And, how does the development of Language Related Episodes emerge from learners’ interactions in this context?

By developing analytical procedures and analyzing triangulated data, this paper documented the interactions that make up each learner’s efforts to resolve language issues by retrieving, evaluating, and appropriating search results from the online tools. The study results show that learners benefit from LTI, as attested to by the 112 identified instances of LREs. In these instances, learners demonstrated both enhanced textual performance, that is improved writing, and enhanced awareness of their own lexical, grammatical and lexicogrammatical challenges and problem-solving processes.

This paper introduced the technique of using screen recording of the learner’s screen in order to capture and analyze composing behavior as it unfolds. Based on an analysis of the real-time data, this study suggests that each instance of LRE emerges as a series of queries, or search processes, rather than as a result of a quick, one-shot search and an unevaluated use of the search results. Structurally, each LTI typically consists of performing a challenge analysis, drawing on an understanding of the question at hand; querying the online tools; engaging in a strategic activity to elicit information from the online tools, either individually or collectively; and evaluating the results in order to determine if and how to use them. It should be noted that evaluation can be based on the learner’s analysis of a single query. On the other hand, the evaluation can prove very complex, and necessitate multiple queries. This series of successively refined searches typically produces results that are different, yet closely related.

LIMITATIONS

This study was limited in that it did not look into the lecturer’s role in enabling positive

Testing the Effectiveness of eLearning: A Multidimensional Assessment of Blended Learning

Experience in an EFL Writing Course in a Saudi University

المؤتمر الدولي الثالث للتعلم الإلكتروني والتعليم عن بعد11

learning outcomes. Although the learners referred to the lecturer’s feedback as they used the online tools, it was not easy to trace the impact of this feedback by means of the methods employed herein. This study also did not offer a complete account of the learners’ attitudes toward and perceptions of using these online tools. Although an extensive description of these aspects is not the focus of this study, such a description would be worth investigating. Furthermore, it should be noted that this study was very limited by the number of subjects, which consequently raises questions about the generalizability of the results. Nevertheless, this limitation does not imply that the findings cannot be used to come up with sound recommendations. Rather, these findings may inform the way through which we evaluate online learning experiences in the future.

12 3rd International Conference For e-learning & Distance Education

REFERENCESAljaafreh, A. and J. Lantolf. 1994. ‘Negative feedback as regulation and second language

learning in the zone of proximal development,’ The Modern Language Journal 78: 465–87.Anderson, W. and J. B. Corbett. 2009. Exploring English with Online Corpora. Palgrave.Aston, G. 1997. ‘Small and large corpora in language learning’ in B. Lewandowska- Tomaszczyk

and J. P. Melia (eds): Practical Applications in Language Corpora. Lodz University Press.Aston, G. 2000. ‘Corpora and language teaching’ in L. Burnard and T. McEnery (eds):

Rethinking Language Pedagogy from a Corpus Perspective. Peter Lang.Aston, G. 2002. ‘The learner as corpus designer’ in B. Kettemann and G. Marko (eds): Teaching

and Learning by Doing Corpus Analysis. Rodopi.Belz, J. A. 2004. ‘Learner corpus analysis and the development of foreign language proficiency,’

System 32: 577–91.Belz, J. A. 2007. ‘The role of computer mediation in the instruction and development of L2

pragmatic competence,’ Annual Review of Applied Linguistics 27: 45–75. Belz, J. A. and C. Kinginger. 2002. ‘The cross-linguistic development of address form use in

telecollaborative language learning: Two case studies,’ The Canadian Modern Language Review 59/2: 189–214.

Belz, J. A. and N. Vyatkina. 2005. ‘Learner corpus analysis and the development of L2 pragmatic competence in networked inter-cultural language study: The case of German modal particles,’ Canadian Modern Language Review 62/1: 17–48.

Bennett, G. R. 2010. Using Corpora in the Language Learning Classroom: Corpus Linguistics for Teachers. University of Michigan Press.

Bernardini, S. 2000. ‘Systematising serendipity: Proposals for concordancing large corpora with language learners’ in L. Burnard and T. McEnery (eds): Rethinking Language Pedagogy from a Corpus Perspective.

Breyer, Y. 2009. ‘Learning and teaching with corpora: reflections by student teachers,’ Computer Assisted Language Learning 22/2: 153–72.

Chambers, A. 2005. ‘Integrating corpus consultation in language studies,’ Language Learning & Technology 9/2: 111–25.

Chambers, A. and I. O’Sullivan. 2004. ‘Corpus consultation and advanced learners’ writing skills in French,’ ReCALL 16/1: 158–72.

Chapelle, C. 1998. ‘Research on the use of technology in TESOL: analysis of interaction sequences in CALL,’ TESOL Quarterly 32/4: 753–57.

Testing the Effectiveness of eLearning: A Multidimensional Assessment of Blended Learning

Experience in an EFL Writing Course in a Saudi University

المؤتمر الدولي الثالث للتعلم الإلكتروني والتعليم عن بعد13

Cobb, T. 1997. ‘Is there any measurable learning from hands-on concordancing?,’ System 25/3: 301–31.

Conrad, S. 2000. ‘Will corpus linguistics revolutionize grammar teaching in the 21st century?,’ TESOL Quarterly 34/3: 548–60

Davis, A. 1995. ‘Qualitative theory and methods in applied linguistics research,’ TESOL Quarterly 29/3: 427–53.

Donato, R. 1994. ‘Collective scaffolding in second language acquisition’ in J. P. Lantolf and G. Appel (eds): Vygotskian Approaches to Second Language Research. Ablex.

Flowerdew, J. 1993. ‘Concordancing as a tool in course design,’ System 21/2: 231–44. Flowerdew, J. 1996. ‘Concordancing in language learning’ in M. Pennington (ed.): The Power

of CALL. Athelstan. Flowerdew, L. 2005. ‘An integration of corpus-based and genre-based approaches to text

analysis in EAP/ESP: countering criticisms against corpus-based methodologies,’ English for Specific Purposes 24/3: 321–32.

Gaskell, D. and T. Cobb. 2004. ‘Can learners use concordance feedback for writing errors?,’ System 32/3: 301–19.

Gavioli, L. 1997. ‘Exploring texts through the concordancer: Guiding the learner’ in A. Wichmann, S. Fligelstone, T. McEnery, and G. Knowles (eds): Teaching and Language Corpora. Longman.

Ghadessy M., A. Henry, and R. Roseberry (eds). 2001. Small Corpus Studies and ELT. Hafner, C. and C. Candlin. 2007. ‘Corpus tools as an affordance to learning in professional

legal education,’ Journal of English for Academic Purposes 6/4: 303–18. Hunston, S. 2002. Corpora in Applied Linguistics. Cambridge University Press. Hyland, K.

2002. ‘Activity and evaluation: Reporting practices in academic writing’ in J. Flowerdew (ed.): Academic Discourse. Longman.

John Benjamins. Granger S., J. Hung, and S. Petch-Tyson (eds). 2002. Computer Learner Corpora, Second Language Acquisition and Foreign Language Teaching. John Benjamins.

John Benjamins. Smith, B. 2008. ‘Methodological hurdles in capturing CMC data: the case of the missing selfrepair, Language Learning & Technology 12/1: 85–103.

Johns, T. 1986. ‘Micro-concord: A language learner’s research tool,’ System 4/2: 151–62. Johns, T. 1991. ‘Should you be persuaded: two samples of data-driven learning materials,’

English Language Research Journal 4: 1–16. Kennedy, C. and T. Miceli. 2010. ‘Corpus-assisted creative writing: introducing intermediate

Italian learners to a corpus as a reference resource,’ Language Learning & Technology 14/1: 28–44. Kennedy, G. 2004. ‘Promoting cognition in multimedia interactivity research,’ Journal of

Interactive Learning Research 15/1: 43–61.

14 3rd International Conference For e-learning & Distance Education

Lai, C. and Y. Zhao. 2006. ‘Noticing and text-based chat,’ Language Learning & Technology 10/3: 102–20.

Lantolf J. P. (ed.). 2000. Sociocultural Theory and Second Language Learning. Oxford University Press.

Lazaraton, A. 1995. ‘Qualitative research in applied linguistics: a progress report,’ TESOL Quarterly 29/3: 455–72.

Lee, D. and J. Swales. 2006. ‘A corpus-based EAP course for NNS doctoral students: moving from available specialized corpora to self-compiled corpora,’ English for Specific Purposes 25/1: 56–75.

Meyer, C. 2002. English Corpus Linguistics: An Introduction. Cambridge University Press.O’Keeffe, A., M. J. McCarthy, and R. Carter. 2007. From Corpus to Classroom: Language Use

and Language Teaching. Cambridge University Press. O’Sullivan, I´. 2007. ‘Enhancing a process-oriented approach to literacy and language learning:

the role of corpus consultation literacy,’ ReCALL 19/3: 269–86. O’Sullivan, I´. and A. Chambers. 2006. ‘Learners’ writing skills in French: corpus consultation

and learner evaluation,’ Journal of Second Language Writing 15/1: 49–68. Park, K. 2012. Learner-Corpus Interaction: A Locus of Microgenesis in Corpus-assisted L2

Writing. Applied Linguistics. Park, K. and C. Kinginger. 2010. ‘Writing/ thinking in real time: Digital video and corpus-

query analysis,’ Language Learning & Technology 14/3: 30–49. Peter Lang. Biber, D., S. Conrad, and R. Reppen. 1998. Corpus Linguistics: Investigating

Language Structure and Use. Cambridge University Press. Romer, U. 2009. ‘Corpus research and practice: What help do teachers need and what can we

offer?’ in K. Aijmer (ed.): Corpora and Language Teaching. John Benjamins. Romer, U. 2011. ‘Corpus Research Applications in Second Language Teaching,’ Annual

Review of Applied Linguistics 31: 205–25. Sinclair, J. 2003. Reading Concordances: An Introduction. Longman. Sinclair J. (ed.). 2004.

How to Use Corpora in Language Teaching. Stevens, V. 1991. ‘Concordance-based vocabulary exercises: a viable alternative to gap-filling,’

ELR Journal 4: 47–61. Swain, M. and S. Lapkin. 2001. ‘Focus on form through collaborative dialogue: Exploring

task effects’ in M. Bygate, P. Skehan, and M. Swain (eds): Researching Pedagogic Tasks: Second Language Learning, Teaching and Testing. Longman.

Thorne, S., J. Reinhardt, and P. Golombek. 2008. ‘Mediation as objectification in the development of professional academic discourse: a corpus-informed curricular innovation’ in J.

Testing the Effectiveness of eLearning: A Multidimensional Assessment of Blended Learning

Experience in an EFL Writing Course in a Saudi University

المؤتمر الدولي الثالث للتعلم الإلكتروني والتعليم عن بعد15

Lantolf and M. Poehner (eds): Sociocultural Theory and the Teaching of Second Languages. Equinox.

Thurstun, J. and C. Candlin. 1998. ‘Concordancing and the teaching of the vocabulary of academic English,’ English for Specific Purposes 17/3: 267–80.

Tribble, C. and G. Jones. 1997. Concordances in the Classroom: A Resource Book for Teachers. Longman.

Vygotsky, L. S. 1978. Mind in Society: The Development of Higher Psychological Processes. Harvard University Press.

Vygotsky, L. S. 1986. Thought and Language. MIT Press. Watson-Gegeo, K. A. 1988. ‘Ethnography in ESL: Defining the essentials,’ TESOL Quarterly

22/4: 575–92. Wertsch, J. 1985. Vygotsky and the Social Formation of Mind. Harvard University Press.

Wichmann A., S. Fligelstone, T. McEnery, and G. Knowles (eds). 1997. Teaching and Language Corpora. Longman.

Yoon, H. 2008. ‘More than a linguistic reference: the influence of corpus technology on L2 academic writing,’ Language Learning & Technology 12/2: 31–48.

Yoon, H. and A. Hirvela. 2004. ‘ESL student attitudes toward corpus use in L2 writing,’ Journal of Second Language Writing 13/4: 257–83.