evaluation of a learning analytics tool for supporting teachers in...

20
British Journal of Educational Technology doi:10.1111/bjet.12940 Vol 51 No 4 2020 1019–1038 © 2020 British Educational Research Association Evaluation of a learning analytics tool for supporting teachers in the creation and evaluation of accessible and quality open educational resources Cecilia Avila , Silvia Baldiris , Ramon Fabregat and Sabine Graf Cecilia Avila is a professor and researcher at the Fundación Universitaria Konrad Lorenz in Bogotá (Colombia). She has a PhD in Technology from the University of Girona. Her research interests are semantic web, web accessibility, usability, learning analytics and open educational resources. Silvia Baldiris is a full professor and researcher at the International University of La Rioja, Logroño, Spain. She has a PhD in Technology from the University of Girona. She is interested in the field of technology enhanced learning and on topics such as authoring tools, open educational resources, universal design for learning and web accessibility. Ramon Fabregat is co-director of the BCDS research group in the University of Girona and is an associate professor at the Institute of Informatics and Applications (IIiA). He has a PhD in computer science from the same university. His research interests are e-learning, adaptive systems, user modeling and inclusive learning. Sabine Graf is presently a full professor (tenured) at the School of Computing and Information Systems, Athabasca University, Canada. She has a PhD from Vienna University of Technology, Austria. Her research areas are: e-learning, personalization, learning analytics and academic analytics, user modeling and game-based learning. Address for correspondence: Cecilia Avila, Faculty of Mathematics and Engineering, Fundación Universitaria Konrad Lorenz, Bogotá, Colombia. Email: [email protected] Introduction The UNESCO guidelines for Open Educational Resources (OERs) in higher education promote the investment in the systematic production, adaptation and use of OER that address students’ learning needs (Commonwealth of Learning, 2011). Thus, teachers are challenged with provid- ing OERs that meet characteristics such as web accessibility and quality as two key features of OERs, which make them suitable for all students (Almendro & Silveira, 2018; Chen, Sanderson, Abstract The learning analytics (LA) field seeks to analyze data about students’ interactions, and it has been applied in the development of tools for supporting both learning and teaching processes. Recent research has paid attention on how LA may benefit teachers in the creation of educational resources. However, most of the research on LA solutions is carried out to support teachers in analyzing students’ behavior data collected as they interact with virtual learning environments. While solutions to support teachers in different virtual learning and teaching contexts have become important, to date little research has been done on how LA solutions can help teachers to create and evaluate Open Educational Resources (OERs). This study aims at presenting the evaluation of a LA tool for supporting teachers in the creation and evaluation of accessible and quality OERs considering that both processes fall within the competences that teachers can acquire and strengthen by participating as authors (creation) and evaluators (evaluation) of OERs. The study was conducted with Colombian teachers and the results obtained highlight the positive effect the tool had on the teachers’ acquisition of the competences and the positive attitude they had toward using the tool.

Upload: others

Post on 09-Aug-2021

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

British Journal of Educational Technologydoi:10.1111/bjet.12940

Vol 51 No 4 2020 1019–1038

© 2020 British Educational Research Association

Evaluation of a learning analytics tool for supporting teachers in the creation and evaluation of accessible and quality open educational resources

Cecilia Avila , Silvia Baldiris , Ramon Fabregat and Sabine Graf

Cecilia Avila is a professor and researcher at the Fundación Universitaria Konrad Lorenz in Bogotá (Colombia). She has a PhD in Technology from the University of Girona. Her research interests are semantic web, web accessibility, usability, learning analytics and open educational resources. Silvia Baldiris is a full professor and researcher at the International University of La Rioja, Logroño, Spain. She has a PhD in Technology from the University of Girona. She is interested in the field of technology enhanced learning and on topics such as authoring tools, open educational resources, universal design for learning and web accessibility. Ramon Fabregat is co-director of the BCDS research group in the University of Girona and is an associate  professor at the Institute of Informatics and Applications (IIiA). He has a PhD in computer science from the same university. His research interests are e-learning, adaptive systems, user modeling and inclusive learning. Sabine Graf is presently a full professor (tenured) at the School of Computing and Information Systems, Athabasca University, Canada. She has a PhD from Vienna University of Technology, Austria. Her research areas are: e-learning, personalization, learning analytics and academic analytics, user modeling and game-based learning. Address for correspondence: Cecilia Avila, Faculty of Mathematics and Engineering, Fundación Universitaria Konrad Lorenz, Bogotá, Colombia. Email: [email protected]

IntroductionThe UNESCO guidelines for Open Educational Resources (OERs) in higher education promote the investment in the systematic production, adaptation and use of OER that address students’ learning needs (Commonwealth of Learning, 2011). Thus, teachers are challenged with provid-ing OERs that meet characteristics such as web accessibility and quality as two key features of OERs, which make them suitable for all students (Almendro & Silveira, 2018; Chen, Sanderson,

AbstractThe learning analytics (LA) field seeks to analyze data about students’ interactions, and it has been applied in the development of tools for supporting both learning and teaching processes. Recent research has paid attention on how LA may benefit teachers in the creation of educational resources. However, most of the research on LA solutions is carried out to support teachers in analyzing students’ behavior data collected as they interact with virtual learning environments. While solutions to support teachers in different virtual learning and teaching contexts have become important, to date little research has been done on how LA solutions can help teachers to create and evaluate Open Educational Resources (OERs). This study aims at presenting the evaluation of a LA tool for supporting teachers in the creation and evaluation of accessible and quality OERs considering that both processes fall within the competences that teachers can acquire and strengthen by participating as authors (creation) and evaluators (evaluation) of OERs. The study was conducted with Colombian teachers and the results obtained highlight the positive effect the tool had on the teachers’ acquisition of the competences and the positive attitude they had toward using the tool.

Page 2: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

1020 British Journal of Educational Technology Vol 51 No 4 2020

Kessel, & Królak, 2015; Marín, Orellana, & Peré, 2019; Moise, Vladoiu, & Constantinescu, 2014). Research on how to support teachers in this is limited. According to Iniesto, McAndrew, Minocha, and Coughlan (2016), “accessibility does not appear to have been considered in a consistent way when designing online learning resources”. Some studies have also stated that evaluating digital contents demands technical or specialized knowledge (Restrepo, Amado-Salvatierra, Otón, & Hilera, 2018; Vlachogiannis, Darzentas, & Koutsabasis, 2010). Moreover, authoring tools should provide feed-back about the learning contents that need to be improved at design time (Chen et al., 2015).

Solutions based on learning analytics (LA) have demonstrated to be suitable for supporting teachers in designing educational resources, a process in which decision making and feedback tools (eg, dashboards) are crucial for the improvement and evaluation of learning contents and activities (Albó, Barria-Pineda, Brusilovsky, & Hernández-Leo, 2019; Hernández-Leo, Martinez-Maldonado, Pardo, Muñoz-Cristóbal, & Rodríguez-Triana, 2019; Mangaroska & Giannakos, 2017). However, most of the LA tools focus on informing teachers about students’ interaction with learning activities or contents (Brooks, Erickson, Greer, & Gutwin, 2014; Fernandez-Delgado, Mucientes, Vazquez-Barreiros, & Lama, 2014; Leacock & Nesbit, 2007) and not on tracing the activities performed by teachers while creating OERs. (Bodily, Nyland, & Wiley, 2017) pointed out that LA may help identifying improvements for OER contents and continuous improvement drives high quality contents.

The purpose of this paper, therefore, is to investigate the use of LA as an opportunity to support teachers in learning how to create and evaluate OERs by tracing their activities in these processes, all the while keeping web accessibility and quality in mind. Because this entails a learning process, we consider the creation and evaluation processes to form part of a teacher’s digital competences. The study was conducted with 19 Colombian school and university teachers and it seeks at ana-lyzing: (1) whether the LA tool helps teachers acquire competences in the creation and evaluation of OERs, and (2) the teachers’ perceptions about the tool.

The paper is structured as follows. Second section describes some theoretical background and related work  on web accessibility and quality in OERs, competences and, LA. Next, third section presents an overview of the LA tool. In fourth section, the methodology of the evaluation

Practitioner NotesWhat is already known about this topic

• Most of the Learning Analytics (LA) solutions focus on tracing students’ interactions with learning environments and few focus on learning design processes.

• Teachers require specialized knowledge and skills for participating in the creation and evaluation of OERs that meet with web accessibility and quality requirements.

What this paper adds

• The evaluation of a LA tool for supporting teachers in the creation and evaluation of OERs.

• Demonstrates that LA is a useful tool for teachers to know how well they create and evaluate accessible and quality contents of OERs.

Implications for practice and/or policy

• The current use of authoring tools for creating OERs is focusing on solutions that help teachers to work without the need of expert guidance.

• Tools based on LA may contribute to the continuously improvement of OERs.

Page 3: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

Learning analytics for creating and evaluating OER 1021

of the LA tool is described. Fifth section presents the results and discussion, whereas sixth section presents the threats to validity for this study. Finally, seventh section presents the conclusions and future work.

Theoretical background and related workWeb accessibility and quality in OERsCreating OERs that address students’ diverse learning needs is a challenge for teachers who should consider characteristics such as web accessibility (to avoid barriers to content access), and quality (to make learning contents more appropriate for the learning context in which they are intended). On the one hand, quality is one of the most discussed features of OERs (Marín et al., 2019) and it is recognized to contribute to a better retrieving and sharing of OERs (Almendro & Silveira, 2018). Moreover, Moise et al. (2014) stated that the lack of quality assurance hinders the uptake and usage of OERs. On the other hand, accessible OERs contribute to flexible learning experiences for all (CAST, 2018) and this feature is also considered when evaluating quality in OERs. In their lit-erature review about standards and trends for OERs, Cueva & Rodríguez (2010) pointed out three aspects for future research in the field of OERs, namely accessibility standards, social components and semantic web. As for accessibility standards, they highlight the use of standards such as the Web Content Accessibility Guidelines (WCAG) specification.

Most of the research on OER web accessibility and quality has been focused on evaluating the contents after their creation (Kumar & Owston, 2016; Kurilovas, Bireniene, & Serikoviene, 2010; Mishra & Kanwar, 2015). However, evaluating an OER while it is still being created contributes to a better adoption once the resource is published in a virtual learning environment or in a reposi-tory (Marín et al., 2019).

CompetencesA competence is “the proven ability to use knowledge, skills and personal, social and/or methodological abilities, in work or study situations and in professional and personal development” (Ferguson et al., 2016). In the Information and Communication Technologies(ICT) context, digital competences are a “set of knowledge, skills, attitudes (thus including abilities, strategies, values and awareness) that are required when using ICT and digital media to perform tasks; solve problems; communicate; manage information; collaborate; create and share content; and build knowledge effectively, efficiently, appropri-ately, critically, creatively, autonomously, flexibly, ethically, reflectively for work, leisure, participation, learning, socialising, consuming, and empowerment” (Ferrari, 2012). In this research, the creation and evaluation of OERs are considered as part of teachers’ digital competences.

Learning analytics and OERsLA is a tool for measuring, collecting, analyzing and reporting data about learners and their con-text (Siemens et al., 2011). Adopting LA solutions to create and evaluate educational resources, often entails providing teachers with feedback about the learning context (eg, student interaction traces) and teachers can, in turn, use this feedback to improve the learning content (Haya, Daems, Malzahn, Castellanos, & Hopp, 2015; Holtham, Martin, Brown, Jawaheer, & Dove, 2012).

Some recent studies highlight the use of LA as a mean for managing learning designs (Er et al., 2019; Hernández-Leo et al., 2019; Holmes, Nguyen, Zhang, Mavrikis, & Rienties, 2019). For instance, Dyckhoff, Zielke, Bültmann, Chatti and Schroeder (2012) present a LA toolkit for teach-ers (eLAT), which is a conceptual toolkit designed to support teachers in the analysis of improve-ments for the learning scenarios by using graphical information regarding content usage, student behavior and assessment results. Lockyer, Heathcote, and Dawson (2013) analyze how analytics

Page 4: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

1022 British Journal of Educational Technology Vol 51 No 4 2020

can facilitate pedagogical actions for learning designs. Another study highlights that LA may pro-vide insights on how to assess students based on the OERs’ impact (Prasad, Totaram, & Usagawa, 2016). However, Persico and Pozzi (2015) argue that most of the research in learning analyt-ics has concentrated attention to the contribution of learning analytics during the enactment of learning experiences and not in a design phase.

Some contributions reported in the literature about using analytics to support the design of edu-cational resources (involving processes such as the creation and evaluation) are: analytics can support teachers in decision making when completing a course design (Fritz & Whitmer, 2017); Cooper, Ferguson, and Wolff (2016) state that analytics is an approach that help identifying accessibility deficits in learning contents; reports based on analytics (usually presented in the form of dashboards) provide objective information that may be used by teachers in the design of OERs in order to improve learning contents (Bodily et al., 2017).

Therefore, we hypothesize LA may be used to provide teachers with feedback on how well they cre-ate and evaluate OERs, and how they can use the feedback to improve the learning content itself.

An overview of the ATCE toolThe Analytics Tool to trace the Creation and Evaluation of OERs (ATCE) was developed as a mod-ule in the ATutor LMS (ATutor, 2002). The ATCE tool was first introduced in the paper by Avila, Baldiris, Fabregat, & Graf (2017). However, in this section, some additional details are described.

ATCE differs from others that support teachers in OER creation and evaluation, in that it allows teachers to identify what they need to improve on their role as authors and evaluators of OERs. ATCE provides authors with visualizations to help them easily identify accessibility failures and quality items that need to be improved in each OER before its delivery in a real scenario with stu-dents, as well as feedback about their role as evaluators. Figure A1 (in supplementary material) depicts the flow diagram of the ATCE tool and its functionalities. The main functionalities of the ATCE tool are:

• Management options

The administrator of the tool assigns evaluators and experts and change the states of an OER. Possible states are: edit (by an author), evaluate (by an evaluator) and verify (by an expert). These states are used to ensure that an OER is not created/edited, evaluated or verified at the same time. Figure 1 depicts the management options.

Figure 1: ATCE—management options [Colour figure can be viewed at wileyonlinelibrary.com]

Page 5: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

Learning analytics for creating and evaluating OER 1023

• Storing HTML elements

The ATCE tool stores each HTML element included in the web pages of an OER as a separate record in the database. The LMS did not include this functionality because it stored the complete HTML source code of each content page. Thus, the content editor of the LMS was modified so that it could store each HTML element in a separated record.

• Automatic accessibility evaluation

Once the HTML elements have been stored, the ATCE tool carries out an automatic accessibility evaluation. For this automatic evaluation, we integrated AChecker (2012), which is an automatic tool to check accessibility criteria in web contents and works in accordance with different acces-sibility standards, among which are the WCAG (W3C-WAI, 2018). We used the API provided by AChecker which facilitates its integration in web environments. The workflow of this API is depicted in Figure 2.

• Manual evaluation

Teachers as evaluators use the ATCE evaluation module to evaluate the web accessibility and quality of each OER. The web accessibility is evaluated for each HTML element in a web page by using the questions pre-sented in Table A1 and the quality is evaluated with the items presented in Table A2 (in supplementary material). Figure 3 depicts the options presented to the evaluator when answering one of the accessibility questions.

Figure 2: AChecker API—rest service [Colour figure can be viewed at wileyonlinelibrary.com]

Client(ATutor)

Server(AChecker)

HTTP URIrequest

REST responseXML format

XML to JSON

GET accessibility answer

Figure 3: ATCE—manual evaluation [Colour figure can be viewed at wileyonlinelibrary.com]

Page 6: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

1024 British Journal of Educational Technology Vol 51 No 4 2020

• Dashboard

It provides teachers with feedback related to their role as both authors and evaluators of OERs. This infor-mation is coming from the creation and evaluation process. Figure 4 depicts the welcome interface of the dashboard (a) and the main view with information about the contents added to the OER (b). Figure 5 depicts the accessibility view with a general overview of the accessibility in the selected OER (a), the accessibility over the time (b), the percentage of accessibility reached in each principle (c) and the detail of accessibility failures

Figure 4: Dashboard—main interface [Colour figure can be viewed at wileyonlinelibrary.com]

Figure 5: Dashboard—accessibility view [Colour figure can be viewed at wileyonlinelibrary.com]

Page 7: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

Learning analytics for creating and evaluating OER 1025

identified in one of the principles (d). Figure 6 depicts the quality view with the general (a) and the detailed (b) results of the quality for the selected OER. Finally, Figure 7 depicts the evaluator view with information of the level reached by the teacher in the role of evaluator and comments to improve in this role for both compe-tences accessibility evaluation (a) and quality evaluation (b). Table A3 (in supplementary material) presents a summary of the metrics and visualizations of the dashboard.

EvaluationMethodThis study was carried out as comparative experimental research using an experimental group within-subjects design, in which the same group of participants are measured in different points in time (Randolph, 2008). Thus, the purpose of this study is twofold:(1) to investigate whether

Figure 6: Dashboard—quality view [Colour figure can be viewed at wileyonlinelibrary.com]

Figure 7: Dashboard—evaluator view [Colour figure can be viewed at wileyonlinelibrary.com]

a

b

2019-01-07

Page 8: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

1026 British Journal of Educational Technology Vol 51 No 4 2020

the ATCE tool benefits teachers in the acquisition of competences in the creation and evaluation of OERs and (2) to identify the perceptions teachers had with regard to the use of the ATCE tool. These objectives were addressed by answering the following research questions(RQ):

• RQ1: Does the use of the ATCE tool benefit teachers in the acquisition of competences in the creation and evaluation of OERs?

• RQ2: What are the perceptions teachers have about the use of the ATCE tool?

The research study was divided into two parts. To answer RQ1, an observational process was de-fined to evaluate the ATCE tool involving a training course, in which teachers learned how to create OERs in terms of web accessibility and quality. To answer RQ2, we analyzed the perceptions teachers had when using the ATCE tool.

MaterialsThe following materials, tools and systems were used for this study:

• Initial questionnaire: This questionnaire gathered data related to demographic information and the background teachers had in the use of ICT tools. The questionnaire consisted of four parts: (1) demographic information, (2) ICT experience and skills, (3) specific use of ICT and (4) ICT in the creation of educational resources as part of teaching practices. This questionnaire was created using Google forms.

• Training course about the creation of OERs: This virtual training course aimed at teaching teachers how to create OERs contemplating web accessibility and quality. The course included four units. Three units with theoretical foundations related to OERs (Unit 1: Inclusive Learning, Unit 2: Universal Design for Learning and Unit 3: Open Educational Resources). As part of Unit 3, teachers learned how to add an open license to their OERs in the ATutor LMS using a Creative Commons License. Unit 4 was about web accessibility and quality.

• ATutor LMS and the ATCE tool: ATutor LMS was the platform used to deliver the content of the training course on creating OERs and integrating the ATCE tool.

• Final questionnaire: The final questionnaire consists of two parts: questions from the LAAM (Learning Analytics Acceptance Model) instrument (Ali, Asadi, Gašević, Jovanović, & Hatala, 2013) and some open questions to collect the opinions the teachers had about the ATCE tool. The authors of LAAM suggest that the questions can be adapted according to the LA tool being assessed. Some open questions were also included in the questionnaire for gathering additional feedback about the functionalities of the ATCE tool such as the ATCE dashboard (comments about the dashboard), the ATCE web editor used for creating the web pages (comments about the web editor) and the ATCE evaluation module (comments about the evaluation module).

ParticipantsInitially, 35 Colombian university and school teachers volunteered to participate in this study, although only 19 teachers completed the study. The reason for this experimental mortality was that participants who dropped out of the study, stated that they did not have enough time to finish the activities in the research study.

According to the answers given to the initial questionnaire, all the participants have teaching experience in fields such as Arts, Accounting, Biology, English, Ethics, Informatics, Literature, Management, Math, Social sciences and Technology. In terms of ICT use and experience, the teachers had participated in courses related to: how to use a computer (74%), Microsoft Office tools (89%), management of virtual learning environments (32%), creation of web pages (58%), inclusion of ICT in education (21%) and web accessibility (16%). Besides that, all the participants

Page 9: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

Learning analytics for creating and evaluating OER 1027

reported that they had participated in at least one activity related to the creation of educational resources: they had asked experts for help (42%), organized the content by themselves (74%), cre-ated the content by themselves (53%), shared the content with others (53%), used a methodol-ogy to create the educational resources (5%), used a license to share educational resources (5%), published resources on the Internet (11%), evaluated web accessibility (5%) or evaluated content quality (26%).

The teachers participated as authors and evaluators of OERs, while the researchers of this study participated as experts by carrying out additional evaluations of the OERs created and verifying the evaluations carried out by teachers.

ProcedureFigure 8 depicts the research design to answer RQ1 and RQ2. The evaluation process encompassed five phases in which the teachers participated as OER authors and evaluators. The teachers took part in a training course on the creation of OERs during phases 2 and 3. In all the phases, the par-ticipants used the web editor and evaluation module of ATCE. In phase 5, both authors and evalu-ators used the ATCE dashboard. For measurements 1 and 2, an expert used the ATCE verification module to verify how well the teachers had created and evaluated the OERs (in phases 4 and 5).

The activities carried out in the evaluation process are described as follows:

• Initial questionnaire: At the beginning of the evaluation process, teachers were asked to answer the initial questionnaire.

• Creation phase 1: Before starting the training course, each teacher in their role of author was asked to create an OER (OER1i, where i identifies the author) as a virtual course in the ATutor LMS using the ATCE web editor.

• Evaluation phase 1: Each teacher, in their role of evaluator, was asked to evaluate the web ac-cessibility and quality of an existing OER (evaluation_OER) entitled, The Kingdoms of Nature using the ATCE evaluation module.

• Training course part: The teachers were asked to read the first three units of the training course: (1) Unit 1: Inclusive Learning, (2) Unit 2: Universal Design for Learning and (3) Unit 3: Open Educational Resources.

• Creation phase 2: After reviewing the aforementioned units, each teacher in their role as author, was asked to create a new OER (OER2i, where i identifies the author) using the ATCE web editor.

• Evaluation phase 2: Each teacher, in their role as evaluator, was asked to once again evaluate the web accessibility and quality of the evaluation_OER. By this stage, the teachers possessed more knowledge on what criteria to consider when evaluating the web accessibility and quality of the OER.

• Training course part 2: The teachers were asked to read Unit 4: Web accessibility and quality.• Creation phase 3. After reading Unit 4, teachers were asked to develop some exercises focused

on creating accessible contents using the ATCE web editor. To do these practical exercises each teacher created another OER (OER3i).

• Creation phase 4: Each teacher, in their role as author, was asked to continue editing the OER2i created in creation phase 2.

• Evaluation phase 4: Each teacher, in their role as evaluator, was asked to once again evaluate the web accessibility and quality of the evaluation_OER.

• Measurement 1: An expert used the ATCE evaluation module to evaluate the OERs edited by the teachers in creation phase 3 and the ATCE verification module to verify the evaluations made by teachers in evaluation phase 4.

Page 10: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

1028 British Journal of Educational Technology Vol 51 No 4 2020

Figu

re 8

: P

hase

s of

the

rese

arch

des

ign

[C

olou

r fig

ure

can

be v

iew

ed a

t wile

yonl

inel

ibra

ry.c

om]

Page 11: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

Learning analytics for creating and evaluating OER 1029

• Creation phase 5: The teachers were introduced to the ATCE dashboard. After that, each teacher in their role as author, was asked to continue editing OER2i but this time using the ATCE web editor and the information provided through the author’s view of the ATCE dashboard.

• Evaluation phase 5: The teachers were asked to participate in a peer-review process in which each teacher assumed the role of evaluator and evaluated an OER created by another teacher (OER5ij, where i identifies the evaluator and j the OER’s author) using the ATCE evaluation module and the feedback provided in the evaluator’s view of the ATCE dashboard.

• Measurement 2: An expert evaluated the OERs edited by the teachers in creation phase 4 and also verified the evaluations made by the teachers in evaluation phase 4.

• Final questionnaire: The teachers were asked to answer the final questionnaire with questions based on the LAAM instrument and some open questions.

Data analysisThe methods used to analyze the data to answer the two research questions are described as follows:

RQ1: Does the use of the ATCE dashboard benefit teachers in the acquisition of competences in the creation and evaluation of OERs?

The dependent variables for RQ1 are:

• Accessibility (creation): This variable shows how well authors can create accessible OERs in terms of their competence level (from 0 to 100).

• Quality (creation): This variable shows how well authors can create quality OERs in terms of their competence level (from 0 to 100).

• Accessibility (evaluation): This variable shows how well evaluators can evaluate the web acces-sibility of OERs in terms of their competence level (from 0 to 100).

• Quality (evaluation): This variable shows how well evaluators can evaluate the quality of OERs in terms of their competence level (from 0 to 100).

To answer RQ1, the results obtained in measurements 1 and 2 were compared, considering the dependent variables. A t test was applied to compare the variables with a normal distribution and the Mann–Whitney U test for those with a non-normal distribution. The Excel tool was used to organize the data collected, the RStudio tool to carry out the statistical tests and the G*Power tool to calculate the effect size and power.

RQ2: What are the perceptions teachers have about the use of the ATCE tool?

To answer RQ2, the answers given by the teachers to the final questionnaire were analyzed. Mean and standard deviation were calculated for the answers given to the questions of the LAAM in-strument and a qualitative description was applied for the answers given to the open questions. The Excel tool was used to organize the data collected and to calculate the descriptive statistics.

Results and discussionOn the one hand, this section presents general results about the web accessibility and quality of the OERs created by teachers and, on the other hand, results from both RQ1 and RQ2.

Overview of web accessibility and qualityFigure 9 shows results of web accessibility of OERs in the second expert evaluation. The level of web accessibility by type of HTML element varies between 79% and 98%.

Page 12: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

1030 British Journal of Educational Technology Vol 51 No 4 2020

Since web accessibility was addressed using the WCAG guidelines, the level of accessibility for each one of the three WCAG principles considered in the ATCE tool (Perceivable, Operable and Understandable) was also analyzed. The robust principle was not included here because criteria in this principle are related to the HTML source code of the web pages and the teachers do not work directly with the source code of the web pages. Figure 10 shows the level of web accessibility reached for each principle. These results revealed that the web accessibility of the OERs still need to be improved but the level obtained in each principle was good (higher than 75% in all three principles).

Figure  11 shows the results for the OERs’ level of quality. These results represent the rounded average value of stars obtained by each OER in all the quality items. According to these results, most OERs created in creation phase 4 reached between 4 and 5 stars. These results suggest that the level of quality in most of the OERs was good (3 to 5 stars) with and none of them obtaining 1 or 2 stars.

At the end of the evaluation process, the ideal situation would be that, in the last measurement, all the OERs reached 100% in the level of accessibility and 5 stars in the quality rating for all OERs. That said, the results for the accessibility by type of HTML element and by WCAG principle revealed a good level of web accessibility (higher than 75%) and, in terms of quality, the OERs obtained a good level with 4 and 5 stars for almost all of them.

RQ1: Teacher Competences in the Creation and Evaluation of OERsThe results obtained from the measurements 1 and 2 revealed that teachers’ competences in OER creation and evaluation improved. Data used for answering RQ1 come from measurements

Figure 9: Level of web accessibility by type of HTML element [Colour figure can be viewed at wileyonlinelibrary.com]

0102030405060708090

100

paragraph heading image table elements list video link

Accessible Not accessible

Figure 10: Level of web accessibility by WCAG principle [Colour figure can be viewed at wileyonlinelibrary.com]

Page 13: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

Learning analytics for creating and evaluating OER 1031

carried out in the evaluation process before teachers used the ATCE dashboard (measurement 1) and after they used it (measurement 2). The results for RQ1 were calculated based on statistical tests conducted for each of the four dependent variables defined and are described as follows:

• Accessibility (creation): Data collected about how well authors can create accessible OERs in measurement 1 (ACM1) and measurement 2 (ACM2) was normally distributed according to the Shapiro–Wilk test with p > 0.05 (ACM1 p = 0.08746; ACM2 p = 0.569). The standard parametric t test was used to identify if there was any difference when teachers did not use the ATCE dashboard (ACM1) and when they did use it (ACM2). Results from the t test revealed that the level of web accessibility in ACM2 (M = 85.46; SD = 4.19) was significantly higher than in ACM1 (M = 65.98; SD = 4.19), with t(18) = −6.0638, p < 0.05. The effect size was large, Cohen’s d = 1.43, and the Power was 1.

• Quality (creation): Data collected for how well authors can create quality OERs in measure-ment 1 (QCM1) and measurement 2 (QCM2) was not normally distributed according to the Shapiro–Wilk test with p < 0.05 (QCM1 p = 0.003226; QCM2 p = 0.02163). The Wilcoxon signed-rank test was used to identify if there was any difference when teachers did not use the ATCE dashboard (QCM1) and when they did (QCM2). Results from the Wilcoxon test revealed a significant difference between QCM1 and QCM2: V = 0, p < 0.001. The sum of ranks assigned to the differences with positive sign 0, and the sum of ranks with negative sign was 210. The effect size was large, r = 3.65, and Power was 1.

• Accessibility (evaluation): Data collected for how well evaluators evaluated the web accessibility of the OERs in measurement 1 (AEM1) and measurement 2 (AEM2) was normally distributed according to the Shapiro–Wilk test with p > 0.05 (AEM1 p = 0.9362; AEM2 p = 0.2351). The standard parametric t test was used to identify if there was any difference when teachers did not use the ATCE dashboard (AEM1) and when they did use it (AEM2). Results from the t test revealed that the level of web accessibility in AEM2 (M = 79.99; SD = 10.16) was significantly higher than in AEM1 (M = 68.97; SD = 11.47), with t(18) = −5.1136, p < 0.05. The effect size was large, Cohen’s d = 1.11, and the Power was 1.

• Quality (evaluation): Data collected for how well evaluators evaluated the quality of the OERs in measurement 1 (QEM1) and measurement 2 (QEM2) was not normally distributed accord-ing to the Shapiro–Wilk test with p < 0.05 (QEM1 p = 0.001224; QEM2 p = 0.0002435). The Wilcoxon signed-rank test was used to identify if there was any difference when teachers did not use the ATCE dashboard (QEM1) and when they used it (QEM2). Results from the Wilcoxon test revealed a significant difference between QEM1 and QEM2: p < 0.001. The sum of ranks

Figure 11: Level of quality [Colour figure can be viewed at wileyonlinelibrary.com]

Page 14: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

1032 British Journal of Educational Technology Vol 51 No 4 2020

assigned to the differences with positive sign 0 and the sum of ranks with negative sign was 153. The effect size was large, r = 1.74, and Power was 1.

Differences observed in the two measurements were positive and statistically significant. These results suggest that teachers might benefit from using the ATCE dashboard because their compe-tences improved after they used the feedback provided through the dashboard. These results are consistent with the findings of other studies that found feedback from dashboards guides users in their progress and performance in learning activities (Aljohani & Davis, 2013; Corrin & de Barba, 2014; Na & Tasir, 2017).

RQ2: Teacher Perceptions on the Use of the ATCE ToolThe results from the answers given by teachers (N = 19) for both parts of the final questionnaire (LAAM instrument and open questions), revealed the positive perceptions the teachers had to-ward the use of the ATCE tool.

Table 1 shows the descriptive statistics for the answers given to the LAAM instrument. The possi-ble answers for each question range from 1 to 5, where 1 indicates very low agreement with the statement and 5 indicates very high agreement. It is important to note that Q8 is a reverse ques-tion, which means that the question is a negatively worded question and the numerical scoring scale runs in the opposite way. For the Q8, the result obtained in the mean score (2.17) is equiva-lent to 3.84 so that it can be compared to the other questions.

The mean scores of all the questions in the questionnaire were greater than 3. Overall, this indicates that participants reported positive perceptions on the use of the tool. The highest score among the questions was 4.84 in Q1a. This result suggests that the information pre-sented in the accessibility section of the Author’s view of ATCE, seems to be the most remark-able feature of the tool which is related to the web accessibility of OERs. The lowest scores were in Q8 (3.84) and Q13 (3.84). A potential explanation for this result might be that some of the teachers may have used other similar tools which are more intuitive in feedback provisioning and less overwhelming. This might suggest that some teachers did not fully understand the visualizations and information presented in the dashboard. However, in the comments they gave to the open questions, the teachers seemed to be satisfied with the information presented in the dashboard.

Qualitative feedback on the use of the ATCE tool was gathered through the comments provided for each of the open questions included in the final questionnaire, main comments from teachers are described next and a complete description of this qualitative feedback is presented in the sup-plementary material:

• Comments about the dashboard: The dashboard allowed teachers to “… observe the evolution of the creation process”, and that “the analytics dashboard is an ideal and excellent tool with which to obtain data that allow us to improve the quality of our resources”.

• Comments about the evaluation module: The evaluation module served to help teachers im-prove as OER evaluators (eg, “at the beginning for me it was difficult to understand some ques-tions or categories presented in the evaluation module but little by little I was able to understand them”).

• Comments about the ATCE web editor: Teachers found the web editor to be an adequate sup-port tool to improve the OER’s web accessibility (eg, “an excellent tool that allows us to improve the webpages of the OERs and also to improve their accessibility”).

Page 15: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

Learning analytics for creating and evaluating OER 1033Ta

ble

1:

Res

ults

for

the

LAA

M in

stru

men

t

Cat

egor

yQ

uest

ions

in th

e qu

esti

onna

ire

Mea

nSt

anda

rd d

evia

tion

Perc

eive

d va

lue

of v

isu

aliz

atio

ns

pres

ente

d in

th

e A

TCE

dash

boar

dQ

1a:

I u

se th

e in

form

atio

n p

rese

nte

d in

the

acce

ssib

ility

se

ctio

n o

f th

e A

uth

ors

view

of

ATC

E as

feed

back

to s

up-

port

my

proc

ess

as a

n a

uth

or o

f O

ERs

4.8

40

.37

Q1

b: I

use

the

info

rmat

ion

pre

sen

ted

in th

e qu

alit

y se

ctio

n

of th

e A

uth

or’s

vie

w o

f A

TCE

as fe

edba

ck to

su

ppor

t my

proc

ess

as a

n a

uth

or o

f O

ERs

4.5

80

.51

Q1

c:I u

se th

e in

form

atio

n p

rese

nte

d in

the

Eval

uat

or's

view

of

ATC

E as

feed

back

to s

upp

ort m

y pr

oces

s as

an

eva

lua-

tor

of O

ERs

4.5

80

.51

Perc

eive

d u

sefu

lnes

s (u

sage

bel

ief)

of

the

tool

fo

r im

prov

ing

the

OER

s' w

eb a

cces

sibi

lity

and

qual

ity

Q2

: ATC

E en

able

s m

e to

gai

n in

sigh

t in

to m

y pr

oces

s in

the

crea

tion

an

d ev

alu

atio

n o

f O

ERs

4.7

40

.45

Q3

: Th

e in

form

atio

n A

TCE

prov

ides

hel

ps m

e id

enti

fy w

hat

n

eeds

to b

e im

prov

ed in

my

OER

s4

.68

0.4

8

Q4

: ATC

E pr

ovid

es r

elev

ant i

nfo

rmat

ion

reg

ardi

ng

the

ac-

cess

ibili

ty a

nd

qual

ity

of m

y O

ERs

4.4

70

.51

Q5

: Th

e in

form

atio

n p

rovi

ded

by A

TCE

hel

ps m

e de

term

ine

how

to im

prov

e in

my

role

of

auth

or a

nd

eval

uat

or o

f O

ERs

4.6

30

.50

Q6

: ATC

E h

elps

me

iden

tify

the

mai

n a

cces

sibi

lity

and

qual

-it

y is

sues

in m

y O

ERs

4.6

80

.48

Perc

eive

d G

UI (

ease

of

use

) of

the

tool

Q7

: ATC

E’s

GU

I (G

raph

ical

Use

r In

terf

ace)

is in

tuit

ive

enou

gh4

.37

0.6

0

Q8

: ATC

E’s

GU

I is

over

burd

ened

wit

h in

form

atio

n. (

reve

rse

ques

tion

)2

.17

(3.8

4)

0.7

6

Q9

: ATC

E’s

GU

I has

a g

ood

desi

gn4

.37

0.5

0G

ener

al p

erce

ptio

n o

f th

e to

olQ

10

: All

in a

ll, I

fou

nd

ATC

E a

han

dy to

ol fo

r fe

edba

ck

prov

isio

n4

.63

0.5

0

Q1

1: I

wou

ld li

ke to

be

able

to u

se A

TCE

in m

y te

ach

ing

prac

tice

4.7

40

.56

Q1

2: A

TCE

prov

ides

me

wit

h m

ore

use

ful f

eedb

ack

than

ot

her

sim

ilar

tool

(s) t

hat

I h

ave

use

d/tr

ied

4.2

10

.71

Q1

3: A

TCE

is m

ore

intu

itiv

e th

an th

e ot

her

tool

s ca

pabl

e of

fe

edba

ck p

rovi

sion

that

I h

ave

use

d/tr

ied

3.8

40

.83

Page 16: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

1034 British Journal of Educational Technology Vol 51 No 4 2020

Our observations are in agreement with prior research on teachers’ behavioral intentions when using a LA tool (Papamitsiou & Economides, 2015) because the information provided through the tool was perceived by the users as useful and easy-to-use and users declared their intention to use the tool in the future.

Threats to validityThere are some threats to validity of the evaluation conducted. While teachers learn about OERs and the use of the tool, they participated in different moments of creation and evaluation of OERs. Sometimes using the same OER. However, the measurements focused on what teachers did in the phase 4 (measurement 1) and the phase 5 (measurement 2). On the one hand, for the creation process in phase 4 teachers improved the OER created in phase 2 (OER2i), which was not modified in phase 3. The OER was improved based on what they practiced and learned in previous phases. They did the same in phase 5 but focusing on the feedback given through the dashboard, which was used by teachers only in phase 5. On the other hand, the fact that teachers evaluated the same OER in phase 1, phase 2 and phase 4 made teachers to be more focused on what they needed to improve each time they evaluated the OER and this might have influenced the results obtained in measurement 1 for the competency of evaluating accessibility and quality of OERs. However, in phase 5 teachers evaluated a different OER considering the recommendations given through the evaluator’s view of the dashboard.

In relation to the LAAM instrument, the instrument was adapted in this study by considering the information given to teachers through the ATCE tool about their role as authors and evaluators of OERs. Since the evaluation was conducted in a Spanish speaking country, questions were trans-lated into Spanish by a native speaker with English as a second language. Then, he and one more person discussed each translated question in order to verify their understandability. Then, another person translated the questions again into English to compare them with the original version of the instrument and make the corresponding revisions.

ConclusionsThis study sought to analyze whether the ATCE tool helped teachers in their acquisition of cre-ation and evaluation competences in the context of OERs, and to analyze the perceptions they had about using the ATCE tool. The general results from the evaluation of the ATCE tool suggest that the OERs created reached a good level of accessibility and quality. Moreover, the use of the ATCE dashboard furthered teachers in their acquisition of the competences and they also had positive perceptions about using the ATCE tool. The use of ATCE also allowed teachers to improve in their OER creation and evaluation competences. As authors, teachers were informed about how to im-prove the web accessibility and quality of their learning content and, as OER evaluators, they were informed about how to improve their evaluation in terms of web accessibility and quality.

As discussed by Navarrete and Luján-Mora (2017), people with disabilities are faced with barriers when they try to access OERs. They also state that one way to deal with this issue is to create authoring tools and learning contents that cope with such barriers. In line with this need, support tools like the one evaluated here can make teachers more aware of and can contribute to the creation of OERs with learning content that foster an equal access for all students.

Based on the study conducted, some benefits that can be highlighted in the use of LA in an author-ing tool to create and evaluate OERs are:

• Storing the detail of each element added or modified in the OERs allows to better analyze the traces of activities done by teachers when creating and evaluating OERs.

Page 17: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

Learning analytics for creating and evaluating OER 1035

• Supporting the manual evaluation with an automatic evaluation of web accessibility and qual-ity drives better analyses of content elements that may present issues.

• Teachers are given with feedback about improvements needed for an OER (eg, in terms of web accessibility and quality) before it is  delivered in real scenarios with students or in a public repository.

• The feedback is shown through visualizations in a dashboard that is updated as teachers add or modify contents.

Future work may involve an evaluation focused on teachers integrating their OERs into real sce-narios, so as to observe the effect providing students with accessible and quality OERs has on their learning process. Another future research direction is to foster collaborative work among teachers and stakeholders in the creation and evaluation of OERs and the trace actions coming from such collaborative work.

AcknowledgementsAuthors acknowledge to all teachers taking part in this process. Cecilia Avila and Ramon Fabregat acknowledge the support from the BCDS research group (ref. GRCT40), which is part of the DURSI consolidated research group Smart IT Engineering and Services (SITES) (ref. 2017 SGR-1551). Cecilia Avila had financial support under the grant BR2014 of the University of Girona. Silvia Baldiris acknowledges the support from the School of Engineering and Technology from the International University of La Rioja. Sabine Graf acknowledges the support from NSERC.

Statements on open data, ethics and conflict of interestThis research was approved by the Broadband Communications and Distributed Systems group from the University of Girona.

The involved participants were given with the informed consent for their participation in this study. Data can be accessed for research purposes upon request to the corresponding author.

No conflict of interest declared.

ReferencesAChecker. (2012). IDI web accessibility checker : Web accessibility checker. Retrieved from https://achec ker.ca/

check er/index.phpAlbó, L., Barria-Pineda, J., Brusilovsky, P., & Hernández-Leo, D. (2019). Concept-level design analytics for

blended courses. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11722 LNCS, 541–554. https://doi.org/10.1007/978-3-030-29736-7_40

Ali, L., Asadi, M., Gašević, D., Jovanović, J., & Hatala, M. (2013). Factors influencing beliefs for adoption of a learning analytics tool: An empirical study. Computers & Education, 62, 130–148. https://doi.org/10.1016/j.compe du.2012.10.023

Aljohani, N. R., & Davis, H. C. (2013). Learning analytics and formative assessment to provide immediate detailed feedback using a student centered mobile dashboard. In 2013 Seventh International Conference on Next Generation Mobile Apps, Services and Technologies, 262–267. Prague, Czech Republic. https://doi.org/10.1109/NGMAST.2013.54

Almendro, D., & Silveira, I. F. (2018). Quality assurance for open educational resources: The OER trust framework. International Journal of Learning, Teaching and Educational Research, 17(3), 1–14. https://doi.org/10.26803/ ijlter.17.3.1

ATutor. (2002). ATutor home. Retrieved from https://atutor.github.io/

Page 18: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

1036 British Journal of Educational Technology Vol 51 No 4 2020

Avila, C., Baldiris, S., Fabregat, R., & Graf, S. (2017). ATCE: An analytics tool to trace the creation and eval-uation of inclusive and accessible open educational resources. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (LAK ’17). Vancouver, British Columbia, Canada, 183–187. New York, NY: ACM. https://doi.org/10.1145/30273 85.3027413

Bodily, R., Nyland, R., & Wiley, D. (2017). The RISE framework: Using learning analytics to automati-cally identify open educational resources for continuous improvement. International Review of Research in Open and Distance Learning, 18(2). Retrieved from http://www.irrodl.org/index.php/irrod l/artic le/view/2952/4085

Brooks, C., Erickson, G., Greer, J., & Gutwin, C. (2014). Modelling and quantifying the behaviours of stu-dents in lecture capture environments. Computers & Education, 75, 282–292. https://doi.org/10.1016/j.compe du.2014.03.002

CAST. (2018). UDL on campus: Accessibility and open educational resources. Retrieved from http://udlon cam-pus.cast.org/page/media_oer

Chen, W., Sanderson, N. C., Kessel, S., & Królak, A. (2015). Heuristic evaluations of the accessibility of learning management systems (LMSs) as authoring tools for teachers. First Monday, 20(9). https://doi.org/10.5210/fm.v20i9.5430

Commonwealth of Learning. (2011). Guidelines for open educational resources (OER) in higher education. Commonwealth of Learning, Vancouver and UNESCO. Retrieved from http://unesd oc.unesco.org/image s/0021/00213 6/21360 5e.pdf

Cooper, M., Ferguson, R., & Wolff, A. (2016). What can analytics contribute to accessibility in e-learning systems and to disabled students’ learning? In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge Edinburgh, United Kingdom, 99–103. New York, NY: ACM. https://doi.org/10.1145/28838 51.2883946

Corrin, L., & de Barba, P. (2014). Exploring students’ interpretation of feedback delivered through learning analytics dashboards. In 31st Annual Conference of the Australian Society for Computers in Tertiary Education (ASCILITE 2014), ‘Rhetoric and Reality: Critical perspectives on educational technology’ (pp. 23–26). Dunedin, New Zealand.

Cueva, S., & Rodríguez, G. (2010). OERs, standards and trends. RUSC. Universities and Knowledge Society Journal // International Journal of Educational Technology in Higher Education, 7(1), 1–8. http://dx.doi.org/10.7238/rusc.v7i1.361

Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012). Design and implementation of a learning analytics toolkit for teachers. Educational Technology and Society, 15(3), 58–76.

Er, E., Gómez-Sánchez, E., Dimitriadis, Y., Bote-Lorenzo, M. L., Asensio-Pérez, J. I., & Álvarez-Álvarez, S. (2019). Aligning learning design and learning analytics through instructor involvement: A MOOC case study. Interactive Learning Environments, 27(5–6), 685–698. https://doi.org/10.1080/10494 820.2019.1610455

Ferguson, R., Brasher, A., Clow, D., Cooper, A., Hillaire, G., Mittelmeier, J., … Castaño, J. (2016). Research evidence on the use of learning analytics: Implications for education policy. Joint Research Centre Science for Policy Report; EUR 28294 EN. https://doi.org/10.2791/955210

Fernandez-Delgado, M., Mucientes, M., Vazquez-Barreiros, B., & Lama, M. (2014). Learning analytics for the prediction of the educational objectives achievement. In 2014 IEEE Frontiers in Education Conference 1-4. Madrid, Spain. https://doi.org/10.1109/FIE.2014.7044402

Ferrari, A. (2012). Digital competence in practice: An analysis of frameworks (No. JRC68116; JRC Technical Reports). Retrieved from http://ftp.jrc.es/EURdo c/JRC68 116.pdf

Fritz, J., & Whitmer, J. (2017). Learning analytics research for LMS course design: Two studies. Retrieved from https://er.educa use.edu/artic les/2017/2/learn ing-analy tics-resea rch-for-lms-course-design-two-studies

Haya, P. A., Daems, O., Malzahn, N., Castellanos, J., & Hoppe, H. U. (2015). Analysing content and patterns of interaction for improving the learning design of networked learning environments. British Journal of Educational Technology, 46(2), 300–316. https://doi.org/10.1111/bjet.12264

Hernández-Leo, D., Martinez-Maldonado, R., Pardo, A., Muñoz-Cristóbal, J. A., & Rodríguez-Triana, M. J. (2019). Analytics for learning design: A layered framework and tools. British Journal of Educational Technology, 50(1), 139–152. https://doi.org/10.1111/bjet.12645

Page 19: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

Learning analytics for creating and evaluating OER 1037

Holmes, W., Nguyen, Q., Zhang, J., Mavrikis, M., & Rienties, B. (2019). Learning analytics for learning de-sign in online distance learning. Distance Education, 40(3), 309–329. https://doi.org/10.1080/01587 919.2019.1637716

Holtham, C., Martin, R., Brown, A., Jawaheer, G., & Dove, A. (2012). Evaluating engagement to address undergraduate first year transition: A case study. CSEDU 2012—Proceedings of the 4th International Conference on Computer Supported Education, 1, 223–228. Retrieved from http://www.scopus.com/inwar d/record.url?eid=2-s2.0-84864 87134 6&partn erID=tZOtx3y1

Iniesto, F., McAndrew, P., Minocha, S., & Coughlan, T. (2016). Accessibility of MOOCs: Understanding the provider perspective. Journal of Interactive Media in Education, 2016(1), 1-10. https://doi.org/10.5334/jime.430

Kumar, K. L., & Owston, R. (2016). Evaluating e-learning accessibility by automated and student-centered methods. Educational Technology Research and Development, 64(2), 263–283. https://doi.org/10.1007/s11423-015-9413-6

Kurilovas, E., Bireniene, V., & Serikoviene, S. (2010). Methodology for evaluating quality and reusability of learning objects. Electronic Journal of E-Learning, 9(1), 39–51.

Leacock, T. L., & Nesbit, J. C. (2007). A framework for evaluating the quality of multimedia learning re-sources. Journal of Educational Technology & Society, 10(2), 44–59.

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459. https://doi.org/10.1177/00027 64213 479367

Mangaroska, K., & Giannakos, M. (2017). Learning analytics for learning design: Towards evidence-driven decisions to enhance learning. In É. Lavoué, H. Drachsler, K. Verbert, J. Broisin, & M. Pérez-Sanagustín (Eds.), Data driven approaches in digital education, 428–433. Tallinn, Estonia: Springer International Publishing.

Marín, V. I., Orellana, M. L., & Peré, N. (2019). Open educational resources for research training: Quality assurance through a collaborative evaluation. Research in Learning Technology, 27, 1-16. https://doi.org/10.25304/ rlt.v27.2271

Mishra, S., & Kanwar, A. (2015). Quality assurance for open educational resources: What’s the difference? (Pre-print).

Moise, G., Vladoiu, M., & Constantinescu, Z. (2014). MASECO: A multi-agent system for evaluation and classification of oers and ocw based on quality criteria. Studies in Computational Intelligence, 528, 185–227. https://doi.org/10.1007/978-3-642-41965-2_7

Na, K. S., & Tasir, Z. (2017). A systematic review of learning analytics intervention contributing to student success in online learning. In 2017 International Conference on Learning and Teaching in Computing and Engineering (LaTICE) (62–68). Hong Kong, China. https://doi.org/10.1109/LaTiCE.2017.18

Navarrete, R., & Luján-Mora, S. (2017). Bridging the accessibility gap in open educational resources. Universal Access in the Information Society, 17(4), 755–774. https://doi.org/10.1007/s10209-017-0529-9

Papamitsiou, Z., & Economides, A. A. (2015). Temporal learning analytics visualizations for increasing awareness during assessment. International Journal of Educational Technology in Higher Education, 12(3), 129–147. https://doi.org/10.7238/rusc.v12i3.2519.

Persico, D., & Pozzi, F. (2015). Informing learning design with learning analytics to improve teacher inquiry. British Journal of Educational Technology, 46(2), 230–248. https://doi.org/10.1111/bjet.12207

Prasad, D., Totaram, R., & Usagawa, T. (2016). Development of open textbooks learning analytics sys-tem. The International Review of Research in Open and Distributed Learning, 17(5), 215-234. https://doi.org/10.19173/ irrodl.v17i5.2541

Randolph, J. (2008). Multidisciplinary methods in educational technology research and development. HAMK, University of Applied Sciences. Retrieved from http://justu srand olph.net/artic les/multi disci plina ry_metho ds.pdf

Restrepo, F., Amado-Salvatierra, H., Otón, S., & Hilera, J. (2018). Training engineering educators on ac-cessible and inclusive learning design. International Journal of Engineering Education, 34(5), 1538–1548.

Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Shum, S. B., & Ferguson, R. (2011). Open learn-ing analytics: An integrated & modularized platform Proposal to design, implement and evaluate an open platform

Page 20: Evaluation of a learning analytics tool for supporting teachers in …sgraf.athabascau.ca/publications/avila_etal_BJET20.pdf · 2020. 7. 8. · Kessel, & Królak, 2015; Marín, Orellana,

© 2020 British Educational Research Association

1038 British Journal of Educational Technology Vol 51 No 4 2020

to integrate heterogeneous learning analytics techniques. SoLAR (Society for Learning Analytics Research). https://solar esear ch.org/wp-conte nt/uploa ds/2011/12/OpenL earni ngAna lytics.pdf

Vlachogiannis, E., Darzentas, J. S., & Koutsabasis, P. (2010). Beyond specifications: Towards a practical methodology for evaluating web accessibility. Journal of Usability Studies, 5(4), 157–171.

W3C-WAI. (2018). Web content accessibility guidelines (WCAG) overview. Retrieved from https://www.w3.org/WAI/stand ards-guide lines/ wcag/

Supporting Information

Additional supporting information may be found online in the Supporting Information section at the end of the article.