2010_use of web-based research materials in education_is uptake occurring_26p

Upload: john-m-smith

Post on 03-Apr-2018

216 views

Category:

Documents


1 download

TRANSCRIPT

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    1/26

    1

    Use of web-based research materials in education: Is uptake occurring?

    Original proposal title: The use of web-based research materials: Using web analytics in conjunction with survey data to better

    understand how online research materials are used

    Paper presented at CSSE, Montreal, 2010

    Amanda Cooper, Hilary Edelstein, Ben Levin, Joannie Leung

    Ontario Institute for Studies in Education

    University of Toronto

    Theory and Policy Studies in Education Department

    252 Bloor Street West

    Toronto, Ontario

    M5S 1V6

    Canada

    E-mail:[email protected]

    [email protected]

    [email protected]

    [email protected]

    mailto:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]
  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    2/26

    2

    Use of web-based research materials in education: Is uptake occurring?

    Introduction

    The internet age

    The rise of the internet offers new possibilities for research dissemination globally. The emergence of information technology has

    enabled people to access information at an ease and rate like never before (Dede, 2000). With nearly 2 billion internet users

    worldwide, the level of online activity is staggering and increasing exponentially: between 2000 and 2009 alone, internet usage

    worldwide increased by 400%! (http://www.internetworldstats.com/stats.htm). Every sector has been transformed by the internet age,

    and the education sector is no exception.

    Knowledge mobilization (KM) is about using research more to improve policy and practice in education. There is growing interestglobally in understanding KM processes (Cooper, Levin & Campbell, 2009; Levin, 2010; Nutley, Walter & Davies, 2007).

    Researchers and educational organizations are increasingly using websites and the internet as the primary vehicle for the

    dissemination of research findings in the form of reports, knowledge sharing events, and the creation of interpersonal networks to

    support KM efforts (Hartley & Bendixen, 2001; Greenhow, Robelia & Hughes, 2009); consequently, investigating web-based

    dissemination and collaboration strategies (websites, social media, virtual networks, and so on) might better inform our understanding

    of KM in the current technological societal context.

    We have begun exploring online KM strategies as part of the Research Supporting Practice in Education (RSPE) program

    (www.oise.utoronto.ca/rspe). RSPE, housed at OISE and funded through Dr. Ben Levins Canada Research Chair, is a program ofresearch and related activities that investigates KM across different areas of the education system including perspectives of research

    producers, research users, and in relation to the emerging technological landscape. The increasing importance of the internet draws

    our attention to use of web-based research materials as an important area for additional research.

    Currently, we have two studies underway which attempt to explore and evaluate the internets growing role in KM in education. One

    study is an analysis of KM strategies in educational and other organizations based on analysis of their websites (Qi & Levin, 2010). In

    this work we have developed inductively a common metric for assessing KM work as revealed on websites in terms of strategies(products, events and networks) and indicators as they relate to strategies (different types, ease of use, accessibility, focus of audience

    and so on). This analysis helps us understand the range of KM strategies being employed by different kinds of organizations,

    including research producers, research users and intermediaries.

    Our data show that few organizations display a wide range of practices related to KM, and many organizations have virtually no KM

    activity (Qi & Levin, 2010). Many organizations focus on posting reports and research-related products online, with far attention tobuilding interaction through events or networks. However, as we looked at the range of different research products in a variety of

    formats for a variety of audiences, we also wondered whether and how much people are actually using these web-based research

    resources, a subject on which there appears to be very little empirical evidence.

    While a great deal of effort goes into developing websites for sharing of research materials and resources, there is little or no empirical

    evidence on the value or impact of these strategies. In fact, we could find no studies of how people actually use web-based research

    material in education.

    This study was conceived to explore the use of online research dissemination.

    The research question guiding this work is:

    How much and by whom are web-based research findings and analyses being used?

    The study uses two data sources to determine the extent and nature of use of web-based research products First, it uses web analyticsto track website use. Second, survey data extend the web analytics data by asking users directly questions about research use that

    cannot be answered from usage data

    While our data are only beginning to accumulate from our various partner organizations, we hope this paper will stimulate discussion

    about this research design or other ways of studying the use of web resources. We expect that data from this study will shed light onwhat forms of online dissemination strategies are effective.

    An overview of the paper

    This paper is organized into four parts. We use a review of related literature to develop a conceptual framework for studying the useof web-based research materials. We describe the challenges and opportunities in studying research use online in relation to our

    approach of Google Analytics (GA) in conjunction with online surveys. Fourth, we provide some initial web metrics data from one of

    http://www.internetworldstats.com/stats.htmhttp://www.internetworldstats.com/stats.htm
  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    3/26

    3

    our partner organizations in order to begin a discussion about the possibilities and limitations of these data to gauge research use and

    to answer KM questions.

    Literature Review

    This literature review is organized into three sections. The first outlines some key findings about knowledge mobilization generally

    that set the context for this study. The second examines the sparse literature that discusses KM in relation to the internet. The third

    section provides some introduction to the literature on web analytics and web metrics related to the study of KM.

    What we know about KM

    Research use is a multifaceted, nonlinear process that takes place within and between diverse organizations in the education system

    (Lemieux-Charles & Champagne, 2004; Levin, 2004; McLauglin, 2008; Nutley et al., 2007). Factors affecting KM also arise at

    multiple levels including individual, organizational and structural, as well as environmental and contextual (Berta & Baker, 2004).

    From a cross-disciplinary review of the literature, Levin (2004) outlines that KM is a function of the interaction among three mainareas: research producing contexts; research using contexts; organizations and processes which mediate between these two contexts.

    All of this takes place over time within a larger societal context.

    Multiple iterations of use

    Understanding of KM has been growing in the past decade due to increasing interest in the topic as a way to improve public services

    (Cooper et al., 2009; Davies, Nutley & Smith, 2000). However many important issues remain unexplored, particularly in education

    (Cooper & Levin, 2010; Nutley et al., 2007). The empirical evidence suggests that research use remains modest across sectors,especially in education (Behrstock, Drill & Miller, 2009; Biddle & Saha, 2002; Cordingley, 2008; Hemsley-Brown, 2004; Hemsley-

    Brown & Sharp, 2003; Lemieux-Charles & Champagne, 2004; Levin, S, Cooper & Mascarenhas, 2009; Pfeffer & Sutton, 2000).

    The use and impact of research are difficult to measure (Amara, Ouimet & Landry, 2004). One reason is that research may inform

    our thinking in ways that are not overtly visible in behaviour, sometimes referred to as conceptual use. Research can be used in direct

    and observable ways (instrumental use) though this is typically less frequent (Amara et al., 2004; Landry, Amara & Lamari, 2001). So

    it can be hard to know whether or to what extent research has actually informed the thinking or actions of people or organizations.

    The discussion of multiple kinds of research use is at least 30 years old, and still relies on Weiss (1979) foundational work on the

    many meanings of research utilization. Knott and Wildavsky (1980) also proposed seven levels of research utilization that remain

    relevant today:

    ReceptionCognitionReferenceEffortAdoption Implementation Impact

    These sequential stages attempt to trace the different components involved from the time that someone actually receives a research

    related product to the point of impact resulting from that use.

    The literature on KM indicates that research use happens over time. Incorporating research into policy and daily practice is not an

    instantaneous process; rather, a multitude of factors - from quality of the evidence to the credibility of the messenger, to the effort it

    takes on the part of practice organizations to implement evidence-based changes - all affect how quickly (if at all) KM occurs (Levin,2004; Nutley et al., 2007; McLaughlin, 2008).

    Timperley (2010) proposes that behaviour change takes at least three years to be fully incorporated. Her work involves intense and

    sustained interaction with teachers in order to have them use evidence (predominantly student assessment data disaggregated into

    different areas) to guide their practice. Others contend that incorporating research substantively takes much longer:

    Studies in healthcare show that it can take a decade or more before research evidence on the effectiveness of interventions

    percolates through the system to become part of established practice. The abandonment of ineffective treatments in the light

    of damning evidence can be equally slow (Davies, Nutley & Smith, 2000, p. 10).Many examples from the health sector and education sector reinforce this point such as the long road to increasing hand washing

    among health practitioners or the amount of time it took to end corporal punishment in schools.

    These studies suggest that in order to understand how much research use is actually going on in the education system, studies need toattend to the issue longitudinally. This study includes a longitudinal element.

    Audience and the format of research matters

    Many studies have reported that tailoring research products for groups of stakeholders increases the likelihood of use (Cordingley,

    2008; Biddle & Saha, 2002; Levin, S, Cooper & Mascarenhas, 2009). Our team found similar results of research use by principals in

    school districts (Levin, S, Cooper & Mascarenhas, 2009). On the other hand, Belkhodja et al. (2007) found that interaction and

    contextual considerations of production and practice environments were much more influential than format of products.

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    4/26

    4

    Practitioners in the field have time and time again insisted that the format of the research influences whether or not they actually use it

    (Cordingley, 2008; Behrstock, Drill & Miller, 2009; Biddle & Saha, 2002; Levin, Sa, Cooper & Mascarenhas, 2009); however, this

    claim does not appear to have been tested. There is simply not enough empirical evidence yet to know whether adaptation of products

    or interaction and recognition of context are most important to research use. Our study will explore this issue to some extent byassessing which products are actually accessed and downloaded.

    The importance of active mobilization of research

    A considerable amount of research suggests that passive dissemination of research products has limited effectiveness (Armstrong,Waters, Crockett & Keleher, 2007; Grimshaw et al., 2006; Lavis, Robertson, Woodside, McLeod, & Ableson, 2003). If this is so,

    investing time and resources in passive online dissemination mechanisms also seems a doubtful strategy, yet one that is common. One

    cannot assume that research is being used just because it is freely available online. Research also provides growing evidence thatsuccessful dissemination efforts need to consider the audience and have dedicated staff and resources (Levin, 2008; Cooper et al.,

    2009).

    Dede (2000) similarly cautions that the internet, if utilized in the same way that traditional research dissemination has occurred (for

    example simply transferring large quantities of data to practice settings), will not yield different results. Hence, he suggests that

    reconceptualising the historic role of information technology in knowledge mobilization and use is central to its future effectiveness

    (p. 3).

    Linking KM and technological literature

    The literature on KM in relation to technology is sparse. Although many contend that the internet and various websites can facilitate

    this work, we found only a few studies in the health sector that explicitly addressed the use of the internet to mobilize researchknowledge.

    Ho et al. (2004), in a conceptual paper, explore the potential synergy of research knowledge transfer and information technology,

    which they refer to as technology-enabled knowledge translation (TEKT). They provide evaluation dimensions and methodologies for

    TEKT including structural, subjective, cognitive, behavioural and systemic elements in order to help researchers compare successful

    models and characterize best practices of TEKT. However they do not provide any empirical data on these practices or ideas.

    Dede (2000) discusses the role of emerging technologies explicitly in relation to knowledge mobilization, dissemination and use in

    education. He elaborates on three ideas to use the internet to spread best practice across educational organizations. First, emerginginformation technologies enable a shift from the transfer and assimilation of information to the creation, sharing and mastery of

    knowledge (p. 2). Here, active collaboration among stakeholders, facilitated through the internet, is seen as a way to co-construct

    knowledge in a more meaningful way, because it takes into account contextual factors and, as a result, increases uptake. Second,

    Dede highlights that dissemination efforts must include all the information necessary for successful implementation of an exemplary

    practice, imparting a set of related innovations that mutually reinforce overall systemic change (p. 2). He argues that interactivemedia can facilitate this process, but must include detailed plans along a number of important areas leadership, professional

    development, and so on. Third, a major challenge in generalizing and scaling up an educational innovation is helping practitioners

    unlearn the beliefs, values, assumptions, and culture underlying their organizations standard operating practices (p. 3). He arguesthat professional rituals are deeply entrenched and that changing practitioners behaviours can be supported through virtual

    communities that provide social support for this difficult and sometimes threatening process.

    Jaded (1999) argues that the internet provides opportunities for networking and partnerships in the health sector. But he also lists a

    number of conditions that are necessary in order for online KM to be effective: a better understanding of the way service users and

    practitioners use the internet; systems that are easy to access and use; rapid transmission systems (bandwidth he argues is still too slow

    in many parts of the world); and information that is relevant and in a format that is ready to use. Different strategies are needed to

    integrate the large volumes of available information in a meaningful way; virtual interaction might still need to facilitate face-to-face

    meetings; and global access to technology is still needed to ensure global equity.

    While these are interesting ideas, they provide little evidence on the actual use of web-based research materials.Conceptual Framework

    For purposes of this study we conceptualize use of web-based research material in terms of the interaction between three elements

    (Figure 1):

    1) Research evidence: Various aspects of the research products influence use.

    Type of resource (idea, product, contact, link)

    Format (long or short print version, video, language)

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    5/26

    5

    Relevance (how tailored to particular users)

    2) User:

    Role (parent, teacher, student, researcher, district administrator, journalist, interested citizen)

    Purpose of visit to website (work, study, personal reasons)

    3) Actual use over time: Comparing original intention to actual use.

    Use over time (no use, undetermined usefulness, immediately useful, intended future use, actually used)

    Sharing of materials (formally and informally; internally or externally to their workplace)

    Type of use (conceptual, symbolic, instrumental)

    RESEARCH

    EVIDENCE

    Idea, product, link or contact

    Format of resource

    Relevance (Tailored foraudience)

    CONTEXT

    Stakeholder

    Individual/ Org Linkage

    Formal/ informal Internal/ external

    TIME

    No use Undetermined

    Immediate use Intended future use

    Actual Use

    EXTENT &TYPE

    OF USE

    Intensity of UseConceptualInstrumentalSymbolic

    Figure 1. Conceptual framework: Online research use as the interplay between evidence, audience and use over time.

    Method

    This study involves our team partnering with educational organizations in Canada and abroad to investigate use of web-based research

    in education. The organizations vary in form and function; for example, one partner is a unit within a school district, while others are

    intermediary research organizations or have websites designed to be databases of relevant research.

    The study uses two data sources to assess the extent and nature of use of research products found on the websites of participatingorganizations. First, web analytics track website usage in various ways. Second, we developed two surveys, administered at two

    different points in time, that ask visitors directly about their use of these web-based resources.Using web analytics

    Web analytics software provides useful data on the use of research materials from websites (Wikipedia,

    http://en.wikipedia.org/wiki/Web_analytics). Tracking and understanding web analytics allows us to understand the specific activity

    on a website, translated into metrics (Ledford and Tyler, 2007). Types of metrics include: hits, page views, visits, unique visitors,referrers, search engines, keywords, time spent on site, exit pages, entrance pages, bounce rate, repeat visits, subscribers and

    conversion rate (Table 1). These data exist for each page on a website and for each product on the site, allowing comparisons over

    time and across sites.

    For this study we chose Google Analytics (GA) software because it is widely used already, including by most of our partner

    organizations, and because it offers a range of useful tools to analyze the data it provides. GA also allows our partner organizations to

    give us access directly to their data, facilitating our analysis.

    http://en.wikipedia.org/wiki/Web_analyticshttp://en.wikipedia.org/wiki/Web_analytics
  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    6/26

    6

    Table 1

    Google Analytics web metrics (Clifton, 2008; Ledford & Tyler, 2007; Page, 2008)

    Metric Definition

    board The general point for all analytics information. Clicking on any clickable point on the dashboard will take you to

    the in-depth analytics section of that point.

    overlay The map overlay is a visual cue to see how many visitors from which countries have visited the site. Additionally

    the map overlay can be broken down by region, province/state, and city thereby comparing specific sections of

    the world with each other.

    or overview Shows a segmentation of visitors: what language/s they speak, where their network is located and which

    browser/operating system they use. There is also a section (with a pie graph) demonstrating percentage of visitors

    who are new versus who are returning visitors. Although this section is helpful on its own, it is more helpful to

    use as a comparison tool comparing between months the amounts of visitors using it as a comparison tool could

    tell us if the new visitor in one month has become a returning visitor by looking at increases and by looking at the

    visitor loyalty breakdown.

    ic source overview Shows where visitors are being referred from to the website such as search engine link, another website, and

    direct traffic to the site.

    ent overview The specific information relating to content for each page and/or document of the website. Includes how many

    people visited the page and the percent of page views. Often includes the unique views of each page.

    sage This is a summary of visits, page views, pages per visit viewed, bounce rate, average time on the site, new visits.

    Clicking on any one of the headers will bring you to a further analysis of that point.

    Line graph plotting how many visits a day, spread out by once a week points. Scrolling over the line graph on

    each of the points, the number of visitors per day pops up.

    or trending Within this portion all the analytics for visitors can be found.

    on site Provides an average time that each visitor spent on the site per day. From this number we could presume or infer

    how many pages the visitor read, if they downloaded something, or through breaking down the number, how

    many visitors bounced in and out of the site within a few seconds.

    ce rate Provides a percentage of how many visitors on that day bounced on/off the site within a few seconds of coming to

    the site. From this statistic, we could presume that the visitor did not find what they were looking for, or that it

    was the wrong site.

    ute unique visitors The percentage of people and the number (in brackets after the percent), per day of completely new IP addresses

    tracked coming to the site.

    age page views The approximate percent and number of pages visitors viewed when coming to the site.

    ue page views The number of unique page views represents the number of individual visitors who have reviewed your pages

    The first image that one sees on logging in to GA is the dashboard (Figure 2).

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    7/26

    7

    Figure 2. CEA Dashboard from Google Analytics September 1, 2009- April 26, 2010.

    From the dashboard, users can view different reports to understand what pages visitors view, where visitors come from, and what

    products visitors access. Table 1 describes the definitions of different web metrics. In this paper, we specifically report on nine

    metrics: Content, site usage, visits, time on site, bounce rate, absolute unique visitors, page views, average page views and unique

    page views.

    There are, however, limits to what Google analytics can tell us. While the analytics tell us about frequency of downloads of different

    formats of products (for instance full reports versus executive summaries) they do not provide information about who visits the site or,

    more importantly, about what people do with the research information aftertheir visit. Since actual use of resources is our

    fundamental interest, we developed a two survey model to use in conjunction with web analytics to deepen our understanding of theuse of web-based materials.

    A two-part survey

    We are using a two part survey. When people visit one of our partner sites, they are invited to take part in a short survey (Appendix

    A) that asks them about whether they found useful information on this visit to the site and about their plans for using any such

    information. They are also invited to take part in a second survey (Appendix B), to be sent to them at a later date, that asks about their

    actual use of the materials or resources since their initial visit. The second survey is being circulated to those who volunteer either 30,

    60 or 90 days after their initial visit.

    Both surveys (Table 2) focus on whether the research-related products or resources are used at all and, if so whether this use isconceptual (informs thinking on future issues, and so on) orinstrumental (affects the users thinking on research, work, or practice;

    impacts how the user does work in their context, and whether or not the participant shares the information with others formally or

    informally inside or outside their organization).

    Table 2

    Survey questions in relation to type of use and time

    Intention

    /Use over

    No

    use

    Undetermined

    Use at this time

    Immediate

    usefulness

    Intended

    future use

    Actual Use (as

    determined by

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    8/26

    8

    Time

    Type of Use

    follow-up

    survey)

    No use Q8

    Conceptual Use Q10 Q10

    Instrumental Use Q7, Q11 Q9, Q13,

    Q14, Q15,

    Q16

    Symbolic Use

    Level of Impact Q12

    Partner organizations

    We currently have either have in place or are about to have in place eight partner organizations; two in Canada and six in England.

    Each partner organization is involved in attempting to share research information in education through making it available on their

    website. We hope to recruit additional partners in the coming year; there is in principle no limit to how many organizations could takepart. Partners have very little work to do; they have to provide us with access to their GA data, to embed some tracking codes on

    particular pages and products, and to embed our initial survey on their site.

    The benefit for the partner organizations is the data analysis and reporting we provide on the use of their web-based research related

    materials. We also provide each partner with comparative data on the other study participants (anonymously). This will allow

    organizations to see how the take-up of research resources on their site compares with other educational organizations and should help

    them improve their sharing of research-related products.

    Since each educational organization has different goals and, as a result, different content and layouts of their websites, we work with

    each partner to identify and track some particular targets on their website. Targets can refer to a number of different thingsdepending on the website a particular web page, a product, an initiative that is linked to multiple products, and so on. Our analysis

    then focuses on these targets. We use ratios to compare different targets in order to gauge intensity of uptake of research materials in

    relation to other kinds of information within and between organizations (Figure 3).

    Figure 3. Metrics analysis framework examining research-based targets within and between educational organizations.

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    9/26

    9

    Tracking different targets within a single website allows an organization to compare uptake of different initiatives or products.

    Tracking several sites over time provides the opportunity to compare them in terms of their power to generate visitors to and

    downloads of material related to research findings in education. Looking at these data across sites and times will allow us to

    understand more about how, in general, web-based products are used and which kinds of approaches seem to have the greatest impact.

    A preliminary example from piloting the project with CEA

    The research partner whose data we report in this paper is the Canadian Education Association (CEA). The mission statement of the

    CEA, an organization founded more than a century ago, is to initiate and sustain dialogue throughout Canada influencing public policy

    issues in education for the ongoing development of a robust, democratic society and a prosperous and sustainable economy. The CEA

    relies on good theory and research evidence as the foundation on which to build shared understanding and commitment with

    organizations that share their values and purposes (http://cea-ace.ca/abo.cfm, 2009). Because it is a national organization with a smal

    staff it relies heavily on dissemination strategies including its website.

    We focus our exploratory findings on the Google Analytics data for CEA on three targets:

    a. Comparing CEAs research and policy page to other pages in regard to page views, average time on page and bounce rate

    b. Comparing which products (PDFs) are accessed the most, with a focus on comparing full reports versus executivesummaries

    c. Comparing the uptake of two research-based initiatives: What Did You Do in School Today (WDYDIST) and the CEAs

    study of the Ontario Primary Class Size initiative.

    What Did You Do in School Today (WDYDIST) is a research project that gathers survey data from middle and secondary students in

    schools across Canada to explore their social, academic and intellectual engagement. We tracked five research-related products from

    this project:

    National report (52 pages)

    Summary report (4 pages)

    Two supporting document reports that included a report on student engagement (26 pages) and a teaching effectivenessframework and rubric (18 pages)

    FAQ document (5 pages).

    Each of these documents is available as a PDF in English and in French in several parts of the CEA website, including on the

    homepage, the main research page, and a specific WDYDIST page. The New & Noteworthy page also includes various

    announcements pertaining to the project in June 2009, August 2009, and September 2009 as the media picked up on the project.

    The Class Size Reduction Initiative is a research project which evaluates the Ontario governments implementation of a class size

    reduction policy that reduced class size in the 90% of Ontario primary classrooms to 20 or fewer students as of 2008. We tracked six

    research-related products from this initiative:

    National Report (22 pages)

    Executive Summary (2 pages)

    Evaluation Report (140 pages)

    Question and Answer document (1 page)

    Literature review on class size reduction (36 pages)

    A paper that was in the CEA quarterly magazine in the fall of 2008 (4 pages).

    As with WDYDIST, these documents are available as PDF files in both official languages in multiple locations on the CEA website,

    including the homepage, the main research page in two locations, and from the New & Noteworthy page with announcements

    pertaining to the release of the full report in February 2010.

    http://cea-ace.ca/abo.cfmhttp://cea-ace.ca/abo.cfmhttp://cea-ace.ca/abo.cfm
  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    10/26

    10

    We have analytics data for the overall site usage, with a focus on comparing research-related pages to non-research pages fromSeptember 2009 through April 2010 (page views, unique page views, average time spent on page, and bounce rate). For the research

    initiatives, we report on data for the product specific targets from February, 2010 (when the appropriate tracking code was inserted)

    through April 2010.

    From these three targets we noticed:

    1. Visitors tend to view non-research related pages more but to spend more time on pages that have research-related content

    2. Non-research pages and resources were more visited and used than research-related pages and resources. Visitors accessedlonger versions of reports more than they did short versions where both were available

    Visitors tend to spend the most time on pages that have research-related content but view non-research related pages more

    From September 2009 through April 2010 the CEA website was visited more than 200,000 times. The pages with the most views are

    shown in Table 3, and are not research related. On these non-research pages, visitors spent an average of 30-50 seconds on the page.In contrast, visitors spent the most time on average on pages that had research related content. On the WDYDIST page visitors spent

    an average of 2:33 on the page. Similarly, on the Focus on Literacy page, visitors spent 3:49 on the page. Although visitors spent more

    time on these pages with research-related content, the bounce rate (see Table 1) was also highest on these pages and lowest on the

    pages that had general information about what CEA does.

    Table 3

    All page views September 2009-April 2010

    Rank Page Page views Unique Page

    Views

    Average time on

    page (minutes)

    Bounce rate

    1 Home page 25,070 18,608 1:04 41.60%

    2 Education Canada

    publication page

    6,663 5,007 0:47 51.55%

    3 About CEA 6,537 4,565 0:38 39.46%

    4 CEA publication page 6,090 4,383 0:27 20.34%

    5 Research and policy

    main page

    6,089 4,169 0:45 43.90%

    6 WDYDIST page 5,702 4,012 2:33 68.68%

    7 FAQ 5,545 4,932 1:50 68.60%

    8 Focus on Literacy page 4,467 3,752 3:49 78.76%

    9 Education Canada

    Spring 2010 page

    3,693 2,521 1:18 52.70%

    10 Focus On main page 3,484 2,478 0:29 41.09%

    Not surprisingly, the CEA home page had substantially more views than the target research pages across the eight months of tracking.

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    11/26

    11

    Figure 4. Comparison of home page views to research page targets for CEA.

    We also compared time spent on the target pages (Table 4).

    Table 4

    Comparison of average time spent on home page, research and policy page and WDYDIST page per month

    Month Average time spent on page (minutes)

    Home page Research and Policy WDYDIST

    September 1:04 0:39 2:17

    October 1:01 0:48 3:02

    November 0:59 0:51 2:54

    December 0:58 0:54 3:07

    January 1:18 0:49 3:45

    February 1:06 0:44 2:24

    March 1:03 0:42 1:23

    April 0:56 0:34 1:49

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    12/26

    12

    The most time is spent by visitors on the WDYDIST; it should be noted that this page has a series of 2-3 minute videos embedded in

    it. While we cannot track access to the videos (because they are embedded on the page) this might account for the additional time

    spent watching the videos from the initiative.

    In another attempt to compare these data, Figure 5 shows views of Research and Policy and WDYDIST as a percentage of the

    homepage views.

    Figure 5. Comparison of CEA homepage, Research and policy page and WDYDIST page.

    While access to both research page targets are low comparison to the homepage, in the WDYDIST activity peaked in November 2009

    with 1,046 page views that month. This peak corresponded to a media release and additional media attention surrounding the

    initiative.

    We explored the ten products on the whole CEA site to see how many would be research related (Table 5).

    Table 5Top 10 accessed PDFs from the CEA website February through April 2010

    Rank Page Page views Unique Page

    Views

    Average time on

    page (minutes)

    1 2009-2010 School Calendar 777 686 2:22

    2 WDYDIST National Report 284 270 3:35

    3 Public Education in Canada: Facts,

    trends and attitudes (2007)

    176 169 3:53

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    13/26

    13

    4 Beyond doing school: From stressed-

    out to engaged in learning

    137 119 2:11

    5 WDYDIST Teaching effectiveness

    framework and rubric

    134 123 2:52

    6 Democracy at Risk article 119 106 2:09

    7 WDYDIST Student engagement report 119 106 2:22

    8 KI-ES-KI contact handbook order form 98 86 2:04

    9 Class size National Report 68 58 3:20

    10 A vision for early childhood education

    and care article

    62 61 2:08

    The PDF with school calendar information from across Canada was by far the most frequently accessed document. Also consistent

    among the top ten were the WDYDIST National Report; WDYDIST teacher effectiveness report; the WDYDIST student engagementreport and the KI-ES-KI contact order form.

    In addition to being the top accessed product, the school calendar had an average view time of 2:22, which is similar to the view times

    of the research-related products. The KI-ES-KI order form had the shortest average time on page, at 2:04.

    Comparing the uptake of two research based initiatives

    We were interested in comparing the uptake (measured as frequency of access to the research-related products) of the CEA target

    initiatives: WDYDIST and the Class size reduction project. We found that the WDYDIST initiative had a greater uptake than the

    Class size project (Tables 6 and 7).

    Table 6

    Top content: PDFs relating to the What Did You Do In School Today Project February 2010-April 2010

    Rank Page Page

    views

    Unique Page

    Views

    Average time (minutes)

    5 WDYDIST Teaching effectiveness

    report

    134 123 2:52

    7 WDYDIST Student engagement report 119 106 2:09

    19 WDYDIST National Report Summary 39 38 2:59

    52 WDYDIST FAQ document 16 15 1:36

    72 WDYIST National Report French 11 9 3:41

    Table 7

    Top content: PDFs relating to the Class Size Reduction Project February 2010-April 2010

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    14/26

    14

    Rank Page Page

    views

    Unique Page

    Views

    Average time (minutes)

    9 External view of the Class Size

    National Report

    68 58 3:20

    15 External view of the Class Size

    Evaluation Report

    51 44 2:03

    18 Literature review of Class Size

    Reduction

    41 35 1:29

    27 Class Size Executive Summary 28 24 3:06

    61 Q and A document 13 13 0:38

    62 Evaluation Report 13 12 0:55

    WDYDIST has been up for longer on the CEA website (launched May 2009). It also has its own webpage as well as more diverseproducts such as videos on the website and document downloads. Hence, WDYDIST applies more strategies in terms of bothproducts and media attention, with frequent news releases at key times within the two year research project. In contrast, the Class Size

    Reduction project was only released in February 2010 and, so far, there has not been as much space devoted to this initiative on the

    CEA website or attention by media (there has been only one news release on the project).

    Visitors accessed longer versions of reports more than they did short versions where both were available

    We were interested in exploring the frequent claim that readers prefer to access shorter versions of research such as executive

    summaries. For both these initiatives on the CEA site, in fact, the longer reports were viewed more often than the shorter versions.

    The WDYDIST student engagement report (26 pages) was viewed 134 times and the teaching effectiveness framework (18 pages) 119

    times while the summary report was only viewed 38 times in the reported time frame. For the Class size project, visitors viewed theNational Report 68 times (22 pages) and the Evaluation Report (140 pages) 51 times externally and 13 times internal to the

    organization website whereas the summary report (1 page) was viewed 28 times in the reported time frame. In addition, visitors spent

    more time on the longer reports than on the summary reports. In the case of WDYDIST, visitors spent an average of 2:59 on summary

    version of the national report and 3:35 on the long version For the Class size project, visitors spent an average of 3:06 on the summary

    but 3:20 on the National Report.

    Survey findings

    At the time of writing this paper, we do not have enough responses to our online surveys to report any data. Currently about 1% of

    visitors are responding to this survey while the second, follow-up survey is too new to be able to report take up or results. As we add

    more partners we will have more data from both of these instruments.

    Conclusion

    Dissemination of research materials through the internet is a ubiquitous practice but although it takes considerable resources, we have

    virtually no knowledge about its impact. This paper outlines a study currently underway that seeks to fill some of that gap, including

    outlining its conceptual basis and giving examples of the kind of data it will provide. Web analytics applied over time and across

    organizations will increase our understanding of the kinds of strategies and products that are most effective in creating attention. Our

    two part survey, if effective, will start to provide information on how and how much people actually use the materials they obtain from

    various websites. Both approaches will add to the base of empirical knowledge on effective mobilization of research in education.

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    15/26

    15

    References

    Amara, N., Ouimet, M., & Landry, R. (2004). New evidence on instrumental, conceptual, and symbolic utilization of university

    research in government agencies. Science Communication, 26(1), 75-106.

    Armstrong, R., Water, E., Crockett, B., & Keleher, H. (2007). The nature of evidence resources and knowledge translation for health

    promotion of practitioners. Health Promotion International, 22, 254-260.

    Behrstock, E., Drill, K. & Miller, S. (2009). Is the supply in demand? Exploring how, when and why teachers use research.

    Learning Point Associates. Paper presented at the Annual Meeting for American Education Research Association, Denver,Colorado.

    Berta, W. B., & Baker, R. (2004). Factors that impact the transfer and retention of best practices for reducing error in hospitals.Health

    Care Management Review, 29(2), 90-97.

    Belkhodja, O., Amara, N., Landry, R., & Ouimet, M. (2007). The extent and organizational determinants of research utilization inCanadian health services organizations. Science Communication, 28(3), 377-417.

    Biddle, B., & Saha, L. (2002). The untested accusation: Principals, research knowledge,

    and policy making in schools. Westport, CT: Ablex.

    Clifton, B. (2008). Advanced web metrics with Google analytics. Indianapolis: Wiley Publishing Inc.

    Cooper, A., Levin, B., & Campbell, C. (2009). The growing (but still limited) importance of evidence in education policy and

    practice. Journal of Educational Change, 10(2-3), 159-171.

    Cooper, A. & Levin, B. (in press, accepted January 2010). Some Canadian contributions to understanding knowledge mobilization.Evidence and Policy.

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    16/26

    16

    Cordingley, P. (2008). Research and evidence-informed practice: focusing on practice and practitioners. Cambridge Journal of

    Education, 38(1), 37-52.

    Davies, H., Nutley, S., & Smith, P (2000). What works? Evidence-based policy and practice in public services. Bristol: Policy Press.

    Dede, C. (2000). The role of emerging technologies for knowledge mobilization, dissemination, and use in education, paper

    commissioned by the Office of Educational Research and Improvement, US Department of education. Retrieved February

    2010 from http://www.virtual.gmu.edu/ss_pdf/knowlmob.pdf

    Greenhow, C., Robelia, B., & Hughes, J. (2009). Web 2.0 and classroom research: What path should we take now?Educational

    Researcher, 38(4), 246259.

    Grimshaw, J., Eccles, M., Thomas, R., MacLennan, G., Ramsay, C., Fraser, C., & Vale, L. (2006). Toward evidence-based quality

    improvement: Evidence (and its limitations) of the effectiveness of guideline dissemination and implementation strategies

    1966-1998.Journal of General Internal Medicine, 21, S14-20.

    Hartley, K. & Bendixen, L. (2001). Educational Research in the Internet Age: Examining the Role of Individual Characteristics.

    Educational Researcher, 30(9): 22 - 26.

    Hemsley-Brown, J., & Sharp, C. (2003). The use of research to improve professional practice: a systematic review of the literature.

    Oxford Review of Education, 29(4), 449-470.

    Hemsley-Brown, J. (2004). Facilitating research utilization: A cross-sector review of research evidence. The International Journal of

    Public Sector Management,17(6), 534-552.

    Ho, K., Bloch, Gondocz, T., Laprise, R., Perrier, L., Ryan, D., Thivierge, R., Wenghofer, E. (2004). Technology-enabled knowledge

    translation : Frameworks to promote research and practice,Journal of Continuing Education in the Health Professions, 24,

    90-99.

    Jadad, A. (1999). Promoting partnerships : challenges for the internet age,BMJ, 319, 761-764.

    Knott, J., & Wildavsky, A. (1980). If dissemination is the solution, what is the problem?Knowledge: Creation, Diffusion, Utilization,

    1(4), 537-578.

    Landry, R., Amara, N., & Lamari, M. (2001). Utilization of social science research knowledge in Canada. Res Policy, 30, 333-349.

    Lavis, J., Robertson, D., Woodside, J. M., McLeod, C. B., & Abelson, J. (2003). How can research organizations more effectively

    transfer research knowledge to decision makers? The Milbank Quarterly, 81(2), 221-48.

    Ledford, J., and Tyler, M. (2007). Google analytics 2.0. Indianapolis: Wiley Publishing Inc.

    Lemieux-Charles, L., & Champagne, F. (2004). Using knowledge and evidence in health care: Multidisciplinary perspectives.

    Toronto: University of Toronto Press.

    Levin, B. (2004). Making research matter more. Education Policy Analysis Archives, 12(56). Retrieved November 15, 2008, from

    http://epaa.asu.edu/epaa/v12n56/

    Levin, B. (2008, May). Thinking About Knowledge Mobilization. (Paper prepared for an invitational symposium sponsored by the

    Canadian Council on Learning and the Social Sciences and Humanities research Council of Canada, Vancouver).

    http://epaa.asu.edu/epaa/v12n56/http://epaa.asu.edu/epaa/v12n56/
  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    17/26

    17

    Levin, B., S, C., Cooper, A., & Mascarenhas, S. (2009). Research use and its impact in secondary schools. CEA/OISECollaborative Mixed Methods Research Project Interim Report.

    Levin, B. (2010). Theory, research and practice in mobilizing research knowledge in education. Paper presented at the 39 th Annual

    Canadian Society for the Study of Education Conference, Montreal, Quebec.

    McLaughlin, M. (2008). Beyond misery research. In C. Sugrue, (Ed.). The future of educational change: International

    perspectives (pp.176-185). London and New York: Routledge.

    Nutley, S., Walter, I., & Davies, H. (2007). Using evidence: How research can inform public services. Bristol: Policy Press.

    Page, R. (2008). Web metrics 101: What do all these terms mean? Retrieved on November 3, 2009 from

    www.makeuseof.com/tag/web-metrics-101

    Pfeffer, J., & Sutton, R. (2000). The knowing- doing gap: How smart compaies turn knowledge into action. Boston: Harvard BusinessSchool Press.

    Qi, J. & Levin, B. (2010). Strategies for mobilizing research knowledge: A conceptual model and its application. Paper presented atthe 39th Annual Canadian Society for the Study of Education Conference, Montreal, Quebec.

    Timperley, H. (2010). Using evidence in the classroom for professional learning. Paper presented at the Ontario Education Research

    Symposium, Toronto, Ontario, Canada.

    Weiss, C. H. (1979). The many meanings of research utilization.Public Administration Review, 39(5), 426-431.

    Appendix A: Initial CEA Survey

    http://www.makeuseof.com/tag/web-metrics-101http://www.makeuseof.com/tag/web-metrics-101
  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    18/26

    18

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    19/26

    19

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    20/26

    20

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    21/26

    21

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    22/26

    22

    Appendix B: Follow-up Survey

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    23/26

    23

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    24/26

    24

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    25/26

    25

  • 7/28/2019 2010_Use of Web-based Research Materials in Education_Is Uptake Occurring_26p

    26/26

    26