deliverable d8.3 final evaluation report - europa...d8.3 – final evaluation report version: 1.0 -...

45
D8.3 Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project Coordinator: INTRASOFT International S.A. Page 1 of 45 EU COMMUNITY ICT-2013.5.4 ICT for Governance and Policy Modelling EU COMMUNITY MERGES ICT AND SOCIAL MEDIA NETWORKING WITH ESTABLISHED ONLINE MEDIA AND STAKEHOLDER GROUPS TO CULTIVATE TRANSPARENCY, ENHANCE EFFICIENCY AND STIMULATE FRESH IDEAS FOR EU POLICY-MAKING Deliverable D8.3 Final Evaluation Report Editor(s): Francesco MUREDDU, David OSIMO EU COMMUNITY: FUNDACIO PER A LA UNIVERSITAT OBERTA DE CATALUNYA Status-Version: Final - v1.0 Date: 18.10.2016 EC Distribution: R

Upload: others

Post on 11-Jun-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 1 of 45

EU COMMUNITY

ICT-2013.5.4 ICT for Governance and Policy Modelling

EU COMMUNITY MERGES ICT AND SOCIAL MEDIA NETWORKING WITH

ESTABLISHED ONLINE MEDIA AND STAKEHOLDER GROUPS TO CULTIVATE

TRANSPARENCY, ENHANCE EFFICIENCY AND STIMULATE FRESH IDEAS FOR EU

POLICY-MAKING

Deliverable D8.3

Final Evaluation Report

Editor(s): Francesco MUREDDU, David OSIMO

EU COMMUNITY: FUNDACIO PER A LA UNIVERSITAT

OBERTA DE CATALUNYA

Status-Version: Final - v1.0

Date: 18.10.2016

EC Distribution: R

Page 2: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 2 of 45

Project Number: 611964

Project Title: EU COMMUNITY

Title of Deliverable: Final Evaluation Report

Date of Delivery to the EC: 19/10/2016

Workpackage responsible

for the Deliverable: WP8 – Evaluation

Editor(s): Francesco MUREDDU, David OSIMO

Contributor(s): Francesco MUREDDU

Reviewer(s): I-EUROPA, INTRA-BE

Approved by: All Partners

Abstract: Within the framework and scope of the EU

Community project, the Final Evaluation Report

presents the elaboration and synthesis of the

results of the evaluations carried out along the

project, in particular in the four rounds of users’

survey and in the elaboration of the data

stemming from the metrics embedded in the

platform, in view of the final evaluation of the EU

Community. The four rounds of users’ survey

were dedicated to the evaluation of EurActory

and PolicyLine (two rounds each), with the aim

to assess to which extent the tools have reached

their objectives and have complied with the

needs of their actual and potential users. The

users’ surveys were complemented by the

analysis of the data stemming from the metrics

embedded in the platform. The primary objective

has been to assess the extent of engagement of

users in the EU Community platform. Regarding

the users’ survey, the evaluation shows a

Page 3: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 3 of 45

positive response for what concerns usability

and usefulness/effectiveness of both tools,

especially in the second round of evaluation. The

considerable feedback regarding content and

bugs present in the system, alongside

suggestions concerning new content /

functionalities have been taken into account

(and will be taken into account in the future) to

improve both EurActory and PolicyLine.

The analysis of the data stemming from the

metrics embedded in the platform and the

metrics recorded by mean of Web Analytics

Tools show a considerable take up of EurActory

and a significant amount of content present in

the tool. On the same time the analysis shows

that the take-up and the content present in

PolicyLine are smaller in magnitude. This is

obviously due to the different degrees of

maturity of the two tools and the fact that

(according to plan) PolicyLine was developed

later then EurActory.

Keyword List: Intervention Logic; Evaluation Criteria;

Evaluation Metrics

Page 4: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 4 of 45

Document Description

Document Revision History

Version Date

Modifications Introduced

Modification Reason Modified by

0.1 12.08.2016 First skeleton of the deliverable and TOC Francesco

MUREDDU

0.2 30.09.2016 Draft version sent to partners and internal

reviewers

Francesco

MUREDDU

and David

OSIMO

0.3 07.10.2016 Comments by partners and internal reviewers

0.4 17.10.2016 Revised version provided to partners

Francesco

MUREDDU

and David

OSIMO

1.0 19.10.2016 Final version sent to the EC

Francesco

MUREDDU

and David

OSIMO

Page 5: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 5 of 45

Contents

EXECUTIVE SUMMARY .............................................................................. 9

1. INTRODUCTION .................................................................................. 10

1.1 PURPOSE AND SCOPE ............................................................................. 10

1.2 APPROACH OF THE WORK PACKAGE AND RELATION TO OTHER WORK PACKAGES AND

DELIVERABLES .......................................................................................... 11

1.3 STRUCTURE OF THE DOCUMENT ................................................................. 11

2. IMPLEMENTATION AND RESULTS OF THE SURVEYS ........................... 12

2.1 EVALUATION OF EURACTORY .................................................................... 12

2.1.1 Comparison between the First and Second Evaluation ......................................... 13

FIRST EVALUATION ................................................................................ 14

SECOND EVALUATION (OPEN) ................................................................ 14

2.2 EVALUATION OF POLICYLINE ..................................................................... 18

2.2.1 Comparison between the First and Second Evaluation ......................................... 19

3. METRICS EMBEDDED IN THE PLATFORM............................................. 22

3.1 METRICS RELATIVE TO EURACTORY ............................................................ 22

3.2 METRICS RELATIVE TO POLICYLINE ............................................................. 27

3.3 LESSONS LEARNT AND RECOMMENDATIONS FOR THE SUSTAINABILITY OF THE

ENGAGEMENT ........................................................................................... 30

4. DISCUSSION OF RESULTS .................................................................. 32

REFERENCES .......................................................................................... 33

APPENDICES: EVALUATION FRAMEWORK .............................................. 35

I. APPENDIX A: THE EU COMMUNITY INTERVENTION LOGIC ............................... 35

II. APPENDIX B: THE TWO DIMENSIONS OF EVALUATION ................................. 40

Policy Impact Measurement Approach ........................................................................... 40

Technology Acceptance Model ....................................................................................... 40

III. APPENDIX C: METHODOLOGICAL FRAMEWORK.......................................... 43

IV. APPENDIX D: DEVELOPMENT OF THE EVALUATION METRICS .......................... 45

Page 6: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 6 of 45

List of Figures

FIGURE 1: PAGE VIEWS OF EURACTORY PER MONTH ............................................... 23

FIGURE 2: PAGE VIEWS OF EURACTORY OVER TIME ................................................ 23

FIGURE 3: USERS OF EURACTORY OVER TIME ....................................................... 24

FIGURE 4: USERS OF EURACTORY OVER TIME ....................................................... 24

FIGURE 5: REGISTERED USERS OF EURACTORY PER MONTH ....................................... 25

FIGURE 6: REGISTERED USERS OF EURACTORY OVER TIME ....................................... 26

FIGURE 7: PAGE VIEWS OF POLICYLINE PER MONTH ................................................ 27

FIGURE 8: PAGE VIEWS OF POLICYLINE OVER TIME ................................................ 27

FIGURE 9: USERS OF POLICYLINE PER MONTH ....................................................... 28

FIGURE 10: USERS OF POLICYLINE OVER TIME ..................................................... 28

FIGURE 11: EU COMMUNITY INTERVENTION LOGIC ................................................. 35

FIGURE 12: LOGICAL-CAUSAL RELATIONSHIP ........................................................ 37

FIGURE 13: TECHNOLOGY ACCEPTANCE MODEL ..................................................... 41

FIGURE 14: METHODOLOGICAL FRAMEWORK ........................................................ 43

FIGURE 15: LOGICAL-CAUSAL RELATIONSHIP COMPLETE WITH EVALUATION CRITERIA ......... 44

FIGURE 16: DEVELOPMENT OF THE EVALUATION METRICS .......................................... 45

Page 7: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 7 of 45

List of Tables

TABLE 1: DEFINITIONS, ACRONYMS AND ABBREVIATIONS............................................ 8

TABLE 2: USAGE SCENARIOS ........................................................................... 12

TABLE 3: SUMMARY OF RESULTS OF THE EVALUATION OF EURACTORY ........................... 14

TABLE 4: BENEFITS REPORTED BY RESPONDENTS ................................................... 15

TABLE 5: FLAWS OF THE TOOL REPORTED BY RESPONDENTS ...................................... 16

TABLE 6: SUGGESTIONS REPORTED BY RESPONDENTS .............................................. 17

TABLE 7: USAGE SCENARIOS ........................................................................... 18

TABLE 8: SUMMARY OF RESULTS OF THE EVALUATION OF POLICYLINE ........................... 19

TABLE 9: NUMBER AND PERCENTAGE OF SESSIONS PER DEVICE .................................. 25

TABLE 10: EXPERTS BY TYPE ........................................................................... 26

TABLE 11: ACTIVATED PROFILES BY TYPE ............................................................ 26

TABLE 12: NUMBER AND PERCENTAGE OF SESSIONS PER DEVICE ................................ 29

TABLE 13: NUMBER OF CONTRIBUTIONS PER TOPIC ................................................ 29

TABLE 14: NUMBER OF CONTRIBUTORS PER TOPIC AND TYPE OF CONTRIBUTION .............. 29

TABLE 15: NUMBER OF CONTRIBUTORS PER CATEGORY OF CONTRIBUTOR AND TYPE OF

CONTRIBUTION ...................................................................................... 30

Page 8: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 8 of 45

Definitions, Acronyms and Abbreviations

Table 1: Definitions, Acronyms and Abbreviations

Acronym Title

EC European Commission

ICT Information and Communication Technologies

FITT Fit among individuals, tasks and technology

IDT Innovation Diffusion Theory

IT Information Technology

TAM Technology Acceptance Model

TPB Theory of Planned Behaviour

TRA Theory of Reasoned Action

UTAUT Unified Theory of Acceptance and Use of Technology

Page 9: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 9 of 45

Executive Summary

The Final Evaluation Report presents the elaboration and synthesis the results of

the evaluations carried out along the project, in particular in the four rounds of

users’ survey and in the elaboration of the data stemming from the metrics

embedded in the platform, in view of the final evaluation of the EU Community.

The four rounds of users’ survey were dedicated (two rounds each) to the

evaluation of the tools EurActory and PolicyLine. The aim of the users’ surveys

was to assess to which extent the tools have reached their objectives and have

complied with the needs of their actual and potential users. At the end of the

project, the users’ surveys were complemented by the analysis of the data

stemming from the web analytics tools as well as from the metrics embedded in

the platform. The metrics embedded in the platform allowed assessing the extent

of engagement of users in the EU Community platform. For what concerns the

results of the analysis, regarding the users’ survey, the evaluation shows a

positive response for what concerns usability and usefulness/effectiveness of both

tools, especially in the second round of evaluation. The considerable feedback

regarding content and bugs present in the system, alongside suggestions

concerning new content/functionalities have been taken into account (and will be

taken into account in the future) to improve both EurActory and PolicyLine. The

analysis of the data stemming from the metrics embedded in the platform and

the metrics recorded by mean of Web Analytics Tools show a considerable take

up of EurActory and a significant amount of content present in the tool. On the

same time the analysis shows that the take-up and the content present in

PolicyLine are smaller in magnitude. This is obviously due to the different degrees

of maturity of the two tools.

The evaluation methodology, presented in the Appendices and developed in

Deliverable D8.1, is built on the combination of two approaches: the policy impact

measurement approach, which aims to evaluate the EU Community tools in terms

of relevance, efficiency, effectiveness, additionality and sustainability; and the

technology acceptance approach, which evaluates the tools developed by EU

Community according to the criteria of attitude towards use, perceived

usefulness, perceived ease of use and behavioural intention to use.

In Section 2 the deliverables presents the results of the four rounds of users’

surveys, while in Section 3 it presents the results of the analysis of the data

stemming from the metrics embedded in the platform. Finally Section 4 provides

a discussion of results.

Page 10: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 10 of 45

1. Introduction

1.1 Purpose and Scope

The aim of Deliverable D8.3 - Final Evaluation Report is to present the elaboration

and synthesis of the results of the evaluations carried out along the project, in

particular in the four rounds of users’ survey and in the elaboration of the data

stemming from the metrics embedded in the platform, in view of the final

evaluation of the EU Community.

The four rounds of users’ survey were dedicated (two rounds each) to the

evaluation of the tools EurActory and PolicyLine. The rounds of evaluation of the

tools EurActory and PolicyLine assessed to which extent the tools have reached

their objectives and have complied with the needs of their actual and potential

users. Moreover, the feedback collected by the mean of the surveys has been

used to improve and enhance the tools themselves.

The first evaluation of EurActory took place in an open stage in

October/November 2015, by contacting the users of the tool subscribed on the

platform, and in a closed stage in February 2016, on a group of selected

responders interpreting the role of potential uses. The second evaluation of

EurActory took place in an open stage in June 2016, by contacting the users of

the tool subscribed on the platform. Likewise, the first evaluation of PolicyLine

took place in February 2016. The evaluation took place in two different instances:

a first structured feedback provided by experts hired by the consortium, and a

closed stage evaluation on a group of selected responders interpreting the role of

potential users carried out by the mean of an online survey. The second and final

evaluation of PolicyLine took place in July 2016. The evaluation was carried out

by mean of an online survey administered in a closed stage to a set of individuals

interpreting the role of potential users taking part in two events, both held in July

2016. The first event was the Summer School on Open and Collaborative

Governance organized by the Department of Information and Communication

Systems Engineering, University of the Aegean, at the beginning of July 2016.

The summer school was focused on topics relevant to PolicyLine, such as ICT-

enabled Governance, policy modelling, information management, social media in

governance, open data, crowdsourcing, modelling and simulation tools, data

mining and visualization. During the summer school PolicyLine was presented and

its main purposes and functionalities were explained. The second event was the

PolicyLine user workshop Organized in Brussels in mid-July 2016 by EurActiv. This

dissemination event was aimed at involving and attracting users to the tool, in

particular by asking them to populate PolicyLine by creating new policy processes.

In both events attendants were professionals for which policy-making is relevant,

such as IT researchers, NGO employees, civil servants, and professionals involved

in EU policy analysis.

The four rounds of users’ survey were complemented by the analysis of the data

stemming from the metrics embedded in the platform, which primary objective

has been to assess the extent of engagement of users in the EU Community

platform.

It has to be noticed that the objectives of EU Community tools and the needs of

target users illustrated in the deliverable are excerpted from the project’

documents (in particular the Description of Work and the article by Charalabidis

et al. (2014)), and that the process of the elaboration of the evaluation metrics

Page 11: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 11 of 45

has taken into account the Deliverable D9.1 - Dissemination & Communication

Plan as well as the Deliverable D2.4 – Community Requirement and Specifications

Research Strategy.

In the following paragraphs the relation between the evaluation activity and other

work packages is presented.

1.2 Approach of the Work Package and Relation to other

Work Packages and Deliverables

The Final Evaluation report refers to Task 8.5 – Final Evaluation. The objective of

the work package is to evaluate the overall success level of the EU Community

platform by measuring the level of the community engagement, as well as to

draw recommendations for future solution deployment and lessons learnt from

the pilot phase. The evaluation has been undertaken throughout the project cycle

by the mean of a set of metrics embedded in the platform, a long semi-

qualitative questionnaire (users’ survey), a quick qualitative feedback form, and

interview forms for experts’ assessment. The activities of the work package

Evaluation are related to the Work Package 6 Platform Development, as a set of

evaluation metrics are to be implemented in the platform; and to Work Package 9

Dissemination and Exploitation, as the survey of users’ needs will be available on

the platform but also send out to the stakeholders via email and disseminated on

social media through the channels identified by Work Package 9.

1.3 Structure of the Document

This deliverable is structured as follows. Section 2 presents the results of the four

rounds of users’ surveys. Section 3 presents the results of the analysis of the

data stemming from the metrics embedded in the platform, while Section 4 wraps

up the results of the evaluation activities carried out in the project. Finally the

appendices present the evaluation methodology developed in Deliverable D8.1.

Page 12: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 12 of 45

2. Implementation and results of the surveys

As already mentioned, the four rounds of users’ survey were dedicated (two

rounds each) to the evaluation of the tools EurActory and PolicyLine. The

evaluations of EurActory were carried out in November 2015 and in June 2016,

while the evaluations of PolicyLine were carried out in February 2016 and in July

2016. The difference in timing is due to the different stage of development of the

two tools and the fact that for PolicyLine also two additional evaluations in the

framework of WP5 have taken place, which were described in the deliverables

D5.2 ' Evaluation of 1st Visualisation Prototype' and D5.4 'Evaluation of final

visualization prototype'.

2.1 Evaluation of EurActory

The first evaluation of EurActory has been carried out by means of both an open

stage and a closed stage evaluation. In the open stage all the individuals

subscribed to EurActory have been invited to take part to the evaluation survey

by mean of mail contact. A first mail was sent on October 27 2015, followed by

two reminders. The survey has been closed on November 27 2015. The responses

from the participants have been collected through the online evaluation

questionnaire. In total 22 individuals responded to the survey.

On the other hand a closed evaluation session was also organized with the

participation of graduate students from the University of Aegean interpreting the

role of potential users. During this session the tool was proposed to the audience,

and its applications and results were explained. Subsequently the participants

had the opportunity to interact with the tool by executing a set of predefined

usage scenarios, which are listed in Table 2. These usage scenarios were

contacted under the observation of the organizers who supported the

participants, and recorded any comments or difficulties they faced, and also

feedback on possible improvements. Finally, the evaluation data from the

participants in this session were collected. In total 13 individuals took part to the

evaluation.

Table 2: Usage Scenarios

Usage scenarios

Create a user account on EurActory

Claim expert’s profile with a claim link

Search expert’s profiles on a topic or a sub-topic

Search an expert, peer asses the expert and share expert’s profile

Search experts of an organization

For both stages the questions of the survey have been clustered according to the

following categories:

Background Questions: in this section the respondents were asked to

identify their domain of expertise/interest as well as their category (in the

open stage);

Page 13: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 13 of 45

Relevance: this set of questions aims to evaluate if the objective of the

tool under scrutiny are adequate to face the needs of the beneficiaries;

Perceived Ease of Use: this category of questions takes into account the

degree to which a person believes that using a particular system would be

free from effort;

Intention to Use: a person’s perceived likelihood or subjective probability

that he or she will use the tool in the future;

Knowledge of similar initiatives: here the respondent is asked to provide

information on possible competitors of the tool;

Strengths of the tool: this category of questions regards the main benefits

of the tool;

Perceived usefulness: this category of questions takes into account the

degree to which a person believes that using a particular system would

enhance his or her job performance;

Effectiveness of the tool: capability of the tool to reach its intermediate

and strategic objectives;

Barriers and bottlenecks: issues preventing the tool to unleash its full

potential;

Future Improvements: in this section the respondents were asked to

provide suggestions on future improvements of the tool.

The questionnaires for the two stages (open and closed) were very similar, even

though they have been tailored to the different audiences. The tool used for the

carrying out the survey was Netquest Survey Manager (www.netquest.com),

version 1.6.0.12.00. The tool used for elaborating the charts is the spreadsheet

Microsoft Excel 2016.

The second evaluation of EurActory has been carried out in an open stage in

which all the individuals subscribed to EurActory have been invited to take part to

the evaluation survey by mean of mail contact in June 2016. A questionnaire very

similar to the one used in the first evaluation was used also for the second

evaluation of EurActory and the responses from the participant have been

collected through the online evaluation questionnaire. In total 59 individuals

responded to the survey.

2.1.1 Comparison between the First and Second Evaluation

In order to assess the progress of the tool in reaching its objectives and in

complying with the needs of their actual and potential users, we compared the

results of the first and the second evaluation. Moreover, we also compared the

feedback collected to get an idea of the technical improvements carried out. In

Table 3 a summary of results of the evaluation of EurActory is depicted. In some

questions the respondents were asked to provide their opinion by the mean of a

Likert scale from 1 (completely disagree) to 5 (completely agree). In other cases

they were asked just to provide their preference.

As it is clear from the table the results of the evaluation are generally positive,

except for what concerns the categories “EurActory improves the quality of my

work” and “EurActory allows me to be more productive” of the first evaluation

Page 14: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 14 of 45

closed stage. At any rate the mostly comparable evaluations are the open stage

ones carried out in the first and in the second round. Clearly the judgments are

more positive in the second evaluation, providing what can be assumed as a

signal of improvement of the tool over time.

More specifically, the tool appears to be more relevant for the respondents of the

first evaluation open stage. Concerning ease of use, in all the stages the results

are positive, with a slight edge in the closed stage of the first evaluation, except

for what concerns the use of EurActory without assistance.

Table 3: Summary of Results of the Evaluation of EurActory

CRITERION FIRST EVALUATION Second

Evaluation (Open)

Open Stage Closed Stage

Relevance

It is my main activity 40,90% NA 25,42%

I occasionally contribute 40,90% NA 38,98%

I rarely contribute 13,63% NA 23,73%

I never contribute 4,54% NA 11,86%

Perceived ease of use

I can easily rate my peers 3,05 3,75 3,63

I can access top listings 3,10 4,31 3,83

Creating a profile is easy 3,27 4,38 3,98

Euractory can be easily used without

assistance

3,27 3,38 4,05

Using EurActory has been a positive experience (agree or strongly agree vs disagree or strongly disagree)

40,9% (22,27%)

76,91% 84,75% (3,39%)

Intention to use

Willingness to use EurActory on a regular basis (every month/week)

81,81% (27,27%)

53,83% (23,07%)

69,49% (45,76%)

Willingness to advise other colleagues to use EurActory (agree or strongly agree vs disagree or strongly disagree)

54,54% (13,63%)

61,53% (7,69%)

66,1% (6,77%)

Knowledge of similar initiatives 13,63% 0 8,47%

Strengths and effectiveness

EurActory puts together information not found nor collected under one roof else (agree or strongly agree vs disagree or

strongly disagree)

36,36% (18,17%)

23,07% (15,38%)

57,42% (16,94%)

EurActory improves the quality of my work 3 2,92 3,56

EurActory allows me to be more productive 3,14 2,77 3,44

EurActory enables me to reinforce my expert positioning

3,18 3,08 3,58

EurActory provides me with all the needed information on relevant experts

3,05 3,08 3,54

EurActory assists me in identifying relevant experts

3,32 3,69 4

Page 15: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 15 of 45

Regarding the intention to use, the majority of respondents in all stages are

willing to use the tool at least every month, and in the second evaluation almost

the half is willing to use the tool every week. Likewise the majority of

respondents are willing to advise other colleagues to use the tool.

Concerning initiatives similar to EurActory, only in the open stage of the first

evaluation and in the second evaluation a minority of respondents reported

knowledge of similar services. When asked to clarify their answer, the

respondents of the first evaluation suggested tools such as LinkedIn, the official

database of European Commission staff, as well as proprietary tools and

databases of EU-focused public affairs consultancies. Likewise the respondents of

the second evaluation suggested also tools such as LinkedIn, Klout, the official

database of European Commission staff, as well as proprietary tools and

databases of EU-focused public affairs consultancies. In any case all the

respondents reporting the knowledge of similar initiatives recognize that

EurActory is better and that its range of functionalities is wider.

Finally regarding strengths and effectiveness of the tool, the respondents

provided positive answers, especially in the second evaluation, except for the

aforementioned categories “EurActory improves the quality of my work” and

“EurActory allows me to be more productive” of the closed stage of the first

evaluation. Moreover there is a great quantity of undecided for what concerns the

category “EurActory puts together information not found nor collected under one

roof else”, even though the respondents in the second evaluation have a positive

judgement.

Considering open-ended questions, in Table 4 the benefits of EurActory reported

by respondents in all the evaluation exercises are presented. As it can been seen

in general terms the benefits reported concern the quantity of content present in

the tool, and especially the fact that it gathers a useful amount of information in

a single place, as well as the possibility to identify the most influential individuals

in every topic.

Table 4: Benefits Reported by Respondents

BENEFITS

FIRST EVALUATION SECOND EVALUATION (Open)

Open Stage Closed Stage

It allows networking and getting informed

on most pressing updates in the EU sphere by people living in- and working for it

X X

It allows an easy finding of who is relevant in the topic of interests

X X

It is a one-stop shop for information X X

It provides the possibility to access it from a smartphone

X X

It is a reliable application that provides concrete information

X X

I can find the information I need relatively to relevant people.

X

It brings several social media info in one place

X

Ability to find experts by subject matter X

Provides valuable information X

Saves time and improve the work performance

X

Page 16: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 16 of 45

Allows to fine the most influential people in

every topic

X

It provides clear rankings of experts per topic

X

Gathers information hardly found online in a quick and organized fashion

X

Useful to find information from other countries, experts for different multinational projects and international conferences

X

It allows finding the information needed relatively to relevant people

X

It filters and ranks information, thus facilitation information-finding and networking

X

It gives the possibility to increase one’s

own visibility

X

Combines institutional and other influencers/stakeholders in one place

X

It is friendly and easy to understand X

It gives a clear picture of the "state of

play"

X

It allows direct access to EU professionals and impartial information about their activities and projects

X

Likewise in Table 5 the flaws of the tool reported by respondents in all the

evaluation exercises are depicted. Clearly the flaws reported concern the

presence of bugs, the speed of the system and the limited amount of information.

These are all issues. that have been mitigated during the project. It is also

interesting to note that the majority of the flaws have been reported in the first

evaluation. Thus, proving the fact the consortium has used this valuable feedback

to improve and ameliorate the tool.

Table 5: Flaws of the Tool Reported by Respondents

CURRENT FLAWS

FIRST EVALUATION Second Evaluation (Open)

Open Stage Closed Stage

Database connection to social networks make the whole enterprise less professional and insecure as far as personal data are concerned

X

Industry groupings are too limited and

institutionalized

X

The contacts list appears to be somewhat random and is slow loading

X

The system provides very few data and no added value

X

The system is slow and has some bugs X X X

The information is not present for different profiles

X

The aggregation of content from LinkedIn

seems to not work efficiently

X

The user interface is not very attractive and is sort of visually outdated

X

Page 17: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 17 of 45

Lack of visibility of civil society related

activities

X

It is not available in all EU languages X

There is over-representation of Brussels-based experts

X

The design of the user interface should be improved

X

Categories of people are too broad X

The ranking mechanism is not clear X

There is a limited amount of information (no CVs)

X

Finally, regarding the suggestions for improving the system reported by

respondents in all the evaluation exercises (Table 6), they are clearly related to

the aforementioned flaws, e.g. necessity to increase the quantity and

transparency of information present in EurActory (profiles and more information

on each profile), to increase the speed of the system, and to add functionalities

such as the possibility to search by category and to introduce interaction among

the experts. All these suggestions have been considered by the consortium in

order to improve the tool.

Table 6: Suggestions Reported by Respondents

SUGGESTIONS

FIRST EVALUATION Second Evaluation (Open)

Open Stage Closed Stage

Increase the speed of the system X X X

Provide a more attractive and modern user

interface

X X

Provide the possibility of searching by category e.g. Influencer, Institutional,

Analyst

X X

Integrate real-time interactivity between participants

X X

Promote more the product X X

Increase the number of profiles and the

information on each profile

X X

Increase the number of domains X X

Add google results X

Update consistently X

Provide info on profiles not easily searchable online

X

Invite users to put a profile picture, for

instance by gamifying slightly the experience (e.g. getting "rewards" or a ranking, or a percentage advancement to complete the profile)

X

Capture informal networks of experts X

Combine and minify the javascript and the css

X

Combine the background images with CSS sprites

X

Interact with users to improve the platform X

Create a home page X

Page 18: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 18 of 45

Improve transparency in the ranking X

Increase the number of profiles from national level

X

2.2 Evaluation of PolicyLine

Concerning the first evaluation of PolicyLine, it has been carried out by mean of

an Experts’ Structured Feedback and a closed stage evaluation by mean of an

online survey. As for the Experts’ Structured Feedback, the participants were

selected by the leader of WP7 “Pilot Operation”, Maxime Sattonay from EurActiv.

He provided a list of EU experts. From this list, 3 experts participated in the

evaluation round. The test participants were invited via email and a link to an

online evaluation system was provided. On the other hand a closed evaluation

session organized with the participation of graduate students from the University

of Aegean interpreting the role of potential users. During this session the tool was

proposed to the audience, and its applications and results were explained. Then

the participants had the opportunity to interact with the ICT platform by

executing a set of predefined usage scenarios, which are listed in Table 7, under

the observation of the organizers who supported them, and recorded any

comments or difficulties, and also feedback on possible improvements. Finally,

the evaluation data from the participants in this session were collected. In total

16 individuals took part to the evaluation.

Table 7: Usage Scenarios

Usage Scenarios

Login on PolicyLine, view all topics, view policy processes under a topic, find

documents and view more information on a document

Find the proposal documents of a policy process, view the proposals’ options

chart and its author’s profile

Create a new process, add a document

Find the proposal document of a policy process, rate and share the document

on Social Media

The questions of the survey have been clustered according to the following

categories:

Background Questions: in this section the respondents were asked to

identify their domain of interest as well as their degree of interest in EU

policy making;

Relevance: this set of questions aims to evaluate if the objective of the

tool under scrutiny are adequate to face the needs of the beneficiaries;

Perceived Ease of Use: this category of questions takes into account the

degree to which a person believes that using a particular system would be

free from effort;

Intention to Use: a person’s perceived likelihood or subjective probability

that he or she will use the tool in the future;

Knowledge of similar initiatives: here the respondent is asked to provide

information on possible competitors of the tool;

Page 19: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 19 of 45

Strengths of the tool: this category of questions regards the main benefits

of the tool;

Perceived usefulness: this category of questions takes into account the

degree to which a person believes that using a particular system would

enhance his or her job performance;

Effectiveness of the tool: capability of the tool to reach its intermediate

and strategic objectives;

Barriers and bottlenecks: issues preventing the tool to unleash its full

potential;

Future Improvements: in this section the respondents were asked to

provide suggestions on future improvements of the tool.

The same questionnaire of the first evaluation closed stage was used for the

second evaluation of PolicyLine. The online survey was administered in a closed

stage to a set of individuals interpreting the role of potential users taking part in

two events, both held in July 2016: a User Workshop on PolicyLine organised by

EurActiv, held in Brussels, and a Summer School on Open and Collaborative

Governance organised by the University of Aegean in Samos. Given the

similarities in terms of background and interests between the audiences and the

content of the two events, as well as of the results of the individual evaluation,

we deemed acceptable to aggregate the data. During the events the tool was

proposed to the audience, and its applications and results were explained. Then

the participants were shown a set of predefined usage scenarios, equal to those

presented in the closed stage of the first evaluation. In total 14 individuals took

part to the evaluation, 10 during the summer school and 4 in the workshop.

2.2.1 Comparison between the First and Second Evaluation

In Table 8 a comparison between the first and second evaluation carried out by

the mean of the same online questionnaire is displayed. As it can be seen in both

the evaluations the response is quite positive. Moreover it is increasing in the

second evaluation. This might be a signal of improvement of the tool over time.

Table 8: Summary of Results of the Evaluation of PolicyLine

Criterion First Evaluation Second Evaluation

Perceived ease of use

I can easily get an overview of the

process

3,67 4,07

I can easily rate/comment a

document

3,63 4,00

I can easily add a document 3,81 4,29

I can easily create a process 3,73 4,14

PolicyLine can be used without

assistance

3,64 3,79

Using PolicyLine has been a

positive experience (agree or

strongly agree)

68,75% 79%

Intention to use

Willingness to use PolicyLine on a 62,50% 86%

Page 20: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 20 of 45

regular basis (every month)

Willingness to advise other

colleagues to use EurActory (agree

or strongly agree)

68,75% 92%

Knowledge of similar initiatives 0,00% 57%

Strengths and effectiveness

PolicyLine puts together

information not found nor collected

under one roof else

37,50% 57%

PolicyLine improves the quality of

my work

3,38 3,79

PolicyLine allows me to be more

productive

3,38 3,57

In the second evaluation respondents also claimed knowledge about services

similar to PolicyLine. When asked to clarify their answer, the respondents

suggested tools such as One Policy Place (http://onepolicyplace.com), which is a

tool aimed at displaying the latest EU legislative and policy developments in

climate change, energy, and environment together; and the project Publiaccess

Eurlex (https://publications.europa.eu/en/web/public-access), aimed at

facilitating online access to a wider range of unclassified documents held by EU

institutions. It is clear that the range of functionalities of PolicyLine is wider than

the suggested similar tools.

Concerning the strengths and effectiveness of PolicyLine, the respondents in the

first evaluation identified as important the provision of valuable information and

the possibility to save time and to monitor and evaluate the policy processes. On

the other hand in the second evaluation the respondents were more accurate as

they identified as important the collection of all the useful information about

European issues, collection and tracking in time of documents, provision of an

accurate overview of recent policies, and the assessment of the credibility of a

document by reviewing the author’s profile and ranking.

Concerning future improvements of the system, the respondents of the first

evaluation, given their relatively high IT knowledge, provided suggestions of

technical nature such as making the policy timelines clearer to read, as well as

using bigger panels to display the relevant information.

On the other hand the respondents of the second evaluation provided suggestions

more related to policy and usability, such as:

Adapt to different browsers

Provide more categories on topics

Create drop-down menus under topic

Make it clear on the website how the size of two bubbles is created

Provide notifications on chosen policy process

Increase the responsiveness of the website to smart phones

Page 21: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 21 of 45

Provide data mining and machine learning algorithms yielding more

automated results in the rating of results

Involve civil society, think tanks and universities

The different responses might be due to the diverse background of the

respondents in the two rounds of evaluation.

The evaluation of PolicyLine was complemented by the aforementioned structured

feedback from experts paid by the project. In particular the feedback was focused

on reporting issues related the design and the technical details of the tool.

Concerning the design issues, the most important feedback was related to topics

such as content, contribution from users, documents displayed, navigation,

timeline, overall impression, and link to EurActory. Concerning the technical

issues, the most important feedback was related to topics such as axes of the

visualization, bounds displayed, content, contribution, documents displayed,

general impression, navigation, social Media, zoom and timeline. In any case all

the feedbacks have been reported to Jira and taken into account for the

improvement of the tool.

Page 22: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 22 of 45

3. Metrics Embedded in the Platform

The following sections present the results stemming from the metrics embedded

in the platform and the metrics recorded by mean of Web Analytics Tools, with

the objective to investigate the level of take-up of the EU Community platform by

the target groups. Regarding EurActory, the following data were used:

Data from Web Analytics Tools

o Number of users per month

o Number of returning users

o Number of page views per month

o Number of sessions by device

Data from metrics embedded in the platform

o Number of registered users by type

o Number of activated profiles by type and rating

o Number of activated profiles

On the other hand, regarding PolicyLine, the following data were used:

Data from Web Analytics Tools

o Number of users per month

o Number of returning users

o Number of page views per month

o Number of sessions by device

Data from metrics embedded in the platform

o Number of contributions per topic and type of contribution

o Number of contributors per topic and type of contribution

o Number of contributors per type of contributor and type of

contribution

The report will depict first the results relative to EurActory, and then the results

regarding PolicyLine.

3.1 Metrics Relative to EurActory

Starting from the metrics stemming from the Web Analytics Tools, EurActory has

been used from August 2014 to September 2016 by 69270 users (81% returning

users) in 81646 sessions. Likewise, the website has received in the same period

165302 page views. A depiction of page views over time is available in Figure 1.

As it can be see the views per month increase along the project before peaking in

February 2016 and in September 2016, due to dissemination campaigns carried

out by the consortium.

Page 23: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 23 of 45

Figure 1: Page Views of EurActory per Month

Similarly in Figure 2 the reader can observe the distribution of Page Views over

time. The rate of growth of Page Views is increasing from August 2015 and it is

constant (and high) until the end of the project.

Figure 2: Page Views of EurActory Over Time

0

2000

4000

6000

8000

10000

12000

14000

16000

18000

Aug 14

Okt 14

Dez 14

Feb 15

Apr 15

Jun 15

Aug 15

Okt 15

Dez 15

Feb 16

Apr 16

Jun 16

Aug 16

0

20000

40000

60000

80000

100000

120000

140000

160000

180000

Sep 14

Nov 14

Jan 15 Mrz 15

Mai 15

Jul 15 Sep 15

Nov 15

Jan 16 Mrz 16

Mai 16

Jul 16 Sep 16

Page 24: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 24 of 45

In the same way the number of users per month increases until peaking in June

2016 and in September 2016 (Figure 3).

Figure 3: Users of EurActory Over Time

Similarly, Figure 4 displays the Users of EurActory over time. As it can be seen

the rate of growth of Users is constant until August 2015, and then it increases

and remains high for the rest of the project.

Figure 4: Users of EurActory Over Time

It is interesting to have a look also at the sessions per device in Table 9. Clearly

most of use of EurActory has been carried out by mean of a desktop. This is not

surprising as most people use EurActory while performing their job. On the other

hand the extent of use by mobile phone is not negligible.

0

10000

20000

30000

40000

50000

60000

70000

80000

Sep

14

Ok

t 1

4

No

v 1

4

Dez

14

Jan

15

Feb

15

Mrz

15

Ap

r 1

5

Mai

15

Jun

15

Jul 1

5

Au

g 1

5

Sep

15

Ok

t 1

5

No

v 1

5

Dez

15

Jan

16

Feb

16

Mrz

16

Ap

r 1

6

Mai

16

Jun

16

Jul 1

6

Au

g 1

6

Sep

16

0

1000

2000

3000

4000

5000

6000

7000

8000

9000

Aug 14

Okt 14

Dez 14

Feb 15

Apr 15

Jun 15 Aug 15

Okt 15

Dez 15

Feb 16

Apr 16

Jun 16 Aug 16

Page 25: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 25 of 45

Table 9: Number and Percentage of Sessions per Device

DEVICE NUMBER PERCENTAGE

Desktop 69290 84,87%

Mobile 8944 10,95%

Tablet 3412 4,18%

Total 81646 100%

Regarding the data stemming from the metrics embedded in the platform,

EurActory displays 807 registered users and 532 activated profiles.

Figure 5: Registered Users of EurActory per Month

Likewise in Figure 5, the distribution of Registered Users over time is depicted. As

it can be seen the rate of growth of Registered Users is at its peak between

December 2014 and February 2015, then it slows down until August 2015, and

then it increases and remains high for the rest of the project.

0

10

20

30

40

50

60

70

80

90

Mai 14

Jul 14 Sep 14

Nov 14

Jan 15

Mrz 15

Mai 15

Jul 15 Sep 15

Nov 15

Jan 16

Mrz 16

Mai 16

Jul 16 Sep 16

Page 26: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 26 of 45

Figure 6: Registered Users of EurActory Over Time

Out of the activated profiles, the experts doing peer-rating are 55, the experts

that have been peer-rated are 90, and the self-assessed Experts are 260.

Table 10 presents the experts and activated profiles by type. As it can be seen

there are 34000 expert profiles in Euractory, almost 96% of which are

institutional. This does not constitute a surprise as the institutional experts’ data

are the easiest to crawl.

Table 10: Experts by Type

TYPE NUMBER PERCENTAGE

Analysts 499 1,47%

Influencers 931 2,74%

Institutional 32570 95,79%

Total Experts in

EurActory 34000 100%

By contrast in Table 11, it can be observed that the relative majority of activated

profiles belong to Influencers, while Institutional Experts are only 23.5 %. It is

interesting to note that over fifty Members of the European Parliament and over

twenty Senior Officers of the European Commission have activated profiles.

Table 11: Activated Profiles by Type

TYPE NUMBER PERCENTAGE

Analysts 187 35,15%

Influencers 220 41,35%

Institutional 125 23,50%

Total Experts in

EurActory 532 100%

0

100

200

300

400

500

600

700

800

900

Jun 14

Aug 14

Okt 14

Dez 14

Feb 15

Apr 15

Jun 15

Aug 15

Okt 15

Dez 15

Feb 16

Apr 16

Jun 16

Aug 16

Page 27: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 27 of 45

3.2 Metrics Relative to PolicyLine

In the same way considering first the metrics stemming from the Web Analytics

Tools, EurActory has been used from August 2015 to September 2016 by 979

users (83% returning users) in 2240 sessions. Likewise, the website has received

in the same period 24279 page views. A depiction of page views over time is

available in Figure 7. As it can be observed the views per month increase along

the project before picking in February 2016 and then going decreasing, increasing

again in July and September

Figure 7: Page Views of PolicyLine per Month

Similarly in Figure 8 we can observe the distribution of Page Views over time: the

rate of growth of Page Views is roughly constant until April 2016, then it slows

down.

Figure 8: Page Views of PolicyLine Over Time

0

500

1000

1500

2000

2500

3000

3500

4000

4500

Sep 15

Okt 15

Nov 15

Dez 15

Jan 16

Feb 16

Mrz 16

Apr 16

Mai 16

Jun 16

Jul 16

Aug 16

Sep 16

0

5000

10000

15000

20000

25000

30000

Okt 15

Nov 15

Dez 15

Jan 16

Feb 16

Mrz 16

Apr 16

Mai 16

Jun 16

Jul 16

Aug 16

Sep 16

Okt 16

Page 28: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 28 of 45

Quite differently the number of users per month increases until peaking in

January-March 2016 and in September 2016 (Figure 9).

Figure 9: Users of PolicyLine per Month

Likewise in Figure 10 the distribution of Users over time is displayed. The rate in

the growth of users is more or less constant until the very end of the project.

Figure 10: Users of PolicyLine Over Time

It is interesting to have a look also at the sessions per device in Table 12. Almost

the entire use of PolicyLine has been carried out by mean of a desktop, probably

because the screen of mobile phones and tablets is not large enough to have a

good user experience.

0

20

40

60

80

100

120

Sep 15

Okt 15

Nov 15

Dez 15

Jan 16

Feb 16

Mrz 16

Apr 16

Mai 16

Jun 16

Jul 16

Aug 16

Sep 16

0

200

400

600

800

1000

1200

Okt 15

Nov 15

Dez 15

Jan 16

Feb 16

Mrz 16

Apr 16

Mai 16

Jun 16

Jul 16

Aug 16

Sep 16

Okt 16

Page 29: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 29 of 45

Table 12: Number and Percentage of Sessions per Device

DEVICE NUMBER PERCENTAGE

Desktop 2160 96,43%

Mobile 47 2,10%

Tablet 33 1,47%

Total 2240 100%

Regarding the data stemming from the metrics embedded in the platform, Table

13 depicts the number of contributions per topic and per type of contribution,

which amounts to 736. Clearly the most contributed topics are Innovation and

Entrepreneurship and Future of EU, with respectively 295 and 279 contributions.

On the other end the most used contribution is the addition of documents to a

policy process, with 659 documents added to the platform. Much less used are

the process creation and the process ratings. In any case 21 processes created is

a positive number.

Table 13: Number of Contributions per Topic

TOPIC

TYPE OF CONTRIBUTION

CONTRIBUTIONS

Create

Process

Add

Document

Rate

Document

Energy Union 3 85 49 137

Innovation &

Entrepreneurship 3 289 3 295

Future of EU 9 266 4 279

Evaluation Topic 6 19 25

Total 21 659 56 736

Likewise in Table 14 the number of contributors per topic and type of contribution

are displayed. As it can be see 53 users contributed to the platform, out of which

20 contributed to the topic Future of EU, 12 contributed to the topic Innovation

and Entrepreneurship, 10 contributed to the topic Evaluation, and 9 contributed

to the Energy Union one.

Table 14: Number of Contributors per Topic and Type of Contribution

TOPIC

TYPE OF CONTRIBUTION

CONTRIBUTORS

Create

Process

Add

Document

Rate

Documen

t

Energy Union 1 5 3 9

Innovation &

Entrepreneurship 3 8 1 12

Page 30: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 30 of 45

Future of EU 4 15 3 22

Evaluation Topic 5 5 10

Total 13 33 7 53

Finally in Table 15 the number of contributors per category and type of

contribution are displayed. As expected most of the contributors are Analysts and

Influencers.

Table 15: Number of Contributors per Category of Contributor and Type of Contribution

TOPIC

TYPE OF CONTRIBUTION

CONTRIBUTORS

Create

Process

Add

Document

Rate

Document

Analyst 5 13 4 22

Influencer 5 10 2 17

Institutional 1 1 2

Total 11 24 6 41

3.3 Lessons Learnt and Recommendations for the

Sustainability of the Engagement

The successful take up of EurActory provides a set of lessons to be mutuated in

similar projects. In general EU Community has carried out a mixture of online

engagement and live events, which are mutually supportive and should be closely

interlinked. Online engagement helps improving live events by kick-starting

discussions and setting the right expectations before the event, ensuring its

momentum and results are maintained as a follow-up. Online engagement is

designed to complement communication activities, and build on them. While

communication is typically one-way, online engagement aims to stimulate not

only two-ways communication, but also genuine many-to-many interaction. In

other words, it is about moving from a set of stakeholders to a real community.

In EU Community the stakeholder engagement aimed mainly at 2 objectives: to

widen the participation to new, relevant people; and to strengthen the intensity

of participation of those that are members already. In this respect a successful

approach is funded on a limited set of principles:

A federated approach: rather than trying to create “the one and only”

engagement platform, we should promoted interaction with other

platforms.

Proactive outreach on social media and third party platforms;

Snowball approach, starting from a targeted desk-based research of key

stakeholders, to gradually increase participation;

Integration of synchronous and asynchronous online engagement;

Systematic reporting

Online engagement can and should be measured. In this respect it is useful to set

up and run a monitoring and evaluation system of the level and quality of

Page 31: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 31 of 45

engagement in the dissemination campaign. The monitoring and the evaluations

should take place on a regular basis as to amend the dissemination strategy if

necessary. The evaluation will make use of participation metrics and indicators

(e.g. embedded in the websites-platforms), as well as periodic surveys to the

users of the online engagement channels and to the attendees of live events. In

any case, a system allowing continuous feedback must be used.

Examples of combined online and offline engagement activities carried out by EU

Community are the three online discussions covering the policy and technical

aspects of the project as well as driving user engagement with the product of the

project, as well as the three workshops aimed at gathering valuable feedback

from the users and the community of EU Policy Stakeholders as a whole.

More specifically the first online discussion focused on the policy aspect of the

project, EurActory ranking specifically, the second online discussion that drove

the user engagement with EU Community and EurActory on twitter, while final

online discussion that was held on Skype with EurActiv network partners from

around the Europe and focused on the technical aspects of the project primarily.

Likewise the first workshop was aimed at data specialist and took place during

Belgium’s biggest data journalism conference called Dataharvest, the second

workshop was aimed at communication professionals working on EU policy and

took place during an event called BucoproX, and finally the last workshop took

place during a policy debate with an audience and a panel of Energy Efficiency

experts.

Page 32: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 32 of 45

4. Discussion of Results

This final deliverable of the project presents the elaboration and synthesis the results of the evaluations carried out along the project; in particular in the four rounds of users’ survey and in the elaboration of the data stemming from the metrics embedded in the platform. The four rounds of users’ survey dealt respectively with the evaluation of the tools EurActory and PolicyLine, having two evaluation rounds for each tool. In this regard the aim of the users’ surveys was to assess to which extent the tools have reached their objectives and have complied with the needs of their actual and potential users.

Furthermore, the users’ surveys were complemented by the analysis of the data stemming from the web analytics tools as well as from the metrics embedded in the platform. The metrics embedded in the platform allowed assessing the level of users’ engagement in the EU Community platform. For what concerns the results of the analysis, regarding the users’ survey, the evaluation shows a positive response for what concerns usability and usefulness/effectiveness of both tools, especially in the second round of evaluation. More specifically the tools appear to be relevant and useful for the users. Regarding technological usability the response is also fairly positive, and the same goes with the effectiveness in helping the user to be more productive, to increase the quality of the work, and to carry out the tasks related to his/her own profession. What is more important, the second evaluation provides more positive results with respect to the first one. The considerable feedback regarding content and bugs present in the system, alongside suggestions concerning new content/functionalities have been taken into account (and will be taken into account in the future) to improve both EurActory and PolicyLine.

Regarding the analysis of the data collected by mean of the metrics embedded in the platform and the metrics recorded by mean of Web Analytics Tools show a considerable take up of EurActory, as well as a significant amount of content present in the tool. Likewise the analysis shows that the take-up and the content present in PolicyLine are smaller in magnitude which is certainly due to the different degrees of maturity of the two tools.

Page 33: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 33 of 45

References

[1] Ajzen, I., Fishbein, M. (1980). “Understanding Attitudes and Predicting

Social Behaviour“, Prentice-Hall: Englewood Cliffs, NJ.

[2] Ammenwerth, E., Brender, J., Nykänen, P., Prokosch, H.U., Rigby, M.,

Talmon, J. (2004) “Visions and strategies to improve evaluation of health

instageion systems - reflections and lessons based on the HIS-EVAL

workshop in Innsbruck“. Int J Med Inf, 73(6): 479-491.

[3] Burton-Jones, A., Hubona, G.S. (2005). “Individual differences and usage

behavior: revisiting a technology acceptance model assumption“, The DATA

BASE for Advances in Instageion Systems. 36(2), 2005, pp. 58–77.

[4] Charalabidis, Y., Loukis, E., Koulizakis, Y., Mekkaoui, D., Ramfos, A.

(2004). “Leveraging European Union Policy Community Through Advanced

Exploitation of Social Media“. IFIP e Conference, 2nd September 2014,

Trinity College, Dublin.

[5] Chau, P.Y.K., Hu, P.J.H. (2002). “Investigating healthcare professionals’

decisions to accept telemedicine technology: an empirical test of competing

theories“, Instageion & Management 39, pp. 297–311.

[6] Davis, F.D. (1989). “Perceived Usefulness, Perceived Ease of Use, and User

Acceptance of Instageion Technology”, MIS Quarterly (13:3), pp. 319-339.

[7] Davis, F.D., Bagozzi, R.P., Warshaw, P.R. (1989) “User Acceptance of

Computer Technology: A Comparison of Two Theoretical Models,”

Management Science (35:8), pp. 982-1002.

[8] Davis, F.D., Venkatesh, V. (2004). “Toward Preprototype User Acceptance

Testing of New Instageion Systems: Implications for Software Project

Management”, IEEE Transactions on Engineering Management (51:1), pp.

31-46.

[9] Fishbein, M., Ajzen, I. (1975). “Belief, attitude, intention, and behavior“.

Reading, MA: Addison-Wesley.

[10] Kim, D., Chang, H. (2007). “Key functional characteristics in designing and

operating health instageion web-sites for user satisfaction: An application of

the extended technology acceptance model“, International Journal of

Medical Instageics, vol. 76, pp. 790-800.

[11] Lee, F., Teich, J.M., Spurr, C.D., Bates, D.W. (1996) “Implementation of

physician order entry: user satisfaction and self-reported usage patterns“,

Journal of the American Medical Instageics Association 3 (1996), pp. 42–55.

[12] Rogers, E. (1995). “Diffusion of Innovations“, Free Press, New York.

[13] Venkatesh, V. (1999). “Creating Favorable User Perceptions: Exploring the

Role of Intrinsic Motivation,” MIS Quarterly (23:2), pp. 239-260.

[14] Venkatesh, V., Morris, M. G., Davis, G. B., Davis, F. D. (2003). “User

Acceptance of Instageion Technology: Toward a Unified View,” MIS

Quarterly (27:3), pp. 425-478.

[15] Venkatesh, V., Morris, M.G. (2000). “Why Don’t Men Ever Stop to Ask for

Directions? Gender, Social Influence, and Their Role in Technology

Acceptance and Usage Behavior,” MIS Quarterly (24:1), pp. 115-139.

Page 34: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 34 of 45

[16] Yarbrough, A.K., Smith, T.B. (2007). “Technology acceptance among

physicians: a new take on TAM“, Medical Care Research and Review 64, pp.

650–672

Page 35: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 35 of 45

APPENDICES: Evaluation Framework

This chapter begins with the illustration of the fundamental intervention logic of

the project. Subsequently the two dimensions of evaluation are presented,

namely the policy impact measurement approach and the technology acceptance

model, which are then integrated in the general methodological framework.

Finally, the process for the development of the evaluation metrics is described.

I. APPENDIX A: The EU Community Intervention Logic

The first step in setting up an evaluation framework is the definition of an

intervention logic for the project. In the specific case, the logic of the intervention

(Figure 11) entails the evaluation of the five stages of engagement:

Definition of the Context in terms of socio-political factors and of the

Needs of the users as well as of the objectives of the call

Evaluation of the Intervention in terms of technical design,

methodological design, and quality of moderation

Evaluation of Output/uptake: extent of participation, degree of diversity

in participation, content provided in the platform

Evaluation of Outcomes: quality of ideas, quality of actual decisions,

quality and availability of policy options

Evaluation of Impact: improved quality of policy making and increased

empowerment of actors

Figure 11: EU Community intervention logic

The steps outlined in the intervention logic are related to the objectives of the

project:

The operational objectives are related to output/uptake, and consist of:

o Providing a ready to use information base and a platform

containing a set of visual tools, focusing on the most relevant

Page 36: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 36 of 45

documents as well as the most knowledgeable people and credible

people on each topic

o Involve the key actors of the EU policy debate

The intermediate objectives are related to outcome/results and consist in

improving the capacity to:

IO1. Map the position of stakeholders and institutions

IO2. Quickly gather the evidence available

IO3. Monitor the status of policy issues in the decision-making flow

IO4. Expand you visibility and influence the policy debate

IO5. Identify new experts

The strategic objectives are related to impact and consist in Improving the

quality and transparency of EU Policy Making:

o More evidence based policy making

o More consensus behind policy decisions

o More alignment with strategic priorities

o More capacity to quickly react to policy priority

Also the problems/challenges of the targets of the project are related to the steps

of the intervention logic:

In relation to output/uptake

o The traditional policy discussion is sub-optimal in terms of speed,

clarity and use of evidence and the open discussions involving

important new players are often too general and crowded

o Too many unstructured contributions not easily understandable and

of low quality

In relation to outcome/results

o Low quality and availability of policy options, low quality of

decisions and ideas

In relation to impact

o Low quality, transparency and efficiency of EU policy making and

empowerment of stakeholders and citizens

There is obviously symmetry between the analysis of the problems and the

description of the objectives. In Figure 12 the logical-causal relationship is

illustrated between problem/challenge and the objectives outlined by associating

to each level of problems/challenges a correspondent level of objectives. The

Page 37: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 37 of 45

logical-causal relationship is complemented by systemic elements and feedback

loops.

Figure 12: Logical-causal relationship

The target

One remarkable element of the project is the fact that it is not targeted to the

population at large, but rather it addresses the needs of relevant EU

stakeholders. It goes beyond the paradigm of “open policy making” to provide

high quality services to expert groups, in order to improve the quality of policy-

making. In fact, according to Charalabidis et al. (2014), the target of the EU

Policy Community, which is the target of the project, is composed by:

Decision

makers:

The Commission, the European Parliament, the Council, the

European Investment Bank, the European External Action

Service and the decentralised agencies and bodies

Influencers: EU industry federations, Trade Unions, NGOs, multinational

corporations

Experts and

Policy Analysts

Examples include international media organisations (e.g.

EurActiv.Com), as well as think tanks (e.g. Lisbon Council)

and academic experts

Page 38: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 38 of 45

Incidentally, these three categories reflect what Robert Madelin calls the Bermuda

Triangle of policy, as illustrated below:

Different objectives are relevant for each typology of stakeholders1:

Type IO1 IO2 IO3 IO4 IO5

Decision makers: XXX XXX X XXX

Influencers: XX X XXX XXX

Experts and Policy Analysts X XX XX X

As for the output produced, the aim of the project is to retrieve documents

authored by experts (or authorities, such as European Commission) in various

other sources (blogs, websites, etc.), and then process them providing a

structured information base. More in particular the project platform EurActory

component’s output will include credibility ranking of pivotal EU actors, while the

PolicyLine component will include relevant documents in visualization form such

as:

The mapping of topics and subtopics of the document with respect to the

steps of EU policy process (public debate, policy debate, draft, debate,

decision, implementation, review)

1 It has to be noticed that the higher the number of “X”, the more relevant the objective is for the

stakeholder

Page 39: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 39 of 45

Official relevant documents from EU Institutions (e.g. white papers, green

papers, Commission drafts, amendments, etc.)

Links to various stakeholder positions documents (e.g. from industry

federations, NGOs, etc.) related to the relevant official documents

Media analysis documents (e.g. from EurActiv and other media), which are

related to the relevant official documents

Page 40: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 40 of 45

II. APPENDIX B: The two Dimensions of Evaluation

In this subsection the two approaches to the evaluation to be used throughout

the project are depicted. One approach has been developed for assessing policy

interventions, and thereby it is dubbed by the authors policy impact

measurement approach, while the other approach is related to the acceptance of

a technology.

Policy Impact Measurement Approach

Starting from the intervention logic depicted in Figure 11, it is possible to define a

set of indicators, which are characteristics or attributes that can be measured to

assess a project in terms of its outputs, outcomes or impacts, and that can be

either quantitative or qualitative. Measurement indicators can be used to assess

interventions according to key criteria of evaluation and which measure to which

extent a project has achieved the results intended. The evaluation criteria used in

the project are: relevance, efficiency, effectiveness, and additionality. The

evaluation criteria are further described below:

The relevance criterion aims to evaluate if the objective of the intervention

under scrutiny is adequate to face the needs of the beneficiaries. In this

respect, we will analyse the profile of the participants in terms of needs,

benefits and participation, as well as the methodological and technical

design of the project.

Efficiency aims to evaluate if the inputs provided by a project are adequate

to reach a given result in terms of outputs and outcomes. In this sense,

we will evaluate the extent of participation, its degree of diversity, as well

as the capability of the project to obtain the same results with less

expenditure.

Effectiveness: this criterion, which is the most important, refers to the

capability of EU Community to reach its intermediate and strategic

objectives, i.e. to improve and facilitate the daily activity of EU actors, and

to improve the quality and transparency of EU Policy Making and

empowerment of actors. To this respect, we will evaluate the value of the

services offered, the quality and quantity of the information provided in

the platform, as well as what kind of benefits have been gained by users.

Additionality is referred to the capability of EU Community to achieve a set

of results that would have not been reached in its absence. In particular,

in our case it refers to the capability of the project to provide better

services that are unique or better than similar initiatives, as well as to

reach users that normally are not reached by other services.

Technology Acceptance Model

A number of theories have been developed in order to assess the reasons

according to which users make decisions about adopting technology applications.

The Innovation Diffusion Theory (IDT) (Rogers 1995), explains the process that

an intervention follows in order to move from the state of invention to widespread

adoption. The theory classifies individuals according to their speed of uptake:

innovators, early adopters, early majority, late majority and laggards. Moreover,

Page 41: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 41 of 45

the approach illustrates a set of innovation characteristics affecting diffusion:

compatibility, relative advantage, complexity, trialability and observability.

The Theory of Planned Behaviour (TPB) (Fishbein and Ajzen 1975) states that the

determinants of individual behaviour are subjective norms (individual’s

consideration about the opinion of people who are important to him/her towards

the implementation of the behaviour in question), attitudes toward behaviour

(feelings about implementing the behaviour), and perceived behavioural control

(ease or difficulty in implementing the behaviour).

The Unified Theory of Acceptance and Use of Technology (UTAUT) (Venkatesh et

al. 2003), according to which social influence, performance expectancy, effort

expectancy, and facilitating conditions, are direct determinants of intention to use

and of behaviour.

The FITT framework (Ammenwerth et al. 2006), according to which in the IT

implementation process an essential element is the fit among individuals, tasks

and technology. The theory states that the implementation of IT solutions in

clinical practice depends upon the fit amongst individual attributes (e.g.

motivation to use the IT solution), technological attributes (e.g. usability), and

attributes of the clinical processes (e.g. organizational factors).

Finally, the Technology Acceptance Model (TAM) (Davis 1989, Davis et al. 1989)

is the most frequently used theory, and it is the one to be adopted within the

scope of the EU Community project. The Technology Acceptance Model builds on

the attitude paradigm developed by Fishbein and Ajzen’s (1975), which illustrates

how to measure the components of attitudes related to behaviour, distinguishes

between beliefs and attitudes and explains the mechanism according to which

external stimuli are connected to beliefs, attitudes and behaviour. The Technology

Acceptance Model builds also on the Theory of Reasoned Action (TRA). This

theory states that the performance of an individual is influenced by his/her

attitude and subjective norms concerning the behaviour in question. Moreover, it

states that the beliefs and the motivations of individuals interact with existing

behaviour (Ajzen and Fishbein 1980).

Figure 13: Technology Acceptance Model

Source: Davis (1989)

Page 42: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 42 of 45

Let us now see what are the criteria according to which technology can be

accepted. Davis (1989), argues that user acceptance of any technology is

determined by two factors:

Perceived usefulness, which is defined as “the degree to which a person

believes that using a particular system would enhance his or her job

performance”

Perceived ease of use, which is defined as “the degree to which a person

believes that using a particular system would be free from effort”

Perceived usefulness and perceived ease of use form the attitude towards use,

which is explained by Davis (1989) as “the degree of evaluative affect that an

individual associates with using a system in his or her job”. Finally, according to

the theory behavioural intention to use, which is “a person’s perceived likelihood

or subjective probability that he or she will engage in a given behaviour” (Davis

et al. 1989; Davis and Venkatesh 2004), determines the actual use of the

application, and attitude toward technology affects the intention.

As shown in Figure 13, perceived ease of use and perceived usefulness can be

affected by various external variables such as personal characteristics like the

level of education (inter al. Burton-Jones and Hubona 2005) and gender (inter al.

Venkatesh and Morris 2000), or such as organisational features like training in

computer use (inter al. Venkatesh 1999).

The technology acceptance mode has been tested with several types of IT

applications (inter al. Lee et al. 2006; Yarbrough and Smith 2007), and it has

also been adopted for the identification of functional factors in designing health

information websites for customers (Kim and Chang 2007). Moreover, several

studies have demonstrated that the model can be used to assess actual IT use

(Venkatesh and Morris 2000) as well as the variation in behavioural intention

(Chau and Hu 2002).

Page 43: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 43 of 45

III. APPENDIX C: Methodological Framework

Combining the intervention logic, the technology acceptance evaluation criteria

and the policy impact evaluation criteria we have defined an overall

methodological framework for the evaluation of EU Community (Figure 14). As

illustrated, the technology acceptance criteria are all indirectly related to the

production of output of the project, while in particular perceived ease of use is

related to the input/intervention and perceived usefulness is related to the

outcome/result of the project. As for the policy impact evaluation criteria,

additionality is related to the impact over the baseline (context), while relevance

regards the adequacy of the intervention with respect to the needs of the targets.

In the same way efficiency concerns the adequacy of the output produced with

the input deployed, while finally effectiveness of the intervention concerns the

outcome reached by the mean of the outputs produced.

Figure 14: Methodological Framework

After having defined the technology acceptance evaluation criteria and the policy

impact evaluation criteria it is possible to illustrate the logical-causal relationship

between problem/challenge and objectives in a more complete fashion (Figure

15).

Page 44: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 44 of 45

Figure 15: Logical-causal relationship complete with evaluation criteria

Taking into account the objectives of the project, it follows that the criteria of

efficiency and effectiveness are respectively referred to the operational and

strategic objectives. Moreover, the criteria of relevance, additionality and

sustainability are transversal to all the objectives. On the other hand, perceived

usefulness is related to operational objectives, while attitude, perceived ease of

use and behavioural intention to use are related to the intermediate objectives.

Page 45: Deliverable D8.3 Final Evaluation Report - Europa...D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016 Project Title: EU Community Contract No. 611964 Project

D8.3 – Final Evaluation Report Version: 1.0 - Final, Date: 19/10/2016

Project Title: EU Community Contract No. 611964

Project Coordinator: INTRASOFT International S.A.

Page 45 of 45

IV. APPENDIX D: Development of the Evaluation Metrics

In Figure 16 the process for the development of the evaluation metrics is

depicted. The process starts with the definition of the objectives of the project

(operational, intermediate and strategic), followed by the definition of the

evaluation criteria, which assess to which extent a project has achieved the

results intended, and which are related in the case at hand both to the policy

impact measurement approach and to the technology acceptance model. From

the set of objectives and criteria stems a series of evaluation questions that are

further refined. Then a set of indicators is defined in order to answer to the

specific evaluation questions. The definition of the sources related to the

indicators is the last step.

Figure 16: Development of the evaluation metrics