deliverable 18.1 dissemination and training strategy plan and ...€¦ · d18.1 - dissemination and...
TRANSCRIPT
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
1
Model Driven Paediatric European Digital Repository
Call identifier: FP7-ICT-2011-9 - Grant agreement no: 600932
Thematic Priority: ICT - ICT-2011.5.2: Virtual Physiological Human
Deliverable 18.1
Dissemination and training strategy plan
and preliminary materials
Due date of delivery:28-02-2014
Actual submission date:
Start of the project: 1st March 2013
Ending Date: 28th February 2017
Partner responsible for this deliverable: LYNKEUS
Version: 1.4
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
2
Title Dissemination and training strategy plan and preliminary materials
Deliverable 18.1
Reporting Period 15th March 2013 – 1st March 2014
Authors LYNKEUS
Work Package 18 – Dissemination & Training
Security Public
Nature Report
Keyword(s) Dissemination, training, material
Document History
Name Remark Version Date
Vanessa Diaz Training planning draft 1.0 05-12-2013
Almerico Bartoli First Complete draft 1.1 18-01-14
Mirko De Maldè Second complete draft 1.2 25-02-14
Mirko De Maldè Third complete draft 1.3 27-02-14
List of Contributors
Name Affiliation
Vanessa Diaz UCL
Mirko De Maldè LYNKEUS
Almerico Bartoli LYNKEUS
List of reviewers
Name Affiliation
Bruno Dallapiccola OPBG
C. MacGregor LYNKEUS
Edwin Morley-Fletcher LYNKEUS
Abbreviations
DoW Description of Work
MD-PAEDIGREE MD-Paedigree
Dos Dissemination Objects
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
3
Table of Contents 1. Rationale ............................................................................................................................................ 5
2. Dissemination strategy plan ............................................................................................................... 5
2.1 Vision and key messages ............................................................................................................. 5
2.2 The strategy ................................................................................................................................. 5
2.3 MD-Paedigree and the Big Data Challenge ................................................................................. 6
2.4 The target audiences ................................................................................................................... 6
3. Most significative dissemination activities and achievements in the first year ...................................... 8
3.1 MD-Paedigree awarded as Best Exhibit at ICT ’13 ...................................................................... 8
3.2 Attended Conferences................................................................................................................. 9
3.3 Press coverage ........................................................................................................................... 10
3.4 Next steps .................................................................................................................................. 18
4. Preliminary dissemination materials and channels ............................................................................ 21
4.1 Md-Paedigree Logo ................................................................................................................... 21
4.2 MD-Paedigree in One Slide ....................................................................................................... 21
4.3 The MD-Psaedigree Kick-Off Meting Poster and Biannual Meeting Poster .............................. 22
4.3 Newsletter ................................................................................................................................ 23
4.4 Project website .......................................................................................................................... 23
Public website ............................................................................................................................. 23
Private website ............................................................................................................................ 24
4.5 Social Networks ......................................................................................................................... 24
4.6 Project Posters .......................................................................................................................... 25
4.7 ID Card ....................................................................................................................................... 26
4.8 Project’s “Business Card” .......................................................................................................... 26
4.9 Publications ............................................................................................................................... 27
5. Training Strategy .............................................................................................................................. 29
5.1 Introduction ............................................................................................................................... 29
5.2 Training Event (1) in a nutshell: Training clinicians to upload and use the MD-Paedigree’s
Repository ....................................................................................................................................... 29
5.2 The organisation of both Training Events can be broken down into 6 General Tasks .............. 30
5.3 Supporting materials ................................................................................................................. 33
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
4
Index of Figures Fig. 1 – ICT 2013 Best Exhibit award ..................................................................................................... 8 Fig 3. – Screenshot from http://youtu.be/b8qIZS2j2T8 ......................................................................... 9 Fig. 4 – The MD-Paedigree Poster for the EHealth Forum Gastein ....................................................... 9 Fig 5. Article about MD-Paedigree published on “La Repubblica“ ...................................................... 11 Figure 6 – MD-Paedigree News for SIEMENS GLOBALE ...................................................................... 14 Fig. 7 – The UCL Institute of Child Health Entrance ............................................................................. 16 Fig.8 – One of the MD-Paedigree partner explaining .......................................................................... 16 the work perfomed .............................................................................................................................. 16 Fig.9 – The Plenary Session of the Internal Review ............................................................................. 16 Fig. 13 - Alberto Sanna ........................................................................................................................ 17 Fig.12 - Adam Shortland ...................................................................................................................... 17 Fig. 11 - Rolando Cimaz ....................................................................................................................... 17 Fig.10 - Marco Bonvicini ...................................................................................................................... 17 Fig.14 – The MD-Paedigree Logo ........................................................................................................ 21 Fig.15 – The MD-Paedigree infographic .............................................................................................. 21 Fig.16 – Kick-Off Meeting poster ......................................................................................................... 22 Fig.17 – Biannual Meeting poster ....................................................................................................... 22 Fig.18 – The MD-Paedigree Newsletter Frontpage ............................................................................. 23 Fig.19 – The MD-Paedigree Website Homepage ................................................................................ 23 Fig.20 – The MD-Paedigree Twitter Account ....................................................................................... 24 Fig.21 – The MD-Paedigree posters (one per disease area) ................................................................ 25 Fig.22 – The MD-Paedigree ID Card .................................................................................................... 26 Fig.23 – The MD-Paedigree “Business card” ....................................................................................... 26 Fig. 24 - Training event 1 in a nutshell (location still not confirmed) .................................................. 30
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
5
1. Rationale This document outlines some preliminary elements of a comprehensive and effective dissemination strategy for MD-Paedigree (MD-PAEDIGREE), drawn up according to a 4 year plan which will be updated yearly as the project evolves. In this sense, the dissemination strategy should be considered as an evolutionary strategy, which will be implemented taking into account the most effective means of dissemination for the specific achievements reached during the project, and possibly adding to the tasks already indicated in this document new activities for appropriately disseminating the actions and results of MD-PAEDIGREE. In fact, the dissemination channels foreseen, deemed promising at present, may in the future be found to be unattractive and/or ineffective, while new ones may demand further exploitation. In the first part of the document, the dissemination strategy is explained, followed by a brief outline of the activities already implemented in the first year of the project. The second part of the document is devoted to the presentation of the different dissemination materials and channels implemented so far. The third and final part serves as a preliminary explanation of the training activities that will be implemented during the project.
2. Dissemination strategy plan 2.1 Vision and key messages MD-PAEDIGREE has the overarching ambition to develop a strong brand image, 5ptimize5l by both biomedical communities and IT experts, in order to become a quality label and a reference for the application of advanced technology in biomedical informatics. The MD-Paedigree central vision is of the project outputs resulting in a lasting and stable community of paediatric patients, parents and institutions who work around and with the MD-Paedigree Infostructure to bring together more and more knowledge, experience, data and technology to provide a clinical decision support system. Along side this vision is the aim to provide a knowledge sharing and discovery tool for paediatric research. Furthermore, MD-Paedigree looks toward the subsequent involvement of new subjects, which will make the models implemented in the project more reliable, thanks to the quality, depth, and volume of clinical data accruing into its federated digital repository, which will become larger and will be expanded to new pathologies and new clinical centres. It is the communication of these visions that is the primary motivation of the MD-Paedigree dissemination strategy.
2.2 The strategy In coherence with the above described vision, the MD-Paedigree dissemination has been conceived, since the very beginning of the project, to be bidirectional, in order to make it possible to disseminate the results, and in the meanwhile trying to bring in the platform new knowledge, data, and feedback from external organisations and experts, also in order to ensure greater compatibility of the outcomes of the project with other projects and existing technology.
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
6
In order to reach in the most effective way the end users (clinicians and patients), part of the MD-Paedigree dissemination has been focused on the clinical use cases for the four different pathologies initially addressed by the project. These four use cases, were prepared by the clinical experts involved into the project. Finally to ensure the success of the dissemination activities, all the partners have at least one person month of involvement in WP18.
2.3 MD-Paedigree and the Big Data Challenge Furthermore, to reinforce the vision of MD-Paedigree (and of its future follow-up), it has been deemed strategic to clearly position MD-Paedigree in the Big Data challenge context, and in particular within the Big Data Analytics applied in healthcare. Big data in healthcare is rapidly becoming a very dynamic field, which is gathering great interest from a number of industries, research institutions and public bodies. Starting from this assumption, positioning MD-Paedigree as an early player into this growing field could represent a strategic and long-standing investment, capable to provide the project with great return both for the dissemination of its results both for guarantee a more effective exploitation of the outcomes of the project itself. To reach this strategic goal, a specific effort has been put in place to organise a dedicated session, and to prepare a specific paper, on this issue, at ICT 2013, in Vilnius (see the section 3.1 for more information).
2.4 The target audiences MD-Paedigree’s dissemination strategy will build on the dissemination experiences from previous EC projects (Health-e-Child and Sim-e-Child in particular), and will be further implemented in order to achieve the best dissemination outcomes for different target populations. In fact, as already stated in the DoW, MD-Paedigree’s outcomes are going to exert an impact on a variety of different work and research communities, ranging from clinicians and caregivers to biomedical researchers, as well as European industry, and finally also the general public opinion at large. Therefore the dissemination strategy will be implemented in order to reach in the most effective way each of these communities, adopting from time to time different languages and communication means, tailoring dissemination messages to each audience, and 6ptimize6l specific dissemination activities and channels for the two main audience domains, namely the specialized audience and the general non expert public. The specialized audience For the 6ptimize6l6 audience (researchers, clinicians, etc.), the main means of dissemination will be the publication of the most significative results achieved during the project. A list (not to be considered as exhaustive yet) of relevant journals is provided in Section 4.9. Furthermore, MD-Paedigree’s clinical and technical partners will attend relevant conferences in the field of interest of the project (and of any of it’s sub-domains), attempting to disseminate the work performed in the project and the main results. A preliminary list of events to which the MD-Paedigree consortium is likely to participate is provided in Section 3.4. MD-Paedigree will also organise specific workshops, exhibitions and specific dissemination materials to make the attendance to the external events as effective as possible.
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
7
Moreover, for the 7ptimize7l7 audience, training sessions will be organised (please refer to the specific Section 5 for details), which will be implemented in order to highlight the outcomes achieved both in disease modelling and in the building of the Infostructure, as well as the potential for change management and innovation in clinical workflows to the medical/clinical and research community interested in VPH technology. The general public For the general public, the work will be disseminated in everyday language rather than in an academic or industrial language. One of the main means of dissemination is the project website (www.md-paedigree.eu), which is online from the very beginning of the project, and that will be constantly updated with new materials in the course of the project. The Project Newsletter will be another main tool for communicating in layperson terms the main achievements of the project, providing the main information about the project activities, the attendance to relevant events, and also providing the reader with a focus on a specific part of the project, together with real use-cases to translate the project results within the everyday life experience (i.e. case studies in the clinical practices). The ID Card and the press release has been conceived with the same rationale. Finally, as soon as the project will begin to achieve significant results, the production of the so-called “Dissemination Objects” (Dos) to be made available on the project website and on the project Youtube channel (still to be created) will start. Typically Dos will be 3-5 minute multimedia clips built from consortium presentations, talks given by consortium members and especially recorded video clips to present the highlights of the consortium’s work.
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
8
3. Most significative dissemination activities and achievements in the
first year 3.1 MD-Paedigree awarded as Best Exhibit at ICT ’13
ICT2013 was the 20th and largest event in ICT, hosting over
200 exhibits and 5500 participants from all over the world
and was hosted for the first time in Eastern Europe
(Vilnius, Lithuania 6-8 November 2013). It is one of the
most significant events in Lithuania’s presidency of the
European Union (EU) Council organised by the Ministry of
Transport and Communications and Ministry of Foreign
Affairs together with the European Commission (EC). The
most prominent EU ICT researchers, representatives of
new and experienced companies, strategists in digital
technology, engineers, investors, high-ranking EC and EP officials, ministers and PMs from EU
member countries, students and journalists were present at the event.
At ICT2013, MD-Paedigree hosted a booth giving demonstrations of its innovative technology,
conducted a packed networking session entitled Big data and data analytics impact in
healthcare and distributed a discussion paper on Big Data Healthcare. During the networking
session, MD-Paedigree brought together a wide range of stakeholders to come together and
discuss the challenges faced in improving Europe’s ability to manage and exploit its healthcare data
assets laying the foundation for a Big Data in Healthcare interest group.
On the 8th November 2013 MD-Paedigree was awarded the Best Exhibit Award at ICT2013.
Fig. 2 The MD-Paedigree Team receiving the award
See also: https://ec.europa.eu/digital-agenda/en/news/meet-winners-best-exhibitors-ict-2013
Fig. 1 – ICT 2013 Best Exhibit award
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
9
After the event, a press release has been distributed, and the success at ICT2013 has been reported
also within the MD-Paedigree First Newsletter.
The EC prepared also an interview with the MD-Paedigree Project Manager, Edwin Morley-Fletcher
(who is also in charge of the dissemination activities within the project), in order to explain the
project and its linkage with the Big Data Challenge.
The video of the interview has been released trhough the EC Youtube channel, and also shared via
Twitter by the Official EU-eHealth Account.
Edwin Morley-Fletcher, “Big Data for health” – MD-Paedigree Project
Fig 3. – Screenshot from http://youtu.be/b8qIZS2j2T8
Furthemore, the paper Big data Healthcare, has been distributed by the EC within dedicated
ICT2013 pages of the Digital Agenda Website.
3.2 Attended Conferences In it’s first year of activity, MD-Paedigree took part (with persons from the consortium or with
dissemination materials) in a series of events:
1. The 16th European Health Forum in Gastein, Austria. For this event, MD-Paedigree’s Consortium
has been asked to produce a A0 poster to explain the project.
Fig. 4 – The MD-Paedigree Poster for the EHealth Forum Gastein
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
10
2. The 22nd Annual Meeting of the European Society for Movement Analysis in Adults and Children
(ESMAC2013) in Glasgow, Scotland. Different person involved in the project, and more specifically
in the NND study, participated to this meeting. MD-Paedigree has been also represented through a
poster specifically dedicated to the NND part of the project.
3. The Pediatrics 2040 Conference in Anaheim, California: in this big event, hosted by the Childern
Hospital of Orange County (CHOC), MD-Paedigree has been provided with a time slot for a
presentation.
4. Big Data Challenges in the health sector, CTB – Nessi workshop, 11th February 2014, Madrid.
During this workshop, attended also by Mr. Wolfgang Treinen, Policy and Project Officer of the
European Commission (DG CONNECT), there has been a presentation of MD-Paedigree, to which
the representative of IBM has replied suggesting a possible cooperation with their Watson
supercomputer, now being tested at the Sloan Kettering Hospital in New York.
3.3 Press coverage MD-Paedigree appeared on several headlines and articles from national and international
newspapers, websites and blogs. Following are some of the most highlighted articles:
1 – La Repubblica (Newspaper, Italy) – 14-05-2013
“A Roma già si studia il baby cuore dilatato”
Anche l’ Italia è tra i protagonisti della ricerca sul paziente avatar, almeno nella sua versione
“cucciolo d’ uomo”. Sarà infatti l’ Ospedale Pediatrico Bambino Gesù di Roma a coordinare MD-
Paedigree, un progetto di quattro anni finanziato con 15 milioni di euro. È la prima volta al mondo
che il coordinamento di un progetto di Information and Communication Technology (ICT), viene
affidato ad un ospedale pediatrico. «La cosa più difficile per un medico – ammette Giacomo
Pongiglione, direttore del Dipartimento medico chirurgico di cardiologia pediatrica del Bambino
Gesù e coordinatore clinico del p r o g e t t o M D Paedigree – è decidere la cura migliore per il
malat o . F i g u r a r s i quando è un bambino. Al momento, ci basiamo sull’ epidemiologia, cioè le
risposte ai trattamenti su gruppi di persone, più o meno simili al nostro paziente. L’ epidemiologia
peròè una guida statistica, non ci consente di prevedere se nel nostro paziente una certa cura avrà
effetto». Ma si profilano altre strade, quali quella del paziente avatar. «Utilizzando i dati del
paziente – spiega Pongiglione – possiamo personalizzare il modello della sua patologiae quindi
testare le cure sul modello, prima che sul malato vero». L’ ologramma del malato, prende così vita
nel computer. Nel caso del cuore, ad esempio, si può realizzare il modello di un cuore dilatato, di un
cuore che non si contrae, di una valvola che non funziona, di una tetralogia di Fallot. Tutto ciò viene
descritto da un modello matematico, tanto più preciso quante più informazioni si hanno da
immettere. «È il terzo progetto finanziato dalla UE (i due precedenti sono stati Healthe-Child dal
2005 al 2009 che ha permesso di creare il modello del ventricolo destro del cuore, e Sim-e-Child dal
2010 al 2012, che ha realizzato il modello dell’ aorta) a cui abbiamo partecipato su questo
argomento. Questa volta noi siamo i leader del progetto (Bruno Dalla Piccola, direttore scientifico
del Bambino Gesù, è il coordinatore globale di MD-Paedigree), che in questa nuova fase, lavorerà
alla modellizzazione di sei patologie pediatriche: cardiomiopatie, rischio di malattie cardiovascolari
nei bambini obesi, artrite giovanile idiopatica, atrofia muscolare spinale di tipo 3, paralisi cerebrale
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
11
emiplegica, distrofia muscolare di Duchenne. L’ altro obiettivo di MD-Paedigree è la realizzazione di
“infostrutture” (se ne occuperà la HESSO, Haute Ecole Spécialisée de Suisse Occidentale), che
porteranno a realizzare una banca dati mondiale di patologie pediatriche».
Fig 5. Article about MD-Paedigree published on “La Repubblica“
2 – Digital Agenda for Europe (Website, EU) – 27.05.2013
“EU awards 12 million euros to 11ptimize11l11es a healthier future for Europe’s children”
To help fight childhood obesity and other child diseases, the European Commission has awarded 12
million euros to a medical research project that will use mathematical models to improve the
treatment of children.
According to recent data from the World Health Organisation (WHO), the number of overweight
infants and children in Europe has steadily risen from 1990 to 2008. Childhood obesity is strongly
linked to risk factors for cardiovascular disease, type 2 diabetes, orthopaedic problems, mental
disorders, underachievement in school and lower self-esteem. To tackle this problem, the European
Commission has decided to fund the MD-Paedigree project. This project will provide decision
support to medical professionals when treating their young patients in four areas:
cardiomyopathies, obesity-related cardiovascular disease, juvenile idiopathic arthritis and
neurological & neuromuscular diseases. Disease simulation to predict treatment outcome Through
disease simulations, requiring the availability of high performance and supercomputer resources,
MD-Paedigree will improve the diagnostic precision of paediatricians and offer child-specific
treatment choices for Europe’s children. Using MD-Paedigree, doctors will be able to select a highly
11ptimize11l11es11 treatment options and receive on-the-spot support in predicting the likely
outcome of such treatments based on each patient’s personal medical data. This will lead us into a
future where child healthcare will become more effective, more personalised, and even more
affordable. Minimalize animal testing On top of that, there are strong ethical reasons to pursue
this revolutionary research: in the future, new drugs and procedures will be preliminarily tested
using computer simulations. In line with this vision, MD-Paedigree will open up the prospect of
exploring new treatments without or with only minimal animal testing. In the longer run, it is to be
expected that even first pre-clinical and phase one clinical trials may be substantially supported and
accelerated, thereby massively reducing the risk to patients involved in such endeavours. Presently,
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
12
very few new drugs are specifically tested for children – and here MD-Paedigree aims to open up a
whole new perspective for paediatric research. MD-Paedigree, which stands for Model-Driven
European Paediatric Digital Repository (www.md-paedigree.eu) is due to last 4 years, finishing in
2017. The project consortium of 21 partners from 10 countries (Belgium, France, Germany, Greece,
Italy, Romania, Switzerland, Netherlands, United Kingdom, USA) is coordinated by the Bambino
Gesù Hospital in Rome and includes five other clinical centres from across the EU (Amsterdam
University Hospital, Catholic University Hospital in Leuven, Giannina Gaslini Institute in Genoa,
Great Ormond Street Hospital in London, Wilhelmina Children’s Hospital in Utrecht). The 6 clinical
centres are scientifically and technologically supported by Siemens and other industrial partners,
research institutions and SMEs. MD-Paedigree follows two previous highly successful projects,
Health-e-Child (www.health-e-child.org) and Sim-e-Child (www.sim-e-child.org) making it possible
to apply innovative diagnostic tools to routine clinical data. It is meant to provide three
fundamental functionalities on top of those of an advanced Electronic Health Records registry:
1. Similarity search, allowing clinicians to access “patients like mine” (and finding decision support
for optimal treatment also based on comparative outcome analysis), and allowing patients to get in
touch with “patients exactly like me” 2. Model-based patient-specific simulation and prediction
3. Patient-specific clinical workflows. Through these tools, MD-Paedigree cognitively updates the
European ideal of granting universal access to clinical and patient enhanced awareness of evidence
based best treatment.
3 – ICT2013 (Website, EU) – 08.11.2013
MD-Paedigree (Model-Driven Paediatric European Digital Repository) wins “ICT2013 Best Exhibit Award” On 8th November 2013 MD-Paedigree (Model-Driven Paediatric European Digital Repository) was awarded the Best Exhibit Award at ICT2013. An event hosting over 200 exhibits and 5500 participants from all over the world (Vilnius, Lithuania 6-8 November 2013). ICT2013 was the 20th and largest event in ICT and was hosted for the first time in Eastern Europe. It is one of the most significant events in Lithuania’s presidency of the European Union (EU) Council organised by the Ministry of Transport and Communications and Ministry of Foreign Affairs together with the European Commission (EC). The most prominent EU ICT researchers, representatives of new and experienced companies, strategists in digital technology, engineers, investors, high-ranking EC and EP officials, ministers and PMs from EU member countries, students and journalists were present at the event. MD-Paedigree is a clinically-led ICT project aiming at developing a model-driven data and workflow-based digital repository, leveraging an information processing and knowledge discovery evolutionary framework, for personalised and predictive treatment in paediatrics. It makes it possible to apply Big Data analytics to routine paediatric clinical data, bringing together all information and knowledge in a private Big Data Cloud which forms the basis of a Smart Analytics service meant to provide three fundamental functionalities: 1) Search for clinicians to access patients like mine and for patients to find patients exactly like me; 2) Model-based patient-specific simulation and prediction; 3) Patient-specific clinical workflows.
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
13
At ICT2013, MD-Paedigree hosted a booth giving demonstrations of its innovative technology, conducted a packed networking session entitled Big data and data analytics impact in healthcare and distributed a discussion paper on Big Data Healthcare. During the networking session, MD-Paedigree brought together a wide range of stakeholders to come together and discuss the challenges faced in improving Europe’s ability to manage and exploit its healthcare data assets laying the foundation for a Big Data in Healthcare interest group. Due to the rapid growth in the availability of data, Big Data technologies define a market projected to grow annually by 40% over the coming years. However, only a small number of European companies appear in the most recent list of largest companies by Big Data revenue.
In modelling environments for predictive, 13ptimize13l13es13 healthcare, personalised solutions are key to improving patient safety and treatment efficacy. The focus of MD-Paedigree is the use of large bases of patient and health system data to drive healthcare innovation. Bringing together stakeholders from Big Data, data analytics and biomedical research, this project gathers emerging needs and service scenarios for storage, sharing, collaboration, multimodal similarity search, outcome analysis, risk stratification, 4P Medicine (Preventive, Personalized, Predictive and Participatory) and patient-specific clinical decision support.
“I am confident that this ground-breaking research MD-Paedigree will open up a new perspective for paediatric research and lead to a future where child healthcare will be more effective, more personalised, and even more affordable.” Concludes Bruno Dallapiccola MD- Paedigree Project Coordinator and Scientific Director of Ospedale Pediatrico Bambin Gesù (Rome, Italy).
4 – Le Monde (Newspaper, France) – 20.01.2014
“Vers un big data européen de la santé”
Plusieures initiatives, soutenues par la commission européenne, s’appuient sur les technologies du
cloud computing et de la big data pour 13ptimi accessibles, compatibles et comparables les données
de différents hôpitaux relatives à des pathologies complexes. Les explications de David Manset,
dirigeant de la société Gnúbila, à l’origine des plateformes dématérialisées.
Le projet Sim-e-Child porte le cloud au service de la pédiatrie. Il s’agit d’un programme lancé par la
Commission européenne en janvier 2010. Ce projet repose sur une plateforme transatlantique,
développée par la société Gnubila, qui relie 5 hôpitaux européens et américains.
Cette plateforme dématérialisée permet aux acteurs de la santé de valider de nouveaux modèles
de simulation sans installer des dispositifs lourds et coûteux pour les hôpitaux. La plateforme cloud
permet notamment de s’affranchir des contraintes 13ptimize13l13es des systèmes d’information et
de 13ptimi les données, concernant des pathologies cardiaques complexes, compatibles et
comparables .
Cette plateforme pourrait à terme devenir un outil prometteur d’aide à la décision. Grâce à elle, les
cardiologues peuvent d’ores et déjà consulter à tout moment une énorme base de données, les
croiser et obtenir des rendus statistiques et autres résultats de fouilles avancées. De quoi
13ptimize13l plus facilement le diagnostic des patients, consulter des cas de 13ptimize13 ou bien
encore identifier des cas similaires par delà les 13ptimize13.
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
14
Le projet Sim-e-child a été développé pour la cardiologie pédiatrique et plus particulièrement pour
les cas de coarctation de l’aorte (rétrécissement 14ptimize14l de l’aorte) chez les jeunes patients.
Toutefois, les chercheurs planchent actuellement sur une déclinaison de ce modèle pour d’autres
pathologies (maladies cardiovasculaires, en particulier liées à l’obésité, maladies rhumatologiques,
maladies neuromusculaires et neurologiques chez l’enfant) à travers le projet MD-Paedigree. Co-
financé par la Commission européenne à hauteur de 11,8 millions d’euros, celui-ci regroupe au
total 22 partenaires, dont 7 hôpitaux en Italie, Belgique,Royaume-Uni,Pays-Bas et Etats-Unis.
Le patient au coeur du système
L’ambition ici est de développer un véritable big data européen pour la santé au service de la
recherche transversale. L’objectif est également de créer un réseau social médical, permettant au
patient de se (ré)identifier pour accéder à ses données et d’en définir les droits
d’accès/exploitation, avec une notion de “don de données” pour la science et de
“péage/rétribution” des patients dans le cadre d’études pharmaceutiques volontaires.
Ce nouveau paradigme de plateforme place le patient au cœur du système et lui permet de
reprendre le contrôle sur ses informations que ce soit en termes de portabilité des données, de
droit à l’oubli ou bien encore de monétisation des accès, adressant une partie des préconisations
de la nouvelle Réglementation Générale de la Protection des Données (GDPR) Européenne, visant à
harmoniser les pratiques au sein de l’Union européenne.
5 – Euronews ‘Futuris’ (website, News – EU) – 23/09/2013
“Visions from the Heart”
In this video dedicated to the EC project Sim-e-Child (which ended on July 2012), MD-Paedigree is
quoted at the end of the video: “Researchers are now adapting the same platform for use with
other paediatric health issues, including rheumatism, neurological and neuromuscular diseases, and
obesity… for the sake of other children all around Europe”.
The video has been published both on the Euronews Website
(http://it.euronews.com/2013/09/23/passi-avanti-nello-studio-delle-patologie-cardiache-infantili/)
and on the Euronews Youtube Channel (https://www.youtube.com/watch?v=4d1nB9ZC910)
6 – SIEMENS GLOBALE NEWS
The Siemens Group
involved in the Project
have prepared a news
article for the internal
SIEMENS Globale News.
Figure 6 – MD-Paedigree
News for SIEMENS GLOBALE
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
15
7 – Other press coverage:
Garr News (website, health – Italy): La ricerca medica italiana premiata con il Best Exhibit Award di
ICT 2013
Peds 2040: Choc Children (website, health – eng.): MD-PAEDIGREE: the first EU-funded big data
project in paediatrics” (paper)
Wiki of future (wiki): Model-Driven European Paediatric Digital Repository
Techeconomy (website, technology – Italy): #ICT2013EU: big data, mobilità sostenibile e salute al
centro della manifestazione
Diplo News (website, News – Eng): ICT2013 has been the EU Presidency most visited event in
Lithuania
MedGift- Content Basedmedical image retrieval (Website, health – Switzerland): Projects“MD-
Paedigree
HD TV one (Website, Tv – Italy): ICT 2013: Premiato a Vilnius (Lituania) l’Avatar pediatrico del
Bambino Gesù
HealthManagement (website, health – Eng.): EU Awards 12 million Euros to Supercompute a
Healthier Future for Europe’s Children
Romaonline (website, news – Italy): Scheda di approfondimento Progetto MD-PAEDEGREE
Universo Mamma (website, health – Italy): L’ospedale Bambin Gesù premiato per l’avatar
pediatrico
Superabile (website, health – Italy): ICT, PREMIATO L’OSPEDALE BAMBINO GESÙ PER “L’AVATAR
PEDIATRICO
Insigneo ‘Institute for InSilico Medicine’ (website, health/technology – Eng.): INSIGNEO Research
Projects – Infrastructure & Frameworks
ASCA ‘Agenzia Stampa Quotidiana Nazionale’ (website, news – Italy): Salute: premiato l’avatar
pediatrico dell’ospedale Bambino Gesu’
Haute école spécialisée de Suisse occidentale, (website, press release, Switzerland): Un projet HES-
SO reconnu par l’Europe a Vilnius.
Radio Rottu Oberwallis, (website, press release, Switzerland): Vilnius/Siders: Internationale
Anerkennung für Hes-so.
1985.ch, (website, press release, Switzerland), Wirtschaftsinformatiker der HES-SO Wallis
ausgezeichnet.
VPH Institute, (website), New VPH project kicks off: MD-Paedigree.
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
16
Furthremore, Cloudsource magazine (SUCRE) have requested two articles from a special edition on
‘Cloud-computing In The Healthcare Industry’ from Athena and Gnubila based on MD-Paedigree.
8 – The Internal Review as means of dissemination
Recently, on 20-22 February, the Internal Review of
MD-Paedigree has been held at the Institute of Child
Health in London, at the presence of independent
reviewers for each disease area addressed by the
project.
Fig. 7 – The UCL Institute of Child Health Entrance
This meeting has represented also an internal
dissemination tools, amongst the partners of the
Consortium itself, and it has been useful in order
to increase awareness about what are the overall
progress of the project and about the main
Fig.8 – One of the MD-Paedigree partner explaining the work perfomed
Furthremore, a press release about the outcomes of the Internal Review has been released through
the project website.
MD-Paedigree has made good progress in this
first year. In any project of such size, complexity,
and encompassing such a diverse range of
activities, there will be all manner of hurdles.
Nevertheless, we have gathered momentum
over year one and we are now truly hitting our
stride in time to tackle the middle straight of the
project and the great challenges it contains. If
we, as a project consortium, maintain the levels
of engagement and motivation shown so far we
will be well set to meet these challenges.
At the core of the project is the data repository and infostructure which is the platform on
which the rest of the project stands. The progress made for the first prototype integration
was able to show the power and potential of the MD-Paedigree vision, enabling us to win
Fig.9 – The Plenary Session of the Internal Review
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
17
the Best Exhibit Award at ICT’13. Over the next twelve months we will see more clinical
centre nodes, more functionality and more data types come online and so the Infostructure
will really begin to take on its role as the backbone of the project.
The specific activities in the clinical applications were started in close collaboration between
clinical and technical partners. Although some data collection took some time to begin in
earnest, there were subsequent mitigation efforts in finding alternatives, e.g. giving access
to legacy data to allow the technical activities to progress according to plan. Nevertheless,
the risks are understood and particular focus will be dedicated to data acquisition in the
second project year.
Since the clinical involvement implies a number of diagnostic modalities encompassing
different clinical specialities, well-specified practical workflows are crucial for patient
recruitment and effective data collection. Even though DHZB joined the project at a later
point, such workflow preparation was achieved in close alignment with the other centres
and patient groups could successfully be identified. Cardiac MRI protocols have been
adapted at DHZB and lab infrastructure was agreed upon to 17ptimize the pre-analytics for
all clinical sites.
A continued open communication within and between work packages will be a key driver in
leveraging the effect of scale of this large project. In addition to the scientific and clinical
achievements, the enthusiasm and engagement of the consortium members are the best
prerequisite to make this project successful at international level.
As a demonstration of our commitment to transparency between collaborators, we plan to
establish a mechanism for publication that will include the MD-Paedigree consortium in the
author list of every publication. Furthermore, every publication will be sent to the Co-
ordinator for formal approval prior to submission, to ensure appropriateness of authorship
and maintain high publication standards.
As a further extension of our focus on clinical leadership as a driver for healthcare
innovation we will be conducting a data privacy and risk stratification assessment that is
founded upon clinical need and use this to inform our engagement with policy makers. In
this way we hope to drive the privacy agenda so that Big Data can be better but securely
applied for the benefit of the healthcare of European citizens.
The four reviewers:
Fig.10 - Marco Bonvicini Fig. 11 - Rolando Cimaz Fig.12 - Adam Shortland Fig. 13 - Alberto Sanna
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
18
3.4 Next steps The work performed during the first year will go on on a similar track, attending the various VPH
Community meetings and other relevant events. Furthermore, in an approach which has proved to
be fruitful for Health-e-Child and Sim-e-Child, MD-Paedigree’s conference participation will seek to
be done in conjunction with similar projects, therefore maximizing the opportunities to increase
the frequency of conference attendance with booths or other dissemination means.
As already done during the first year, apart from booth-based conference attendance, MD-
Paedigree will also seek to participate with “light” dissemination materials such as posters and
brochures of the project.
Furthremore, as stated already in the DoW, the MD-Paedigree Consortium will also attempt to
engage Parent and Patient Associations, disseminating news of its work, expected results and
potential future developments through these channels.
Finally, MD-Paedigree will organise a final conference, to be held at the end of the project,
targeting both internal and external clinical and research communities as well as patient
organisations and the interested media.
For the following months, MD-Paedigree’s Consortium is planning to attend a number of relevant
events.
The most important are:
1. The 2nd International Conference on Research Infrastructure (ICRA 2014), which will be held
in Athens on 2-4 April. MD-Paedigree will apply for an exhibit and will attend with some
dissemination materials.
2. The IEEE-EMBS International Conferences on Biomedical and Health Informatics (BHI), which will
be held in Valencia on 1-4 June 2014. For this event, MD-Paedigree already submitted a proposal for
a special session on Big Data in Healthcare, which has been accepted.
3. The European Scientific Society for Clinical Gait and Movement Analysis Conference (ESMAC),
which will be held in Rome on 1-4 October 2014.
Besides these three events, a detailed list of the events which MD-Paedigree’s partners are likely to attend to within 2014 is the following:
Meeting Date Name Location Full Conference Name
March 3-4, 2014 NTRC2014 London - UK 7
th Annual Neuromuscular Translational
Research Conference (UCL)
March 3-6, 2014
HEALTHINF 2014
Eseo, Angers, Loire Valley - France
7th International Conference on Health Informatics
March 5 -8, 2014 RE(ACT) Basel - Switzerland International Congress on Research of
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
19
Rare Diseases
March 8 - 9, 2014
SMA Den Haag - Netherlands
SMA Europe Annual Meeting
March 19-20, 2014 EDF 2014 Athens –Greece European Data Forum 2014
March 27-28, 2014 IIHC 2014 Berlin - Germany 2014 Innovations & Investments in Healthcare
April 2-4 2014 WoHIT Nice, France World of Health IT 2014
April 1-4 2014 ICRI 2014 Athens –Greece 2nd International Conference on Research Infrastructure
April 9-11, 2014 Med-e-Tel 2014 Luxembourg
The International eHealth, Telemedicine and Health ICT forum for education, networking and business.
April 28 – May 2, 2014 ISBI2014 Beijing - China 2014 IEEE International Symposium on Biomedical Imaging
April 29 – May 1, 2014 Bio-IT 2014 Boston, MA - USA 2014 Bio-IT World Conference
May -- eHealth 2014 Athens - Greece eHealth Forum 2014
May 6-8, 2014 ConhIT 2014 Berlin, Germany Connecting Healthcare IT
May 8 – 10, 2014 ECRD 2014 Berlin, Germany
European Conference on Rare Diseases and Orphan Products
May 10 – 16, 2014 ISMRM Milan - Italy
International Society for Magnetic Resonance in Medicine
May 21-24, 2014 AEPC Helsinki - FInland Association for European Paediatric and Congenital Cardiology
May 28-31, 2014 ECO2014 Sofia - Bulgaria 21
st European Congress on Obesity
June 1-4, 2014 (BHI) Valencia IEEE-EMBS International Conferences on Biomedical and Health Informatics (BHI)
June 7-10, 2014 PICS & AICS 2014 Chicago, IL - 2014 Pediatric & Adult Interventional Cardiac Symposium
June 17-18, 2014 GHIT 2014 Washington, DC - USA
Government Health IT Conference 2014
22-25 June, 2014 ISMRM Workshop
Charleston, SC - USA
ISMRM Workshop on Functional MRI: Emerging Techniques & New Interpretations
July 3-5, 2014 EACD 2014 Vienna - Austria 26th Annual Meeting of the European Academy of Childhood Disability
July 11 -14, 2014 ISMRM Workshop Tromsø - Norway ISMRM Workshop on Motion Correction in MRI
September 9-12 VPH 2014 Trondheim, Norway Virtual Physiological Human 2014
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
20
Sep 14 -18, 2014 MICCAI 2014 Boston (MIT), MA - USA
Medical Image Computing and Computer Assisted Intervention
October 1–3, 2014 EHFG Gastein - Austria 17th European Health Forum Gastein
October 1-4, 2014 ESMAC 2014 Rome, Italy European Scientific Society for Clinical Gait and Movement Analysis –
October 27-30, 2014 ICIP 2014 Paris - France 2014 IEEE International Conference on Image Processing
November 30, 2014 RSNA 2014 Chicago, IL – USA Radiological Society of North America
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
21
4. Preliminary dissemination materials and channels During the first year, a number of dissemination materials have been produced. Since the Project is
at its very beginning, the materials produced are rather generic.
First of all, we have produced the branding of the project (logo, template, graphic), then the public
website, the ID Card and the Newsletter. Furthermore, a number of posters focused on each
disease area of the project have been produced. Finally, a first channel of communication have
been opened, besides the website, namely the MD-Paedigree Twitter Account.
The MD-Paedigree Consortium expects to multiply the dissemination materials (both specific and
generic) and channels during its course, as the number of significant achievements starts to
increase.
In the following sections, details are provided for each of the dissemination material produced so
far.
4.1 Md-Paedigree Logo The MD-Paedigree logo was presented to and agreed by the project partners at the project kick-off
meeting.
4.2 MD-Paedigree in One Slide A poster summarising the MD-Paedigree effort has been produced and is shown in the following
figure:
Fig.14 – The MD-Paedigree Logo
Fig.15 – The MD-Paedigree infographic
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
22
4.3 The MD-Psaedigree Kick-Off Meting Poster and Biannual Meeting Poster
Fig.16 – Kick-Off Meeting poster
Fig.17 – Biannual Meeting poster
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
23
4.3 Newsletter
4.4 Project website
Public website
The first Newsletter has been produced after 6
months of activity of the project. The newsletter
was divided in various sections: an editorial
interview with Project Coordinator Bruno
Dallapiccola, a Project Overview, Highlight &
Objectives, and a section dedicated to the
Clinical applications of MDP for the various
areas (Cardiomyopathies, Gabriele Rinelli;
Obesity Related Cardiovascular disease, Andrew
Taylor; Juvenile Idiopathic Arthiritis with Alberto
Martini; and Neurological and Neuromuscular
Disease with Jaap Harlaar). Another section was
dedicated to the Infostructure Challenges. Latest
News of the project’s achievements were
presented and in the last section ‘Special
Feature’ a position paper on ‘Big Data
Healthcare’ (an overview of the challenges in
data intensive healthcare) by Edwin Morley-
Fletcher has been included.
As foreseen in the DoW, the newsletter has been
published into the Project Website and
distributed in other channels on the web. Fig.18 – The MD-Paedigree Newsletter Frontpage
Fig.19 – The MD-Paedigree Website Homepage
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
24
The MD-Paedigree community is supported by a dedicated website, linked to all partners’ websites, and to all the other VPH and infrastructure projects’ websites of interest for MD-Paedigree. The website has been designed to raise awareness and to promote contents, discussion and suggestions. The project’s website provides access to:
A public area, providing promotional information on public results as well as news, public deliverables, articles and material from participation at events;
A link to EMDesk collaboration platform (see below);
A link to the MD-Paedigree Infostructure.
A link to the MD-Paedigree Twitter Feed.
A link to the EC official RSS Feed. Updates to the website are reviewed on a regular basis, with new material being posted as they become available.
Private website
The MD-Paedigree Consortium adopted the EM-DESK tools: EMDESK is a web-based collaboration and project management platform developed especially for European research projects. An integrated and secure tool, EMDESK supports the entire project life-cycle.
4.5 Social Networks Since it’s inception, MD-Paedigree has been active on a number of social network platforms, from
Twitter, to Youtube, to LinkedIn.
Fig.20 – The MD-Paedigree Twitter Account
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
25
4.6 Project Posters Posters have been produced and displayed at events where MD-Paedigree participated
Fig.21 – The MD-Paedigree posters (one per disease area)
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
26
4.7 ID Card An ID Card of the project has been produced, as shown in the following figure:
4.8 Project’s “Business Card” The MD-Paedigree Business Card has been conceived to be a very light and general dissemination
means, for a wide and massive distribution during the events. It contains only the main information
about the project: Logo, website and Twitter account.
Fig.22 – The MD-Paedigree ID Card
Fig.23 – The MD-Paedigree “Business card”
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
27
4.9 Publications
Publications up to February 2014
1. E. Morley-Fletcher, Big Data healthcare, paper attached to the article Big data: What is it and why is it important?, published in the EC Digital Agenda Website (http://ec.europa.eu/digital-agenda/en/news/big-data-what-it-and-why-it-important). 2. E. Morley-Fletcher & B. Bressan, Innovation and Big Data, concluding chapter included in the forthcoming Wiley book for the 60th anniversary of CERN, From Physics to daily life: Applications in Biology, Medicine and Healthcare: How the technology and knowledge transfers from fundamental research changed industrial process, human behaviour and society at large, edited by Beatrice Bressan.
Both these publications are attached in the Appendix.
Scientific journal and magazines for next publications
The following scientific journals have been identified the most appropriate places in which to
publish the work performed within MD-Paedigree:
Computer Magazine, IEEE Press Information Processing in Medical Imaging (IPMI), Springer International Journal of Multiscale Computational Engineering, Begell House Publications The Fraunhofer IESE Series on Software and Systems Engineering, Fraunhofer Medical Image Analysis, Miccai Society - Elsevier IEEE Transactions on Biomedical Engineering, EMB | PubMed IEEE Transactions on Medical Imaging, IEEE Publications Biomechanics and Modeling in Mechanobiology, Springer American Journal of Physiology, APS Publications Annals of Biomedical Engineering, Springer Journal of Applied Mechanics, ASME Journals Interface focus - Royal Society Publications Journal of Cardiovascular Magnetic Resonance, SCMR Publications Circulation, American Heart Association Publications European journal of medical research, BioMed Publications Journal of Magnetic Resonance Imaging (ISMRM), Wiley-Blackwell Journal of Medical Imaging, SPIE Journal of the American Medical Informatics Association (JAMIA), BMJ Publishing Group Studies in Health Technologies and Informatics, IOS Press Rheumatology, Oxford Journals Journal of Biomechanics, Elsevier Arthritis & Rheumatology, Wiley-Blackwell Gait & Posture, Elsevier Information Processing & Management, Elsevier Paediatric Critical Care Medicine, Lippinkott | Williams & Wilkins
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
28
IEEE Journal of Biomedical and Health Informatics (Retitled from IEEE Transactions of Information Technology on Biomedicine – T-ITB), IEEE
Clinical Biomechanics, Elsevier Medical & Biological Engineering & Computing, Springer Developmental Medicine & Child Neurology, Wiley | Blackwell Nature Genetics, nature publishing group (ngp) Paediatric Radiology, Springer Methods of information in Medicine, Schattauer Nature Review: Neurology, Nature Publishing Group (ngp) Transactions on Database Systems (TODS), ACM Very Large Databases (VLDB) Journal, Springer Eurointervention, EuroPCR | EAPCI, European Society of Cardiology American Journal of Hematology, Wiley | Blackwell Diagnostic Microbiology and Infectious Disease, Elsevier Transactions on Case‐Based Reasoning, IBAI Publishings Proteomics – Clinical Applications, Wiley-Blackwell
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
29
5. Training Strategy
The training strategy of MD Paedigree is embedded in a large dissemination strategy that considers not only scientific dissemination but clear reach towards its target audience in terms of concrete outputs of the project (i.e, the MD Paedigree platform). This is also a very important part of the MD Paedigree sustainability strategy.
5.1 Introduction The training activities correspond to task 18.3 of the workplan. It was mentioned in the DoW that anecdotal evidence has confirmed (via WP4 of the VPH NoE and via feedback from the DISCIPULUS (‘Roadmap Towards the Digital Patient’) meeting (30/03/2012; Barcelona)) that training is recognized to be one of the most solid and long-lasting dissemination strategies in place. This was also highlighted in the paper “Virtual Physiological Human: Training Challenges1 The training activities within MD Paedigree will consist of 2 ‘hands-on’ workshops to be delivered during years 2 and 4 of the project (at approx. 1 or 1.5 year interval) in order to expose the outcomes achieved both, in disease modelling and in building the infostructure, highlighting the potential for change management and innovation in clinical workflows to the medical/clinical and research community interested in VPH technology. The first workshop will also seek to provide feedback to the research and development activities, so as to refine the outcomes for the final workshop. The workshop participants will fill in a detailed feedback questionnaire that will be passed to the developers. The level of detail in this training strategy is intended to give sufficient guidance as to what the training will entail, what is the target audience, the format that will be used and the expected (and desired) outcomes.
5.2 Training Event (1) in a nutshell: Training clinicians to upload and use the MD-Paedigree’s Repository Format The format of the training is based on a ‘hands-on’ workshop in which participants (mainly clinicians) will learn how to use the MD Paedigree platform and, if they so desire, they will bring a clinical case of their own (prior notification to the MD Paedigree team). Although the first training event will be on a “first-come, first-served” basis, the MD Paedigree Consortium will also identify key clinicians to participate. Key technical participants:
1 facilitator from a clinical background involved in the development of the platform 1 facilitator from a technical background involved in the development of the platform An initial group of around 20 individuals split into 4 sessions (1 in the morning, one in the
afternoon, 2 days in total), where each per application will be shown individually (risk of obesity in children, cardiomyophaties, juvenile idiophatic arthritis, neuromuscular and neurological disorders)
Typical example of 1 session (4 hours): Introduction of the tool (1 hour)
Test case studies (including cases brought by participants*)
Exploration of the tool by individual participants (1.5 hours)
1 http://www.ncbi.nlm.nih.gov/pubmed/20478909
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
30
Real case presentation of use in a hospital setting – discussion about implementations and obstacles (45 minutes)
Feedback and discussion (45 minutes)
The organisation of a Training session is complex and time consuming. If the training is to be a success, team effort is needed. The following plan is a suggestion based on the experience and lessons learned in the organisation of different training events in the past. Although there are 2 training events planned, a general plan for all the training will be provided below. However for the purpose of this deliverable, we will use the Training Event (1) as an exemplar. This is because Training event 2 is (a) still too far away in the future and (b) its concrete implementation will depend on the success and lessons learned in Training Event 1. The summary of training event 1 is shown in Fig. 1.
Fig. 24 - Training event 1 in a nutshell (location still not confirmed)
5.2 The organisation of both Training Events can be broken down into 6 General Tasks Task I: Set up, including building the organising team, elaborating a plan for the training organisation, and budget considerations. Task II: Handling logistics for completion before the announcement of the training and before the application period starts. Task III: Tasks to be performed during the dissemination and application period, related mainly with the preparation of the materials before the actual training takes place. Task IV: Invitations and course preparation tasks.
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
31
Task V: Tasks during the training event. Task VI: Tasks to be carried out after the training event. These tasks do not necessarily occur in chronological order and are dependent on one another. The estimated duration of each phase and overview plan are presented in section 2.3.
Task I: Set up An organising team is made up of the following roles:
Training organiser: The person/entity who has ultimate responsibility for the training and the organising team and may make decisions on organisation of the training.
Training secretariat: a person who will prepare the materials and coordinate its distribution (together with the WP leader in order to ensure a consistent alignment with the general dissemination strategy). The secretariat is also the main contact point for all the organising team members and for the participants.
Tutors: 2 people with advanced or senior level scientific profile in related fields of the selected case
studies, each with the responsibility to guide their team’s efforts during course preparation (before the training week starts) and over the training event.
Local clinicians to demonstrate a real use of the platform
The only role not requiring scientific knowledge over the training topics is the training secretariat. For the other roles people of advanced/senior scientific knowledge and expertise should be assigned. The same people may take on many roles, reducing this way the number of staff involved in the training organisation and execution.
Task II: Logistic tasks The first training event will take place on the occasion of one of MD-Paedigree’s periodic meetings. The preparation of the dissemination material is better to be started at the same time as the logistics arrangements, so that the application period can be opened as soon as necessary confirmations are done. It is often the case that the development of dedicated web pages and online applications, and the design of the dissemination material takes longer than expected.
One meeting room with reliable and fast internet connection, ethernet cables (as back-up), sufficient power supply, projector, and optionally also a printer.
At least one extra working station with standard Office tools (Word, PowerPoint etc.) MD Paedigree platform available and set up
The Participant Dossier will include information about the available resources provided by the organiser.
Task III: Dissemination and application period The preparation of materials consists of developing the material necessary to disseminate the training information, and the application form. If everything goes smoothly, the development of the dissemination material should not take more than two months. Spending some time in the beginning of the planning in thinking about the visual image of the Training Event is a good idea, since that will help with the
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
32
preparation of all the relevant material and gives a uniform look to it. After the material is ready and the application period starts, the workload reduces significantly, and increases again after the deadline for applications has closed. Dissemination materials
Announcement of the training: The training event announcement should be supported by a website (or a dedicated section of a website) and electronic material. Print material is not necessary, but can be of use for example for dissemination at scientific conferences or meetings.
Web: Introduction of the Training Event: The complete, up-to-date information about the training event and key participants (if any) and application process should be available online. For this purpose, building a section on the MD Paedigree website is needed. The dissemination can be reinforced by the use of banners, and dissemination through relevant mailing lists and social media such as twitter, facebook, and LinkedIn (this task will be handled by the Project dissemination team). The information in this summary should include:
o A short description of the training event i.e. what is it? o The objectives of the training o Definition of the target; to whom the training is aimed at and who can apply (are there any
restrictions to apply) o Information about the important dates and deadlines o Instructions on how to apply
Application Application form: This will be done using Event Brite on any other online tool. Application period: 2 months.
Task IV: Confirmation and course preparation A quick check on the credentials of the applicants will be performed in order to ensure consistency and maximum impact. This will be done within 5 working days of registration. Participant Dossier Each valid participation request will receive the Participant Dossier which is a document including all the information about the training, and detailed instructions to the participants regarding the next steps, i.e. how to confirm the assistance to the training. Updates of the Participant Dossier can be re-sent to the participants during the preparation phase, thus the last version of the Dossier should include complete information about the training in one single document. Sent along with the confirmation about acceptance to the course, the first version of the Participant Dossier should include at least the following information:
What is the Training event, and how does it work Calendar for important dates and deadlines Description of the venue(s) Tentative schedule Information about the resources provided by the organiser Information about the tutors, and their contact details Required actions from the participants (see next item for ‘Case Study Preparations’) Information about the expenses (what is covered by the organisation and what is not)
And in the final version, in addition to the previous information:
Final schedule Information on logistics:
- Contact details (secretariat) - Addresses and instructions on how to arrive
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
33
Case study preparations Each applicant will have the possibility to bring a case study to the training event. Prior to the event, (at least 3 weeks prior) the participant will contact the MD-Paedigree organisers about his/her case in order to get it ready for the training session. The organisers will put them in touch with the tutors who will handle their sessions. Failure to comply to this rule will mean that the case will not be considered. The tutors’ role is to make sure all the session participants understand the platform, are comfortable sharing their case studies (if any), access to necessary material and data is assured for the training and the course preparation is taken seriously by all. It is important that the tutor takes on an active role in leading the team. Also, the team member whose case study is selected has an important role in providing relevant readings and further information about the case study to the rest of the team, and the tutors should work intensively to involve and engage all participants at this point.
Task V: During the training For tutors and experts it is good to realize that many participants will have only little experience in working with such technologies and/or within a multi-disciplinary team. At the start of the session, the tutors give an introductory presentation on the platform. The final outcome of the training is to provide a clear idea of what the MD-Paedigree platform provides, to entice future participation, generate discussion and collect feedback.
Task VI: After the training Wrapping up the Training event will take between 4 and 5 weeks. The immediate tasks are related with dissemination, such as publishing a press release and updating the web with information about the success of the training event. A satisfaction survey is handed out to the participants, either in paper during the last day of the training, or in electronic format by email shortly after the training week. The organisers will prepare a report (which is a formal deliverable of the project), including a brief overview of the results obtained. The preparation of this report included input from the team as well, and possibly some observations from the tutor. Experience with this has shown that it takes quite some time to get these reports ready and continue to have the involvement of the teams in delivering the requested information.
5.3 Supporting materials A number of supporting materials will have to be developed before the training event. They consist of the material used for dissemination, material related with the application process, material providing information to the participants, and documents used for the evaluation of the training. The Supporting Documents are:
Web/Print: Leaflet, which will have to be produced prior to the meeting in conjunction with the WP leader. It will help disseminate the events via a section on the website and printed material.
Application form: A simplified online application form will be produced. This form will require the specification as to whether or not the participant would like to bring his/her own material to test the MD-Paedigree Platform. In case of a positive answer, the workshop organiser will contact the applicant and will put the applicant in contact with the technical team so that this data is ready to be tested.
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
34
Participant Dossier: Agenda, background about the project, key contacts, information on test cases and why they were chosen.
Certificate of assistance
Satisfaction survey: the objective of the satisfaction survey is to provide feedback about the event and how could be made better (for planning of the 2nd event)
Brief Overview of the work plan The estimated duration and dependencies of the main tasks of the organisation are shown in the work plan below. The estimation is based on the assumption that the organisation is done as a part time activity of the organising team. In any case, the preparation of the training is preferably started at least 6 months before the training commences, since the application period is only expected to last about 2 months.
Month 1 Month 2 Month 3 Month 4 Month 5 Month 6 Prepare dissemination material
Admin: Booking space – flexible as it will depend on demand
Dissemination
Application open (1)
Identification of experts and invitations
(2) Space confirmation
Data processing if applicants send their own (3)
Training
Analysis feedback
(1) checking interest and re-arranging sessions if necessary. Send material to participants. (2) are experts here? do we need to identify additional experts
D18.1 - Dissemination and training strategy plan and preliminary materials
MD-Paedigree - FP7-ICT-2011-9 (600932)
35
Appendix 1. Big Data healthcare, paper attached to the article Big data: What is it and why is it important?, published in the EC Digital Agenda Website (http://ec.europa.eu/digital-agenda/en/news/big-data-what-it-and-why-it-important). 2. Innovation and Big Data, concluding chapter included in the forthcoming Wiley book for the 60th anniversary of CERN, From Physics to daily life: Applications in Biology, Medicine and Healthcare: How the technology and knowledge transfers from fundamental research changed industrial process, human behaviour and society at large, edited by Beatrice Bressan.
1
Please address all proposals of edits and comments to the current draft to [email protected]
Big Data Healthcare
An overview of the challenges in data intensive healthcare1
Edwin Morley-Fletcher In order to bring about the revolution in healthcare that mod-
ern IT promises, there are legal, technical and cultural/societal
barriers that must be overcome. In this draft paper we deal
with the concept of Big Data as applied to healthcare, or Big
Data Healthcare, and the developments it may bring. We then
consider some of the current major hurdles to its acceptance in
standard healthcare.
1. Big Data Healthcare
Big Data Healthcare is the drive to capitalise on growing patient
and health system data availability to generate healthcare inno-
vation. By making smart use of the ever-increasing amount of
data available, we can find new insights by re-examining the
data or combining it with other information. In healthcare this
means not just mining patient records, medical images, bi-
obanks, test results , etc., for insights, diagnoses and decision
support advice, but also continuous analysis of the data streams
produced for and by every patient in a hospital, a doctor’s
office, at home and even while on the move via mobile devices.
Current medical hardware, monitoring everything from vital
signs to blood chemistry, is beginning to be networked and con-
nected to electronic patient records, personal health records,
and other healthcare systems.
1 This document constitutes a preparatory draft for the Networking Session on “ Big data and data analytics impact in healthcare” organised by the FP7 integrated project MD-Paedigree, partially funded by the European Com-mission, for November 7th, 2013, as part of the ICT’13 conference in Vilnius. Its author is Edwin Morley-Fletcher, who, though fully assuming all responsibili-
ties for the current draft, acknowledges his intellectual debts to the fruitful
discussions he had with various members of the Virtual Physiological Human
Institute Public Affairs Working Group (among whom, particularly, Martina
Contin, Ann Dalton, Liesbet Geris, Adriano Henney, James Kennedy and Marco
Viceconti), as well as with the MD-Paedigree community (among whom, partic-
ularly, Bruno Dallapiccola, Ludovica Durst, Harry Dimitropoulos, Yannis Ioan-
nidis, Alex Jones, Titus Kuehne, Callum MacGregor, David Manset, Omiros Met-
axas, Henning Mueller, Giacomo Pongiglione, Patrich Ruch, Alberto Sanna,
Constantin Suciu, Michael Suehling, Karl Stroetman, and Frans Van der Helm),
the leaders of the p-medicine and the VPH-Share projects (Norbert Graf and
Rod Hose), and scholars met on recent occasions, such as the Teradata Big
Analytics conference in London, 20 September 2013 (Yasmeen Ahmad, Mike
Gualtieri), the Pediatrics 2040 conference in Anaheim, CA, 3-5 October 2013
(particularly Gregory Auner, Anthony Chang, Ho Chih-Ming, Wendy Swanson, ,
Han Wang, Randall Wetzel), and the IEEE Big Data in Santa Clara, CA, 6-9 Octo-
ber 2013 (particularly Ingemar Cox, Benedikt Elser, Mike Franklin Joseph Gonza-
lez, Irwin King, Pantelis Koutroumpis, Natasa Milic-Frayling).
The resulting data stream is monitored by healthcare profes-
sionals and healthcare software systems. This allows the
former to care for more patients, or to intervene and guide
patients earlier before an exacerbation of their (chronic)
diseases. At the same time data are provided for bio-medical
and clinical researchers to mine for patterns and correla-
tions, triggering a process of “data-intensive scientific dis-
covery”, building on the traditional uses of empirical descrip-
tion, theoretical computer-based models and simulations of
complex phenomena.
Big Data has been characterised as raising five essentially
independent challenges:
volume,
velocity,
variety,
veracity (lack thereof),
value (hard to extract).
As elsewhere, in Big Data Healthcare the data volume is in-
creasing and so is data velocity as continuous monitoring
technology becomes ever cheaper. With so many types of
tests, and the existing wide range of medical hardware and
personalised monitoring devices healthcare data could not
be more varied, yet data from this variety of sources must
be combined for processing to reap the expected rewards.
In healthcare, veracity of data is of paramount importance,
requiring careful data curation and standardisation efforts
but at the same time seeming to be in opposition to the en-
forcement of privacy rights2.
Finally, extracting value out of big healthcare data for all its
beneficiaries (clinicians, clinical researchers, pharmaceutical
companies, healthcare policy-makers, etc.) demands signifi-
cant innovations in data discovery, transparency and open-
ness, explanation and provenance, summarisation and visua-
lisation, and will constitute a major step towards the cove-
ted democratisation of data analytics.
2 The more we remove information from patient data that can be
used to identify patients, the lower the data’s veracity and thus clinical
value. A patient’s name may not be clinically significant, but age, gender,
blood type and clinical event timings can be used to help identify patients
while also having obvious clinical relevance.
2
Please address all proposals of edits and comments to the current draft to [email protected]
2. Healthcare Data Access and Protection
Three political, academic and business discussions are currently
at the core of the EU debate in this area - the need to ensure
that EU citizens’ data are adequately protected, the need for
open access to data for research purposes, and the need for
Europe to develop a vibrant Big Data industry, capable of inve-
sting a growing amount of resources into break-through innova-
tions in healthcare that the appropriate utilisation of Big Data
promises to deliver. The combined impact of the effective use
of Big Data in research and clinical applications has the poten-
tial to significantly enhance the health and wellbeing of the EU
citizens.
These three debates are running largely in parallel with the fol-
lowing:
1) the loss of trust in data privacy promises, caused by recent
disclosures highlighting massive intrusions by government
agencies and globally acting commercial companies, as well as
inappropriate data sharing by social media organisations;
2) the continuous growth of healthcare data, which awaits tran-
sformation into bio-medical knowledge by adequate data crun-
ching and data analytics applications that will bring about inno-
vations and improvements in healthcare;
3) the open access debate, being driven by the ever increasing
size of scientific literature and the rising cost of access to jour-
nal articles, which is further complicated by recent discussions
of treating datasets as first-class objects in scholarly communi-
cation and the research life-cycle, thereby bringing them into
the open access sphere as well.
2.a. An ownership issue and the European gift relationship tra-
dition
A further topic, often overlooked in reference to Big Data, is the
rights of patients to their data or “the data subject's legitimate
interests”. There are currently instances where others use the-
se data in a supposedly anonymised form, usually without the
patients’ knowledge or consent, to earn profits without letting
patients participate in the wealth generated from adding their
respective data to large repositories.
For example, in the pharmaceuticals industry there is already a
growing market for data on individual prescriptions. Regulation
aimed at letting patients participate in some way in the ad-
vantages derivable from their data is a complex topic in need of
extended public debate leading eventually to policy attention.
Richard Titmuss’ 1970 seminal study on The Gift Relationship:
From Human Blood to Social Policy, has long been an implicit
reference standard due to its concern regarding “processes,
institutions and structures which encourage or discourage the
intensity and extensiveness of anonymous helpfulness in socie-
ty” and its warning that “patients can be harmed, physically and
psychologically, by giving themselves, willingly or unwillingly,
knowingly or unknowingly, as teaching material” to scientific
research and the medical profession. International agree-
ments, such as the 1997 European Convention on Human
Rights and Biomedicine (and its 2002 additional protocol),
have stated that “the human body and its parts shall not, as
such, give rise to financial gain or comparable advantage”.
Also the UK House of Lords Science and Technology Com-
mittee’s 2001 report on human genetic databases concluded
that: “we do not regard ownership of biological samples as a
particularly useful concept *…+ we prefer the notion of part-
nership between participants and researchers for medical
advance and the benefit of others”, while the 2011 report
on Human Bodies: Donation for Medicine and Research, is-
sued by the Nuffield Council on Bioethics, is not opposed, in
principle, to money payments for people donating human
bodily materials but insists on a subtle distinction between
paying for the material purchase of a thing and paying for
the people for their donation, rejecting the former.
Another point, raised by Anne Philips’ Whose Bodies, Whose
property? (2013), is the advocacy for introducing “a levy on
the proceeds of research to be returned, either as assistance
to the specific group suffering from the disease, or to the
wider community”.
How these complex issues are reflected in the Big Data de-
bate is still entirely to be ascertained, even though it is clear-
ly of paramount importance for the development of bio-
medical research.
2.b. The goal of a European Research Area and the Data Pro-
tection Regulation
Access to data is recognised by the European Commission as
key to the completion of the European Research Area (ERA)
and to ensuring a “single-market of data”. The Commission
Communication on “A reinforced European Research Area
Partnership for Excellence and Growth” establishes the fun-
damental priorities for completing ERA. Amongst them, it
includes the optimal Union-wide circulation, access and
transfer of scientific knowledge, as also imbedded in data, in
order to ensure optimal transnational co-operation in the
frame of building ERA. In order to foster Europe’s research
and innovative potential, Member States shall implement
the Commission’s recommendation of improving research
teams’ access, inter alia, to medical record data.
The Conclusions issued by the European Council on 24/25
October 2013, however, have been rather generic, stating
that “Europe must boost digital, data-driven innovation
across all sectors of the economy”, that “strategic technolo-
gies such as Big Data and Cloud computing are important
enablers for productivity and better services”, that “EU ac-
tion should provide the right framework conditions for a
3
Please address all proposals of edits and comments to the current draft to [email protected]
aingle market for Big Data and Cloud Computing”, that “the
European Council calls for the establishment of a strong net-
work of national digital coordinators which could play a strate-
gic role in Cloud, Big Data and Open Data development”, and
that “the commitment to complete the Digital Single Market by
2015 has to be delivered on”.
A precondition for all this to become a reality, and for Europe to
assert a dynamic and independent role in this crucial area, is
the speedy adoption of the Data Protection Regulation, which is
still under discussion. Two basic principles in the current text
are the “right to be forgotten” (when a data subject no longer
wants its data to be processed and when there are no legitima-
te grounds for retaining it, the data must be deleted) and
“informed consent” (consent should be given explicitly by any
appropriate method enabling a freely given informed indication
of the data subject's wishes).
2.c. The right to be forgotten
The right of data subjects “to be forgotten” implies, de facto,
some sort of ownership of data relating to them, and therefore
the right to eventually donate or sell them. It is true, however,
that when it comes to data relating to individual subjects, the
preferred model for the advancement of scientific research
seems to have been not to allow intellectual property (IP) rights
deriving from raw datasets, while IP rights should only become
attached to the analytic work performed on the data, in the
same way as current IP law covers arrangement of facts, but
not the facts themselves.
Therefore, a distinction needs to be made between data and
the products and services developed using these data. For a
market in data analysis to be feasible, if ownership of such
products and services should rest with the developer, the basic
ownership of the data on which it was based should rest with
the data subject, but no direct IP rights should be derived from
the raw data as such.
2.d. Data donation and data inheritance
What remains unclear is how the “right to be forgotten” affects
these products and services. Suppose there is some patient
data based product (e.g. a disease model that has been devel-
oped by analysing and in some novel way aggregating a da-
taset). If some data subjects whose data makes up part of the
underlying dataset request the deletion of their data, must the
effects of their data be removed from the developed product?
If the product has been sold or licensed, or if the data was in
fact totally anonymised, which in itself is difficult to prove, this
may no longer even be possible.
The demands for informed consent and de-identification imply
the implementation of appropriate counteracting measures to
prevent deductive disclosure, i.e. the ability to re-identify
data based on some inferences either by aggregating more
data or by querying the available dataset.
A simple and effective approach consists of setting a minimal
threshold on query engines so that queries returning less
than a minimum number of cases do not inform the user on
the real count (K-anonymity). Another approach is the injec-
tion of noise into datasets for prevention of re-identification.
Nevertheless, with both of these approaches, there is clearly
a balance to be struck between protecting privacy and main-
taining the utility of the datasets. As more and more diverse
person-related datasets become available, data that were
once regarded as 99.99% anonymised may no longer qualify
for this score.
From an ethical perspective, data subjects should be ade-
quately informed of the current and future health care ad-
vantages that they may derive by making their data available
for biomedical exploitation. Yet, how can a data subject
grant consent for as yet unknown purposes? There is a need,
in this sense, for what could be defined as “enhanced priva-
cy” or “enhanced consent”, based on spreading the aware-
ness of the personal and social significance of anonymised
individual patient and personal data for preventative and
predictive purposes in healthcare, and for promoting “data
donation” mechanisms. This could be combined with “data
inheritance” mechanisms, which are automatically applied
after a certain period has elapsed from the data subject’s
time of death, unless they have explicitly opted-out prior to
death. Certainly, a different treatment should be considered
for the parts of these data whose the public availability
could have detrimental consequences for the relatives of the
deceased, such as genetic information.
2.e. Anonymisation and enhanced privacy
“Enhanced privacy/enhanced consent” could also focus on
the subject being able to define restrictions under which the
consent becomes void (e.g., when the study involves the
development of armaments) and should be coupled with the
concept of “personal data portability”: an individual should
be able to export or delete his or her data from the system
at the end of a relationship with a particular service provider
or researcher.
In an age of intense and pervasive innovation, the speed of
technological development is likely to outpace the legislative
process, so there is a need to constantly update legal fra-
meworks to implement forward-looking policies and laws,
allowing for flexible regulations to keep step with technolo-
gy.
It may be that the more pragmatic approach is to legislate
against misuse of data as opposed to prescribing allowable
4
Please address all proposals of edits and comments to the current draft to [email protected]
uses. Article 81 of the Data Protection Regulation states that
Union law or Member State law dealing with processing of per-
sonal data concerning health shall provide for suitable and spe-
cific measures to safeguard the data subject's interests and
fundamental rights versus the necessity for the purposes of
preventive or occupational medicine, medical diagnosis, the
provision of care or treatment or the management of
healthcare services, and where those data are processed by a
health professional subject to the obligation of professional
secrecy, or another person also subject to an equivalent obliga-
tion of confidentiality.
Article 83 states that within the limits of the Regulation, per-
sonal data may be processed for historical, statistical or scien-
tific research purposes only if the following hold:
(a) these purposes cannot be otherwise fulfilled by processing
data that do not permit or no longer permit the identification of
the data subjects;
(b) data enabling attribution of information to an identified or
identifiable data subject are kept separately from all other in-
formation under the highest technical standards, and all neces-
sary measures are taken to prevent unwarranted re-
identification of the data subjects.
This highlights the crucial importance and desirability of de-
identification of data, but recognizes that it is not always possi-
ble to carry out the necessary research if the data are fully de-
identified3.
It must also be noted that full anonymisation is in principle im-
possible, depending on the data recorded; e.g. both DNA
(fingerprinting) and images (e.g. 3D) do always theoretically
allow the identification of a specific patient.
To address this, wherever possible, data "itemising" must be
considered so that there is not a complete image/genotype
record, and therefore identification of the individual is more
difficult, although in principle not impossible.
The debate on data protection and open access should
come to an ethically-based consensus agreement, allowing
for the views of minorities to be respected, if the right of
citizens to appropriate data protection is to be adequately
balanced against their right to further improved healthcare
based on patient data-facilitated clinical research. This ba-
lance is crucial if legislators wish to avoid overprotection of
the rights of a minority becoming detrimental to the delivery
of effective healthcare for the majority.
To this end, it must be ensured that it is not just the nature
of the data that influences the level of protection afforded,
but also the intended use of those data, and the potential
risks implied by their usage. EU citizens have shown time
and time again in surveys that their concerns over data secu-
rity relate not to the use of their data per se, but to who will
use it, and to who might have access to it in the future.
In a UK survey conducted in 2009, it was revealed that an
overwhelming number of UK citizens were willing to donate
their health data where such data would be used solely for
the advancement of medical knowledge. Concerns over the
use of data were confined to fears that their data would be
leaked to their employer or insurance company. The nuan-
ces of EU citizens’ concerns shows that there is a need to
ensure that in crafting the solution to the data protection vs.
open access and Big Data analytics dichotomies, the specifi-
cities pertaining to each of these issues are appropriately
identified.
Denying health researchers access to data necessary for
potentially life-saving research, the results of which EU citi-
zens may someday need, is no guarantee that their data is
any more secure. On the contrary, it will serve only to hinder
the progress of that research, delaying, or even preventing,
the development of new treatments in demand by EU citi-
zens.
2.f. Striking a balance between between privacy, security,
and innovation
Today in Europe there is an increasing call by some for un-
ambiguous data protection regulation that could ease new
scientific research instead of hampering it.
Regulators should understand that where and when strong,
consistently audited data security measures are applied, the
benefits of medical research in all probability outweigh the
putative risks associated with the use of patients’ data by
health professionals.
A harms-based approach, focusing more on the regulation of
possible misuse of data, rather than on limitations of
3 Note that a 100% anonymisation of patient data is impossible, or they may be rendered useless. Whether, for all practical purposes, data are anony-mised with a probability of 99.9%, 99%, or 95% will depend on the availability of other data on individuals (which is hard to predict for the future), computer power and related factors. This seems to be a largely unexplored field. Of great interest, however, is the legal framework adopted by the p-medicine project, based on the following pillars: Pseudonymization In clinical care pseudonymization should be the norm, but, when in need of using these data for research or to upload them with semantic links to a data warehouse, a second pseudonymization should take place, performed by a trusted third party (TTP) which is in case of p-medicine is the CDP (Center for Data Protection; http://www.privacypeople.org). This pseudonymization is done using a tool called CAT (Custodix Anonymization Tool). Contracts between data providers and data users These contracts bind users legally to use the data only according to the research for which the data are requested. Fines are defined if somebody tries to re-identify a subject. National differences (which might disappear after the approval of the new Data Protection Regulation) Today there are different laws regarding data protection in different member states throughout Europe, where the usage of data is different.
Informed consent is seen as a fallback in this legal framework Data are regarded as de-facto-anonymous if they fulfil the above mentioned points.
5
Please address all proposals of edits and comments to the current draft to [email protected]
usage, is likely to be the most appropriate approach for striking
a reasonable balance between privacy, security, and innova-
tion.
Article 70 of the new draft Data Protection Regulation suggests
that the “indiscriminate general notification obligation should
be abolished, and replaced by effective procedures and mecha-
nism which focus instead on those processing operations which
are likely to present specific risks to the rights and freedoms of
data subjects by virtue of their nature, their scope or their pur-
poses. In such cases, a data protection impact assessment
should be carried out by the controller or processor prior to the
processing, which should include in particular the envisaged
measures, safeguards and mechanisms for ensuring the protec-
tion of personal data particularly as specified in the enhanced
consent and for demonstrating the compliance with the Regula-
tion”.
Article 71 significantly specifies that “this should in particular
apply to newly-established large scale filing systems, which
aim at processing a considerable amount of personal data at
regional, national or supranational level and which could affect
a large number of data subjects”.
This paper proposes the inclusion of a concept of enhanced
consent where the data subject is able specifically to exclude
certain data usage whilst allowing data utilisation for the bene-
fit of, for example, healthcare research, alongside maintaining
and ensuring that consent can be withdrawn and data com-
pletely deleted.
As described in the Regulation, all data management will be
under the auspices of a controller4 to ensure compliance and
thus the rights of the citizen are protected and the benefits of
the utilization of Big Data in research, innovation and business
development in healthcare are retained and developed.
3. Dealing with the Data Glut
With the ever-increasing volumes of data being produced ou-
tstripping5, and arguably being driven by, the computing power
available to analyse them (quantum computing aside possibly),
we already are faced with the reality that we have too much
data. As of 2011, this “data glut” was estimated to be 150 exa-
bytes (150 billion gigabytes) for healthcare globally.
To make sense of so much data, where sense is to be found, will
require innovative analytical techniques that can make it possi-
ble to efficiently search, process and analyse these massive da-
tasets. Some handle may be gained over the torrent of data by
reducing the dimensionality of a dataset.
Feature selection methods, whether selecting features on the
basis of existing medical knowledge or on statistical techni-
ques, can be used to map a dataset with many feature di-
mensions to one with significantly fewer, thus creating a
manageable search space. This simpler search space is then
used for querying and analysis, with the full dataset only
referred to when necessary, allowing for the application of
goal-oriented search techniques such as Model-Driven Ana-
lysis. Fractional Factorial Design uses this sort of approach to
concentrate search efforts in areas of a multi-dimensional
dataset that have been selected by searching a lower-
dimensional one that it has been mapped to.
Following a top-down system level approach, Feedback Sy-
stem Control (FSC) has also been proposed recently to redu-
ce the number of experiments in in silico clinical trials. FSC
was shown to efficiently hone-in on an optimised drug com-
bination with 102–106 times fewer experiments than a typi-
cal high throughput (“brute force”) approach. As opposed to
collecting all measurable data and trying to find a needle in a
haystack, the FSC scheme is a goal-oriented method, which
uses the phenotypic outcome to tune the intervention of
engineering systems, achieving system-in-system integra-
tion.
These various techniques allow us to find correlations,
patterns and structures in overwhelming volumes of data,
giving them value.
They reinforce the fact that data do not possess inherent
value in the absence of a means to make sense of them.
They are meaningless until analysed for significance, visual-
ised within a context or compared to other data. This means
that the value of a dataset will vary according to its context.
The corollary of this is that the value of a dataset is the sum
of its value in each analytical context. This fits neatly with
the concept that it is the research results, services and prod-
ucts generated from data that will provide the value in a Big
Data economy.
3.a. Statistical and mechanistic methods within the moving
boundaries of the “dimensionality curse”
Ultimately, data will be as useful as the knowledge that can
be derived from them, implicitly or explicitly. Two diverging
modelling approaches need, therefore, to be taken into ac-
count here. On the one hand, the bottom-up approach,
based on statistical models that can identify patterns and
correlations between observed variables in large datasets,
and that are necessarily dependent on the data to which
they are trained.
These models will not reveal fundamental causality between
variables or dynamic aspects, as would be needed for under-
standing complex biological processes. In the era of Big Da-
ta, however, a priori knowledge is supposedly not enough
but given the fact that valuable information exists is various
4 Who in turn needs to be controlled or audited.
5 For instance, in genetics DNA gene sequencing machines based on Big
Data analytics can now read 26 billion characters of the human geno-
me in seconds.
6
Please address all proposals of edits and comments to the current draft to [email protected]
data sources, these can be additionally used for enriching the
global amount of available information. Thus, Big Data (bottom-
up) knowledge discovery and data mining (KDD) can remain the
primary goal, even if, to address such a challenge, it becomes
necessary to move beyond classical – one source, one modality
– statistical simulation models, and there is the need of analys-
ing and combining information from different data sources (e.g.
biomedical data & literature) and modalities (e.g. clinical varia-
bles & images or streams).
On the other hand, following a top-down approach, mechanis-
tic models, based on an understanding of the behaviour of bio-
logical systems’ components, are obviously more difficult to
build and generally larger in scale, but are more robust to noise
and to local inaccuracies, and eventually provide, when execut-
ed properly, unprecedented power for extrapolation and pre-
diction in domains in which the other techniques fail.
This is why, approaches such as those advocated by molecular
systems biology, or more recently by the Virtual Physiological
Human across all scales, need to be fully exploited in healthca-
re. The development of such models, requiring identification of
the system nonlinearities and model structure and estimates of
the system model parameters, implies, however, a computation
time directly related to the number of terms and increasing
dramatically with the model order. A trade-off between accura-
cy and complexity becomes therefore the way to avoid the
“curse of dimensionality” leading eventually to a computatio-
nally intractable combinatorial optimisation problem.
Furthermore, when studying highly complex and inherently non
-linear physiological systems, it would not make sense to assu-
me that their model order and structure can be sufficiently well
known a priori. In fact, in biological systems analysis the main
objective is precisely to gain insight into the underlying structu-
re and the system identification is based on a learning process,
associating parameter estimation with structure selection algo-
rithms, aimed at finding the simplest possible model capable of
revealing the unifying rule relating the components and beha-
viours of the system under study. However, since patterns di-
scovered through KDD enrich already existing a priori knowled-
ge, and the latest statistical simulation models can incorporate
a priori logic (e.g. formulating constraints, imposing dependen-
cies, etc. ), a sensible goal is to combine both bottom-up and
top-down approaches.
The search for parsimonious and computationally efficient me-
thods enabling the determination of the simplest model struc-
ture, has additionally led to the development of methods that
can cope with partial and/or incomplete mechanistic knowled-
ge, such as the nonlinear, autoregressive, moving average exo-
genous (NARMAX) structure detection approach, pioneered for
many years by the signal processing and complex systems re-
search of Stephen Billings and others at the University of
Sheffield.
This method is based on black-box parametric system identi-
fication and leads to a mathematical model on the basis of
input/output causal relationship calculated with non-linear
difference equations. It has demonstrated a remarkable ca-
pability of accommodating the dynamic, complex and nonli-
near nature of real-world time series prediction problems,
successfully achieving the goal of determining the model
form and estimating the numerical values of the unknown
parameters, and eventually validating those results which
have shown sufficient accuracy. In the analysis of physiologi-
cal systems this mathematical modelling can allow to inte-
grate very large data sets into a consistent information
framework, which can further be used to arrive at novel
mechanistic insight and predictions.
3.b. The value of bio-medical Big Data repositories
Given the fact that data are “experience goods”, according
to Arrow’s paradox it should be true that once access to
them has been awarded, their value should be significantly
reduced because of the inherent non-rivalry and non-
excludability characteristics of open access information.
However, Big Data repositories coupled with analytics can be
utilised in a variety of excludable ways and their value is not
critically influenced by Arrow’s paradox, while the
“experience goods” challenge mainly concerns the availabil-
ity of “enough” descriptive information about the data, their
structure, process of collection, and possibly “teasers” about
the analyses or outputs from the database.
However, it remains true – as stated in the OECD Report on
“Supporting Investment in Knowledge Capital, Growth and
Innovation” – that in order to reap this growing value there
will be the need not only for clinicians and researchers to
acquire Big Data analytics skills and services, but also to de-
velop a framework for data repositories which adheres to
international standards for the preservation of data, sets
common storage protocols and metadata, protects the in-
tegrity of data, establishes rules for different levels of access
and defines common rules that facilitate the combining of
datasets and improve interoperability. These frameworks
could, someday, render some of today’s data protection
rules and procedures invalid.
Such repositories can allow “testing” of the prior knowledge
of clinicians, who identify the data features deemed to be
key for specifying a patient’s treatment, versus the correla-
tions that big data crunching may highlight, possibly leading
to further knowledge discovery.
Indeed, by statistically and semantically reasoning on the
data, existing pathophysiological patterns may be revealed
and inputted as a first step in a fractional factorial and model
driven research process supporting physicians in their
7
Please address all proposals of edits and comments to the current draft to [email protected]
iterative and interactive quest to discovering new knowledge.
The goals here are: to be able to provide model-driven patient-
specific predictions and simulations and consequent optimised
personalised clinical workflows, to allow for advanced similarity
search among patients, such that clinicians can find “the patient
like mine”, and to get support through risk stratification and
outcome analysis. Eventually it is hoped that specific patho-
physiological patterns (“disease signatures”) can be detected,
refined and made available to other clinicians and researchers
in the form of pattern libraries. These pattern libraries, identify-
ing homogenous groupings among patients and model similari-
ties, could be shared between researchers and clinicians to al-
low for data intensive pathophysiological diagnoses. Allied to
the above is the potential to revolutionise health communica-
tions by making it possible, on the basis of semantically ad-
vanced repositories, to use social media among patients aware
of sharing highly similar conditions (“patients exactly like us”),
empowering them to bridge the gap with the clinicians, espe-
cially in the case of paediatric patients and their parents.
4. The “Long Tail” potential in bio-medical research
The Pareto Distribution, introduced in 1906 by the Italian econ-
omist Vilfredo Pareto, described a feature of social distributions
which has been a recurrent mantra in organizational studies.
Presented in simplified form as Pareto's Principle, or the “80/20
Rule”, it states that 20 percent of something would normally be
responsible for 80 percent of the results.
In the economic arena, this traditional 80/20 concept has begun
to be reversed, as highlighted by the innovative insights in Chris
Anderson’s book, The Long Tail (2006). This developed the con-
cept that, when transaction costs are greatly lowered, “the big-
gest money is in the smallest sales”, whereby a series of small
niches cumulatively achieve a much larger amount than the
traditional focus on selling the preferred 20% of the items.
Internet businesses such as Amazon have demonstrated this,
because they have infinite shelf space and a radically different
cost structure to “bricks and mortar” retailers. The long tail has
been extremely lengthened as a result. Consumers really can
find and choose whatever they want, no matter how popular or
sought after the item is. While such retail niches were not eco-
nomically viable in the past, they can now better fulfil the mar-
ket.
An analogy to epidemiological studies can apply here. What
works for business transactions can also work for clinical inter-
actions. Big Data applications in medicine radically change the
capacity for going beyond the average patient population,
searching for specific cohorts of patients fitting into very peculi-
ar niches of their own.
Such an approach can show the way to truly personalised medi-
cine. Applying the Long Tail insight to biomedical research and
to drug discovery, will lead to people receiving treatments
and drugs specifically targeted to their own genomic, prote-
omic, and metagenomic characterisation. Tailoring treat-
ments, drugs and research to everyone's individual needs is
precisely what the long tail's approach is about. Thus focus
on the long tail in healthcare should allow medicine to
better address all those diseases and ailments suffered by a
relatively small number of people or by a large number of
people whose common conditions have numerous underly-
ing causes.
Big data allows for “long-tail medicine” drugs with enhanced
personalised information content, based on customized al-
gorithms tackling the individual disease conditions that can
be best addressed only by personalised treatment. Notably,
such conditions may often be the most serious. For example,
cardiovascular disease is responsible globally for more mor-
bidity, mortality and economic burden than any other dis-
ease. Despite growing knowledge about its various causes
and risk factors, it remains difficult to tackle in a preventa-
tive sense. A key element of this difficulty results from the
complex nature of the disease, arrived at by a multitude of
pathways, influenced by genome, metagenome and environ-
mental exposures. Thus, no single intervention or set of in-
terventions applied in a blanket fashion to the population
has been able to tackle this disease adequately. Personalised
interventions, resulting from a big data focus on the dispar-
ate underlying genotypic and phenotypic drivers of the dis-
ease offers a possible solution to this.
There are also “orphan” diseases, which affect relatively
small numbers of people. The low prevalence of these dis-
eases results in little or no direct research investment being
made by industry to understand them or to develop new
treatments for them. The biopharmaceutical industry in par-
ticular is still largely focusing on a “one size fits all” ap-
proach, but “one size” medicines do not fit all patients, and
the same is true of the R&D process. The limitations of this
approach have become increasingly clear, starting from the
need of increased paediatric knowledge discovery, as exem-
plified at a basic level by the Paediatric-use marketing au-
thorisations which are promoted by the European Medicines
Agency.
Data sets from a sub-population or from longitudinal clinical
data have the potential to expedite the development of tar-
geted therapies in terms of both patient population and
disease. This model of research is in its infancy but has tre-
mendous implications for medical knowledge discovery and
for future drug development.
5. Big Data Literacy in Healthcare
In parallel to the legal and technological challenges we have
outlined above, and to the fact that the healthcare domain
is known for its ontological complexity, variety of medical
8
Please address all proposals of edits and comments to the current draft to [email protected]
data standards and variable data quality, there are also
healthcare cultural changes that will be required to fully capital-
ise on Big Data Healthcare as a movement. This especially re-
volves around the need for medical staff to be educated using
examples of its success. Clinicians will need to understand and
be persuaded of the potential of Big Data Healthcare, as well as
its limits. To maximise the acceptability and utility of Big Data
Healthcare solutions in the clinical arena, clinicians should
therefore be involved in the development of these solutions
from their inception. Thus, a clinically led development ideology
will ensure that technical know-how and innovation translates
into clinically useful tools that fit more naturally into clinical
workflows. Clinicians and engineers must work together to
translate and extend their existing and advanced data analysis
technology, that is the clinically trained human mind, into tar-
geted big data analytical approaches that will achieve clinically
useful outputs. Although engineers and clinicians have long
collaborated successfully, development work in Big Data
Healthcare will require particularly intimate reciprocal under-
standing by each disciplinary culture of the other. This will re-
quire further cultural development in both areas.
This should be achieved at a grass-roots level by a greater em-
phasis of clinical informatics in medical curriculums and similar
exposure of developing bioinformatics engineers to the unique
challenges that medical bioinformatics faces.
At the same time, decision support systems will need to be
more than black boxes but be capable of showing clinicians why
they advocate particular courses of action, providing the neces-
sary assurance that advice is based on sound principles.
Diffusion of medical technologies is necessarily a lengthy pro-
cess. This means that these processes of education and culture
shift should begin now, preparing new generations of clinicians
and bioinformatics engineers for forthcoming data intensive
methodologies and collaboration.
The following points will need therefore to be fleshed out:
Management of big data
Seamless end-to-end big data curation
Data discovery, profiling, extraction, cleaning, integration, analysis, visualisation, summarisa-tion, explanation
Use of big data
Appropriate use of big data – avoiding over-reliance
Responsible use of automated techniques
Communicating big data findings to patients
Integrating data analytics into clinical workflows
Data (clinical) scientist
6. The consequences for policy-making
When Jean Monnet launched the European construction in
1950, he based it on coal and steel, as these sectors were of
strategic importance at the time. In an updated spirit, Nellie
Kroes has suggested that the unifying vision for Europe
should presently be based on two concepts: “being wired”
and “being digital”. Sometime earlier, she had also stated
that “knowledge is the engine of our economy, and data is
its fuel”.
It is hard to disagree with her. However, this vision brings
with it the need to go beyond the usual proliferation of EC-
jargon acronyms that hint at barely specified future goals.
In line with the targets set out in the Europe 2020 Strategy,
the issue of Big Data – as already mentioned - has now been
singled out as a political priority at the 24-25 October Euro-
pean Council. There is consequently much discussion about
creating a European Big Data Analytics Service (EBDAS) net-
work, aimed at fostering a sustainable, smart and inclusive
European Big Data economy, while the Digital Agenda for
Europe (DAE) has been assigned the overall task of creating a
European digital single market through a number of
measures directed at the use of data sources in Europe. All
this will need to pass through the EU Open Data strategy
(conceived of as an amendment of the Public Sector Infor-
mation Directive), the Data Value Strategy (addressing the
entire data life-cycle), and the crucial Data Protection Regu-
lation.
At the same time, due to the staggering, and ever growing,
size and complexity of bio-medical big data, computational
modelling in this area is likely to provide the most intriguing
insights into the emerging complementary and synergistic
relationship between computational and living systems. Bio-
medical research institutions and related industries, as well
as the whole healthcare and pharmaceutical sectors, must
therefore be considered as key stakeholders in the European
process leading to the next generation of data-centric sys-
tems. Whether these are systems capable of learning from
data, or data-analysis products and applications capable of
translating medical knowledge discovery into widespread
medical practice, they will put predictive power in the hands
of clinicians and healthcare policy makers.
First of all, this requires the swift definition of an updated
legal and ethical framework that adequately protects citi-
zens on a European-wide basis. This is a necessary premise
for allowing private and public European investment to be
directed towards this crucial area of innovation. Doing so
will support the entrepreneurial initiatives aimed at com-
moditising once complex big data analytics, foster massive
clinical uptake, and reinforce European competitiveness in
medical developments. FP8 European research funding
needs to be directed accordingly.
Innovation and Big Data
Edwin Morley-Fletcher
President Lynkeus Consultancy
Rome, Italy
Beatrice Bressan
Coordinator, TOTEM Outreach CERN, the European Organization for Nuclear Research
Geneva, Switzerland A succession of paradigm shifts has accompanied the varying perceptions of wealth in
economics. Starting from those who thought that wealth resided in agriculture, moving then
to those who saw wealth as correlative to the division of labour, and further to those who
posited it as corresponding to profits, in opposition to rents and wages, as industrialisation
became, for some, synonymous with exploitation. Marginalism redirected economics away
from this connection, positing wealth as the outcome of perfect competition, where profits
would wane. Economies of scale were then left as only factor of increasing returns operating
as engine of economic growth. This pleased at the same time free-traders, empire builders,
and socialists and communists, who were both striving to transform the whole of society into
a single factory. Schumpeter’s ‘creative destruction’, followed, on the one hand, by Frank
Knight’s and John Maynard Keynes’ distinction between risk and uncertainty, showed a
different reading of capitalism, while on the other hand socialism’s demise appeared to
originate from having pursued accumulation without disruptive innovation. The sudden shock
incurred since 2008 by globalisation and free flows of capitals proved financial stability to be
still an inherently uncertain precondition for securing further growth and a larger diffusion of
wealth. Since 1990 knowledge had, however, started to be theoretically perceived as the
endogenous factor that really allows increasing returns. This implied the need to understand
what interconnected set of market and non-market institutions could best make the innovation
process work effectively, and what role should be taken by governments. Being immaterial,
knowledge appears to be limitless and cognitive capitalism passage ‘from atoms to bits’
entails various post-scarcity features. Non-rivalness, non-excludability, cumulativeness, as
well as network effects, jointly lay the ground for moving past abundance. Finding the
adjacent scarcity allows to discover potential economies of scale which ensue from free (or
nearly free) digital goods opening up new markets where to exert monopoly power and reap
extraordinary profits. Where the primary remaining scarce resource is human creativity, the
relative advantages of peer production emerge, opening to criticism of traditional intellectual
property rights. Peer-produced knowledge can be more cost-effective than either markets or
hierarchies, provide better information about available human capital, and better allocate
creative efforts, by taking into account the highly variable and individuated nature of human
intellectual efforts. The ecosystem of technologies is expected to grow in complexity and
colonise new areas. Some anticipate an ‘intelligence explosion’ that will ultimately make
humans and their machines merge and become indistinguishable. The HBP (Human Brain
Project), IBM’s Watson, and MD-Paedigree (Model-Driven European Paediatric Digital
Repository) testify the growing and converging power of Big Data analytics, when applied to
healthcare. Amidst signals of growing difficulties in guaranteeing the sustainability of
welfare states, rather than a further cost driver, a knowledge-based redesign of healthcare
systems needs now to become a political and economic priority for overcoming the residual
grip of scarcity which is hindering mankind from attaining its potential of holistic growth.
IIIa.1 The wealth of nations: agriculture, the division of labour, or profits?
A succession of paradigm shifts has accompanied the varying perceptions of wealth in
economics.
The Physiocrats1 believed that agriculture was the only activity generating a ‘net product’,
while industry was deemed to be ‘sterile’. Given this belief, that Nature allowed mankind to
get a surplus only in agriculture, they focused all they reform efforts in trying to spread what
they termed ‘la grande culture’, i.e., a reorganisation of land plots leading to the
establishment of big farms.
A great shift was brought forward by Adam Smith,2 who subverted the physiocrat approach
by persuasively arguing that the ‘wealth of nations’ resided instead in the division of labour.
His forecast was that the latter, by increasing specialization, and having it mediated through
the ‘invisible hand’ of the market, would bring about rising efficiency in production and
rising living standards.
However, then came David Ricardo. Expounding on Smith’s basis, Ricardo argued that the
wealth of nations was founded on profits, in opposition to agricultural rents and working
class salaries. While wages, were not due to increase in conditions of abundance of
population, because of the working of the ‘iron law of wages’, growth was destined – in
Ricardo’s forecast – to peter out owing to the scarcity of land and the correlative increase of
rents, unless, through free trade, Britain were to take advantage of comparative costs and
become the workshop of the world, purchasing most of her food overseas.
IIIa.2 Industrialisation and/or exploitation Saint Simon saw wealth as depending on the process of ‘industrialisation’ (this was the new
term that he coined), which allowed the translation of the scientific and technical revolution
of his age into economics. On this basis, he could project a future society in which poverty
and war would eventually be eradicated, provided the ‘industrial system’ were to be applied
on sufficiently large scale. In his Parabole politique3 he posited that if France were to lose
her 50 best scientists, artists, military officers, physicians, entrepreneurs and craftsmen, it
would take her a whole generation to recover, while if 30,000 of her nobles and bureaucrats
were suddenly to disappear, this would not be such a dramatic loss, given that “current
society is truly an inverted world, where in all types of occupations men who are not capable
are in charge of directing the capable ones.”4,5
Karl Marx, introduced in his youth to the writings of Saint Simon by his future father–in-
law,6,7
stated that the bourgeoisie had “accomplished wonders” and “conducted expeditions”
that surpassed by far any of those of the past, and was “constantly revolutionising the
instruments of production, and thereby the relations of production, and with them the whole
relations of society.” By the “rapid improvement of all instruments of production, by the
1 F. Quesnay and the Marquis of Mirabeau, Philosophie rurale, 1763; Du Pont de Nemours, La
Physiocratie, 1768. 2 A. Smith, An Inquiry into the Nature and Causes of the Wealth of Nations, 1776
3 C.H. de Saint-Simon, L’Organisateur, 1819
4 N. Coilly et Ph. Régnier, La parabole de Saint-Simon, in Le siècle des saint-simoniens: du Nouveau
christianisme au canal de Suez, Bibliothèque nationale de France, 2006 5 R.B. Carlisle, The Proffered Crown: Saint-Simonianism and the Doctrine of Hope, Johns Hopkins
University Press, 1987 6 Baron Ludwig von Westfalen, whose daughter Jenny Karl Marx (1818-1883) married in 1843.
7 M. Kowalewsky’s personal recollections on Marx (1909), in L. Meldolesi, L’utopia realmente
esistente: Marx e Saint-Simon, Laterza, 1982
immensely facilitated means of communication”, the bourgeoisie – he reckoned – was
drawing “even the most barbarian nations into civilisation.” The cheap prices of commodities
– he added – were “the heavy artillery” with which all nations were being compelled, “on
pain of extinction, to adopt the bourgeois mode of production, […] i.e., to become bourgeois
themselves.”8
Modern industry, “by means of machinery, chemical processes and other methods”, was
“continually transforming not only the technical basis of production but also the functions of
the worker and the social combinations of the labour process”, revolutionising at the same
time “the division of labour within society, and incessantly throwing masses of capital and of
workers from one branch of production to another.”9
The flip side, in all this, was for Marx that profits were strictly founded on exploitation, i.e.,
on not recognising labour’s real value, but only its money-mediated market value, where the
labour force appeared to have a given price, as with any other good. This intertwining of
exploitation and capital accumulation made capitalism to be an inherently contradictory
process, leading inevitably – in Marx’s forecast – to under-consumption crises.
IIIa.3 Perfect competition, the disappearance of profits, economies of scale Marginalism
10 redirected the economic discourse away from the dangerous connection
between profits and exploitation, positing that ultimately wealth depended on the degree by
which society could approach perfect competition, where marginal returns on all forms of
resource investment would eventually be equalised.
This way, entrepreneurs’ profits would tend to fall in line with capitalists’ interests, and vice
versa. This link would work to minimise both their rates. Economies of scale,11
could,
however, allow for increasing returns operating as engine of economic growth, but also
leading to the monopolisation of markets, as large firms would develop lower cost structures
than smaller firms, driving smaller competitors out of business. Though this was not fully
realised then, the economies of scale were therefore contradicting the assumption of perfect
competition being the preferred social outcome as the fundamental mechanism for
maximising wealth.
The pursuit of economies of scale could justify free trade, as well as, conversely, empire
building and ‘imperial preference’. As a synonym of modern industry, economies of scale
also became a supporting argument for socialism. The greatest economy of scale would in
fact be attained by organising society as ‘a single, huge industrial company’12
: a goal
8 K. Marx and F. Engels, Manifesto of the Communist Party, 1848
9 K. Marx, Das Kapital, Kritik der politischen Ökonomie, Volume 1, Verlag von Otto Meissener,
1867 10
Started by three forerunners: Stanley Jevons (1835-1882), Karl Menger (1840-1921), and Léon
Walras (1834-1910). 11
Alfred Marshall (1842-1924). 12
K. Kautsky, Das Erfurter Programm: An earlier draft of this programme, adopted by the Social
Democratic Party of Germany during the congress held at Erfurt in 1891, had been commented upon
by Friedrich Engels (1820-1895), who, on 18 and 19 June 1891, had famously written that the aim
should be “the transformation of present capitalist production […] into socialist production on behalf
of society as a whole and according to a preconceived plan” (A Critique of the Draft Social-Democratic Program of 1891). Kautsky (1854-1938) had specified that it was “by no means
necessary” that the passage to socialism “be accompanied with violence and bloodshed. […] Neither
[was] it necessary that the social revolution be decided at one blow” since “revolutions prepare
themselves by years or decades of economic and political struggle” (Das Erfurter Programm).
commonly shared by both Kautsky and Lenin,13
even though they conflicted on all other
issues.
IIIa.4 Creative destruction
At this point, however, a young Austrian economist, Joseph Schumpeter, had entered the
scene. In a path-breaking book,14
both the static marginalist vision and Marx’s picture of
exploitative capitalism were replaced by the process of what Schumpeter termed ‘creative
destruction’, fuelled by a new character: “The dynamic, innovating entrepreneur as the
lynchpin of the capitalist system, responsible not just for technical progress but for the very
existence of a positive rate of profit.”15
As Schumpeter later further clarified,16
the initial
super-profits of the innovator were a sort of ‘quasi-rent’, derived by a position of temporary
monopoly supply of a particular good, which in due course would be eroded by imitators,
requiring new innovations from the successful entrepreneur to make him survive.
Lately, Schumpeter’s vision has been sharply criticised by the 2006 Nobel laureate for
economics, Edmund Phelps.17
Schumpeter’s context was one in which an outstanding
German intellectual personality like Max Weber had in his vocabulary “no room for
experimentation, exploration, daring, and unknowability”18
(so much so – we may add – that
moving away from the weberian ‘synoptic rationality’ assumption with regard to public
decision makers has required, some decades later, the fundamental theoretical switch brought
forward by the Herbert Simon’s19
‘bounded rationality’20
concept). It is not surprising,
therefore, that Schumpeter still thought that, though “getting the job done” was a particularly
onerous task, the “likelihood of an ‘innovation’ [was] as knowable as the prospects faced by
established products. There [would not be any] chance of misjudgement, provided there
[were] due diligence. An expert entrepreneur’s decision to accept a project and a veteran
banker’s decision to back it [would be] correct ex ante, even uncanny, though ex post bad
luck might bring a loss and good luck an abnormal profit.”21,22
13
V.I. Lenin, The State and Revolutionn (1917): “The whole of society will become a single office
and a single factory, with equality of labour and pay.” For Lenin there was, however, a proviso: it
should be born in mind that “this ‘factory’ discipline, […] is by no means
our ideal, or our ultimate goal. It is only a necessary step for thoroughly cleansing society […]
and for further progress.” 14
W.J.A. Schumpeter, Theorie der wirtschaftliche Entwicklung, Duncker & Humboldt, 1912 15
M. Blaug, Great Economists before Keynes, Wheatsheaf Books, 1986 16
W.J.A. Schumpeter, Business Cycles: A Theoretical, Historical and Statistical Analysis of the Capitalist Process, McGraw Hill, 1939; History of Economic Analysis, Allen & Unwin, 1954 17
In 2006, Edmund Phelps was awarded the Nobel Prize in Economic Sciences “for his analysis of
intertemporal tradeoffs in macroeconomic policy”: www.nobelprize.org/nobel_prizes/economic-
sciences/laureates/2006. 18
E. Phelps, Mass Flourishing: How Grassroot Innovation Created Jobs, Challenge, and Change,
Princeton University Press, 2013 19
In 1978, Herbert A. Simon was awarded the Nobel Prize in Economic Sciences: “for his pioneering
research into the decision-making process within economic organizations”:
www.nobelprize.org/nobel_prizes/economic-sciences/laureates/1978. 20
H.A. Simon, Administrative Behavior: a Study of Decision-Making Processes in Administrative Organization, Macmillan, 1947; Models of Bounded Rationality: Behavioral Economics and Businees Organization, The MIT (Massachusetts Institute of Technology) Press, 1982. 21
E. Phelps, Mass Flourishing: How Grassroot Innovation Created Jobs, Challenge, and Change,
Princeton University Press, 2013 22
N. Rosenberg, Schumpeter and the Endogeneity of Technology: Some American Perspectives,
Routledge, 2000.
On a more practical note, some years before, an influential American business consultant,
Clayton Christensen, had already highlighted the “dilemmas posed to innovators by the
conflicting demands of sustaining and disruptive technologies.”23,24
In a book later to be
thought of as revolutionary in its own domain, he had showed how “very capable executives
in […] extraordinarily successful companies, using their best managerial techniques”25
, could
perfectly face the challenges of developing and adopting sustaining innovations, but had
often “led their firms toward failure”26
, when confronting disruptive technological changes
which could allow to reap the real competitive advantage, by “facilitating the emergence of
new markets”27
, and where leadership would eventually be paying off huge dividends. The
point was – Christensen stated – that, on one hand, “innovation is inherently unpredictable”28
,
because “not only are the market applications for disruptive technologies unknown at the time
of their development, they are unknowable”; and on the other hand, that “amid all the
uncertainty surrounding disruptive technologies, managers can always count on one
anchor”29
, namely that “experts’ forecasts will always be wrong.”30
IIIa.5 Risk and uncertainty
Whatever the current perception of the limitations of Schumpeter’s assumptions regarding the
key role of innovation in driving long-term economic growth, it deserves to be noted that the
initial focus on the crucial issue of unpredictability had been prompted, already in 1921, by
an American economist, Frank Knight. He had in fact posited that it was precisely “true
uncertainty” – which he defined as being “not susceptible to measurement and hence to
elimination” – that was “preventing the theoretically perfect outworking of the tendencies of
competition.”31
This was, according to him, the reason which gave “the characteristic form of
23
C.M. Christensen, The Innovator’s Dilemma, Harper, 2000 24
A similar distinction, dubbed ‘incremental innovation’ and ‘radical innovation’, is at the core of
David Sainsbury’s Progressive Capitalism, though on a much different note. Following the literature
on the “varieties of capitalism” (M. Morishima, Why Has Japan 'Succeeded'?: Western Technology and the Japanese Ethos, Cambridge University Press, 1982; R. Dore, Taking Japan Seriously: A Confucian Perspective on Leading Economic Issues, Athlone Press, 1987; M. Albert, Capitalisme contre capitalisme, Seuil, 1991; D. De Giovanni et al., Le politiche industriali della Cee, Il Sole 24
Ore, 17, 24, 31 May, 1991; C. Hampton-Turner and F. Trompenaars, The Seven Cultures of Capitalism, Doubleday, 1993; K. Seitz, Die japanisch-amerikanische Herausforderung, Aktuell
Verlag, 1994; S. Berger and R. Dore (Eds.), National Diversity and Global Capitalism, Cornell
University Press, 1996; R. Dore, Stock Market Capitalism: Welfare Capitalism: Japan and Germany versus the Anglo-Saxons, OUP (Oxford University) Press, 2000; P.A. Hall and D. Soskice (Eds.),
Varieties of Capitalism: Institutional Foundations of Comparative Advantage, OUP, 2001; F.
Giavazzi and R. Prodi, Discutono di modelli di capitalismo, Il Mulino, 2011. Sainsbury posits that
Germany and Japan excel in the first type of innovation, while the US in the second, and that “neither
variety of capitalism is consistently better than the other, and the task of economic policy-makers has
therefore to be the improvement of their particular variety of capitalism and its constant adaptation to
the changing economic and technological opportunities and challenges that it faces” (Progressive Capitalism: How To Achieve Economic Growth, Liberty and Social Justice, Biteback Publishing,
2013). 25
C.M. Christensen, The Innovator’s Dilemma, Harper, 2000 26
Ibidem 27
Ibidem 28
Ibidem 29
Ibidem 30
Ibidem 31
F.H. Knight, Risk, Uncertainty and Profit, Houghton Mifflin Co., 1921
‘enterprise’ to economic organization”, while also accounting for “the peculiar income of the
entrepreneur”32
: had there not been such uncertainty, the entrepreneur’s income would not be
“‘what is left’, after the others are ‘determined’”33
– stated Knight – but only the normal
return required to pay the manager’s wage and the competitive level of interest to creditors.
Though sharply aware of the crucial role of irreducible uncertainty in economic life, John
Maynard Keynes objected then that “to convert the business man into the profiteer” would be
a sure way for striking “a blow to capitalism, because it [would] destroy the psychological
equilibrium which permits the perpetuance of unequal rewards. The economic doctrine of
normal profits, vaguely apprehended by everyone, is a necessary condition for the
justification of capitalism. The business man is only tolerable so long as his gains can be held
to bear some relation to what, roughly and in some sense, his activities have contributed to
society.”34
Still, Keynes’ most eminent biographer, Robert Skidelsky, is right in stating that “uncertainty
pervades Keynes’s picture of economic life. It explains why people hold savings in liquid
form, why investment is volatile, and why the rate of interest doesn’t adjust savings and
investment. […] Under capitalism, uncertainty is generated by the system itself, because it is
an engine for accumulating capital goods whose rewards come not now but later. The engine
of wealth creation is at the same time the source of economic and social instability.”35
Keynes
prescription would follow Knight’s distinction between risk and uncertainty: in general terms,
“risk could be left to look after itself; the government’s job was to reduce the impact of
uncertainty.”36
Controlling demand, while not interfering with the supply of goods,37
was for
Keynes what ought therefore to be the first objective of government policy to maintain
economic and political uncertainty within acceptable bounds. A second level of intrinsic
uncertainty, related to investment and financial markets, should be countered, in practical
terms, by national and international monetary policy, and culturally by spreading the
philosophical choice of “arranging our affairs in such a way as to appeal to the money-motive
as little as possible, rather than as much as possible.”38
IIIa.6 Accumulation without innovation The disquieting association of profits, innovation, and uncertainty, that was coming to
maturity in those years, was meanwhile looked at in a dismissive way by those who were
pursuing the maximum economies of scale through economic planning. In this sense, it can
be noted that when Joseph Stalin promoted a process of accelerated industrialisation in the
32
Ibidem 33
Ibidem 34
J.M. Keynes, The Collected Writings of John Maynard Keynes, in R.E. Backhouse and B.W.
Bateman, Capitalist Revolutionary: John Maynard Keynes, p. 60, Harvard University Press, 2011 35
R. Skidelsky, Keynes, The Return of the Master, Allan lane, 2009 36
Ibidem, 37
A choice which was in line with the traditional and highly influential approach of John Stuart Mill
(1806-1873), who famously stated that “the laws and conditions of the Production of wealth partake
of the character of physical truths. […] We cannot foresee to what extent the modes of production
may be altered […] by future extensions of our knowledge of the laws of nature, suggesting new
processes of industry of which we have at present no conception. But howsoever we may succeed in
making for ourselves more space within the limits set by the constitution of things, we know that there
must be limits. […] It is not so with the Distribution of wealth. That is the matter of human institution
solely. The things once there, mankind, individually or collectively, can do with them as they like”
(Principles of Political Economy, Book 2, 1852). 38
J.M. Keynes, The End of Laissez-Faire (1926), in Essays in Persuasion, The Collected Writings of
John Maynard Keynes, Volume 9, p. 293, MacMillan-Cambridge University Press, 1989
Soviet Union he ended up by providing exactly what was to be the most compelling example
of the unintended consequences of trying to use forced accumulation as the key to economic
development. As Jeffrey Sachs and John McArthur put it, “the Soviet economy had very
little technological change in the civilian sector for decades and, as a result, came about as
close as possible to a case of a high saving rate combined with stagnant technology. It is
probably fair to say that it proved a key result […]: capital accumulation without
technological advancement eventually leads to the end of economic growth.”39
Ironically, the
process which had had as grand goal the “universal development of productive forces”40
,
leading to a situation where “with the all-around development of the individual, […] all the
springs of co-operative wealth [would] flow more abundantly”41
, endogenously turned,
through centralised planning, into a system which in 1980 would be bluntly characterised as
the Economics of Shortage by the great Hungarian economist, Janos Kornai.42
IIIa.7 The real engine of economic growth
When these outcomes became suddenly apparent to all, in the last decade of the 20th
century,
the euphoria engendered by the ‘victory of capitalism’ and the ‘demise of socialism’ induced
many to believe that globalisation, in conjunction with the free flow of capitals, would bring
about increasing wealth, provided economic growth could be channelled into boundaries of
ecological sustainability.
Forgetting about intrinsic economic uncertainty, the theory of efficient market expectations
assumed that “financial markets [were] equivalent to insurance markets”43
, until the financial
crisis of 2008 proved once more that financial stability was still an inherently uncertain
precondition for securing further growth and a larger diffusion of wealth.
The economic slowdown which followed led not only to increased competition, but also
highlighted the growing impossibility for advanced economies to guarantee a continual
growth of output just by increasing the inputs to production process, as understood during
industrialisation. A new paradigm won through: the key driver should be believed to be none
of those referred to until then; knowledge, and its continuous translation into innovation and
knowledge and technology transfer, were instead to be seen as the real engines of long-term
economic growth.
In broad conceptual terms, Hayek44
had laid the ground for this new way of looking at
development by introducing already in the 30s the idea that in a complex economy know-how
had necessarily to be highly dispersed and no central planner would ever be able to put it all
39
J.D. Sachs and J.W. McArthur, Technological Advancement and Long-Term Economic Growth in Asia, in Chong-En Bai and Chi-Wa Yuen (Eds.), Technology and the New Economy, p. 161, MIT
Press , 2002 40
K. Marx, The German Ideology, 1845. 41
K. Marx, Critique of the Gotha Programme, 1875 42
J. Kornai, Economics of Shortage, North-Holland, 1980; Growth, Shortage and Efficiency,
Blackwell, 1982; Contradictions and Dilemmas: Studies on the Socialist Economy and Society, The
MIT Press, 1986; The Socialist System: The Political Economy of Communism, Clarendon Press, 1992 43
R. Skidelsky, Keynes, The Return of the Master, Allan lane, 2009 44
In 1975 Frederick A. Hayek was awarded the Nobel Prize in Economic Sciences jointly to G.
Myrdal “for their pioneering work in the theory of money and economic fluctuations and for their
penetrating analysis of the interdependence of economic, social and institutional phenomena”: www.nobelprize.org/nobel_prizes/economic-sciences/laureates/1974.
together;45
furthermore, in 1968 he had come to dub competition itself as a “discovery
procedure.”46
IIIa.8 Endogenous Technological Change
How the reorientation of economic research could technically make its way towards
recognising a fundamental role to knowledge is in its own right a fascinating tale. This tale
has been neatly told by, amongst others,47,48
David Warsh,49
showing how it started from the
matter of fact assessment by Robert Solow,50
in 1957, that technological change accounted
for seven-eighths of the growth of the US economy while increases in capital stock accounted
for only one-eighth of the growth of income per person in the US.51
The new perception of
the problem then slowly moved on, passing first through Lucas52
and Prescott in 1971,53’54
then Martin Weitzman in 1982,55
followed by Elhanan Helpman and Paul Krugman56
in
1987,57,58
until entered the scene Paul Romer in 1990.59
Romer clearly posited knowledge as
45
F.A. Hayek, Socialist calculation, p. 181 (1935); Economics and Knowledge, p. 33 (1937), The Use of Knowledge in Society, p. 77 (1945), in F. A. Hayek, Individualism and Economic Order, University
of Chicago Press, 1948 46
F. von Hayek, Competition as a Discovery Procedure (1968), in F. A. Hayek, New Studies in Philosophy, Politics, Economics and the History of Ideas, p. 179, Routledge & Kegan, 1978. 47
D. Acemoglu, Introduction to Modern Economic Growth, Princeton University Press, 2008 48
P. Aghion and P. Howitt, The Economics of Growth, The MIT Press, 2009 49
D. Warsh, Knowledge and the Wealth of Nations: A Story of Economic Discovery, Norton, 2006 50
In 1987, Robert Solow was awareded the Nobel Prize in Economic Sciences “for his contributions
to the theory of economic growth”: www.nobelprize.org/nobel_prizes/economic-
sciences/laureates/1987. 51
R.M. Solow Technical Change and the Aggregate Production Function, Review of Economics and
Statistics, Volume 39, N. 3, p. 312, 1957; The last 50 years in growth theory and the next 10, Oxford
Review of Economic Policy, Volume 23, N. 1, p. 3, 2007 52
In 1995, Robert E. Lucas was awarded the Nobel Prize in Economic Sciences “for having
developed and applied the hypothesis of rational expectations, and thereby having transformed
macroeconomic analysis and deepened our understanding of economic policy”:
www.nobelprize.org/nobel_prizes/economic-sciences/laureates/1995. 53
R.E. Lucas and E.C. Prescott, Investment Under Uncertainty, Econometrica, Volume 39, N. 5, p.
659, 1971 54
R. Lucas, On the Mechanics of Economic Development, Journal of Monetary Economics, Volume
22, Issue 1, p. 3, 1988; Why Doesn't Capital Flow from Rich to Poor Countries, American Economic
Review, Volume 80, N. 2, p. 92, 1990 55
M. Weitzman, Increasing Returns and the Foundations of Unemployment Theory, Economic
Journal, Volume 92, Issue 368, p. 782, 1982. 56
In 2008, Paul Krugman was awarded the Nobel Prize in Economic Sciences “for his analysis of
trade patterns and location of economic activity”: www.nobelprize.org/nobel_prizes/economic-
sciences/laureates/2008. 57
E. Helpman & P. Krugman, Market Structure and Foreign Trade: Increasing Returns, Imperfect Competition, and the International Economy, MIT Press, 1987 58
E. Helpman, The Mystery of Economic Growth, Harvard University Press, 2004 59
P.M. Romer, Endogenous Technological Change, Journal of Political Economy, Volume 98, N. 5,
p. 71, 1990; Increasing Returns and Long Run Growth, Journal of Political Economy, Volume 94, N.
5, p. 1002, 1986; Growth Based on Increasing Returns due to Specialization, American Economic
Review, Volume 77, N. 2, p. 55, 1987; Crazy Explanations for the Productivity Slowdown, NBER
(National Bureau of Economic Research) Macroeconomics Annual, Volume 2, p. 163, 1987; New goods, old theory, and the welfare costs of trade restrictions, Journal of Development Economics,
Volume 43, Issue 1, p. 5, 1994
the endogenous factor that economists had hitherto failed to take adequately into account: its
accumulation and deepening was the source of increasing returns, which explained why
economic growth could be accelerating in rich countries, where the standard of living was
diverging rapidly from poorer countries, contradicting the classical economic law of
diminishing returns. It was by investing in knowledge that the final output could increase
more than proportionately, because of the impetus given this way to the introduction of new
technology.
A metaphor, Romer later made use of, was plainly taken from everybody’s experience in the
kitchen: “To create valuable final products – he stated - we mix inexpensive ingredients
together according to a recipe. […] If economic growth could be achieved only by doing
more and more of the same kind of cooking, we would eventually run out of raw materials
[…]. History teaches us, however, that economic growth springs from better recipes, not just
from more cooking. […] Every generation has perceived the limits to growth that finite
resources and undesirable side effects would pose if no new recipes or ideas were discovered.
And every generation has underestimated the potential for finding new recipes and ideas. We
consistently fail to grasp how many ideas remain to be discovered. […] Possibilities do not
add up. They multiply.”60
IIIa.9 The appropriate set of market and non-market institutions Clearly, this new approach implied the need to understand what interconnected set of market
and non-market institutions could best make the innovation process work effectively.61
This
subsequently implied understanding what type of strategies governments should follow in
order to foster highly innovative economic systems.
A crucial feature of any innovation strategy would be, for instance, betting on promoting
enhanced higher education as a major long-term investment by government. Such a public
policy priority would go hand in hand with the need to fostering sufficient scale for a
dynamic environment where scientific and technological progress could jointly feed the
development of a competitive innovation system.62
As further feature, the option of an export
oriented, open economy would assume the whole world as a potential market, compensating
60
P.M. Romer, Economic Growth, in D.R. Henderson (Ed.), The Concise Encyclopedia of Economics,
Liberty Fund, 2007
61 The need to be aware of the “richness of the institutional alternatives between which we have to
choose” had already been most influentially brought forward by Ronald Coase (The Choice of Institutional Framework, Journal of Law and Economics, Volume 17, Issue 2, p. 493, 1974; The Institutional Structure of Production, The American Economic Review, Volume 82, Issue 4, p. 713,
1992), Oliver Williamson (The Economic Institutions of Capitalism: Firms, Markets, Relational Contracting, The Free Press, 1986), and Douglass North (Institutions, Institutional Changes and Economic Performance, Cambridge University Press, 1990). 62
Both the above reported features are, by the way, not at all new. They can be traced back to Das Nationale System Der Politischen Ökonomie (1841), where Friedrich List (1789-1846) had written
that “the present state of the nations is the result of the accumulation of all discoveries, inventions,
improvements, perfections, and exertions of all generations which have lived before us: they form the
mental capital of the present human race, and every separate nation is productive only in the
proportion in which it has known how to appropriate these attainments of former generations, and to
increase them by its own acquirements. […] There scarcely exists a manufacturing business which has
no relation to physics, mechanics, chemistry, mathematics or to the art of design, etc. No progress, no
new discoveries and inventions can be made in these sciences by which a hundred industries and
processes could not be improved or altered. In the manufacturing State, therefore, sciences and arts
must necessarily become popular” (D. Sainsbury, Progressive Capitalism: How To Achieve Economic Growth, Liberty and Social Justice, Biteback Publishing, 2013).
this way, by the scope of the market, the significant R&D investments which had to be taken
into account as necessary ‘fixed costs’ and ‘barriers to entry’ of innovation endeavours, and
which needed therefore to be recouped through subsequent worldwide sales. Last but not
least, special financing mechanisms beyond the banking sector would prove to be crucial,
since banks would normally not fund projects, even if based on excellent ideas, because
knowledge constituting their background would be intangible and non-collateralisable. And if
banks do not and should not lend for non-collateralised ideas, the innovation process requires
then somebody else who will be able to consider ‘creativity and vision are resources’63
:
venture capitalists, in the first place, and more generally a capital market making it possible
for successful IPOs (Initial Public Offerings) to raise equity. This way, the openness to
innovation processes implies not only specific forms of organisation that develop, test, and
prove ideas,64
and “bureaucratic regulatory environments that [do not] impede capital and
labour movements [nor] place unnecessary burdens on firm creation and dissolution”65
, but
also a general frame of mind that truly takes into account that the economic death of old
sectors is part and parcel of the advance of new sectors.66
IIIa.10 Limitless knowledge
On top of these, other features appear to be further disruptive on a deeper theoretical level.
While, with industrial development, economic growth was based on the usage of limited
natural resources, knowledge, being immaterial, appears to be limitless. The passage “from
atoms to bits”67
seems to imply also that the number of bytes we make use of can continue to
grow exponentially.
This is not, however, the only post-scarcity element characterising what has been termed
‘cognitive capitalism.”68,69,70
Paradoxically, Marx’s insight on machines, being “organs of the
human brain, created by the human hand”, which express “the power of knowledge,
objectified”71
, acquires in this context a prophetic echo. So much so, as to announce with an
anticipation of 150 years, “the beginning of a new phase of the division of labour in which
the development of fixed capital indicates to what degree general social knowledge has
become a direct force of production, and to what degree, hence, the conditions of the process
of social life itself have come under the control of the general intellect and been transformed
in accordance with it; to what degree the powers of social production have been produced,
63
E. Phelps, Mass Flourishing: How Grassroot Innovation Created Jobs, Challenge, and Change,
Princeton University Press, 2013 64
Ibidem: McKinsey estimated that, from 10,000 business ideas, 1,000 firms are founded, 100 receive
venture capital, 20 go on to raise capital in an initial public offering of share, and 2 become market
leaders. 65
R.D. Atkinson and S.J. Ezell, Innovation Economics: The Race for Global Advantage, Yale
University Press, 2012 66
Ibidem: Atkinson and Ezell remark that “even though Scumpeter was a European, Europeans are
not Schumpeterians. They want the benefit of a knowledge-based economy without the creative
destruction that not only accompanies it but also is required to achieve it”. 67
N. Negroponte, Being Digital, Knopf, 1995 68
C. Vercellone (Ed.), Capitalismo cognitivo: Conoscenza e finanza nell’epoca post-fordista,
Manifestolibri, 2006 69
Y. Moulier Boutang, Le capitalisme cognitif: La nouvelle grande transformation, Editions
Amsterdam, 2007 70
A. Fumagalli and S. Lucarelli, A model of Cognitive Capitalism: a preliminary analysis, European
Journal of Economic and Social Systems, Volume 20, N. 1, p. 117, 2007 71
K. Marx, The Fragment on Machines, Grundrisse, 1857-1858
not only in the form of knowledge, but also as immediate organs of social practice, of the real
life process.”72
It is irrelevant to establish, here, whether Marx was positing, in these passages, “the
possibility of a direct transition to communism.”73
What is significant for us is the fact that
the current “radical transformation of the foundations of wealth”74
appears to be the
consequence of a “virtualisation of the economy” entering into a condition of growing non-
scarcity.
The non-rivalness,75
non-excludability,76
cumulativeness,77
and network78
characteristics of
knowledge have “the potential of creating a ‘combinatorial explosion’ [allowing it to achieve]
“almost infinitely increasing returns.”79
Correlatively, a knowledge-based economy would be,
however, expected to prevent natural market incentives from achieving allocatively efficient
outcomes. But instead of determining a “tragedy of the commons” – as would run a much
cited theory put forward by Garrett Hardin80
– the final result looks likely to be a ‘happy
ending’, provided it is understood that “for managing ‘knowledge commons’ the social
regulations which will be needed are to be totally different from those generally used for
regulating systems founded on exhaustible resources.”81,82
IIIa.11 Post-scarcity and networks
To explain this assumption, Yann Moulier Boutang has recourse to the metaphor of
pollination by bees. Google – he writes – employs 19,000 people at its premises in Mountain
View, California, while having at the same time 15,000,000 people who work for it for free,
producing information, creating a network: “The value of this network creation activity can
be compared to that developed by bees when they pollinate. The value of their honey, and of
the wax extracted from their alveoles, when brought to the market, is from 350 to 1,000 times
72
Ibidem, Chapter The Development of Machinery: “In a word – Marx had stated, just a few lines
above – it is the development of the social individual which appears as the great foundation-stone of
production and of wealth”. In publishing Das Kapital, Marx would confine in a footnote the remark
that: “A critical history of technology would show how little any of the inventions of the 18th century
are the work of a single individual. Hitherto there is no such book.” 73
C. Vercellone, The Hypothesis of Cognitive Capitalism, paper presented at the Birkbeck College
and at SOAS (School of Oriental and African Studies) Annual Conference on Towards a
Cosmopolitan Marxism, 2005 74
Y. Moulier Boutang, Le capitalisme cognitif: La nouvelle grande transformation, Editions
Amsterdam, 2007 75
Non-rivalness is the characteristic of indivisible benefits of consumption, such that one person’s
consumption does not preclude that of another. 76
Non-excludability is the characteristic that makes impossible to prevent others from sharing in the
benefits of consumption. 77
Cumulativeness is the characteristic that allows increasing the value of existing knowledge by
adding further knowledge to it. 78
Network is the characteristic by which the utility that a user derives from consumption of a good
increases with the number of other agents consuming the good. 79
D. Foray, L’économie de la connaissance, La Découverte, 2000 80
G. Hardin’s thesis was that, when sharing a resource, individuals acting independently and having
in mind only each one's self-interest, would act contrary to the group's long-term best interests and
deplete the common resource (The Tragedy of the Commons, Science, Volume 168, N.3859, p. 1242,
1968) 81
See footnote 743 82
S. J. Liebowitz and S. E. Margolis, Network Externality: An Uncommon Tragedy, Journal of
Economic Perspectives, Volume 8, N. 2, p. 133, Spring, 1994
lower than the value of the pollination that they perform. Google achieves to draw market
profit from human pollination.”83
On a different side of the political spectrum, a similar perception of the role of networks –
which exist when a product’s value to the user increases as the number of users of the product
grows and “each new user of the product derives private benefits, but also confers external
benefits (network externalities) on existing users”84
– has led to focus on the concept that
“today the most interesting business models are in finding ways to make money around
Free”85
, i.e., to make use of network effects and of the correlated economies of scale in
providing for free (or almost free) a digital good to create a new market where they can exert
a strong monopoly power and reap extraordinary profits.86,87
Abundance thinking – as Chris Anderson has written – is not only discovering “what will
become cheaper, but also looking for what will become more valuable as a result of that
shift, and moving to that. It’s the engine of growth, something we've been riding since even
before David Ricardo defined the ‘comparative advantage’ of one country over another in the
eighteenth century. Yesterday’s abundance consisted of products from another country with
more plentiful resources or cheaper labour. Today’s also consists of products from the land of
silicon and glass threads.”88
Chris Anderson has tried to describe some of the conceptual passages implied by the shift
from scarcity to abundance as shown in table IIIa.1. His basic assumption is that the new way
to compete within the Free economic environment will be “to move past the abundance to
find the adjacent scarcity. If software is free, sell support. If phone calls are free, sell distant
labour and talent that can be reached by those free calls (the Indian outsourcing model in a
nutshell). If your skills are being turned into a commodity that can be done by software
(hello, travel agents, stockbrokers, and realtors), then move upstream to more complicated
problems that still require the human touch. Not only can you compete with Free in that
instance, but the people who need these custom solutions are often the ones most willing to
pay highly for them.”89
Table IIIa.1 Examples of scarcity and abundance thinking.90 Scarcity Abundance
Rules Everything is forbidden unless it is
permitted
Everything is permitted unless it is
forbidden
Social model Paternalism (‘We know what's best’) Egalitarianism (‘You know what's best’)
Profit plan Business model We'll figure it out
Decision process Top-down Bottom-up
Management style Command and control Out of control
IIIa.12 Intellectual property rights
Would, however, Free destroy the financial incentives which are needed to push individuals
and firms to undertake the costs, risks, and efforts of developing new knowledge?
Particularly in research – it had been stated – “there are huge incentives to free-ride the
83
Y. Moulier Boutang, L’abeille et l’économiste, Carnets Nord, 2010 84
W.H. Page and J. E. Lopatka, Network Externalities, Encyclopedia of Law and Economics, 1999 85
C. Anderson, Free: The Future of a Radical Idea, Random House, 2009 86
O. Bomsel, Gratuit! Du déploiment de l’économie numérique, Gallimard, 2007 87
O. Shy, The Economics of Network Industries, Cambridge University Press, 2001 88
See footnote 749 89
Ibidem 90
Ibidem
system: let someone else finance the big breakthroughs and let us focus our money on
development, where big payoffs are not far away. Even among governments, there is now an
incentive to free-ride and let some other government somewhere else in the world pay for the
basic research.”91
According to Christensen, the argument that Free represents an attack against intellectual
property rights “straddles the line between libre and gratis. […] The history of intellectual
property [is] based on the long traditions of the scientific world, where researchers freely
build on the published work of those who came before. In the same vein, the creators of the
patent system (led by Thomas Jefferson) wanted to encourage sharing of information, but
they realized that the only way people thought they could get paid for their inventions was-to
hold them secret. So the founding fathers found another way to protect inventors––the
seventeen-year patent period. In exchange for open publication of an invention (libre), the
inventor can charge a license fee (not gratis) to anyone who uses it for the term of the patent.
But after that term expires, the intellectual property will be free (gratis). […] However, a
growing community of creators doesn’t want to wait that long. They’re choosing to reject
these rights and release their ideas […] under licenses such as Creative Commons […]
believe that real Free – both gratis and libre – encourages innovation by making it easier for
other people to remix, mash up, and otherwise build on the work of others.”92
On the whole, innovative societies have adopted pragmatic compromises, such as that basic
scientific discoveries, in general, are not patentable, and that patents are limited to specific
new technologies, and – as we have just seen - are given for a limited period of time. Still, the
indivisibility, uncertainty and externalities characteristics of knowledge-generating activities
may determine a situation of ‘market failure’: on the one hand, markets provide too few
incentives to introduce new innovations, and the production of information may well,
therefore, be insufficient from a social point of view; on the other, innovators and creators
face a consistent risk of incomplete returns for their efforts.
IIIa.13 Governments’ support of scientific research
This problem of appropriability has widely led governments to support basic scientific
discovery through direct subsidization of primary research in universities, government
research laboratories, and even private companies that qualify for government grants. But in
fact the state does more than just fix market failures: “Large-scale and long-term government
investment has been the engine behind almost every general purpose technology in the last
century.”93,94,95,96,97
This is true for ICT (Information and Communication Technology), but
91
L. Thurow had squarely advocated that, “just as the industrial revolution began with an enclosure
movement in England that abolished common land and created private land, the world now needs an
organized enclosure movement for intellectual property rights” (The New Rules For Individuals, Companies, and Nations in a Knowledge-Based Economy, Chapters: Building Wealth and Building Wealth, HarperCollins, 1999). 92
C. Anderson states: “Generation Free – he further posits – doesn’t assume that as go bits, so should go atoms.
They don’t expect to get their clothes or apartments for free; indeed the’re paying more than ever for those. Give
the kids credit: They can differentiate between the physical and the virtual, and they tailor their behavior
differently in each domain” (Free: The Future of a Radical Idea, Random House, 2009). 93
V.W. Ruttan, Is War Necessary for Economic Growth?: Military Procurement and Technology Development, in M. Mazzucato, The Entrepreneurial State: Debunking Public vs. Private Sector Myths, p. 62, Anthem Press,
2013 94
F.L. Block and M.R. Keller (Eds.), State of Innovation: The U.S. Government’s Role in Technology Development, Paradigm, 2011 95
E. Helpman (Ed.), General Purpose Technologies and Economic Growth, The MIT Press, 1998
also “three-quarters of the new molecular biopharmaceutical entities owe their creation to
publicly funded laboratories,” and “in the past ten years, the top ten companies in this
industry have made more in profits than the rest of the Fortune 50098
companies combined.”99
Yet, in general, “nothing at all is paid in royalties. It is simply assumed that the public
investment is meant to help create profits for the firms in question, with little to no thinking
about the obvious distorted distribution of risk and reward this presents.”100,101
This has led Mariana Mazzucato to devote a chapter of her 2013 book, The Entrepreneurial State, to “socialisation of risk and privatisation of rewards,” and to raise the question “can the
Enterpreneurial State eat its cake too?”102,103
Whatever will be the prevailing answer, and whether research funding may come to look
more like venture capital, maintaining a stake in the future exploitation of results, the key
theoretical issue which is at stake implies reaching a much deeper understanding of the way
markets, organisations, and “commons-based peer production” reciprocally position
themselves with regard to their specific comparative advantages.
IIIa.14 The remaining scarce resource is human creativity Yochai Benkler is the author who has most eminently raised the possibility that a ‘networked
information economy’ can prove more economically efficient than one in which innovation is
encumbered by the strict enactment of exclusive intellectual property rights, when the
marginal cost of re-producing most information is effectively nearing zero. In a context in
which physical capital costs for fixation and communication of knowledge are low and
widely distributed, and where existing information is itself a public good, the primary
remaining scarce resource – he states – is human creativity, and it is in this context that the
relative advantages of peer production emerge, highlighting how, under certain
circumstances, peering can in fact be a more cost-effective institutional form than either
markets or hierarchical organizations.104
Benkler has provided a simple table showing how ideal organisational forms can be deducted
as a function of relative social cost (table IIIa.2).
Table IIIa.2 Ideal organisational forms as a function of relative social cost.105
96
B. Jovanovic and P.L. Rousseau, General purpose technologies, in P. Aghion and S. Durlauf (Eds.),
Handbook of Economic Growth, Elsevier, 2005 97
; R.G. Lipsey et al., Economic Transformations: General Purpose Technologies and Long-Term Economic Growth, OUP, 2005 98
The Fortune 500 is an annual list compiled and published by Fortune magazine that ranks the top
500 U.S. closely held and public corporations as ranked by their gross revenue after adjustments made
by Fortune to exclude the impact of excise taxes companies incurred. The list includes publicly and
privately held companies for which revenues are publicly available. The first Fortune 500 list was
published in 1955. 99
M. Mazzucato, The Entrepreneurial State: Debunking Public vs. Private Sector Myths, Anthem
Press, 2013 100
M. Angell, The Truth about the Drug Companies, Random House, 2004 101
M. Mazzucato and G. Dosi (Eds.), Knowledge Accumulation and Industry Evolution: The Case of Pharma-Biotech, Cambridge University Press, 2006. 102
See footnote 763 103
L. Burlamaqui et al., Knowledge Governance: Reasserting the Public Interest, Anthem, 2012 104
Y. Benkler, The Wealth of Networks: How Social Production Transforms Markets and Freedom, Yale
University Press, 2006 105
Y. Benkler, Coase’s Penguin, or, Linux and The Nature of the Firm, The Yale Law Journal, Volume 112,
Issue 3, p. 400, 2002
Property system more valuable than
implementation costs
Property system implementation costs higher than
opportunity cost Market exchange of x more efficient
than organizing or peering of x
Markets
(Farmers markets)
Commons
(Ideas & facts; roads)
Organizing x more efficient than
market exchange or peering of x
Firms
(Automobiles; shoes)
Common property regimes
(Swiss pastures)
Peering of x more efficient than
organizing or market exchange of x
Proprietary ‘open source’ efforts
(Xerox’s Eureka)
Peer production processes*
(NASA106
Click workers)
*‘Cost’ would include the negative effects of intellectual property on dissemination and downstream productive
use.
IIIa.15 Different organisational modes for overcoming uncertainty Behind this taxonomy, there is the taking into account of recent fundamental shifts in
economic reasoning, which Yochai Benkler summarises into four conceptual moves:
1. Ronald Coase107
asked why clusters of individuals operate under the direction of an
entrepreneur, a giver of commands, rather than interacting purely under the guidance of
prices, and answered that using the price system is costly (in terms of ‘transaction
costs’).108
2. Assuming that the cost of organization increases with size, Coase consequently posited
that “any given firm will cease to grow when the increased complexity of its
organization makes its internal decision costs higher than the costs that a smaller firm
would incur to achieve the same marginal result.”109
Coase could assume, this way, to
have a “natural” – i.e., internal to the theory – limit on the size and number of
organizations.
3. Harold Demsetz’s treatment of property rights110
added a specific element of
explanation: property in a resource emerges if the social cost of having no property in
that resource exceeds the social cost of implementing a property system in it.
4. Benkler posits that, when “the social cost of using existing information as an input into
new information production is zero”, “the digitization of all forms of information and
culture has made the necessary physical capital cheaper by orders of magnitude than in
the past”, and “the dramatic decline in communications costs radically reduces the cost
of peering”, the centrality of human capital to information production and its variability
becomes “the primary source of efficiency gains from moving from markets or
hierarchical organization to peering.”111
Hence the conclusion that “commons-based
106
NASA (National Aeronautics and Space Administration) 107
In 1991, Ronald H. Coase was awarded the Nobel Prize in Economic Sciences “for his discovery
and clarification of the significance of transaction costs and property rights for the institutional
structure and functioning of the economy”: www.nobelprize.org/nobel_prizes/economic-
sciences/laureates/1991. 108
R.H. Coase, The Nature of the Firm (1937), now in The Firm, the Market, and the Law, University of
Chicago Press,1988 109
Ibidem 110
H. Demsetz, Toward a theory of property rights, American Economic Review, Volume 57, N. 2, p.
347, 1967; Ownership, Control, and the Firm: The Organization of Economic Activity, Blackwell,
1988 111
Y. Benkler, Coase’s Penguin, or, Linux and The Nature of the Firm, The Yale Law Journal,
Volume 112, Issue 3, p. 404, 2002
peer production creates better information about available human capital and can better
allocate creative effort to resources and projects.”112
Different organisational modes have different strategies for overcoming persistent
uncertainty: markets reduce uncertainty regarding allocation decisions through prices that act
as clear and comparable signals for choosing which use of the relevant factors would be most
efficient; firms or hierarchical organizations resolve uncertainty by instituting a process by
which information about which actions to follow is ordered. Both are, however, ‘lossy’
mediums, in the sense – specifies Benkler – that “much of the information that was not
introduced in a form or at a location that entitled it to ‘count’ toward an agent’s decision is
lost.”113
IIIa.16 Information and allocation gains of peer production
It is with reference to this intrinsic ‘lossiness’ that peering may end up having lower
information opportunity costs than markets or hierarchies. In particular, Benkler suggests that
the primary source of gains – which he calls ‘information gains’ – peer production offers is
its capacity to collect and process information about human capital.
“Human intellectual effort – he posits – is highly variable and individuated. People have
different innate capabilities, personal, social, and educational histories, emotional
frameworks, and ongoing lived experiences. These characteristics make for immensely
diverse associations with, idiosyncratic insights into, and divergent utilization of, existing
information and cultural inputs at different times and in different contexts. Human creativity
is therefore very difficult to standardize and specify in the contracts necessary for either
market-cleared or hierarchically organized production.”114
In addition to these information gains, peering has also potential allocation gains, enabled by
the large sets of resources, agents, and projects available to this form of organisation. “As
peer production relies on opening up access to resources for a relatively unbounded set of
agents, […] this is likely to be more productive than the same set could have been if divided
into bounded sets in firms.” Furthermore, while the use of a rival resource would have
excluded the use by others, this is not true for a purely non-rival good like information, where
the allocation gain can be attained by “allocating the scarce resource – human attention,
talent, and effort – given the presence of non-rival resources to which the scarce resource is
applied with only a probability of productivity.”115
Benkler’s conclusion entails, therefore, a criticism of any attempt to apply strong intellectual
property rights with regard to peer-produced information: “Since the core of commons-based
peer production entails provisioning without direct appropriation and since indirect
appropriation—intrinsic or extrinsic—does not rely on control of the information but on its
widest possible availability, intellectual property offers no gain, only loss, to peer
production.”116
The idea that openness and connectivity may, in the end, be more valuable to innovation than
purely competitive mechanisms is shared also by Steven Johnson, who states that “we are
often better served by connecting ideas rather than we are by protecting them. […] good ideas
112
Y. Benkler, The Penguin and the Leviathan: How Cooperation Triumphs over Self-Interest, Crown
Business, 2011 113
Y. Benkler, Coase’s Penguin, or, Linux and The Nature of the Firm, The Yale Law Journal,
Volume 112, p. 409, 2002 114
Ibidem, p. 412 115
Ibidem, p. 421 116
Ibidem
may not want to be free, but they do want to connect, fuse, and recombine. They want to
reinvent themselves by crossing conceptual borders. They want to complete each other as
much as they want to compete.”117
Ideas – he writes – “rise in liquid networks where connection is valued more than
protection”118
, and on this ground “perhaps ‘commons’ is the wrong word […], though it has
a long and sanctified history in intellectual property law. The commons’ metaphor doesn’t
suggest the patterns of recycling and exaptation119
and recombination that defer so many
innovation spaces. I prefer another metaphor from nature: the [coral] reef”120
, which is the
most extraordinary engine of biological innovation: “Coral reefs make up about one-tenth of
one percent of the earth’s surface, and yet roughly a quarter of the known species of marine
life make their homes there”121
, and what makes them so inventive is not “the struggle
between the organisms but the way they have learned to collaborate.”122
IIIa.17 An ecosystem of technologies leading to the singularity? By analogy – says Johnson – Kuhn’s paradigms of research are like coral reefs, “raised by
myriads of tiny architects”123
, which provide fertile environments for new developments and
represent “the scientific world’s equivalent of a software platform: a set of rules and
conventions that govern the definition of terms, the collection of data, and the boundaries of
inquiry for a particular field.”124
Since modern scientific paradigms are rarely overthrown,
but rather tend to be built upon, the outcome of such a process is a stacked platform that
supports new paradigms above the old ones, and, “in a funny way, the real benefit [of it] lies
in the knowledge you no longer need to have.”125
Such an ecosystem of technologies is expected to grow in complexity and colonise new areas,
just like life itself. Kevin Kelly calls it the ‘technium’, and describes it as a living, evolving
organism that has its own unconscious needs and tendencies: just as water “wants” to flow
downhill and life tends to fill available ecological niches, so technology is similarly due to
expand and evolve.126
In a visionary book published in 2005,127
the Director of Engineering at Google, Ray
Kurzweil, predicted that 2045 will be the year of the singularity,128
when computers meet or
exceed human computational ability and when their ability to recursively improve themselves
can lead to an ‘intelligence explosion’ that will affect all aspects of human culture and
technology, ultimately making humans and their machines merge and become
indistinguishable from each other. “Once a planet yields a technology-creating species –
117
S. Johnson, Where Good Ideas Come From: The Seven Patterns of Innovation, Penguin, 2011 118
Ibidem 119
The term is taken from S.J. Gould and E. Vrba, Exaptation: A Mising Term in the Science of Form,
Paleobiology, Volume 8, N. 1, p. 4, 1982, where they suggested that “characters, evolved other usages
(or for no function at all), and later, p. 6, ‘coopted’ for their current role, be called ‘exaptations’. 120
S. Johnson, Where Good Ideas Come From: The Seven Patterns of Innovation, Penguin, 2011 121
Ididem 122
Ibidem 123
This was the expression used by Charles Darwin admiring the Keeling Islands in April 1836 (The Voyage of the Beagle, Penguin, 1989). 124
See footnote 784 125
Ibidem 126
K. Kelly, What Technology Wants, Viking, 2010 127
R. Kurzweil, The Singularity Is Near: When Humans Transcend Biology, Duckworth, 2005 128
Mathematics defines a ‘singularity’ as a point at which an object changes its nature so as to attain
properties that are no longer the expected norms for that class of objects.
Kurzweil stated – and that species creates computation (as has happened here), it is only a
matter of a few centuries before its intelligence saturates the matter and energy in its vicinity,
and it begins to expand outward at least the speed of light (with some suggestions of
circumventing this limit). Such a civilization will then overcome gravity (through exquisite
and vast technology) and other cosmological forces – or, to be fully accurate, it will
manoeuvre and control these forces – and engineer the universe it wants. This is the goal of
the singularity.”129
IIIa.18 Big Data analytics and data-intensive healthcare
While the forecast of such technological singularity remains highly controversial, in 2009
Microsoft Research published a book of essays, entitled The Fourth Paradigm: data-intensive scientific discovery.130
One of the essays was dedicated to The Healthcare Singularity and the Age of Semantic Medicine,
131 on the assumption that medicine would be likely to be the first
Big Data domain where a threshold moment, dubbed the ‘Healthcare Singularity’, will be
reached, making medical knowledge become ‘liquid’ and its flow from research to practice
(‘bench to bedside’) become frictionless and immediate.132
In the few years which have passed, a number of projects and realisations have made similar
goals look now much nearer. The HBP (Human Brain Project)133,134,135
has been awarded 1
billion euros FET (Future and Emerging Technologies) flagship funding for the next 10 years
by the European Commission. IBM’s Watson,136
the artificially intelligent supercomputer
system capable of answering questions posed in natural language, which won the American
television quiz show Jeopardy!, can process 500 gigabytes, the equivalent of a million books,
per second, and uses natural language capabilities, hypothesis generation, and evidence-based
learning to support medical professionals as they make decisions. MD-Paedigree (Model-
Driven European Paediatric Digital Repository),137
the clinically-led FP7 IP (7th
Framework
Programme Integrated Project) that won the Best Exhibit Award at ICT ’13 European
Conference in Vilnius, Lithuania, in 2013, is applying Big Data analytics to build a scans-
based, genomic and meta-genomic repository of routine patients in four disease areas. The
‘model-driven’ repository is meant to provide, as key functionalities: 1) similarity search,
allowing clinicians to access ‘patients like mine’ (and finding decision support for optimal
treatment also based on comparative outcome analysis), and allowing patients to get in touch
with ‘patients exactly like me’; 2) model-based patient-specific simulation and prediction;
and 3) patient-specific clinical workflows.
129
R. Kurzweil, The Singularity Is Near: When Humans Transcend Biology, Duckworth, 2005 130
T. Hey et al. (Eds.), The Fourth Paradigm: Data-Intensive Scientific Discovery, Microsoft Research,
2009 131
M. Gillam et al., The Healthcare Singularity and the Age of Semantic Medicine, Microsoft , 2009 132
Ibidem 133
H. Markram et al., The Human Brain Project: A Report to the European Commission, HBP-PS
(Human Brain Project- Preparatory Study) Consortium, EPFL (École Polytechnique Fédérale,
Lausanne), 2012 134
R. Kurzweil, How To Create A Mind: The Secrets Of Human Thought Revealed, Viking, 2012 135
M. Kaku, The Future of the Mind: The Scientific Quest to Understand, Enhance and Empower the Mind, Penguin, 2014 136
J. E. Kelly III and S. Hamm, Smart Machines: IBM’s Watson and the Era of Cognitive Computing,
Columbia University Press, 2013 137
MD-Paedigree (Model-Driven European Paediatric Digital Repository) Newsletter, N. 1, 2013:
www.md-paedigree.eu.
Big Data analytics, and specifically Big Data healthcare, are becoming more and more a
generalised concern.138,139,140,141,142,143
Due to the staggering, and ever growing, size and complexity of bio-medical Big Data,
computational modelling in medicine is likely to provide the most intriguing insights into the
emerging complementary and synergistic relationship between computational and living
systems. Bio-medical research institutions and related industries, as well as the whole
healthcare and pharmaceutical sectors, must therefore be considered as key stakeholders in
the process leading to the next generation of data-centric systems. Whether these are systems
capable of learning from data, or data-analysis products and applications capable of
translating medical knowledge discovery into widespread medical practice, they will put
predictive power in the hands of clinicians and healthcare policy makers.
At the same time, Big Data applications in medicine radically change the capacity for going
beyond the average patient population, searching for specific cohorts of patients fitting into
very peculiar niches of their own. Such an approach can show the way to truly personalised
medicine. Applying the Long Tail144 insight to bio-medical research and to drug discovery,
will lead to people receiving treatments and drugs specifically targeted to their own genomic,
proteomic, and metagenomic characterisation. Tailoring treatments, drugs and research to
everyone’s individual needs is precisely the point of the long tail approach. Thus focus on the
long tail in healthcare should allow medicine to better address all those diseases and ailments
suffered by a relatively small number of people or by a large number of people whose
common conditions have numerous underlying causes. Furthermore, Big Data will allow for
‘long-tail medicine’ drugs with enhanced personalised information content, based on
customized algorithms tackling the individual disease conditions that can be best addressed
only by personalised treatment.
Amidst signals of growing difficulties in guaranteeing the long term sustainability of all
European welfare states, the acceleration of technological innovation in healthcare has long
appeared as a further cost driver. Leaving aside whether such acceleration could clear the
way to the hoped-for ‘singularity’, a knowledge-based redesign of healthcare systems
(including how to assess value in people’s lives), needs now to become a political and
economic priority for overcoming the residual grip of scarcity which is hindering mankind
from attaining its potential of holistic growth. Eventually, wealth is health.
138
N. Sliver, The Signal and the Noise: The Art and Science of Prediction, Penguin, 2012 139
B. Kayyali et al., The big-data revolution in US health care: Accelerating value and innovation,
McKinsey Company, 2013 140
Intel IT Centre, Big Data Analytics, Peer Research Report, Intel Corporation, 2013 141
E. Morley-Fletcher, Big Data Helthcare: An overview of the challenges in data intensive healthcare, discussion paper, Networking Session, ICT (Information and Technology
Communications) ‘13 Conference, 2013 142
V. Mayer-Schӧnberger and K. Cukier, Big Data: A Revolution That Will Transform How We Live, Work and Think, Murray, 2013 143
T. Davenport and J. Kim, Keeping Up with the Quants: Your Guide To Understanding and Using Analytics, Harvard Buisness Review Press, 2013 144
C. Anderson developed the concept that, when transaction costs are greatly lowered, “the biggest
money is in the smallest sales”, whereby a series of small niches cumulatively achieve a much larger
amount than the traditional focus on selling the preferred selection of block-buster items. The ‘long
tail’ can be extremely lengthened as a result. Consumers really can find and choose whatever they
want, no matter how popular or sought after the item is, and retail niches which were not
economically viable in the past, can now better fulfil the market. An analogy to epidemiological
studies does apply here (The Long Tail: How Endless Choice is Creating Unlimited Demand, Random
House, 2006).