powerpoint presentation · o advanced data analytics o decentralized computing o metadata...

16
Eurostat benchmarking project Eurostat unit B1 Warsaw, 21-22/11/018

Upload: others

Post on 10-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

Eurostat benchmarking project

Eurostat unit B1

Warsaw, 21-22/11/018

Page 2: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

Where Do We Come From? What Are We? Where Are We Going?

Page 3: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

• 2016 – Best Practice Case Template - 16 open-ended questions

• 2017 – GARTNER – Benchmark of MS architecturesand IT organisation

Where Do We Come From ?

Page 4: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

• What Are We ?

• Where Are We Going?

Eurostat benchmarking project

Page 5: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

• The "Eurostat benchmarking project" covers Validation, ESBRs, "Data

Services" and Linked Open Data (LOD);

• The objective of the project is to provide Eurostat and the ESS partners

with tangible and actionable information about the statistical production

landscape (processes and tools);

• The project team (Eurostat project managers and contractor) analyses

ESS and Eurostat documentation, designs and carries out online

surveys (with the EU-Survey tool) as well as conducts interviews with

statisticians and IT professionals in the NSIs;

• The findings and recommendations will be collated in reports for

member states and Eurostat.

Scope of the project

Page 6: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

Example of benchmarking question (data validation)

Page 7: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

Combining a backwards and forward looking perspective

• Provide a constructive and open channel to assess what has been done across

the ESS partners;

• Collect feedback on what was successful to date and where there is room for

improvement;

• Gather a combined picture of the upcoming priorities of NSIs in the field of

statistical data/metadata management and data analysis in the light of the

expected trends (big data, AI, IoT,…);

Benchmarking approach

Page 8: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

Achievements and feedback

• Analyse what has been achieved across the ESS partners;

• Collect feedback on the work of the last few years to identify which initiatives

added the highest value;

• NSIs can rate the effectiveness of the different projects and share their own

achievements with the ESS as a whole;

• Countries can outline how they collaborated with other member states;

• Analyse how NSIs selected technology platforms for their pilots and POCs.

Benchmarking approach

Page 9: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

Priorities for the next years

• Establish which projects are ongoing at the NSIs and their main objectives;

• Analyse investment plans into new technologies;

• Confirm if mid- and long-term technology roadmaps exist;

• Establish the technology priorities across the ESS with a focus on statistical data

management and data analytics;

• Help to highlight common priorities, plans and collaboration potential between

NSIs in the member states.

Benchmarking approach

Page 10: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

• One of the objectives of this benchmarking

project is to help the ESS to make effective

investment decisions based on past projects

and potential future synergies;

• Implement: Plan – Do – Check – Act;

• In a global data ecosystem enabled by the

internet, big data, IoT etc. it will be even more

important to consider practices like continuous

feedback and continuous improvement and to

integrate those with existing processes in the

future.

Measuring outcomes

Page 11: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

• The proposal is to focus the benchmarking on the following technologies:

o Data virtualization and the logical statistical data warehouse

o Advanced data analytics

o Decentralized computing

o Metadata management

o New data sources management

• Validate what has been tested by NSIs, what was working well and what not so

well;

• Analyse how technology platforms were selected – including vendor and open

source solutions – to help share lessons learned and best practices;

• Lay the foundation for a technology solution catalogue;

• Establish how much appetite there is to collaborate across member states in

structure like CoE;

New technologies

Page 12: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

• Based on previous experience with data validation a limited number of

questions to sketch an overal profile with tentative follow up telephone interview.

• The questionnaires would address cross-functional S-DWH teams at NSIs. The

questionnaire will be elaborated in coordination with the CoE on DWH

• The aim is to conduct the benchmarking in February 2019 and communicate the

findings by the end of March 2019.

• The online questionnaire will be at least available for three weeks in February

after which the analysis will start.

Benchmarking method and high-level timeline

Page 13: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

• Online survey carried out via the

EU-Survey tool;

• Creation of a dynamic

questionnaire for the respective

questionnaire in case not all NSIs

have cross-functional S-DWH

teams;

• Creation of an analysis

presentation and a report with a

detailed analysis and

recommendations.

Execution of the benchmarking

Page 14: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

Where Are We Going?

Data

Information

Page 15: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

Importance of other topics

Access to raw data? ELT instead of ETL?

Self-BI

Data governance

Trainings

Visualization

Security

Page 16: PowerPoint Presentation · o Advanced data analytics o Decentralized computing o Metadata management o New data sources management • Validate what has been tested by NSIs, what

Thank you for your attention!