index [eubrasilcloudforum.eu] · we are pround to share with you the proceedings of the third...

54
1 CloudScapeBrazil 2016 Position Papers Index Welcome message ................................................................................................................................ 3 Interdisciplinary Research in Cloud Computing: future and challenges ................................ 4 Towards a Brazilian Programme for Open Research Data....................................................... 5 Interdisciplinary Research for Cloud Computing: Future and challenges ....................... 7 HPC and Cloud Computing .............................................................................................................. 9 Cloud28+ - A Cloud of Clouds to address the Digital Single Market Strategy and to support Enterprises and Public Sector Agencies .................................................................... 11 Legal and privacy aspects on the cloud............................................................................................ 13 SLA-Ready: A lifecycle approach to Cloud Service Level Agreements so SMEs know what to expect, what to do and what to trust ......................................................................... 14 CloudWATCH2 – Helping cloud service customers become security-savvy through risk management profiling ............................................................................................................. 16 Protecting user data on the Cloud in the era of the new EU GDPR with Confidential and Compliant Clouds - Coco Cloud ........................................................................................... 18 Enhancing Cloud Security and Trust with Context-aware Usage Control Policies ..... 20 Secure Data Processing in Untrusted Clouds ............................................................................ 22 Towards Trust in cloud services with CLARUS – how we are tackling related legal issues ... 24 How WISER project is preparing the ground for cyber security challenges in the Digital Single Market ......................................................................................................................... 26 Cloud Federation & Open Science Cloud ......................................................................................... 28 HPC as a Service ................................................................................................................................... 29 EUBra-BIGSEA: Cloud QoS for Big Data Applications ............................................................. 31 Adaptive Virtual Network Provisioning for IaaS Clouds ........................................................ 33 A Federation of Testbeds for Experimentation in Next Generation Wireless-Optical Convergent Networks: A Case-Study of NFV/C-RAN ............................................................. 36 Testbed as a Service: Experimental Worldwide Laboratories .............................................. 38 Cutting edge cloud technologies: 5G, Cloud and IoT, Fog computing.................................. 40 Cyber-Physical Systems of Systems meet the Cloud ............................................................. 41 Cloud + IoT = New Software Requirements .............................................................................. 43 From Consumer to Industrial IoT: How Megatrends are shaping our society ............... 45 Usto.re – Innovating and reducing costs on private cloud management solutions ..47 Low-cost and Open Source Framework for Monitoring Power Usage Effectiveness on IaaS Clouds............................................................................................................................................. 49 The Future of IT is in the Hybrid Cloud ........................................................................................ 52

Upload: others

Post on 11-Jun-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

1

CloudScapeBrazil 2016 Position Papers

Index

Welcome message ................................................................................................................................ 3

Interdisciplinary Research in Cloud Computing: future and challenges ................................ 4

Towards a Brazilian Programme for Open Research Data ....................................................... 5

Interdisciplinary Research for Cloud Computing: Future and challenges ....................... 7

HPC and Cloud Computing .............................................................................................................. 9

Cloud28+ - A Cloud of Clouds to address the Digital Single Market Strategy and to support Enterprises and Public Sector Agencies ....................................................................11

Legal and privacy aspects on the cloud ............................................................................................13

SLA-Ready: A lifecycle approach to Cloud Service Level Agreements so SMEs know what to expect, what to do and what to trust .........................................................................14

CloudWATCH2 – Helping cloud service customers become security-savvy through risk management profiling .............................................................................................................16

Protecting user data on the Cloud in the era of the new EU GDPR with Confidential and Compliant Clouds - Coco Cloud ...........................................................................................18

Enhancing Cloud Security and Trust with Context-aware Usage Control Policies .....20

Secure Data Processing in Untrusted Clouds ............................................................................22

Towards Trust in cloud services with CLARUS – how we are tackling related legal issues ...24

How WISER project is preparing the ground for cyber security challenges in the Digital Single Market .........................................................................................................................26

Cloud Federation & Open Science Cloud .........................................................................................28

HPC as a Service ...................................................................................................................................29

EUBra-BIGSEA: Cloud QoS for Big Data Applications .............................................................31

Adaptive Virtual Network Provisioning for IaaS Clouds ........................................................33

A Federation of Testbeds for Experimentation in Next Generation Wireless-Optical Convergent Networks: A Case-Study of NFV/C-RAN .............................................................36

Testbed as a Service: Experimental Worldwide Laboratories ..............................................38

Cutting edge cloud technologies: 5G, Cloud and IoT, Fog computing ..................................40

Cyber-Physical Systems of Systems meet the Cloud .............................................................41

Cloud + IoT = New Software Requirements ..............................................................................43

From Consumer to Industrial IoT: How Megatrends are shaping our society ...............45

Usto.re – Innovating and reducing costs on private cloud management solutions ..47

Low-cost and Open Source Framework for Monitoring Power Usage Effectiveness on IaaS Clouds .............................................................................................................................................49

The Future of IT is in the Hybrid Cloud ........................................................................................52

Page 2: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

2

CloudScapeBrazil 2016 Position Papers

Page 3: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

3

CloudScapeBrazil 2016 Position Papers

Welcome message

We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM project, which has received funding from DGCNECT e-infrastructure and the Brazilian government under the 3rd Europe - Brazil coordinated call. Cloud computing and the data economy are key enablers for growth and new business opportunities but require research advances to fully realise their potential. EUBrasilCloudForum is playing an instrumental role by coordinating groups of experts investigating ICT topics in future cooperation. Defining EU-Brazil co-operation priorities for future research on ICT, particularly cloud computing, big data and the Internet of Things (IoT) was a key theme at Cloudscape Brazil on 7 July in Porto Alegre, co-located with the annual conference of the Brazilian Computing Society (CSBC), 4-7 July, and the Workshop on Cloud Networks (6 July).

Policy perspectives and Directions for future ICT co-operation. Policy makers and thought leaders from Brazil and Europe were on hand to set out their mutually agreed priorities moving forward. Renowned ICT experts from the research community took stock of results coming from current and previous collaborative projects between Brazil and Europe and defined future research that is key to advancing the state of the art and beneficial to both sides through joint coordinated calls. The position papers collected under the label “Interdisciplinary Research in Cloud Computing: future and challenges” provide a framework for new multidisciplinary collaborative work in cloud computing, big data, IT security and smart cities, among others, already in place in the two regions.

Cloudscape Brazil was also the launch pad for presenting novel, unpublished research at infrastructure, platform and software as well as at the networking level (look for them under the section “Cutting edge cloud technologies: 5G, Cloud and IoT, Fog computing”). This bleeding-edge research is bringing advances that will ultimately find their way into more sophisticated products and services on the international scene.

Legal and security high on the agenda. The Federal public administration in Brazil has been moving towards adopting cloud technology, with the Ministry of Planning investing huge resources to address barriers including security, data protection, data storage, and governance, demonstrated by the new legal package on the protection of personal data, underpinning a free and secure Internet, and the new cyber security law soon to be approved by the Brazilian congress. Despite these efforts, a lack of knowledge on cloud computing laws, data protection issues and taxation are causing difficulties for companies, namely start-ups, in Brazil. Even more, with the advent of cloud services in Brazil, everything has been regulated but it needs now to be made operational and adapted to the real needs of the market. So it’s comprehensible why legal aspects were a recurring theme in all Cloudscape Brazil sessions. The interest was also confirmed by the number of Position papers received and that we collected in the “legal and privacy aspects on the cloud” section of this document.

Page 4: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

4

CloudScapeBrazil 2016 Position Papers

Business perspectives. Cloudscape Brazil capitalised on its industrial network with key players represented, very timely as the new Brazilian government sees both Industry and SMEs as two priority areas to focus on. High on the agenda for future EU-Brazil collaborative projects are the participation of industry players and SMEs, and taking research results to market in a short timeframe (within 6 months) – requirements that became a recurrent theme of the event. Perspectives on public-private partnerships advancing research come from multinationals like the Central Bank of Brazil, EMC Brazil, Philips Research Brazil, SAP Latin America and the Caribbean and Hewlett Packard Enterprise & Cloud28+ initiative. Look at their statements in the following pages.

Now, it’s your turn

EUBrasilCloudFORUM set up a Working Group (WG) to supports efforts in collecting inputs from a wider pool of experts in EU-Brazil in relation to cloud computing, IoT, and 5G, including security aspects, most of them represented along the authors listed in these pages. The time is ready now to widen its scope and contribute to the EU-Brazil Research and Innovation Roadmap and Action Plan that EUBrasilCloudFORUM has the mandate to write; and provide valuable insights in time for the annual EU-Brazil Policy Dialogue meetings in ICT planned in Autumn in 2016, where these topics are typically decided for future Joint calls by the funding agencies in both countries. If you are interested in one or more of the topics listed in these pages, join our community on EUBrasilCloudforum.eu to become part of our active working group. 

We would like to thank the conference of the Brazilian Computing Society for being our hosts, and our supporters, SBC, CTIC, USP, SNIA, STartUpFarm and ibict. We wish all of you an interesting reading.

Page 5: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

5

CloudScapeBrazil 2016 Position Papers

Interdisciplinary Research in Cloud Computing: future and challenges

Page 6: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

6

CloudScapeBrazil 2016 Position Papers

Towards a Brazilian Programme for Open Research Data

Author: Leonardo Lazarte (IBICT, Brazil)

Position Paper

The Brazilian Institute of Information for Science and Technology, IBICT, and the Brazilian National Research and Education Network, RNP, are working on a proposal for a national programme to promote Open Access to Research Data.

The current storage, communication, and processing technologies, together with a mature scientific culture of Open Science, led to a movement for open access to research data. This greatly enhances    the strength of scientific research and expands its reach to researchers and regions which otherwise had great difficulties of access to advanced infrastructure for experiments and observations to get data for their research.

Open Access to Research Data has nowadays a strategic value for scientific communities, countries and regions. Being a collaborative endeavour, this understanding led to the establishment of networks, communities of practice and larger initiatives. RNP and IBICT participate in one of the largest, the Research Data Alliance, RDA, initially launched by a joint sponsorship of Australia, the European Union and the United States but now involving many countries all over the world.

The proposal for the Brazilian Programme is preliminary based on four key areas:

Information architecture and tools for storage, discovery and retrieval of datasets

Many communities of practice in Brazil and all over the world are developing ontologies, metadata, ad-hoc or adapted tools to share research data among them. Adapting and promoting the visibility of those initiatives and interaction between them and with similar groups in other countries or regions is part of the programme, giving access to knowledge and tools already available.

Physical infrastructure to store, preserve and have access to data

RNP, the Brazilian NREN, has developed a national network with a good capillarity, reaching most research institutions on the country. The challenge now is to build a storage system for data generated and used by those institutions. Certainly a model which is being considered is the building of a Research Data Cloud, based partly on existing infrastructure, part on new resources.

Policies which induce sharing of research data

Some agencies which finance and promote research have already policies which induce recipients of financing to share their research data as part of the conditions to receive support. The national programme will promote the refinement, expansion and adoption

Page 7: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

7

CloudScapeBrazil 2016 Position Papers

of those policies, involving government agencies, researchers and communities of practice.

Education and training of researchers, gatekeepers and data scientists

A key point in the adoption of a culture of data sharing is to promote awareness of researchers on the value of sharing, both for their own production and for the larger scientific community. This awareness should be complemented by a more technical training on structuring data management plans and use of tools for storage and recovery of data.

Information scientists and traditional gatekeepers to information should get a new perspective to their enhanced roles as data scientists and gatekeepers to data.

Interaction with universities will be a key point to promote new curricula and specific training for both players, researchers and gatekeepers.

Continued international cooperation is a key point of the programme. On one hand it will expand an already existing relation with broader initiatives. On the other hand the national programme might grow into a regional initiative as it already exist in areas as access to scientific production and regional digital networks.

Page 8: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

8

CloudScapeBrazil 2016 Position Papers

Interdisciplinary Research for Cloud Computing: Future and challenges

Author: Wagner Meira Jr. (Federal University of Minas Gerais, Brazil)

Focus Area

» Security tradeoff vs. performance

» Data streaming and big data analytic services - processing massive data in real-time

» Trustworthy cloud platforms and services

» Cross-domain orchestration and programmable networks in clouds

» Data flow and portability of data: remedies to vendor lock-in

Who stands to benefit and how

Researchers and practitioners from all areas of Computer Science who develop both applications and system support to cloud computing.

Position Paper

The rise of cloud computing as a universal platform for providing all kinds of services to users and organizations demands research and development in all levels of computing systems. Starting from the hardware level, we may highlight the continuous need for mechanisms that improve the support to virtualization, which also affects significantly the operating system architecture and the interfaces to hardware and functions provided to user level.

The network subsystem, for instance, is key to first class cloud services, given the strict requirements in terms of both latency (e.g., effective collaborative work), and bandwidth, (e.g., for high resolution media). Latency-tolerating protocols over software defined networks are examples of the state-of-the-art research demanded by cloud computing. Similarly, programming languages, together with their compilers, libraries, and runtime systems should provide novel primitives that allow programmers to develop applications that fit the cloud-based platforms better or be able to handle automatically the new requirements.

One particular challenge is data management, since data have been generated at unseen rates, which not only overloads storage resources, but also demand scalable algorithms. In this scenario, desired applications must process distributed data, which should be moved the least, and, if migrating data is necessary, it should overlap with actual computation as much as possible.

Page 9: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

9

CloudScapeBrazil 2016 Position Papers

Further, algorithms implemented on top of cloud platforms should work well for batch and stream processing, and even a mix of both, and exploit properly the computational resources available. For instance, classical problems such as load imbalance and scheduling should be revisited and strategies to address them reassessed and potentially adapted and improved.

Finally, building applications in this scenario is also reshaping the methods and practices of software engineering and computer-human interfaces, which should address the challenges associated with providing services that are based on distributed, geographically apart, and potentially heterogeneous components, what makes orchestration harder, among other challenges. Similar issues arise for several other areas, such as computer security and IoT devices.

In summary, it is necessary to foster interdisciplinary research to enhance cloud-based platforms and services because isolated efforts in any context will be less effective for not considering their impact on other research areas. Such a need seems to be even more important considering that cloud platforms and their usage are a moving target which demands fast, versatile and integrated models, algorithms, mechanisms and techniques.

Page 10: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

10

CloudScapeBrazil 2016 Position Papers

HPC and Cloud Computing

Authors: Philippe Olivier Alexandre Navaux, Eduardo Roloff, Emmanuell Diaz Carreño, Jimmy Valverde Sanchez (Federal University of Rio Grande do Sul, Brazil)

Focus Area

The area of our position is not directly related to the focus themes above, but it is related to the HPC communitty and the benefits and dificulties to adopt the Cloud as an environment to execute HPC applications.

Who stands to benefit and how

General HPC users, with most benefits of users that do not have access to large facilities and coul benefit from the elasticitity and pay-per-use model of cloud computing.

Position Paper

The Cloud Computing model has arrived as a viable alternative to clusters and supercomputers that traditionally have been used to execute High-Performance Computing (HPC) applications. By using the cloud, it is possible to configure experimental or production environments without upfront investments. Moreover, due to the elasticity characteristics of the model, it is possible to have access to a virtually unlimited amount of resources. These two are the main keys of the cloud computing paradigm that attract the attention of the HPC community.

However, the cloud is still not ready to substitute traditional HPC environments for all kinds of user needs. In the following text, we will point out some of the main issues of the model. One big problem of using the cloud is its heterogeneity and the virtualization of the cloud resources. Today, we do not have a cloud manager that is HPC oriented. For this reason, the allocation of machines in the cloud is not predictable in terms of network topology and resource isolation. The virtualization adds an extra layer that causes performance loss and could introduce noise in the execution as well. To address these problems, we need to implement a resource allocation strategy that takes into account the characteristics of HPC applications to provide an HPC cloud environment.

Additionally, it is necessary to explore the possibility of lightweight virtualization as an alternative to decrease the overhead of current virtualization technologies. The performance bottlenecks are also an issue for effective HPC in the Cloud. The biggest challenge in the cloud is the network performance, because normally the cloud providers use commodity interconnections that do not provide adequate performance for HPC applications.

The disk I/O performance is also a key issue for HPC that presents performance losses

Page 11: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

11

CloudScapeBrazil 2016 Position Papers

in cloud environments. The main reasons for these two issues are the resource sharing normally used in the cloud, where several virtual machines are allocated on the same physical hardware. It is important to note that the CPU isolation technology is much more developed than the virtualization of network and disk I/O.

The CPU performance in the cloud is close to the performance obtained in the physical processor. It is necessary to create virtualization techniques for network and disk I/O similar to the ones used in CPU virtualization. Another possibility is to create physical machines designed to be used for HPC in the cloud, with more network interfaces and disk controllers that could be allocated with more isolation levels. Clearly this strategy needs deeper studies to understand its economic viability.

On the user side, it is possible to explore the development of applications and runtime libraries that could benefit from the cloud characteristics. Most importantly, the benefits of elasticity need to be explored, because the applications could use this capability to reduce the execution costs or to improve the overall performance. Also, HPC runtimes could be extended to map the application needs to provide a better match to the cloud environments.

As a concluding remark, we understand that cloud computing has a tremendous potential to be used for HPC, but it is not yet fully compatible with all types of HPC applications.

Page 12: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

12

CloudScapeBrazil 2016 Position Papers

Cloud28+ - A Cloud of Clouds to address the Digital Single Market Strategy and to support Enterprises and Public Sector Agencies

Authors: Claudio Caimi, Mailys Demeoulte, Andrea Monaci (Hewlett Packard Enterprise and Cloud28+)

Focus Area

In the digital era, Enterprises to select the proper software services to support their business and Enterprises making their living to sell digital services are struggling to achieve their respective objectives. Cloud28+ is addressing these issues, covering “special focus” like vendor lock-in, trustworthy, SLA and cross-domain orchestration. Cloud28+ combines a central platform—a cloud service library—with the distributed development of cloud services, as well as the subscription to those services and execution of them via local IT service providers.

Who stands to benefit and how

Cloud28+ proposes a platform for cloud services. As such beneficiaries of the initiative are from both side Cloud Services Vendors and Cloud Services buyers. On the Cloud service Vendors side is a community made of: cloud service providers, independent software vendors, system integrators, consulting firms, legal firms. On the end user side are Enterprises, either private or public, eager to exploit the economic and business advantages offered by the cloud.

Position Paper

In today’s age of smartphones and tablets, European consumers regularly turn to digital app marketplaces to meet their needs for entertainment and tools to improve their daily lives. In turn, these app stores offer software companies of all sizes the opportunity to reach millions of potential customers. App ratings and recommendations guide the consumer in his/her selection, providing an increased opportunity for quality service providers to prosper. While often invisible to the consumer, who enjoys a high level of freedom in selecting the services that best meet his/her needs on the device of his/her choice, the central marketplace provides a unified framework in regard to terms and conditions.

Where can digital users like European enterprises and public organizations go for a similar experience tied to services that match their business needs? Where can they find

Page 13: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

13

CloudScapeBrazil 2016 Position Papers

services that meet their heightened requirements concerning data privacy, service level agreements, and security? To meet this challenge of increasing business cloud services and providing new business opportunities for local cloud service providers in Europe, Helwett-Packard Enterprise has initiated the Cloud28+ community.

Cloud28+ is a federation of Service Providers, resellers, Independent Software Vendors (ISVs), system integrators and government entities providing an online catalogue of trusted cloud services – a “Cloud of Clouds,” made in Europe and secured locally. By connecting their own catalogues or clouds of applications, service providers in the community can offer end customers a broader range of applications, while expanding their own geographic reach. In turn, end users can search for cloud applications according to workload requirements, geographic location and price. The central cloud repository provides end customers with a set framework for security and terms and conditions, while they can gain quality of service and adherence to native market conditions by selecting a local IT partner of their choice.

Cloud28+ combines a central platform—a cloud service library—with the distributed development of cloud services, as well as the subscription to those services and execution of them via local IT service providers. If a Service Provider in France, for example, designs a “smart city” cloud service for Paris and then publishes that service on the Cloud28+ platform, a Service Provider in the UK looking to roll out a similar service for the city of London could discover it in Cloud28+ and then subscribe to it via the UK Service Provider and execute it locally in its own data center for data privacy reasons. To ensure Cloud28+ remains open to all vendors and service providers, while easing interoperability issues, Hewlett Packard Enterprise advocates the use of open source technologies, including the cloud platform HPE Helion OpenStack.

As such, Cloud28+ from ongoing HPE investment and leadership in the OpenStack community, including significant contributions in terms of funding, resource allocation, testing, code and training.

Page 14: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

14

CloudScapeBrazil 2016 Position Papers

Legal and privacy aspects on the cloud

Page 15: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

15

CloudScapeBrazil 2016 Position Papers

SLA-Ready: A lifecycle approach to Cloud Service Level Agreements so SMEs know what to expect, what to do and what to trust

Authors: Stephanie Parker, Nicholas Ferguson and Matteo Scarpellini (Trust-IT Services, UK)

Focus Area 

Tools and services to help European companies manage the cloud service lifecycle with a strong focus on service level agreements. A business hub with practical guides, user stories and expert opinion, and an SLA Aid with personalised report on choosing and using the right cloud service. A Common Reference Model to guide the industry on best practices for service level agreements based on industry best practices and international standards, drawing on an in-depth analysis of current practices from a technical, legal, economic and customer perspective.

Who stands to benefit and how

As most businesses increasingly become digital over the next few years, they will breathe life into our economies and help create new jobs. It is hardly surprising that they are expected to gain the most from using cloud services. So what is holding them back?

Lack of trust in cloud services because of the lack of transparency in contract terms and pricing models. Customers also see the use of standardised cloud service level agreements (Cloud SLAs) as a critical step towards better understanding the levels of security and data protection offered and actually delivered through monitoring of cloud service provider performance.

To lower the entry barriers for small and large firms alike, SLA-Ready provides a suite of practical tools and services, so they know what to expect, what to do and what to trust. SLA-Ready goes beyond the contract signing to assist cloud customers in every phase of the service lifecycle, from assessment to operation and termination. Special attention is paid guiding customers on security and data protection levels so they do not need to be ICT-savvy.

Position Paper

The standardisation and transparency of SLAs is paramount to provide customers with enough information about what services to use and how to use them – all the way through the lifecycle. SLA-Ready has made a comprehensive analysis of current practices of Cloud

Page 16: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

16

CloudScapeBrazil 2016 Position Papers

SLAs, from a technical, legal, economic and sociological perspective, eliciting a set of requirements spanning general requirements, responsibility requirements, economic requirements and technical service level requirements. This is the basis for a Common Reference Model that describes, promotes and supports the uptake of cloud service level agreements by providing a common understanding of SLAs for cloud services.

The Common Reference Model integrates SLA components and terminology, SLA attributes, service level objectives (SLOs), guidelines and best practices. Most importantly from a business perspective, the Reference Model is the basis for a suite of practical tools and services tailored to different levels of knowledge: novice, basic knowledge, experienced user. The SLA Aid also walk current and prospective customers through the different stages of the cloud service management life-cycle based on a set of representative use cases: procurement of IaaS by a Fintech firm, operational phases by small IT teams in local government agencies, an SME using SaaS, and an SME migrating from one SaaS cloud service provider to another.

The SLA Aid is complemented by a Business Hub, a set of user-friendly guides on all the major aspects of cloud services with user tips and expert opinions, covering legal, operational and business aspects. Together, the tools and services increase both understanding and acceptance of responsibilities and risks. They help businesses get to grips with complex legal and technical terms, and help them compare, define and monitor security levels. Because businesses of all sizes only realise the benefits of the cloud when they truly understand and trust it, these online tools really are a “must use” for companies all over the globe, whether they are using cost-effective off-the-shelf services with standard contract terms or have some power to negotiate SLAs.

For cloud service providers, SLA-Ready is an important opportunity to showcase best practices on Cloud SLAs. Our survey for cloud service providers helps them evaluate the extent to which they are meeting customer needs. SLA-ready will use the findings to develop an SLA-Readiness Index that customers can use to compare different practices – a great showcase for providers prioritising transparency and standardisation. Greater confidence in the market means companies can grow their business with new cloud-enabled products and services.

Links www.sla-ready.eu | @SLAReady http://www.sla-ready.eu/common-reference-model http://www.sla-ready.eu/use-cases-0 http://www.sla-ready.eu/crm-validation-smes http://www.sla-ready.eu/crm-validation-smes

Page 17: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

17

CloudScapeBrazil 2016 Position Papers

CloudWATCH2 – Helping cloud service customers become security-savvy through risk management profiling

Authors: Jesus Luna, Cloud Security Alliance, Nicholas Ferguson and Stephanie Parker (Trust-IT Services, UK)

Focus Area

Cloud computing has now reached a point where it is truly accessible to all types of organisations. However, many small and mid-sized organisations often lack IT expertise to assess security measures of cloud service providers and monitor the levels actually delivered. CloudWATCH2 fills this gap with a risk management profile that guides cloud service customers through security self-assessments.

Who stands to benefit and how

Cloud computing has become pretty much a staple in the start-up and SME culture because it radically reduces barriers to entry in almost any sector without the need for upfront investments. Many of these small firms, however, lack the expertise to assess and monitor the security measures of a cloud service provider necessary to protect their data and business assets in an increasingly digital economy. Herein lays the value of CloudWATCH2, a coordination and support action funded under Horizon 2020 for software, services and cloud computing (DG CONNECT Unit E2).

In order to increase awareness of cloud security and help small firms and IT teams in government agencies monitor security levels, CloudWATCH2 provides a simple, efficient and inexpensive approach to identifying and managing their cloud security risks from both a technical and organisational perspective. The approach is standards-based, leveraging standards developed by the US National Institute of Standards and Technology (NIST) and the International Standardization Organization/International Electrotechnical Commission (ISO/IEC).

Position Paper

Compliance with business, regulatory, operational and security requirements largely depends on the service and deployment model and the cloud architectural context. It is therefore imperative that all levels of an organisation understand their responsibilities for adequate cloud security and for managing information system-related security risks. The result is a risk profile that guides cloud service customers through security self-assessments.

Page 18: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

18

CloudScapeBrazil 2016 Position Papers

The approach taken by CloudWATCH2 takes into account security requirements elicited through relevant studies in Europe and globally and is instantiated on top of a well known best practices of the Cloud Security Alliance, namely the Cloud Control Matrix (CCM) and the enterprise Architecture (EA). Both are widely used industrial practices and have been mapped to relevant standards like NIST 800-53v4 and ISO/IEC 27002.

There are many benefits for small firms and IT teams in public administrations (PAs) and government agencies.

» Simplicity: the guided self-assessment for SMEs and Public Administrations means they do not need expert knowledge of cloud security.

» Technical and organisational focus: SMEs and Public Administrations are guided in the elicitation of security controls considered “good enough” for their requirements. These controls are based on the Cloud Security Alliance Cloud Control Matrix (CCM) and cover both technical and organisational aspects of the cloud customer.

» Repeatable process: SMEs and Public Administrations can periodically re-assess their risks to identify opportunities for improving their risk profile.

» Standards-based: CloudWATCH2 leverages well-known standards and best practices to facilitate industrial uptake, with the Cloud Control Matrix and Enterprise Architecture both based on international standards from ISO/IEC and NIST.

» High automation potential: facilitating the development of software applications to empower SMEs and Public Administrations in creating and using risk profiles.

» Cloud specific: to the best of our knowledge, there are no other approaches aimed at developing cloud-specific risk profiles.

Links www.cloudwatchhub.eu, @CloudWatchHub https://cloudsecurityalliance.org/group/cloud-controls-matrix/

Page 19: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

19

CloudScapeBrazil 2016 Position Papers

Protecting user data on the Cloud in the era of the new EU GDPR with Confidential and Compliant Clouds - Coco Cloud

Authors: Claudio Caimi, Fabio Martinelli, Mirko Manea, Marinella Petrocchi, Francesco Di Cerbo, Jose F. Ruiz, Paolo Mori (Coco Cloud Project)

Focus Area

Sharing data through the cloud securely and privately is a major concern in today’s digital economy. Recent approval of the new European Global Data Protection Regulation (GDPR) strengthens the data protection needs for individuals and the necessity of data-centric approaches to information security. Confidential and Compliant Clouds (Coco Cloud) is an EU FP7 project that has the goal to offer a data protection platform for information sharing between individuals and organisations or between organisations, including the sharing of personal or sensitive data, in accordance to applicable laws, vertical markets regulations, and end user privacy preferences.

Who stands to benefit and how 

Coco Cloud is being piloting in three different domains. A public administration that needs to exchange citizen documents with other public offices; a hospital that needs to regulate the secure sharing of radiological examinations and studies with patients, between doctors and between hospitals; an enterprise where employees have to exchange sensitive business documents.

Position Paper

Coco Cloud envisions the control of the disseminated data (both on the Cloud and on premises) based on mutually agreed electronic Data Sharing Agreements (DSAs) that are uniformly and end-to-end enforced. These agreements can reflect legal and contractual policies or user-defined preferences. Simply said, DSAs specify rules applied for accessing or using the data they are linked to.

The project is creating an efficient and flexible framework for secure data management from the client to the cloud, and vice-versa. We purse this objective by addressing three dimensions:

1. The writing, understanding, analysis, management, enforcement and dissolution of data sharing agreements, from high level descriptions (close to natural language) to system enforceable data usage policies;

2. The development of a uniform enforcement infrastructure that seamlessly enforces data sharing agreements from the Cloud till the client (e.g., mobile devices) and back;

Page 20: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

20

CloudScapeBrazil 2016 Position Papers

3. The “compliance by design” approach to address key challenges for legally compliant data sharing in the cloud, placing an early emphasis on understanding and incorporating legal and regulatory requirements into the data sharing agreements.

A DSA system takes care of handling the DSA lifecycle and provides the following core tools:

» DSA Authoring Tool is in charge of creating and managing DSAs in a user-friendly manner via web technologies. It provides an easy interface to express rules using a language called Controlled Natural Language for DSA, or, more concisely, CNL, based on domain specific dictionaries (e.g., an ontology for the healthcare use case), that resemble the common English.

» DSA Analyser and Conflict Solver analyse the rules in a DSA and solve potential conflicts. A conflict exists when two policies simultaneously allow and deny an access request under the same contextual conditions. In case a conflict is revealed, the Conflict Solver prioritizes the rules to be enforced.

» DSA Mapper translates the DSA policies from CNL into an enforceable XACML-based language. The outcome of this tool is an enforceable policy. Such policy will be evaluated at each request to access and/or use the target data.

» DSA Lifecycle Manager orchestrates the usage of the DSA system components. It is the single entry point for accessing all the described tools. A Coco Cloud Engine system is then in charge of the runtime enforcement of the defined DSA. It provides the capabilities for creating protected objects and for accessing and using them by taking care of enforcing the rules prescribed in the associated DSA. In particular, it features the Coco Cloud API, a unified frontend to the Coco Cloud services. This enables to create Coco Cloud protected objects, i.e. encrypted digital containers storing the data along with its protection rules (sticky policy).

The Coco Cloud API is also the point used to required access and usage of protected objects. The engine runs seamlessly in the Cloud or on mobile devices and provides different kinds of protections like access control and usage control (continuous authorization and obligation). The engine evaluates the DSA mapped rules. The overall framework integrates with OpenStack (i.e., Swift) and provides supplemental services for managing identities, encryption and keys, platform integrity and auditing, all relying on open and industry standards.

Page 21: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

21

CloudScapeBrazil 2016 Position Papers

Enhancing Cloud Security and Trust with Context-aware Usage Control Policies

Authors: Christian Jung, Manuel Rudolph, Reinhard Schwarz (Fraunhofer IESE, Germany)

Focus Area

Cloud infrastructures rely on virtualization abstracting the physical hardware. Multiple tenants can share physical hardware, and a single virtual resource may span multiple physical resources. The uncertainty about location and context of virtual resources is a potential security threat. For instance, tenants may want to prevent their data from migrating to undesirable jurisdictions (e.g. outside the European Union) and to ensure that certain virtual resources share (or expressly do not share) a common physical resource. Cloud users demand fine-grained control of collection, processing and storage of their data. For secure and resilient cloud application management, we suggest a usage control framework for cloud environments for enforcing user-definable, context-aware usage control policies.

Who stands to benefit and how

In cloud environments, usage control may serve as an enabler. It provides not only fine-grained control over data flow and data usage, but also context-aware control decisions in virtual cloud environments that otherwise remain rather intransparent for cloud users, preventing them from putting their critical business assets into the cloud. For example, in critical infrastructures such as smart energy management, different stakeholder need to exchange all kinds of business-related data for trading energy and managing the energy grid, and each stakeholder has specific business interests to share (or not to share) data with collaboration partners or competitors. From a provider’s perspective, usage control is an added value that may convince more users to opt for cloud deployment. Thus, usage control allows different stakeholders to enforce their individual security demands with respect to their data and functionality.

Position Paper

The value of data is more and more appreciated, and data-centric business models are gaining popularity in the age of »big data« applications. However, to facilitate large-scale data business across organizational boundaries—especially in cloud environments—, we need to reconcile two conflicting goals: the free exchange of data assets on the one hand, and the protection of intellectual property, trade secrets, and privacy on the other hand. Even in a cloud environment with cross-organizations business transactions, the users should keep full control over the collection, processing and storage of their data and application assets.

Page 22: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

22

CloudScapeBrazil 2016 Position Papers

Unfortunately, the virtual resources of cloud infrastructures lack transparency, which is a major obstacle for trusted cloud deployment. Ideally, users would like to share their data according to well-defined, technically enforced usage policies, and they would like to adapt these policies at any time depending on their specific business needs to express exactly who may use their data and services under which conditions and for which purposes.

Providing a suitable usage control infrastructure within the cloud would foster trust among business partners, thus facilitating new types of data-centric business models while limiting the risks of privacy violations, intellectual property theft, or license infringement. Apart from restricting the use of data in its original form, data usage control (a generalization of access control spanning the whole lifecycle of data even after initial access has been granted) can also enforce data transformations to adapt the data to the intended usage scenario. For example, a usage control policy may require to remove person-identifiable data attributes (or to anonymize them) before granting access to a business partner. Moreover, usage control policies may take application context and cloud context into account, for example, to enforce data processing at restricted geo-locations or to grant processing only if suitable audit trails are recorded. Data usage control mechanisms can be integrated into several system layers (e.g., cloud infrastructure and cloud service layer) in order to allow comprehensive usage control. These types of accompanying control measures pave the way for new business opportunities what would otherwise fail due to security or privacy concerns of companies or customers.

At Fraunhofer IESE in Kaiserslautern, we are exploring the potential of usage control for several years in various contexts, such as mobile applications, the Internet of Things, or cloud computing. We are exploring a data usage control framework that aims to offer a rich policy language, support for user-friendly policy specifications, context-awareness capabilities, and policy decision and enforcement components for distributed usage control.

We already showed the applicability of usage control in the context of secure cloud computing within the EU research project SECCRIT (FP7). In RESCUER (FP7) we analyzed the application of data usage control for a privacy-preserving emergency management platform.

Page 23: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

23

CloudScapeBrazil 2016 Position Papers

Secure Data Processing in Untrusted Clouds

Authors: José Luis Vivas, Andrey Brito (SecureCloud project)

Focus Area

Trustworthy cloud platforms and services

Who stands to benefit and how

Developers of applications will be able to use cloud services when building application that use sensitive data. Cloud providers, either IaaS, PaaS or SaaS, will be able to offer services that are more robust against attack aiming to steal confidential data.

Position Paper

Security is essential for organizations seeking to outsource critical infrastructure systems to the cloud. Confidentiality of information, integrity of data, privacy, resilience of processes, and availability of applications and data are of immediate concern to these organizations, which must comply with strict security requirements, since the interruption of critical infrastructures may have wide-ranging effects.

The domains of these organizations include traditional sectors such as electricity, health care, transportation, and finance, but also novel critical infrastructures such as smart grids and future large-scale computing such as the Internet of Things (IoT) and Cyber-Physical Systems (CPS). Indeed, by supporting these critical infrastructures, the cloud becomes also a critical infrastructure itself.

The main objective of the SecureCloud project is to eliminate technical impediments to dependability in cloud computing by ensuring that confidentiality, integrity, availability and privacy requirements are met, focusing mainly on applications that support critical infrastructures throughout Brazil and Europe. Thereby, SecureCloud will encourage and enable a greater uptake of cost-effective, environment-friendly and innovative cloud solutions.

The innovative approach taken by SecureCloud leverages the emergence of secure commodity CPUs, a novel and promising technological whose objective is to enable a new generation of dependable applications by reducing the trusted computing base of applications to new hardware mechanisms provided by commodity CPUs such as Intel Software Guard Extensions (SGX), which is basically a set of new CPU instructions that can be used by applications to set aside protected regions of code and data, dubbed “enclaves”.

Enclaves are intended to isolate applications not only from other applications running within the same platform, but also from the underlying operating system or the hypervisor. This enables cloud consumers to retain control of their

Page 24: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

24

CloudScapeBrazil 2016 Position Papers

platforms and thereby run sensitive applications in a public cloud without the need to trust the cloud provider. The memory content of protected applications may also be encrypted within enclaves in order to prevent the operating system or the hypervisor from being able to read and/or modify application data. SecureCloud will therefore leverage Intel SGX as root of application trust to provide confidentiality and integrity of sensitive data. It will moreover use OpenStack as a common cloud stack infrastructure, and extend standard container technology to allow the execution of Intel SGX secure enclaves within containers. SecureCloud will also use a Coordination Service to detect computer or application process failures, and restart either the application process on a different computer or a newly created virtual machine, or the container itself, depending on the requirements of the application process. Finally, SecureCloud will use Software-Defined Networks (SDN) to connect the application components both within as well as across datacentres.

SecureCloud will seamlessly integrate the new dependability features into a standard cloud stack to encourage easy migration of both critical and non-critical applications to the cloud without compromising application dependability. Moreover, SecureCloud will convincingly validate and demonstrate the benefits of the proposed approach by applying it to real-world scenarios involving smart grids, a demanding big data use case in the domain of critical infrastructures.

Page 25: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

25

CloudScapeBrazil 2016 Position Papers

Towards Trust in cloud services with CLARUS – how we are tackling related legal issues

Authors: Pieter-Jan Ombelet, Stephanie Mihail (University of Leuven - KU Leuven, Belgium)

Focus Area

The aim of the Horizon 2020 CLARUS project is to offer cloud services that are compliant with global standards and European IPRs reflecting federated and coherent road maps. CLARUS will impact the current cloud landscape by delivering key elements spanning reinforced European leadership in privacy-preserving technologies by safeguarding the privacy of citizens in cloud computing environments, increased interoperability of systems and services from different vendors and innovative research in security-enabling techniques and new architectures for secure delivery in the cloud. This paper provides an overview of the legal research that is undertaken within the context of CLARUS, with a particular focus on the legal issues encountered during the development and deployment of the CLARUS solution.

Who stands to benefit and how 

More specifically, CLARUS will enhance trust in cloud services by developing a secure framework for the storage and processing of data outsourced to the cloud. The CLARUS framework aims to allow end users to monitor, audit and retain control of the stored data while maintaining the functionality and cost-saving benefits of cloud services.

Position Paper

Tackling legal challenges

The main legal concerns impeding the mainstream adoption of the cloud relate to privacy and security matters, as well as the concepts of interoperability and portability. In this respect, multiple legal frameworks are analysed.

First and foremost, the development of CLARUS draws on the use of two distinct data sets, namely geospatial data and eHealth data. The first data set relates to freedom of information legislation, whereas the second data set is analysed under the EU privacy and data protection framework. In Europe, legislation on freedom of information is rather fragmented and divergent. Despite differences, there is one clear exceptions regarding access related to environmental and spatial data which is of particular significance to the geospatial data used for the CLARUS project. More specifically, the EU has established a clear legal framework vis-à-vis the availability of public sector spatial data. There are three main Directives, namely the ACCESS Directive, INSPIRE Directive and PSI Re-use Framework.

Page 26: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

26

CloudScapeBrazil 2016 Position Papers

Legal research for the eHealth use case has a clear focus on privacy and data protection issues, as one of the priority areas for CLARUS. The legal requirements of the current data protection legislation, as well as the recently adopted General Data Protection Regulation (GDPR), which will come into force the 25th May 2018 and will be directly applicable in all EU Member States are thoroughly analysed for the implementation of a legally compliant CLARUS solution. Attention is paid to the concept of ‘Privacy by Design’ and the associated challenges for its implementation, as well as to the consequences and implications of the ‘Safe Harbor’ decision on the transfer of personal data, which invalidated the EU-US Safe Harbor Agreement regulating the exchange of personal data between the two continents. Currently, the transfer is done by using standard contractual clauses (SCCs) and binding corporate rules (BCRs), while the ‘Privacy Shield’ Agreement is expected to be adopted soon. Regarding the security aspects relevant to CLARUS, we have explored the liability issues of cloud service providers and intermediaries, based on the general tort law and the E-commerce Directive. Certain relevant cybercrime legislation and EU Directives on attacks against information systems are also analysed, as well as the European Investigation Order in criminal matters.

The added value of the CLARUS solution lies in enabling the delivery of new and improved services, while retaining full control over any potentially confidential or business-critical data outsourced to the cloud. For cloud service providers, the CLARUS trust-enabling solution will increase market potential by attracting a much broader spectrum of prospective cloud customers. These objectives will also be met by making a thorough analysis of the key elements of the applicable legal framework alongside the study of the recent milestone decisions from the European Court of Human Rights (ECtHR) and the Court of Justice of the European Union (CJEU).

The comprehensive legal guidance during the life cycle of the CLARUS project on the impact of these legal requirements on cloud services will enhance the trust of all stakeholders, and particularly organisations on the demand side, which are currently hesitant to adopt cloud services. Furthermore, this research will result in providing concrete recommendations for partners and policy makers in order to ensure a fully compliant CLARUS solution, able to improve user’s trust, safety, privacy and confidence in cloud services.

Page 27: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

27

CloudScapeBrazil 2016 Position Papers

How WISER project is preparing the ground for cyber security challenges in the Digital Single Market

Authors: Elena González, Antonio Álvarez , Aljosa Pasic (ATOS, Spain)

Focus Area

Started in June 2015, WISER project will deliver, in late 2017, a cyber risk management framework that dynamically assesses the cyber risk to which the client ICT infrastructure is exposed. This is done by continuously monitoring the risk associated to the cyber-climate of its ICT operational environment. It encompasses not only the technical side of cyber risk but also incorporates the business side, including socio-economic impact assessment. WISER builds on current state of the art methodologies and tools, leveraging best practices from multiple industries.

Given the traditional approach where cyber risk management is performed periodically, and the current state of art with risk management frameworks lacking of an integrated, agile methodology to analyse cyber risks, a growing demand for the continuous monitoring of cyber security relevant events and dynamic assessment of risk is more than evident. The best answer when a cyber-attack threatens valuable assets calls for a reliable support for decision-making. WISER provides support to adopt the correct measures while maximising the return on Investment (RoI).

Who stands to benefit and how

Among the WISER goals, the highest priority is making cyber security affordable for SMEs. WISER therefore mostly focuses on SMEs needs that often do not have means to handle cyber risk with advanced methodologies & tools, and cannot usually afford to hire a consultancy services. WISER aims at being a sophisticated solution while easy to adopt by the end user. Although, as mentioned before, SMEs are the main target of WISER solution, any organisation has to manage cyber security risks appropriately and to show that they are capable of doing it successfully, as pointed out by the European Commission in their communication on ‘ICT Standardisation Priorities for the Digital Single Market’ (19/04/2016). WISER conception is based in this growing security market need.

On top of this, WISER is facilitating the uptake of a cyber security culture that enhances business opportunities and competitiveness in the private sector, making cyber security a key selling point.

This means that, despite the strong focus on SMEs, WISER intends to provide affordable, effective and efficient cyber security to clients, no matter the segment they belong to.

Page 28: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

28

CloudScapeBrazil 2016 Position Papers

Position Paper

Digital Single Market Strategy

As the European Commission continues to progress the development of a Digital Single Market (DSM), organisations across the region are beginning to think carefully about how the initiative would impact them. Coupled with the impending adoption of the General Data Protection Regulation (GDPR), privacy and security issues are quickly moving to the top of companies’ and politicians’ agendas.

The first step is to complete a cybersecurity and privacy assessment for the company´s cross-border business and digital services. An organiszation cannot defend itself perfectly against every threat. Hence technology decisions need to be risk-based decisions. Thinking carefully about the size of the organisation and its appetite for risk, businesses should consider which areas are more vulnerable to threats, establish priorities for mitigation goals, and establish cost-efficient mitigation measures. It’s important to understand that there is no one-size-fits-all standard for risk assessment: any successful evaluation has to be based on a thorough expert analysis leading to a comprehensive and holistic picture of the business risks. WISER is aligned with the DSM, specifically with initiatives 12 and 13, contributing to increase the cyber risk awareness by educating risk managers and boards of directors across the market.

To reach this new level in cyber security WISER is developing a methodology, based on the definition of the risk assessment cycle, where the evaluation of risk is based on the correlation, thanks to risk models and associated model rules, of information coming from both the infrastructure to monitor and the client himself. The former provides technical information regarding the cyber climate and the presence (or not) of incidents within the client ICT infrastructure, and the latter gives valuable input about the infrastructure elements and their business value according to the client criteria. Thanks to this, a dynamic risk evaluation is performed, expressed in qualitative and quantitative terms. Hence, WISER goes a step beyond and does not settle for detecting and reporting cyber incidents, but also evaluates their business importance, giving an information which is crucial for top management positions, the ones with decision-making capabilities. On top of it, besides the business impact, the tool inform about the societal component of the risk, which is one of the main novelties brought by WISER.

The cycle is completed with the decision-making process which provides to the client decision support tools that make easier the selection of mitigation options integrating tech, business and societal visions of risk. The effect of the mitigation actions is measured in the following risk cycle, where the risk level, evaluated by WISER, must have diminished. In conclusion, WISER represents advance over the state of the art by leveraging best practices today and recent research results. It is not simply about monitoring cyber incidents; it is about assessing the risk they mean to a company. It considers not only the damage to the ICT infrastructure, also the damage to the business, providing a multi-level assessment. This risk evaluation evolves as cyber climate changes. The definition of mitigation measures is assisted by the framework with solid criteria to apply to the decision-making. And all of this with a strong focus on SMEs with an aim to make cyber risk assessment and management affordable. In a nutshell, WISER aspires to drive a “cyber security for all” approach.

Page 29: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

29

CloudScapeBrazil 2016 Position Papers

Cloud Federation & Open Science Cloud

Page 30: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

30

CloudScapeBrazil 2016 Position Papers

HPC as a Service

Authors: Bruno Schulze and A. Roberto Mury (LNCC, Brazil)

Focus Area

» Building a culture of transparency for customer-friendly Service Level Agreements (SLAs).

» How can Cloud Service Providers make their SLAs more transparent. A Cloud Service Provider perspective

» What challenges do customers face when signing an SLA. A Cloud Service Customer perspective Cloud & IoT

» Challenges of integration of big data and IoT using cloud as an enabler

Who stands to benefit and how

Academia and Research Community running Simulations and Applications. Could be used for educational purpose and also for deploying specialized Computational Services with any built-in expertise hidden from the final user. There are potential application services and developments not being considered or yet properly addressed. There is a missing middle class of applications.

Position Paper

Several studies have been carried out to check limitations of clouds in providing support to scientific applications. The major part is dedicated to the behaviour of scientific applications, most of them characterized by large amounts of information processing and massive use of computational resources. In this context clouds emerge in providing additional resources, or in minimizing cost in the acquisition of new resources.

The use of clouds in support to scientific applications has inherited characteristics, different from the commercial ones. The virtualization technologies are the basic elements of clouds’ infrastructure, and despite of their significant advances they still present limitations when confronted with the needs of high computational power and communication, demanded by several scientific applications.

However, using virtualized resources demands a deeper understanding of the characteristics of the applications and the cloud architecture. Our group of Distributed Scientific Computing at the National Laboratory for Scientific Computing (ComCidis / LNCC) and other research groups, suggest that different virtualization layers and hardware architecture used in the cloud infrastructure influence the performance of scientific applications. This influence leads to the concept of affinity, i.e., which group of scientific applications has a better performance associated to the virtualization layer and

Page 31: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

31

CloudScapeBrazil 2016 Position Papers

hardware architecture being used. These aspects involve:

» Reducing cloud environment limitations in support to scientific applications;

» Providing the basis for the development of new cloud scheduling algorithms;

» Assisting the acquisition of new resources and cloud providers, looking for performance and resource usage optimization.

Page 32: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

32

CloudScapeBrazil 2016 Position Papers

EUBra-BIGSEA: Cloud QoS for Big Data Applications

Authors: Ignacio Blanquer (Politecnic University of Valencia, Spain), Wagner Meira Jr. (Federal University of Minas Gerais, Brazil)

Focus Area 

Big Data Applications normally deal with dynamic streams of data that require a periodic processing. This is indeed the case of public transport data, which require the repetitive processing of massive data with a deadline. Adjusting the rightmost amount of resources to achieve the processing according to a SLA is a challenge to minimize cost without compromising Quality of Service. Resource Management Frameworks tend to the overestimation of requirements both in memory and computing.

Who stands to benefit and how

EUBra-BIGSEA is an-API centric project for providing cloud services to Data Analytics applications. Therefore, EUBra-BIGSEA aims at application developers and Data scientists, who could run their processing code efficiently on a self-adjusted platform. As main use case, EUBra-BIGSEA aims at Public Transportation analysis, and it will develop an application for citizens and municipalities to plan their journey or to evaluate their transportation network.

Position Paper 

EUBra-BIGSEA is a European - Brazilian collaboration of Big Data Scientific Research through Cloud-Centric Applications. EUBra-BIGSEA will develop a programming interface for developing data analytic applications that will leverage proactive and reactive elasticity policies on top of a cloud-agnostic platform.

EUBra-BIGSEA has defined a cloud QoS software architecture that will be able to deal with the three types of jobs that has been identified through the requirement analysis: long-running high-availability jobs, periodic and deadline-bounded batch jobs and interactive tasks.

EUBra-BIGSEA uses a container model to embed client software dependencies and integrates several schedulers for addressing the three types of jobs, namely Marathon, Chronos and YARN. This way, the same resource infrastructure is managed by a single service that skips the need of statically partitioning the data centre, adjusting the active resources to the workload, which enables not only mixing multiple heterogeneous workloads but also the allocation of resources through local policies, thanks to a software component called EC3.

EUBra-BIGSEA instruments a Mesos framework with an elastic provisioning system

Page 33: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

33

CloudScapeBrazil 2016 Position Papers

that will allocate and reconfigure the Mesos slaves on demand. A service implements proactive policies to determine the amount of resources that should be assigned to a specific job to meet a given deadline. This service learns from the experience of past executions and readjust the request in the further iterations. The reactive policies will allocate more resources if the job is about to miss the deadline. The use of container applications facilitates the overcommitment of memory by the Mesos framework, by means of a Cloud Virtual Machine Automated Procurement system (CLOUDVAMP) which over-commits real memory.

Finally, the system will be monitored by means of OpenStack MONASCA, which will trigger the resource reallocation when the metrics fall from a given threshold. This platform main use case is the periodic execution of a data processing algorithm over a fresh data coming from the activity of the previous day and beyond.

These processing involves a series of steps that need to be done in a given order and which should be completed for a specific deadline. Data must be retrieved and preprocessed, to infer routes and vehicles usage. Models for delay estimation must be trained with the new data before the rush hour, but not so early that the data used is not updated. Therefore, this constitutes a clear use case for deadline-based QoS periodic jobs.

Page 34: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

34

CloudScapeBrazil 2016 Position Papers

Adaptive Virtual Network Provisioning for IaaS Clouds

Authors: Guilherme Koslovski, Adriano Fiorese, Charles Miers, Felipe Rodrigo de Souza (Santa Catarina State University – Joinville, SC – Brazil)

Focus Area 

Cross-domain orchestration and programmable networks in clouds

Who stands to benefit and how

IaaS providers and tenants. Each end-user can compose a private virtual network for interconnecting a set of virtual machines.

Position Paper 

1. Introduction

Cloud Computing has revolutionised the provisioning of IT services. Particularly, Infrastructure as a Service (IaaS) providers have been using virtualization technology for delivering on-demand virtual machines (VM) [1]. VMs can be defined based on application requirements. Parameters defining RAM, CPU and storage are usually available according instance types (e.g., small, large). Moreover, the elastic provisioning of virtual resources paved the road for composing adaptive virtual environments. Each VM can be dynamically adjusted to application’s load, increasing or decreasing its configuration (vCPUs, storage size, and memory capacity).

Taking this into account, we argue that network configuration and management are critical aspects for improving cloud hosted application’s performance. In cloud datacentres, applications can generate large volumes of data, and an expressive portion of its running time is due to network activity. For instance, a Facebook cluster can go up to 33% of its running time only making data transfers [2] while under-provisioned virtual networks can drastically affect hosted applications performance [3].

Providers have used virtualization-based tools to datacentre management (e.g., load balancing, upper-bound limits for bandwidth provisioning). However, tenants have no access to such technologies. The networking price model is based on data transfer volume, charging by upload and download rates. Little attention has been directed toward examining internal data- center communication guided by hosted-applications performance and security aspects though. A shine initiative was proposed by Amazon Virtual Private Cloud, in which tenants can specify some network requirements for VMs traffic. Considering these points, we come up claiming that adaptive virtual networking is a promising technology for improving performance and security of hosted applications. Section 2 details how this technology must be used as a driven force for composing adaptive virtual infrastructures, while Section 3 finalizes this position paper.

Page 35: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

35

CloudScapeBrazil 2016 Position Papers

2. Virtual Network as a Service

Network virtualization was disseminated as a promising technology for overcoming the ossification of traditional network protocol and architectures [4] [5]. However, network virtualization adoption has been facing obstacles since its proposition due to rigidity of network management. Recently, Software-Defined Networking (SDN) and Network Functions Virtualization (NFV) emerged as real solutions for composing dynamic virtual networks [6]. SDN has decoupled network control and data plans enabling the development of new protocols and routing algorithms, while NFV introduced the provisioning of virtual resources (e.g., routers, switches, firewalls, load balancers) atop physical servers.

Cloud providers have been using SDN for managing datacentres, however, we state that such ossification-break technologies must be offered as a service for tenants enabling the composition of fully-virtualized infrastructures: a set of VMs interconnected by virtual networking resources. In this context, the number of virtual resources (machines, routers, switches and links) and its configuration (processing, storage, bandwidth, latency) can be dynamically adjusted based on hosted applications requirements. Similarly to other IT resources, a virtual network can be dynamically adjusted to accommodate peaks of load. For instance, distributed applications with sporadic high-networking demand can adjust virtual topology requirements just when networking activity is required, avoiding bandwidth under- and over-provisioning. Adaptive virtual networking provisioning is a key concept for enabling network elasticity on IaaS providers. With respect to security aspects, network virtualization can improve confidentiality, accountability, availability and authenticity.

The introduction of a virtual network hypervisor (based on SDN and NFV), between physical resources and virtual infrastructures, assures that tenants have no direct access to the host datacentre. Moreover, each sensitive task (e.g., switch configuration, firewall rule) can be trapped by the IaaS networking hypervisor, which is in charge of verifying security and isolation aspects.

3. Conclusion

Virtualization is recognised as a key technology for IaaS-cloud providers. However, IaaS providers have been applying virtualization techniques for internal datacentre management only. A perspective for improving cloud-hosted applications performance and security is an adaptive provisioning of virtual networks. Indeed, IaaS providers must consider network as a critical resource that can be on-demand configured and deployed.

References

[1] S. S. Manvi and G. K. Shyam, “Resource management for Infrastructure as a Service (IaaS) in cloud computing: A survey,” Journal of Network and Computer Applications, vol. 41, pp. 424 – 440, 2014. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S1084804513002099

[2] M.Rost,C.Fuerst,andS.Schmid,“Beyondthestars: Revisiting virtual cluster embeddings,”in InProc. ACM SIGCOMM Computer Communication Review (CCR, 2015.

Page 36: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

36

CloudScapeBrazil 2016 Position Papers

[3] T. Truong Huu, G. Koslovski, F. Anhalt, J. Montagnat, and P. Vicat-Blanc Primet, “Joint Elastic Cloud and Virtual Network Framework for Application Performance-cost Optimization,” J. Grid Comput., vol. 9, no. 1, pp. 27–47, Mar. 2011. [Online]. Available: http: //dx.doi.org/10.1007/s10723-010-9168-6

[4] M. Handley, “Why the internet only just works,” BT Technology Journal, vol. 24, no. 3, pp. 119–129, Jul. 2006. [Online]. Available: http://dx.doi.org/10.1007/s10550-006-0084-z

[5] N. M. M. K. Chowdhury and R. Boutaba, “Network Virtualization: State of the art and research challenges,” Comm. Mag., vol. 47, no. 7, pp. 20–26, Jul. 2009. [Online]. Available: http://dx.doi.org/10.1109/MCOM.2009.5183468

[6] W. Stallings, Foundations of Modern Networking: SDN, NFV, QoE, IoT, and Cloud, 1st ed. Addison- Wesley Professional, 2015.

Page 37: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

37

CloudScapeBrazil 2016 Position Papers

A Federation of Testbeds for Experimentation in Next Generation Wireless-Optical Convergent Networks: A Case-Study of NFV/C-RAN

Authors: Juliano Araujo Wickboldt, Maicon Kist, Marcelo Marotta, and Cristiano Bonato Both (FUTEBOL Project)

Focus Area

» Federated cloud infrastructures for research and science-applications addressing societal challenges, big data value, open standards

» Cloud-Radio Access Networks (C-RAN)

» Network Functions Virtualization (NFV)

» Wireless-optical convergent network orchestration

Who stands to benefit and how

The federation of testbeds created by the FUTEBOL project will benefit: (i) experimenters who can take advantage of the infrastructure and tools enabled by FUTEBOL, helping them capitalize on their innovation efforts; (ii) innovators and researchers both in the academic and corporate research & development domains working on resource allocation including optimization in wireless and optical networks; (iii) telecommunication manufacturers, operators and service providers, such as, but not limited to, the 5G-PPP members, and organizations participating in the recent H2020 ICT calls; (iv) society as a whole including students, regulators, standardization groups, etc.

Position Paper

FUTEBOL: Federated Union of Telecommunications Research Facilities for an EU-Brazil Open Laboratory, is a research project targeted to create research infrastructure and tools that enable and promote the federation of experimental telecommunication resources irrespective of their location in Brazil and Europe, with heterogeneous networks, wired and wireless. The focus of FUTEBOL is on building upon current tools and platforms in support of end-to-end experimentation, creating a pool of, and giving open access to, shared network experimental resources that complement those available on each continent.

Besides deploying the testbed infrastructure and developing the associated control frameworks for experimentation, FUTEBOL will also deliver a series of showcases to demonstrate use cases of concrete research utilizing innovative and challenging network architectures. These use cases are based on Internet of Things (IoT), heterogeneous and

Page 38: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

38

CloudScapeBrazil 2016 Position Papers

dense wireless network deployments (HetNets), and Cloud-Radio Access Networks (C-RAN). Every showcase will highlight the interplay between optical and wireless networking technologies. Moreover, at least some of the showcases will include interrelation between testbed infrastructures located at the two sides if the ocean (i.e., Brazil and Europe). In particular, in this position paper, we describe a specific experiment currently being designed by FUTEBOL partners that explore virtualization in the context of C-RAN. In C-RAN, traditional Base Stations (BSs) are decoupled into Remote Radio Heads (RRH) and Baseband Units (BBU). RRHs are co-located at each cell site and perform only the radio frequency translation, whereas BBUs are placed into a central site and provide the fundamental physical layer capabilities. The transport of radio signals is carried by optical fronthaul/backhaul networks. In this context, BBU functionality can be implemented as Virtual Network Functions (VNFs) on top of standard datacentre hardware. The break of the static relationship between RRHs and BBUs, as the baseband processing of RRH, is conducted by a chain of VNFs in the datacentre. Coordinating these chains of VNFs and configuring/programming the underlying optical network to adapt to demand fluctuation can be considered a major orchestration challenge. To perform this experiment, we will rely on the infrastructure deployed by FUTEBOL partners in Europe and Brazil, which includes state-of-the-art equipment such as Software-Defined Radio (SDR) and programmable optical switching technologies. Virtualization support will be added to these programmable platforms allowing dynamic allocation and sharing of resources for experimentation purposes. In this specific experiment, we intend to show the impact of orchestrating virtualized network functions in the context of C-RANs to benefit bandwidth and latency sensitive applications. Examples these applications are:

1. High-quality video transmission over wireless/mobile networks considering vertical and horizontal handover,

2. Response to mobile users demand fluctuation (i.e., churn),

3. Controlling backhaul/fronthaul congestion.

Page 39: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

39

CloudScapeBrazil 2016 Position Papers

Testbed as a Service: Experimental Worldwide Laboratories

Author: Leandro Ciuffo, Tiago Salmito, Iara Machado (RNP, Brazil)

Focus Area

Trustworthy cloud platforms and services

Who stands to benefit and how

Researchers, professors, students and system developers, by having access to a myriad of research facilities “available in the cloud”

Position Paper

In the past years, several e-infrastructures and experimental platforms were constructed by different organizations and projects. Only in Europe, for instance, the FIRE initiative (www.ict-fire.eu), financed by the European Commission, has been growing since its inception in 2010 with the ambition of being Europe’s Open Lab for Future Internet research, development and innovation. Today, FIRE’s current offering includes 25 experimental facilities, ranging from IoT and robotic to SDN and wireless networks. A usual trend after a proliferation of research infrastructures is the emergence of new projects to promote the interoperability and the federation of platforms. In that way, each experimental platform (aka testbed) can be seen as a set of resources available for experimentation “in the cloud”. There is also testbeds for cloud experimentation (see https://www.chameleoncloud.org/). Another trend is the offering of such testbeds as a service, eventually generating revenues. This is the case of offering specialized platforms for software houses to deploy and test their solution in an isolated environment at scale, outside their lab, to reduce the time for product development and prevent unexpected or errant behavior from interfering with production services (see the example of FanTaaStic testbed: http://www.testbeds.eu). In another level, generating new services on top of testbeds, new concepts are emerging, like “Testing-as-a-Service” or “CEaaS: City Experimentation-as-a-Service”, as available in the Bristol through its Open IoT testbed (www.bristolisopen.com). In Brazil, the FIBRE testbed (https://www.fibre.org.br) is starting to be offered as a service - at no charge - to the academic community, operated by the National Research and Education Network (RNP) in partnership with several Brazilian research and education institutions. Its infrastructure is composed of a federation of local testbeds, also known as experimentation islands, that provides support for unified resource sharing, managing the lifecycle of experiments and mechanisms to facilitate the support of federation.

Page 40: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

40

CloudScapeBrazil 2016 Position Papers

In this year, the FIBRE testbed is entering a new phase of production with the reformulation of its architecture and the modernization of its underlying tools to keep up with the advancements in experimental testbeds-as-a-service infrastructures and be able to provide support to a wider community of researchers and use cases.

Page 41: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

41

CloudScapeBrazil 2016 Position Papers

Cutting edge cloud technologies: 5G, Cloud and IoT, Fog computing

Page 42: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

42

CloudScapeBrazil 2016 Position Papers

Cyber-Physical Systems of Systems meet the Cloud

Author: Andrea Bondavalli (University of Florence, Italy)

Focus Area

Cloud & IoT - Challenges of integration of big data and IoT using cloud as an enabler Trustworthy cloud platforms and services Data streaming and big data analytic services - processing massive data in real-time

Who stands to benefit and how

Many organizations using and managing CyberPhysiscal systems need to perform more and more sophisticated real time management of the collected data from sensors to guide the physical actuation of their systems and to have three systems interacting with the surrounding environment and similar systems. People, citizens and organizations who are expecting more and more sophisticated and yet smooth and unnoticeable immersion of their appliances with the surrounding environment.

Position Paper

A Cyber-Physical System of Systems CPSoS that is an integration of a finite number of constituent systems (CS) which are independent and operable, and which are networked together for a period of time to achieve a certain higher goal. This compound encompasses several CPS each consisting of a cyber system, a controlled object (a physical system) and possibly of interacting humans.

Data management and processing requirements in such context have high requirement that can be provided by in the Cloud, provided that some conditions can be adequately satisfied. The interaction between CPSoSes and cloud will be one of the most important drivers for the future development of information and communication technologies bringing the computing power, services and data everywhere and conversely making available information about our physical world to every interested stakeholder.

Obviously many challenges need to be addressed and will take our next years as trust; security; resiliency and dependability are highly required in several different flavours. Essential to this horizon is the possibility to master and reduce complexity of the resulting infrastructures. The ever-growing cognitive complexity of large CPSoSes is a topic of utmost concern, being a major cause for cost overruns, unreliability or outright failure.

Soundness and practical demonstration of the assumption that all subsystems share a common notion of a dependable global physical time with a specified precision was

Page 43: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

43

CloudScapeBrazil 2016 Position Papers

achieved in the AMADEOS project. The availability of a dependable time simplifies many temporal coordination problems and provides the backbone of the temporal infrastructure of the architecture. Based on such ground, the systematic application of the well-proven principle of Divide and Conquer allows then to master the complexity and improve the Understanding of the Behavior of a CPSoS:

» Partition the CPSoS along stable Relied-Upon Interfaces (RUIs) into a set of nearly autonomous Constituent Systems (CSes). The RUIs are well defined in the domains of time, and value at the syntactic and semantic level. Normal information flow among different components shall be explicitly managed, to identify potential error propagation channels.

» Decouple the real-time software architecture from the underlying hardware architecture to simplify software changes and hardware updates.

» Decouple the communication actions from the processing actions in order that the proper operation of the system in the temporal domain and value domain can be established independently.

A proper integration of CPSoS and Cloud In the last decade, the development of cloud computing has brought about a major revolution with tremendous cost reductions in the domain of non-real-time software development and execution. The main characteristic of cloud computing is the isolation of the application software from the underlying hardware infrastructure such that software components can be executed flexibly on different hardware configurations. However, temporal guarantees cannot be provided by today’s cloud computing environment.

We need to set in motion a similar revolution to the field of cyber-physical systems of systems. Our innovative model of a time-triggered virtual machine with predictable temporal properties for the execution of a Real Time SW Component provides the basis for a flexible software-hardware allocation that helps to tackle the problems of on line evolution and fault-tolerance of safety critical real-time systems. The results of research in such direction starting from the results of AMADEOS and pushing beyond it will be of particular importance for the nascent field of fog computing, where real-time tasks are executed reliably in a generic hardware server at the edge of the cloud.

Page 44: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

44

CloudScapeBrazil 2016 Position Papers

Cloud + IoT = New Software Requirements

Author: Roberto C Mayer (MBI, Brazil)

Focus Area

VersaCloud is a cloud-based middleware server that implements time-limited transactions’ management, based on an improved time-limited transaction concept, and which implements a wealth of interrelated services both to developers of front-end applications as well as back-end servers through a strongly typed Application Program Interface (API), in order to reduce the cost of integration of big data and IoT using cloud as an enabler, providing a trustworthy cloud platform and services, as well as providing cross-domain orchestration and programmable networks in clouds through transactions.

Who stands to benefit and how

VersaCloud benefits developers by providing a whole bunch of new software requirements imposed by the cloud environment as ready to use services on the cloud itself. As an integration tool, there are services focusing both on developers of end-user applications (running on PCs, mobiles, tablets or inside a browser) as well as back-end server developers.

Position Paper

Cloud Computing and the Internet of Things have opened up a new space for innovation opportunities, which not only present many new requirements, in comparison to traditional software, but that share many of these new requirements among many of them. Hence, development costs can be reduced by the introduction of new and appropriate software tool aimed specifically at this scenario. Among these common requirements, we cite just a few:

1. Internet connections between devices and servers being integrated are far less reliable than local area networks. Worse, many modern devices run on batteries – so that devices disappear from the network when they run out of energy.

2. End-user solutions become more and more global, as cloud computing does not respect national borders. Many locally developed solutions in fact solve global problems, but developing global solutions is still a challenge for many local developers.

3. As end-users access services from various devices to interact with back-end servers, they want that user interface customization migrates transparently from one device to the others. This omni-channel user interface requires storing data about customization on the cloud.

Page 45: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

45

CloudScapeBrazil 2016 Position Papers

4. Many solutions need to charge users, sometimes on very small amounts (aka micro-billing). A flexible billing and payment system can be used by many applications.

5. As more and more applications are produced, controlling the upgrade of users from one version to another is also a standard challenge: it is necessary to avoid overloading the servers providing the new version. This problem is ‘enhanced’ by the ‘lean’ strategy of testing various versions of applications simultaneously with different users.

6. Not only users are to be handled securely, but security measures are needed to warrant access to back-end servers so that they cannot suffer attacks, like e.g. denial-of-service.

7. Every single transaction needs to be auditable, in order to allow measuring solutions (be it for proving SLAs, marketing purposes, etc)

These, and other additional requirements, have been worked into a new middleware cloud-based transaction server, called VersaCloud, based on a revised version of the transaction concept that includes time-limitations for their execution. A US Provisional Patent Application was filed on April 5th, 2016. VersaCloud is currently available as a public beta.

Notes: 1. MBI is a member of BraFIP - the Brazilian Technology Platform - Mr. Mayer currently holds its presidency. 2. A more complete version of this position paper has been presented at the 2016 European IoT Week, held in Belgrad in May, and is available at http://www.mbi.com.br/mbi/english/versacloud/Cloud-IoT-New-Software-Requ...

Page 46: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

46

CloudScapeBrazil 2016 Position Papers

From Consumer to Industrial IoT: How Megatrends are shaping our society

Authors: Welson Barbosa (EMC and SNIA Brasil)

Focus Area 

Cloud & IoT - Challenges of integration of big data and IoT using cloud as an enabler

Who stands to benefit and how

My Paper talks, in high level, about how Megatrends along with IoT are shaping the way people and companies interact with “things”, using technology to communicate and to generate new opportunities. Since it is an overview is recommended to all types of audiences.

Position Paper

We are currently living a very intense and ever changing period in the tech industry. In the past changing did not happen so fast, we had transitions or evolutions, from Mainframes to Mini computers to PC´s, from desktops to mobile devices, changes usually took a few years to consolidate. Today instead we have to deal with at least five major waves or Tsunamis. Cloud, Big Data, Social and Mobile are changing the way we do business, we communicate, drive, walk, talk, train, etc. Those changes are shaping the new millennium generation, they don´t know what is like to be disconnected, most of them type more than talk, this are not just technology changes, these trends are changing our life style our society. In addition to that, IoT (Internet of things) a terminology, which was forged few years ago to represent, in simple terms, the hyper-connectivity or uber-connectivity of today´s world, means that basically every device we use today is or can be connected to the Internet, generating Data for us to use or for someone else take advantage of, is growing extremely fast.

From Consumer side, we can mention a few examples. Sports Gear! Today your running watches, your fitbit, your heath apps are all connected and generating data. Most of this technology, once available only for professional athletes, is now massively available to whoever wants to track performance and health. With that type of data in hands, doctor can run diagnosis, not only when you do you yearly checkup, but real time, or when they need to have access to it. Your health data is now available in the cloud. In our houses, cameras, thermostats, sensors are sending out data all day long. Our cars can be monitored from our Smart Phones. Industry is also taken advantage of this revolution, automation and sensors are more present than ever in manufacturing, ensuring the quality of products that are ship to customers, improving product lines, ensuring safety,

Page 47: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

47

CloudScapeBrazil 2016 Position Papers

among others things. Sensors in airplanes turbines generate data, for maintenance purpose, can now help airline companies reducing fuel costs and improving safety. Connected cars can be fixed without a long and very inefficient recall procedure, just by being connected to the internet. Insurance companies are now using GPS data to calculate the risk of a driver and give him a better price based upon drive patterns and routes.

Such complex scenarios, create the need of a common Platform, or as some experts are naming an IoT PaaS or (Internet of things Platform as a Service). Backend services and Optimized infrastructures have become key elements for a successful IoT solution, one common and most likely open platform, where data can be uploaded, analysed and transformed into real useful information. A Platform as A Service infrastructure is one of the key pillars to implement an IoT Strategy. Rapid Software Development, Scalability, Manageability, Speed to respond to market needs are essential elements that will determine success or failure.

Another Pilar is Analytics, how to use Big Data to mining valuable information from private Data Lake or maybe Shared/Public data lakes, such as government public info. In summary, we still have a long way to have a common framework to integrate different types of IoT data, however the demand is presented and conversations have begun. Soon our world will be more connected and productive, thanks to the universal access to information provided by Cloud Technologies.

Page 48: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

48

CloudScapeBrazil 2016 Position Papers

Usto.re – Innovating and reducing costs on private cloud management solutions

Authors: Leandro M. Nascimento, Rodrigo E. Assad, Vinicius C. Garcia (Usto.re, Brazil)

Focus Area 

Cloud infrastructures; Cloud services for business

Who stands to benefit and how

Companies of all size which plan on maintaining their own private cloud infrastructure; cloud entrepreneurs who envisage to leverage their cloud services by enhancing security and keeping in-house all processed data; government and/or public companies, which need to maintain citizens’ information inside country’s borders, avoiding espionage and improving national security; practitioners and enthusiasts of cloud computing.

Position Paper

Information security is a major concern of private or public organizations of all size. After WikiLeaks’ publications followed by Edward Snowden’s revelations, a worldwide red alert has been sent. Overall, information is becoming the main asset of any type of business and transmitting, processing or storing it need to be executed with caution.

Considering the current widespread popularity of could computing services, companies’ information tend not to be maintained on their own, but on third party servers. In order to avoid strategic information leaking and security breaches, Brazilian government even sanctioned a federal law obligating public institutions to maintain all information in their own servers. In agreement with Brazilian laws, one of the first steps to take security seriously into account is not giving away your sensitive data to other organizations to process and/or store. Even transmitting sensitive data over the Internet would be dangerous. This context fired up a search for products that may replace common cloud services, but using internal company infrastructure.

Ustore have been developing and delivering such products to several clients in Brazil for about five years, leveraging private cloud infrastructures and allowing companies of all size to keep and maintain their sensitive data in the house. This position paper describes Ustore products and their evolution over the last year.

These products may help on reducing costs with third party services is the long term, such as corporate e-mail or file storage. Following are some successful cases of Ustore products:

1. Corporate e-mail: uMail is the software to keep corporate e-mail at maximum security. It uses cryptography for all exchanged messages, it allows access to contact list and

Page 49: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

49

CloudScapeBrazil 2016 Position Papers

calendar and integrate to a customized storage tool that allows all the attached files to be saved properly in a virtual drive. This product was successfully sold and delivered to Brazilian Department of Defense and is responsible for managing thousands of corporate mail accounts.

2. Cloud storage: uStorage / uDrive are two integrated products that controls any file backups and versioning inside the corporation based on P2P technology to faster file transmissions and improve disaster recover, once files are replicated in peers. In addition, uDrive uses iSCSI technology to improve file transfer speed. Both products were sold and delivered to Brazilian DoD as, a complete solution integrated with uMail.

3. Cloud infrastructure manager: uCloud can control in-house datacenters, including virtual machines, network and storage management. uCloud is on its way to be deployed as a solution for private cloud management at PRODAP (www.prodap.ap.gov.br), an IT government company of Amapa State, Brazil. The solution will be used to manage 512 CPUs, 2048 gigabytes of RAM and 60 terabytes of storage, approximately.

4. Desktop virtualization: integrated to uCloud solution, uVDI allows desktop virtualization over a private cloud infrastructure. This means that the company would not need to provide a computer for every employee, reducing costs of ownership and maintenance. With uVDI, it is possible to instantiate several virtual machines with the same configuration for hundreds of employees within minutes.

Page 50: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

50

CloudScapeBrazil 2016 Position Papers

Low-cost and Open Source Framework for Monitoring Power Usage Effectiveness on IaaS Clouds

Authors: Charles Miers, Daniel Camargo , Guilherme Koslovski, Maurício Pillon (Computer Science Department Santa Catarina State University – Joinville, SC – Brazil)

Focus Area

3 - Cloud & IoT - Challenges of integration of big data and IoT using cloud as an enabler 7 - Data flow and portability of data: remedies to vendor lock-in

Who stands to benefit and how

Clouds hosted by public and academic institutions can compose a low-cost and open source framework for monitoring datacentre PUE.

Position Paper 

1. Introduction

Recent studies indicated almost 2% of world-energy consumption is due to IT datacentres. Despite appealing social aspects, the reduction of energy consumption stands out for its immediate economic impact on IaaS providers. For instance, datacentre cooling accounts for 25-40% of energy consumption on small/medium size datacentres [1].

Moreover, inaccurate environmental temperature reduces hardware lifetime [2]. Datacentres use a set of indicators to account energy management efficiency. In this context, Power Usage Effectiveness (PUE) is a well-know metric, given by total datacentre energy consumption (Ptotal) and IT consumption ratio (PTI): PUE = Ptotal . Accordingly, 1 ≤ PUE ≤ ∞, in which 1 indicates the maximum PTI efficiency level. Public cloud providers have PUE values closest to optimal value: 1.11 for Google, 1.06 for Facebook, and 1.2 for Amazon EC2. Typically, datacentre monitoring frameworks are commercialized/licensed through expensive agreements using proprietary components. Although efficient, the trade-off on acquiring such solutions and their effective application must be considered by IaaS providers. Usually, proprietary solutions are focused on IT or support system monitoring individually without a proper approach for combining both data. Expensive solutions are inadequate for private cloud hosted by public and academic institutions. We claim that the proliferation of low-cost sensors and actuators stands an opportunity for composing a monitoring framework based on open source software and hardware. In other words, a specification of Internet of Things (IoT) [3] can be applied for monitoring and controlling energy consumption on cloud datacentres. Open source software and hardware brings out technological independence, parameterization, cost reduction, and interoperability with legacy systems for cloud providers [4].

Page 51: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

51

CloudScapeBrazil 2016 Position Papers

2. Green IoT Monitoring

Wireless sensor and actuator networks (WSAN) are composed of multiple nodes which interact for controlling specific environments. Nodes are generally classified on final (monitoring and acting) and coordinator (data group). On open source low-cost scenarios, WSAN is composed of low-power microcontrollers, such as Arduino, XBee, and Banana Pi.

Usually, networking is performed over low-power communication solutions, such as ZigBee protocol. Data (e.g., temperature, air flow, and energy consumption) is collected/processed by nodes and specialized servers. Actions are chosen based on datacentre management policies, e.g., if the datacentre temperature is higher than a given threshold, then an actuator can reconfigure the cooling equipments. Additionally, complex policies can be composed to adapt datacentre load in order to reduce the PUE value. As cooling cost is proportional to servers load (and processing), a load balancing can be performed to avoid overloading on cooling equipments. On IaaS cloud providers, virtual machines can be migrated to new hosts in order to decrease datacentre zone temperature. Integration of IoT and cloud management frameworks is a promising approach to decrease datacentre PUE value. A prototype based on OpenStack and Zabbix tools was composed to monitor and control PUE on a private cloud datacentre (Tche cloud: http://labp2d.joinville.udesc.br). Basically, the low-cost open source software/hardware framework is employed to monitor the cooling system and keep the temperature at 23 Celsius degrees, in compliance with TIA- 942 [5]. As a result, our solution reduced energy consumption by 43.7% when faced to the common scenario fixed on 18 Celsius degrees. Moreover, PUE reduced from 2.34 to 1.71. The datacentre energy consumption is divided into two main groups: IT equipment (52%) and support systems (48%) [6].

Among the IT equipment components, the processor consumption is highlighted representing 15% of total consumption. Among the support systems, the cooling system highlights with 38%. Besides its large consume of power, the processor is also responsible for the generation of heat (higher the processor workload, then higher the need of the fans). Therefore, in a datacentre, the distribution of processing impacts the physical location of heat sources. Minimizing the processing concentration is one approach to reduce hot spots and save energy on cooling system. Our orchestration proposal take into account the monitoring data provided by our framework (e.g., temperature, and equipment power consumption), and the workload of the servers (CPU), applying policies to:

1. Reduce the cores frequency (DVFS);

2. Turn-off cores;

3. Turn-off processors; and

4. Turn-off servers. These choices are performed according to the ambient temperature and processing load.

Page 52: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

52

CloudScapeBrazil 2016 Position Papers

3. Considerations

Despite that large datacentres possess sophisticated systems to monitor their energy efficiency; the vast majority of the datacentres in the world is small and medium size without such systems. In this sense, an open source and low-cost solution is primarily to meet these small and medium-sized datacentres requirements (to reduce their costs while globally contribute in reducing energy consumption). Monitoring and managing the processor and cooling systems are the basis for achieving an efficient PUE value.  

References

[1] M. David and R. Schmidt, “Impact of ASHRAE environmental classes on data centers,” in 2014 IEEE ITherm, 2014, pp. 1092–1099.

[2] Y. Zhang, Z. Peng, J. Jiang, H. Li, and M. Fujita, “Temperature-aware software-based self-testing for delay faults,” in DATE2015, mar 2015, pp. 423–428.

[3] W. Stallings, Foundations of Modern Networking: SDN, NFV, QoE, IoT, and Cloud, 1st ed. Addison- Wesley Professional, 2015.

[4] M. Rodriguez, L. Ortiz Uriarte, Y. Jia, K. Yoshii, and R. Ross, “Wireless sensor network for DC environmental monitoring,” in 2011 5th ICST, nov 2011.

[5] . TIA-942, “ANSI/TIA-942 - telecommunications infrastructure standard for data centers,” Telecom. Industry Assossiation (TIA), White Paper 942, 2005.

[6] E. N. Power and E. Logic, “Reducing data center energy consumption by creating savings that cascade across systems,” Emerson Network Power, 2008.

Page 53: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

53

CloudScapeBrazil 2016 Position Papers

The Future of IT is in the Hybrid Cloud

Author: Karin Breitman (EMC Brasil)

Focus Area 

Cloud Infrastructure

Who stands to benefit and how

Practitioners and academics alike

Position Paper

By 2020, more than seven billion people and businesses and at least 30 billion devices will be connected to the Internet, which has spawned the new world of digital business. The convergence of digitally enabled people, business and things will disrupt existing business models in powerful ways. All of this change is driven by software. Successful digital businesses don’t just re-tool old processes and paradigms. Using software, they reinvent the way people purchase and receive goods and services. With the help of software, intelligent devices are gathering and providing highly valuable information to us. They’re inspiring a new generation of data-driven industries, services and business opportunities and what we see today is just the beginning.

With all of this technology now in the palm of consumers, expecting the world at their fingertips, IT needs to be positioned to deliver that same experience... with no going back. There is a fundamental shift towards 3rd platform, cloud native applications driving this digital transformation.

In this scenario, many organisations are looking for ways to drive more business value, redefine their business models, and build an enhanced customer experience in an increasingly digital world. IT must deliver enterprise IT services and applications with greater speed and agility, while reducing costs and minimising risks.

We argue that a hybrid cloud can help organisations innovate rapidly while still delivering enterprise-grade performance, resiliency and security. Hybrid Cloud infrastructures combine the control, reliability and confidence of private cloud with the simplicity, flexibility and cost efficiency of public clouds to transform delivery of IT services. If engineered correctly, hybrid clouds can deliver automated infrastructure services for traditional enterprise applications across private and public clouds with greater speed, scalability and agility while reducing costs and minimising risks.

Workflows and application blueprints transform what was once manual into automated infrastructure provisioning, on-demand, with management insights and cost

Page 54: Index [eubrasilcloudforum.eu] · We are pround to share with you the proceedings of the third edition of Cloudscape Brazil, the event organised within the remit of the EUbrasilCloudFORUM

54

CloudScapeBrazil 2016 Position Papers

transparency. Self-service catalogues empower business users to procure traditional enterprise applications and IT services on-demand, with service levels that align with workload and cost objectives. Finally, built-in security and data protection should also be present, to allow users to run their hybrid cloud with confidence.