network security (netsec)netsec.net.in.tum.de/slides/15_privacy.pdf · a cypherpunk’s manifesto a...

120
Chair of Network Architectures and Services Department of Informatics Technical University of Munich Network Security (NetSec) IN2101 – WS 19/20 Prof. Dr.-Ing. Georg Carle Dr. Holger Kinkelin Jonas Jelten Richard von Seck Johannes Schleger Acknowledgements: Dr. Marcel von Maltitz Chair of Network Architectures and Services Department of Informatics Technical University of Munich

Upload: others

Post on 16-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Chair of Network Architectures and ServicesDepartment of InformaticsTechnical University of Munich

Network Security (NetSec)

IN2101 – WS 19/20

Prof. Dr.-Ing. Georg Carle

Dr. Holger KinkelinJonas Jelten

Richard von SeckJohannes Schleger

Acknowledgements: Dr. Marcel von Maltitz

Chair of Network Architectures and ServicesDepartment of Informatics

Technical University of Munich

Page 2: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Chapter 15: Privacy

Privacy

Introduction

A Cypherpunk’s Manifesto

Contextual Integrity

Privacy Protection Goals

General Data Protection Regulation

Personal Data

Roles

Scope

Principles

Chapter 15: Privacy 15-1

Page 3: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Chapter 15: Privacy

Secure Computation

Use Cases

Definition

Processing Model

Linear Secret Sharing

SPDZ

Bibliography

Chapter 15: Privacy 15-2

Page 4: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Chapter 15: Privacy

Privacy

Introduction

A Cypherpunk’s Manifesto

Contextual Integrity

Privacy Protection Goals

General Data Protection Regulation

Secure Computation

Bibliography

Chapter 15: Privacy 15-3

Page 5: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Introduction

Privacy

Privacy is often used in an intuitive manner and most often in contexts, where it is deemed to be missing, e.g. privacy breaches, data thefts.

In order to develop technology for privacy protection, a better understanding of the concept is fundamental. Hence, we ask

• What does privacy mean?

• How does it relate our current understanding of security?

Chapter 15: Privacy — Privacy 15-4

Page 6: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

A Cypherpunk’s Manifesto

A non-scientific attempt: Setting

In the 1990’s the U.S. goverment tried to constrain the use of strong cryptography to ensure that the NSA could break all ongoing encryptedcommunications. The reasons where mainly the same like to today. → Cryptowars

In 1993, Eric Hughes wrote the Cyphernomicon, a pamphlet arguing for cryptography and describing the movement of Cypherpunks.

A fundamental goal is achieving and supporting personal privacy in the digital world.

From the Cypherpunk’s Manifesto [9]:

Privacy is necessary for an open society in the electronic age. Privacy is not secrecy. A private matter is something one doesn’twant the whole world to know, but a secret matter is something one doesn’t want anybody to know. Privacy is the power toselectively reveal oneself to the world.

There seem to be two main reasons people are drawn to Cypherpunks [. . . ]. The first reason is personal privacy. That is, toolsfor ensuring privacy, protection from a surveillance society, and individual choice. [...]

Chapter 15: Privacy — Privacy 15-5

Page 7: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

A Cypherpunk’s Manifesto

A non-scientific attempt: Analysis

• Context: Political debate about the power of the government in the electronic world and encryption as a measure of self-protection.

• Adversary: Surveillance society, NSA, governments

• Assets: Personal information, and especially own identity

• Remedies: (strong) Encryption, Anonymization (Mix networks, remailers, . . . )

Understanding of privacyPrivacy is mainly constituted by confidentiality of information and a mechanism of selective access control. Furthermore, privacy is oftenconsidered equal with anonymity, i.e. keeping the identity confidential.

Chapter 15: Privacy — Privacy 15-6

Page 8: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

A Cypherpunk’s Manifesto

A non-scientific attempt: CritiqueThe main focus is keeping data inaccessible for unauthorized third parties. But what if the third party is not unauthorized?

Privacy should also cover the following cases (examples):

• Apple collects your fitness and workout data• Should it be able to offer you suggestions for improvement?• Should it be able to offer you sport equipment ads, if you opt for it?• Should it be able to provide (sell) this data to your insurance company?

• Facebook knows what you like, who you know, what you do, ...• Should it be possible to derive your relationship status, your sexual orientation, your state of employment?• Should it allow to provide targeted advertisement; can you later understand why you’ve got to see a certain ad?• Should you be able to actually delete your account instead of just deactivating it?

• Amazon knows what you consume (read, listen to, watch, eat, wear, . . . ). Additionally, Alexa knows when you’re at home and howyour voice sounds like1.

• Should it be able to suggest articles for you based on products you bought for your little cousin instead for yourself?• Should it be able to derive more knowledge about you, as if you bought everything at different shops?• Should it be able to derive your current emotional state based on your voice?

1 Here, we even assume that she is not eavesdropping the whole timeChapter 15: Privacy — Privacy 15-7

Page 9: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Contextual Integrity

Contextual Integrity [13, 12]

Facing current technological developments, Helen Nissenbaum proposed a concept for privacy termed Contextual Integrity.

Main idea: There are no areas of life not governed by norms of information flow, no information or spheres of life for which “anything goes.”Contextual Integrity is given, if two properties are fulfilled:

1. Appropriateness of Information: An information to be revealed is fitting for a given context

2. Appropriateness of Distribution: The way, information is distributed is fitting for a given context.

Examples for 1):

• Health information is shared with the doctor, but not with the lawyer

• Religious affiliation is not shared with the employer

Examples for 2):

• Unidirectionality: The patient tells the doctor about her health, but the doctor does not tell about his.

• Non-transitiveness/Confidentiality: Information you share with a friend should not be forwarded by her to other friends.

• Free Choice: You tell your friends what you decide vs. the doctor may inquire certain information

Chapter 15: Privacy — Privacy 15-8

Page 10: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Contextual Integrity

Contextual Integrity [13, 12]

Facing current technological developments, Helen Nissenbaum proposed a concept for privacy termed Contextual Integrity.

Main idea: There are no areas of life not governed by norms of information flow, no information or spheres of life for which “anything goes.”

Contextual Integrity is given, if two properties are fulfilled:

1. Appropriateness of Information: An information to be revealed is fitting for a given context

2. Appropriateness of Distribution: The way, information is distributed is fitting for a given context.

Examples for 1):

• Health information is shared with the doctor, but not with the lawyer

• Religious affiliation is not shared with the employer

Examples for 2):

• Unidirectionality: The patient tells the doctor about her health, but the doctor does not tell about his.

• Non-transitiveness/Confidentiality: Information you share with a friend should not be forwarded by her to other friends.

• Free Choice: You tell your friends what you decide vs. the doctor may inquire certain information

Chapter 15: Privacy — Privacy 15-8

Page 11: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Contextual Integrity

Contextual Integrity [13, 12]

Facing current technological developments, Helen Nissenbaum proposed a concept for privacy termed Contextual Integrity.

Main idea: There are no areas of life not governed by norms of information flow, no information or spheres of life for which “anything goes.”Contextual Integrity is given, if two properties are fulfilled:

1. Appropriateness of Information: An information to be revealed is fitting for a given context

2. Appropriateness of Distribution: The way, information is distributed is fitting for a given context.

Examples for 1):

• Health information is shared with the doctor, but not with the lawyer

• Religious affiliation is not shared with the employer

Examples for 2):

• Unidirectionality: The patient tells the doctor about her health, but the doctor does not tell about his.

• Non-transitiveness/Confidentiality: Information you share with a friend should not be forwarded by her to other friends.

• Free Choice: You tell your friends what you decide vs. the doctor may inquire certain information

Chapter 15: Privacy — Privacy 15-8

Page 12: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Contextual Integrity

Contextual Integrity [13, 12]

Facing current technological developments, Helen Nissenbaum proposed a concept for privacy termed Contextual Integrity.

Main idea: There are no areas of life not governed by norms of information flow, no information or spheres of life for which “anything goes.”Contextual Integrity is given, if two properties are fulfilled:

1. Appropriateness of Information: An information to be revealed is fitting for a given context

2. Appropriateness of Distribution: The way, information is distributed is fitting for a given context.

Examples for 1):

• Health information is shared with the doctor, but not with the lawyer

• Religious affiliation is not shared with the employer

Examples for 2):

• Unidirectionality: The patient tells the doctor about her health, but the doctor does not tell about his.

• Non-transitiveness/Confidentiality: Information you share with a friend should not be forwarded by her to other friends.

• Free Choice: You tell your friends what you decide vs. the doctor may inquire certain information

Chapter 15: Privacy — Privacy 15-8

Page 13: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Contextual Integrity

Contextual Integrity [13, 12]

Facing current technological developments, Helen Nissenbaum proposed a concept for privacy termed Contextual Integrity.

Main idea: There are no areas of life not governed by norms of information flow, no information or spheres of life for which “anything goes.”Contextual Integrity is given, if two properties are fulfilled:

1. Appropriateness of Information: An information to be revealed is fitting for a given context

2. Appropriateness of Distribution: The way, information is distributed is fitting for a given context.

Examples for 1):

• Health information is shared with the doctor, but not with the lawyer

• Religious affiliation is not shared with the employer

Examples for 2):

• Unidirectionality: The patient tells the doctor about her health, but the doctor does not tell about his.

• Non-transitiveness/Confidentiality: Information you share with a friend should not be forwarded by her to other friends.

• Free Choice: You tell your friends what you decide vs. the doctor may inquire certain information

Chapter 15: Privacy — Privacy 15-8

Page 14: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Contextual Integrity

Limitation: Not every desired privacy property can be easily modeled as a question of information flow...

Privacy should also cover the following cases (examples):

• Apple collects your fitness and workout data• Should it be able to offer you suggestions for improvement?• Should it be able to offer you sport equipment ads, if you opt for it?• Should it be able to provide (sell) this data to your insurance company?

• Facebooks knows what you like, who you know, what you do, ...• Should it be possible to derive your relationship status, your sexual orientation, your state of employment?• Should it allow to provide targeted advertisement; can you later understand why you got to see a certain ad?• Should you be able to actually delete your account instead of just deactivating it?

• Amazon knows what you consume (read, listen to, watch, eat, wear, . . . ). Additionally, Alexa knows when you’re at home and howyour voice sounds like2.

• Should it be able to suggest articles for you based on products you bought for your little cousin instead for yourself?• Should it be able to derive more knowledge about you, as if you bought everything at different shops?• Should it be able to derive your current emotional state based on your voice?

2 Here, we assume that she is not eavesdropping the whole timeChapter 15: Privacy — Privacy 15-9

Page 15: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Contextual Integrity

Limitation: Not every desired privacy property can be easily modeled as a question of information flow...

Privacy should also cover the following cases (examples):

• Apple collects your fitness and workout data• Should it be able to offer you suggestions for improvement?• Should it be able to offer you sport equipment ads, if you opt for it?• Should it be able to provide (sell) this data to your insurance company?

• Facebooks knows what you like, who you know, what you do, ...• Should it be possible to derive your relationship status, your sexual orientation, your state of employment?• Should it allow to provide targeted advertisement; can you later understand why you got to see a certain ad?• Should you be able to actually delete your account instead of just deactivating it?

• Amazon knows what you consume (read, listen to, watch, eat, wear, . . . ). Additionally, Alexa knows when you’re at home and howyour voice sounds like2.

• Should it be able to suggest articles for you based on products you bought for your little cousin instead for yourself?• Should it be able to derive more knowledge about you, as if you bought everything at different shops?• Should it be able to derive your current emotional state based on your voice?

2 Here, we assume that she is not eavesdropping the whole timeChapter 15: Privacy — Privacy 15-9

Page 16: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Privacy Protection Goals

Privacy Protection Goal [3, 7, 8, 14, 15]

Recap:We defined protection goals for security: confidentiality, integrity, and availability (CIA). They constitute criteria which allow assessment,whether security is achieved in a certain context.This approach was highly successful and constitutes a de facto default understanding of security.

Idea: Establish protection goals for privacy which complement CIA and interrelate with them.

Proposal:

• Unlinkability: The inability to connect and combine initially separate information

• Transparency: The ability to observe the data handling and processing of a system

• Intervenability: The ability (by data and system owners) to influence all planned or ongoing processing of personal data

Chapter 15: Privacy — Privacy 15-10

Page 17: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Privacy Protection Goals

Integrity

IntervenabilityUnlinkability

Transparency

Availability

Confidentiality

The set of six protection goals is interrelated as a triad of duals, i.e. pairs of conflicting goals building a trade-off:

• Confidentiality vs. Availability: Confidentiality constrains access to information, while availability demands that information is accessi-ble.

• Integrity vs. Intervenability: Integrity constrains the ability to change information, while intervenability requires that stakeholders areable to change information when necessary.

• Unlinkability vs. Transparency: Unlinkability demands that information cannot be combined, transparency requires that insight ispossible how personal information of individuals are processed.

Chapter 15: Privacy — Privacy 15-11

Page 18: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Chapter 15: Privacy

Privacy

General Data Protection Regulation

Personal Data

Roles

Scope

Principles

Secure Computation

Bibliography

Chapter 15: Privacy 15-12

Page 19: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

General Data Protection Regulation

General Data Protection Regulation

• EU law on data protection and privacy

• Law passed in May, 2016

• Implementation since 25.05.2018

Fundamental Goals: Protection of personal data and individuals stay in control of their data. Harmonize legislation over whole EU, enableeasy lawful data flow.

Chapter 15: Privacy — General Data Protection Regulation 15-13

Page 20: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Personal Data

General Data Protection Regulation: Personal Data

⇒ Any information relating to an identified or identifiable natural person (‘data subject’)

An identifiable natural person is one who

• can be identified, directly or indirectly,

• in particular by reference to an identifier such as a name, an identification number, location data,

• an online identifier or

• to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person

Chapter 15: Privacy — General Data Protection Regulation 15-14

Page 21: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Excursion: Quasi-Identifiers

Definition [16]: Given a population of entities, a Quasi-Identifier is a subset of properties which is unique for each entity in the population.

Based on 1990 U.S. Census: 87% of US citizens have a unique combination of {ZIP code, birth date, sex}within that data set [16].

Reidentification attack:Sweeney [16] obtained two data sets:

• Voter registration list for Cambridge Massachusetts (public)

• Group Insurance Commission: “anonymized” patient-specific data of 135.000 state employees.

→ Patient information of Governor of Massachusetts could be uniquely identified, Sweeney sent them to him via mail.

Chapter 15: Privacy — General Data Protection Regulation 15-15

Page 22: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Roles

https://blogs.gartner.com/richard-watson/stop-agonising-gdrp-opt-emails-start-thinking-cloud-providers/

Chapter 15: Privacy — General Data Protection Regulation 15-16

Page 23: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

Chapter 15: Privacy — General Data Protection Regulation 15-17

Page 24: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

Chapter 15: Privacy — General Data Protection Regulation 15-17

Page 25: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

• Name, ID, . . .

• Cookies, IP address. . .

• A combination of properties which identifies you uniquely

• Retrieved web page

Chapter 15: Privacy — General Data Protection Regulation 15-17

Page 26: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

• Name, ID, . . .

• Cookies, IP address. . .

• A combination of properties which identifies you uniquely

• Retrieved web page

Chapter 15: Privacy — General Data Protection Regulation 15-17

Page 27: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

• Name, ID, . . .

• Cookies, IP address. . .

• A combination of properties which identifies you uniquely

• Retrieved web page

Chapter 15: Privacy — General Data Protection Regulation 15-17

Page 28: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

• Name, ID, . . .

• Cookies, IP address. . .

• A combination of properties which identifies you uniquely

• Retrieved web page

Chapter 15: Privacy — General Data Protection Regulation 15-17

Page 29: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

• Name, ID, . . .

• Cookies, IP address. . .

• A combination of properties which identifies you uniquely

• Retrieved web page

• Logfiles?

Chapter 15: Privacy — General Data Protection Regulation 15-17

Page 30: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

• Name, ID, . . .

• Cookies, IP address. . .

• A combination of properties which identifies you uniquely

• Retrieved web page

• Logfiles: Depends

Chapter 15: Privacy — General Data Protection Regulation 15-17

Page 31: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

Chapter 15: Privacy — General Data Protection Regulation 15-17

Page 32: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

• Databases

• CSV-Files

• Paper guest list on your desk

• Attendance list for a university course

• Your patient folder at your doctor’s

Chapter 15: Privacy — General Data Protection Regulation 15-17

Page 33: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

• Databases

• CSV-Files

• Paper guest list on your desk

• Attendance list for a university course

• Your patient folder at your doctor’s

Chapter 15: Privacy — General Data Protection Regulation 15-17

Page 34: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

• Databases

• CSV-Files

• Paper guest list on your desk

• Attendance list for a university course

• Your patient folder at your doctor’s

Chapter 15: Privacy — General Data Protection Regulation 15-17

Page 35: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

• Databases

• CSV-Files

• Paper guest list on your desk

• Attendance list for a university course

• Your patient folder at your doctor’s

Chapter 15: Privacy — General Data Protection Regulation 15-17

Page 36: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

• Databases

• CSV-Files

• Paper guest list on your desk

• Attendance list for a university course

• Your patient folder at your doctor’s

Chapter 15: Privacy — General Data Protection Regulation 15-17

Page 37: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

2. This Regulation does not apply to the processing of personal data:

. . .

c) by a natural person in the course of a purely personal or household activity

. . .

⇒ If you as a natural person handle data of others, the GDPR does not apply?

Chapter 15: Privacy — General Data Protection Regulation 15-18

Page 38: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

2. This Regulation does not apply to the processing of personal data:

. . .

c) by a natural person in the course of a purely personal or household activity

. . .

⇒ If you as a natural person handle data of others, the GDPR does not apply?

Chapter 15: Privacy — General Data Protection Regulation 15-18

Page 39: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

2. This Regulation does not apply to the processing of personal data:

. . .

c) by a natural person in the course of a purely personal or household activity

. . .

⇒ If you as a natural person handle data of others, the GDPR does not apply?

Chapter 15: Privacy — General Data Protection Regulation 15-18

Page 40: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 2 Material Scope:1. This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than byautomated means of personal data which form part of a filing system or are intended to form part of a filing system.

2. This Regulation does not apply to the processing of personal data:

. . .

c) by a natural person in the course of a purely personal or household activity

. . .

⇒ If you as a natural person handle data of others, the GDPR does not apply?

Chapter 15: Privacy — General Data Protection Regulation 15-18

Page 41: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 3 Territorial Scope:

1. . . . applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in theUnion, regardless of whether the processing takes place in the Union or not.

2. . . . applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in theUnion . . .

• No matter where person is, when controller or processor in EU

• No matter where controller or processor is, when person in EU

Note: People do not have to be EU-citizens, being in Europe, e. g., on vacation is sufficient.

Chapter 15: Privacy — General Data Protection Regulation 15-19

Page 42: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 3 Territorial Scope:

1. . . . applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in theUnion, regardless of whether the processing takes place in the Union or not.

2. . . . applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in theUnion . . .

• No matter where person is, when controller or processor in EU

• No matter where controller or processor is, when person in EU

Note: People do not have to be EU-citizens, being in Europe, e. g., on vacation is sufficient.

Chapter 15: Privacy — General Data Protection Regulation 15-19

Page 43: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 3 Territorial Scope:

1. . . . applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in theUnion, regardless of whether the processing takes place in the Union or not.

2. . . . applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in theUnion . . .

• No matter where person is, when controller or processor in EU

• No matter where controller or processor is, when person in EU

Note: People do not have to be EU-citizens, being in Europe, e. g., on vacation is sufficient.

Chapter 15: Privacy — General Data Protection Regulation 15-19

Page 44: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Scope

General Data Protection Regulation: Scope (Excerpt)

Art. 3 Territorial Scope:

1. . . . applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in theUnion, regardless of whether the processing takes place in the Union or not.

2. . . . applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in theUnion . . .

• No matter where person is, when controller or processor in EU

• No matter where controller or processor is, when person in EU

Note: People do not have to be EU-citizens, being in Europe, e. g., on vacation is sufficient.

Chapter 15: Privacy — General Data Protection Regulation 15-19

Page 45: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Principles

General Data Protection Regulation: Principles

The principles of the GDPR describe how processing of personal data should be performed.

• lawfulness, fairness and transparency

• purpose limitation

• data minimization

• accuracy

• storage limitation

• integrity and confidentiality

• accountability

Adhere to the next articles of the GDPR and other laws in place and let data subjects know how their data is used.

Chapter 15: Privacy — General Data Protection Regulation 15-20

Page 46: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Principles

General Data Protection Regulation: Principles

The principles of the GDPR describe how processing of personal data should be performed.

• lawfulness, fairness and transparency

• purpose limitation

• data minimization

• accuracy

• storage limitation

• integrity and confidentiality

• accountability

Specify the purpose of data collection explicitly, use only legitimate purposes and perform no further processing.

Chapter 15: Privacy — General Data Protection Regulation 15-20

Page 47: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Principles

General Data Protection Regulation: Principles

The principles of the GDPR describe how processing of personal data should be performed.

• lawfulness, fairness and transparency

• purpose limitation

• data minimization

• accuracy

• storage limitation

• integrity and confidentiality

• accountability

Personal data should be adequate, relevant and limited to what is necessary in relation to the specified purposes

Chapter 15: Privacy — General Data Protection Regulation 15-20

Page 48: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Principles

General Data Protection Regulation: Principles

The principles of the GDPR describe how processing of personal data should be performed.

• lawfulness, fairness and transparency

• purpose limitation

• data minimization

• accuracy

• storage limitation

• integrity and confidentiality

• accountability

Personal data should be accurate. Outdated and inaccurate must be rectified or erased.

Chapter 15: Privacy — General Data Protection Regulation 15-20

Page 49: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Principles

General Data Protection Regulation: Principles

The principles of the GDPR describe how processing of personal data should be performed.

• lawfulness, fairness and transparency

• purpose limitation

• data minimization

• accuracy

• storage limitation

• integrity and confidentiality

• accountability

Personal data should be kept in a form which permits identification of data subjects for no longer than is necessary for the specified purpose.

Chapter 15: Privacy — General Data Protection Regulation 15-20

Page 50: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Principles

General Data Protection Regulation: Principles

The principles of the GDPR describe how processing of personal data should be performed.

• lawfulness, fairness and transparency

• purpose limitation

• data minimization

• accuracy

• storage limitation

• integrity and confidentiality

• accountability

Ensure security of the personal data, including protection against unauthorised processing and against accidental loss, destruction or dam-age.

Chapter 15: Privacy — General Data Protection Regulation 15-20

Page 51: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Principles

General Data Protection Regulation: Principles

The principles of the GDPR describe how processing of personal data should be performed.

• lawfulness, fairness and transparency

• purpose limitation

• data minimization

• accuracy

• storage limitation

• integrity and confidentiality

• accountability

Demonstrate compliance with all aforementioned principles.

Chapter 15: Privacy — General Data Protection Regulation 15-20

Page 52: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Quiz

• How to understand security?

Using the protection goals of confidentiality, integrity and availability

• How to understand privacy?Using the protection goals of unlinkability, transparency and intervenability

• How can security and privacy goals be interrelated?A double triad or a triad of duals:confidentiality↔ availability, integrity↔ intervenability, unlinkability↔ transparency

• What does GDPR stand for?General Data Protection Regulation

• Is company data protected by the GDPR?No, only personal data of natural persons

• What is personal data?Any information relating to an identified or identifiable natural person

• What is the difference between the data controller and the data processor?• controller means the [entity] which determines the purposes and means of the processing of personal data• processor means [an entity] which processes personal data on behalf of the controller

Chapter 15: Privacy — Quiz 15-21

Page 53: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Quiz

• How to understand security?Using the protection goals of confidentiality, integrity and availability

• How to understand privacy?Using the protection goals of unlinkability, transparency and intervenability

• How can security and privacy goals be interrelated?A double triad or a triad of duals:confidentiality↔ availability, integrity↔ intervenability, unlinkability↔ transparency

• What does GDPR stand for?General Data Protection Regulation

• Is company data protected by the GDPR?No, only personal data of natural persons

• What is personal data?Any information relating to an identified or identifiable natural person

• What is the difference between the data controller and the data processor?• controller means the [entity] which determines the purposes and means of the processing of personal data• processor means [an entity] which processes personal data on behalf of the controller

Chapter 15: Privacy — Quiz 15-21

Page 54: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Quiz

• How to understand security?Using the protection goals of confidentiality, integrity and availability

• How to understand privacy?

Using the protection goals of unlinkability, transparency and intervenability

• How can security and privacy goals be interrelated?A double triad or a triad of duals:confidentiality↔ availability, integrity↔ intervenability, unlinkability↔ transparency

• What does GDPR stand for?General Data Protection Regulation

• Is company data protected by the GDPR?No, only personal data of natural persons

• What is personal data?Any information relating to an identified or identifiable natural person

• What is the difference between the data controller and the data processor?• controller means the [entity] which determines the purposes and means of the processing of personal data• processor means [an entity] which processes personal data on behalf of the controller

Chapter 15: Privacy — Quiz 15-21

Page 55: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Quiz

• How to understand security?Using the protection goals of confidentiality, integrity and availability

• How to understand privacy?Using the protection goals of unlinkability, transparency and intervenability

• How can security and privacy goals be interrelated?A double triad or a triad of duals:confidentiality↔ availability, integrity↔ intervenability, unlinkability↔ transparency

• What does GDPR stand for?General Data Protection Regulation

• Is company data protected by the GDPR?No, only personal data of natural persons

• What is personal data?Any information relating to an identified or identifiable natural person

• What is the difference between the data controller and the data processor?• controller means the [entity] which determines the purposes and means of the processing of personal data• processor means [an entity] which processes personal data on behalf of the controller

Chapter 15: Privacy — Quiz 15-21

Page 56: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Quiz

• How to understand security?Using the protection goals of confidentiality, integrity and availability

• How to understand privacy?Using the protection goals of unlinkability, transparency and intervenability

• How can security and privacy goals be interrelated?

A double triad or a triad of duals:confidentiality↔ availability, integrity↔ intervenability, unlinkability↔ transparency

• What does GDPR stand for?General Data Protection Regulation

• Is company data protected by the GDPR?No, only personal data of natural persons

• What is personal data?Any information relating to an identified or identifiable natural person

• What is the difference between the data controller and the data processor?• controller means the [entity] which determines the purposes and means of the processing of personal data• processor means [an entity] which processes personal data on behalf of the controller

Chapter 15: Privacy — Quiz 15-21

Page 57: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Quiz

• How to understand security?Using the protection goals of confidentiality, integrity and availability

• How to understand privacy?Using the protection goals of unlinkability, transparency and intervenability

• How can security and privacy goals be interrelated?A double triad or a triad of duals:confidentiality↔ availability, integrity↔ intervenability, unlinkability↔ transparency

• What does GDPR stand for?General Data Protection Regulation

• Is company data protected by the GDPR?No, only personal data of natural persons

• What is personal data?Any information relating to an identified or identifiable natural person

• What is the difference between the data controller and the data processor?• controller means the [entity] which determines the purposes and means of the processing of personal data• processor means [an entity] which processes personal data on behalf of the controller

Chapter 15: Privacy — Quiz 15-21

Page 58: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Quiz

• How to understand security?Using the protection goals of confidentiality, integrity and availability

• How to understand privacy?Using the protection goals of unlinkability, transparency and intervenability

• How can security and privacy goals be interrelated?A double triad or a triad of duals:confidentiality↔ availability, integrity↔ intervenability, unlinkability↔ transparency

• What does GDPR stand for?

General Data Protection Regulation

• Is company data protected by the GDPR?No, only personal data of natural persons

• What is personal data?Any information relating to an identified or identifiable natural person

• What is the difference between the data controller and the data processor?• controller means the [entity] which determines the purposes and means of the processing of personal data• processor means [an entity] which processes personal data on behalf of the controller

Chapter 15: Privacy — Quiz 15-21

Page 59: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Quiz

• How to understand security?Using the protection goals of confidentiality, integrity and availability

• How to understand privacy?Using the protection goals of unlinkability, transparency and intervenability

• How can security and privacy goals be interrelated?A double triad or a triad of duals:confidentiality↔ availability, integrity↔ intervenability, unlinkability↔ transparency

• What does GDPR stand for?General Data Protection Regulation

• Is company data protected by the GDPR?No, only personal data of natural persons

• What is personal data?Any information relating to an identified or identifiable natural person

• What is the difference between the data controller and the data processor?• controller means the [entity] which determines the purposes and means of the processing of personal data• processor means [an entity] which processes personal data on behalf of the controller

Chapter 15: Privacy — Quiz 15-21

Page 60: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Quiz

• How to understand security?Using the protection goals of confidentiality, integrity and availability

• How to understand privacy?Using the protection goals of unlinkability, transparency and intervenability

• How can security and privacy goals be interrelated?A double triad or a triad of duals:confidentiality↔ availability, integrity↔ intervenability, unlinkability↔ transparency

• What does GDPR stand for?General Data Protection Regulation

• Is company data protected by the GDPR?

No, only personal data of natural persons

• What is personal data?Any information relating to an identified or identifiable natural person

• What is the difference between the data controller and the data processor?• controller means the [entity] which determines the purposes and means of the processing of personal data• processor means [an entity] which processes personal data on behalf of the controller

Chapter 15: Privacy — Quiz 15-21

Page 61: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Quiz

• How to understand security?Using the protection goals of confidentiality, integrity and availability

• How to understand privacy?Using the protection goals of unlinkability, transparency and intervenability

• How can security and privacy goals be interrelated?A double triad or a triad of duals:confidentiality↔ availability, integrity↔ intervenability, unlinkability↔ transparency

• What does GDPR stand for?General Data Protection Regulation

• Is company data protected by the GDPR?No, only personal data of natural persons

• What is personal data?Any information relating to an identified or identifiable natural person

• What is the difference between the data controller and the data processor?• controller means the [entity] which determines the purposes and means of the processing of personal data• processor means [an entity] which processes personal data on behalf of the controller

Chapter 15: Privacy — Quiz 15-21

Page 62: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Quiz

• How to understand security?Using the protection goals of confidentiality, integrity and availability

• How to understand privacy?Using the protection goals of unlinkability, transparency and intervenability

• How can security and privacy goals be interrelated?A double triad or a triad of duals:confidentiality↔ availability, integrity↔ intervenability, unlinkability↔ transparency

• What does GDPR stand for?General Data Protection Regulation

• Is company data protected by the GDPR?No, only personal data of natural persons

• What is personal data?

Any information relating to an identified or identifiable natural person

• What is the difference between the data controller and the data processor?• controller means the [entity] which determines the purposes and means of the processing of personal data• processor means [an entity] which processes personal data on behalf of the controller

Chapter 15: Privacy — Quiz 15-21

Page 63: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Quiz

• How to understand security?Using the protection goals of confidentiality, integrity and availability

• How to understand privacy?Using the protection goals of unlinkability, transparency and intervenability

• How can security and privacy goals be interrelated?A double triad or a triad of duals:confidentiality↔ availability, integrity↔ intervenability, unlinkability↔ transparency

• What does GDPR stand for?General Data Protection Regulation

• Is company data protected by the GDPR?No, only personal data of natural persons

• What is personal data?Any information relating to an identified or identifiable natural person

• What is the difference between the data controller and the data processor?• controller means the [entity] which determines the purposes and means of the processing of personal data• processor means [an entity] which processes personal data on behalf of the controller

Chapter 15: Privacy — Quiz 15-21

Page 64: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Quiz

• How to understand security?Using the protection goals of confidentiality, integrity and availability

• How to understand privacy?Using the protection goals of unlinkability, transparency and intervenability

• How can security and privacy goals be interrelated?A double triad or a triad of duals:confidentiality↔ availability, integrity↔ intervenability, unlinkability↔ transparency

• What does GDPR stand for?General Data Protection Regulation

• Is company data protected by the GDPR?No, only personal data of natural persons

• What is personal data?Any information relating to an identified or identifiable natural person

• What is the difference between the data controller and the data processor?

• controller means the [entity] which determines the purposes and means of the processing of personal data• processor means [an entity] which processes personal data on behalf of the controller

Chapter 15: Privacy — Quiz 15-21

Page 65: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Quiz

• How to understand security?Using the protection goals of confidentiality, integrity and availability

• How to understand privacy?Using the protection goals of unlinkability, transparency and intervenability

• How can security and privacy goals be interrelated?A double triad or a triad of duals:confidentiality↔ availability, integrity↔ intervenability, unlinkability↔ transparency

• What does GDPR stand for?General Data Protection Regulation

• Is company data protected by the GDPR?No, only personal data of natural persons

• What is personal data?Any information relating to an identified or identifiable natural person

• What is the difference between the data controller and the data processor?• controller means the [entity] which determines the purposes and means of the processing of personal data• processor means [an entity] which processes personal data on behalf of the controller

Chapter 15: Privacy — Quiz 15-21

Page 66: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Chapter 15: Privacy

Privacy

General Data Protection Regulation

Secure Computation

Bibliography

Chapter 15: Privacy 15-22

Page 67: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Privacy Enhancing Technologies

Privacy Enhancing Technologies

Definition [17]:

Privacy-Enhancing Technologies is a system of ICT measures protecting informational privacy by eliminating or minimisingpersonal data thereby preventing unnecessary or unwanted processing of personal data, without the loss of the functionality ofthe information system.

ICT = Information and communication technology

Examples

• Mix networks (e.g. Tor) for browsing without revealing IP

• Anonymous Credentials for authorization without authentication

• Private Information Retrieval for querying a database without disclosing the query

• Secure Computation for processing data without having access to it

Chapter 15: Privacy — Privacy Enhancing Technologies 15-23

Page 68: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Privacy Enhancing Technologies

Privacy Enhancing Technologies

Definition [17]:

Privacy-Enhancing Technologies is a system of ICT measures protecting informational privacy by eliminating or minimisingpersonal data thereby preventing unnecessary or unwanted processing of personal data, without the loss of the functionality ofthe information system.

ICT = Information and communication technology

Examples

• Mix networks (e.g. Tor) for browsing without revealing IP

• Anonymous Credentials for authorization without authentication

• Private Information Retrieval for querying a database without disclosing the query

• Secure Computation for processing data without having access to it

Chapter 15: Privacy — Privacy Enhancing Technologies 15-23

Page 69: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Use Cases

Secure Computation is needed, if computation on input data of different stakeholders should be carried out, without making anything acces-sible than the result of the computation.

Example 1 [18]: Millionaires’ Problem Two millionaires want to find out, who is richer. Both do not want to give away any other informationabout their wealth.

Example 2 [18]: Secret Voting A committee of m members wish to decide on a yes-no action. Each member is to write an opinion xi , andthe final action can be regarded as a function f (x1, x2, x3, ..., xm). f should be computed, without anyone knowing the opinion of any othermembers.

Example 3: Private Auctions A group of m people bid on a second price auction, the highest bidder wins and pays the price of the secondhighest bidder. Everything but the name of the highest bidder and the second highest price remains unknown to all participants.

Example 4: Multi-centric studies A group of m research institutes perform the same study with different groups of people. They want toevaluate the results, as if all data would be present in a single DB, but merging of the individual data sets is not allowed.

Chapter 15: Privacy — Secure Computation 15-24

Page 70: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Use Cases

Secure Computation is needed, if computation on input data of different stakeholders should be carried out, without making anything acces-sible than the result of the computation.

Example 1 [18]: Millionaires’ Problem Two millionaires want to find out, who is richer. Both do not want to give away any other informationabout their wealth.

Example 2 [18]: Secret Voting A committee of m members wish to decide on a yes-no action. Each member is to write an opinion xi , andthe final action can be regarded as a function f (x1, x2, x3, ..., xm). f should be computed, without anyone knowing the opinion of any othermembers.

Example 3: Private Auctions A group of m people bid on a second price auction, the highest bidder wins and pays the price of the secondhighest bidder. Everything but the name of the highest bidder and the second highest price remains unknown to all participants.

Example 4: Multi-centric studies A group of m research institutes perform the same study with different groups of people. They want toevaluate the results, as if all data would be present in a single DB, but merging of the individual data sets is not allowed.

Chapter 15: Privacy — Secure Computation 15-24

Page 71: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Use Cases

Secure Computation is needed, if computation on input data of different stakeholders should be carried out, without making anything acces-sible than the result of the computation.

Example 1 [18]: Millionaires’ Problem Two millionaires want to find out, who is richer. Both do not want to give away any other informationabout their wealth.

Example 2 [18]: Secret Voting A committee of m members wish to decide on a yes-no action. Each member is to write an opinion xi , andthe final action can be regarded as a function f (x1, x2, x3, ..., xm). f should be computed, without anyone knowing the opinion of any othermembers.

Example 3: Private Auctions A group of m people bid on a second price auction, the highest bidder wins and pays the price of the secondhighest bidder. Everything but the name of the highest bidder and the second highest price remains unknown to all participants.

Example 4: Multi-centric studies A group of m research institutes perform the same study with different groups of people. They want toevaluate the results, as if all data would be present in a single DB, but merging of the individual data sets is not allowed.

Chapter 15: Privacy — Secure Computation 15-24

Page 72: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Use Cases

Secure Computation is needed, if computation on input data of different stakeholders should be carried out, without making anything acces-sible than the result of the computation.

Example 1 [18]: Millionaires’ Problem Two millionaires want to find out, who is richer. Both do not want to give away any other informationabout their wealth.

Example 2 [18]: Secret Voting A committee of m members wish to decide on a yes-no action. Each member is to write an opinion xi , andthe final action can be regarded as a function f (x1, x2, x3, ..., xm). f should be computed, without anyone knowing the opinion of any othermembers.

Example 3: Private Auctions A group of m people bid on a second price auction, the highest bidder wins and pays the price of the secondhighest bidder. Everything but the name of the highest bidder and the second highest price remains unknown to all participants.

Example 4: Multi-centric studies A group of m research institutes perform the same study with different groups of people. They want toevaluate the results, as if all data would be present in a single DB, but merging of the individual data sets is not allowed.

Chapter 15: Privacy — Secure Computation 15-24

Page 73: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Use Cases

Secure Computation is needed, if computation on input data of different stakeholders should be carried out, without making anything acces-sible than the result of the computation.

Example 1 [18]: Millionaires’ Problem Two millionaires want to find out, who is richer. Both do not want to give away any other informationabout their wealth.

Example 2 [18]: Secret Voting A committee of m members wish to decide on a yes-no action. Each member is to write an opinion xi , andthe final action can be regarded as a function f (x1, x2, x3, ..., xm). f should be computed, without anyone knowing the opinion of any othermembers.

Example 3: Private Auctions A group of m people bid on a second price auction, the highest bidder wins and pays the price of the secondhighest bidder. Everything but the name of the highest bidder and the second highest price remains unknown to all participants.

Example 4: Multi-centric studies A group of m research institutes perform the same study with different groups of people. They want toevaluate the results, as if all data would be present in a single DB, but merging of the individual data sets is not allowed.

Chapter 15: Privacy — Secure Computation 15-24

Page 74: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Definition

Definition (Secure Computation [4])[T]he parties, or players, that participate are called P1, ... , Pn . Each player Pi holds a secret input xi , and the players agree on some functionf that takes n inputs. Their goals is to compute y = f (x1, ... , xn) while making sure that the following two conditions are satisfied:

• Correctness: the correct value of y is computed; and

• Privacy: y is the only new information that is released

Computing f such that privacy and correctness are achieved is referred to as computing f securely.

Chapter 15: Privacy — Secure Computation 15-25

Page 75: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Processing Model

Roles of participants

• I – Input parties: Provide data to the computation

• C – Computing parties: Perform computation on the provided data

• R – Result parties: Obtain the result from the computation

• (SC: Secure Computation)

Common interaction models [1]

Chapter 15: Privacy — Secure Computation 15-26

Page 76: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Linear Secret Sharing

Secure Computation can be realized using Linear Secret Sharing (LSS). However, LSS was initially intended for another problem.

Problem: A secret should be distributed among n parties so that it can only be used in cooperation.

Example 1: Decryption with shared private keyExample 2: Missile launching codes shared among generalsExample 3: DNSSEC Root key shared among seven people3

3 https://www.schneier.com/blog/archives/2010/07/dnssec_root_key.html

Chapter 15: Privacy — Secure Computation 15-27

Page 77: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Linear Secret Sharing

Approach: Additive Secret Sharing

Let x be a secret to be distributed among n parties. Define Zp for a sufficiently large prime p.Choose r1 ... rn−1 shares uniformly at random in Zp . Let rn = x − r1 − r2 − ...− rn−1 mod p

Chapter 15: Privacy — Secure Computation 15-28

Page 78: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Linear Secret Sharing

Approach: Additive Secret Sharing

Let x be a secret to be distributed among n parties. Define Zp for a sufficiently large prime p.Choose r1 ... rn−1 shares uniformly at random in Zp . Let rn = x − r1 − r2 − ...− rn−1 mod p

Option 1: ∀0 ≤ i < n: Send share ri to party Pi .

⇒ All parties have to cooperate for reconstruction.⇒ (n,n)-threshold scheme

Chapter 15: Privacy — Secure Computation 15-28

Page 79: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Linear Secret Sharing

Approach: Additive Secret Sharing

Let x be a secret to be distributed among n parties. Define Zp for a sufficiently large prime p.Choose r1 ... rn−1 shares uniformly at random in Zp . Let rn = x − r1 − r2 − ...− rn−1 mod p

Option 2: ∀0 ≤ i < n: Send all shares but ri to party Pi .

⇒ Two parties are sufficient for reconstruction.⇒ (2,n)-threshold scheme

More sophisticated approaches (e.g. Shamir’s Secret Sharing) exist, which allow (t,n)-threshold sharings with an arbitrary 1 ≤ t ≤ n.

Chapter 15: Privacy — Secure Computation 15-28

Page 80: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Linear Secret Sharing

Approach: Additive Secret Sharing

Let x be a secret to be distributed among n parties. Define Zp for a sufficiently large prime p.Choose r1 ... rn−1 shares uniformly at random in Zp . Let rn = x − r1 − r2 − ...− rn−1 mod p

Option 2: ∀0 ≤ i < n: Send all shares but ri to party Pi .

⇒ Two parties are sufficient for reconstruction.⇒ (2,n)-threshold scheme

More sophisticated approaches (e.g. Shamir’s Secret Sharing) exist, which allow (t,n)-threshold sharings with an arbitrary 1 ≤ t ≤ n.

Chapter 15: Privacy — Secure Computation 15-28

Page 81: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Linear Secret Sharing

Definition (Secure Computation [4])[T]he parties, or players, that participate are called P1, ... , Pn . Each player Pi holds a secret input xi , and the players agree on some functionf that takes n inputs. Their goals is to compute y = f (x1, ... , xn) while making sure that the following two conditions are satisfied:

• Correctness: the correct value of y is computed; and

• Privacy: y is the only new information that is released

Computing f such that privacy and correctness are achieved is referred to as computing f securely.

Chapter 15: Privacy — Secure Computation 15-29

Page 82: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Linear Secret Sharing

Correctness: Reconstruction is possible if two or more parties pool their information, since∑n

i=1 ri mod p = x

Privacy: Party Pi knows that∑n

j=1 rj mod p = x holds and all shares except ri .

However, for any possible value for x, a corresponding ri can be found: ∀x ∈ Zp : ∃ri = x −∑n

j=1,j 6= i rj mod p Since, each ri is chosenuniformly at random, it does not provide any knowledge on x.⇒ Shares do no leak any information about the original secret.

Chapter 15: Privacy — Secure Computation 15-30

Page 83: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Addition

So far, secrets can be split and shares can be recombined.

Observation: Computation on shares is possible.

Protocol Secure Addition [4]

Participants are P1, P2, P3, input for Pi is xi ∈ Zp , where p is a fixed prime agreed on in advance.

1. Each Pi computes and distributes shares of his or her secret xi : he or she chooses ri,1, ri,2 uniformly at random in Zp , and setsri,3 = xi − ri,1 − ri,2 mod p.

2. Each Pi sends privately ri,2, ri,3 to P1, ri,1, ri,3 to P2, and ri,1, ri,2 to P3 (note that this involves Pi sending “to himself or herself”). So P1,for instance, now holds (r1,1), r1,2, r1,3, r2,2, r2,3 and r3,2, r3,3.

3. Each Pj adds corresponding shares of the three secrets: ` 6= j, s ` = r1,` + r2,` + r3,` mod p, and announces s` to all parties. Each partycomputes and announces two values.

4. All parties compute the result v = s1 + s2 + s3 mod p.

Chapter 15: Privacy — Secure Computation 15-31

Page 84: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Addition

So far, secrets can be split and shares can be recombined.

Observation: Computation on shares is possible.

Protocol Secure Addition [4]

Participants are P1, P2, P3, input for Pi is xi ∈ Zp , where p is a fixed prime agreed on in advance.

1. Each Pi computes and distributes shares of his or her secret xi : he or she chooses ri,1, ri,2 uniformly at random in Zp , and setsri,3 = xi − ri,1 − ri,2 mod p.

2. Each Pi sends privately ri,2, ri,3 to P1, ri,1, ri,3 to P2, and ri,1, ri,2 to P3 (note that this involves Pi sending “to himself or herself”). So P1,for instance, now holds (r1,1), r1,2, r1,3, r2,2, r2,3 and r3,2, r3,3.

3. Each Pj adds corresponding shares of the three secrets: ` 6= j, s ` = r1,` + r2,` + r3,` mod p, and announces s` to all parties. Each partycomputes and announces two values.

4. All parties compute the result v = s1 + s2 + s3 mod p.

Chapter 15: Privacy — Secure Computation 15-31

Page 85: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Addition

So far, secrets can be split and shares can be recombined.

Observation: Computation on shares is possible.

Protocol Secure Addition [4]

Participants are P1, P2, P3, input for Pi is xi ∈ Zp , where p is a fixed prime agreed on in advance.

1. Each Pi computes and distributes shares of his or her secret xi : he or she chooses ri,1, ri,2 uniformly at random in Zp , and setsri,3 = xi − ri,1 − ri,2 mod p.

2. Each Pi sends privately ri,2, ri,3 to P1, ri,1, ri,3 to P2, and ri,1, ri,2 to P3 (note that this involves Pi sending “to himself or herself”). So P1,for instance, now holds (r1,1), r1,2, r1,3, r2,2, r2,3 and r3,2, r3,3.

3. Each Pj adds corresponding shares of the three secrets: ` 6= j, s ` = r1,` + r2,` + r3,` mod p, and announces s` to all parties. Each partycomputes and announces two values.

4. All parties compute the result v = s1 + s2 + s3 mod p.

Chapter 15: Privacy — Secure Computation 15-31

Page 86: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Addition

So far, secrets can be split and shares can be recombined.

Observation: Computation on shares is possible.

Protocol Secure Addition [4]

Participants are P1, P2, P3, input for Pi is xi ∈ Zp , where p is a fixed prime agreed on in advance.

1. Each Pi computes and distributes shares of his or her secret xi : he or she chooses ri,1, ri,2 uniformly at random in Zp , and setsri,3 = xi − ri,1 − ri,2 mod p.

2. Each Pi sends privately ri,2, ri,3 to P1, ri,1, ri,3 to P2, and ri,1, ri,2 to P3 (note that this involves Pi sending “to himself or herself”). So P1,for instance, now holds (r1,1), r1,2, r1,3, r2,2, r2,3 and r3,2, r3,3.

3. Each Pj adds corresponding shares of the three secrets: ` 6= j, s ` = r1,` + r2,` + r3,` mod p, and announces s` to all parties. Each partycomputes and announces two values.

4. All parties compute the result v = s1 + s2 + s3 mod p.

Chapter 15: Privacy — Secure Computation 15-31

Page 87: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Addition

So far, secrets can be split and shares can be recombined.

Observation: Computation on shares is possible.

Protocol Secure Addition [4]

Participants are P1, P2, P3, input for Pi is xi ∈ Zp , where p is a fixed prime agreed on in advance.

1. Each Pi computes and distributes shares of his or her secret xi : he or she chooses ri,1, ri,2 uniformly at random in Zp , and setsri,3 = xi − ri,1 − ri,2 mod p.

2. Each Pi sends privately ri,2, ri,3 to P1, ri,1, ri,3 to P2, and ri,1, ri,2 to P3 (note that this involves Pi sending “to himself or herself”). So P1,for instance, now holds (r1,1), r1,2, r1,3, r2,2, r2,3 and r3,2, r3,3.

3. Each Pj adds corresponding shares of the three secrets: ` 6= j, s ` = r1,` + r2,` + r3,` mod p, and announces s` to all parties. Each partycomputes and announces two values.

4. All parties compute the result v = s1 + s2 + s3 mod p.

Chapter 15: Privacy — Secure Computation 15-31

Page 88: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Addition

So far, secrets can be split and shares can be recombined.

Observation: Computation on shares is possible.

Protocol Secure Addition [4]

Participants are P1, P2, P3, input for Pi is xi ∈ Zp , where p is a fixed prime agreed on in advance.

1. Each Pi computes and distributes shares of his or her secret xi : he or she chooses ri,1, ri,2 uniformly at random in Zp , and setsri,3 = xi − ri,1 − ri,2 mod p.

2. Each Pi sends privately ri,2, ri,3 to P1, ri,1, ri,3 to P2, and ri,1, ri,2 to P3 (note that this involves Pi sending “to himself or herself”). So P1,for instance, now holds (r1,1), r1,2, r1,3, r2,2, r2,3 and r3,2, r3,3.

3. Each Pj adds corresponding shares of the three secrets: ` 6= j, s ` = r1,` + r2,` + r3,` mod p, and announces s` to all parties. Each partycomputes and announces two values.

4. All parties compute the result v = s1 + s2 + s3 mod p.

Chapter 15: Privacy — Secure Computation 15-31

Page 89: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Addition

So far, secrets can be split and shares can be recombined.

Observation: Computation on shares is possible.

Protocol Secure Addition [4]

Participants are P1, P2, P3, input for Pi is xi ∈ Zp , where p is a fixed prime agreed on in advance.

1. Each Pi computes and distributes shares of his or her secret xi : he or she chooses ri,1, ri,2 uniformly at random in Zp , and setsri,3 = xi − ri,1 − ri,2 mod p.

2. Each Pi sends privately ri,2, ri,3 to P1, ri,1, ri,3 to P2, and ri,1, ri,2 to P3 (note that this involves Pi sending “to himself or herself”). So P1,for instance, now holds (r1,1), r1,2, r1,3, r2,2, r2,3 and r3,2, r3,3.

3. Each Pj adds corresponding shares of the three secrets: ` 6= j, s ` = r1,` + r2,` + r3,` mod p, and announces s` to all parties. Each partycomputes and announces two values.

4. All parties compute the result v = s1 + s2 + s3 mod p.

Chapter 15: Privacy — Secure Computation 15-31

Page 90: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Addition

Each party Pi always also has its own shares, i. e. ri,1 , ri,2 , ri,3 . Chapter 15: Privacy — Secure Computation 15-32

Page 91: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Addition

Correctness: vsum =∑

j sj mod p =∑

j∑

i ri,j mod p =∑

i∑

j ri,j mod p =∑

i xi mod p

Privacy: Shares do not leak any information about the original secret. Local computations on shares do not provide any further knowledge.Pi knows all intermediate values s` for ` 6= i. The only new information for Pi during the protocol is si . However, when Pi obtains vsum

4, it canderive si efficiently, since si = vsum −

∑j,j 6= i sj .

4 This is legitimate, since it is the goal of the whole computation, to make vsum known to the parties.Chapter 15: Privacy — Secure Computation 15-33

Page 92: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Multiplication

Protocol Secure Multiplication [4]

Participants are P1, P2, and P3; input for P1 is a ∈ Zp ; input for P2 is b ∈ Zp , where p is a fixed prime agreed on in advance. P3 has no input.

1. P1 distributes shares a1, a2, a3 of a, while P2 distributes shares b1, b2, b3 of b.

2. P1 locally computes u1 = a2b2 + a2b3 + a3b2 mod p,P2 locally computes u2 = a3b3 + a1b3 + a3b1 mod p, andP3 locally computes u3 = a1b1 + a1b2 + a2b1 mod p

3. The players use Protocol Secure Addition to compute the sum u1 + u2 + u3 mod p securely, where Pi uses ui as input.

Chapter 15: Privacy — Secure Computation 15-34

Page 93: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Multiplication

Chapter 15: Privacy — Secure Computation 15-35

Page 94: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Multiplication

Correctness:

vmult = ab = (a1 + a2 + a3)(b1 + b2 + b3) (1)

= a1b1 + a1b2 + a1b3 + a2b1 + a2b2 + a2b3 + a3b1 + a3b2 + a3b3 mod p (2)

Privacy: Shares do not leak any information about the original secret. Local computations on shares do not provide any further knowledge.Intermediate values u1, u2, u3 are summed up using Protocol Secure Addition. We have already proven its privacy and correctness.

Chapter 15: Privacy — Secure Computation 15-36

Page 95: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Multiplication

Example: Matchmaking Alice and Bob use TinderSMC. Alice is P1, Bob is P2, Tinder Corp. is P3. P3 is actually a third-party, but not a trustedthird-party, since it does not learn anything during the computation. Alice and Bob want to find out, whether both are interested in each otherby using the following representation:

• If Alice is interested x1 = 1, else x1 = 0

• If Bob is interested x2 = 1, else x2 = 0

We want a secure function match() with the following requirements:

• Correctness: match(x1, x2) = 1 iff x1 == 1 && x2 == 1

• Privacy: ∀Pi ∈ {P1, P2} : if xi == 1 then match(x1, x2) else 0

⇒ Protocol Secure Multiplication is exactly this function.

Chapter 15: Privacy — Secure Computation 15-37

Page 96: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Multiplication

Consider the consequences of using Protocol Secure Addition as a subroutine in the last step of Protocol Secure Multiplication:

1. u1, u2, u3 are the “new” secret inputs for the last addition.

2. They have to be transformed into secret shares themselves.

3. Secret sharing a value implies a round of communication: Party Pi has to split ui into rui ,2, rui ,2, rui ,2 and distribute it to the other parties.

4. Addition is performed locally

5. Recombination is another round of communication5

Communication often is costly, can we do without Protocol Secure Addition?

⇒ No

5 If Protocol Secure Addition would not have been used, recombination would have to be carried out nevertheless.Chapter 15: Privacy — Secure Computation 15-38

Page 97: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Multiplication

Consider the consequences of using Protocol Secure Addition as a subroutine in the last step of Protocol Secure Multiplication:

1. u1, u2, u3 are the “new” secret inputs for the last addition.

2. They have to be transformed into secret shares themselves.

3. Secret sharing a value implies a round of communication: Party Pi has to split ui into rui ,2, rui ,2, rui ,2 and distribute it to the other parties.

4. Addition is performed locally

5. Recombination is another round of communication5

Communication often is costly, can we do without Protocol Secure Addition?

⇒ No

5 If Protocol Secure Addition would not have been used, recombination would have to be carried out nevertheless.Chapter 15: Privacy — Secure Computation 15-38

Page 98: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Multiplication

Counter Example The matchmaking setting is as before. However, Tinder Corp. wants to reduce costs, and replaces the final Secure Additionagainst a plain announcement of the intermediate values u1, u2, u3. Every party can then perform the final addition locally. Correctness is stillgiven.

RecapP1 locally computes u1 = a2b2 + a2b3 + a3b2 mod p,P2 locally computes u2 = a3b3 + a1b3 + a3b1 mod p, andP3 locally computes u3 = a1b1 + a1b2 + a2b1 mod p

VulnerabilityP1 obtains u2.

AttackAssume x1 == 0, then P1 should not find out x2.

P1 already knows b2, b3 and of course a1, a2, a3.If P1 can recover b1, x2 leaks.

If P1 obtains u2, she can compute locally b1 = (u2 − a3b3 − a1b3) · (a3)−1 mod p.x2 = b1 + b2 + b3.Privacy is not given anymore.

⇒ Further implication: Collusion of a user with Tinder Corp. also undermines privacy.

Chapter 15: Privacy — Secure Computation 15-39

Page 99: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Multiplication

Counter Example The matchmaking setting is as before. However, Tinder Corp. wants to reduce costs, and replaces the final Secure Additionagainst a plain announcement of the intermediate values u1, u2, u3. Every party can then perform the final addition locally. Correctness is stillgiven.

RecapP1 locally computes u1 = a2b2 + a2b3 + a3b2 mod p,P2 locally computes u2 = a3b3 + a1b3 + a3b1 mod p, andP3 locally computes u3 = a1b1 + a1b2 + a2b1 mod p

VulnerabilityP1 obtains u2.

AttackAssume x1 == 0, then P1 should not find out x2.

P1 already knows b2, b3 and of course a1, a2, a3.If P1 can recover b1, x2 leaks.

If P1 obtains u2, she can compute locally b1 = (u2 − a3b3 − a1b3) · (a3)−1 mod p.x2 = b1 + b2 + b3.Privacy is not given anymore.

⇒ Further implication: Collusion of a user with Tinder Corp. also undermines privacy.

Chapter 15: Privacy — Secure Computation 15-39

Page 100: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Multiplication

Counter Example The matchmaking setting is as before. However, Tinder Corp. wants to reduce costs, and replaces the final Secure Additionagainst a plain announcement of the intermediate values u1, u2, u3. Every party can then perform the final addition locally. Correctness is stillgiven.

RecapP1 locally computes u1 = a2b2 + a2b3 + a3b2 mod p,P2 locally computes u2 = a3b3 + a1b3 + a3b1 mod p, andP3 locally computes u3 = a1b1 + a1b2 + a2b1 mod p

VulnerabilityP1 obtains u2.

AttackAssume x1 == 0, then P1 should not find out x2.

P1 already knows b2, b3 and of course a1, a2, a3.If P1 can recover b1, x2 leaks.

If P1 obtains u2, she can compute locally b1 = (u2 − a3b3 − a1b3) · (a3)−1 mod p.x2 = b1 + b2 + b3.Privacy is not given anymore.

⇒ Further implication: Collusion of a user with Tinder Corp. also undermines privacy.

Chapter 15: Privacy — Secure Computation 15-39

Page 101: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Multiplication

Counter Example The matchmaking setting is as before. However, Tinder Corp. wants to reduce costs, and replaces the final Secure Additionagainst a plain announcement of the intermediate values u1, u2, u3. Every party can then perform the final addition locally. Correctness is stillgiven.

RecapP1 locally computes u1 = a2b2 + a2b3 + a3b2 mod p,P2 locally computes u2 = a3b3 + a1b3 + a3b1 mod p, andP3 locally computes u3 = a1b1 + a1b2 + a2b1 mod p

VulnerabilityP1 obtains u2.

AttackAssume x1 == 0, then P1 should not find out x2.

P1 already knows b2, b3 and of course a1, a2, a3.If P1 can recover b1, x2 leaks.

If P1 obtains u2, she can compute locally b1 = (u2 − a3b3 − a1b3) · (a3)−1 mod p.x2 = b1 + b2 + b3.Privacy is not given anymore.

⇒ Further implication: Collusion of a user with Tinder Corp. also undermines privacy.

Chapter 15: Privacy — Secure Computation 15-39

Page 102: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Multiplication

Counter Example The matchmaking setting is as before. However, Tinder Corp. wants to reduce costs, and replaces the final Secure Additionagainst a plain announcement of the intermediate values u1, u2, u3. Every party can then perform the final addition locally. Correctness is stillgiven.

RecapP1 locally computes u1 = a2b2 + a2b3 + a3b2 mod p,P2 locally computes u2 = a3b3 + a1b3 + a3b1 mod p, andP3 locally computes u3 = a1b1 + a1b2 + a2b1 mod p

VulnerabilityP1 obtains u2.

AttackAssume x1 == 0, then P1 should not find out x2.

P1 already knows b2, b3 and of course a1, a2, a3.If P1 can recover b1, x2 leaks.

If P1 obtains u2, she can compute locally b1 = (u2 − a3b3 − a1b3) · (a3)−1 mod p.x2 = b1 + b2 + b3.Privacy is not given anymore.

⇒ Further implication: Collusion of a user with Tinder Corp. also undermines privacy.

Chapter 15: Privacy — Secure Computation 15-39

Page 103: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Computation based on Linear Secret Sharing: Multiplication

Counter Example The matchmaking setting is as before. However, Tinder Corp. wants to reduce costs, and replaces the final Secure Additionagainst a plain announcement of the intermediate values u1, u2, u3. Every party can then perform the final addition locally. Correctness is stillgiven.

RecapP1 locally computes u1 = a2b2 + a2b3 + a3b2 mod p,P2 locally computes u2 = a3b3 + a1b3 + a3b1 mod p, andP3 locally computes u3 = a1b1 + a1b2 + a2b1 mod p

VulnerabilityP1 obtains u2.

AttackAssume x1 == 0, then P1 should not find out x2.

P1 already knows b2, b3 and of course a1, a2, a3.If P1 can recover b1, x2 leaks.

If P1 obtains u2, she can compute locally b1 = (u2 − a3b3 − a1b3) · (a3)−1 mod p.x2 = b1 + b2 + b3.Privacy is not given anymore.

⇒ Further implication: Collusion of a user with Tinder Corp. also undermines privacy.

Chapter 15: Privacy — Secure Computation 15-39

Page 104: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Security Model

Adversaries:

• Global Passive Observer: Secure end-to-end-channels are assumed. Otherwise, eavesdropping on the network would reveal allshares.

• Single Party: Shares are proven not to leak any information about the secret values. Same for protocols.

• Multiple Parties (Collusion): Two parties suffice to recover all secrets. Scaling the number of parties does not yield further security.Reason: Applied secret sharing is (2,n)-threshold scheme.

⇒ Robustness against larger colluding groups necessary⇒ Security should scale with number of participating parties

Chapter 15: Privacy — Secure Computation 15-40

Page 105: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

SPDZ

SPDZ [6]6 is a state-of-the-art protocol for SMC. It mitigates several aforementioned weaknesses of other SMC approaches. Furthermore,improvements on the communication efficiency is a further goal.

SPDZ uses the (n,n)-threshold scheme as described on page 77⇒ Secure against up to n − 1 parties⇒ Security scales with n

ProblemMultiplication does not work anymore

6 SPDZ is an acronym of the researchers who developed the approach: I. Damgard, V. Pastro, N. P. Smart, and S. ZakariasChapter 15: Privacy — Secure Computation 15-41

Page 106: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

SPDZ

Correctness:

vmult = ab = (a1 + a2 + a3)(b1 + b2 + b3) (3)

= a1b1 + a1b2 + a1b3 + a2b1 + a2b2 + a2b3 + a3b1 + a3b2 + a3b3 mod p (4)

Privacy: Shares do not leak any information about the original secret. Local computations on shares do not provide any further knowledge.Intermediate values u1, u2, u3 are summed up using Protocol Secure Addition. We have already proven its privacy and correctness.

Chapter 15: Privacy — Secure Computation 15-42

Page 107: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

SPDZ

Protocol Secure Multiplication SPDZParticipants are P1, P2, and P3; input for P1 is a ∈ Zp ; input for P2 is b ∈ Zp , where c = ab and p is a fixed prime agreed on in advance. Allother parties have no input. Let n denote the total amount of parties/shares.

Furthermore, we assume, there is a triplet7 x, y, z chosen uniformly at random such that xy=z.Each Pi possesses xi , yi , zi .

1. P1 distributes share ai of a to Pi , while P2 distributes share bi of b to Pi (e.g. P1a2→ P2, P1

a3→ P3, P2b1→ P1, ...)

2. Pi locally computes αi := ai − xi and βi := bi − yi and broadcasts it to every other party

3. Pi then locally computes c i = zi + αyi + βxi = zi + (a − x)yi + (b − y)xi

4. P1 sets c1 = c1 + αβn = c1 + 1

n (a − x)(b − y)

5. Pi now possesses a valid share ci of c = ab

Correctness:c =

n∑i=1

ci =n∑

i=1

c i +αβ

n= αβ +

n∑i=1

c i = αβ +n∑

i=1

zi + αyi + βxi

= αβ + z + αy + βx = (a − x)(b − y) + z + (a − x)y + (b − y)x

= ab − ay − xb + xy + z + ay − xy + bx − xy

= ab

Privacy: Shares do not leak any information about the original secret. Local computations on shares do not provide any further knowledge.The public values ai − xi and bi − yi are uniformly random values, since xi and yi were.

7 The usage of triplets for multiplication has been proposed in [2] by D. Beaver. In literature, they are commonly known as Beaver triplets.Chapter 15: Privacy — Secure Computation 15-43

Page 108: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

SPDZ

Protocol Secure Multiplication SPDZParticipants are P1, P2, and P3; input for P1 is a ∈ Zp ; input for P2 is b ∈ Zp , where c = ab and p is a fixed prime agreed on in advance. Allother parties have no input. Let n denote the total amount of parties/shares.

Furthermore, we assume, there is a triplet7 x, y, z chosen uniformly at random such that xy=z.Each Pi possesses xi , yi , zi .

1. P1 distributes share ai of a to Pi , while P2 distributes share bi of b to Pi (e.g. P1a2→ P2, P1

a3→ P3, P2b1→ P1, ...)

2. Pi locally computes αi := ai − xi and βi := bi − yi and broadcasts it to every other party

3. Pi then locally computes c i = zi + αyi + βxi = zi + (a − x)yi + (b − y)xi

4. P1 sets c1 = c1 + αβn = c1 + 1

n (a − x)(b − y)

5. Pi now possesses a valid share ci of c = ab

Correctness:c =

n∑i=1

ci =n∑

i=1

c i +αβ

n= αβ +

n∑i=1

c i = αβ +n∑

i=1

zi + αyi + βxi

= αβ + z + αy + βx = (a − x)(b − y) + z + (a − x)y + (b − y)x

= ab − ay − xb + xy + z + ay − xy + bx − xy

= ab

Privacy: Shares do not leak any information about the original secret. Local computations on shares do not provide any further knowledge.The public values ai − xi and bi − yi are uniformly random values, since xi and yi were.

7 The usage of triplets for multiplication has been proposed in [2] by D. Beaver. In literature, they are commonly known as Beaver triplets.Chapter 15: Privacy — Secure Computation 15-43

Page 109: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

SPDZ

Protocol Secure Multiplication SPDZParticipants are P1, P2, and P3; input for P1 is a ∈ Zp ; input for P2 is b ∈ Zp , where c = ab and p is a fixed prime agreed on in advance. Allother parties have no input. Let n denote the total amount of parties/shares.

Furthermore, we assume, there is a triplet7 x, y, z chosen uniformly at random such that xy=z.Each Pi possesses xi , yi , zi .

1. P1 distributes share ai of a to Pi , while P2 distributes share bi of b to Pi (e.g. P1a2→ P2, P1

a3→ P3, P2b1→ P1, ...)

2. Pi locally computes αi := ai − xi and βi := bi − yi and broadcasts it to every other party

3. Pi then locally computes c i = zi + αyi + βxi = zi + (a − x)yi + (b − y)xi

4. P1 sets c1 = c1 + αβn = c1 + 1

n (a − x)(b − y)

5. Pi now possesses a valid share ci of c = ab

Correctness:c =

n∑i=1

ci =n∑

i=1

c i +αβ

n= αβ +

n∑i=1

c i = αβ +n∑

i=1

zi + αyi + βxi

= αβ + z + αy + βx = (a − x)(b − y) + z + (a − x)y + (b − y)x

= ab − ay − xb + xy + z + ay − xy + bx − xy

= ab

Privacy: Shares do not leak any information about the original secret. Local computations on shares do not provide any further knowledge.The public values ai − xi and bi − yi are uniformly random values, since xi and yi were.

7 The usage of triplets for multiplication has been proposed in [2] by D. Beaver. In literature, they are commonly known as Beaver triplets.Chapter 15: Privacy — Secure Computation 15-43

Page 110: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

SPDZ

Protocol Secure Multiplication SPDZParticipants are P1, P2, and P3; input for P1 is a ∈ Zp ; input for P2 is b ∈ Zp , where c = ab and p is a fixed prime agreed on in advance. Allother parties have no input. Let n denote the total amount of parties/shares.

Furthermore, we assume, there is a triplet7 x, y, z chosen uniformly at random such that xy=z.Each Pi possesses xi , yi , zi .

1. P1 distributes share ai of a to Pi , while P2 distributes share bi of b to Pi (e.g. P1a2→ P2, P1

a3→ P3, P2b1→ P1, ...)

2. Pi locally computes αi := ai − xi and βi := bi − yi and broadcasts it to every other party

3. Pi then locally computes c i = zi + αyi + βxi = zi + (a − x)yi + (b − y)xi

4. P1 sets c1 = c1 + αβn = c1 + 1

n (a − x)(b − y)

5. Pi now possesses a valid share ci of c = ab

Correctness:c =

n∑i=1

ci =n∑

i=1

c i +αβ

n= αβ +

n∑i=1

c i = αβ +n∑

i=1

zi + αyi + βxi

= αβ + z + αy + βx = (a − x)(b − y) + z + (a − x)y + (b − y)x

= ab − ay − xb + xy + z + ay − xy + bx − xy

= ab

Privacy: Shares do not leak any information about the original secret. Local computations on shares do not provide any further knowledge.The public values ai − xi and bi − yi are uniformly random values, since xi and yi were.

7 The usage of triplets for multiplication has been proposed in [2] by D. Beaver. In literature, they are commonly known as Beaver triplets.Chapter 15: Privacy — Secure Computation 15-43

Page 111: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

SPDZ

Protocol Secure Multiplication SPDZParticipants are P1, P2, and P3; input for P1 is a ∈ Zp ; input for P2 is b ∈ Zp , where c = ab and p is a fixed prime agreed on in advance. Allother parties have no input. Let n denote the total amount of parties/shares.

Furthermore, we assume, there is a triplet7 x, y, z chosen uniformly at random such that xy=z.Each Pi possesses xi , yi , zi .

1. P1 distributes share ai of a to Pi , while P2 distributes share bi of b to Pi (e.g. P1a2→ P2, P1

a3→ P3, P2b1→ P1, ...)

2. Pi locally computes αi := ai − xi and βi := bi − yi and broadcasts it to every other party

3. Pi then locally computes c i = zi + αyi + βxi = zi + (a − x)yi + (b − y)xi

4. P1 sets c1 = c1 + αβn = c1 + 1

n (a − x)(b − y)

5. Pi now possesses a valid share ci of c = ab

Correctness:c =

n∑i=1

ci =n∑

i=1

c i +αβ

n= αβ +

n∑i=1

c i = αβ +n∑

i=1

zi + αyi + βxi

= αβ + z + αy + βx = (a − x)(b − y) + z + (a − x)y + (b − y)x

= ab − ay − xb + xy + z + ay − xy + bx − xy

= ab

Privacy: Shares do not leak any information about the original secret. Local computations on shares do not provide any further knowledge.The public values ai − xi and bi − yi are uniformly random values, since xi and yi were.

7 The usage of triplets for multiplication has been proposed in [2] by D. Beaver. In literature, they are commonly known as Beaver triplets.Chapter 15: Privacy — Secure Computation 15-43

Page 112: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

SPDZ

Protocol Secure Multiplication SPDZParticipants are P1, P2, and P3; input for P1 is a ∈ Zp ; input for P2 is b ∈ Zp , where c = ab and p is a fixed prime agreed on in advance. Allother parties have no input. Let n denote the total amount of parties/shares.

Furthermore, we assume, there is a triplet7 x, y, z chosen uniformly at random such that xy=z.Each Pi possesses xi , yi , zi .

1. P1 distributes share ai of a to Pi , while P2 distributes share bi of b to Pi (e.g. P1a2→ P2, P1

a3→ P3, P2b1→ P1, ...)

2. Pi locally computes αi := ai − xi and βi := bi − yi and broadcasts it to every other party

3. Pi then locally computes c i = zi + αyi + βxi = zi + (a − x)yi + (b − y)xi

4. P1 sets c1 = c1 + αβn = c1 + 1

n (a − x)(b − y)

5. Pi now possesses a valid share ci of c = ab

Correctness:c =

n∑i=1

ci =n∑

i=1

c i +αβ

n= αβ +

n∑i=1

c i = αβ +n∑

i=1

zi + αyi + βxi

= αβ + z + αy + βx = (a − x)(b − y) + z + (a − x)y + (b − y)x

= ab − ay − xb + xy + z + ay − xy + bx − xy

= ab

Privacy: Shares do not leak any information about the original secret. Local computations on shares do not provide any further knowledge.The public values ai − xi and bi − yi are uniformly random values, since xi and yi were.7 The usage of triplets for multiplication has been proposed in [2] by D. Beaver. In literature, they are commonly known as Beaver triplets.

Chapter 15: Privacy — Secure Computation 15-43

Page 113: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

SPDZ

• Multiplication works again

• It needs a single round of communication

• A beaver triplet is needed per multiplication

• Beaver triplets may not be reused, otherwise information leakage may happen

• How to obtain Beaver triplets?

⇒ Introduction of a new execution model: The Preprocessing Model

Chapter 15: Privacy — Secure Computation 15-44

Page 114: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Preprocessing Model

The Preprocessing Model is a framework in which we assume the existence of a trusted dealer who distributes raw material (beaver triplets)to the parties before computation takes place.

Model:

• Computation is split into offline (preprocessing) and online phase

• During online phase actual SMC computation is carried out

• During offline phase, beaver triplets are generated

Properties:

• Online phase is quite efficient

• Offline phase is still time costly

• But: Offline phase is independent of computations to be performed, i. e. can be done without knowledge of input values.

Research on improving the offline phase is ongoing [5, 10, 11].

Chapter 15: Privacy — Secure Computation 15-45

Page 115: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Preprocessing Model

The Preprocessing Model is a framework in which we assume the existence of a trusted dealer who distributes raw material (beaver triplets)to the parties before computation takes place.

Model:

• Computation is split into offline (preprocessing) and online phase

• During online phase actual SMC computation is carried out

• During offline phase, beaver triplets are generated

Properties:

• Online phase is quite efficient

• Offline phase is still time costly

• But: Offline phase is independent of computations to be performed, i. e. can be done without knowledge of input values.

Research on improving the offline phase is ongoing [5, 10, 11].

Chapter 15: Privacy — Secure Computation 15-45

Page 116: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Preprocessing Model

The Preprocessing Model is a framework in which we assume the existence of a trusted dealer who distributes raw material (beaver triplets)to the parties before computation takes place.

Model:

• Computation is split into offline (preprocessing) and online phase

• During online phase actual SMC computation is carried out

• During offline phase, beaver triplets are generated

Properties:

• Online phase is quite efficient

• Offline phase is still time costly

• But: Offline phase is independent of computations to be performed, i. e. can be done without knowledge of input values.

Research on improving the offline phase is ongoing [5, 10, 11].

Chapter 15: Privacy — Secure Computation 15-45

Page 117: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Chapter 15: Privacy

Privacy

General Data Protection Regulation

Secure Computation

Bibliography

Chapter 15: Privacy 15-46

Page 118: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Bibliography

[1] D. W. Archer, D. Bogdanov, B. Pinkas, and P. Pullonen.Maturity and Performance of Programmable Secure Computation.IEEE Security and Privacy, 14(5):48–56, 2016.

[2] D. Beaver.Efficient Multiparty Protocols Using Circuit Randomization.In J. Feigenbaum, editor, Lecture Notes in Computer Science, volume 576, chapter Efficient, pages 420–432. Springer, 1991.

[3] K. Bock and M. Rost.Privacy By Design und die Neuen Schutzziele.DuD, 35(1):30–35, 2011.

[4] R. Cramer, I. B. Damgard, and J. B. Nielsen.Secure Multiparty Computation and Secret Sharing.Cambridge University Press, New York, NY, USA, 2015.

[5] I. Damgård, M. Keller, E. Larraia, V. Pastro, P. Scholl, and N. P. Smart.Practical covertly secure MPC for dishonest majority - Or: Breaking the SPDZ limits.In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), volume 8134 LNCS, pages 1–18, 2013.

[6] I. Damgård, V. Pastro, N. Smart, and S. Zakarias.Multiparty computation from somewhat homomorphic encryption.In Lecture Notes in Computer Science, volume 7417, pages 643–662, 2012.

[7] M. Hansen.Top 10 Mistakes in System Design from a Privacy Perspective and Privacy Protection Goals.In J. Camenisch, B. Crispo, S. Fischer-Hübner, R. Leenes, and G. Russello, editors, Privacy and Identity Management for Life, pages 14–31, Berlin, Heidelberg, 2012. Springer BerlinHeidelberg.

[8] M. Hansen, M. Jensen, and M. Rost.Protection Goals for Privacy Engineering.In 2015 IEEE Security and Privacy Workshops, pages 159–166, 2015.

Chapter 15: Privacy — Bibliography 15-47

Page 119: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Bibliography

[9] E. Hughes.A Cypherpunk’s Manifesto, 1993.

[10] M. Keller, E. Orsini, and P. Scholl.MASCOT.In Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security, pages 830–842, 2016.

[11] M. Keller, V. Pastro, and D. Rotaru.Overdrive: Making SPDZ great again.Lecture Notes in Computer Science, 10822 LNCS:158–189, 2018.

[12] H. Nissenbaum.Protecting Privacy in an Information Age: The Problem of Privacy in Public.Law and Philosophy, 17:559–596, 1998.

[13] H. Nissenbaum.Privacy as contextual integrity.Washington Law Review, pages 101–139, 2004.

[14] M. Rost.Bob, es ist Bob!FiFF-Kommunikation, (4):63—-66, 2017.

[15] M. Rost and A. Pfitzmann.Datenschutz-Schutzziele — revisited.Datenschutz und Datensicherheit - DuD, 33(6):353–358, 2009.

[16] L. Sweeney.k-anonymity: A model for protecting privacy.International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 10(5):1–14, 2002.

[17] G. W. von Blarkom, J. Borking, and J. Olk.Handbook of Privacy and Privacy-Enhancing Technologies.2003.

Chapter 15: Privacy — Bibliography 15-48

Page 120: Network Security (NetSec)netsec.net.in.tum.de/slides/15_privacy.pdf · A Cypherpunk’s Manifesto A non-scientific attempt: Setting In the 1990’s the U.S. goverment tried to constrain

Bibliography

[18] A. C. Yao.Protocols for secure computations.In Proceedings of the 23rd Annual Symposium on Foundations of Computer Science, pages 1–5, Washington, DC, USA, 1982. IEEE.

Chapter 15: Privacy — Bibliography 15-49