umr 5205 security and privacy issues in multi-scale digital ecosystems 1- an overview 2- a focus on...
TRANSCRIPT
UMR 5205
Security and Privacy Issues in Multi-Scale Digital Ecosystems
1- An Overview2- A Focus on Trust and Reputation Protocols
Lionel Brunie
National Institute of Applied Sciences (INSA)LIRIS Laboratory/DRIM Team – UMR CNRS 5205
Lyon, France
http://liris.cnrs.fr/lionel.brunie
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 2
Objective of this Course
The objective of this course is:
To draw an overview of security and privacy challenges in multi-scale digital ecosystems
To identify candidate methodologies to address these issues
To analyze a privacy-preserving decentralized reputation protocol and finally,
To discuss some hints for a research agenda
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 3
Agenda
(Digital Ecosystems)
Security and Privacy
The Personalization vs Privacy Dilemma
Enforcing Security and Privacy Identity Location Accountability Trust Reputation
Privacy-Preserving Trust and Reputation protocols
Some Hints for a Research Agenda
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 4
Agenda
(Digital Ecosystems)
Security and Privacy
The Personalization vs Privacy Dilemma
Enforcing Security and Privacy Identity Location Accountability Trust Reputation
Privacy-Preserving Trust and Reputation protocols
A Proposition of Research Agenda
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 5
Security and Privacy: Definitions (1/2)
Both concepts design properties of a system and, by extension, their enforcement
Security focuses on protecting users and businesses from intrusions, attacks, vulnerabilities, etc.
Security provides a safe environment and secure communication along with end user and business protection (Chang et al., 2005)
Feeling of security
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 6
Security and Privacy: Definitions (2/2)
Privacy: “State of being alone and not watched or disturbed by other people / State of being free from the attention of the public” (Oxford Dictionary)
The perception of privacy is shaped by the perceived identity of the information receiver the perceived usage of the information the subjective sensitivity of the disclosed information, and the context in which the information is disclosed
(Adams, cited by Lederer et al, 2003)
“An individual actively yet intuitively monitors and adjusts his behavior in the presence of others in an attempt to control their conceptions of his identity”
(Goffman, cited by Lederer et al, 2003)
(Feeling of) Security along with privacy are user-sensitive (user-dependent) concepts
Security and privacy are key enablersfor interaction, collaboration and DE dynamics
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 7
Agenda
Digital Ecosystems
Security and Privacy
The Personalization vs Privacy Dilemma
Enforcing Security and Privacy Identity Location Accountability Trust Reputation
Privacy-Preserving Trust and Reputation protocols
A Proposition of Research Agenda
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 8
The Central Dilemma: Personalization vs Privacy
Pervasiveness/smartness means personalization
Personalization needs context and user (profile) information disclosure
Privacy needs context and user (profile) information hiding
The central dilemma: Personalization or Privacy
The central challenge: mitigate personalization and privacy
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 9
Agenda
(Digital Ecosystems)
Security and Privacy
The Personalization vs Privacy Dilemma
Enforcing Security and Privacy Identity Location Accountability Trust Reputation
Privacy-Preserving Trust and Reputation protocols
A Proposition of Research Agenda
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 10
A Quick Focus on Identity Management
Who are you? Is this information private or public?
One identity? SIP address-of-record (RFC 3674)? RFID tag? Electronic Product Code (EPC) – EPCGlobalNetwork – Object Naming Service (ONS) IETF Host Identity Protocol (HIP)
Multiple identities/avatars? Linked identities? Not a technological but a political question
From a technical point of view, identity management needs authentication through a third party (certificate, signature…)? through a challenge/response protocol?
need some common framework => often incompatible with the dynamic interactions of open systems
through social recognition? Trust and Reputation
Identity inference
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 11
The Special Concern of Location
“Where are you”: is this information private or public?
Location is the key component of all Location-Based Services (LBS)
Location is related to identity as a husband, I was on the French Riviera (with my wife) as a PhD adviser, the secretary said I was “busy” my wife changed her profile on Facebook by “Romantic WE on
the Riviera” my wife is friend with my PhD students …
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 12
The Special Concern of Location (Cont’d)
IETF (SIMPLE WG)Notions of presence system/server and “presentity” i.e., unique
identity in a presence system
Main security and privacy requirementsUpdating presence information requires a priori authentication and
authorization
Subscribing to/Watching presence information requires authentication
Watching requires compliance with the privacy filtering/policy, incl. authorization
Confidentiality and integrity of presence information and privacy policy must be ensured
(from Singh et al.)
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 13
The Special Concern of Location (Cont’d)
IETF (SIMPLE WG): Privacy policy requirements (Geopriv, RFC 3693)Authentication/Authorization of watchersSelective information: ability to specify what parts of presence information are
given to watchersDifferential information: ability to distribute different presence information to
different watchersAuthorization policy for anonymous subscriptions (cf. anonymity in RFC 3323)National policy overrides user’s policy
Authorization Policy = set of rules: (conditions, actions, transformations) (action + transformation = permission)
The Authorization Policy is a (very) sensitive information => needs protection!
Issue: enable a presence server to filter and distribute presence information while hiding its actual value (Singh at al., 2006)
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 14
The Special Concern of Location (Cont’d)
What happens after disclosing your location information???
A first approach: use complex cryptography or steganography. No, bad idea
A second approach: pray the watcher is a good and intelligent guy
A third approach: pray the watcher is a good guy that will respect your location privacy rules (ensure he agrees with that); give him the usage rules he is concerned by; if the guy appears as a bad guy, have a discussion together (or start a legal action)
This last issue illustrates a very important concern about privacy: the further (uncontrolled) usage of a disclosed information (see also: Creative Commons)
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 15
Accountability
Definition: “Condition in which individuals who exercise power are constrained by external means and by internal norms” (Public Administration Dictionary)
Accountable services/systems Un-deniability (non repudiation of actions) Verifiability (correctness and deviations) Detection of deviations
(from Malone et al.)
monitoring/logging
easy with trusted third parties; complex otherwise
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 16
From Security to Trust and Reputation
Open unpredictable un-secure: live is risk
The alternative would be Big Brother…
Use security tools, forget security systems
Shift from (false) determinism (of security) to probability, risk management, and social-awareness
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 17
TrustGambetta (1990): “Trust […] is a particular level of the subjective probability with which an agent assesses that another agent [..] will perform a particular action […] in a context in which it affects his own action”
Wang (2003): “an Agent’s belief in another Agent’s capabilities, honesty and reliability based on its own direct experiences”
Chang and al. (2005): “Belief that the Trusting Agent has in the Trusted Agent’s willingness and capability to deliver a quality of service in a given context and in a given Timeslot”
Jøsang et al. (2007):Trust (reliability trust) is the subjective probability by which an individual, A,
expects that another individual, B, performs a given action on which its welfare depends
Trust (decision trust) is the extent to which one party is willing to depend on something or somebody in a given situation with a feeling of relative security, even though negative consequences are possible
Marsh (1994): all studies on trust make the assumption of the presence of a society
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 18
Properties of Trust
Quantifiable as a subjective probability
Binary-Relational and Directional (non symmetric)
Contextual (wrt an action)
Dynamic (evolves with time)
Relativity / Subjectivity / Fuzziness
Is Trust Reflexive???
Is Trust transitive???
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 19
Computing Trust – Usage of Trust
Direct Trust, based on analysis of truster/trustee past interactions
Trust in the trustee’s profile elements (e.g., diploma, employer, experience…)
Transitive Trust / Trust recommendation and propagation (e.g., co-citation)
Record-based Trust, computed by analyzing the trustee (certified) record/history
Trust negotiation
Social-network base Trust: trust ~ centrality
Reputation of the trustee in a group
Usage: P2P systems (e.g., EigenTrust), routing algorithms, roaming, dynamic virtual organizations…
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 20
Reputation
Reputation is the general opinion of the/a community about the trustworthiness of an individual or an entity
Reputation is computed by aggregating feedback values provided par rating agents
By nature, reputation is decentralized
Studied in economy, psychology, sociology, computing
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 21
Good feedback Good reputation Benefits
Bad feedback Bad reputation Exclusion
Reputation Systems
Objective: discourage dishonest behavior
Reputation = aggregate of feedbacks provided by others
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 22
Seller
Buyer 1 +1Buyer 2 +1Buyer 3 -1Buyer 4 +1Buyer 5 0
Sum: 2
Feedback
Reputation
An Example: the eBay Reputation System
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 23
Other examples
Rooting out fake identities in social networks: Unvarnished.com, Duedil.com
Defeating pollution in peer-to-peer file sharing networks: [Costa and Almeida, 2007], [Yu, 2006], EigenTrust [Kamvar et al., 2003]
Discouraging selfish behavior in mobile ad-hoc networks: [Hu and Burmester, 2006], [Buchegger et al., 2004], [Buchegger, 2002] [Miao et al., 2012]
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 24
Reputation Systems: Characteristics
Centralized vs decentralized architecture
Feedback aggregation model
Reputation visibility
Reputation durability
Feedback durability
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 25
Computing Reputation
Easy in a centralized or trusted environment: use a trusted third party to collect the feedbacks and return the reputation value
More complex in decentralized environment
TTPS
s1
s2
sn
…
qrtls2t
lsn t
ls1t
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 26
Agenda
(Digital Ecosystems)
Security and Privacy
The Personalization vs Privacy Dilemma
Enforcing Security and Privacy Identity Location Accountability Trust Reputation
Privacy-Preserving Trust and Reputation protocols
A Proposition of Research Agenda
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 27
Privacy-Preserving Trust and Reputation protocols
The fear of retaliation prevent source agents to provide their real feedback [Resnick, 2002], [ebay.com, 2008]
A privacy preserving trust system protects the user from unwanted information disclosure (~personalization vs privacy dilemma)
A privacy preserving reputation system protects a feedback provider by hiding either the feedback value or the identity of the user (~anonymization)
Issue : anonymity => risk of attacks Sybil attack (attacker creates multiple identities) Self-promotion, ballot stuffing, slandering (in coalition or not) Whitewashing Oscillation
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 28
t
q
s1s2
sn
l1
l2
…q – querying agentt – target agentrt – reputation of agent t, sum of all lisi – source agentli – local feedback of agent si about agent t, range: [-1,1]n – number of source agents
lnrt = ?
Privacy-preserving reputation protocol
Agent q needs to learn rt = li
Privacy of li must be preserved
Need efficient computation
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 29
tq
s1 s2 sn…
S = {s1, s2, …, sn}
Request for S
S
y = rand()
V, y
V = <s1, s2, …, sn>
z1 = y + l1 z2 = z1 + l2
s3
z3 = z2 + l3 zn = zn-1 + ln
V, z1 V, z2
zn
rt = zn - y
An Existing Protocol: Secure Sum (technique: Secure Multi-Party Computation)
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 30
q
s1 s2 …
y = rand()
V, y
V = <s1, s2, s3, …, sn>
z1 = y + l1 z2 = z1 + l2
s3
l2 = z2 - z1
V, z1 V, z2
sns1 s3
Attack 1
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 31
t
sx
lx
rt = 10
r’t = 9
lx = r’t – rt = 9 – 10 = -1
Attack 2
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 32
Adversarial Models
The Semi-Honest Model Follow the protocol according to the specification Passively attempt to learn the inputs of honest agents
The Disruptive Malicious Model Provide out of range values as their inputs Make erroneous computations Refuse to participate in the protocol Drop messages Pre-maturely abort the protocol Wiretap and tamper with the communication channels
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 33
Semi-Honest Model: state of the art
Key Security Mechanisms Complexity
Clifton et al., 2003
– Secure Sum
Secure multi-party computation
Collusion not permitted
O(n)
Gudes et al., 2009
– Scheme 1
Trusted third parties
Public-key cryptography
O(n)
Gudes et al., 2009
– Scheme 2
Trusted third parties
Public-key cryptography
Secure product
O(n)
Gudes et al., 2009
– Scheme 3
Secure multi-party computation O(n2)
Nin et al., 2009 Private collaborative networks
ElGamal encryption scheme
O(1)
Pavlov et al., 2004
– WSS-1
Secure multi-party computation
Secret sharing
O(N) + O(n2)
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 34
Disruptive Malicious Model: state of the art
O(n3)Verifiable secret sharing
Discrete log commitment
Pavlov et al., 2004
– WSS-2
O(1)Central server
Pseudonym / Identity management
Steinbrecher, 2006
-Trusted platform
MIX cascades
Digital signatures
Kinateder and Pearson, 2003
O(1)Anonymous credential systems
E-cash (bank)
Blind signatures
Mixnets / Onion routing
Androulaki et al., 2008
ComplexityKey Security Mechanisms
O(n3)Verifiable secret sharing
Discrete log commitment
Pavlov et al., 2004
– WSS-2
O(1)Central server
Pseudonym / Identity management
Steinbrecher, 2006
-Trusted platform
MIX cascades
Digital signatures
Kinateder and Pearson, 2003
O(1)Anonymous credential systems
E-cash (bank)
Blind signatures
Mixnets / Onion routing
Androulaki et al., 2008
ComplexityKey Security Mechanisms
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 35
Technique 1: Secret Sharing
• Split a secret into n shares: x1, x2, …, xn
• m n agents required to unlock the secret
• Send the shares to n agents
a
u1 u2 un…u3
x1 xnx3x2
Secret
The k-Shares Protocol (1/2)
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 36
Technique 2: Trust-awareness
a v7
vn
…
0.9
?
v6
0.5
v1
0.1
v5
v4
v3
v2
0.9
1.0
0.20.0
The trustworthiness of fellow agents in the context of
preserving privacy
The k-Shares Protocol (2/2)
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 37
Step 1: Initiation
t
S = {s1, s2, …, sn}
Request for S
q
s1 s2 sn…
S SS
The k-Shares Protocol
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 38
0.9
Step 2: Select k Trustworthy Agents
s1
s3
sn
…
s20.5
1.0
0.3
s31.0
Set S
s4s4
0.9
u1
u2
The k-Shares Protocol (cont’d)
Such that:p(u1 is malicious) x… x p(uk is malicious)
is low
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 39
s
Step 3: Prepare k+1 Shares
Step 4: Distribute k Shares
s
u1 u2 uk…u3
x1 xkx3x2
lst = x1+x2+:::+xk+xk+1
The k-Shares Protocol (cont’d)
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 40
u
Step 5: Sum all Received Shares and xk+1
Step 6: SendLocal Sum to q
Step 7: q Computesthe Reputation
s1 s2 sn…s3
s1 s2 s3 sk
The sum σu hidesthe received shares
and xk+1
The k-Shares Protocol (cont’d)
u = (received_shares) + xk+1
rtu) / n
Preserving Privacy of Feedback Providers in Decentralized Reputation Systems
Omar Hasan, Lionel Brunie, and Elisa Bertino. Computers & Security. 2011
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 41
Key Security Mechanisms Complexity
Clifton et al., 2003
– Secure Sum
Secure multi-party computation
Collusion not permitted
O(n)
Gudes et al., 2009
– Scheme 1
Trusted third parties
Public-key cryptography
O(n)
Nin et al., 2009 Private collaborative networks
ElGamal encryption scheme
O(1)
Pavlov et al., 2004
– WSS-1
Secure multi-party computation
Secret sharing
O(N) + O(n2)
k-Shares Secure multi-party computation
Trust awareness
Secret sharing
O(n)
Round-Trip Secure multi-party computation
Trust awareness
Data perturbation
O(n)
Introduction Framework Survey Semi-Honest Malicious Conclusion
Comparison
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 42
The k-Shares Protocol (Malicious)
Technique 1: Additive Homomorphic Cryptosystems
Product of ciphertexts = Sum of plaintextsE(3).E(4) = E(3+4) = E(7)
Paillier Cryptosystem [Paillier, 1999]
Used to encrypt the shares and the sum of the shares
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 43
The k-Shares Protocol (Malicious) (Cont’d)
Technique2:
Non-Interactive Zero Knowledge Proofs (ZKP)
A Prover convinces a Verifier that a statement is true No additional information is revealed
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 44
The k-Shares Protocol (Malicious) (Cont’d)
ZKP: Set Membership Given a ciphertext Eu(x) and a public set S Agent u proves: x S x is not revealed
ZKP: Plaintext Equality Given two ciphertexts Eu(x) and Ev(x) Agent u proves: Both Eu(x) and Ev(x) encrypt x x is not revealed
Used to Prove that the feedback provided by an agent (i.e., the sum of its shares) is correct (lies
in a specified interval) Prove that the shares sent to the nodes are the correct ones Prove that all agents compute their own sum correctly Prove that the received u has the correct value
Reference A Decentralized Privacy Preserving Reputation Protocol for the Malicious Adversarial
Model. O. Hasan, L. Brunie, E. Bertino, N. Shang. IEEE Transactions on Information Forensics and Security, vol.8, n°6, p. 949-962, 2013.
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 45
Key Security Mechanisms Complexity
Androulaki et al. Anonymous credential systems
E-cash (bank)
Blind signatures
Mixnets / Onion routing
O(1)
Kinateder and Pearson Trusted platform
MIX cascades
Digital signatures
-
Pavlov et al. – WSS-2 Verifiable secret sharing
Discrete log commitment
O(n3)
Steinbrecher Central server
Pseudonym / Identity management
O(1)
k-Shares (Malicious) Additive homomorphic cryptosystems
Zero-knowledge proofs
O(n) + O(log N)
Introduction Framework Survey Semi-Honest Malicious Conclusion
Comparison
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 46
Agenda
(Digital Ecosystems)
Security and Privacy
The Personalization vs Privacy Dilemma
Enforcing Security and Privacy Identity Location Accountability Trust Reputation
Privacy-Preserving Trust and Reputation protocols
Some Hints for a Research Agenda
Master Course, Lyon, January 2015 - Security and Privacy in Digital Ecosystems 47
Some Hints for a Research Agenda
Seamless certified and secure integration of multiple heterogeneous ecosystems, e.g., sensor network and cloud infrastructure
Holistic trust, reputation and security business-centric value-aware framework (do not forget security…)
Lifecycle of a piece of information (is a piece of information a new “thing”?)
The issue of identity and anonymity
Personalization vs Privacy dilemma / User-centric privacy management proxy
Enforcing new rights: indifference and oblivion
A social Web of things « [In the] Internet of Things (IoT) […] physical and virtual ‘things’ have identities […] and
virtual personalities and […] are expected to become active participants in business, information and social processes […] » (CERP-IoT)
Identity? Personality? Relationship? Social network of things? Trust? Privacy?