Cookies, Lies and Consent
Enrico Gerding
What is this talk about?
The Biggest Lie
Solution 1: Read all terms and conditions
Alex Hern: "I read all the small print on the internet and it made me want to die "
http://www.theguardian.com/technology/2015/jun/15/i-read-all-the-small-print-on-the-internet
Issues
• It is impossible to read all terms and conditions• Even if we read them, they are clouded in legal
language and difficult to understand what it means in practice
• There is no choice: there is no negotiation of the terms; it's `take it or leave it'
Consequences
• We do not know what we agree to• Consent becomes meaningless• Leads to mistrust in businesses
Privacy Index
Why is this important?
• Increasing amount information is being collected– Browsing – Social media– Mobile devices– Internet of Things
• Potential privacy issues• Often not clear what information is collected
and for which purpose
Samsung SmartTV
Samsung SmartTV has a voice command feature. The privacy policy reads:
“Please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party,”
Source: The Daily Beast, 5 Feb 2015
Mobile App Permissions
• University of Southampton Campus App has access to:– Device & app history
• Includes browsing history and information about other apps
– Identity• Includes accounts
– Contacts• Reading all contacts
– Location– Photos/Media/Files
• Can access files on device including photos and videos taken
– Device ID & Call Information• Phone number, whether calls are `active', remote number connected to
phone, etc
Solution 2: Change the Law
• EU Cookie Law, adopted May 2011• The goal was to `make consumers aware of how
information about them is collected and used online, and give them a choice to allow it or not' http://www.cookielaw.org/the-cookie-law/
• Issues:– Not many people understand what cookies are– Still not clear how information is being used– Often there is no choice
Towards Meaningful Consent in the Digital Economy
• How can we make consent more meaningful?• Multi-disciplinary EPSRC-funded project– ECS: m.c schraefel (PI), Enrico Gerding, Tim
Baarslag, Richard Gomer, Dion Kitchener, Anna Soska
– Economics: Michael Vlassopoulis, Helia Marreiros
Towards Meaningful Consent in the Digital Economy
Research questions:• Understand to what extent people care about
privacy, and in which context• Can we make consent more meaningful than
talking about cookie technologies?• Can consent be automated using agent-based
technologies?• Can consent be negotiated?
Current Situation
Service Provider
UserYes/No
Privacy Policy
Agent-Based Privacy Negotiation (1)
Service Provider
User
Privacy Preferences
Agent
Negotiation
Agent-Based Privacy Negotiation (2)
User Agent Service Provider
PreferenceElicitation
Negotiation
• Determine which information is needed to reach an optimal outcome
• No cognitive costs but risky if user model is inaccurate
• Find the right time to ask the user for feedback
• Find the most information-revealing questions to ask
• Incurs cognitive costs
ProviderModel
UserModel
ExampleSuppose the agent can make 3 different offers to the service provider
Offer 1 Offer 2 Offer 3
User Model
Provider Model
Prob
abili
ty
Utility Utility Utility
Acceptance Probability = 0.7
Acceptance Probability = 0.5
Acceptance Probability = 0.2
Trade offs
• Agent can make an offer, but if accepted is bound to the offer – Risk
• Agent can ask the user to get more accurate information about its preferences but this incurs a cognitive cost
Pandora’s problem• Boxes cost to open and contain a prize distribution • Open (i.e. find instantiated value) any number of boxes and
stop at any point• Once stopped, collect highest reward found amongst
opened boxes• Aim is to maximize the expected value of the greatest prize
found, minus the sum of costs of opening boxes
𝐹 1 (𝑥1 ) 𝐹 2 (𝑥2 )
⋯
𝐹 𝑛 (𝑥𝑛 )𝑥0𝑐1 𝑐2 𝑐𝑛
Pandora’s Policy
• A fundamental result by Weitzman (1979) shows that we can compute, once, for every box its index :
uniform utility : 0 to 1
cost to open : 0.2
Example
index 0.37
𝑧=𝐸 ¿
Pandora’s Policy
• With these indexes, Pandora follows these simple rules:SELECTION RULE: If a box is to be opened, it should be the closed box with highest index.STOPPING RULE: Terminate search whenever the maximum sampled reward exceeds the index of every closed box
• Pandora’s policy is optimal in terms of expected reward• We adapted Pandora’s Policy for the privacy negotiation
setting
Tim Baarslag and Enrico H. Gerding. Optimal incremental preference elicitation during negotiation. In Proceedings of the Twenty-fourth International Joint Conference on Artificial Intelligence (IJCAI). AAAI Press, 2015
Negotiating Consent in Practice
Negotiating Consent in Practice
• Development of mobile app which collects potentially sensitive user information
• Users receive monetary reward in return for granting permissions (data)
• Field studies with real users and their data• Research questions:– How privacy sensitive are users in practice (as opposed
to what they say they are)– User bother vs accuracy– Effectiveness of agent based approach
Conclusions
• Nobody reads terms and conditions• Privacy an ever-important issue• Meaningful consent is an interaction issue • CS solutions:– Agent-based approaches – HCI
• Websitehttp://www.meaningfulconsent.org/