[ieee 2013 fourth international conference on information, intelligence, systems and applications...

6
Privacy as a Product: A Case Study in the m-Health Sector Constantinos Patsakis Distributed Systems Group School of Computer Science and Statistics Trinity College Dublin, Ireland [email protected] Agusti Solanas UNESCO Chair in Data Privacy Department Computer Engineering & Mathematics Universitat Rovira i Virgili Tarragona. Spain [email protected] Abstract—Business models based on offering free services to people in exchange for their data are gaining importance and prevalence. The most prominent examples are social networks and, more recently, mobile social networks. However, this trend is endangering users’ privacy. We do not discuss the ethical and legal issues derived from this business model. Notwithstanding, we believe that users might have better and more privacy-aware alternatives enabling them to trade their privacy on their own. Thus, we introduce the concept of “Privacy as a Product” (PaaP) and we propose and describe a framework and a protocol based on the Raykova-Vo-Bellovin- Malkin protocol that enables users to share private data without the need for trusting the infrastructure (e.g. a social network). We show that our proposal is feasible in terms of computa- tional and storage overhead. Hence, our solution opens the door to the new concept of “Privacy as a Product” and could be the foundation for implementations of privacy-aware social networks in which privacy plays a more central role, like in the healthcare sector. Index Terms—Privacy, Social Networks, Secure Multiparty Computation, m-Health I. I NTRODUCTION Information and communication technologies (ICT) are fundamentally changing our society from its very core. The Internet is one of the most significant exponents of those changes and far from being incremental it has proven to be a clear disruptive change. This becomes apparent after the dawn of the Web 2.0 in which users jump from a merely passive role to a more creative and interactive behavior. Internet users are no longer only information consumers but information producers and they interact to obtain more information and better user experiences. Another milestone of ICT is mobile telephony. An unprece- dented amount of people has access to the Internet through mobile devices and this has changed the way in which we understand social networks and has increased the privacy risks. In the next sections, we briefly discuss the evolution that privacy has experienced due to the use of ICT; we define the concepts of sensitive social network and mobile sensitive social network and; we summarize the contribution and plan of the article. A. The Evolution of Privacy Privacy is recognized as an individual right that might be understood collectively (i.e. it can only be understood in a society in which individuals interact and share information). From an individual perspective, privacy has changed signifi- cantly. Initially, privacy was a need that people struggled to cover. After this need was consolidated and globally accepted, privacy became a right recognized by the Universal Declara- tion of Human Rights. With the growing use of ICT, privacy has become a com- modity for which people might pay, in fact, privacy has become a service (e.g. proxies and modules for the cloud). Finally, privacy is now seen as a product or a good something that can be sold or shared for a price or in exchange for a service or other products. So, privacy started like a need, transformed into a right, evolved to be a service and has finished being a product. Currently, most users of Internet free services (e.g. social networks and other services based on the so-called freemium model) are freely giving their privacy (i.e. a precious product) in return for those apparently “free” services that they use. But users have started to become more interested in protecting their privacy[4]. B. Mobile Sensitive Social Networks Social networks (SN) are the natural result of the evolution of Web 2.0 in which people interact and create content on and for the Internet. Those contents are essentially built by the people who use SN, and in general, they decide whether to share that information publicly or with a reduced number of virtual friends. However, even in the case when users decide to bound the access to their information to a given number of users, the infrastructure of the SN (i.e. servers, storage devices, administrators, etc.) always has access to the information – to all the information. In fact, in the vast majority of SN, the service is provided for free, but users are forced to share their information with the SN infrastructure that becomes a broker for their information. This information is processed and, sometimes, sold to third parties like advertising companies that aim at promoting their products to segmented sets of users.

Upload: agusti

Post on 18-Dec-2016

215 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: [IEEE 2013 Fourth International Conference on Information, Intelligence, Systems and Applications (IISA) - Piraeus, Greece (2013.07.10-2013.07.12)] IISA 2013 - Privacy as a Product:

Privacy as a Product: A Case Study in the m-HealthSector

Constantinos PatsakisDistributed Systems Group

School of Computer Science and StatisticsTrinity CollegeDublin, Ireland

[email protected]

Agusti SolanasUNESCO Chair in Data Privacy

Department Computer Engineering & MathematicsUniversitat Rovira i Virgili

Tarragona. [email protected]

Abstract—Business models based on offering free services topeople in exchange for their data are gaining importance andprevalence. The most prominent examples are social networksand, more recently, mobile social networks. However, this trendis endangering users’ privacy.

We do not discuss the ethical and legal issues derived fromthis business model. Notwithstanding, we believe that users mighthave better and more privacy-aware alternatives enabling them totrade their privacy on their own. Thus, we introduce the conceptof “Privacy as a Product” (PaaP) and we propose and describea framework and a protocol based on the Raykova-Vo-Bellovin-Malkin protocol that enables users to share private data withoutthe need for trusting the infrastructure (e.g. a social network).

We show that our proposal is feasible in terms of computa-tional and storage overhead. Hence, our solution opens the doorto the new concept of “Privacy as a Product” and could be thefoundation for implementations of privacy-aware social networksin which privacy plays a more central role, like in the healthcaresector.

Index Terms—Privacy, Social Networks, Secure MultipartyComputation, m-Health

I. INTRODUCTION

Information and communication technologies (ICT) arefundamentally changing our society from its very core. TheInternet is one of the most significant exponents of thosechanges and far from being incremental it has proven to be aclear disruptive change. This becomes apparent after the dawnof the Web 2.0 in which users jump from a merely passiverole to a more creative and interactive behavior. Internet usersare no longer only information consumers but informationproducers and they interact to obtain more information andbetter user experiences.

Another milestone of ICT is mobile telephony. An unprece-dented amount of people has access to the Internet throughmobile devices and this has changed the way in which weunderstand social networks and has increased the privacy risks.

In the next sections, we briefly discuss the evolution thatprivacy has experienced due to the use of ICT; we definethe concepts of sensitive social network and mobile sensitivesocial network and; we summarize the contribution and planof the article.

A. The Evolution of Privacy

Privacy is recognized as an individual right that might beunderstood collectively (i.e. it can only be understood in asociety in which individuals interact and share information).From an individual perspective, privacy has changed signifi-cantly. Initially, privacy was a need that people struggled tocover. After this need was consolidated and globally accepted,privacy became a right recognized by the Universal Declara-tion of Human Rights.

With the growing use of ICT, privacy has become a com-modity for which people might pay, in fact, privacy hasbecome a service (e.g. proxies and modules for the cloud).

Finally, privacy is now seen as a product or a good –something that can be sold or shared for a price or in exchangefor a service or other products. So, privacy started like aneed, transformed into a right, evolved to be a service andhas finished being a product.

Currently, most users of Internet free services (e.g. socialnetworks and other services based on the so-called freemiummodel) are freely giving their privacy (i.e. a precious product)in return for those apparently “free” services that they use. Butusers have started to become more interested in protecting theirprivacy[4].

B. Mobile Sensitive Social Networks

Social networks (SN) are the natural result of the evolutionof Web 2.0 in which people interact and create content on andfor the Internet. Those contents are essentially built by thepeople who use SN, and in general, they decide whether toshare that information publicly or with a reduced number ofvirtual friends. However, even in the case when users decideto bound the access to their information to a given number ofusers, the infrastructure of the SN (i.e. servers, storage devices,administrators, etc.) always has access to the information– to all the information. In fact, in the vast majority ofSN, the service is provided for free, but users are forcedto share their information with the SN infrastructure thatbecomes a broker for their information. This information isprocessed and, sometimes, sold to third parties like advertisingcompanies that aim at promoting their products to segmentedsets of users.

Page 2: [IEEE 2013 Fourth International Conference on Information, Intelligence, Systems and Applications (IISA) - Piraeus, Greece (2013.07.10-2013.07.12)] IISA 2013 - Privacy as a Product:

Social networks are technically similar. However, theircontents might be significantly different and the needs oftheir users might highly differ too. In recent years we havewitnessed the appearance of many social networks orientedto sensitive groups, namely chronic patients, disabled people,people with mental disorders, etc. We call them “SensitiveSocial Networks” (SSN) and we define them as follows:

A Sensitive Social Network (SSN) is a socialstructure that consists of a group of people andorganizations that share inherently private and sen-sitive information by means of an Internet-basedinfrastructure.

We can find a number of examples of SSN in the healthcaresector for people with cancer1, diabetes2, disabled people3,and more general ones for patients, doctors, and the like4.

In addition, SSN might allow the interaction through mobiledevices like Smartphones, which leads to the concept of“Mobile Sensitive Social Network” defined as follows:

A Mobile Sensitive Social Network (MSSN) is asocial structure that consists of a group of peopleand organizations that share inherently private andsensitive information by means of mobile deviceswith self-location capabilities.

Devices with self-location capabilities allow users to obtaintheir location and share it with the MSSN to obtain location-based services (e.g. finding the closest pharmacy, finding lostpatients, etc.). However, this information could also endangerthe privacy of users by, for example, revealing their habits.

C. Contribution and plan of the article

This work concentrates on people using sensitive socialnetworks (SSN) and we consider the more recent phenomenonof mobile sensitive social networks (MSSN). We propose aprotocol that allows users to trade their privacy within theinfrastructure of the social network so as to obtain a fairerservice and to give them more control over their data.

The rest of the article is organized as follows: §2 providesthe reader with some background on privacy techniques ap-plied to ICT. §3 summarizes the main points of the Raykova-Vo-Bellovin-Markin protocol in which our proposal is based.§4 describes our protocol and its main aims, proposes itsapplication to the healthcare sector, and discusses severalsecurity issues. §5 analyzes the costs and overheads associatedto the protocol and shows its feasibility. Finally, the articleconcludes with some final comments in §6.

II. BACKGROUND ON PRIVACY IN ICT

The study of privacy protection applied to ICT is very wideand embraces many fields of knowledge from Cryptographyand Statistics to Artificial Intelligence and Sociology. Manyclassifications can be used to organize the wide variety of

1http://cancermatch.com, http://www.ihadcancer.com2http://www.diabeticconnect.com3http://www.disaboomlive.com4http://www.careconnectix.com, http://www.inspire.com

Figure 1. Scheme of privacy protection methods according to the perceptionof the user.

techniques aimed at privacy protection. For the sake of clarityand simplicity we consider the perspective of the user of ICT,this is, how the user perceives that her privacy is protected.

When an ICT user accesses a service, in general, she hasto share some information with the provider of the service,namely identity, type of service required, location, etc. Clearly,the shared information depends on the service but regardlessof the exchanged information, users have to choose betweenfour main options: (i) trust the provider and send him all therequired information, (ii) individually protect the data sent tothe untrusted provider, (iii) collaborate with the provider toprotect her data or, (iv) collaborate with other users to protecttheir data from the provider. Figure 1 shows a scheme withthese four possibilities and some techniques that can be appliedin each case.

In the following sections we briefly summarize the mostrelevant techniques within each category.

A. Privacy based on trust

This is probably the most common situation. Users tendto trust service providers because, in fact, they do not reallyhave alternatives in many cases. Due to the fact that privacy isconsidered a right, most countries have regulations that compelcompanies and agencies to guarantee the privacy of theirusers. In addition to this legislation, companies might decideto adhere to a privacy policy that describes their practicesaccording to the privacy of their users. Finally, when usersdata are released to third parties they should be sanitized soas to guarantee users privacy. To do so statistical disclosurecontrol techniques are generally used.

B. Privacy based on Individual User Actions

Despite the legislation, users might prefer to keep some oftheir private information away from the service provider. In

Page 3: [IEEE 2013 Fourth International Conference on Information, Intelligence, Systems and Applications (IISA) - Piraeus, Greece (2013.07.10-2013.07.12)] IISA 2013 - Privacy as a Product:

this case, we assume that the user cannot collaborate with theservice provider (e.g. consider the case of sending a query toan Internet search engine such as Google or Yahoo. The usercannot initiate a collaborative protocol with the search engine,that is only able to receive and answer queries).

C. Privacy based on Collaboration with the Provider

There are situations in which the service provider mightcollaborate with the user to protect her privacy by runningprivacy-aware protocols. We emphasize the following:

• Privacy Preserving Data Mining• Private Information Retrieval

D. Privacy based on Collaboration with other Users

This is an evolution of the proposals described in II-B inwhich users collaborate to protect their privacy. In this case,users do not want to trust the provider nor other third parties.We emphasize the following approaches:

• Distributed obfuscation [14][12].• Distributed pseudonymizers [10].

III. THE RAYKOVA-VO-BELLOVIN-MALKIN SCHEME

Private information retrieval (PIR) schemes allow users toobtain information stored in databases of an untrusted server,and allow to issue queries to the database without revealingthe actual queries to the server, which doesn’t know eitherthe records that have been returned as a result of the query.Even though the first two articles [6, 1] on PIR were verypromising as they showed that we can have solutions to thisproblem without having to replicate the whole database, upto today, we do not have any practical implementation, as thetime and computational cost are prohibitive.

On a different line, fully-homomorphic cryptography, anencryption mechanism that quite recently was proved to exist[5], which could provide encrypted queries, proves to beinefficient for practical applications as well.

In the quest for efficient solutions to such problems, aninteresting scheme, the Raykova, Vo, Bellovin and Malkin(RVBM) scheme [11], provides a feasible solution with smalloverhead [9] to an interesting and closely related problem toPIR schemes.

The RVBM scheme allows users/clients to anonymouslyperform queries to a server, without revealing any informationabout the query or the results, neither to the owner of thedatabase nor to the server that hosts it. More interestingly, theserver does not have access to the contents of the database,as all the contents are encrypted by keys selected by theusers/clients.

The scheme has four main entities: the User, who suppliesthe data, the Querier, who queries the database, an IndexServer (IS) which hosts the encrypted database, and the QueryRouter (QR), which performs as a proxy routing the queriesfrom the Queriers to the Index Server, therefore, providinganonymity to the queriers. In order to enable the afore-mentioned feature, the entities follow the procedure below,illustrated in Figure 2:

Index ServerIS

Query RouterQR

EncryptedUser data & BFs

User

Querier

1

6

2

5 4

3

Figure 2. Raykova-Vo-Bellovin-Malkin scheme.

1) The user sends to the IS his/her encrypted data and aset of Bloom filters that are used to manage keywordsthat he/she has in a dictionary.

2) The user sends a key to the Querier.3) The querier encrypts his/her query with the key (received

from the user) and sends the result to the QR.4) The QR re-encrypts the query and forwards it to the IS.5) The IS performs the search on the Bloom Filters and

sends the results (encrypted) to the QR.6) The QR replies to the Querier with the re-encrypted

results.These steps allow the Querier to identify which records

in the database contain a certain keyword, without revealingany information to the IS. Moreover, the scheme can befurther extended to provide aggregation queries instead of justkeyword searches.

IV. OUR PROPOSAL

A. Assumptions and Desiderata

Today there is a wide variety of Social Networks that areused by millions of people worldwide. They share a mixtureof public and private information. Many cases of identity theftand disclosure of sensitive users’ private information havebecome known for celebrities, or in many cases poor privacysettings expose average users everyday. Even if these eventswere not the case, the information that a user is sharing ona SN is exposed to the SN infrastructure; and it is tradedby the SN to advertising companies mainly, without the userknowledge of where exactly her data have been sent. However,there are situations in which people want to maintain theirprivacy, as the shared information is rather sensitive, a clearexample of this situation can be found in the healthcare sector,as we stated in the introduction.

In the healthcare sector, users may want to share with otherusers their symptoms, their illness/disease, the results of theirMRIs, blood exams, their treatment drugs, the outcomes oftheir treatments etc. Obviously, this type of information is verysensitive and in most cases, any type of disclosure can havea huge impact in their personal and social life. Solutions like

Page 4: [IEEE 2013 Fourth International Conference on Information, Intelligence, Systems and Applications (IISA) - Piraeus, Greece (2013.07.10-2013.07.12)] IISA 2013 - Privacy as a Product:

HealthVault 5 try to manage such information. However, theinformation is shared with them as well, meaning that thereshould be a trust towards the company, the infrastructure andits associates.

In this context, we create a transparent SN architecturewhere users store their data encrypted, with keys that areissued and managed by them, allowing only the entitiesthat they want to access their content. Simultaneously, theinfrastructure allows targeted advertisement, even if the SNdoes not have access to users’ data. Moreover, since users arein total control of their shared information, they can trade itand get payed from the advertising companies. The SN canhave interest from this market model, as it gets payed fromthe advertising companies for offering them data queryingservices. Additionally, the SN can get a part from every tradethat is made between users and advertising companies. Itshould be noted here that in the proposed SN scheme, theSN cannot be held responsible for information leakage.

B. The protocol

Our proposal will use the RVBM protocol, that was de-scribed in the previous section, as a wrapper, and try to setthe parties in the right positions to fulfill our goals. Themain differences include an initialization of cryptographicparameters for the two parties, a commitment from the user(Alice) regarding her data, the exchange of the data andmutual receipts of the transaction. Finally, if they reach anagreement, the disclosure of Alice’s data and the transmissionof a secret key from the bidder. This scheme allows queriesonly to appropriate clients, previously authenticated to the QScompanies, without any further data disclosures. Moreover, itssub-linear time complexity makes it ideal for implementation.This way, the service providers have the sole task of storingand distributing the encrypted data.

Alice stores all her data encrypted on IS, or else HostingServer HS, as it only hosts the files, and defines subsets of herdata that can be exchanged. IS does not have any access tothe plaintext version of Alice’s data and performs the queriesreceived from QS of the RVBM protocol, allowing anonymousqueries for the bidder (B) in the authorized subsets by theusers. B queries in the authorized subsets trying to find manyhits of his keywords using the RVBM protocol. If he findsa subset with many hits, he issues a notice to QS for thespecific user of IS. From the IS, the notice is forwarded tothe appropriate user. If the user is interested in having a bidhe approves the bid request. IS then informs Alice and thebidder B, through QS of each other public keys, their currentaddresses and issues an offerID along with a nonce nIS .

The bidder sends Alice a hello message, his identity IDB ,a nonce, the offerID and a set of supported encryption,hashing and compression algorithms, digitally signed by hisprivate key. Alice replies with her identity IDA, a nonce, theselected algorithms and digitally signs them with her privatekey. Using the supplied nonces, they both can create the

5Microsoft HealthVault: www.healthvault.com/

session key ks and encrypt the rest of their communication.B sends Alice a set of bidding functions f1, . . . , fn, to pickone in order to use it for the secure computation. Alice replieswith her desired function f . They make the secure computationf(x, y) where x is supplied by Alice and represents her privacyexposure measurement for disclosing this information and yis the number of keyword hits that B got, when queried thesubset. They both know the value of f(x, y). B now sendsAlice his offer:

offer = offerID||IDB ||IDuser||bid_value||H(m)

||timestamp||BTTL||bid_user_rights||KServ||t1||t2digitally signed by his private key PrivB , where:

• bid_value is the value that B is offering,• timestamp shows when the bid is given,• BTTL is the bid time to live for response,• bid_user_rights is a text/link/identifier stating what are

the bidder and user rights in the agreement,• KServ is the access key for the offered service and• t1 to t2 is the validity period of the key KServ .

If Alice accepts this offer she replies with two messages theaccept_reply/deny_reply, which is intended for B and arecieptIS intended for IS, where:

accept_reply = 1||offerID||H(offer)||timestamp

deny_reply = 0||offerID||H(offer)||timestamp

recieptIS = [0/1]||offerID||nIS

both digitally signed with her private key PrivA. B will nowsend to Alice:

EncPrivB (H (Alice_reply)) ||EncPrivB ([0/1]||offerID||nIS)

Alice can now send Bob the decryption key kdecr forthe agreed information m, digitally signed with her pri-vate key. Alice contacts IS informing him about the out-come of the bid, showing him the signed messages. Aliceshows: EncPrivB ([0/1]||offerID||nIS) , while B showsrecieptIS = [0/1]||offerID||nIS . If both messages agreethen IS allows B to download Alice’s data from IS. Giventhe data, B checks their validity and grants Alice’s key KServ

the agreed service.In order to keep the rules of the bid private, the bidding

function f remains secret from eavesdroppers, so it is sentencrypted from B to Alice. Moreover, Alice may choose whichfunction is going to be used, if she thinks she can get a betterdeal from it. The two signed messages, the offer and theaccept/deny reply cannot be forged and can be used fromeach side in order to prove the deal. B can prove that Alicehas agreed with the terms of the agreement if she denies it,while on the other hand Alice can prove that even thoughshe accepted the offer, B did not fulfill his part of the deal.Moreover, Alice’s signed reply is a receipt for B, while:

EncPrivB (H(Alice_reply))

is a receipt for Alice that B is aware of her reply.

Page 5: [IEEE 2013 Fourth International Conference on Information, Intelligence, Systems and Applications (IISA) - Piraeus, Greece (2013.07.10-2013.07.12)] IISA 2013 - Privacy as a Product:

Putting HS on the scheme enables him to have a controlover the transactions, hence he may request from either partan exchange for his services. Therefore, all parts have interestfrom the deal. If Alice or B decide to stop the transactionat any point, they will result with a key that cannot befurther used, so no information is leaked. The receipts thatHS receives do not let him know any other details of theagreement, apart from their outcome, deal or no deal.

The scenarios in which two parties cooperate to play ma-liciously against the third party are the following three. IfAlice cooperates with HS, Alice cannot access the serviceoffered by B, since the key KServ will never work. If Alicecooperates with B, B will not be able to retrieve the datawithout HS, since the data are stored there. Finally, if HSand B cooperate, since none of them owns the decryption key,Alice’s data cannot be accessed.

Alice’s privacy exposure y, used in calculating f(x, y), ismeasured using for example the Liu and Terzi privacy riskscore [7]. One may argue that B is not a user of the SN andthat this information is not leaked to the other users, yet onemay use the dichotomous case of Liu and Terzi metric.

C. Threat models

The first attack scenario would be to attack the Index Server.However, such an attack would not lead to any usable results.As discussed previously, all the information that is stored inthe IS is already encrypted with keys that the IS is not awareof. Therefore, the outcome of the attack will be encrypteddata that can only be decrypted by the owners, or the usersthat have been granted the right by the owners.

The trading protocol has been proved formally secure usingScyther [2] as it supports: non-injective agreement as definedin [3], non-injective synchronisation as defined in [3], alive-ness, of all entities as defined in [8]. Additionally, the values:offerID, nis, nb, params, timestamp, bttl, bur, kserv, t1, t2,nu, m, ad, kdecr, are only disclosed to the proper entities andremain secret from outside parties.

If the Bidder and the Hosting Server cooperate, only theinformation that has been already shared by the user can bedisclosed, a risk that is always taken for granted. If someonehas access to some information he/she might distribute it.Since the Bidder has only access to the information that theuser has given the key and belong to the past, the bidder cannotaccess new data. Moreover, if the user becomes aware of suchleakage, the source might be traced, with the cooperation ofother users who have similar issues, or with the use of robustwatermarks, if the content is multimedia. We should note here,that the Hosting Server nor the Bidder, even in cooperationcan forge the signature of the User, claiming addition accessto his data. The same applies for possible cooperation of Userand HS, who cannot forge a sealed agreement with the Bidder.The standalone cases, (e.g. User or Bidder only attacks, cannotobviously be launched).

In our protocol, to “seal the deals” we use digital signatures,which can expose the true identity of the user who reaches anagreement, as someone might be using the same private/public

EncryptedUserdata

Query RouterQR

Hosting ServerHS

User Bidder

Figure 3. The proposed model.

key pair in the SN, along with other services outside of the SN.His public key, which is used for verification, can be found onthe Internet, revealing his true identity. If this is not the case,so the user has different private/public key pairs, then againhe could be identified by combining the result of two differentdeals. In both cases we would have the same public key used,therefore, we know that the same person has reached to twoagreements and depending on the content, extract even moreinformation.

With the aim to address this shortcoming, we propose theuse of an anonymous signature scheme, like the one proposedin [15]. This approach has the obvious advantage of notdisclosing the true user’s identity whilst, at the same time,keeping the advantages of digital signatures. It can be saidthat these schemes could allow users to escape a sealed deal,but this is not the case in our case study. The user’s consent onsealing the deal is guaranteed by the fact that she provides adecryption key, that allows access to her encrypted data. Since,neither HS nor the Bidder can have access to the decryptionkey, the key is the proof of the user’s consent on the deal.

D. Application case: The Healthcare Sector

A very good target in which to apply our proposed scheme isthe case of social networks that manage content in the health-care sector. Specially, we focus on social networks for m-health that are Mobile Sensitive Social Networks. Users carrytheir mobile phones every time, so with today’s smartphones, itis very easy to have their whole medical record with them andupdate their health SN profile automatically. Moreover, due totheir self-location capabilities, smartphones may track userstrajectories[13]. Mobile phones can store securely a user’sprivate and public keys on their SIM cards and manage them

Page 6: [IEEE 2013 Fourth International Conference on Information, Intelligence, Systems and Applications (IISA) - Piraeus, Greece (2013.07.10-2013.07.12)] IISA 2013 - Privacy as a Product:

efficiently. Hence, the problem of issuing and managing keyscan be drastically minimized.

Medical, pharmaceutical and food companies, hospitals anddoctors can anonymously query patients data, without anyinformation that can identify the patients being exposed. Soa huge database of potential users, with real-time data canbe queried and the data can be traded if users agree to.On the one hand, this new environment, allows patients toreceive advertisements for new drugs related to their diseasesand get real-time recommendations from doctors. Patientswith Mild cognitive Impairments and Dementia may securelyshare their position, so that they can be easily transferredto their home. New services with on demand doctors mayarise, where gadgets connected to smartphones share real-timemeasurements with the doctors to receive appropriate medicalinstructions.

These services could be individually offered, nevertheless,this would demand many intermediate entities, which couldhave access to patient’s sensitive data. Therefore, the noveltyin our approach is the unification of all these, into oneframework, with total user control over the shared informationand without any information leakage.

V. CONCLUSIONS

Trading personal/private information is a problem beyondthe scope of ethics. In many situations we might say that wehave already passed the border line of ethics, therefore, eitherthe line has to be redrawn and make some compromises, orwe have to create a more secure environment.

In this context, we have proposed a protocol that can be seenas a novel application on private information retrieval. Thiswork illustrates how an individual may securely share/tradepart of her private data. The proposed infrastructure createsnew roles on social networks, each of which can monetize itsparticipation and a new market model arises.

The case study in the health sector, specially in m-Healthsocial networks, shows that such an approach has not onlymuch potential, but also can extend current provided services,reassure users’ trust and lead to new business models, whichare more privacy aware. Furthermore, we show that the intro-duction of the concept of privacy as a product, is feasibleand very practical in many cases. The main product, user’sdata, which is up to now a trading value for SNs, becomesa trading value for users as well. The adoption of the wholeconcept not only returns the control of the information to theusers, but may lead to more fine crafted services.

REFERENCES

[1] Benny Chor, Eyal Kushilevitz, Oded Goldreich, andMadhu Sudan. Private information retrieval. Journalof the ACM (JACM), 45(6):965–981, November 1998.

[2] C. Cremers. Scyther. people.inf.ethz.ch/cremersc/scyther/index.html, 2012.

[3] Cas Cremers, Sjouke Mauw, Erik de Vink, C. J. F.Cremers, and S. Mauw. Defining authentication in a tracemodel, 2003.

[4] Ratan Dey, Zubin Jelveh, and Keith W. Ross. Facebookusers have become much more private: A large-scalestudy. In PerCom Workshops, pages 346–352, 2012.

[5] Craig Gentry. Fully homomorphic encryption usingideal lattices. In Proceedings of the 41st annual ACMsymposium on Theory of computing, STOC ’09, pages169–178, New York, NY, USA, 2009. ACM.

[6] Eyal Kushilevitz and Rafail Ostrovsky. Replication isnot needed: Single database, computationally-private in-formation retrieval (extended abstract). In In proceedingsof the 38th annual IEEE Symposioum on Foundations ofComputer Science, pages 364–373, 1997.

[7] Kun Liu and Evimaria Terzi. A framework for computingthe privacy scores of users in online social networks.ACM Trans. Knowl. Discov. Data, 5(1):6:1–6:30, Decem-ber 2010.

[8] Gavin Lowe. A hierarchy of authentication specifications.pages 31–43. IEEE Computer Society Press, 1997.

[9] Vasilis Pappas, Mariana Raykova, Binh Vo, Steven M.Bellovin, and Tal Malkin. Private search in the realworld. In Robert H’obbes’ Zakon, John P. McDermott,and Michael E. Locasto, editors, ACSAC, pages 83–92.ACM, 2011.

[10] Pablo A. Pérez-Martínez, Agusti Solanas, and AntoniMartínez-Ballesté. Location privacy through users’ col-laboration: A distributed pseudonymizer. In Mobile Ubiq-uitous Computing, Systems, Services and Technologies,International Conference on, pages 338–341, 2009.

[11] Mariana Raykova, Binh Vo, Steven M. Bellovin, and TalMalkin. Secure anonymous database search. In Proceed-ings of the 2009 ACM workshop on Cloud computingsecurity, CCSW ’09, pages 115–126, New York, NY,USA, 2009. ACM.

[12] David Rebollo-Monedero, Jordi Forné, Agusti Solanas,and Antoni Martínez-Ballesté. Private location-based in-formation retrieval through user collaboration. ComputerCommunications, 33(6):762–774, 2010.

[13] Agusti Solanas, Antoni Martínez, Pablo A. Pérez-Martínez, Albert Fernández, and Javier Ramos. m-Carer:Privacy-aware monitoring for people with mild cognitiveimpairment and dementia. IEEE Journal On SelectedAreas in Communications, page (in press), 2013. ,Accepted Manuscript.

[14] Agusti Solanas and Antoni Martínez-Ballesté. A ttp-freeprotocol for location privacy in location-based services.Computer Communications, 31(6):1181–1191, 2008.

[15] Guomin Yang, Duncan S. Wong, Xiaotie Deng, andHuaxiong Wang. Anonymous signature schemes. InIn Public Key Cryptography (2006, pages 347–363.Springer, 2005.