dpavatar: a real-time location protection framework for ... · privacy models, lacking rigorous...

14
1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information. This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEE Transactions on Mobile Computing 1 DPavatar: A Real-time Location Protection Framework for Incumbent Users in Cognitive Radio Networks Jianqing Liu, Chi Zhang, Beatriz Lorenzo, and Yuguang Fang, Fellow, IEEE Abstract—Dynamic spectrum sharing between licensed incumbent users (IUs) and unlicensed wireless industries has been well recognized as an efficient approach to solving spectrum scarcity as well as creating spectrum markets. Recently, both U.S. and European governments called a ruling on opening up spectrum that was initially licensed to sensitive military/federal systems. However, this introduces serious concerns on operational privacy (e.g., location, time and frequency of use) of IUs for national security concerns. Although several works have proposed obfuscation methods to address this problem, these techniques only rely on syntactic privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time differential location privacy for sensitive IUs. We design a utility-optimal differentially private mechanism to reduce the loss in spectrum efficiency while protecting IUs from harmful interference. Furthermore, we strategically combine differential privacy with another privacy notion, expected inference error, to provide double shield protection for IU’s location privacy. Extensive simulations are conducted to validate our design and demonstrate significant improvements in utility and location privacy compared with other existing mechanisms. Index Terms—Differential privacy, location privacy, Bayesian inference attack, cognitive radio networks, optimization. 1 I NTRODUCTION Radio spectrum is a precious resource and a prerequisite for modern communication systems. However, current spec- trum utilization efficiency is relatively low. For example, as of September 2012, National Telecommunications and Information Administration (NTIA) estimated that 43 per- cent of the spectrum between 225 MHz to 3,700 MHz (the spectrum most highly desired by the wireless community) is predominantly occupied by federal systems but is used inefficiently. Concurrently, there is an ever-increasing indus- trial demand for wireless spectrum. To mitigate the problem of unbalanced spectrum usage, dynamic spectrum access (DSA) technique has been well recognized as a promising solution. Federal Communications Commissions (FCC) in the U.S. recently issued a ruling that the 3,550-3,700 MHz band, which was initially licensed to DoD’s military radar systems, will be opened to wireless industries for shared usage [1]. In Europe, a similar DSA framework named li- censed shared access (LSA) is also being developed, and the J. Liu is with the Department of Electrical and Computer Engineering, University of Alabama in Huntsville, Huntsville, AL, 35899 USA e-mail: [email protected] C. Zhang is with University of Science and Technology of China and Key Lab- oratory of Electromagnetic Space Information, Chinese Academy of Sciences, Hefei, 230027 P. R. China, e-mail: [email protected] B. Lorenzo is with the Atlantic Research Center for Information and Com- munication Technologies (AtlantTIC), University of Vigo, Vigo, 36310 Spain e-mail: [email protected] Y. Fang is with the Department of Electrical and Computer Engineering, University of Florida, Gainesville, FL, 32611 USA e-mail: [email protected]fl.edu The work of J.Liu and Y.Fang was supported by the US National Science Foun- dation under grant CNS-1343356 and IIS-1722791. The work of C.Zhang was supported by the National Key Research and Development Program of China under Grant 2017YFB0802202 and by the Natural Science Foundation of China (NSFC) under Grant 61201240. The work of B. Lorenzo was supported by MINECO, Spain, under Grant EUIN2017-88225. 2.3-2.4 GHz band originally belonging to military aircraft services and police wireless systems is now open for wire- less industries for shared access [2]. Among these rulings, a spectrum access system (SAS) database is recommended to be set up to manage the shared spectrum usage between licensed incumbent users (IUs) and unlicensed secondary users (SUs) [1]. Specifically, SUs send queries to the SAS database for the available spectrum and allowable trans- mit power. The SAS database responds according to IUs’ operational information (e.g., location, frequency band, and time of operation) to ensure that no harmful interference is generated. It is recognized that most IUs are not classified systems like fixed satellite radars whose operational information is public, but some are quite sensitive such as the Ground/Air Task Oriented Radar (GATOR) which is used to detect unmanned aerial systems, cruise missiles, rockets, mortars, etc [3]. In practice, the GATOR system operates in the lower adjacent 3.5 GHz band; has very low interference thresholds (e.g., -102 dBm for GATOR Block 3); and is frequently redeployed by a medium-tactical vehicle replace- ment (MTVR) to fulfill various homeland defense missions [4]. The SAS database, therefore, incorporates the protection for the GATOR system to avoid the high adjacent channel leakage from the 3.5 GHz band. Unfortunately, research studies have shown that the query-response process, which is initially intended to protect IUs from power interference, reveals unexpected useful information to adversaries who can compromise IUs’ operational privacy [5], [6]. As we know, traditional cyber-attacks apply the in- field spectrum sensing technique to triangulate the radar system, whereas military systems normally react using the frequency hopping over a wide range of spectrum [7]. However, in the SAS database-based DSA system, adver-

Upload: others

Post on 26-Sep-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: DPavatar: A Real-time Location Protection Framework for ... · privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time

1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEETransactions on Mobile Computing

1

DPavatar: A Real-time Location ProtectionFramework for Incumbent Users in Cognitive

Radio NetworksJianqing Liu, Chi Zhang, Beatriz Lorenzo, and Yuguang Fang, Fellow, IEEE

Abstract—Dynamic spectrum sharing between licensed incumbent users (IUs) and unlicensed wireless industries has been wellrecognized as an efficient approach to solving spectrum scarcity as well as creating spectrum markets. Recently, both U.S. andEuropean governments called a ruling on opening up spectrum that was initially licensed to sensitive military/federal systems.However, this introduces serious concerns on operational privacy (e.g., location, time and frequency of use) of IUs for national securityconcerns. Although several works have proposed obfuscation methods to address this problem, these techniques only rely on syntacticprivacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-timedifferential location privacy for sensitive IUs. We design a utility-optimal differentially private mechanism to reduce the loss in spectrumefficiency while protecting IUs from harmful interference. Furthermore, we strategically combine differential privacy with another privacynotion, expected inference error, to provide double shield protection for IU’s location privacy. Extensive simulations are conducted tovalidate our design and demonstrate significant improvements in utility and location privacy compared with other existing mechanisms.

Index Terms—Differential privacy, location privacy, Bayesian inference attack, cognitive radio networks, optimization.

F

1 INTRODUCTION

Radio spectrum is a precious resource and a prerequisitefor modern communication systems. However, current spec-trum utilization efficiency is relatively low. For example,as of September 2012, National Telecommunications andInformation Administration (NTIA) estimated that 43 per-cent of the spectrum between 225 MHz to 3,700 MHz (thespectrum most highly desired by the wireless community)is predominantly occupied by federal systems but is usedinefficiently. Concurrently, there is an ever-increasing indus-trial demand for wireless spectrum. To mitigate the problemof unbalanced spectrum usage, dynamic spectrum access(DSA) technique has been well recognized as a promisingsolution. Federal Communications Commissions (FCC) inthe U.S. recently issued a ruling that the 3,550-3,700 MHzband, which was initially licensed to DoD’s military radarsystems, will be opened to wireless industries for sharedusage [1]. In Europe, a similar DSA framework named li-censed shared access (LSA) is also being developed, and the

J. Liu is with the Department of Electrical and Computer Engineering,University of Alabama in Huntsville, Huntsville, AL, 35899 USA e-mail:[email protected]. Zhang is with University of Science and Technology of China and Key Lab-oratory of Electromagnetic Space Information, Chinese Academy of Sciences,Hefei, 230027 P. R. China, e-mail: [email protected]. Lorenzo is with the Atlantic Research Center for Information and Com-munication Technologies (AtlantTIC), University of Vigo, Vigo, 36310 Spaine-mail: [email protected]. Fang is with the Department of Electrical and Computer Engineering,University of Florida, Gainesville, FL, 32611 USA e-mail: [email protected] work of J.Liu and Y.Fang was supported by the US National Science Foun-dation under grant CNS-1343356 and IIS-1722791. The work of C.Zhang wassupported by the National Key Research and Development Program of Chinaunder Grant 2017YFB0802202 and by the Natural Science Foundation ofChina (NSFC) under Grant 61201240. The work of B. Lorenzo was supportedby MINECO, Spain, under Grant EUIN2017-88225.

2.3-2.4 GHz band originally belonging to military aircraftservices and police wireless systems is now open for wire-less industries for shared access [2]. Among these rulings,a spectrum access system (SAS) database is recommendedto be set up to manage the shared spectrum usage betweenlicensed incumbent users (IUs) and unlicensed secondaryusers (SUs) [1]. Specifically, SUs send queries to the SASdatabase for the available spectrum and allowable trans-mit power. The SAS database responds according to IUs’operational information (e.g., location, frequency band, andtime of operation) to ensure that no harmful interference isgenerated.

It is recognized that most IUs are not classified systemslike fixed satellite radars whose operational information ispublic, but some are quite sensitive such as the Ground/AirTask Oriented Radar (GATOR) which is used to detectunmanned aerial systems, cruise missiles, rockets, mortars,etc [3]. In practice, the GATOR system operates in thelower adjacent 3.5 GHz band; has very low interferencethresholds (e.g., -102 dBm for GATOR Block 3); and isfrequently redeployed by a medium-tactical vehicle replace-ment (MTVR) to fulfill various homeland defense missions[4]. The SAS database, therefore, incorporates the protectionfor the GATOR system to avoid the high adjacent channelleakage from the 3.5 GHz band. Unfortunately, researchstudies have shown that the query-response process, whichis initially intended to protect IUs from power interference,reveals unexpected useful information to adversaries whocan compromise IUs’ operational privacy [5], [6].

As we know, traditional cyber-attacks apply the in-field spectrum sensing technique to triangulate the radarsystem, whereas military systems normally react using thefrequency hopping over a wide range of spectrum [7].However, in the SAS database-based DSA system, adver-

Page 2: DPavatar: A Real-time Location Protection Framework for ... · privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time

1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEETransactions on Mobile Computing

2

saries gain extra or asymmetric advantages in this cy-berspace battlefield, meaning that they can remotely initiatelarge number of innocuous database queries and infer thepossible operational information of radar systems beforedeploying in-field sensing or jamming. This unfortunatelybenefits adversaries in terms of reducing their attacking costand increasing the attacking accuracy. Therefore, to ensuresuccessful military operations, it is critical to develop ob-fuscation techniques in the SAS database to complicate andconfound adversaries in conducting cyber-attacks, which inturn avails IUs in this cyber battle.

Even though prior research work [5], [8]–[10] proposedseveral privacy-preserving mechanisms to protect IUs’ op-erational privacy in terms of location, time and frequencyof use, these approaches only focused on one time ran-domization or obfuscation. Besides, these approaches couldfail for two reasons. First, in real-time series, independentlyapplying these mechanisms for multiple times may help ad-versaries to remove the noise and compromise IUs’ privacy.Second, the added noise could be filtered out by adversarieswith certain prior knowledge that these works clearly ig-nore. Furthermore, these mechanisms typically add extranon-existing dummy records in spatial, temporal or spectraldomain to preserve IUs’ operational privacy, which yetsufficiently reduces spectrum efficiency.

In this paper, we consider a DSA system where userscome or go in dynamic fashions, and we intend to protectthe IU’s location privacy in real-time series while mini-mizing the loss to SUs’ spectrum efficiency. Specifically,we leverage user diversity to include SUs and IU in alocation cloaking set such that every user therein is subjectto obfuscation and could be released to represent the IU.Here, these SUs are coined as “avatars” and we propose aframework called DPavatar to achieve Differential locationPrivacy for the IU by leveraging avatar SUs in real-timesequence. Our contributions are listed as follows.

• To the best of our knowledge, this is the first workto incorporate SUs to obfuscate IU’s location. Thebenefit is to reduce the loss in spectrum efficiencyby elevating any SU as the IU and protecting it frominterference. However, natural concerns could rise upin the sense that these SUs could infer IU’s locationand the IU may suffer from harmful interference.We tackle the first concern by constructing the lo-cation cloaking set satisfying reciprocity, a propertyto preserve spatial K-anonymity so that the includedSUs cannot distinguish which one is the real IU. Forthe second concern, we design a feedback controlmechanism to suppress the power interference to theIU.

• The obfuscation is performed at certain time instantsand we design an interference-utility-aware differen-tially private mechanism which is cast as a linearoptimization problem. The optimal DP obfuscationdistribution is obtained while the loss in spectrumefficiency is minimized.

• This is the first work so far to address IU’s locationprivacy in the real-time setting. Specifically, we con-sider privacy loss due to continuous application ofobfuscation mechanism and apply (ω, ε)-differential

privacy mechanism to guarantee that the IU couldretain ε-differential location privacy at any ω lengthof time window.

• We argue that differential privacy only confines ad-versary’s information gain between prior and poste-rior knowledge but may not be sufficient for locationprotection if an adversary has certain prior knowl-edge that can uniquely identify the IU. We provethis claim both theoretically and numerically in thispaper. To cope with this issue, we combine DP withanother privacy notion named expected inference error[11] which takes adversary’s specific prior knowl-edge into consideration. We claim that this strategiccombination can double shield IU’s location privacyby simultaneously limiting information leakage ofthe DP mechanism and ensuring the inference errorto be constrained for inference attacks with priorinformation the adversary may have.

The reminder of this paper is organized as follows.Section II describes the related work. Section III introducessome background information and describes the adversarymodel. Essential preliminaries are followed at Section IV.Then our DPavatar scheme is discussed in details in SectionV. Section VI gives the privacy analysis. The performanceevaluations are presented in Section VI and finally, SectionVII concludes the paper and outlines the future work.

2 RELATED WORK

Spectrum sharing is an interactive process among differentparties that could introduce security or privacy threats. Onetype of attacks such as Primary User Emulation Attack(PUEA) [12] and Spectrum Sensing Data Falsification (SSDF)attack [13] is about compromising the security of spectrumsharing, which is not necessarily based on SAS databasesystem. Therefore, in this survey, we focus on works inprotecting IU’s operational privacy in database-driven DSA.Authors in [14], [15] consider the adversary model that theSAS database is semi-trusted. Cryptographic tools, namelyCP-ABE in [14] and homomorphic Paillier cryptosystem[15], are used to protect IU’s operational information suchas IU’s antenna height and gain, transmission power and in-terference threshold against the SAS database. Obfuscationstrategies can also be applied to protect IU’s operationalinformation from being inferred. In [9], Robertson et al.insert to the SAS database dummy records of frequencybands that the IU operates on, so that adversaries cannotdistinguish IU’s exact operating spectrum. In [8], Bahrak etal. apply K-anonymity to preserve IU’s location privacy andadd false entries in time domain to prevent adversaries fromlearning IU’s correct operational time. In [5], [10], authorsadd random noise to IU’s operating information such as theradius of protected contour and interference threshold. As aresult, the adversaries may not correctly infer IU’s location.

Although these obfuscation strategies seem workable atfirst glance, they rely on syntactic privacy models withoutrigorous privacy guarantee. On the other hand, the differen-tial privacy [16], initially applied in statistical databases, hasbeen accepted as a standard for privacy protection. Recently,the notion of differential privacy has been extended topreserve location privacy [17]–[20]. In general, differentially

Page 3: DPavatar: A Real-time Location Protection Framework for ... · privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time

1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEETransactions on Mobile Computing

3

private location obfuscation requires a location set contain-ing the actual user’s location and “neighbouring” locations,such that they have the similar probabilities (bounded byeε) to produce a pseudo-location. Andrés et al. [17] proposethe notion geo-indistinguishability and a planar Laplacemechanism is developed to produce fake locations from apolar Laplacian distribution. Based on this work, Bordenabeet al. propose a utility-optimal obfuscation mechanism toachieve ε-geo-indistinguishability [18]. Chatzikokolakis etal. define privacy mass over an area and generalize thegeo-indistinguishability by adaptively adjusting the privacyparameter. Xiao et al. in [19] consider temporal correlationsof user’s mobility pattern and obfuscate the location basedon a planar isotropic mechanism.

However, most of them only consider obfuscation ina static scenario without considering privacy loss due tocontinuously applying differentially private obfuscation.Therefore, we resort to the ω-event ε-differential privacyproposed in [21], [22] to protect IU’s location privacy on-the-fly in a dynamic spectrum sharing environment. Besides, itis well-known that the location obfuscation affects utility(or quality) of services. Here, by releasing pseudo-locationin spectrum sharing scenario could incur harmful powerinterference to the IU, which is much more severe than justdegrading utilities. Therefore, we propose an interference-aware differentially private mechanism in this paper toremedy this problem.

On the other hand, it has been recently recognized thatthe differential privacy may not be effective in locationprotection for Bayesian adversaries with sufficient priorknowledge [20], [23], [24]. Shokri et al. in [24] is the firstto combine differential privacy with adversary’s expectedinference error using a linear programming framework todemonstrate the potential for privacy protection improve-ments. Yu et al. [23] based on this work propose a userdefined personalized error bound in the differential privacyframework to limit Bayesian adversaries’ inference accuracy.In this paper, we intend to bound the Bayesian adversary’sinference error in our differential privacy framework toprovide double-shield protection for IU’s location privacy.

3 BACKGROUND AND ADVERSARY MODEL

3.1 Overview of Spectrum Access SystemsAn SAS generally consists of IUs, SUs and a centralizeddatabase system that coordinates the spectrum sharing be-tween IUs and SUs. For instance, the 3.5 GHz SAS operatesvia a three-tier spectrum access approach, wherein IUsoperate at the first tier with the highest priority, while SUsoperate either at the second tier with licensed priority access(the small-cell services), or at the third tier with generalauthorized access. Fig.1 shows a typical spectrum accessprocess and it is explained in details as follows. Firstly, IUssend the SAS database their operational data such as loca-tion xu∗ , occupied channels fu∗ and interference thresholdΛthu∗ for interference protection. Moreover, in our design,

IUs can send a personalized error bound ϕ to the SASdatabase, specifying the required protection level againstBayesian adversaries. After receiving queries from SUs foravailable channels at specific locations xu , SAS database

responds with a list of available channels→

f along with

the allowable transmit power→

Pf . The power is calculatedbased on the accurate radio propagation model and IUs’operational data 1 such that the accumulated interferencefrom all operating SUs is below IUs’ interference threshold.Finally, upon receiving responses from the SAS database,the SU claims a channel and registers its usage at the SASdatabase.

DPavatar

Protection Scheme

Access Contol

SAS Database

Notify:

IU operation

data, privacy

parameter IUSU

1. Query:

2. Respond::

3. Register:

Figure 1: The system model of SAS database

3.2 Adversary Model

We assume Bayesian adversaries whose goal is to geo-locate the sensitive IU (e.g., the GATOR) through innocuousqueries to the SAS database. We assume adversaries havesufficient computational resources in the sense that (1) theycan generate a large number of fake locations to query theSAS database simultaneously; (2) and they can performreal-time analysis of database responses to triangulate thesensitive IU. Moreover, adversaries may possess a varietyof prior knowledge, which is obtained from other publicdatabases. For instance, the NTIA in [3] lists 30 coarse-grained installation locations of the GATOR system acrossthe United States. In particular, we can assume the ad-versary has a prior (probability) distribution π of the IUover the set of possible locations Ψ (e.g., one of those 30installation cities).

We consider the adversary as an informed one in thesense that it knows the location obfuscation mechanism,i.e., how it works and the exact obfuscation technique A(i.e., its probability distribution). Under this assumption,by geo-locating every released (real and dummy) IU usingalgorithms available in [5], [6], Bayesian adversaries aim tofilter the dummy IUs and infer the real one. Particularly,Bayesian adversaries will firstly compute the following pos-terior probability distribution:

Pr(x |xu) =π(x)Pr (A(xu |x))∑

x∈Ψ π(x)Pr (A(xu |x))(1)

Based on the posterior distribution, the adversary couldinfer IU’s location through Bayesian Inference Attack, repre-sented by

xu∗ = arg maxx∈Ψ

Pr(x |xu) (2)

We further quantify IU’s location privacy as the Bayesianadversary’s error in her inference attack. We see for a givenreleased dummy IU u, IU’s conditional location privacy (i.e.,

1. Since SAS database knows users’ exact locations, radio specs (e.g.,antenna height/gain, terrain data, weather, etc) and transmit power,it can adopt the channel model (e.g., Longley-Rice) for interferencecalculation.

Page 4: DPavatar: A Real-time Location Protection Framework for ... · privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time

1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEETransactions on Mobile Computing

4

the conditional expected inference error) can be computedas follows:

Err =∑

x∈ΨPr(x |xu) · dh(xu∗, x) (3)

where dh(·, ·) is the Hamming distance between two pointsand it equals to 1 if xu∗ , x and 0 otherwise. Note that this isan absolute measure of adversary’s capability to geo-locatethe IU given a specific prior knowledge π.

In the end, we claim that the SAS database is trustedsince it is authorized and censused by FCC and NTIA tomanage the DSA system, whereas the semi-trusted databasemodel has been addressed elsewhere in Liu et. al [14]. Be-sides, the operating SUs that have obtained spectrum accessopportunities are assumed curious about IU’s location butthey do not actively collude with each other or outsideattackers.

4 PRELIMINARIES

4.1 Differential Privacy

Differential privacy has become a de facto standard pri-vacy model, which was originally introduced in statisticaldatabases, requiring that a randomized mechanism shouldproduce similar results when a query is applied to neighbour-ing databases. Formally, the notion of ε-differential privacyis formulated as follows [25].

Definition 4.1 (Differential Privacy). A privacy mechanismA gives ε-differential privacy where ε > 0 if for anyneighbouring databases D and D

differing on at mostone record, and for all sets S ⊆ Range(A), the followingholds

e−ε ≤Pr (A (D) ∈ S)Pr (A (D′) ∈ S)

≤ eε (4)

The commonly used technique to achieve ε-differentialprivacy is the Laplace mechanism [25], whose main idea isto add random noise drawn from a Laplace distribution intothe statistics to be published. One of the most importantproperties of DP mechanisms is the composition theorem,meaning any DP mechanism is bound to cause privacy losswhen used repeatedly [21].

Theorem 4.1 (Sequential Composition). LetA1,A2 ,...,Ar bea set of mechanisms and each Ai provides εi-differentialprivacy. Let A be another mechanism that executesA1(D), A2(D) ,..., Ar (D) using independent randomnessfor each Ai . Then A satisfies (

∑ri=1 εi)-differential pri-

vacy.

4.2 Differentially Private Location Obfuscation

The ε-differential privacy concept in statistical data releasecontext can be naturally extended to the setting of locationprivacy preservation. Here, we define the ε-differential pri-vacy on a discretized location set containing the IU, withthe intuition that the released location z will not help anadversary to differentiate any instance inside this locationset.

Definition 4.2 (Differential Location Privacy). A random-ized mechanism A satisfies ε-differential privacy on

location set X if for any released location z ∈ X and anytwo locations x and x

∈ X, the following holds:

e−ε ≤Pr (A (x) = z)Pr (A (x′) = z)

≤ eε (5)

The location set could be created in various ways ac-cording to different problem settings [17]–[19], [23]. Giventhe location set, noise could be generated to satisfy differ-ential privacy following approaches such as 2-dimensionalLaplace distribution [17], utility-optimal mechanism [18],[26], planar isotropic mechanism [19] and exponential mech-anism [23].

4.3 (ω, ε)-Privacyω-event ε-differential privacy [21], [27], (ω, ε)-privacy usedfor short when there is no confusion, is proposed as anextension of differential privacy to address release of infinitestreams. The model emphasizes the guarantee of user levelε-differential privacy in any ω contiguous time intervals ina sliding window.Definition 4.3 ((ω, ε)-privacy). The randomized mechanismA gives (ω, ε)-privacy if for any two ω-neighbouringseries of input databases Dω and D

ω , and for all setsS ⊆ Range(A), the following holds

e−ε ≤Pr (A (Dω) ∈ S)Pr

(A

(D′

ω

)∈ S

) ≤ eε (6)

Theorem 4.2. [21] Let A be a randomized mechanism thattakes as input stream Dt, where Dt[i] = Di ∈ D, and out-puts st = (s1, ..., st ) ∈ S. Suppose A can be decomposedinto t randomized mechanisms A1, A2 ,..., At such thatAi(Di) = si , each Ai generates independent randomnessand satisfies εi-differential privacy. Then , A achieves(ω, ε)-privacy iff

∀i ∈ [t],∑i

k=i−ω+1εk = ε

This theorem views ε as the privacy budget for everysubsequence of length ω anywhere (i.e., in any sliding win-dow of ω) in the original series of input databases. This is thefundamental theorem, based on which we design a novelreal-time (ω, ε)-privacy mechanism by properly allocatingportions of ε across multiple time instants.

5 DPAVATAR: REAL-TIME DIFFERENTIAL LOCA-TION PRIVACY FOR THE IU5.1 OverviewFig.2 illustrates the basic idea of how our proposed DPavatarprotection scheme works. For the presentation clarity, wefirst give an overview of each function block in Fig.2 andthen outline some assumptions that are essential in ourdesign.

Firstly, we consider a time-slotted system where theSAS database only registers one user per time instant forthe spectrum access. This is because the mutual authen-tication is executed in the registration phase and limitingthe number of registered users per time instant can thwartthe denial-of-service (DoS) attack. Besides, the time intervalcan be dynamically adjusted based on the number of users

Page 5: DPavatar: A Real-time Location Protection Framework for ... · privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time

1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEETransactions on Mobile Computing

5

Does interference

threshold exceed?

Location

Cloaking Set

Is the sampling

time instant?

DP Location

Obfuscation

IBA Adaptive

Sampling

Dynamic Budget

Allocation

Yes

No

No

Yes

IU joins IU leaves

t0 t1 t2 t3 ti-1 ti ti+1 tN… …

SUs join and leave

t0 t1 t2 t3 ti-1 ti ti+1 tN… …

Avatar IU

Figure 2: The overview of DPavatar protectionscheme.

participating in the DSA system. Next, suppose when theIU registers in the SAS database at t0, several operatingSUs (i.e., avatars), including the IU, are selected to form alocation cloaking set. We assume these avatar SUs are pro-vided with incentives (e.g., power interference protection)to obfuscate the IU’s location from t0 to tN , but they mightbe curious about IU’s location. After t0, the following proce-dures are carried out at each time instant ti (t0 < ti ≤ tN ):

• When a SU joins the DSA system at ti , the SASdatabase checks whether the accumulated powerinterference to any user in the location cloaking setexceeds the threshold or not;

• If yes, the incoming SU is rejected and the inter-fered avatar IU is released at ti for public queries;otherwise, the scheme continues to verify if ti is thesampling time instant to perform the differentiallyprivate (DP) location obfuscation;

• If not, the SAS database utilizes the last re-leased avatar IU for public queries; otherwise, theInterference and Budget Aware (IBA) adaptive sam-pling function module calculates the next samplingtime instant and meanwhile informs the dynamicbudget allocation block to determine certain privacybudgets (i.e., ε) for the location obfuscation;

• The DP location obfuscation function block thentakes ε and the location cloaking set as input andproduces the avatar IU at ti .

Note that if there are not many SUs participating inthe DSA system, our scheme is naturally equivalent to theconventional approaches [5], [8] in the sense that dummyrecords (e.g., false locations, noisy interference threshold,etc.) are injected to create the location cloaking region, whichis even similar to the Exclusion Zone (EZ) proposed byNTIA [28]. However, this work intends to examine howthe design (i.e., privacy, spectrum efficiency, and powerinterference) would be different when the DSA systemincludes a large number of SUs. In other words, we aim toinvestigate how the user diversity can be fully leveraged tobetter protect IU’s location privacy. Next, we will elaborateeach function block in more details.

5.2 Location Cloaking Set

The application of standard differential privacy in statisti-cal database is intended to “hide” a true database in its“neighbouring databases” which are obtained by addingor removing one record (or one user). However, locationprotection for a user only involves a single record, which ishis own location. Hence, differential privacy in this contextrequires the definition of “neighbouring” location pointsto the targeted user’s location. One intuitive method is toinclude a set of “neighbouring” locations and the user’sactual location in a location set. In this case, differentiallyprivate mechanism can be applied to it such that a per-turbed location is released but the real location cannot bedifferentiated among other instances in this set.

Instead of using a fake location, we argue that releasingthe location of an operating SU to represent the locationof IU is more beneficial. The reasons are two-fold. First, ifthe adversary with certain prior information knows that noradio devices are transmitting at the released fake location,it can confidently exclude it from the locations where theIU could possibly operate, thus reducing the size of theanonymity set. Second, releasing “IU’s” location, regardlessof the real or fake one, inevitably restricts SUs’ spectrumaccess and reduces the spectrum efficiency. However, report-ing the location of a SU (e.g., a pair of transceivers or a WiFiaccess point along with its served users) could reduce theoverall spectrum efficiency loss because the SAS databaseprotects it from power interference in the same way as howthe IU is protected.

With this in mind, we intend to construct a locationcloaking set containing the real IU and its “neighbouring”SUs. However, selecting the location cloaking set is by nomeans a trivial task especially when it comes to protectIU’s location against operating SUs. In particular, at t0 whenthe IU comes, some SUs might be cleared due to theirinterference to the IU, which unfortunately could help theseSUs geo-locate the IU given this “firsthand experience”.Moreover, after a long time observation from t0 to tN , all in-stances (i.e., locations) in the location cloaking set are likelyreleased. Without proper design, the IU’s location could bethe “outlier” among others which is easily discerned. Forinstance, in [8], Bahrak et al. create a K-anonymity cloakingset by including K − 1 nearest users around the targeted IU.However, we argue that using nearest neighbouring usersmay compromise IU’s location in a scenario as shown inFig.3. Since the adversary knows all locations in the cloak-ing set after continuous observations, the adversary couldconfidently decide u∗ is IU’s real location as the cloakingsets would have been different if u1, u2 or u3 were the realIU.

The above observation motivates us to design a locationcloaking set satisfying the following two properties: (1)any user/location in the location cloaking set excludes itsinterfered SUs; (2) the location cloaking set should preservereciprocity whose definition can be found in [29].

To give a general picture, Alg.1 outlines the proceduresin creating the location cloaking set. Firstly, based on theaccurate channel fading model (e.g., Longley-Rice (L-R)model [30]) available at the SAS database and the operat-ing parameters of all users, the SAS database is aware of

Page 6: DPavatar: A Real-time Location Protection Framework for ... · privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time

1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEETransactions on Mobile Computing

6

u1u8

u7 u6

u2

u4u3

u5

Figure 3: An example of forming the location cloaking setusing nearest neighbouring users. Suppose the size of loca-tion cloaking set is 4 and {u∗, u1, u2, u3} is the cloaking set vianearest neighbouring algorithm. However, if u1 were the IU,its cloaking set would have been {u1, u6, u7, u8}. Similarly, if u2were the IU, its cloaking set would have been {u2, u3, u4, u5}.

Algorithm 1 Creating the Location Cloaking Set

Input: channel fading model; operating frequency, location,transmit power of all users; system parameters: εth , Λth

u∗ ,ϕ, xu∗ .

Output: location cloaking set X.1: Construct a conflict graph G (V, E);2: Find the maximum independent set Iu∗ ;3: Calculate the size of location cloaking set from (10);4: Create the location cloaking set X of size K from Iu∗ .

the interference relationship between any two users andcan construct a conflict graph G (V, E). Specifically, eachvertex v (v ∈ V) represents each user while an edge (u, v)((u, v) ∈ E) between two vertices u and v indicates these twousers interfere with each other (or conflict). Secondly, usingG (V, E), the SAS database then determines the maximumindependent set (MIS) [31] Iu∗ which includes the real IU u∗.The algorithm in [31] can be utilized as an efficient approachto constructing the MIS that includes the real IU.

Next, we determine the size of the location cloaking setto guarantee a certain level of inference errors for Bayesianadversaries, and then we can create the location cloakingset of size K from Iu∗ . From our prior discussion, in Step 4of Alg.1, the location cloaking set should be constructed topreserve reciprocity, which naturally leads us to an approachusing the Hilbert space-filling curve [32]. The Hilbert curvemaps the 2-D Cartesian coordinate of each user into a1-D value, and provides that if two points are in closeproximity in 2-D space, with high probability they willalso be close after 1-D transformation. Fig.4, for instance,illustrates the location of 10 users and the Hilbert curve fora 8×8 space partitioning. It also shows users’ sorted Hilbertvalues and how to construct cloaking sets of different sizes.In particular, users are split into K-buckets, each of whichcontains exactly K users except the last one which maycontain at most 2·K-1 users. The cloaking set generated inthis respect satisfies reciprocity because all users in the samebucket will generate the same cloaking set. Enlightened bythe algorithm in [33], the cloaking set can be generated bythe following procedures. Firstly, the SAS database obtainsIU’s Hilbert value H(u∗) and gets its position Ru∗ in thesorted sequence of all instances in Iu∗ . Given K , the SAS

u1u8

u7 u6

u2

u4

u3

u5

u9

(a) 8 × 8 Hilbert Curve

u1 u8u7u6u2 u4u3 u5

2 9 13 25 32 36 49 51 54 57

u9

Hilbert

Value

Buckets for K=3

u1 u8u7u6u2 u4u3 u5

2 9 13 25 32 36 49 51 54 57

u9

Hilbert

Value

Buckets for K=4

(b) User’s sorted Hilbert values

Figure 4: Forming location cloaking sets of different sizesbased on 8 × 8 Hilbert Curve.

database then calculates the start and end positions definingthe K-bucket that includes H(u∗), as follows:

start = Ru∗ − (Ru∗ mod K)

end = start + K − 1

After the location cloaking set of K users are formed,the SAS database will exclude all other users that interferewith any of these K users. We will show later in Section6 that the proposed Alg.1 can prevent adversaries fromdiscerning the real IU from these K − 1 avatar SUs. Next,the natural question arises that how the size of K should bechosen. As it is shown in Fig.2, DP obfuscation mechanismis applied to this location cloaking set to provide geo-indistinguishability for these K users. However, studies [23]have shown that the notion of differential privacy doesnot protect against Bayesian inference attacks using priorinformation; whereas the other privacy notion, namely, theexpected inference error, promotes resilience to Bayesian in-ference attacks but lacks differential privacy guarantee interms of geo-indistinguishability. Therefore, we intend tocombine these complementary privacy notions to providedouble shield protection for the IU’s location privacy. Specif-ically, we base the selection of K on the expected inferenceerror, which can be thought of the first phase; then the DPobfuscation mechanism is applied to this location cloakingset in the later phase.

First of all, let us examine the relationship betweenthese two privacy notions. For demonstrative purposes,we give an example to evaluate IU’s location privacy byapplying differential privacy against Bayesian inferenceattack. As shown in Fig.5(a), the adversary may have aprior knowledge (denote as prior 1) of IU’s appearance ata specific location at a given area of its interest. On the otherhand, the adversary could obtain some side informationthat eliminates loc.#4 as the possible IU’s locations (denoteas prior 2). We apply the exponential mechanism [23] toachieve differential privacy on the location cloaking set of 10possible locations. The adversary’s expected inference errorsfor IU’s appearance at any of these locations are evaluatedbased on (2) and (3). As shown in Fig.5(b), the adversarycould identify IU at loc.#4 or loc.#5 when adversary hasprior 1 or 2, respectively. This figure also shows that byeliminating loc.#4 as the possible locations from the cloak-ing set, the adversary could increase its inference success

Page 7: DPavatar: A Real-time Location Protection Framework for ... · privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time

1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEETransactions on Mobile Computing

7

1

0.5

0

1

2

3

4

5

6

7

89

10

(a) The adversary’s prior informationon IU’s location at a 16 × 13 region.

1 2 3 4 5 6 7 8 9 100.0

0.2

0.4

0.6

0.8

1.0

ExpectedInferenceError

Location ID

Prior 1

Prior 2

(b) Expected inference error

0.0 0.5 1.0 1.5 2.0

0.30

0.35

0.40

0.45

0.50

Prior 1

Prior 2

ExpectedInferenceError

(c) ε v.s. inference error

Figure 5: Demonstrative example of how differential privacycan protect IU’s absolute location privacy when adversary hasdifferent prior knowledge.

probability at every other possible location. In particular,considering the adversary with prior 1 and the IU operatingat loc.#6, Fig.5(c) plots how differential privacy could helpprotect IU’s location privacy. We can see that the adver-sary’s expected inference error decreases as the parameterε increases. However, with the improved prior information,the adversary’s inference error can be reduced despite theprotection of differential privacy. Clearly, we can concludethat the adversary with different prior knowledge has differ-ent capability to compromise IU’s absolute location privacyin terms of the inference error. The extreme case would bethat the adversary having certain prior information coulduniquely geo-locate IU while differential privacy clearly hasno control over it.

From the above numerical analysis, we conclude thatdifferential privacy is not robust to ensure absolute locationprivacy guarantee against Bayesian adversaries with certainprior knowledge. Furthermore, we also observe that whendifferential private mechanism is applied in the locationcloaking set, it allows the adversary to narrow its guesses ina smaller area. Therefore, we propose a dynamic selectionof the size of cloaking set K by taking the adversary’sprior knowledge π and IU’s privacy requirement ϕ intoconsideration. Next, we will theoretically explore how todecide K in accordance to π and ϕ. First, let us assumexu∗ = arg max

x∈Ψ

Pr(x |xu) to be the Bayesian adversary’s es-

timated location. Based on (3), the conditional expectedinference error can be calculated as:

Err =∑

x∈X

Pr(x |xu)∑y∈X Pr(y |xu)

· dh(xu∗, x)

=∑

x∈X

π(x)Pr (A(xu |x))∑y∈X π(y)Pr (A(xu |y))

· dh(xu∗, x)

≥ e−ε∑

x∈X

π(x)∑y∈X π(y)

· dh(xu∗, x)

≥ e−ε∑

x∈X

π(x)∑y∈X π(y)

· dh(xu∗′

, x)

= e−ε∑

x∈X\{x=xu∗′}

π(x)∑y∈X π(y)

(7)

where the first inequality is due to (5) and the secondinequality is due to xu∗

= arg maxx∈X

Pr(x |xu), which means

the guessed location is now within the cloaking set. Aboveequation validates our numerical analysis in the previousdiscussion, where the adversary’s inference error or IU’sabsolute location privacy is dependent on adversary’s priorknowledge π(x)∑

y∈X π(y)whereas given an ε, differential privacy

only guarantees a lower bound of it. Due to the fact thatthe SAS database may not know the adversary’s priorinformation, we consider a simple scenario that all locationsin X have equal prior probability and π(x)∑

y∈X π(y)would be 1

K .As a result, (7) becomes

Err ≥ e−ε(1 −1K) (8)

Here, ε is chosen as the privacy budget for DP obfuscationand we shall see in Section 5.5 that any allocated privacybudget ε should be no greater than εth . Thus, the inequality(8) can be further written as follows:

Err ≥ e−ε(1 −1K) ≥ e−εth (1 −

1K) (9)

To guarantee the protection for IU’s location privacysatisfies its specified requirement ϕ in any DP obfuscationcase, we have to let Err ≥ ϕ. To ensure that, it is sufficientto satisfy that e−εth (1 − 1

K ) ≥ ϕ. According to (9), we can seethat the size of location cloaking set can be determined asfollows:

K ≥1

1 − eεthϕ(10)

Intuitively, the size of cloaking set should be set as largeas possible. However, we shall see later that the larger of thecloaking set size, the higher the computational complexityin the obfuscation phase. Besides, when the cloaking setis large enough, the obfuscated locations may be distantfrom the IU’s location which as a result will cause highobfuscation error in terms of power interference to the IU.Therefore, we can select K as the lower bound value in (10).

5.3 Interference CheckAfter the IU joins the system and the location cloaking setbeing constructed at t0, new SUs could join the system atany time instant from t0 to tN . To avoid harmful interferenceto the IU while guaranteeing the IU is not inferred from thedeterministic feature about releasing the IU as soon as theinterference threshold exceeds, we propose that any avatarSU or the real IU will be released if the incoming SUs’transmissions interfere with it. In so doing, the adversaryis expected to be incapable of discerning the IU from otheravatar SUs from this function block.

To be specific, the aggregated power interference to anyuser in the location cloaking set is described as follows:

Pint,u =∑u′∈Iu

Pu′ · hu′,u (11)

Page 8: DPavatar: A Real-time Location Protection Framework for ... · privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time

1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEETransactions on Mobile Computing

8

where Iu is the set of SUs that interfere with the user uin the location cloaking set; Pu

′ is the transmit power ofthe SU u

; and hu′,u represents the wireless channel gainbetween u and u

. Since the SAS database adopts accurateradio propagation model or deploys in-field sensors forinterference management, the element of Iu can be easilyobtained. Thus, the aggregated power interference Pint,u canbe calculated in a timely manner.

Whenever the power interference exceeds the threshold,the interfered avatar SUs or the real IU will be released asthe “avatar” IU and the respective interfering SUs will beexcluded.

5.4 IBA Adaptive Sampling

If the interference threshold for all users in the locationcloaking set is not violated at ti , the DPavatar schemethen verifies whether to perform DP obfuscation or not.The reason is that each obfuscation comes at the cost ofprivacy budget (ε) while the total budget is constant. Weare thus motivated to apply a sampling mechanism to selectcertain time instants for the DP obfuscation while reusingthe previously released avatar IU between two consecutiveobfuscation time instants, as shown in Fig.2. Although inthis way the budget can be saved for future usage, using oneavatar IU for a long time may cause power interference toother avatars or even the real IU. Based on this observation,we need to design an adaptive sampling mechanism byjointly considering the incurred power interference and theremaining privacy budget.

The previous work [34] in the context of numericaldata (or histograms) publications adopt PID control [35]to adjust the sampling rate according to historical datadynamics. However, the framework therein uses a fixed ratesampling scheme and allocates equivalent budget to eachpredefined sampling time instant, which is not desirable inour setting to keep tracking of aggregated power interfer-ence (i.e., feedback error) as well as the remaining privacybudget and to adjust the sampling rate on-the-fly. In thispaper, we propose a novel Interference and Budget Awareadaptive sampling mechanism based on feedback error tocast a balance on power interference and remaining privacybudget. In particular, we use the PID control to characterizethe effect of power interference on sampling intervals, andthen determine the next sampling time instant by jointlyconsidering the remaining privacy budget.

In our framework, the feedback error measure is definedas the proportion of the power interference increase betweentwo sampling time instants to the remaining interferencetolerance budget, as shown below:

Etn =(Ptn

int − Ptn−1int

)/max{Λth

u∗−Ptnint, 0}, (12)

where Ptnint = max{Ptn

int,u1, Ptn

int,u2, ..., Ptn

int,uK} measures the

largest power interference level of one particular user inthe location cloaking set. We treat every user equally sothat adversaries cannot deduce extra information from thevariation of sampling time instants.

Furthermore, the remaining interference tolerance bud-get defined in (12) captures that the feedback error is infinityif the interference power exceeds IU’s threshold, whereas

the feedback error is a proportional measure of the interfer-ence increase if not. Then, we can apply the PID control toevaluate the effect of feedback errors on the sampling rate.First of all, the PID error control law ∆ is defined as follows:

∆ = CpEtn + Ci

∑nj=n−Ti+1 Etj

Ti+ Cd

Etn − Etn−1

tn − tn−1(13)

where the first term represents the proportional error withCp being the proportional gain which amplifies the currenterror; the second term is the integral error standing for theaccumulated error in the past Ti integral time window andCi denotes the integral gain; the third term is the derivativeerror capturing the predicted errors to prevent large errorsin the future with Cd being the derivative gain. In general,the control gains Cp , Ci and Cd denote the weights thataccount for the final calibrated PID error control and theysatisfy the following constraints:

Cp,Ci,Cd ≥ 0Cp + Ci + Cd = 1

(14)

From our prior discussion, we know the sampling rateshould increase when the feedback error is high. In otherwords, avatar IUs should be frequently released to suppressthe power interference due to the incoming SUs. With this inmind, the adjusted new sampling interval (the time betweentwo sampling time instants) can be determined as:

I′

= max{1, I + θ(1 − e∆−ν ·λr )}, (15)

where θ is the system parameter to adjust the samplinginterval and ν is the scaling factor to unify the PID errorcontrol and the remaining budget λr which is 1

εr. Here,

we assume the smallest sampling interval is 1. The timeunit could be in any degree of granularity such as seconds,minutes or hours depending on the time scale of SUs’ join-ing rate. The reason of applying (15) to determine sampleinterval is as follows. It is intuitive that the sampling intervalI is a decreasing function w.r.t. PID control error ∆ andremaining privacy budget εr . In other words, obfuscationshould be applied frequently when PID error is large (toavoid interference) or when there is enough privacy budgetleft (to select an avatar with high utility). That is to say εr(or λr ) and ∆ jointly affect whether the sampling intervalshould be reduced or increased. Thus, ν is introduced as aweighting factor to bias between λr and ∆ for determininga new sample interval. Specifically, e∆−ν ·λr is greater than 1when ∆ > ν · λr and the new sampling interval is reduced;and e∆−ν ·λr is less than 1 when ∆ < ν · λr and the newsampling interval is increased.

5.5 Dynamic Budget Allocation

Next, if ti is the sampling time instant, it implies that the DPobfuscation should be performed over the location cloakingset. Thus, our next objective is to determine the amount ofprivacy budget to be allocated for the obfuscation.

In this section, we propose a dynamic budget allocationmechanism that adapts to the variation of sampling inter-vals and allocates privacy budget accordingly to guarantee(ω, ε)-privacy in the sense that the sum of budget usagewithin any sliding window of length ω is less than the

Page 9: DPavatar: A Real-time Location Protection Framework for ... · privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time

1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEETransactions on Mobile Computing

9

total budget ε . Recall that in the prior IBA adaptive sam-pling mechanism, the sampling interval decreases when thefeedback error increases. In other words, we anticipate moresampling time instants within a time window of length ω,which as a result leads us to allocate smaller share of budgetat each sampling time instant so that more available budgetwill be reserved for the successive ones. On the contrary,when the feedback error decreases, we have fewer samplingtime instants within the ω time window and thus a largefraction of remaining budget can be allocated at the currentsampling time instant to preserve utilities.

With this in mind, we need to first obtain the remainingbudget in window [n − ω + 1, n] as εr = ε −

∑i=n−1i=n−ω+1 εi .

Next we need to determine the fraction of privacy budget tobe allocated at the current sampling time instant. Based onour previous analysis, we realize the logarithmic functioncan perfectly capture the relationship between the samplinginterval I

and the fraction f . Thus, we define the amountof the privacy budget allocated at the current sampling timeinstant as follows:

εn = min{µ ln(I′

+ 1) · εr, εth} (16)

where µ is the normalizing factor to scale the fraction in (0, 1)and we use ln(I

+ 1) instead of ln I′

in order to avoid zerosince I

could be 1. Besides, the reason we setup a minimumprivacy budget is to prevent from leaving too few budget tothe future.

5.6 Differentially Private Location ObfuscationGiven location cloaking set X, our scheme achieves differ-ential privacy (with privacy budget εn) on it to protect IU’slocation. We attempt to obtain an optimal obfuscation mech-anism A(·|·) in such a way that the utility loss is minimizedwhile the differential privacy guarantee and the interferenceprotection are satisfied. In particular, the mechanism A(·|·)achieving differential privacy means that any two usersin the location cloaking set have approximately the sameprobability (within a factor of eε) to generate the releasedavatar IU. The utility loss in our context is defined as theloss in spectrum efficiency.

In our proposed scheme, operating SUs instead of non-existing dummy users are selected to obfuscate the real IU’slocation. The benefit is that when an avatar SU is released asan IU, it is protected from harmful interference so its qualityof service is guaranteed. Even though the avatar SU rejectsits interfering SUs in the same way as the real IU, the overallspectrum efficiency loss is much less than that by relying onnon-existing users. Specifically, the utility loss in spectrumefficiency (bits/s/Hz) by releasing avatar u as IU can bedescribed as follows:

c(u) =∑u′∈Iu

log2(1 +PRx(u

)

Pint,u′ + N0)−[

log2

(1 +

PRx (u)min

{Λthu∗, Pint,u

}+ N0

)− log2(1 +

PRx(u)Pint,u + N0

)

](17)

where PRx(∗) is the receiving power at a particular userand Pint,∗ is the aggregated interference power on that userwhich can be calculated in the same way as in (11). Note that

the first term represents the loss of spectrum efficiency byexpelling SUs while the latter two terms in square bracketsindicate the spectrum efficiency gain due to elevating theavatar SU to become the IU. In particular, if the real IU isreleased, the utility loss in spectrum efficiency is calculatedas follows:

c(u∗) =∑u∈Iu∗

log2(1 +PRx(u)

Pint,u + N0) (18)

With the utility function defined as above, we also as-sume a prior π over X, representing the probability of thereal IU being at any location at any given time. Therefore,given the allocated privacy budget εn at nth sampling timeinstant, we can construct an optimal mechanism by solvinga linear optimization problem, minimizing the expectedutility loss while satisfying εn-differential privacy:

minimizeA

∑xu∗,xu ∈X

π(xu∗ )Pr (A(xu |xu∗ ))c(u)

subject to Pr (A(xu |xu∗ )) ≤ eεnPr(A(xu |x

)

)xu∗, x

, xu ∈ X∑xu ∈X

Pr (A(xu |xu∗ )) = 1 xu∗ ∈ X

0 ≤ Pr (A(xu |xu∗ )) ≤ 1 xu∗, xu ∈ X

E(Pint,u∗ ) ≤ Λthu∗ xu∗ ∈ X

(19)In the above optimization problem, the first three con-

straints are set to guarantee differential privacy while thelast constraint is to make sure the expected interference tothe real IU by releasing u does not exceed the threshold.More explicitly, the fourth constraint can be written asfollows:∑

xu∗ ∈X

∑xu∈X

π(xu∗ )Pr (A(xu |xu∗ ))∑

u∈Iu∗ \Iu

Puhu,u∗ ≤ Λthu∗ . (20)

We also observe that there are |X|2 decision variablesand up to O(|X|3) constraints in (19). Based on our analysisat Section V.A, it is clear that with the increasing size ofthe location cloaking set, the IU’s location privacy is betterpreserved but the computational complexity will increasedramatically. We will examine this tradeoff in the followingevaluation section. As far, after the optimal obfuscationstrategy A is obtained, the SAS database could follow thisprobabilistic distribution and release the avatar IU accord-ingly. To have a clearer picture of the working flow andinteractions of the above-mentioned mechanics, Algorithm2 sketches the outline of DPavatar protection scheme.

6 PRIVACY ANALYSIS

We show in this section that the design blocks, namelythe location cloaking set and the IBA adaptive sampling,leaks no extra information to adversaries facilitating theirattacks to our scheme. Based on that, we prove the proposedDPavatar scheme achieves (ω, ε)-privacy over the IU’s oper-ating time. Before the analysis, we pose one fundamentalassumption that adversaries cannot manipulate the SASdatabase by registering fake SUs at certain locations dueto the authentication process.

Firstly, the location cloaking set is constructed to pre-serve reciprocity through searching for the MIS and then

Page 10: DPavatar: A Real-time Location Protection Framework for ... · privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time

1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEETransactions on Mobile Computing

10

Algorithm 2 DPavatar Protection Scheme

Input: user join/leave dynamics; channel fading model;system parameters: ε, ω, θ, υ, εth , µ, Cp , Ci , Cd ; IU’sparameters: Λth

u∗ , ϕ, xu∗ ; the next sampling time instantns, the time span N .

Output: xu1: construct the location cloaking set according to Alg.1;2: ns← 1; I ← 13: for n = 1:N do4: for i = 1:K do5: if Pn

int,ui> Λth

u∗ then6: release xns

u← xnui ;

7: jump to Step 3;8: end if9: end for

10: if n == ns then11: n is the sampling time instant;12: calculate Pn

int according to (11);13: calculate PID error ∆ according to (13);14: update interval I

according to (15);15: ns← ns + I

; I ← I′

;16: compute remaining budget εr = ε −

∑i=n−1i=n−ω+1 εi ;

17: calculate budget allocation εn from (16);18: solve the optimization problem (19) and obtain A;19: release xn

uaccording to A;

20: else21: n is not the sampling time instant; keep using xns

u;

22: end if23: end for

using Hilbert curve. Although the number of MISs in agraph is dependent on the number of vertices n and edges,normally for a random graph (without adversarial manipu-lations) like our scenario, the number of MISs containing thereal IU is at least n1−τ (τ is a small value) whereas the num-ber of elements in any MIS is almost surely 2 (1 + O (1)) log2n[36]. This indicates that the operating SUs and adversarieshave extremely low probability to discern the IU from theMIS phase when the number of operating SUs are large. Onthe other hand, the sufficient number of elements in any MISguarantees the Hilbert curve can be applied to construct thelocation cloaking set to preserve reciprocity.

Secondly, it seems that adversaries can learn from thechange of sampling intervals to gain extra information aboutwhere the IU is. However, the IBA adaptive sampling blockcalculates the sampling interval based on the maximumpower interference to all the users in the location cloakingset. By doing so, adversaries cannot deduce the specific user,to which the power interference causes to, actually results inthe variation of the sampling interval.

Last but not least, we argue that the DPavatar schemepreserves (ω, ε)-privacy for the IU from t0 to tN . First ofall, the DP obfuscation scheme is applied to the locationcloaking set that has been proven to be private and itcannot be manipulated in any stage. Then, we see that thedynamic budget allocation guarantees that

∑i=n−1i=n−ω+1 εr ≤ ε

for any slide window of size ω, and at any sampling timeinstant εth ≤ εn ≤ ε. Therefore, the DP obfuscation at anytime instant is εn-differential privacy and consequently, the

DPavatar scheme is (ω, ε)-privacy across t0 to tN .

7 PERFORMANCE EVALUATION

In this section, we conduct simulations to evaluate theperformance of our DPavatar scheme, and compare it withother differentially private mechanisms on utility and pri-vacy.

We focus on an 13×13 Km2 geographical area, where theIU could possibly operate. At the initialization as shown inFig.6, we randomly generate 50 users and scatter them intothis area. We select randomly one and set it as the IU, forinstance the solid dot in Fig.6, whose interference tolerancethreshold is assumed as 0dBm and personalized privacyrequirement is ϕ. For the channel model, we consider thelarge scale fading which is described as PRx = γd−ξPTx

[37]. Here, we set the antenna gain γ=2.5, the transmittingpower PTx=10W and the path loss factor ξ=4, while d is theEuclidean distance between transceiver pairs. We assumethat two nodes interfere with each other when the inter-ference power exceeds -40dBm and we can therefore builda bipartite interference graph for these users, as shown inFig.6. Also considering the AWGN channel, we assume thenoise power N0 = 10−9W. Since we consider the real-timescenario, we assume that the IU’s operation duration is 30minutes and the granularity of SUs’ joining/leaving rate isin seconds. To be specific, we set each time interval to be10 seconds, during which with probabilities α and β one SUjoins and leaves the DSA system, respectively. Equivalently,we can claim that on average the SUs’ net arrival rate isρ = 6 (α − β) users/minute. Note that here the departing SUsare those who finish their services but not the expelled SUs.On the other hand, incoming SUs are admitted according toSAS database access control and they are randomly placedin this area. In the simulation, we set {Cp,Ci,Cd} = {1, 0, 0},θ=5, υ=0.1, εth=0.3, µ=0.1 while privacy budget ε and timewindow ω are evaluated to examine their impacts on theperformance.

2 Km

Figure 6: The initialization of simulations, where 50users are distributed in an 13×13 Km2 area and theirbipartite interference relationship is also given.

The metrics we use in the evaluation are the following:the size of X, the computational time in DP obfuscation,the utility loss concerning spectrum efficiency, the privacyin terms of the adversary’s expected inference error, the

Page 11: DPavatar: A Real-time Location Protection Framework for ... · privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time

1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEETransactions on Mobile Computing

11

0.50 0.55 0.60 0.65 0.700

4

8

12

16

20SizeofLocationCloakingSet

εth= 0.1

εth= 0.2

εth= 0.3

(a) |X| vs. ϕ

0.0 0.1 0.2 0.3 0.48

10

12

14

16

18

20

UtilityLoss(bps/Hz)

ε

ϕ=0.70

ϕ=0.75

ϕ=0.80

(b) Utility loss vs. ε

(c) Utility loss comparisons

0 10 20 30 40 50

0.0

0.2

0.4

0.6

0.8

1.0

ExpectedInferenceError

Location ID

DPavatar Exponential

(d) Location privacy compar-isons

Figure 7: System performance at a given sampling time instant.

aggregated power interference to the IU. We examine howthe designed variables impact the system performance withrespect to these metrics.

7.1 Performance of the Obfuscation at a Sampling TimeInstantFirst, we focus on evaluating the performance at one specificsampling time instant when the obfuscation mechanism isperformed. Simulations are conducted for the IU being atany of these 50 possible locations and we then take theaverage of 100 independent runs.|X| vs. ϕ. To perform location obfuscation, the location

cloaking set should be constructed by taking the adversary’sprior knowledge into consideration. Fig.7a indicates theimpact of privacy parameter ϕ and ε on the selection ofthe location cloaking set. For the same ε, with the increaseof ϕ, the cloaking set size (technically, the lower bound)becomes larger because more SUs should be included forobfuscation to satisfy the IU’s higher error (location privacy)demand. On the other hand, for the same ϕ, the cloaking setsize increases when ε increases, which coincides with ourtheoretical analysis on (7).

Utility loss vs. ε. At the sampling time instant whenthe location obfuscation is performed according to the opti-mization problem (19), we can see that the lower the ε is, thetighter the differential privacy constraint (the 1st inequalityin (19)) is, which results in more similar probabilistic dis-tributions for every cloaking user. As a result, the releasedavatar may not be the one giving low utility loss. When εincreases, the utility loss decreases but barely changes whenε becomes large enough as shown in Fig.7b, which coincides

with our intuition about differentially private mechanisms.On the other hand, given the same ε, the utility loss de-creases when ϕ increases. The reason is that larger ϕ givesbigger cloaking set which includes more SUs, which takesadvantages of user diversity and could possibly generatean avatar with low utility loss. However, we shall see laterthat the increase of cloaking set has negative impact on thepower interference to the IU.

Utility loss comparison. We compare the utility loss inour scheme with the loss by applying exponential mech-anism [23]. The results are shown in Fig.7c, from whichwe can see that our differentially private mechanism bysolving an utility-based optimization problem outperformsthe exponential mechanism at any value settings of privacyparameters. Besides, the trend of the utility loss regardingdifferent values of privacy parameters coincides with theprevious results.

Location privacy comparison. Next, we show the com-parison with respect to the location privacy provided byour scheme and the exponential mechanism. As shownin Fig.7d, our scheme at most locations provides higherexpected inference error or location privacy than the expo-nential mechanism. It is interesting to note that the reasonfor the large variation of inference error in our schemeis that different locations creating the cloaking set havedifferent utility loss values, which is taken into account inthe obfuscation mechanism by solving the optimization (19).

Computational Cost for Solving Problem (19){εth, ϕ} |X| Time(s){0.1, 0.7} 5 0.0156{0.25, 0.7} 10 0.1092{0.25, 0.725} 15 1.7316{0.27, 0.725} 20 8.7985

Table 1: Execution time of our obfuscation approach fordifferent values of ϕ, εth .

Responsiveness: From previous analysis, we know thatlarger cloaking set increases adversaries’ expected inferenceerror. Here we examine the computational time for solvingoptimization (19) w.r.t. different sizes of cloaking set. Notethat the delay of our framework mainly comes from con-structing the location cloaking set, calculating privacy bud-get and solving optimization problem (19). On one hand,the cloaking set construction is a one-time computation atthe system initialization which can be considered to haveno impact on the responsiveness of our framework in latertime. On the other hand, solving optimization problem (19)dominates the delay of our framework while the overheadfor calculating privacy budget can be neglected. Thus, weonly consider the responsiveness of our framework accord-ing to the delay incurred in solving (19). The measurementsare based on a desktop of Intel i3 processor and 8GB mem-ory while we use Simplex algorithm to solve problem (19).Table 1 shows that the execution time of our obfuscationapproach increases for larger size of cloaking set. However,we anticipate that using spanning graph to approximate theHamming distance of (19) as in [18] and higher computingcapacity in real SAS database (e.g., enterprise-level comput-ers) the responsiveness of our framework will be highlyimproved.

Page 12: DPavatar: A Real-time Location Protection Framework for ... · privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time

1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEETransactions on Mobile Computing

12

1.0 1.5 2.0 2.5 3.020

30

40

50

60

70NumberofSamplingPoints

ε

ρ = 1 SUs / minutes

ρ = 3 SUs / minutes

ρ = 5 SUs / minutes

(a) Sampling Times vs. ε

10 20 30 40 50 60 7020

30

40

50

60

70

NumberofSamplingPoints

ω

ρ=1 SUs / minutes

ρ=3 SUs / minutes

ρ=5 SUs / minutes

(b) Sampling Times vs. ω

1.0 1.5 2.0 2.5 3.0

375

450

525

600

675

750

TotalUtilityLoss(bps/Hz)

ε

ω = 20

ω = 40

ω = 60

(c) Total utility loss v.s. ε

10 20 30 40 50 60 70300

400

500

600

700

800

900

1000

TotalUtilityLoss(bps/Hz)

ω

ε = 1

ε = 2

ε = 3

(d) Total utility loss vs. ω

Figure 8: Impact of design variables ε, ω on system perfor-mance.

7.2 Real-time Performance

Now, we examine the system performance in real-time withSUs joining in and departing from the targeted area. Thenumber of time instants during IU’s operation duration canbe calculated as N = 180. Here, we fix the IU’s personalizedprivacy requirement ϕ as 0.7.

Sampling Times vs. ε First, we set ω = 20 and examinehow the total privacy budget affects the number of samplingpoints. Here, the sampling point is the time instant whenthe obfuscation is about to execute. As shown in Fig.8a, thenumber of sampling points increases with the increase ofε since more privacy budget allows our scheme to squeezein more sampling time instants in any ω sliding windowand each of them could be allocated with a fair amountof ε. Besides, given ε, more incoming SUs cause higherpower interference to users in the location cloaking set sothe number of sampling time instants increases to suppressthe interference.

Sampling Times vs. ω Here, we set privacy budgetε = 2.0 and vary the length of sliding window to see itsimpact on number of sampling points. As shown in Fig.8b,the number of sampling points reduces when ω increases.The reason is intuitive in the sense that with ε being fixed,the sampling rate is lower when the window size increasesin order to save the privacy budget. On the other hand, themore incoming SUs also increases the number of samplingpoints for the same reason we described previously. Itshould be noted that the number of sampling time instantsaffects the computational complexity of the SAS databasesystem since it needs to solve optimization problem (19).

Total utility loss vs. ε Besides the complexity cost weenvision, we now show the total utility loss w.r.t. the designparameter ε and ω. First, by setting ρ = 5, we conductsimulations to examine the impact of ε on the total utilityloss and the results are shown in Fig.8c. Clearly, higherprivacy budget gives larger utility loss since our observation

1.0 1.5 2.0 2.5 3.0400

450

500

550

600

650

700

750

Utility Loss

Interference Error

ε

TotalUtilityLoss(bps/Hz)

-28

-26

-24

-22

-20

-18

AveragedInterferenceError(dBm)

(a) Utility vs. error w.r.t. ε

10 20 30 40 50 60 70

400

450

500

550

600

650

700

750

Utility Loss

Interference Error

ω

TotalUtilityLoss(bps/Hz)

-28

-27

-26

-25

-24

-23

-22

EverageInterferenceError(dBm)

(b) Utility vs. error w.r.t. ω

Figure 9: Utility/interference error tradeoff w.r.t. design vari-ables.

at Fig.8a indicates that more frequent obfuscation happenswhen the privacy budget becomes higher. As a result, theavatar is released more often to expel interfering SUs whichbrings down the total utility.

Total utility loss vs. ω The same effect can be observedfor the design parameter ω as shown in Fig.8d. Largersliding window size reduces the total utility loss becausethe obfuscation is not performed frequently enough and notmany SUs are expelled.

From the previous analysis, it seems that ε and ω shouldbe designed in such a way that the obfuscation times mustbe as few as possible to reduce the utility loss. However,we shall see that there is a tradeoff between the utility lossand the IU’s interference error, which should be balanced byselecting appropriate ε and ω.

Utility loss vs. interference error w.r.t. ε. First, weexamine the tradeoff between the total utility loss and theIU’s interference error w.r.t. ε. Here, the interference erroris averaged across the total time horizon (i.e., N = 180)and ω is set to 20. The result shown in Fig.9a indicatesthat higher privacy budget gives lower average interferenceerror since more obfuscation are performed so that avatarsare frequently released to suppress the power interference.On the other hand, an opposite phenomenon is observed forthe utility loss curve, whose analysis was given previously.Therefore, there is a clear tradeoff between these two mea-suring quantities and ε should be chosen to cast a balancebetween them.

Utility loss vs. interference error w.r.t. ω. Similarly,when we conduct simulations on the tradeoff relationshipw.r.t. ω, the result shown in Fig.9b implies the same ob-servation. Actually, the design goal should be that ε andω are chosen to cause as few obfuscation as possible whilesatisfying the IU’s interference error below the threshold.

7.3 Performance Comparison with Other SchemesTo assess the performance of our scheme compared to otherstate-of-the-art ones, we refer to Clark et al. [5] and Bahraket al. [8] as benchmark schemes. Both works focused onlocation preservation for static IUs under a strong adver-sary assumption that all SUs are malicious. Specifically, theformer work applied power allocation randomization andfalse location injection interchangeably to maximize IU’sprivacy-preservation time (coined as Clark’16) whereas thelatter one obfuscated IU’s location in a larger area (coinedas Bahrak’14). Since our work is based a weaker adversaryassumption that operating SUs are not malicious, to make

Page 13: DPavatar: A Real-time Location Protection Framework for ... · privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time

1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEETransactions on Mobile Computing

13

2 4 6 8 10 12 14

Numer of Injected Dummy IUs

350

400

450

500

550

600

650T

ota

l U

tilit

y L

oss (

bp

s/H

z)

(a) Utility loss vs. number ofinjected dummy IUs.

40 80 120 160 200 240 280

Numer of Time Instants

1

2

3

4

5

6

7

8

9

To

tal U

tilit

y L

oss (

bp

s/H

z)

in lo

g1

0 s

ca

le

mDPavatar

Clark'16

Bahrak'14

(b) Utility loss of differentschemes.

Figure 10: Comparison with the state-of-the-art schemes.

a fair comparison, we adopt the same threat model as in[5], [8] so the IU has to inject dummy locations whoserelease still ensures differential-privacy guarantee (coinedas mDPavatar). However, one can easily expect that the totalspectrum-efficiency loss increases as the IU introduces moredummy locations to substitute the untrusted SUs. A simplenumerical simulation for an anonymity set of k = 15 overN = 200 time instants is conducted as shown in Fig.10a.

Note that the simulation settings in Clark’16, Bahrak’14and our work are different, so we adopt them in our settingfor fair comparison. Besides, we follow [5] and break thewhole area into small grids, each of which only has oneuser. IU’s privacy is then evaluated using Eq.(3) but theEuclidean distance measurement is applied to unify thesethree schemes. Moreover, we let these schemes run sufficientamount of time and the total utility loss is then calculated.With respect to a Bayesian adversary in both our work and[5], the IU’s corresponding privacy level can be derived aftersufficient runtime (or adversary’s inference periods).

Since there is a clear tradeoff between privacy and utility,we only compare the utility of these schemes for the sameprivacy target. Here, we set the privacy level to be 4.3Kmand then randomization parameters of these designs canbe traced backward. Specifically, in our design, the bud-get of {εth, ϕ} = {0.25, 0.725} gives an anonymity set ofsize 15, which satisfies the above privacy level. Similarly,Clark’16 is estimated to apply a uniform distribution ofmean 3.7dB to reduce SUs’ power assignment profile for asetting of PTx = 10W and a new IU’s interference threshold -80dBm and this calculation is based on their proposed singlePU error (SPE) adversarial estimation scheme. In addition,Bahrak’14 is expected to generate a protected contour of16.81 × 16.81Km2 to provide the same privacy level. Noticethat a protected contour of such size covers the whole regionof interest so Bahrak’14 serves as the baseline case where theutility loss is the maximum.

Simulation results are shown in Fig.10b. It is clear thatour scheme causes the least spectrum-efficiency loss overa long time period, but the performance gap between ourscheme and Clark’16 shrinks as the IU operates longertime. The reason is that when N is relatively small, ourscheme selects an avatar with utility awareness as in (19) butClark’16 randomizes power allocation attempting to prolongthe expected time of privacy preservation [5]. However, as Nincreases, the gain vanishes as our scheme releases avatarsdirectly without solving (19) due to power interference from

joined SUs. In other words, our scheme operates similar toClark’16 in the sense that dummy records (in mDPavatar)and obfuscated response (in Clark’16) are applied respec-tively.

8 CONCLUSION

In this paper, we proposed DPavatar, a real-time differentialprivacy framework to preserve IU’s location privacy. Thiswork novelly leveraged operating SUs as the obfuscated ob-jects and included them in a location cloaking set, in whichwe developed an utility-optimal differentially private mech-anism. We strategically combined differential privacy withthe expected inference error in our design to improve IU’slocation privacy. Moreover, we adopted the (ω, ε)-privacynotion to ensure differential privacy guarantee for the IU inreal time. Privacy analysis was given and evaluation resultsshowed that DPavatar framework can achieve better utilityand location privacy performance than existing solutionsand other differentially private mechanisms.

REFERENCES

[1] F. C. Commission, “Enabling innovative small cell usein 3.5 ghz band nprm & order,” 2012. [Online].Available: https://apps.fcc.gov/edocs_public/attachmatch/FCC-12-148A1_Rcd.pdf

[2] K. Kim, Y. Jin, D. Kum, S. Choi, M. Mueck, and V. Ivanov, “Etsi-standard reconfigurable mobile device for supporting the licensedshared access,” Mobile Information Systems, 2016.

[3] U. A. F. E. Segundo, “Ground/air task ori-ented radar (g/ator),” 2015. [Online]. Available:http://www.dtic.mil/docs/citations/AD1019434

[4] U. M. Corps, “An/tps-80 ground/air task oriented radar(g/ator),” 2017, https://marinecorpsconceptsandprograms.com/programs/aviation/antps-80-groundair-task-oriented-radar-gator.

[5] M. Clark and K. Psounis, “Can the privacy of primary networksin shared spectrum be protected,” in International Conference onComputer Communications (INFOCOM). IEEE, 2016, pp. 1–9.

[6] P. R. Vaka, S. Bhattarai, and J.-M. Park, “Location privacy ofnon-stationary incumbent systems in spectrum sharing,” in GlobalCommunications Conference (Globecom). IEEE, 2016, pp. 1–6.

[7] D. of Defense, “Electromagnetic spec-trum strategy,” 2013. [Online]. Available:http://archive.defense.gov/news/dodspectrumstrategy.pdf

[8] B. Bahrak, S. Bhattarai, A. Ullah, J.-M. Park, J. Reed, and D. Gur-ney, “Protecting the primary users’ operational privacy in spec-trum sharing,” in International Symposium on Dynamic SpectrumAccess Networks (DySpan). IEEE, 2014, pp. 236–247.

[9] A. Robertson, J. Molnar, and J. Boksiner, “Spectrum databasepoisoning for operational security in policy-based spectrum op-erations,” in Military Communications Conference (Milcom). IEEE,2013, pp. 382–387.

[10] Z. Zhang, H. Zhang, S. He, and P. Cheng, “Achieving bilateral util-ity maximization and location privacy preservation in database-driven cognitive radio networks,” in International Conference onMobile Ad Hoc and Sensor Systems (Mass). IEEE, 2015, pp. 181–189.

[11] R. Shokri, G. Theodorakopoulos, C. Troncoso, J.-P. Hubaux, andJ.-Y. Le Boudec, “Protecting location privacy: optimal strategyagainst localization attacks,” in Proceedings of the 2012 ACM con-ference on Computer and communications security. ACM, 2012, pp.617–627.

[12] C. Chen, H. Cheng, and Y.-D. Yao, “Cooperative spectrum sensingin cognitive radio networks in the presence of the primary useremulation attack,” IEEE Transactions on Wireless Communications,vol. 10, no. 7, pp. 2135–2141, 2011.

[13] L. Zhang, G. Ding, Q. Wu, Y. Zou, Z. Han, and J. Wang, “Byzantineattack and defense in cognitive radio networks: A survey,” IEEECommunications Surveys & Tutorials, vol. 17, no. 3, pp. 1342–1363,2015.

Page 14: DPavatar: A Real-time Location Protection Framework for ... · privacy models, lacking rigorous privacy guarantee. In this paper, we propose a comprehensive framework to provide real-time

1536-1233 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TMC.2019.2897099, IEEETransactions on Mobile Computing

14

[14] J. Liu, C. Zhang, H. Ding, H. Yue, and Y. Fang, “Policy-basedprivacy-preserving scheme for primary users in database-drivencognitive radio networks,” in Global Communications Conference(Globecom). IEEE, 2016, pp. 1–6.

[15] Y. Dou, K. C. Zeng, H. Li, Y. Yang, B. Gao, C. Guan, K. Ren, andS. Li, “P2-sas: preserving users’ privacy in centralized dynamicspectrum access systems,” in Proceedings of the 17th ACM Inter-national Symposium on Mobile Ad Hoc Networking and Computing(MobiHoc). ACM, 2016, pp. 321–330.

[16] C. Dwork, “Differential privacy: A survey of results,” in Interna-tional Conference on Theory and Applications of Models of Computation.Springer, 2008, pp. 1–19.

[17] M. E. Andrés, N. E. Bordenabe, K. Chatzikokolakis, andC. Palamidessi, “Geo-indistinguishability: Differential privacy forlocation-based systems,” in Proceedings of the 2013 ACM SIGSACconference on Computer & communications security. ACM, 2013, pp.901–914.

[18] N. E. Bordenabe, K. Chatzikokolakis, and C. Palamidessi, “Op-timal geo-indistinguishable mechanisms for location privacy,” inProceedings of the 2014 ACM SIGSAC Conference on Computer andCommunications Security. ACM, 2014, pp. 251–262.

[19] Y. Xiao and L. Xiong, “Protecting locations with differentialprivacy under temporal correlations,” in Proceedings of the 22ndACM SIGSAC Conference on Computer and Communications Security.ACM, 2015, pp. 1298–1309.

[20] K. Chatzikokolakis, C. Palamidessi, and M. Stronati, “Construct-ing elastic distinguishability metrics for location privacy,” Proceed-ings on Privacy Enhancing Technologies, vol. 2015, no. 2, pp. 156–170,2015.

[21] G. Kellaris, S. Papadopoulos, X. Xiao, and D. Papadias, “Differen-tially private event sequences over infinite streams,” Proceedings ofthe VLDB Endowment, vol. 7, no. 12, pp. 1155–1166, 2014.

[22] Q. Wang, Y. Zhang, X. Lu, Z. Wang, Z. Qin, and K. Ren, “Res-cuedp: Real-time spatio-temporal crowd-sourced data publishingwith differential privacy,” in International Conference on ComputerCommunications (INFOCOM). IEEE, 2016, pp. 1–9.

[23] L. Yu, L. Liu, and C. Pu, “Dynamic differential location privacywith personalized error bounds,” Network and Distributed SystemSecurity Symposium (NDSS), 2017.

[24] R. Shokri, “Privacy games: Optimal user-centric data obfuscation,”Proceedings on Privacy Enhancing Technologies, vol. 2015, no. 2, pp.299–315, 2015.

[25] C. Dwork, F. McSherry, K. Nissim, and A. Smith, “Calibratingnoise to sensitivity in private data analysis,” in Theory of Cryp-tography Conference. Springer, 2006, pp. 265–284.

[26] J. Liu, C. Zhang, and Y. Fang, “Epic: A differential privacy frame-work to defend smart homes against internet traffic analysis,”IEEE Internet of Things Journal, vol. 5, no. 2, pp. 1206–1217, 2018.

[27] Q. Wang, Y. Zhang, X. Lu, Z. Wang, Z. Qin, and K. Ren, “Real-timeand spatio-temporal crowd-sourced social network data publish-ing with differential privacy,” IEEE Transactions on Dependable andSecure Computing, vol. 15, no. 4, pp. 591–606, 2018.

[28] NTIA, “3.5 ghz exclusion zone analyses and methodology,” 2015.[Online]. Available: https://www.ntia.doc.gov/report/2015/35-ghz-exclusion-zone-analyses-and-methodology

[29] P. Kalnis, G. Ghinita, K. Mouratidis, and D. Papadias, “Preventinglocation-based identity inference in anonymous spatial queries,”IEEE Transactions on Knowledge and Data Engineering, vol. 19, no. 12,pp. 1719–1733, 2007.

[30] U. FCC, “Longley-rice methodology for evaluating tv coverageand interference,” OET Bulletin, vol. 69, 2004.

[31] J. M. Robson, “Algorithms for maximum independent sets,” Jour-nal of Algorithms, vol. 7, no. 3, pp. 425–440, 1986.

[32] J. K. Lawder and P. J. H. King, “Querying multi-dimensional dataindexed using the hilbert space-filling curve,” ACM Sigmod Record,vol. 30, no. 1, pp. 19–24, 2001.

[33] G. Ghinita, P. Kalnis, and S. Skiadopoulos, “Prive: anonymouslocation-based queries in distributed mobile systems,” in Proceed-ings of the 16th international conference on World Wide Web. ACM,2007, pp. 371–380.

[34] L. Fan and L. Xiong, “An adaptive approach to real-time aggre-gate monitoring with differential privacy,” IEEE Transactions onKnowledge and Data Engineering, vol. 26, no. 9, pp. 2094–2106, 2014.

[35] M. King, Process Control: A Practical Approach. John Wiley & Sons,2016.

[36] U. Feige and E. Ofek, “Finding a maximum independent set in asparse random graph,” Lecture notes in computer science, vol. 3624,p. 282, 2005.

[37] J. Liu, H. Ding, Y. Cai, H. Yue, Y. Fang, and S. Chen, “An energy-efficient strategy for secondary users in cooperative cognitive ra-dio networks for green communications,” IEEE Journal on SelectedAreas in Communications, vol. 34, no. 12, pp. 3195–3207, 2016.

Jianqing Liu (M’18) received the Ph.D. degreefrom University of Florida in 2018 and the B.Eng.degree from University of Electronic Science andTechnology of China in 2013. He is currently anassistant professor in the Department of Electri-cal and Computer Engineering at University ofAlabama in Huntsville. His research interest is toapply cryptography, differential privacy and con-vex optimization to design secure and efficientprotocols for networked systems.

Chi Zhang received the B.E. and M.E. degreesin electrical and information engineering fromthe Huazhong University of Science and Tech-nology, China, in 1999 and 2002, respectively,and the Ph.D. degree in electrical and computerengineering from the University of Florida in2011. He joined the School of Information Sci-ence and Technology, University of Science andTechnology of China, as an Associate Professorin 2011. His research interests include the ar-eas of network protocol design and performance

analysis and network security particularly for wireless networks andsocial networks. He received the 7th IEEE ComSoc Asia-Pacific Out-standing Young Researcher Award. He is a member of the IEEE.

Beatriz Lorenzo (S’08–M’12) received theM.Sc. degree in telecommunication engineer-ing from the University of Vigo, Spain, in 2008,and the Ph.D. degree from the University ofOulu, Finland, in 2012. Since 2014, she hasbeen a Senior Researcher with the Atlantic Re-search Center for Telecommunication Technolo-gies, University of Vigo. Her research interestsinclude wireless networks, network architecturesand protocol design, mobile computing, opti-mization, and network economics. She received

the Fulbright Visiting Scholar Fellowship, University of Florida, from2016 to 2017.

Yuguang Fang (F’08) received the M.S. degreefrom Qufu Normal University, China, in 1987,the Ph.D. degree from Case Western ReserveUniversity in 1994, and the Ph.D. degree fromBoston University, in 1997. He joined the Depart-ment of Electrical and Computer Engineering,University of Florida, in 2000, where he has beena Full Professor since 2005. He held a Universityof Florida Research Foundation Professorshipfrom 2006 to 2009, a Changjiang Scholar ChairProfessorship with Xidian University, China, from

2008 to 2011, and also with Dalian Maritime University since 2015, anda Guest Chair Professorship with Tsinghua University, China, from 2009to 2012. He is a fellow of the IEEE and AAAS. He was the Editor-in-Chief of the IEEE Transactions on Vehicular Technology from 2013 to2017 and the IEEE Wireless Communications from 2009 to 2012.