freedom infringements – siam project · 1.1.1 legal norms do not exist in isolation: law as a ......
TRANSCRIPT
1
T h e i n t e r d i s c i p l i n a r y
R e s e a r c h G r o u p o n L a w , S c i e n c e , T e c h n o l o g y &
S o c i e t y
V r i j e U n i v e r s i t e i t B r u s s e l
K a t j a d e V r i e s
P r o f . D r . M i r e i l l e H i l d e b r a n d t
Deliverable D9.7 Report on the Legal Framework of the Use of SMTs at EU and International Level
SIAM Security Impact Assessment
Measures
WP 9 Legal Frameworks –
Regulative Techniques
Project number 261826 Call (part) identifier FP7-Security-2010-1 Funding scheme Collaborative Project
2
Table of Contents
List of figures…..…………………….........................................................................…....... p. 7
List of tables in the annex………………………………………………………………............…......... p. 8
Summary ………………………………………………….……………………………………............…......... p. 9
Chapter 1. The legal compatibility of SMTs – a broader view in a double sense …. p. 11
1.1 A broader view in terms of the legal framework …..………………………....... p. 14
1.1.1 Legal norms do not exist in isolation: law as a
“system” or “legal totality”...................................................………..... p. 14
1.1.2 The legal framework of the Council of Europe (CoE) –
and why it of specific importance to the SIAM
assessment process. ...................................................…………........... p. 28
1.1.3 The EU legal framework - and why it of specific
importance to the SIAM assessment process.……………………….......... p. 29
1.1.4. Fundamental Right Impact Assessment and Legal Protection
by Design in the EU legal framework (with special attention to
their role in Data Protection)............................................................ p. 31
(a) LPbD in EU data protection: Data Protection by Design ... p. 32
(b) IA in EU data protection: Data Protection Impact
Assessment (DPIA)……………………………………………………..… p.. 36
(c) IA and LPbD with regard to other fundamental rights… p. 38
3
1.2 A broader view in terms of the relation between SMTs and legal
normativity: KORA as one particular form of Legal Protection by Design
(LPbD) .......................................................….................................................. p. 39
1.2.a. Theoretical assumptions underlying the designing of
norms into the architecture of an SMT.……..………............................. p. 41
1.2.b. The particularities of designing legal norms, especially
those expressing fundamental rights, into SMT architecture........... p. 43
1.2.c. Assessing how the notions “privacy by design” and
“privacy by default” fit in the broader field of LPbD........................... p. 44
1.2.d. Assessing how the KORA method fits in the broader
field of LPbD....................................................................................... p. 46
Chapter 2. The legal frameworks of fundamental rights of the Council of Europe and
the EU. Assessing the legal compatibility of SMTs with European fundamental rights
(examples: Smart CCTV and Passenger Profiling) and inferring LPbD implications from
them (example: Smart CCTV) ……………………….......................................................... p. 48
PART 1:
General observations about the legal frameworks of
fundamental rights of the Council of Europe and the EU.
2.1 Some general observations……………..……………………………………………..…. p. 50
2.1.1 The relation between D4.2 and D9.7..................................... p. 50
4
2.1.2 Fundamental rights – some introductory remarks………………….. p. 51
2.1.3 Smart CCTV and Passenger Profiling – two examples that are
looked at in more detail..................................................................... p. 51
2.2 Differences in legal effect of (i) the European Convention of Human Rights,
(ii) Convention 108, (iii) the EU charter of Fundamental Rights and (iv) EU
Directives, Framework Decisions and Regulations ……………..…................. p. 57
2.3 Proportionality and Fair Balancing.........……………………........................ p. 65
2.3.1 In the case law of the EHtHR and ECJ..................………………… p. 65
2.3.2 In LPbD and IA......................................................................... p. 73
2.3.3 Proportionality in the quadruple structure of the legal
compatibility analysis in 2.4 and 2.5................................................ p. 77
2.3.4 A typology of technological and organizational design
implications...................................................................................... p. 79
PART 2:
Fundamental rights of the Council of Europe and the EU.
Assessing the legal compatibility of SMTs with them (examples: Smart
CCTV and Passenger Profiling) and inferring LPbD implications from them
(example: Smart CCTV)
2.4 Council of Europe: the ECHR....………………………………………………………... p. 83
a. Art. 2: Right to life
b. Art. 3: Prohibition of torture
c. Art. 5: Freedom from unlawful detention
5
d. Art. 6: Presumption of innocence and fair trial
e. Art. 8: Respect for private and family life:
f. Art. 9(1): Freedom of thought, conscience and religion
g. Art. 14: Prohibition of discrimination with regard to the exercise
other human rights
2.5 The EU: the EU charter of fundamental rights and secondary legislation with
regard to the protection of fundamental rights……………………………....…..... p. 116
2.5.1 The EU Charter for Fundamental Rights (CFREU)................... p. 118
a. Art. 1: Human dignity
b. Art. 2(1): Right to life
c. Art. 3(1): Right to the integrity of the person
d. Art. 4: Prohibition of torture and inhuman or degrading treatment
e. Art. 6: Right to liberty and security
f. Art. 7: Respect for private and family life
g. Art. 8: Protection of personal data
h. Art. 10(1): Freedom of thought, conscience and religion
i. Art. 21: Non-discrimination
j. Art. 24(2): The rights of the child
k. Art. 25: The rights of the elderly
l. Art. 26: Integration of persons with disabilities
m. Art. 35: Health care
n. Art. 45(1): Freedom of movement
o. Art. 47: Right to an effective remedy and to a fair trial
p. Art. 48: Presumption of innocence and right of defense
2.5.2 EU secondary law with regard to Fundamental Rights............... p. 134
6
(a) Data Protection……………………………..…………………….......... p. 134
Data protection Directive 95/46/EC
Framework Decision 2008/977/JHA
Proposed General data protection Regulation
Proposed Law Enforcement Data Protection Regulation
(b) Anti-discrimination.............................................……….................... p. 146
Racial equality Directive 2000/43/EC
Employment equality Directive 200/78/EC
Gender Recast Directive 2006/54/EC
Gender Goods and Services Directive 2004/113/EC
Proposed Equal Treatment Directive COM (2008) 426
(c) Freedom of movement..........................................................…....... p. 152
Directive 2004/38/EC on the right to move
and reside freely
Annex………..…………………………………………….………………………............…........................ p. 154
Internet resources of relevant legislative texts at the European level for the
SIAM Database (organized according to the freedom infringement typology
presented in D4.2)…………………………………………………………………………......................... p. 164
Bibliography…………………………………………….………………………............…........................ p. 169
7
List of figures
Fig. 1 – How in D9.7 the “background” of D9.2-D9.6 and D9.8 is foregrounded…… p. 13
Fig. 2 – The inductively inferred Freedom Infringements of D4.2 mapped against
some of relevant rights and legal instruments of the EU and CoE legal
frameworks...................................................................................................p. 27
Fig. 3 – IA and LPbD. Two ways of bringing SMTs and Fundamental Rights in
alignment with each other and prevent any clashes between them………....p. 32
Fig. 4 – Discrepancies and overlaps between the categories of sensitive data
and the prohibited grounds for discrimination………………………………………... p. 144
Fig. 5 – The different protective scopes for the various prohibited grounds
of discrimination...........................................................................................p. 148
8
List of tables in the annex
Table 1 – Compatibility of Passenger Profiling with relevant ECHR rights....................... p.155
Table 2 – Compatibility of Smart CCTV with relevant ECHR rights................................... p.156
Table 3 – LPbD for Smart CCTV based on relevant ECHR rights....................................... p. 157
Table 4 – Compatibility of Passenger Profiling with relevant CFREU rights......................p. 158
Table 5 – Compatibility of Smart CCTV with relevant CFREU rights..................................p. 159
Table 6 –LPbD for Smart CCTV based on relevant CFREU rights.......................................p. 160
Table 7 – Compatibility of Passenger Profiling with relevant secondary EU legislation...p. 161
Table 8 – Compatibility of Smart CCTV with relevant secondary EU legislation ..............p. 162
Table 9 – LPbD for Smart CCTV based on relevant secondary EU legislation ..................p. 163
9
Summary
In order to assess and/or increase the level of compatibility of SMTs with European
fundamental rights, legal normativity has to be translated and articulated into SMT
architecture. Legal normativity operates by immanently (re-)constructing the legal
framework to which a legal norm belongs:: that is, legal norms are never interpreted in
isolation but always in relation to their legal pedigree, to other legal norms and to the
specifics of an individual case. Legal normativity functions in a way that is distinctly different
from other types of normativity, such as technological normativity, which works through a
folding-in or “black-boxing” of its pedigree (e.g., a body scanner “works” independently from
the institutional, political and legal steps that lead up to its acquisition) and through a high
level of autonomy towards “contextual” factors (e.g., the body scanner is “indifferent” to
whoever passes the scanner: it “treats” all alike).
Because legal norms protecting fundamental rights are not organized in a hierarchical
way, their mode of operation is less about pedigree and more about proportionality testing
and fair balancing. This requires an imaginative and constructivist approach: establishing
whether an infringement on a fundamental right is justified or resolving conflicts between
several fundamental rights should preferably not be done in terms of a zero-sum game but
in terms of convergence, reconciliation, the construction of an optimal composition and of a
win-win situation.
The question addressed in this deliverable is how to preserve the legal (systematic-
fractal) mode of operation, specifically the one of proportionality testing and fair balancing,
when translating the legal requirements of fundamental rights into technological or
organizational requirements (Legal Protection by Design). We relate this question to existing
forms of LPbD, such as privacy and data protection by design (and by default), and to the
KORA method presented in D9.2.
The theoretical first half of this deliverable (chapter 1) is followed by a practical and applied
one (chapter 2) in which the compatibility between SMTs and the relevant European
fundamental rights is assessed. The theoretical findings of chapter 1 are applied in the
10
analysis of chapter 2 by looking at the limitations to a fundamental right, and the possible
conflicts and convergences with other rights. Moreover, the analysis presented in chapter 2:
- reviews the European fundamental rights framework, describing the fundamental
rights and freedoms protected by the European Convention for the Protection of
Human Rights and Fundamental Freedoms (Council of Europe), by the EU Charter of
Fundamental Rights and various secondary EU instruments with concerning data
protection, anti-discrimination law and freedom of movement;
- discusses the case law that is relevant for SMTs;
- maps the inductively generated typology of freedom infringements presented in D4.2
onto the European fundamental rights framework;
- assesses the territorial reach, protective scope and applicability of each of the
relevant legal instruments (including a clarification of the “horizontal effect” of
fundamental rights and the “positive obligations” that they can pose on States);
- presents ways in which proportionality testing and fair balancing can be taken into
account when performing a fundamental rights impact assessment (FRIA) or when
creating LPbD;
- assesses the compatibility of SMTs with the relevant European fundamental rights
and freedoms;
- illustrates this compatibility analysis by paying specific detailed attention to the legal
compatibility of two particular SMTs: smart CCTV and passenger profiling;
- shows which legal design implications could be drawn from the fundamental right
frameworks of the EU and the CoE for smart CCTV.
The compatibility assessments and LPbD analyses of chapter 2 are illustrated with tables
which are presented in the annex.
11
Chapter 1.
The legal compatibility of SMTs – a broader view in a double sense
As stated in the WP9 Procedural Guidelines the general purpose of WP9 is:
“… to gain an understanding of the legal framework that regulates the use of SMTs.
Its aim is to find out whether implemented techniques actually achieve their
objectives and whether they are ready to face the challenges of evolving, more
intensive and extensive SMTs.” (p. 1)
In WP9 understanding of the legal framework regulating the use of SMTs is gained
through a bottom-up, empirical approach in the SIAM case studies (D9.3, D9.4, D9.5, D9.6
and D9.8): data are mainly gathered through interviews and workshops with experts,
stakeholders and practitioners that are involved with the regulations of SMTs at each of the
four case study sites. The bottom-up, empirical findings are presented against the
background of the KORA method (“Concretization of Legal Requirements”, D9.2). Given the
fact that the legal framework regulating SMTs is difficult to grasp for actors lacking expert
legal knowledge, the application of the KORA method to the SIAM case studies offers a
refreshing and much needed level of concreteness. The concreteness that animates WP9 is a
concreteness in a double sense: it follows both from the process of concretization fostered
by the KORA method – what’s in a name! – as well as from the fact that this method is
applied to the concrete SIAM case studies (London Tube, Turin Metro, Berlin Brandenburg
International Airport and Ben Gurion International Airport). Such concreteness is very much
called for when developing a tool that strives to offer support to actors which are faced with
the concrete decision of choosing the most appropriate SMT.
However, when looking at the concrete case studies through the lens of the KORA
method the analysis is limited to the SMTs used at the site of the case study, the particular
(national) regime governing the SMTs at this site, and the specific relationship between law
and technology as set out by the KORA method. Without abandoning the commitment to
12
concreteness that is guiding for work package 9 as a whole, D9.7 aims to offer a broader
outlook on the applicable legal framework and on the methodological ways of approaching
the relation between law and SMTs. It should be stressed that this “broadening” is not
meant as a euphemism for abstraction. D9.7 does not zoom out into more general
observations but shifts the spotlight (see figure 1) onto (1) the legal frameworks of the EU
and of the Council of Europe and, onto (2) how the KORA method belongs to a broader field
of legal protection by design (LPbD).
These two matters – that is, firstly, the systemic interpretation of applicable norms
on the level of European fundamental rights and, secondly, the broader philosophical
assumptions about the relation between legal norms and their embodiment – form the
horizon against which D9.2-D9.6 and D9.8 should be understood. Thus, to put it more
precisely, the two aforementioned European legal frameworks and the large conceptual
category of LPbD operate as the constitutive periphery of the analyses of D9.2-D9.6 and
D9.8. It is only against the background of this periphery that (a) the KORA method as an
instrument for the evaluation and design of SMTs (D9.2) and, (b) the assessment of the
compatibility between the SMTs and the relevant fundamental rights provisions (D9.8)
based on the case study reports of D9.3-D9.6, can emerge.
Why is it important to complement D9.2-9.6 and D9.8 with an overview (1) of the
legal framework of the EU and of the Council of Europe and, (2) of the broader conceptual
category “legal protection by design” (LPbD)? The answer to the first part of this question
lies mainly in the inherently “systematic” nature of law (explored in more detail in section
1.1), which risks getting lost when assessing SMTs only according to their compatibility with
isolated requirements derived from legal norms. The answer to the latter part of this
question (explored in more detail in section 1.2) is mainly to clarify some of the strengths
and limitations of the KORA method as compared to other forms of LPbD.
13
Figure 1. The analyses presented in D9.2-D9.6 and D9.8 are informed and shaped by (a) the legal
framework of the Council of Europe and the EU, and (b) the KORA method, which is one
possible way of realizing legal protection by design (LPbD). In D9.7 this “background” is
foregrounded.
14
1.1 A broader view in terms of the legal framework
Why is it important to complement D9.2-9.6 and D9.8 with an overview of the legal
fundamental rights framework(s)1 of the EU and of the Council of Europe2? This question can
be divided into two sub-questions: firstly, why it is not sufficient to study legal norms in
isolation and what should be understood by a “legal framework” (section 1.1.1), and,
secondly, why the legal frameworks of the Council of Europe (section 1.1.2) and the EU
(sections 1.1.3 and 1.1.4) are of specific importance to the SIAM assessment process.
1.1.1 Legal norms do not exist in isolation: law as a “system” or “legal
totality”
One reason for providing an overview of the legal frameworks of the EU and of the Council
of Europe is that it offers some practical guidance with regard to any other legal aspects
discussed in the SIAM project. Because these two legal frameworks inform the majority of
applicable legal norms within any airport and public transportation system in the member
states of the EU3 and the Council of Europe4, their systematic presentation provides a
1 In principle we discuss two separate legal frameworks: the legal framework of the EU should be distinguished
from the one of the Council of Europe, as they both derive their legal force from different sources. However, especially since the entry into force of the Lisbon Treaty, the ties between the two frameworks are becoming closer. The most important step in this regard is that the Treaty of Lisbon (Art. 6(2)) obliges the EU to accede to the European Convention on Human Rights (ECHR) of the Council of Europe (CoE). Another important connection between the two frameworks is created by the fact that in the EU Charter of Fundamental Rights (Art. 52(3)) it is specified that, when equivalent fundamental rights exist in the ECHR, the interpretation of these ECHR rights sets a minimum standard for the understanding of the corresponding CFREU rights. Finally, the EU Court of Justice (ECJ) and the European Court of Human Rights (ECtHR) are known for paying close attention to each other’s case law, and occasionally referring to it. Thus, on occasion the EU legal framework incorporates the ECHR (an instrument of the CoE) into its own legal body. Because, as we explain in the second half of section 1.1.1, a legal framework can be understood as an immanent entity, i.e., not existing in abstracto but only in relation to a concrete case, we can say that, while the CoE and EU legal frameworks normally act as two separate frameworks, they can on occasion act as one unified legal framework. 2 We use the expression “legal fundamental rights framework of the Council of Europe” as a totum pro parte.
We do not discuss all fundamental rights instruments of the Council of Europe, but focus on the most important one: the European Convention on Human Rights (ECHR). Next to the ECHR we also give a concise description of CoE Convention 108 on data protection (see below, section 2.2.2). 3 At present (August 2013) the 28 member states of the EU are (between brackets the date of their entry):
Austria (1995), Belgium (1952), Bulgaria (2007), Croatia (2013), Cyprus (2004), Czech Republic (2004), Denmark
15
guiding structure for the reader to resort to when reading the immanent, piecemeal legal
analyses presented in the other deliverables of work package 9, and the inductively
generated typology of freedoms5 presented in D4.2 (see figure 2).
However, an even more important reason for providing an overview of these two
legal frameworks has to do with the nature of legal norms as such: legal norms never exist in
isolation and can only be understood within a particular legal framework. That is, in order to
operate as a legal norm the relevant part of the legal “system” or legal “totality”6, to which
these legal norms belong and from which they derive their legal force, has to be mobilized. It
should therefore be noted that when assessing the legal compatibility of SMTs, the
assessment can never be limited to matching one SMT with one legal norm. An assessment
of the legal compatibility of an SMT does never concern a legal norm in isolation, but takes
place through the act of interpreting a norm as part of the legal system or legal totality to
which it belongs. The exact nature of the legal “system” or “totality” (what it is made of, how
it comes into being, how it persists in existence and what differentiates it from other
“systems”) is subject to extensive theoretical debates. Is the legal system an order of norms
invented by legal scholars who create a logical and hierarchic order from the chaotic social
(1973), Estonia (2004), Finland (1995), France (1952), Germany (1952), Greece (1981), Hungary (2004), Ireland (1973), Italy (1952), Latvia (2004), Lithuania (2004), Luxembourg (1952), Malta (2004), Netherlands (1952), Poland (2004), Portugal (1986), Romania (2007), Slovakia (2004), Slovenia (2004), Spain (1986), Sweden (1995), United Kingdom (1973). 4 At present (August 2013) the 47 member states of the Council of Europe are: Albania, Andorra, Armenia,
Austria, Azerbaijan, Belgium, Bosnia and Herzegovina, Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Georgia, Germany, Greece, Hungary, Iceland, Ireland, Italy, Latvia, Liechtenstein, Lithuania, Luxembourg, Malta, Republic of Moldova, Monaco, Montenegro, Netherlands, Norway, Poland, Portugal, Romania, Russian Federation, San Marino, Serbia, Slovak Republic, Slovenia, Spain, Sweden, Switzerland, The former Yugoslav Republic of Macedonia, Turkey, Ukraine, United Kingdom. All of these 47 member states fall under the jurisdiction of the European Court of Human Rights (ECtHR) in Strasbourg and are legally bound by the European Convention on Human Rights. 5 The seven types of Freedoms identified in D4.2 are (p.9): (a) Bodily integrity, (b) Equal Treatment and non-
discrimination, (c) Freedom of movement, (d) Freedom from unlawful detention, (e) Presumption of innocence, (f) Fair trial and due process, and (g) Privacy and data protection. 6 We use the terms “legal system” and “legal totality” as interchangeable synonyms. However, it should be
noted that legal system is the more frequently used expression in legal theory and jurisprudence (see especially: Hart, 1997; Kelsen, 2005). The reason for using the term legal totality is that it is a more open concept: it acknowledges the permanent reference to the body of law as a whole but does not incorporate an assumption about how the legal totality is organized (whether it is organized in a hierarchical way and thus “truly” systematic or not) and what it exactly consists of (rules, norms, principles, techniques, etc.). This is especially important when “balancing” fundamental rights. Fundamental rights, as we argue in this deliverable, do not allow for a hierarchical ordering between them. Nevertheless, the proportionality reasoning used to solve conflicts between them is not any less legal because of this lack of hierarchy and builds on the idea of a particular “coherence” between the various rights. One could call this “coherence” a “legal system”, but this would require that the term “system” is understood in a non-hierarchical way. To avoid confusion, in the context of fundamental rights it might be easier to speak of “legal totality” instead.
16
reality of legislation and case law? (cf. Kelsen, 2005) Is a law a systematized whole of social
rules and is the legal validity of a rule dependent on the question whether it can be traced
back to an ultimate social rule that recognizes it as a legal rule? (cf. Hart, 1997) Is it the
image of law as “a seamless web” (Dworkin, 1978, p. 115), and the subsequent “integrity” of
law (Dworkin, 1998), invented through an interpretative attitude in adjudication that unfolds
according to “two dimensions of integrity” (Dworkin, 2006, p. 264): firstly that a judge
should strive to optimize the consistency of a judgment with its legal precedents, and
secondly that a legal interpretation of the body of law provides it with a justification that is
best “from the point of view of political morality” (Dworkin, 1998, p. 411)? Despite the vast
theoretical differences separating legal theorists like Kelsen, Hart and Dworkin, they could all
agree on the fact that when law is mobilized this means that a legal totality is mobilized with
regard to an individual case. (Latour, 2010, p. 256) Of course, this is not to say that every
legal case has to discuss every existing piece of legislation, doctrine and case law – that
would be both absurd and physically impossible. It does mean, however, that every legal
case has to articulate its connection to “the” Law as a coherent, and maybe even systematic,
totality. In the rulings of contemporary Western Courts, which includes the European Court
of Human Rights (ECtHR) and the EU Court of Justice (ECJ), this connection is made under
headings such a as “The Law”, “Legal Context” or “Legal Grounds”. The connection will often
be partly implicit, but without it all rulings would lose their legal force. It is this connection
between legal totality and individual case that makes that “[l]aw is fractal” (idem, p. 256):
legal reasoning requires that an isolated legal norm or local case is interwoven with law in its
entirety. “[B]y talking about Law we always talk about the entire corpus of law.” (idem, p.
258) This systematic-fractal nature of law is exemplified by how a judge always has to
consider the effect of his ruling on two levels: the effect in the individual case brought
before the judge and the effect on (case) law as a whole.
In contrast to law, the normative force and interpretation of an informal social norm
(e.g., that it is impolite to burp during dinner) or a norm embedded in technology (e.g. the
lock on a door “embodies” the norm that people without a key should not be able to enter)
are not dependent on whether a legal totality is mobilized. A legal grounding, explicitly
attaching isolated facts and norms to the legal system as a whole, is only needed when law
is mobilized (in legal adjudication). Of course, one might object that no norm can be
understood in isolation: every interpretation requires a "fore-conception of completeness”
17
(“Vorgriff der Volkommenheit”, Gadamer, 2004, p. 294), which assumes that the object of
interpretation is a coherent whole, and every interpretation mobilizes some totality (e.g., a
line from the New Testament is interpreted in line with the spirit of the Old and New
Testament as a whole or the informal social norm against burping is interpreted through the
lens of the principles underlying a particular Western middle class etiquette as a whole).
Moreover, one could also object that the interpretation of legal norms, like any other norm,
does not only depend on a systematic interpretation but also relies on grammatical,
historical and teleological interpretations (see chapter 2.1.3 of D9.2 and Larenz, 1992).
Nevertheless, legal interpretation distinguishes itself from other interpretative practices by
its emphasis on legally systematic interpretation. It is only with legal reasoning that a legal
totality has to be mobilized: for example, interpreting the right to respect for private life (art.
8 ECHR) from an anthropological perspective7 (link it to the totality of a particular culture or
society) or in line with the New Testament (link it to the totality of particular text) does not
make a connection to the legal totality. That legal reasoning requires the mobilization of a
legal totality sounds like a hopeless tautology as long as we do not specify what makes a
totality legal. If we were to define what makes a particular totality legal, we would have to
plunge into the aforementioned deep theoretical issues again. For example: does a legal
totality have to be backed-up by a “coercive authority” (Mireille Hildebrandt, 2008b, p. 174)
in order to be legal? Or can a legal totality exist in a non-state society? Is a legal totality a
semiotic fiction (that is, “the Law” is postulated as an enunciator, allowing legal practitioners
to claim that “The Law says that….”8) immanently constructed by adhering to a particular
7 This is not to say that an anthropological perspective on law is not interesting or helpful – just that it’s not a
legal perspective. Already in the early eighteenth century Montesquieu noted to which extent laws are influenced by their social and natural setting: “Laws should be so appropriate to the people for whom they are made that it is very unlikely that the laws of one nation can suit another. Laws must relate to the nature and the principle of the government that is established or that one wants to establish […]. They should be related to the physical aspect of the country; to the climate, be it freezing, torrid, or temperate; to the properties of the terrain, its location and extent; to the way of life of the peoples, be they plowmen, hunters, or herdsmen; they should relate to the degree of liberty that the constitution can sustain, to the religion of the inhabitants, their inclinations, their wealth, their number, their commerce, their mores and their manners; finally, the laws are related to one another, to their origin, to the purpose of the legislator, and to the order of things on which they are established. They must be considered from all these points of view.” (Montesquieu, 1748/1989, p. 8-9) 8 When “The Law” is understood as a postulated enunciator, and thus as semiotic fiction, the word “fiction”
should not be understood in a pejorative way, as something that lacks reality or objectivity and belongs to the realm of phantasms and fairytales. To say that “The Law says that….”, requires many legal operations such as providing a convincing legal qualification of the facts, establishing sufficient proof, interpreting and balancing norms, etc.
18
way of legal reasoning and particular legal “techniques”9, or is a legal totality something that
has a material and organizationally embodied existence10 and therefore only really came
into being after the invention of the printing press, which provided the necessary material
support for a satisfactory level of legal certainty, and after the emergence of the modern
state, which guarantees an exclusively legal domain – exempt from moral and political
pressure?
9 This position has recently been defended by Latour (2010, p. 254). If one puts the stress on the legal
operations and techniques that bring about the totality of law as a semiotic fiction (De Vries & Van Dijk, 2013; Greimas & Landowski, 1990), the origins of law can be traced to the Roman republic. (Schiavone, 2012; Thomas, 2011) Schiavone argues that law (ius), as “a sort of concrete rationalism” (p. 62) constituting a sphere separate from religion and politics, is a Roman invention that fully developed during the middle- and late- republic (274 BC – 30 BC). “[I]t was Roman law alone that provided the paradigm enabling us to recognize as ‘legal’ the prescriptive practices that were originally integral parts of radically different contexts and systems – theological apparatuses with varying links to royalty, kinship ties, and political institutions. However, it was only in Rome that the ordering inevitably found in any human community was subjected at an early point to a strict specialization, in turn transformed into a strong grounded social technology which identified, once and for all, the juridical function and its experts, the ‘jurists’ (a word unknown to any ancient language but Latin), detaching them from any other cultural production or institutional center – from religion, morals, or even politics – and endowing them with a clear, autonomous, and definite identity. From then on, law would be seen in every depiction and image, even the simplest and most unassuming, as something entirely apart – a compact, impenetrable corpus – and would always be distinguished by the delineating of regulatory devices with a special and powerful rationality. Its separatedness came to be regarded as a peculiar feature of the West: around this isolation an extraordinary ideological discourse quickly took shape to recast it as ‘independence’ and ‘neutrality’ – of norms, procedures, judges – making it one of the underlying values of our civilization.” (Schiavone 2012, p. 3-4) 10
This position has been extensively defended by one of the authors of this deliverable: see particularly Hildebrandt (2008). “The systematic nature of modern legal systems builds on the need for systemisation, rationalisation and linear thinking that is inherent in the affordances of the printing press.” (Hildebrandt, 2011a, p. 236) It should be noted that Hildebrandt makes a link between the printing press and modern law: clearly, there existed law before the printing press, but it lacked the rationalized and systematic interpretative dynamics between the individual case and “written, enacted codes and authoritative, written judgments.” (Hildebrandt, 2013b, p.14) Hildebrandt argues that this particular interpretative legal dynamics is a modern characteristic of law and only emerged after the invention of the printing press and the advent of the Modern state. The question is, thus, if the legal interpretative fractal dynamics (going back-and-forth between individual case and legal system) could exist in an oral, non-state society or whether these dynamics can only be found in modern law because they depend on a strong material support (writing as such only offers limited support in comparison to the printing press) in which a legal system can exist and a clear delineation of the scope of the power of the legal system (the Modern state in which there is a separation of powers and the rule of law is backed by State force). The answer to this question is an empirical one, that has not been conclusively answered yet. It should, however, be noted that the answer is likely to show a process of gradual and local change; of a historical development in which the fractal interpretative dynamics slowly became more important to law. For example, Vismann (2008) stresses the importance of the media embodying law, but instead of merely focusing on the printing press she looks at the how the development of various filing systems – from antiquity to our present day – has allowed for the emergence of something like a “totality of law” or a “legal system”. See also Latour (2013, p. 371): “The diffusion of writing has certainly made these traces [which allow acts to be linked to legal qualifications and to be attributed to legal subjects] easier to follow and to archive, but even among peoples said to be “without writing,” the anthropology of law attests to hundreds of astonishing procedures for attaching promises to their authors by solemn oaths and imposing rituals. On this point, writing has only accentuated the habit of already well-established links. Which explains, moreover, why even the most exotic collectives have always been recognized as perfectly capable of producing law.”
19
While giving an exhaustive definition of the nature a legal system or totality is clearly
beyond the scope of this deliverable, it is important to understand that a legal norm does
not operate in isolation: a legal norm always operates by mobilizing a cluster consisting of
legal norms, rules, principles, procedural formats, etc. Such a cluster is first and foremost
held together by deriving its legal force from the same source, but can additionally also be
bound together by shared legal principles, institutions, precedents or a common matter of
concern. When the same cluster is mobilized over-and-over again it might gain a certain
recognizable stability. It is then that one could call it “a legal framework.” Thus, for example,
one can say that a legal norm should be understood within the context of the German
constitutional framework, the framework of the European Convention of Human Rights
(ECHR) or the framework of EU labor law. The legal framework is that part of the legal
totality that is mobilized when a legal norm becomes active (invoked and interpreted). Thus,
we can conclude that a legal framework is an immanent part of a legal totality (i.e., legal
system), and that we can study it without getting into philosophical discussions about
whether the legal totality (i.e., legal system) to which it belongs is semiotically presumed,
materially embodied or both.
In this deliverable we focus our attention on the fundamental rights frameworks of
the Council of Europe and of the EU. These frameworks are not static givens but are
permanently rearticulated and reframed. They exist as a mixture of legislation, case law,
legal doctrine and legislative documentation, printed in hard copy or stored on servers, and
the practices of the officials and practitioners engaging with these documents.
While attentiveness to the legal framework is always important when dealing with
legal norms, this is particularly the case when dealing with norms that express fundamental
rights.
One of the important reasons for this is that fundamental rights often clash – or at
least seem to clash – with each other. For example, when passengers at an airport are
screened by means of a passenger profiling system, from the perspective of anti-
discrimination rights it might be beneficial to keep track of the ethnicity, age and gender of
all flagged passengers (this is way one can monitor if the system is discriminatory), and, at
the same time, from the perspective of data protection and the right to respect for private
life this might be highly problematic (the processing of sensitive data is only allowed for if a
set of strict conditions is fulfilled):
20
“If you start from the well-acknowledged observation that in order to combat
discrimination, it must be possible to compare individual situations by having access
to certain personal data, it becomes immediately apparent that tensions will
inevitably occur between this aspect of the right to respect for private life and the
fight against discrimination.” (Bribosia & Rorive, 2010, p. 8)
Thus, especially within the field of fundamental rights (Brems, 2008), which are hardly ever
structured in a hierarchical way11, it is important to know the principles and balancing
techniques that make it possible to resolve conflicts between rights. The tension between
anti-discrimination and data protection is only one example of how two rights can clash.
Clashes can emerge virtually between any two rights. What’s more: even within one right
clashes can occur. Zucca (2007, p. 26) calls such clashes “between two instantiations of the
same norm” intra-rights conflicts and opposes them to inter-rights conflicts between
“different fundamental rights norms.” Think, for example, of how a substantive
interpretation of the principle of non-discrimination (equality of results) can call for
affirmative action (positive discrimination), while a formal interpretation of the principle of
non-discrimination (equal distribution of opportunities) calls for exactly the opposite: that all
are treated equal and that any positive discrimination should be avoided.
In the context of SMTs that aim to protect from acts of terrorism and serious crimes,
there is one fundamental rights conflict that deserves particular attention. This is the
possible conflict between the positive obligations following from right to life, protected by
Article 2(1) ECHR and by Article 2 CFREU, and the negative obligations to abstain from
interference that follow from some of the other fundamental rights, such as the right to
respect for private life or freedom of religion. What should be understood as the positive
obligations following from the right to life is explained in Osman v United Kingdom.12 In this
case the ECtHR noted that the right to life of art. 2(1) ECHR does not merely pose negative
but also positive obligations on the State:
11
In contrast, Zucca (2007) proposes that it would be a good idea to articulate or create certain “rules of priority” within fundamental right systems: “qualified and contextualised ‘presumption[s] of priority’” (p. 37) would give guidance as to which right should be given priority in constitutional conflicts and thus put an end to the opacity and indeterminacy of balancing rights that are, at least in principle, all considered equally important. 12
ECtHR, Osman v United Kingdom, nr. 23452/94, 28 October 1998. Also see: Sottiaux, 2008, p. 4 ff.
21
“… the first sentence of Article 2 § 1 enjoins the State not only to refrain from the
intentional and unlawful taking of life, but also to take appropriate steps to safeguard
the lives of those within its jurisdiction. It is common ground that the State’s
obligation in this respect extends beyond its primary duty to secure the right to life
by putting in place effective criminal-law provisions to deter the commission of
offences against the person backed up by law-enforcement machinery for the
prevention, suppression and sanctioning of breaches of such provisions. […] that
Article 2 of the Convention may also imply in certain well-defined circumstances a
positive obligation on the authorities to take preventive operational measures to
protect an individual whose life is at risk from the criminal acts of another
individual.” (section 115)
It would thus be incorrect to say that the working of an SMT is only curtailed by fundamental
rights, or that the only thing that fundamental rights do is placing limits on the effectiveness
of SMTs. Quite the contrary, an SMT can be the very embodiment of several fundamental
rights requirements: both those requirements that impose the positive obligation that
security should be effectively protected as well as those that put negative obligations
regarding the operation of the SMT. An SMT that is compatible with fundamental rights
embodies how these rights are understood and balanced together.
A tension similar to the one existing between fundamental rights and between
positive and negative obligations, also exists within most fundamental rights in as far as they
are not absolute and can be limited or qualified. For example, whereas Art. 2 (right to life)
and Art. 3 (prohibition of torture) ECHR are absolute, Art. 8 (respect for private and family
life) and Art. 9(1) (freedom of thought, conscience and religion) are subject to a
qualification: a limitation or interference with these latter rights can be justified when it is in
accordance with law, has a legitimate aim and is necessary in a democratic society.13
13
The qualification clause is often to be found in the second section of the Article formulating a qualified right. For example, Art. 8(2) is the qualification of Art 8(1): 8(1). Everyone has the right to respect for his private and family life, his home and his correspondence. 8(2). There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.
22
Already in the 1970s the ECtHR noted that, for example, secret surveillance might be a
justified infringement on art. 8 ECHR:
“Democratic societies nowadays find themselves threatened by highly sophisticated
forms of espionage and by terrorism, with the result that the State must be able, in
order effectively to counter such threats, to undertake the secret surveillance of
subversive elements operating within its jurisdiction.”14
To summarize, interpreting a qualified right such as Art. 8 would be impossible without
taking into account the extensive legal framework of case law and doctrine regarding the
legality, legitimacy and necessity of infringements on fundamental rights.
Clashes between various fundamental rights and the existence of justified
infringements on particular fundamental rights are poignant examples of the impossibility of
an isolated, non-systematic, interpretation of a legal norm expressing a fundamental right.
However, even in situations where no there is no clash between rights, nor a qualified right
which is at stake, the interpretation of a legal norm will still be guided by the legal
framework. It is precisely here that legal theorists like Hart, Kelsen and Dworkin help to
remind us that law is inherently fractal: going back-and-forth between individual case and
norm and the body of law as a whole. When interpreting a legal norm, the effect of this
interpretation is not limited to an individual case but affects the legal framework as a whole.
The importance of a legally systematic interpretation has not only been stressed by legal
theorists, but is also expressed by legal practitioners. In fact, even the European Court of
Human Rights (ECtHR), the adjudicative body that is most crucial for the effective protection
of European fundamental rights, has expressed the importance of a systematic
interpretation of the European Convention of Human Rights (ECHR) on several occasions:
“…the Convention must also be read as a whole, and interpreted in such a way as to
promote internal consistency and harmony between its various provisions.”15
14
Klass and Others v. Germany, 6 September 1978, no. 5029/71, Series A, Vol. 28, § 42. 15
ECtHR, Rantsev v. Cyprus and Russia , no. 25965/04, 7 January 2010, § 274; ECtHR, Demir and Baykara v Turkey, no. 34503/97, 12 November 2008, § 66.
23
A legal framework, such as that of the ECHR, does not only require internal consistency but
can also indicate what other interpretative methods should be followed. The ECtHR has, for
example, expressed that the ECHR has to be interpreted both in an evolutive manner as well
as in a systematic way:
“While the Court must take a dynamic and flexible approach to the interpretation of
the Convention, which is a living instrument, any interpretation must also accord
with the fundamental objectives of the Convention and its coherence as a system of
human rights protection.”16
Each legal framework sets different requirements with regard to the interpretation of legal
norms and the resolution of conflicts between them. Therefore, instead of talking about the
entirety of the body of law in an abstract manner, we discuss some important interpretative
principles of two legal frameworks that are of particular importance for SIAM: the legal
framework of the EU and that of the ECHR (as instaurated by the Council of Europe). While
an exhaustive overview of these two frameworks goes beyond the scope of this deliverable,
chapter 2 will present two aspects of the EU and CoE legal framework that are of particular
importance when studying fundamental rights in relation to SMTs in airports and public
transportation systems. The first aspect (section 2.3) is the balancing between and within
rights, and the second one (section 2.2) is the protective scope and the jurisdictional
applicability of these two frameworks (who is bound by their norms and in which way are
they effectuated?).
The reason why these two aspects are of specific importance to the legal regulation
of SMTs in airports and public transportation systems, is that such SMTs are socio-
technological settings that often mix many different actors and interests. In such a
maelstrom of actors and interests17 legal concepts such as “proportionality” and “fair
16
ECtHR, Pretty v. UK, no. 2346/02, 29 April 2002, § 54. 17
Each SMT, and each regulatory tool that governs a particular use of an SMT, tries to create some order in the maelstrom of actors and interests. For example, Bellanova and De Hert (2013) show how there is a whole set of passenger profiling systems with regard to travelers moving between the European Union (EU) and the United States (US), and that each of this systems has a slightly different scope and rationale: one system might focus on the use of commercial data (e.g., the 2012 EU-US PNR agreement), while another focuses on biometric data (e.g., the 2008 Prüm treaty); one system might be mainly used to regulate migration, while another has the detection and prevention of serious crime and terrorism as its main aim; etc. However, in a
24
balance” help to establish whether an infringement on a right is justified and how clashes
between en within rights can be resolved. Secondly, given the mix of actors and interests
involved in most SMTs, it is important to identify exactly to whom a legal framework applies.
For example, an airport or transportation company might be allured by the idea to combine
data from different sources (data explicitly provided by the passenger, data gathered by data
brokers18, data collected in commercial transactions, data captured by security cameras,
etc.) for different kind of passenger profiling (public security, private security, and
commercial). Is such a mix of data sources and processing purposes legal? When taking into
account that one of the guiding principles of EU data protection is purpose limitation and
specification19, it is clear that the legality of such a situation is far from self-evident. On the
other hand, SMTs at airports and transportation sites will often inevitably mix public and
private interests. The current negations within the EU regarding the handling of Passenger
Name Records20 (PNR) is good example of the way in which private and public interests
become entangled with each other. The proposed PNR Directive21 aims to regulate the
transfer of PNR data from commercial carriers to national authorities in order to enable the
context where many actors and interests are involved it is particularly tempting to re-use a data set for a new purpose or combine several data sets into a new one. 18
An example of a situation in which the multiplicity of processing purposes and the involved actors was controversial, was the “Secure Flight” program in the US. This “Secure Flight” program was created by the US Transportation Security Administration (TSA) and became the center of controversy when it turned out that the TSA made use of the services of a commercial data broker to which they transferred passenger data. (EDRI, 2005, 14 July) In 2005 the US Congress suspended funding for the program. It is clear, however, that the TSA is still interested in the possibility of using the services of commercial data broker firms to screen passengers. In January 2013 the TSA posted a Market Research request for information regarding “Third Party Screening”. (https://www.fbo.gov) Several commentators have expressed their concern about this. (Stanley, 2013, 11 January; Sternstein, 2013, 16 January). 19
The principle of purpose limitation (comprising both the requirement of initial purpose specification and a compatibility test for purposes that were not initially envisioned) is formulated in Article 6(1)(b) of Directive 95/46/EC: “Member States shall provide that personal data must be collected for specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes. Further processing of data for historical, statistical or scientific purposes shall not be considered as incompatible provided that Member States provide appropriate safeguards”. On 2 April 2013 Working Party 29 (an independent advisory body on data protection and privacy set up under art. 29 of Directive 95/46/EC) adopted Opinion 03/2013 on purpose limitation, which states that while “[t]he principle of purpose limitation continues to be considered as sound and valid… [a] lack of harmonised interpretation has led to divergent applications of the notions of purpose limitation and incompatible processing in the different Member States, especially in comparison to other principles”. (p. 5) 20
“PNR data is information provided by passengers, and collected by the air carriers for their own commercial purposes. It contains several different types of information, such as travel dates, travel itinerary, ticket information, contact details, travel agent at which the flight was booked, means of payment used, seat number and baggage information. The data is stored in the airlines' reservation and departure control databases.” (EU Passenger Name Record (PNR) - Frequently Asked Questions, MEMO/11/60, 02 February 2011) 21
The proposed Passenger Name Record (PNR) Directive was first proposed by the commission (COM(2011) 32 final, 2 February 2011) and later refined by the Council (nr. 8916/12, 23 April 2012).
25
latter to use and share these data with authorities from other Member States for the
purpose of preventing, detecting, investigating and prosecuting terrorist offences and
serious crime. When there are many actors and interests involved in an SMT the question
which legal framework applies is crucial. The answer to this question is likely to be a
complicated one: whether a legal instrument applies might differ depending on the source
of the data, the actors and the interests involved. Does the SMT involve a possible freedom
infringement of a State against a citizen? Does (part of) the SMT concern police and judicial
cooperation in combating crime? Does the SMT have to do with the provision of goods and
services? Although the question of the protective scope of a legal instrument cannot be
exhaustively answered by looking at the legal framework to which it belongs, the legal
framework is the place where one should begin looking when one wants to know which legal
instruments apply. For example, in principle the European Convention of Human Rights only
has a vertical effect: it protects against fundamental rights infringements committed by
States. In art. 34 of the ECHR it says that only complaints against States are admissible to the
European Court of Human Rights.22 Yet, as will be further illustrated in section 2.2, the idea
that the ECHR only has a vertical effect should be nuanced. One of the reasons why the ECHR
can sometimes also have an impact on relations between citizens (horizontal effect) is
because of the so-called positive obligations: many of the fundamental freedoms protected
in the ECHR do not only require an absence of unjustified State interference, but also impose
positive obligations onto States to secure these freedoms.
In chapter 2 of this deliverable we give a more detailed analysis of the protective
scope of the legal frameworks of the EU and the ECHR (Council of Europe) and explore how
notions as fair balance and proportionality operate within these frameworks. However,
before we can introduce the legal frameworks of the EU and the ECHR (Council of Europe) in
chapter 2, there are two preliminary issues that need some further clarification. Thus, in the
remainder of chapter 1 we will first take a quick look at the political and institutional setting
of the Council of Europe (1.1.2) and the EU (1.1.3), and the specific role of Impact
Assessment (IA) and Legal Protection by Design (LPbD) in the EU legal framework (1.1.4).
The basic understanding of the political and institutional setting of the EU and the CoE
22
Art 34 ECHR. Individual applications. The Court may receive applications from any person, nongovernmental organisation or group of individuals claiming to be the victim of a violation by one of the High Contracting Parties of the rights set forth in the Convention or the Protocols thereto. The High Contracting Parties undertake not to hinder in any way the effective exercise of this right.
26
(provided in sections 1.1.2 and 1.1.3) will be a helpful stepping stone for chapter 2, when we
explore in which SMT settings (with a particular focus on smart CCTV and passenger profiling
systems) particular legal instruments of the CoE and the EU apply. The second issue that we
explore in the remainder of this chapter (1.2) are the implications of the systematic and
fractal nature of law for Legal Protection by Design (LPbD), and more in particular the KORA
method. The conceptual issues presented in 1.2 are a stepping stone for those parts of
chapter 2 in which we look at the design implications that follow from an LPbD analysis
when applied to smart CCTV.
27
Legal framework
Freedoms identified in SIAM
EU Council of Europe – European Convention for Human Rights (ECHR) Primary EU law:
EU Charter (CFR) Secondary EU law:
EU Directives and Regulations
Art
. 25
Th
e ri
ghts
of
the
eld
erly
Art
. 26
In
tegr
atio
n o
f p
erso
ns
wit
h d
isab
ilit
ies
Art
. 35
Hea
lth
car
e
Art
. 45
( 1
) F
reed
om
of
mo
vem
ent
and
of
resi
den
ce
Art
. 47
. Rig
ht
to a
n e
ffec
tive
rem
edy
and
to
a f
air
tria
l
Art
. 48
Pre
sum
pti
on
of
inn
oce
nce
an
d r
igh
t o
f d
efen
se
Data protection
Anti-discrimination Freedom of movement
Art
. 2 R
igh
t to
life
A
rt. 3
Pro
hib
itio
n o
f to
rtu
re
A
rt. 5
Fre
edo
m f
rom
un
law
ful d
eten
tio
n
Art
. 6 P
resu
mp
tio
n o
f In
no
cen
ce a
nd
Fai
r T
rial
Art
. 8 R
esp
ect
for
pri
vat
e an
d f
amil
y li
fe
Art
. 9(1
) F
reed
om
of
tho
ugh
t, c
on
scie
nce
an
d r
elig
ion
A
rt. 1
4 P
roh
ibit
ion
of
dis
crim
inat
ion
wit
h r
egar
d t
o t
he
exer
cise
o
ther
hu
man
rig
hts
Art
. 1 H
um
an d
ign
ity
Art
. 2 (
1)
Rig
ht
to li
fe
Art
. 3 (
1)
Rig
ht
to t
he
inte
grit
y o
f th
e p
erso
n
Art
. 4 P
roh
ibit
ion
of
tort
ure
an
d in
hu
man
or
deg
rad
ing
tr
eatm
ent
Art
. 6 R
igh
t to
lib
erty
an
d s
ecu
rity
A
rt. 7
Res
pec
t fo
r p
riv
ate
and
fam
ily
lif
e
Art
. 8 P
rote
ctio
n o
f p
erso
nal
dat
a
Art
10
(1
) F
reed
om
of
tho
ugh
t, c
on
scie
nce
an
d r
elig
ion
Art
. 21
No
n-d
iscr
imin
atio
n
Art
. 24
(2
) T
he
righ
ts o
f th
e ch
ild
Gen
eral
Dat
a P
rote
ctio
n
Dat
a P
rote
ctio
n i
n p
oli
ce
and
jud
icia
l co
op
erat
ion
in
cr
imin
al m
atte
rs
Em
plo
ymen
t eq
ual
ity
Dir
ecti
ve
20
0/7
8/E
C
Rac
ial e
qu
alit
y D
irec
tive
20
00
/43
/EC
Gen
der
Rec
ast
Dir
ecti
ve 2
00
6/5
4/E
C
Gen
der
Go
od
s an
d S
erv
ices
Dir
ecti
ve 0
04
/11
3/E
C
Dir
ecti
ve 2
00
4/3
8/E
C o
n t
he
righ
t to
mo
ve a
nd
res
ide
free
ly
Dat
a p
rote
ctio
n D
irec
tive
9
5/4
6/E
C
Fra
mew
ork
Dec
isio
n
20
08
/97
7/J
HA
Bodily integrity X x x x x x x
Equal Treatment and non-discrimination
x x x x X
Freedom of movement x X
Freedom from unlawful detention
x x
Presumption of innocence
x x
Fair trial and due process x x x
Privacy and data protection
x x x x x
Figure 2. The inductively inferred Freedom Infringements of D4.2 mapped against some of relevant rights and legal instruments of the EU and CoE legal frameworks.
28
1.1.2 The legal framework of the Council of Europe (CoE) - and why it of
specific importance to the SIAM assessment process.
The legal framework of the EU and of the Council of Europe are frameworks with a major
impact and territorial reach. Either directly or indirectly (through their impact on national
laws) they inform the majority of applicable legal norms within any airport and public
transportation systems in the member states of the EU and the Council of Europe. However,
the way in which these two frameworks exercise their influence differs. Looking at the
political and institutional setting of the Council of Europe (1.1.2) and the EU (1.1.3) helps to
understand the differences in their rationales and ways of functioning.
In the aftermath of the horrors of the Second World War, on the 5th of May 1949, ten
European countries brought the Council of Europe into existence by signing the Treaty of
London. The Council of Europe (CoE) is an international organization that aims to “achieve a
greater unity between its members for the purpose of safeguarding and realising the ideals
and principles which are their common heritage and facilitating their economic and social
progress.” (art. 1 (a) of the CoE Statute23) Today the CoE has 47 members. It covers a
significantly larger territory than the European Union (28 member states), stretching out
deeply into the Euroasian territory with members such as Turkey, Armenia and the Russian
Federation. Then again, compared to the European Union the scope and intensity of the
cooperation within the CoE is rather limited. The CoE produces treaties, officially known as
Conventions. These Conventions do not have a directly binding legal force. A Convention
only becomes binding when it is ratified by the national parliament of a member state and
consequently made part of the national law. The most important and influential Convention
of the CoE is the European Convention on Human Rights European (ECHR). The ECHR was
inspired by the Universal Declaration of Human Rights (adopted by the United Nations in
1948). Contrary, however, to the UN Universal Declaration of Human Rights the ECHR was
created as a more regional document (not universal but European) and it did not only cradle
a set of venerable human rights but also brought the European Court of Human Rights
(1959) into being. The establishment of this court has given the ECHR considerably more bite
than the Universal Declaration of Human Rights.
23
Statute of the Council of Europe, London, 5 May 1949. Online available at: http://conventions.coe.int/Treaty/en/Treaties/Html/001.htm
29
When the ECHR was signed on 4 November 1950 and entered into force on 3
September 1953, the possibility was created for all individuals in CoE member states to bring
a legal action against a member state before the European Court of Human Rights in
Strasbourg (ECtHR). The condition for bringing a legal action before the Court is that an
individual believes that a member state has violated his or her fundamental human rights as
protected by the ECHR and all national remedies have been that have exhausted. Thus, the
main rationale of the ECHR is to offer individual citizens protection against State power by
providing a concrete legal route of redress when fundamental rights have been subjected to
State infringement.
1.1.3 The EU legal framework - and why it of specific importance to the SIAM
assessment process.
Today’s European Union, an economic and political union of 28 member states, began in
1950 when Robert Schuman, the French Foreign Minister, proposed to create an
supranational organisation that would be responsible for the “Franco-German coal and steel
production.” (Blair, 2005, p. 3) One year later Schuman’s proposal resulted in European Coal
and Steel Community (ECSC) consisting out of six member states: France, West Germany,
Italy, Belgium, Luxembourg and the Netherlands. It is important to note that the set-up of
the ECSC was guided by the neologism of supranationality: requiring more political
integration than a mere international organization like the Council of Europe but less than a
federation like the USA. A second point that made the ECSC a distinctly different
international organization from the Council of Europe, was its economic orientation –
towards a common market. Both the Council of Europe and the ECSC were build against the
background of the atrocities of the Second World War and were fuelled by the idea that
these atrocities should never be allowed to happen again: having a common market for coal
and steel, the two main resources needed to wage a war, was as much a buffer against
totalitarian inhumanities as the human rights protected by the Court in Strasbourg.
However, contrary to the institutions of the Council of Europe, the ECSC and the two other
bodies that were at the basis of European Union – the European Economic Community (EEC)
and the European Atomic Energy Community (Euratom), both established in 1957 – mixed
30
political idealism with economic ingenuity. The main concern of the CoE was, and still is, the
power relationship between individual and State, whereas the EU and its predecessor had,
and to a certain extent still have, its main focus on the constitution and preservation of a
particular economic, political and legal structure. While it is clear that during the last
decades the protection of fundamental rights, adjusting the power imbalances within the
power relationship between individual and State, has gained an increasing importance
within the EU, especially since the Charter of Fundamental Rights of the EU (CFREU) entered
into force in 200924 and the EU made a commitment25 to accede to the ECHR, the concern
with the EU as a economic, political and legal structure continues to give a distinctly different
flavor to the legal framework of the EU when compared to that of the ECHR (CoE).
This is not to say that there is no overlap between the way the legal frameworks of
the EU and the ECHR function – quite the contrary. It is widely acknowledged that the
highest court in matters of European Union law, that is, the European Court of Justice (ECJ),
and the European Court of Human Rights (ECtHR), pay close attention to each other’s
rulings. (Bratza, 2013) Moreover, when the EU will accede to the ECHR, which is to be
expected sooner rather than later, the mutual dialogue between the two courts will likely
become even more intense. Nevertheless, the legal framework of the EU is one that is
mainly focused on regulating the area of the EU in accordance with certain economic,
political and legal aspirations. When comparing the legal framework of the ECHR (CoE) and
the EU, one sees that the former is mainly characterized by an ex post (or post active)
approach of imposing sanctions after infringements have already taken place, which is the
classical fundamental rights approach. In contrast, as Gellert et al. (2012) have pointed out,
the EU legal framework with regard to fundamental rights combines the classical ex post
approach with an ex ante approach that aims to create legal and institutional structures
which would prevent infringements from happening in the first place. This is clearly visible in
the field of data protection and anti-discrimination law: detailed secondary EU law,
subjective rights empowering data subjects and possible victims of discrimination, and
supervisory bodies all aim to prevent infringements. (cf. Gellert, et al., 2012) Two additional
24
Charter of Fundamental Rights of the EU, 2000/C 364/01. Entred into force on 1 December 2009, as part of the Treaty of Lisbon. 25
See for the latest negotiations with regard to the entry of the EU to the ECHR: http://www.coe.int/t/dghl/standardsetting/hrpolicy/Accession/Meeting_reports/47_1(2012)R03_EN_final.pdf (accessed 2 August 2013) See also: Polakiewicz, 2013.
31
ex ante instruments that have gained popularity over the last years within the EU legal
framework with regard to fundamental rights, are fundamental right impact assessments
(FRIAs) and Legal protection by Design (LPbD).
1.1.4 Fundamental Right Impact Assessment and Legal Protection by Design
in the EU legal framework (with special attention to their role in Data
Protection)
FRIAs (cf. Wright & Hert, 2012, with regard to privacy impact assessments) assess the impact
of policies and technologies on fundamental rights. LPbD (Hildebrandt, 2011a) poses design
requirements on policies and technologies regarding their compatibility with certain legal
norms, particularly those which are embodied in fundamental rights. Both instruments are
ex ante instruments that tinker with the relationship between, in this case, an SMT and a
fundamental right in order to prevent fundamental rights infringements by the SMT from
happening (figure 3).
LPbD and FRIAs are instruments that are related to each other. Although LPbD is not
necessarily preceded by a FRIA (LPbD is often conceived as a continuous obligation, while
FRIAs are not) the outcomes of a FRIA can influence LPbD and the fact that LPbD has been
implemented will not go unnoticed in a FRIA26. One legal domain that has been particularly
appreciative of the possibilities of FRIAs and LPbD is EU data protection.
26
In the proposed General Data Protection Regulation (see below, footnote 32) the link between IA and LPbD is not made explicit. EDRI proposes to adjust this by obliging a controller to take the results of the DPIA of Article 33 into account when developing Data Protection by Design according to Article 23 GPDR. (European Digital Rights (EDRI), 2012b, commentary on Article 23 GPDR)
32
Figure 3. IA and LPbD. Two ways of bringing SMTs and Fundamental Rights in alignment with each
other and prevent any clashes between them: studying the effect of the SMT on fundamental
rights (Impact Assessment), and translating legal requirements into the design of the SMT
(Legal Protection by Design).
(a) LPbD in EU data protection: Data Protection by Design.
Legal Protection by Design, even though not explicitly named like that and hardly formalized,
has been a well-known instrument in data protection for a long while. In EU data protection
legislation the first sketchy lines of Data Protection by Design (DPbDesign) are formulated in
Data Protection Directive 95/46/EC27 (DPD). In section 46 of the Preamble of the DPD it says
that “appropriate organizational and technical measures” have to be taken “particularly in
order to maintain security and thereby to prevent any unauthorized processing”28, and the
27
Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, Official Journal L 281 , 23/11/1995, p. 31 - 50 28
“… the protection of the rights and freedoms of data subjects with regard to the processing of personal data requires that appropriate technical and organizational measures be taken, both at the time of the design of the processing system and at the time of the processing itself, particularly in order to maintain security and thereby to prevent any unauthorized processing; whereas it is incumbent on the Member States to ensure that controllers comply with these measures; whereas these measures must ensure an appropriate level of security, taking into account the state of the art and the costs of their implementation in relation to the risks inherent in the processing and the nature of the data to be protected” (section 46 of the Preamble to data Protection Directive 95/46/EC)
33
same phrasing returns in more detail in Article 17 (Security of processing), where it says that
such measures have to “implemented” while having regard “to the state of the art and the
cost of their implementation.” Because the DPD was heavily influenced by the 1981 Council
of Europe Convention nr. 108 for data protection29 (“Convention 108”) it is interesting to
contrast Art. 17 DPD with the corresponding Article on data security in Convention 108:
“Data security. Appropriate security measures shall be taken for the protection of
personal data stored in automated data files against accidental or unauthorised
destruction or accidental loss as well as against unauthorised access, alteration or
dissemination.” (Art. 7 of Convention 108)
When comparing Art. 7 of Convention 108 to Art. 17 DPD, one can see that the idea of
technical and organizational measures and the focus on implementation, appear as novelties
in the latter legal instrument:
“Security of processing. Member States shall provide that the controller must
implement appropriate technical and organizational measures to protect personal
data against accidental or unlawful destruction or accidental loss, alteration,
unauthorized disclosure or access, in particular where the processing involves the
transmission of data over a network, and against all other unlawful forms of
processing. Having regard to the state of the art and the cost of their
implementation, such measures shall ensure a level of security appropriate to the
risks represented by the processing and the nature of the data to be protected.” (Art.
17(1) DPD, italics ours)
Although the expression DPbDesign is not used in Art. 17 DPD, words like “technical” and
“implementation” have opened up a way of thinking that is characteristic of DPbDesign. Art.
22 (Security of Processing) of the Council Framework Decision on the protection of personal
29
Council of Europe, Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, nr. 108, Strasbourg, 28 January 1981.
34
data processed in the framework of police and judicial cooperation in criminal matters30 (JHA
Framework Decision), also speaks of the obligation to “implement appropriate implement
appropriate technical and organisational measures to protect personal data” – thereby
reiterating the wording of Art. 17(1) DPD. The similarities31 between Art. 17 DPD and Art 22
of the JHA Framework Decision – a legal instrument that was adopted more than a decade
after the DPD and which regards a certain type of law enforcement data that fall outside the
scope of the general data protection regime of the DPD – are a strong indicator that
DPbDesign is there to stay.
This impression is reinforced by the recently proposed General Data Protection
Regulation32 (GDPR), which is supposed to replace Data Protection Directive 95/46/EC. The
GPDR again speaks of “the appropriate organizational and technical measures” in the
context of the security of the processing (art. 30 of the proposed GPDR). Moreover, the
proposed GPDR complements this with a general article on Data Protection by Design and by
Default (Art. 23 GPDR). In contrast to Art. 17 DPD and art. 30 of the proposed GPDR, “the
appropriate organizational and technical measures and procedures” in Art. 23 of the
proposed GPDR are not merely aimed at providing an appropriate level of security against
unlawful processing but aim to meet all requirements of the proposed GPDR:
30
Council Framework Decision 2008/977/JHA of 27 November 2008 on the protection of personal data processed in the framework of police and judicial cooperation in criminal matters, Official Journal L 350/60, 30 December 2008. 31
One difference between Art. 17 DPD and Art 22 of the JHA Framework Decision is the level of detail with which the various security hazards are described. Given that the DPA is a general data protection instrument whereas the JHA Framework Decision concerns law enforcement data, this detailed concern for data security is not very surprising though. Article 22(2) JHA Framework Decision specifies that measures should be implemented designed to (a) deny unauthorised persons access to data-processing equipment used for processing personal data (equipment access control); (b) prevent the unauthorised reading, copying, modification or removal of data media (data media control); (c) prevent the unauthorised input of data and the unauthorized inspection, modification or deletion of stored personal data (storage control); (d) prevent the use of automated data-processing systems by unauthorised persons using data communication equipment (user control); (e) ensure that persons authorised to use an automated dataprocessing system only have access to the data covered by their access authorisation (data access control); (f) ensure that it is possible to verify and establish to which bodies personal data have been or may be transmitted or made available using data communication equipment (communication control); (g) ensure that it is subsequently possible to verify and establish which personal data have been input into automated dataprocessing systems and when and by whom the data were input (input control);(h) prevent the unauthorised reading, copying, modification or deletion of personal data during transfers of personal data or during transportation of data media (transport control); (i) ensure that installed systems may, in case of interruption, be restored (recovery); (j) ensure that the functions of the system perform, that the appearance of faults in the functions is reported (reliability) and that stored data cannot be corrupted by means of a malfunctioning of the system (integrity). 32
Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), Brussels, 25.1.2012, COM(2012) 11 final .
35
“Having regard to the state of the art and the cost of implementation, the controller
shall, both at the time of the determination of the means for processing and at the
time of the processing itself, implement appropriate technical and organisational
measures and procedures in such a way that the processing will meet the
requirements of this Regulation and ensure the protection of the rights of the data
subject.” (Art 23(1) of the proposed GPDR) 33
A similar extension of the obligation on the data processor to take “appropriate
technical and organisational measures” beyond the “security of processing” is present in the
proposed Law Enforcement Data Protection Directive34 (LEDPD), which is supposed to
replace the JHA Framework Decision. While Art. 27 of the proposed LEDPD almost literally
reiterates Art. 22 of the JHA Framework Decision, obliging the data controller or processor
to implement measures designed to protect the security of the processing (unauthorized
access, modification of the data, etc.), Art. 19 of the proposed LEDPD presents the obligation
to “data protection by design and by default” in the same general way as Art. 23 of the
proposed GPDR. Yet there are also striking differences between Art. 23 of the proposed
GDPR and Art. 19 of the proposed LEDPD. One such difference is that the continuous
obligation to implement appropriate technical and organisational measures (“both at the
time of the determination of the means for processing and at the time of the processing
itself”) of Art. 23(1) of the proposed GPDR, is absent in Art. 19 of the proposed LEDPD.
33
Article 23(1) gives the main definition of Data Protection by Design, while the remainder of the Article mainly shows that an exact understanding of this notion has not been crystallized yet. Article 23 (2) obliges the data controller to implement mechanisms to ensure Data Protection by Default, which is a certain form of Data Protection by Design based on the idea “that privacy intrusive features of a certain product or service are initially limited to what is necessary for the simple use of it”. (European Data Protection Supervisor, 2012, 7 March, p. 29-30) However, as the European Data Protection Supervisor (EPDS) argued in his Opinion on the GPDR, Article 23(2) does not give “any clear substance” to “data protection by default”: “The first sentence does not add much to the general principles of data processing in Article 5, and the data minimisation principle in Article 5(c) in particular, except from the confirmation that such principles should also be embedded in the design of relevant systems.” (European Data Protection Supervisor, 2012, 7 March, p. 29) Articles 23(3) and (4) create the possibility for the Commission to specify further criteria and requirements (3), and technical standards (4) that follow from Data Protection by Design and Default. 34
Proposal for a Directive of the European parliament and of the Council on the protection of individuals with regard to the processing of personal data by competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and the free movement of such data, 25 January 2012, COM(2012) 10 final.
36
(b) IA in EU data protection: Data Protection Impact Assessment (DPIA).
The predecessor of the Impact Assessment with regard to Data Protection (DPIA) can be
found in Articles 18, 19 and 20 of the DPD, regarding the obligation of data processors to
notify the supervisory authority (Art. 18) and the obligation of the supervisory authority to
“determine the processing operations likely to present specific risks to the rights and
freedoms of data subjects” and to “check that these processing operations are examined
prior to the start thereof” (Art. 20). Though the idea of IA might already be present in the
DPD, its presence in the proposed GDPR is incomparably more detailed and explicit. Article
33(1) GPDR (Data Protection Impact Assessment) opens:
“Where processing operations present specific risks to the rights and freedoms of
data subjects by virtue of their nature, their scope or their purposes, the controller or
the processor acting on the controller's behalf shall carry out an assessment of the
impact of the envisaged processing operations on the protection of personal data.”
After the first introductory section Article 3335 specifies a set of processing operations that
pose specific risks (Art. 33(2)), and gives some minimum standards as to which elements a
35
Article 33. Data protection impact assessment 33(1). [see the in-text quotation] 33(2). The following processing operations in particular present specific risks referred to in paragraph 1: 33(2a) a systematic and extensive evaluation of personal aspects relating to a natural person or for analysing or predicting in particular the natural person's economic situation, location, health, personal preferences, reliability or behaviour, which is based on automated processing and on which measures are based that produce legal effects concerning the individual or significantly affect the individual; 33(2b) information on sex life, health, race and ethnic origin or for the provision of health care, epidemiological researches, or surveys of mental or infectious diseases, where the data are processed for taking measures or decisions regarding specific individuals on a large scale; 33(2c) monitoring publicly accessible areas, especially when using optic-electronic devices (video surveillance) on a large scale; 33(2d) personal data in large scale filing systems on children, genetic data or biometric data; 33(2e) other processing operations for which the consultation of the supervisory authority is required pursuant to point (b) of Article 34(2). 33(3). [see the in-text quotation] 33(4). The controller shall seek the views of data subjects or their representatives on the intended processing, without prejudice to the protection of commercial or public interests or the security of the processing operations. 33(5). Where the controller is a public authority or body and where the processing results from a legal obligation pursuant to point (c) of Article 6(1) providing for rules and procedures pertaining to the processing operations and regulated by Union law, paragraphs 1 to 4 shall not apply, unless Member States deem it necessary to carry out such assessment prior to the processing activities. 33(6). The Commission shall be empowered to adopt delegated acts in accordance with Article 86 for the purpose of further specifying the criteria and conditions for the processing operations likely to present specific
37
DPIA should contain (Art. 33(3)). These elements, that a DPIA should at least contain, are: (i)
a general description of the envisaged processing operations, (ii) an assessment of the risks
to the rights and freedoms of data subjects, (iii) the measures envisaged to address the risks,
(iv) safeguards, security measures and mechanisms to ensure the protection of personal
data and (iv) a demonstration of compliance with the proposed GPDR, taking into account
the rights and legitimate interests of data subjects and other persons concerned.
The list of the minimal elements a DPIA should contain (Art. 33(3)) is followed by the
obligation to consult data subjects as stakeholders (Art. 33(4)), by an exemption for doing a
DPIA if the data controller is a public authority or body acting and the processing is
necessary for compliance with a legal obligation to which the controller is subject (Art.
33(5)), and by the possibility for the Commission to further specify the requirements,
standards and procedures of DPIAs ((Art. 33(6) and (7)).
Despite the relatively detailed description of DPIAs in Article 33 GPDR much is still
unclear. As has been noted by several commentators36, it is also unclear why the DPIA is
lacking in the proposed LEDPD. As EDRI puts it:
“The Commission’s wording would result in a situation where a shopping mall
wanting to install video surveillance would need to carry out a DPIA in accordance
with Article 33 of the General Data Protection Regulation, while the police would not
have to do so when installing an identical system in the public space just outside the
mall.” (European Digital Rights (EDRI), 2012a, justification for a new Recital 43a )
It is only in practice that the format of DPIAs gets more body and structure. In this respect
the DPIA template with regard to Smart Metering37, submitted by the Commission to Article
29 Working Party for comments on 8 January 2013, is an interesting example of how a DPIA
could look in practice. Article 29 WP has pointed out that a DPIA should have sufficient
risks referred to in paragraphs 1 and 2 and the requirements for the assessment referred to in paragraph 3, including conditions for scalability, verification and auditability. In doing so, the Commission shall consider specific measures for micro, small and medium-sized enterprises. 33(7). The Commission may specify standards and procedures for carrying out and verifying and auditing the assessment referred to in paragraph 3. Those implementing acts shall be adopted in accordance with the examination procedure referred to in Article 87(2). 36
(Article 29 Data Protection Working Party 29, 2012, 23 March, p. 29; European Data Protection Supervisor, 2012, 7 March, p. 62; European Digital Rights (EDRI), 2012a, justification for a new Recital 43a) 37
Prepared by Smart Grids Task Force Expert Group 2 of the Directorate-General Energy.
38
specificity and look at “actual impacts on the data subjects.” (Article 29 Data Protection
Working Party 29, 2013, 22 April, p. 7) Furthermore, WP 29 suggest “to consider the
opportunity of defining a generic DPIA methodology from which field specific efforts could
benefit.” (p. 9)
(c) Impact Assessment and Legal Protection by Design with regard to other fundamental
rights.
Although data protection is the field within EU fundamental rights protection in which the
proactive approach through LPbD and IA is most developed, this is not to say that LPbD and
IA can only to be applied within this domain. As we will show below, in section 1.2, LPbD and
IA can be used for the protection of many more fundamental rights than merely data
protection – though admittedly often some ingenuity will be needed to do this in an optimal
way. In this respect the KORA method offers interesting possibilities, because it is a general
method that can be both used for IA and to create LPbD. Moreover, it does, in principle,
work for any legal requirement. However, in order to make the most of this method, we will
now (section 1.2) look at the implications of the inherent systematic nature of law on the
KORA method, and situate it in the broader field of Legal Protection by Design.
39
1.2 A broader view in terms of the relation between SMTs and legal
normativity: KORA as one particular form of Legal Protection by Design
(LPbD)
Legal Protection by Design (LPbD) is a term proposed by Hildebrandt (2011a) which conveys
the idea that legal norms can be articulated in architecture and which is especially
concerned with the articulation of fundamental rights in ICT architecture. LPbD is based on
the idea that “the legal requirements of fundamental rights such as privacy and data
protection must be translated into computer system hardware, code, protocols and
organizational standards to sustain the effectiveness of such right in a changing
technological landscape.” (Hildebrandt, 2013b, p. 10)
LPbD should be distinguished from the related notion of techno-regulation (Berg &
Leenes, 2013; Brownsword, 2008; Hildebrandt, 2011b; Leenes, 2010, 2011) which can be
described as the intentional use of “built-in mechanisms to influence people’s behaviour.”
(Koops, 2008, p. 158; as cited and elaborated upon in: Leenes, 2010, p. 21)
Firstly, techno-regulation is thus not limited to any specific kind of norm, while LPbD
only regards legal norms.38 In LPbD the democratic and systematic pedigree39 of legal norms
is of utmost importance: as we will discuss later this is not only to establish the legal validity
and democratic legitimacy of a norm, but also to try to preserve something of the fractal
mode of operation (going back-and-forth between an individual norm, or individual case, and
the body of law as a whole) of a legal norm with respect to its pedigree.
Secondly, it is worth noting that the adjective “techno-” is not present in LPbD.
Consequently, the notion of “design” in “Legal Protection by Design” is not necessarily the
legal protection by means of technological design. While it is true that LPbD recognizes that 38
Hildebrandt (2011a) explains the role of the three constituent words in the notion “Legal Protection by Design” as follows: “In using the term legal I emphasize the role of the democratic legislator as well as the possibility to contest the way the norm affects human behaviour. In using the term protection I emphasize that this is not about implementing written legal rules by means of technological enforcement. I also avoid the term regulation that easily resonates the top-down managerial governmental model discussed in the previous section. Finally, in using the term design I emphasize that this is not only about engineering but also about human-machine-interfacing, highlighting that such inscription of legal norms is not only a matter of technique but also an art”. (p. 247) 39
In jurisprudence the word “pedigree” is strongly associated with the so-called “pedigree-thesis”: the tenet of legal positivism that “legal rules must be identifiable by their pedigree, not their moral content”. (Sebok, 1998, p. 284) We use the word pedigree without committing ourselves to this legal positivist tenet.
40
technology has a normative impact, the word “design” goes beyond merely technological
design. Instead LPbD is an anticipation of what specific technological infrastructures make
possible and impossible in terms of the exercise of fundamental rights.
Thirdly, LPbD adheres to a preservationist normativity with regard to a particular
social, political and legal constellation – “a fragile historical artifact” (D4.2., p. 7) – that is
best described as a constitutional democracy guided by the rule of law.40 LPbD is not about
just any architectural articulation of just any legal norm, but about the architectural
articulation of a legal norm that contributes to the existence of constitutional democracies
guided by the rule of law – hence the specific focus of LPbD on fundamental rights. A
preservationist attitude should not be confused with a conservative one. As the Red Queen
in Lewis Carroll’s Through the Looking-Glass remarks: “It takes all the running you can do, to
keep in the same place.” This observation by the Red Queen, that has become well-known as
a metaphor of how species have to continuously adapt to maintain relative fitness in relation
to their environment (“the evolutionary arms race”: Ridley, 1993), can be extended to the
preservation of the “fragile historical artifact” of constitutional democracy, the rule of law,
and the exercise of fundamental rights and freedoms. How to preserve a milieu governed by
the rule of law in face of a changing technological environment which sometimes has
detrimental effects 41 on it? What is needed is way to compensate for these detrimental
effects. When we focus on SMTs there are two – often intertwined – ways of compensating
for any detrimental effects they might have. One way is to adapt the SMTs to accommodate
existing legal norms that contribute to the existence of a milieu governed by the rule of law.
Another way is to articulate legal meta-norms42 that describe how specific technologies have
to be shaped in order to preserve a milieu that operates according to the rule of law, and
consequently fosters principles such as fair trial and good governance. This legal meta-norm
preserves by legally imposing the reinvention of “old” technological affordances that allow
for the continued existence of the social, political and legal artifact that we wish to preserve. 40
See for a discussion of what the rule of law entails, for example: Committee on Legal Affairs and Human Rights, 2007, 6 July. 41
Not all technological developments have a detrimental effect on constitutional democracy, the rule of law and the exercise of fundamental freedoms and rights. In fact, quite a few technological innovations might contribute to them. For example, a well-known line of argument is that the internet, in principle, is a democratic, decentralized medium that does not distinguish between professional journalists and amateurs and is less susceptible to centralized control than classical mass media. Others, such as Morozov (2011, 2013), are skeptical about the democratic affordances of the internet. 42
Clearly, merely articulating these legal meta-norms will not suffice: they also have to be legally accepted by the legislator, judges, legal doctrine, etc., in order to be more than a dead letter of law.
41
For example, in order to preserve the “old” technological affordance of large scale
readability, and consequently legal certainty, as created in conjunction with the invention of
the printing press (around 1450), legal meta-norms have to be created that demand
transparency (and thus readability and legal certainty) from opaque automated profiling
systems. Hildebrandt and Tielemans (2013) call such legal meta-norms “technology specific
legislation to ensure the objectives of technology neutral law.” (p. 520)
The KORA (concretization of legal requirements) method described in D9.2 is one way of
practicing LPbD. What makes the KORA method special within the broader field of LPbD is
both the level of abstraction and detail of the guidelines for transposing a legal norm into
technical design proposals. Much has been said and proposed about the relation between
legal norms and the possibility of their embodiment in technology (Brownsword, 2005;
Cavoukian, 2012; Mireille Hildebrandt, 2008b; Hildebrandt, 2011a; Leenes, 2011; Lessig,
2006; Reidenberg, 1998). However, most proposals are either focused on a limited set of
legal norms and present findings that are not easily applicable to other legal norms (this is
the especially the case for “privacy by design” or “data protection by design”, which form a
disproportionately large part of the wider field of “legal protection by design”), or present
theoretical and ethical explorations of the issues at stake without offering concrete and
detailed guidelines (Brownsword, 2005; Mireille Hildebrandt, 2008b; Leenes, 2011).
We will now take a closer look at (a) some of the theoretical assumptions underlying the
designing of norms into the architecture of an SMT, (b) the particularities of designing legal
norms into SMT architecture, (c) assess how the notions “privacy by design” and “privacy by
default” fit in the broader field of LPbD, and (d) assess how the KORA method fits in the
broader field of LPbD.
(a) Theoretical assumptions underlying the designing of norms into the architecture of
an SMT.
Security measures and technologies (SMT) that are operational in mass transport sites, for
example passenger profiling systems or smart CCTV systems, are socio-technical settings
(see D2.2) that are meant to fulfill certain aims such as the detection of possibly threatening
objects, events or people, controlling who can access where, understanding what is
happening within a controlled area, enforcing compliance, containing imminent threats, or
supporting any of the aforementioned security procedures by making them more
42
streamlined, efficient, and qualitatively better. However, what an SMT does (that is: how it
steers behavior) and what it should do, are hardly ever clear givens.
A technology can be used for many different purposes. Some of these purposes might be
illegal, socially disruptive or morally despicable, while others further the common good,
bring economic wealth, are fully legitimate or realize high-standing moral aims. A knife can
cut in many ways: it can be an instrument for brutal killings, as well as for the production of
beautifully carved practical wooden artifacts. The multiplicity of purposes that can be
achieved by a single artifact has resulted in a longstanding fallacy that technology is neutral.
However, rejecting an all too simplistic technological determinism (“Give a man a knife and
he will kill”) and moral univocity of technology (“A knife will inherently bring about evil”),
does not mean that one has to embrace the equally one-dimensional thesis of the neutrality
of technology. (Heidegger, 1977) As Kranzberg said saliently: “Technology is neither good
nor bad; nor is it neutral.” (1986) So what is technology, if it’s neither good, nor bad, nor
neutral? While technology is rarely truly deterministic it is always normative, that is it always
steers behavior into a particular direction rather than another. In the second half of the
twentieth century philosophers, sociologists and historians of technology (Latour, 1992;
Winner, 1986) have pointed this out over and again. Every artifact, device, system,
technique, technology or practice has a normative impact that is “situated in the way a
specific technology induces/enforces or inhibits/rules out certain types of behavior.”
(Mirelle Hildebrandt, 2008, p. 177) To give a banal example: while there is no iron law that
says that giving a sharp knife to a toddler as a toy will end in tears and bloodshed, the
“affordances” (Norman, 1990, 1998) of a stuffed animal or puzzle (that is, the possible set of
actions that they allow for), seem more likely to result in successful and safe children’s play.
The theoretical considerations that there is more to (socio-)technological design than
the realization of one singular aim, that its “felicity” or success is not measurable on a one-
dimensional scale (De Vries, 2013), and that its normativity is a matter of likelihood rather
than determinism, are slowly becoming incorporated 43 in the practice of (socio-
)technological design under the guise of notions such as ‘value sensitive design’ (e.g.
43
The dominant one-dimensional approach in technological design is poignantly described in D9.2: ““For technology, anything that functions, that works in a technological sense is ‘right’. The only limitations are feasibility and the laws of nature. Technological thinking is not concerned with the social and political consequences of the introduction of new technologies. It rather excludes any humane and societal responsibility of the technician through a purely functional approach.” (pp. 14-15)
43
Friedman, 2004; Mary Flanagan, Daniel C. Howe, & Nissenbaum, 2008)) and the morality of
technical artifacts (Verbeek, 2011). Within the field of computer science these notions seem
to fit well44 in the practice of “requirements elicitation” and “requirements engineering”
(Alexander & Beus-Dukic, 2009; Sommerville & Sawyer, 1997).
Within the broader field of normative design, the design of legal norms in ICT
architecture is a relatively young one. Three notions that have contributed much to its
popularity are Cavoukian’s “Privacy by Design” and “Privacy by Default” (notably: 2009,
2012), Lessig’s “Code as Law” (Lessig, 2006) and Reidenberg’s “Lex informatica” (1998).
(b) The particularities of designing legal norms, especially those expressing fundamental
rights, into SMT architecture.
As we argued above in section 1.1, the way in which legal norms are constituted in a
constitutional democracy and their systemic, fractal mode of operation when interpreted
makes them different from other norms. Their pedigree, both in terms of how they come
about and the systematic-fractal way in which they are interpreted, continuously affects
their meaning and effectiveness. This continuous effect of pedigree and necessity for
systematic-fractal interpretation does not have a similarly important role for the functioning
of non-legal norms. For example, a speed bump (which embodies the norm that cars must
slow down) will slow a car down even if it was illegally placed, but a fine for speeding will be
nullified if it can be shown that the legislation on which it was based contradicts a higher
legal norm or was not made according to the right legislative procedures: legal normativity
operates by unfolding its pedigree, while technological normativity works through a folding-
in or “black-boxing” of its pedigree. (De Vries & Van Dijk, 2013)
Is it possible to preserve something of the particular way in which legal norms function when
transposing them into an SMT architecture? Given the difference between how technical
and legal normativity function, this is far from obvious. One way to deal with this conundrum
is to conceive legal norms as ethical norms, and consequently ignore their particular legal
mode of operation. This approach seems particularly well suited for legal norms expressing
fundamental rights because, in comparison to “ordinary” legal norms, they have more of a
double existence: next to their existence as a legal norm they also exist as an ethical norm.
(cf. Sen, 2004) Another way is to have a functionalist approach: one accepts that law 44
See for example: Gürses, Gonzalez Troncoso, & Diaz, 2011.
44
operates differently than ,e.g., software code, but by placing both law and code under the
general heading of “regulation” one can simply pick the mode of operation that is best in
realizing a certain regulatory goal:
“In real space we recognize how laws regulate – through constitutions, statutes, and
other legal codes. In cyberspace we must understand how code regulates – how the
software and hardware that make cyberspace what it is regulate cyberspace as it is."
(Lessig, 2006, p. 6)
The problem with this approach is that it assumes that regulatory goals exist independent of
the way in which they are realized, and that one cannot distinguish goals that are inherently
legal or inherent for software. (cf. Gutwirth, De Hert, & De Sutter, 2008) Finally, there is also
the option to create technological affordances that mimic (parts of) the way in which legal
normativity operates. One way to do this in by mimicking the constructivist legal process
(exploring the pedigree of a legal norm and interpreting it in a systematic-fractal way, for
example by balancing it with requirements from other legal norms and the particularities of
the setting in which it is operative) during the process of finding which requirements should
be embedded in the technology. Another way of mimicking legal normativity is to embed the
(pseudo-)legal reasoning that went into the design of an SMT, into the design of an SMT. For
example, we could imagine that while waiting in a passenger line at the airport, one would
not only be informed of that one is being profiled by an automated passenger profiling
system but also about the (pseudo-)legal reasoning that underlies its design. Below, in
section 1.2(d), we will show how the KORA method allows for this latter solution. First,
however, we will take a look (section 1.2(c)) at “privacy by design” and “privacy by default”,
which are probably the most archetypical forms of LPbD (Klitou, 2011; 2012, pp. 264-290;
Krebs, 2013), and explore whether they preserve anything of the legal mode of operation.
(c) Assessing how the notions “privacy by design” and “privacy by default” fit in the
broader field of LPbD
The term “privacy by design” (PbDesign) was first coined in 1997 by Ann Cavoukian (2012;
Krebs, 2013), the Canadian Information & Privacy Commissioner. “Privacy” is used in this
expression in the sense of “information privacy” (Solove, Rotenberg, & Schwartz, 2006),
45
which is an equivalent of what is called “data protection” within Europe and does not square
with the European understanding of privacy as a fundamental freedom derived from Art. 8
ECHR. PbDesign is a combination of the ethical and regulatory understanding of privacy: it
is “the merger of two objectives: the protection and control of personally identifiable
information and privacy, and the advancement of the commercial application of
technologies in a sustainable but competitive manner.” (Krebs, 2013, pp. 2-3) PbDesign is
based on seven foundational principles, one of which is that one should strive to apply the
principle of Privacy by Default (PbDefault), which says that the default settings of an ICT
system or business practice should be such that no more data than necessary to fulfill the
basic goal are used (data minimization). The other six principles mainly explain the
theoretical tenets underlying PbDesign, which have much in common with tenets that are
also found in “value sensitive design.” The seven foundational principles of PbDesign45 can
be summarized as follows:
1. PbDesign tries to prevent infringements from happening: it is “proactive not
reactive; preventative not remedial”;
2. PbDesign requires that “if an individual does nothing” the default setting of
any given IT system or business practice does not use more data than strictly
necessary (data minimization);
3. PbDesign requires that “privacy becomes an essential component of the core
functionality being delivered” and not an add-on that is later added and that
diminishes its functionality;
4. PbDesign “seeks to accommodate all legitimate interests and objectives in a
positive-sum “win-win” manner”;
5. PbDesign seeks to offer “End-to-End Security”, that is “Full Lifecycle
Protection”;
6. PbDesign requires that the “component parts and operations” should be as
“visible and transparent” as possible, “to users and providers alike”;
7. PbDesign requires to make the design as user-centric as possible.
45
http://www.privacybydesign.ca/index.php/about-pbd/7-foundational-principles/
46
Of the seven principles it is principle 4 that seems most in line with a legal way of balancing
interests: not as a trade-off between different rights and interests, but in a constructive
manner wherein as much as possible of all these rights, aims and interests are respected. As
we will show in chapter 2, principle 4 of Cavoukian’s DbDesign is very much in line with the
so-called “strong proportionality test” that the ECJ and ECtHR use to construct a composition
of all the different interests and rights at stake in which they “are all preserved in an optimal
way.” (De Vries, Bellanova, De Hert, & Gutwirth, 2011, p. 21; Gutwirth et al., 2011) However,
no specific value is placed by principle 4 on preserving the reasoning leading up to a certain
win-win design46 in a legible and transparent state embodied within this design. The other
six foundational principles seem to preserve even less of the legal mode of reasoning. For
example, one could imagine situations in which a fundamental rights interpretation would
not lead up to a requirement of data minimization by default, e.g., because the sparse use of
information leads to discrimination based on crude stereotypes, because the security of the
SMT depends on abundant information, because the abundance of information provides a
certain protective opacity for the data subjects or because the creation of a data vault
allowing data subjects to see which data have been gathered and used serves the principles
of data protection better than data minimization. Sticking in such a situation to principle 2
(data minimization by default) could be grounded in ethical or regulative modes of
reasoning, but seems not very compatible with a legal mode of reasoning.
(d) Assessing how the KORA method fits in the broader field of LPbD.
LPbD can regard both broad and open legal norms (for example, the fundamental right to
the respect for private and family life, such as protected in Article 8 of the European
Convention of Human Rights) as well as well as more specific and detailed legal rules (for
example, the specific rule with regard to data quality found in art 6(b) of DPD 95/46/EC
which states that “personal data must be collected for specified, explicit and legitimate
purposes and not further processed in a way incompatible with those purposes”).
46
Cavoukian (2012, p. 3) writes with regard to such win-win solutions: “The notion that privacy requirements must be traded off against others (e.g. security vs. privacy or performance vs. privacy) is discarded as a dated formulation from the past. Innovative privacy solutions must prevail. Among those who have answered the call are the Ontario Lottery and Gaming Corporation, who have used privacy-protective facial recognition technology to ensure that self-excluded gamblers are kept off-site without compromising the privacy of other patrons. The Toronto Transit Commission have developed an approach to video surveillance that is both comprehensive and privacy protective.”
47
KORA, by adhering to a method of concretization of requirements, is less suited to
extract technical requirements from very specific legal rules and works best with broad
fundamental rights.
“KORA is based upon the most permanent legal norms, which – through their
fundamental and technology neutral nature – provide the framework for all future
societal developments.” (D9.2, p. 23; see also p. 16 of D9.2, on the “planning
reliability” following from stable basic rights, in contrast to quickly changing “acts of
parliament and regulations”)
In a way this might sound surprising: after all, is it not easier to implement a requirement
following from a detailed legal rule, than from a liberty like privacy, “which resists definition
ex ante”? (Hildebrandt & Tielemans, 2013, p. 517) By following a very strict guideline about
how to concretize a fundamental right (first into a “legal requirement”, then into “legal
criteria”, subsequently into “technical objectives” and finally into “technical design
proposals”; see D9.2. p. 26) and by preserving every step that is made, the KORA method
mimicks legal reasoning: it has a concern for the “pedigree” leading from a fundamental
right to a design proposal and uses a systematic-fractal way of interpreting a legal norm, that
allows for a contestability and transparency that has a distinct legal flavor to it. It would be
interesting to explore in more detail how KORA would deal with balancing several rights and
interests. Because the principle of proportionality is “an implicit part of the KORA method”
(D9.2, p. 65), allowing for the concretization of qualified rights (requiring an “internal”
proportionality test with regard to whether an infringement is justified), it seems very well
possible to extend the KORA method to explore the extent to which several norms can be
concretized together, without a trade-off between them (in German legal doctrine this is
often called “practical concordance” or “praktische Konkordanz”, Marauhn & Ruppel, 2008).
In order to see how LPbD, and KORA in particular, can be of help for performing FRIAs and in
establishing design requirements, we will now explore (chapter 2) the specifics of legal
reasoning with regard to SMTs within the framework of EU and CoE fundamental rights.
48
Chapter 2.
The legal frameworks of fundamental rights of the Council of Europe
and the EU. Assessing the legal compatibility of SMTs with European
fundamental rights (examples: Smart CCTV and Passenger Profiling)
and inferring LPbD implications from them (example: Smart CCTV).
In chapter 1 we explored the theoretical issues with regard to (i) the relation between an
individual legal norm and the legal framework to which it belongs, and (ii) Legal Protection
by Design (LPbD) as an overarching concept that includes the KORA method. We argued with
regard to the first issue that the interpretation of legal norms, especially those expressing
fundamental rights and freedoms, cannot take this norm in isolation but requires a
systematic-fractal way of reasoning, that is, a legal way of reasoning which assumes that
there is legal totality or legal system and which goes back-and-forth between an individual
case and other cases, norms, rules, interests, principles, etc., that are related to each other
through a particular legal framework. When fundamental rights are at stake, the systematic-
fractal way of legal reasoning is especially important to deal with conflicting or qualified
rights that require balancing (proportionality test). Secondly, when we looked at Legal
Protection by Design (LPbD) and Fundamental Right Impact Assesments (FRIAs) we noted
that the existing forms of LPbD and FRIAs are not very well suited for the preservation of the
systematic-fractal way in which law operates, and consequently also have difficulties in
preserving something of the legal operation of balancing fundamental rights with other
interests and rights. However, we also concluded that it is not impossible to preserve
something of the systematic-fractal way of legal reasoning in FRIAs and LPbD, and that the
KORA method (despite the fact that, in its basic form, it focuses on the concretization of
merely one legal norm) might be very well suited to do this.
The theoretical first chapter, is now followed by a practical one (chapter 2), in which
the theoretical explorations are applied. In this applied, practically oriented second half of
the deliverable we give an analysis of the relevant European legal frameworks with regard to
the freedoms that seem to be at stake in the case of SMTs (see the inductively generated list
of seven types of freedom infringements identified in D4.2).
49
In the first part of chapter 2 we introduce the structure of this chapter and present
some general observations with regard to the legal frameworks of the EU. In the second part
of chapter 2 we look at the most relevant fundamental rights and freedoms protected by the
ECHR, Convention 108, the CFREU, and several secondary EU instruments with regard to
data protection, anti-discrimination and freedom of movement. We pay specific attention to
(1) the proportionality tests used to establish whether there are limitations on the
protection of these fundamental rights and freedoms, (2) and if there are certain obvious
convergences or conflicts with other rights or legal instruments. Based on these
considerations we assess the compatibility of SMTs with these fundamental rights and
freedoms. To illustrate our compatibility analysis we pay specific detailed attention to the
legal compatibility of two particular SMTs: smart CCTV and passenger profiling. Finally, we
explore with regard to smart CCTV which legal design implications could be drawn from the
fundamental right frameworks of the EU and the CoE.
50
2.1. Some general observations
2.1.1 The relation between D4.2 and D9.7
Based on the overview of SMTs operational at airports and mass transportation sites (D2.2),
deliverable 4.2 engaged with a set of freedoms that may be affected by the applications of
smart CCTV, human security officers, body scanners, liquid detectors, biometrics, RFID,
border control systems and passenger profiling systems. One could easily imagine situations
in which the use of the aforementioned SMTs results in, for example, an invasion in one’s
privacy, a restriction on freedom of movement, or an infringement of bodily integrity. In
D4.2 these invasions, restrictions, and infringements have been further elaborated and
consequently inductively categorized into seven types of Freedom Infringements (FIs): (1)
bodily integrity, (2) equal treatment and-non-discrimination, (3) freedom of movement, (4)
freedom from unlawful detention, (5) presumption of innocence, (6) fair trial and due
process, and (7) privacy and data protection.
As explained in D4.2, a better understanding of what is at stake when a freedom is
violated or infringed upon, requires that one engages with international and European
fundamental rights law. We reiterate that fundamental rights law has a recent but
impressive history, which is constitutive for constitutional democracy and the rule of law as
we know them today. Especially in relation to security, international fundamental rights law
can help to prevent the kind of trade-offs that are often deemed inevitable in times of
emergency. The possibility to contest governmental measures by invoking fundamental
rights conventions in a court of law, has created a particular understanding of fundamental
rights and freedoms in which proportionality and fair balancing (see section 2.3, below) play
a crucial role.
This deliverable (D9.7), in which we map the inductively generated typology of
freedoms presented in D4.2 against the fundamental rights legal frameworks of the EU and
the Council of Europe (see above, figure 2), is an important supplement to D4.2. Without
D9.7 the importance of proportionality and fair balancing with regard to FIs would be too
easily overlooked.
51
The legal instruments that are looked at in D9.7 are the European Convention of
Human Rights (ECHR) of 1950, Convention 108 of the Council of Europe on the automatic
processing of personal data of 1981, the Charter of the European Union of 2000 and various
legal instruments of the EU concerning data protection, non-discrimination and freedom of
movement.
2.1.2 Fundamental rights – some introductory remarks
Fundamental rights have developed along three axes. The first generation of fundamental
rights concern civil and political rights and freedoms such as the right to life, liberty, security
and property, freedom of speech, assembly and association, freedom of religion, freedom
from discrimination, and freedom to vote. These freedoms encompass bodily integrity,
privacy, freedom from torture and inhuman and degrading treatment. For a long time they
were understood as negative rights, meaning that they prohibited governments from
interfering with the exercise of these rights. The second generation of fundamental rights
concern social and economic rights such as the right to housing, employment, healthcare
and social security. The right to education, for instance, sits on the cusp between a first and
second generation rights: governments should not unduly interfere with education but like
most social and economic rights the rights to education can only be exercised if
governments invest in a public education system for those who could otherwise not afford
to educate their children. The third generation of fundamental rights concerns rights such as
the right to a sustainable development, the right to self-determination of a people, and
other cultural and group rights, including also the right to development.
It is important to note that many legal scholars have acknowledged that even the
first generation of fundamental rights and freedoms often require a positive intervention of
the state, for instance when guaranteeing the horizontal effects (see below, section 2.2) of
these rights.
2.1.3 Smart CCTV and Passenger Profiling – two examples that are
looked at in more detail
52
Taking into account that assessing the fundamental right compatibility of SMTs is an
inherently situated exercise, we look at two SMTs in more detail to exemplify the
functioning of the European fundamental rights frameworks. These two SMTs are smart
CCTV and passenger profiling. Moreover, smart CCTV is also used as an exemplary case to
show how the European fundamental rights frameworks can be used as a source from which
LPbD design requirements can be drawn.
For more detail about smart CCTV and passenger profiling we refer to D2.2, in which
they are discussed extensively. The reason why we chose to look at these two particular
SMTs is that both often will rely on machine learning, and more specifically on the subfield
of automated-algorithmic profiling involving human data subjects, which is currently a hot
societal and political topic (Article 29 Data Protection Working Party 29, 2013, 13 May;
Council of Europe, 2010; B. Custers, Zarsky, Schermer, & Calders, 2012; B. H. M. Custers,
2004; de Goede, 2012; Ferraris, Bosco, Cafiero, D’Angelo, & Suloyeva, 2013; González Fuster,
Gutwirth, & Ellyne, 2010), and which also has the special interest of the authors of this
deliverable (De Vries, 2013; Gellert, et al., 2012; Mireille Hildebrandt, 2008a, 2008c;
Hildebrandt & De Vries, 2013). Article 29 Working Party, drawing on the CoE
Recommendation regarding profiling (Council of Europe, 2010), has recently given the
following definition of “profiling”:
“‘Profiling’ means any form of automated processing of personal data, intended to
analyse or predict the personality or certain personal aspects relating to a natural
person, in particular the analysis and prediction of the person’s health, economic
situation, performance at work, personal preferences or interests, reliability or
behaviour, location or movements.” (Article 29 Data Protection Working Party 29,
2013, 13 May, p. 2-3)
While computerized databases and the automated processing of data regarding natural
persons are practices that have been sources of legal and ethical concern for at least half a
century (Westin, 1967), automated profiling practices have recently posed new challenges.
The concern here is not just the data collection, storage, processing or retrieval as such, but
that computational-algorithmic operations make it possible to squeeze additional
53
knowledge out of these data (De Vries, 2013), that is, as the CoE stated above, “to analyse or
predict the personality or certain personal aspects relating to a natural person.”
The computational operations that allow for automated data analysis and prediction
are known under a variety of names such as machine learning, data mining, profiling, and
knowledge discovery in databases. Data processing is no longer just about storing and
organizing data, but about attributing meaning to these data: a task that used to be reserved
for humans. The extent to which the attribution of meaning is delegated to automated
computational operations performed by a machine varies widely. Hardly ever decisions
based on machine learning are fully automated47 – a passenger profiling system that flags a
passenger as a possible drug trafficker or a smart camera that classifies a suitcase as a
possibly dangerous object will often function as a decision support system to a human
security officer. It is the extent and intensity of the support that varies widely. When
acquiring or designing an SMT, the extent to which the attribution of meaning is supported
by, or even delegated to, automated operations is one particularly important aspect48 in
assessing its compatibility with European fundamental rights.
The various ways in which the attribution of meaning is supported by automated,
computational operations allows for a crude classification of three of the SMTs described in
D2.2: human security officers, passenger profiling systems and smart CCTV. Looking at these
three SMTs from the perspective of varying modalities in “profiling”, one can see a gradual
continuity between the SMT "human security officers" (who practice a kind of "natural"
profiling), and on the other hand "passenger profiling systems" (which is consists of a mix of
human security officers, data bases, 1-to-1 identification on No Fly lists, Passenger Name
47
Thus, the scope of Article 15(1) of DPD 95/46/EC on “Automated individual decisions” is relatively limited, because it only concerns decisions that are based solely on the automated processing of data: “Member States shall grant the right to every person not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc.” 48
Other aspects that are of particular importance when assessing the compatibility of an SMT with European fundamental rights, are (1) the aims and effects of the SMT, (2) the kind of data which are processed by the SMT. So, for example, one cannot make big sweeping statements about the general fundamental rights impact of a smart CCTV system in an airport, without looking into its aims and effects (e.g., is the aim to identify dangerous passengers and prevent them from boarding a plane, or is it a system targeting smaller security threats such as pickpockets or wrongly parked cars?), and in the particular data it deals with (e.g., does it recognize objects, particular activities, human characteristic such as gender, age or ethnicity, or faces of particular individuals based on 1-to-1 matching with pictures of faces stored in a database?). We return to the importance of these aspects when discussing proportionality in section 2.3.
54
Records49 (PNR) risk profiles50, etc.) and “smart CCTV” (lots of automation, sometimes
automated 1-to-1 identification with data bases, sometimes inferential behavioral profiles
about certain categories of actions or people). Passenger profiling systems and Smart CCTV
systems are both variations on practices that existed before, namely the practice of human
security officers that profile people, based on their expertise and biases, and the practice of
human security officers looking and interpreting “ordinary” CCTV footage. Part of the task of
interpreting information is delegated to such automated profiling machines that are part of
passenger profiling systems and smart CCTV systems.
Apart from the fact that the term “profiling” is useful in describing the gradual
relation between the three aforementioned SMTs, it is also useful to distinguish between
different varieties of a particular SMT like “passenger profiling” or “smart CCTV”. Within the
SMTs “passenger profiling” and “smart CCTV” one could distinguish four “ideal types”51 of
profiling modalities (a real SMT will most likely be a mix between several of these
modalities):
1. Unstructured data & human expertise. Data (passenger data and/or CCTV
footage) are available and possibly stored within an ICT system, but there is no
layer of computational machine intelligence added: that is, the interpretation and
analysis of the data is in no sense automated and relies completely on human
expertise. For example, a human security officer looking at CCTV cameras thinks
that a passenger behaves suspiciously or considers the combination of data on an
entry visa to be in need of further investigation.
49
“PNR data is unverified information provided by passengers, and collected by and held in the carriers’ reservation and departure control systems for their own commercial purposes. It contains several different types of information, such as travel dates, travel itinerary, ticket information, contact details, the travel agent at which the flight was booked, means of payment used, seat number and baggage information.” (Proposal for a Directive of the European Parliament and the Council on the use of Passenger Name Record data for the prevention, detection, investigation and prosecution of terrorist offences and serious crime. Brussels, 2 February 2011, p. 3) 50
See in particular: Proposal for a Directive of the European Parliament and the Council on the use of Passenger Name Record data for the prevention, detection, investigation and prosecution of terrorist offences and serious crime. Brussels, 2 February 2011, COM(2011) 32 final(2011), and Proposal for a Directive of the Council and the European Parliament on the use of Passenger Name Record data for the prevention, detection, investigation and prosecution of terrorist offences and serious crime. Brussels, 23 April 2012, nr. 8916/12. 51
An “ideal type” is an analytical construct that is used to grasp a much more complex and diversified reality: “[A]n ideal-type […] is not a description of reality but […] is formed by the one-sided accentuation of one or more points of view and by the synthesis of a great many diffuse, discrete, more or less present and occasionally absent concrete individual phenomena, which are arranged according to those one-sidedly emphasized viewpoints into a unified analytical construct (Gedankenbild). In its conceptual purity, this mental construct (Gedankenbild) cannot be found empirically anywhere in reality. It is a utopia.” (Weber, 1949, p. 90)
55
2. Structured, searchable data & human expertise. Data (passenger data and/or
CCTV footage) are available and stored within an ICT system and are organized in
such a way that they are easily searchable with a search query. However, the
interpretation and analysis are still completely dependent of human expertise.
For example, a passenger database that is searchable based on criteria such as
date of entry or nationality, or a CCTV footage database that is searchable based
on the meta-data of the footage (location and time of the recording).
3. Automated analysis of the data based on explicit, top-down programmed rules.
Data (passenger data and/or CCTV footage) are available and stored within an ICT
system that is capable of analyzing according to the algorithmic application of
rules that have been programmed in a top-down manner. This requires that the
programmer has domain specific knowledge. So, for example, there might be
experts who argue that “if a passenger travels alone, and orders a special meal,
and does not have a return ticket, then this passenger needs to be flagged as a
possibly dangerous traveler.” Such a rule can be perfectly transformed into a
computer executable rule. What is delegated to the ICT system is not the creation
of a rule but its execution. Similarly, one could equip a CCTV camera with
algorithmically executable rules as to what defines a woman, and what defines a
man; what defines nervous behavior and what defines “normal” behavior; etc.
4. Automated analysis of the data based on machine learning algorithms: rules that
have been generated (partly) in an automated inductive, bottom-up way. Creating
a top-down definition of what defines a certain category (“suspicious behavior”,
“wrongly parked car”, “female”, “dangerous situation”, etc.) is far from easy.
Even while human security offers might possess the skills to distinguish different
categories in practice, articulating their knowledge into explicit rules (i.e., a model
that would allow a machine to recognize and classify data into different
categories) can often be challenging. This is where the field of machine learning is
of great use: instead of providing an ICT system with explicit rules with regard to
particular categories, one gives it more general rules with regard to how to create
a model of a particular category. To put it differently, the machine is provided
with rules about how to infer, or “learn”, a model from data it has been provided
with. Sometimes one attempts to teach a machine a data model by giving it
56
labeled examples (“Here are 5000 examples of passenger profiles of passengers
that have been arrested from smuggling drugs into a plane and 5000 examples of
passenger profiles that did not smuggle drugs – according to which model can we
separate these two groups best?”), by correcting incorrect categorizations (“You
incorrectly classified this passenger profile as belonging to a drug smuggler!”) and
giving positive reinforcement for correct classifications (“You correctly classified
this passenger profile as belonging to a drug smuggler!”), or by giving it unlabeled
data and very general instructions about how to recognize similarity (“Here are
5000 examples of passenger profiles – please categorize them into four groups in
such a way that the distance within these four groups is minimized and the
distance between them is maximized”). However, teaching (and learning) through
(un)labeled examples and feedback is not an exact science. One can compare this
to the human process of learning: it is well known that a rule inferred by a human
pupil might be quite different than the one intended by the teacher. A toddler,
whose parents have spent several days pointing out instances of dogs, can
surprise the parent by pointing to a visiting neighbor and triumphantly say: ‘Dog!’
What is the rule inferred by the child? And by which standards do we judge if the
child is mistaken? Maybe the child was very perceptive and noticed a dog-like
feature of the neighbor which was overlooked by the parents? Finding standards
to judge the “correctness” or “success” of a data model produced through
“machine learning” is equally difficult. As we will see in section 2.4. and 2.5 this
difficulty is of importance when assessing the proportionality of a fundamental
rights infringement of an SMT.
Our analysis of the compatibility of smart CCTV and passenger profiling systems (sections
2.4. and 2.5) is best read when the four aforementioned levels of the automation of
attribution of meaning are kept in mind.
57
2.2 Differences in legal effect of (i) the European Convention of Human
Rights, (ii) Convention 108, (iii) the EU charter of Fundamental Rights
and (iv) EU Directives, Framework Decisions and Regulations
2.2.1 Council of Europe (i): European Convention of Human Rights (ECHR)
The fundamental rights and freedom protected by the European Convention of Human
Rights (ECHR) regulate the relation between State and citizen (vertical effect) in the 47
Council of Europe Member States. Although fundamental rights primarily play out in the
vertical relationship between governments and their subjects, various human rights can also
easily be infringed upon in horizontal relationships, for example by other big players, such as
large companies that are capable of infringing the privacy of their employees or of
discriminating persons on the basis of protected grounds (e.g. ethnic background or gender).
In order to understand how the ECHR can also have a “horizontal effect” in such horizontal
relationships between the subjects of a State, we need to take a look at the notions
“negative” and “positive” obligations. While the classical understanding of fundamental
rights is that they impose an obligation of abstention from interference on a State (a
“negative obligation”), the ECtHR has made it clear that also positive obligations (an
obligation on the State to actively do something) can follow from fundamental rights. By
now, we can safely assume that governments can be held accountable for failing to fulfill
their positive duties when they have failed to make national legislation that prohibits
infringements on ECHR fundamental freedoms. Governments can comply with their positive
obligations in various ways. After all, while not-doing (a negative obligation) is a relatively
unequivocal injunction, the injunction to do something (a positive obligation) can be many
things. In Cossey v. United Kingdom52 the Court states with regard to art. 8 ECHR (respect for
private and family life):
“…the notion of ‘respect’ is not clear-cut, especially as far as the positive obligations
inherent in that concept are concerned: having regard to the diversity of the
52
ECtHR, Cossey v. United Kingdom, 27 September 1990, no. 10843/84, § 37.
58
practices followed and the situations obtaining in the Contracting States, the notion’s
requirements will vary considerably from case to case.” (§ 37)
A positive obligation can entail that a State has to take measures to effectively investigate
alleged fundamental right infringements53 or to guarantee the effective enjoyment of a
right.54 However, currently the most common positive obligation imposed on a State
(Roagna, 2012, p. 60-77) is an obligation that involves doing something with legislation: for
example, the obligation to adopt legislation that will enable the individual to do something
or to clarify existing legislation. Von Hannover v. Germany provides an example of a positive
obligation to clarify existing legislation. In the Von Hannover v. Germany case the Court was
faced with the question whether the publication of pictures of Princess Caroline of Monaco
in some German magazines infringed on the right to respect for private and family life (Art. 8
ECHR). The Court decided that the German laws made use of an unclear distinction between
figures of contemporary society “par excellence” and “relatively” public figures (§ 73) and
that these laws did not specify sufficiently when and where public figures could expect to be
“in a protected sphere or, on the contrary, in a sphere in which they must expect
interference from others, especially the tabloid press.” (§ 72) Thus, the lack of respect for
private life stemmed from the fact that the ambiguously formulated German laws failed to
sufficiently protect the freedom protected in Art. 8 ECHR. What is important about the
introduction of positive obligations is that it gives fundamental rights an indirect horizontal
effect: an infringement occurring between individuals can be requalified as an infringement
that is ultimately originating from the State. Thus, a citizen whose fundamental ECHR rights
are infringed upon by a private actor (for example, a public transportation company or a
carrier) can turn to the ECtHR to oblige a State to oblige this private actor to act in
accordance with the ECHR (indirect horizontal effect).
53
For example, ECtHR, Kaya v. Turkey, 19 February 1998, no. 22729/93. 54
For example, one of the positive obligations following from the right to assembly (Art. 11 ECHR) is the obligation to provide protection to ensure peaceful conduct at demonstrations. ECtHR, Oya Ataman v. Turkey, 5 December 2006, no. 74552/01.
59
2.2.2 Council of Europe (ii): Convention 108 on Data Protection
Convention nr. 108 for data protection55 is an international treaty adopted by the Council of
Europe in 1981, which is currently in the process of being modernized56. It regards all forms
of automated data processing and its scope covers both processing by private and public
bodies. One of the reasons why this Convention 108 is an important document is that it is
the first legally binding instrument regarding data protection and has had an enormous
impact on the content and structure of data protection instruments which were developed
later. Its influence is particularly noticeable in EU Data Protection Directive 95/46/EC (DPD).
The legal concepts and standards in the DPD bear such similarity to those articulated in
Convention 108 that one could even speak of a European style of data protection (Greenleaf,
2012) which finds its origin in this international treaty. Another reason why Convention 108
is of great importance is that it is a legal instrument of a truly international character that
allows any State (also those that are not a Member of the Council of Europe) to accede to
the treaty. Signatory states57 are required to take the necessary steps to apply its principles
in their domestic legislation. However, because Convention 108 only establishes “bilateral,
intra-state cooperation and not the establishment of an international organization or
committee authorized to deal with the relevant issues” (de Hert & Papakonstantinou, 2013,
p. 278) it is also a bit of a toothless document. When a signatory State infringes on
Convention 108 there is, in sharp contrast to the European Convention of Human Rights
(ECHR) discussed above, no court one can turn to. There is no doubt that Convention 108
and its modernized successor, which is currently under construction, will continue to be an
important global standard in Data Protection and will give guidance to many other legislative
instruments regarding data protection. Taking into consideration that the Convention lacks
in concrete means of legal enforcement, and also taking into account there is considerable
overlap between the Convention and the DPD in terms of content, we decided not to include
its principles in the SMT compatibility analysis in the second half of this chapter. While
Convention 108 and its successor can provide interesting pointers with regard to the future
55
Council of Europe, Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, nr. 108, Strasbourg, 28 January 1981. 56
http://www.coe.int/t/dghl/standardsetting/dataprotection/modernisation_en.asp 57
At the moment (August 2013) there are 47 signatory States: http://www.conventions.coe.int/Treaty/Commun/ChercheSig.asp?NT=108&CM=2&DF=&CL=ENG
60
direction of data protection on an international level, to the user of the SIAM tool they are
of limited use.
2.2.3 EU (i): EU Charter of Fundamental Rights (CFREU)
Primary EU legislation mainly consists out of the founding Treaties of the EU and could thus
be understood as the “constitutional” law of the EU. It establishes the powers and
responsibilities of the various EU bodies and the fundamental principles and rules according
to which the EU bodies operate. When the Charter of Fundamental Rights of the European
Union58 (CFREU) was first published in the Official Journal of the EU in 2000, its legal status
was unclear. However, the Treaty of Lisbon59 made it clear that the CFREU, like the Treaty
itself, is primary EU law. The CFREU became legally binding, primary EU law on 1 December
2009 when it entered into force as part of the Treaty of Lisbon.60
“The Union recognises the rights, freedoms and principles set out in the Charter of
Fundamental Rights of the European Union of 7 December 2000, as adapted at
Strasbourg, on 12 December 2007, which shall have the same legal value as the
Treaties.” (Art. 6(1) of the Lisbon Treaty)
The Charter applies to EU institutions and to all legislation or other legal acts and actions
performed within EU jurisdiction. This also means that EU member states only have an
obligation to comply with the CFREU when they implement EU law. The European Court of
Justice enforces the Charter in relation to European Union law. The legal effect of the CFREU
is thus a horizontal effect in the sense that it affects all the acts of EU bodies. It does not
have a vertical effect (the relation between State and citizen) or a horizontal effect between
citizens. Its application is therefore limited in comparison to ECHR, which has a vertical effect
(the relation between State and citizen), and sometimes also an indirect horizontal effect on
the relations between citizens (by imposing positive obligations on States to regulate those
relations), which are both enforced by the ECtHR in Strasbourg. When a State infringes
58
Charter of Fundamental Rights of the European Union of 7 December 2000, Official Journal of the European Communities OJ C 364, 18 December 2000, p. 1-22. 59
Treaty of Lisbon Amending the Treaty on European Union and the Treaty Establishing the European Community, 13 December 2007, Official Journal C 2007, C 306, p. 1-229. 60
Treaty of Lisbon Amending the Treaty on European Union and the Treaty Establishing the European Community, 13 December 2007, Official Journal C 2007, C 306, p. 1-229.
61
directly, or indirectly (by allowing a private actor to infringe), on a fundamental right
protected by the ECHR, the citizen whose right is infringed can turn to the ECtHR in
Strasbourg. In contrast, when it concerns a fundamental right protected by the CFREU, there
is no equivalent route to have one’s rights protected and enforced: a citizen cannot file a
complaint at the European Court of Justice (ECJ) in Luxembourg.
Nevertheless, this does not mean that the influence of the Charter is limited to the
actions of EU bodies. When one realizes that every action of every EU body has to comply
with the fundamental rights standards set in the CFREU, it is not surprising that the CFREU
has a very noticeable impact on daily life within the EU. Or, to put it more legally: because
the Charter is primary EU law, it acts as a source of secondary EU law which affects every
citizen, private and public body within the EU. Some of the rights enshrined in the CFREU,
such as the right to protection of personal data (Art. 8 CFREU), the right to non-
discrimination (Art. 21(1) CFREU), and the freedom of movement and residence (Art. 45
CFREU), act as a legal sources for secondary legislation (see below) in those fields.
Secondly, there is also a more subtle and indirect influence exercised by the Charter.
This effect is, for example, reflected in the annual assessment of the effect of the Charter61
which shows that national judges in the EU member states tend to use it as “an additional
argument or as confirmation of existing (Union) law, or as a 'source of inspiration'”62. One of
the reasons why it is relatively easy for national judges to use the Charter as an additional
source of inspiration, is its similarity to the ECHR. In fact, Art. 52(3) of the CFREU states that
when rights from the CFREU overlap with those of the ECHR, they should be interpreted
according to the interpretation given by the ECtHR to the corresponding rights in the ECHR:
“Insofar as this Charter contains rights which correspond to rights guaranteed by the
Convention for the Protection of Human Rights and Fundamental Freedoms, the
meaning and scope of those rights shall be the same as those laid down by the said
Convention. This provision shall not prevent Union law providing more extensive
protection.” (Art. 52(3) CFREU)
61
http://www.aca-europe.eu/en/colloquiums/colloq_en_23.html 62
General ACA report, p. 7: http://www.aca-europe.eu/seminars/DenHaag2011/Gen_Report_en.pdf
62
The relation of the CFREU with secondary EU law and the correspondence between certain
CFREU rights with ECHR equivalents, makes the discussion of CFREU rights very dependent
on these two legal sources. In the second half of chapter two, when discussing the rights of
the CFREU that are relevant for SMTs, we often found that we could discuss a CFREU right in
sufficient detail by merely referring to our analysis of the corresponding ECHR right or the
corresponding secondary legislation. We only present an autonomous analysis of CFREU
rights when a corresponding ECHR right and/or secondary EU law is lacking.
2.2.4 EU (ii): EU Directives, Regulations and Framework Decisions (secondary
EU legislation)
EU Directives and Regulations are two particular forms of secondary EU legislation.
Secondary EU legislation is guided by and based on primary EU legislation. An example of
primary EU legislation is, as we explained above, the CFREU. Framework Decisions are
secondary legislative instruments that were produced within the no longer existing Third
Pillar of the EU (regarding police and judicial co-operation in criminal justice matters) and
that function very similarly to Directives. Since the dissolution of the EU pillar structure in
2009 no new Framework Decisions can be adopted. However, those Framework Decisions,
that were adopted before 2009 and that have not been abolished or superseded by more
recent legislation, are still in vigour. Framework Decisions are similar to Directives in the
sense that both are a form of secondary EU legislation which is not self-executive, but
obliges (or, in the case of Framework Decisions: obliged) EU Member States to transpose the
EU legislation into national legislation. Given that Framework Decisions are no longer used
we will not discuss them in any further detail and focus instead on the similarities and
differences between Directives and Regulations. While Directives are in principle as binding
as a Regulations, they are not self-executive and thus leave member states some room as to
the exact content and shape of the national transposition of the Directive. EU Regulations
and the domestic implementations of EU Directives are directly binding for all EU citizens
and can be directly invoked in a national court. EU Directives primarily address the EU
member states, but can get direct vertical effect (can be invoked by a citizen against a State)
if the state fails to implement the Directive in domestic legislation. EU Regulations and
Directives are thus legal instruments that have a very real and practical bite to them. In the
63
second half of this chapter we discuss the following EU Framework Decisions, Directives and
Regulations:
(a) With regard to data protection:
Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995
on the protection of individuals with regard to the processing of personal data and on the
free movement of such data, Official Journal L 281, 23 November 1995, p. 31-50 (“Data
Protection Directive”).
Framework Decision 2008/977/JHA of 27 November 2008 on the protection of
personal data processed in the framework of police and judicial cooperation in criminal
matters, Official Journal L 350/60, 30 December 2008, p. 60-71 (“JHA Framework Decision of
Data protection”).
Proposal for a Regulation of the European Parliament and of the Council on the
protection of individuals with regard to the processing of personal data and on
the free movement of such data, 25 January 2012 COM(2012) 11 final (“Proposed General
Data Protection Regulation”).
Proposal for a Directive of the European parliament and of the Council on the
protection of individuals with regard to the processing of personal data by competent
authorities for the purposes of prevention, investigation, detection or prosecution of criminal
offences or the execution of criminal penalties, and the free movement of such data, 25
January 2012 COM(2012) 10 final (“Proposed Law Enforcement Data Protection Directive”).
(b) With regard to anti-discrimination:
Directive 2000/43/EC of 29 June 2000 implementing the principle of equal treatment
between persons irrespective of racial or ethnic origin, Official Journal L 180, 19 July 2000, p.
22-26 (“Race Directive”).
64
Directive 2000/78/EC of 27 November 2000 establishing a general framework for
equal treatment in employment and occupation, Official Journal L 303, 2 December 2000, p.
16-22 (“Employment Equality Directive”).
Directive 2006/54/EC of 5 July 2006 on the implementation of the principle of equal
opportunities and equal treatment of men and women in matters of employment and
occupation (recast), Official Journal L 204, 26 July 2006, p. 23-34 (“Gender Recast Directive”).
Directive 2004/113/EC of 13 December 2004 implementing the principle of equal
treatment between men and women in the access to and supply of goods and services,
Official Journal L 373, 21 December 2004, p. 37-43 (“Gender Goods and Services Directive”).
Proposal for a Council Directive on implementing the principle of equal treatment
between persons irrespective of religion or belief, disability, age or sexual orientation, 2 July
2008, COM (2008) 426 (“Proposed Equal Treatment Directive”).
(c) With regard to freedom of movement:
Directive 2004/38/EC of the European parliament and of the Council of 29 April 2004
on the right of citizens of the Union and their family members to move and reside freely
within the territory of the Member States, Official Journal L 229, 29 June 2004, p. 35-48
65
2.3 Proportionality and Fair Balancing
2.3.1 Proportionality and fair balancing in the case law of the ECtHR and ECJ
“International human rights law as well as fundamental rights granted by national
constitutions employ various strategies to limit the scope of fundamental rights,
without losing their substance. Limitation is inevitable, either because they clash with
public goods that are conditional for the effectiveness of fundamental rights or
because various rights or liberties clash and must be aligned one way or another.”
(Hildebrandt, 2013a, p. 19)
When several fundamental rights are in conflict with each other or when limitations on the
scope of a fundamental right have to be established, a Court can take several approaches.
(a) The triple proportionality test of the ECtHR: limitations on a fundamental right that
are provided for by law, have a legitimate objective and are necessary in a
democratic society.
When several fundamental rights are in conflict with each other or when limitations on the
scope of a fundamental right have to be established, a Court can take several approaches.
De Schutter and Tulkens (2008) list the three classical approaches followed by the European
Court of Justice (ECtHR): (i) a proportionality analysis that assesses whether the limitation is
“necessary in a democratic society”, (ii) a proportionality analysis balancing the two rights
against each other, or (iii) leave the proportionality analysis to the local authorities, based on
the so-called “national margin of appreciation” doctrine which says that “State authorities”
are more in touch with local laws and morals and thus “in principle in a better position than
the international judge to give an opinion [...] on the ‘necessity’ of a ‘restriction’ [...].”63
Clearly, there are important differences between these three approaches. Which approach is
taken is often a matter of framing. For example, an airplane passenger who feels that a body
scanner procedure infringes on his or her right to respect for private life (Art. 8 ECHR), can
63
ECtHR, Handyside v. The United Kingdom, no. 5493/72, Judgment of 7 December 1976, § 48.
66
either question this infringement in terms of the necessity of the limitation on private life to
realize a particular public interest (e.g., security when travelling in an airplane), or as a
conflict between two rights that need to be balanced against each other: the right following
from Art. 8 (“Respect for private life”) against the right to life (Art. 2 ECHR, which can be
protected, e.g., by imposing a security screening with a body scanner on all passengers).
While one could argue that the former approach obfuscates the true conflict (De Schutter &
Tulkens, 2008), it is also true that the necessity test has been developed in more detail in the
case law of the ECtHR than the fair balancing of two rights. Similarly there can be both
arguments in favour and against the “margin of appreciation” approach. However,
whichever of the three aforementioned approaches is taken, they all encompass some
element of “proportionality analysis.”
When talking about the use of “proportionality analysis” in a fundamental rights
context, it is a fallacy to understand it as a neutral cost-benefit analysis between several
rights or interests. As convincingly argued by Waldron (2003), and discussed by one of us in
D4.2 (p. 7-8 and: Hildebrandt, 2013a), the often invoked image of a balance weighing liberty
against security, is a flawed metaphor for proportionality analysis (see also: De Schutter &
Tulkens, 2008). One of the problems of framing proportionality in economic terms like “cost-
benefit analysis”, “trade-off” or “zero-sum game”, is the lack of a common measure for the
“‘stuff’ we are balancing.” (Hildebrandt, 2013a, p. 16) There is no common standard like
“pounds”, “Euros” or “kilograms” that can be invoked when weighing fundamental rights. It
would be a plain absurdity to state that the impact of an SMT equals 2 lbs increase in
security, 5 lbs of privacy loss and 3 lbs of growth in the amount of anti-discrimination
prevention.
“[T]he scale analogy is not really appropriate, since the interests on both sides are
incommensurate. It is more like judging whether a particular line is longer than a
particular rock is heavy.”64
Even in relative terms the weighing of fundamental rights against other interests, or various
fundamental rights against each other, suffers from the problem of incommensurability
64
Justice Antonio Scalia, concurring opinion in Bendix Autolite Corp. v. Midwesco Enterprises, Inc., et. al, 486 U.S. 888, (108 S. Ct. 2218, 100 L. Ed. 2d 896), decided 17 June 1988.
67
(Endicott, 2012): who is to say that the security of X weighs more than the privacy of Y? And
even if we would assume that there is a common measure, it would be difficult to find truly
relevant empirical evidence with regard to this trade-off. For example, the fact that a smart
CCTV system correctly identifies 80% of the faces in a standardized facial recognition data
base, says little about how well such a system performs in a real life setting and even less
about the increase or loss in such abstract concepts as “security” or “privacy.”
Proportionality analysis, which has little to do with economic cost-benefit analyses, is
better understood as both an utterly legal operation, adhering closely to the systematic-
fractal logic described in chapter 1 of this deliverable, protecting a particular political
aspiration of how power should be distributed in constitutional democracies guided by the
rule of law. Thus, proportionality analysis will often consist out of two parts. The first one is
the procedural-analytical part, in which legal (i.e., systematic-fractal) logic is used to assess
the legal grounds of a fundamental rights infringement. Is the infringement justified by the
legal system? Does it have a legal grounding that justifies it? The second part is a
substantive-creative one, in which the legal (i.e., systematic-fractal) logic is used to assess
whether the rights and interests at stake have been combined into a well-balanced
composition or reconciliation. (Gutwirth, et al., 2011, p. 27) If this is not the case, the
substantive-creative proportionality test strives to present a composition that contains
preferably no trade-offs and that preserves the different interests and rights “in an optimal
way.” (De Vries, et al., 2011, p. 21)
The European Court of Human Rights (ECtHR) uses a triple test to assess whether a
limitation of fundamental rights65 is permissible. The triple limitation test assesses whether
the limitation (1) is in accordance with a national law of good quality66, that is, a law which is
65
A basic introduction with regard to the triple test of the ECtHR can be found in: Gerards, 2011; Sottiaux, 2008. For an interesting critical re-articulation see: De Schutter & Tulkens, 2008. The relation between the triple test and privacy impact assessments is explored in: De Hert, 2012. 66
“The Court would reiterate its opinion that the phrase "in accordance with the law" does not merely refer back to domestic law but also relates to the quality of the law, requiring it to be compatible with the rule of law, which is expressly mentioned in the preamble to the Convention […]. The phrase thus implies - and this follows from the object and purpose of Article 8 (art. 8) - that there must be a measure of legal protection in domestic law against arbitrary interferences by public authorities with the rights safeguarded by paragraph 1 (art. 8-1) […]. Especially where a power of the executive is exercised in secret, the risks of arbitrariness are evident […]. Undoubtedly, as the Government rightly suggested, the requirements of the Convention, notably in regard to foreseeability, cannot be exactly the same in the special context of interception of communications for the purposes of police investigations as they are where the object of the relevant law is to place restrictions on the conduct of individuals. In particular, the requirement of foreseeability cannot mean that an individual should be enabled to foresee when the authorities are likely to intercept his communications so that he can
68
sufficiently clear and precise in its wording to be accessible and foreseeable, and which is
compatible with the rule of law (legality), (2) pursues a legitimate aim such as, for example,
national security or safety (legitimacy), and (3) is necessary in a democratic society
(necessity).
The legality and legitimacy requirement can be understood as the procedural-
analytical part of a proportionality test, while the third step acts as its substantive-creative
part. The use of the triple proportionality test depends largely on the fundamental right(s) at
stake. The ECHR contains a few so-called “notstandfest” absolute rights, such as the
prohibition of torture (Art. 3 ECHR) and the prohibition of slavery (Art. 4(1) ECHR), to which
there can be no limitations, and which do not allow any exceptions or derogations:
“The Court has confirmed that even in the most difficult circumstances, such as the
fight against terrorism and organised crime, the Convention prohibits in absolute
terms torture and inhuman or degrading treatment or punishment, irrespective of
the conduct of the person concerned.”67
However, as Gerards (2011, p. 105-106) points out, even in such absolute rights
proportionality can nevertheless play a role, namely by defining the scope of the protection.
A right can be absolute, but when the Court states that only infringements with “a minimum
level of severity”68 fall under the protective scope of a right, proportionality considerations
can enter the stage through the backdoor.69
The majority of the rights in the ECHR are, however, not absolute. They either
contain an explicit clause containing the general limitation of the triple proportionality test
adapt his conduct accordingly. Nevertheless, the law must be sufficiently clear in its terms to give citizens an adequate indication as to the circumstances in which and the conditions on which public authorities are empowered to resort to this secret and potentially dangerous interference with the right to respect for private life and correspondence”. (ECtHR, Malone v. the United Kingdom, Judgment of 2 August 1984, no. 8691/79, Series A no. 82, §67) 67
ECtHR, Gäfgen v. Germany, no. 22978/05, Judgment of 1 June 2010, §87. 68
“The Court recalls that ill-treatment must attain a minimum level of severity if it is to fall within the scope of Article 3. The assessment of this minimum is relative: it depends on all the circumstances of the case, such as the nature and context of the treatment, its duration, its physical and mental effects and, in some instances, the sex, age and state of health of the victim”. (ECtHR, A. v. the United Kingdom, nr 100/1997/884/1096, Judgment of 23 September 1998, §20) 69
The “relativity” of the absolute protective scope of fundamental rights can also be seen in the case law of the Federal Constitutional Court of Germany. This is especially exemplified by the way in which the Court establishes to the absolute essence of a fundamental right (Wesensgehalt): this is not done in an abstract manner but in relation to the particular case. (Dammann, 2011; De Hert, De Vries, & Gutwirth, 2009)
69
(e.g., Art. 8 [Right to respect for private and family life], 9 [Freedom of thought, conscience
and religion], 10[Freedom of expression] and 11[Freedom of assembly and association]
ECHR) or specific limitations70 (e.g., Art. 5 [Right to liberty and security] and Art. 6 [Right to
fair trial]), or are considered to be limited in an implicit manner by a general principle of
limitation that applies to all fundamental rights that are not explicitly absolute (e.g., Art. 6
[Right to fair trial], Art. 5 [Right to liberty and security] and Art. 14 [Prohibition of
discrimination]). For example, with regard to Article 14 ECHR the Court argued that it would
be absurd to think of the prohibition of discrimination as an absolute right on which no
limitations are possible:
“In spite of the very general wording of the French version ("sans distinction
aucune"), Article 14 does not forbid every difference in treatment in the exercise of
the rights and freedoms recognised. […] It is important, then, to look for the criteria
which enable a determination to be made as to whether or not a given difference in
treatment, concerning of course the exercise of one of the rights and freedoms set
forth, contravenes Article 14. On this question the Court, following the principles
which may be extracted from the legal practice of a large number of democratic
States, holds that the principle of equality of treatment is violated if the distinction
has no objective and reasonable justification. The existence of such a justification
must be assessed in relation to the aim and effects of the measure under
consideration, regard being had to the principles which normally prevail in
democratic societies. A difference of treatment in the exercise of a right laid down in
the Convention must not only pursue a legitimate aim: Article 14 is likewise violated
when it is clearly established that there is no reasonable relationship of
proportionality between the means employed and the aim sought to be realised.”71
(italics ours)
70
For example, the right to liberty and security (Art. 5) is limited by several specific habeas corpus limitations: the deprivation or restriction of one’s liberty can be justified only when one is lawfully detained, which mainly implies an arrest based on reasonable suspicion and that one is brought before a competent court. Similarly, there can be specific circumstances (notably those of Art. 15) in which the right to fair trial is limited (Art. 6). Contrary to Art. 5 the limitations of Art. 6 are not explicitly named, but are so-called implied limitations. See e.g. ECtHR, Golder v the United Kingdom, no. 4451/70, judgment of 21 February 1975, §38. 71
ECtHR, Certain aspects of the laws on the use of languages in education in Belgium, no 1474/62, Judgment of 23 July 1968, §I.B.10.
70
The triple proportionality test is formulated most explicitly in Art. 8 (Right to respect for
private and family life). This, however, does not mean that it is any more (or less) subject to
limitations than other fundamental rights in the ECHR. While the exact manner in which the
triple proportionality test is applied differs slightly per right, this is a difference that is not so
much found in the wording of an Article but that is mainly developed in and through case
law and that is subject to change over time. For example, the majority of the case law of the
ECtHR on private life (Art. 8) that has to do with the legality of the interference: the Court
tends to test the legitimacy requirement in a very marginal way (virtually any, more or less,
reasonable aim can be a legitimate aim) and has until recently72 be quite reluctant with
regard to applying the necessity test in a strict manner. Especially when compared to how
the “necessity requirement” has been applied in relation to freedom of expression (Art. 10
ECHR), its use with regard to Art. 8 has been rather frugal. (cf. De Hert, 2005) One of the
reasons for the Court to be reluctant to assess whether a limitation on a fundamental right is
“necessary in a democratic society” is that it is a less procedural and more creative-
substantial test: deciding on an optimal composition of various interests and right seems to
entail a rather political stance. So what does the necessity requirement entail precisely? The
term “creative-substantial” is ours – the Court is more cautious in defining what necessity is:
“The Court notes […] that […] whilst the adjective ‘necessary’ […] is not synonymous
with ‘indispensable’ […], ‘absolutely necessary’ and ‘strictly necessary’ and […] the
phrase ‘to the extent strictly required by the exigencies of the situation’, neither has
[…] the flexibility of such expressions as ‘admissible’, ‘ordinary’ […] ‘useful’,
‘reasonable’ […] or ‘desirable’, […] [it implies a] pressing social need”73
Notwithstanding this fuzzy formulation, there can be no doubt that the necessity
requirement allows for a substantive-creative approach in which the Court takes a
deliberative approach, characteristic for decision-making under uncertainty (De Schutter &
72
During the last decade the necessity test seems to become more important in assessing whether a limitation on private life is justified. See, e.g., ECtHR, S. and Marper v. the United Kingdom, nos. 30562/04 and 30566/04, Judgment of 4 December 2008. 73
ECtHR, Handyside v. The United Kingdom, no. 5493/72, Judgment of 7 December 1976, § 48.
71
Tulkens, 2008), and tries to come up with a constructive way of optimizing74 all interests and
rights at stake. (Van Drooghenbroeck, 2001, p. 302 ff) The necessity requirement is to a
certain extent comparable to what German constitutional doctrine calls the method of
‘practical concordance’ (Marauhn & Ruppel, 2008), which aims to reconcile rights in conflict
“to the fullest extent possible, without any of the two rights in conflict having been
completely sacrificed to the other”, but supplements this with a “constructivist dimension”
that forces the Court to “develop imaginative solutions.” (De Schutter & Tulkens, 2007, p.
34; 2008)
As we will argue below, in section 2.3.2, it is the deliberative and the constructive
aspect of proportionality analysis that would make it possible to extend its logic into the
field of, respectively, FRIAs and LPbD. First, however, we will take a look at the role of
proportionality analysis in the legal instruments of the EU and in the case law of the ECJ.
(b) Proportionality and fair balancing in EU legal instruments and the case law of the ECJ:
the logic of fundamental rights combined with the logic of the regulation of a
common market.
As noted above, in section 1.1.3, the logic of the legal instruments of the EU are Janus-faced.
While they show a considerable overlap with the fundamental rights logic found in the ECHR
and the case law of the ECtHR, they are also informed by a completely different logic and
concern, namely that of regulating the internal common market of the EU. For example,
when looking at Art. 1 of the Data Protection Directive 95/46/EC (DPD), we clearly see this
74
The question is, of course, which standards should be used to assess whether the rights in conflict have been sufficiently optimized. De Schutter and Tulkens (2008) argue that the case law of the ECtHR shows that “personal autonomy” and “democratic society” are two crucial notions in assessing the felicity of a the way a conflict between rights is resolved. As an aside, it is interesting to note that the importance of these two notions fits rather well with the genealogy of proportionality analysis. Proportionality analysis is not a timeless or universal method but a modus operandi originally derived from German constitutional doctrine, which in turn has its roots in the Prussian “administrative law-police law (Polizeirecht)” (Stone Sweet & Mathews, 2008, p. 98) of the late eighteenth century. This Prussian Polizeirecht was characterized by an almost “pastoral” concern (the population as the “flock” cared for by the State) for the well-being of society as a whole and particularly the relation between citizen and State: “[Polizeirecht] subsumed measures designed to promote the public welfare, morality, and public safety, encompassing nearly the whole of the states (then fairly primitive) interventions in society.” (Stone Sweet & Mathews, 2008, p. 98-99) The novelty introduced by the logic of eighteenth century Polizeirecht is that it practice of policing is wrought with the paradox that it is “both individualising and totalitarian” (Foucault, 2000, p. 325) and that is fosters “both citizens’ lives and the state’s strength” (Foucault, 2000, p. 322-323). Even today it is the resolution of this (seeming) paradox that still permeates any proportionality analysis with respect to fundamental rights.
72
double objective. The first objective of the DPD is to “protect the fundamental rights and
freedoms of natural persons, and in particular their right to privacy with respect to the
processing of personal data” and the second objective is to ensure “the free flow of personal
data” within the internal common market of the EU. While the fundamental rights concern
for maintaining a particular power relation between State and citizen and the related
proportionality analysis are important within the EU legal framework, they are always
combined with a regulatory concern for the internal market of the EU that is absent in the
fundamental rights framework of the ECHR. This, of course, is especially noticeable in
secondary EU legislation elaborating on particular EU fundamental rights. For example,
when comparing the non-discrimination provision of the ECHR (Art. 14) with the one of the
CFREU (Art. 21) the differences seem small. It is only when juxtaposing Art 14 ECHR with the
various EU Directives (secondary EU legislation) regarding anti-discrimination, that the
differences between the ECHR approach to anti-discrimination and the EU approach become
palpable. We will return to this below, in section 2.4. and 2.5. For now we will exemplify the
Janus-faced EU approach – a fundamental rights logic mixed with a regulatory approach
regarding the common market – by taking a closer look at the DPD. In the DPD we see that a
level of regulatory detail that is completely absent in a legal document like the ECHR. Based
on this observation one could be allured to conclude that assessing the compatibility of an
SMT with secondary EU legislation is a completely straightforward exercise – like going
through a detailed checklist of requirements – while a compatibility assessment of an SMT
with fundamental rights derived from the ECHR requires the more complex, creative
proportionality approach described above. (cf. De Hert, 2012) While this observation is not
incorrect, it does obfuscate the fact that the EU regulatory approach is strongly inspired by
concerns very similar to those found in the proportionality analyses of the ECtHR. Continuing
to look at the DPD as an exemplary case, we can see how some of the data processing
requirements of the DPD mirror the triple proportionality test of the ECHR. To begin with
there is the legality and the legitimacy of the processing of personal data. Art. 7 of the DPD
addresses these two requirements by enumerating the grounds that legitimize the
processing of personal data:
(a) the data subject has unambiguously given his consent; or
73
(b) processing is necessary for the performance of a contract to which the data
subject is party or in order to take steps at the request of the data subject prior to
entering into a contract; or
(c) processing is necessary for compliance with a legal obligation to which the
controller is subject; or
(d) processing is necessary in order to protect the vital interests of the data subject;
or
(e) processing is necessary for the performance of a task carried out in the public
interest or in the exercise of official authority vested in the controller or in a third
party to whom the data are disclosed; or
(f) processing is necessary for the purposes of the legitimate interests75 pursued by
the controller or by the third party or parties to whom the data are disclosed, except
where such interests are overridden by the interests for fundamental rights and
freedoms of the data subject.
In addition, the requirements with regard to data minimization in Art. 6 DPD (purpose
specification, use limitation, accuracy and completeness of the data, and deletion and
anonymisation of the data as soon as they are no longer needed for the purpose that led to
their collection76), confidentiality and security of the data processing (Art. 16 and 17 DPD),
and the additional strict requirements of Art. 8 with regard to sensitive data (regarding racial
or ethnic origin, political opinions, religious or philosophical beliefs, trade-union
membership, health or sex life), are all requirements that also could play a role in the
“necessary in a democratic society”-test of the ECHR. Thus, while in practice the
considerations in a proportionality analysis of the ECtHR regarding data processing are likely
to overlap with the requirements posed to data processing by Art. 6, 7, 8, 16 and 17 DPD, a
legal EU instrument like the DPD articulates in much more regulatory detail with which
75
One could argue that the “legitimate interest”-ground of Art. 7(f) encompasses all the grounds mentioned in Art.7(a)-(e). Elsewhere we wrote: “Contrary to what is often argued, we do not believe that the consent criterion of Article 7(a) DP[D] is the most important. Since art. 7(e) and (f) do already justify any processing of personal data tending to the realisation of a legitimate aim of the data controller, the legitimacy by consent criterion foreseen by art. 7(a) will often, if not always, be superfluous. If the consent criterion could supersede the other “legitimate aims” criteria this would perversely imply that consent could legitimize processing for “illegitimate aims”, which would be an unacceptable outcome.” (Gellert, et al., 2012, p. 77) 76
This simplified paraphrasing of the data minimization requirements of Art. 6 DPD has been inspired by Hildebrandt, 2013b, p. 33.
74
requirements an SMT should comply. Moreover, in contrast, to the fundamental rights
protected in the ECHR, regulatory EU instruments like the DPD aim to prevent infringements.
(cf. Gellert, et al., 2012) In the DPD this preemptive objective is achieved by granting certain
individual rights to data subjects, such as the right to be informed about essential aspects of
the data processing and to access and correct one’s data (Art. 10, 11 and 12 DPD), and the
imposition of certain “structural” obligations on EU Member States, such as the obligation to
provide for a national supervisory body (Art. 28) and for adequate judicial remedies,
sanctions and damages (Art. 22-23).
To summarize: the protection of fundamental rights within the EU framework does
contain elements of proportionality analysis that are akin to those used by the ECtHR.
However, because of the regulatory nature of the EU framework, the EU proportionality
analysis is implied in requirements that are incomparably more detailed than those
contained in the ECHR. Moreover, the EU legal framework also differs from the ECHR
framework in that one of its objectives is to preempt fundamental infringements, thereby
creating an attractive and reliable internal market within the EU.
2.3.2 Proportionality and fair balancing in LPbD and FRIAs
The fact that proportionality analysis plays a role in both the EU legal framework and that of
the ECHR, makes it possible to extend the deliberative (articulating all the relevant points of
view in a transparent manner) and the substantive-constructive logic of proportionality
analysis into the field of, respectively, FRIAs and LPbD.
We realize that when we propose to use the operations underlying fundamental
rights proportionality analysis as a model for FRIAs and LPbD, that this differs from how most
FRIAs and LPbD are currently framed. The only type of FRIA which currently has been
elaborated upon in some detail, is the DPIA (data protection impact assessment). These
DPIAs are currently modeled in a way that resembles a straightforward checklist (De Hert,
2012). The form and wording of DPIAs are mainly inspired by environmental or health
impact assessments (Gellert, 2013), and consequently the DPIA discourse is one of risks and
precaution. However, we would like to argue that while a discourse of risks and precaution
might be suitable when assessing the expected impact of a policy or technology on
environment or health, this discourse cannot be simply transposed to FRIAs. Impact
75
assessments with regard to fundamental rights should be performed in a more legal way,
and the deliberative aspect of the legality-, legitimacy- and necessity-tests found in the
general proportionality test of the ECHR seems particularly appropriate. Or, as De Hert
(2012) argues, the current straightforward DPIAs should be supplemented with an IA that
conforms more to the subtle legal logic of proportionality analysis. De Hert therefore calls
for supplementing DPIAs with privacy impact assessments (PIAs):
“It is an ideal test, consisting of a bundle of more specific tests or criteria. Together,
they are suited for an honest privacy impact assessment carried out by a government
or a private actor undertaking an initiative or designing the functionalities of a
technology and open to critical self-assessment.” (De Hert, 2012, p. 43)
The substantive-constructive approach of the necessity test of the ECHR is not just
suitable for FRIAs but also seems to be very well suited for LPbD. In fact one could argue that
the necessity test of the ECHR comprises an element of composition, that is of design, with
all the rights and interests that are at stake when creating or using a particular SMT. We
argue that what should differentiate LPbD from techno-regulation or value sensitive design
is precisely that it should preserve something of the systematic-fractal logic of the law which
cancombine several interests and rights into one SMT.
The formulation of Art. 23(1) of the proposed GDPR, with regard to the obligation for
data controllers to implement the relevant protection of personal data at the level of design,
seems to support the kind of legally proportionate design that we envision:
“Having regard to the state of the art and the cost of implementation, the controller
shall, both at the time of the determination of the means for processing and at the
time of the processing itself, implement appropriate technical and organisational
measures and procedures in such a way that the processing will meet the
requirements of this Regulation and ensure the protection of the rights of the data
subject.” (Art. 23(1) proposed GDPR, italics ours)
The Data Protection by Design (DPbDesign) in Art 23 of the proposed GDPR has two types of
built-in proportionality considerations. Firstly, it states that the measures have to be
76
technically and economically feasible. This implies, as Hildebrandt & Tielemans (2013, p.
517) argue that data “controllers will not be confronted with unreasonably costly
requirements or with an obligation to integrate requirements for which no technical solution
has yet been developed. At the same time, it forces them to implement technical solutions
that are available if the cost is not prohibitive. Once technical solutions for particular legal
obligations are on the market at a reasonable price, data controllers will have to use them or
implement their own equivalent or better solutions.” Secondly, the word “appropriate” is a
strong indicator of the proportionality incorporated in DPbDesign. The task of the data
controller to find appropriate technical and organizational measures can thus be understood
as being conceptually related with the proportionality tests performed by the judges of the
ECtHR and ECJ in the sense that it is a task of composition between various requirements,
interests, alternative routes for realizing the same goal, and safeguards.
“To opt for the word ‘appropriate’ shows that the controller still has discretion
concerning which technical measures or procedures he will implement. Furthermore,
it is open for the controller to define what the purpose of the processing is, and
whether it is necessary to process, collect, and store the data for that purpose.”
(Hildebrandt & Tielemans, 2013, p. 517)
Of course, it cannot be expected of a data controller to perform a proportionality
analysis in the same way as the ECtHR would do that. Even when professional legal advice is
sought, the legal expertise of a data controller cannot be expected to attain the levels of
legal ingenuity of the ECtHR or the ECJ. However, what can be expected of the data
controller is that the fundamental rights concerns are taken into account next to other
concerns. For example, a data controller acquiring or maintaining a smart CCTV system will
have to pick the system that combines all the interests and rights in the best possible way,
thus embracing a sophisticated pragmatism, imbued with a basic sense of legal logic, that
can incorporate fundamental right considerations next to those of profit and efficiency:
“ [C]urrent techniques are often merely ‘designed to promote corporate and state
interests such as profit, prosperity, security and safety, often at the expense of any
given citizen’. (Kerr, 2013, p. 101) In a constitutional democracy, which claims to
77
cherish values such as due process and privacy, pragmatism should not merely be
about the best score on a standardized database or about the biggest profit, but
should also reckon with other values and perspectives.” (De Vries, 2013, p. 24-25)
The analysis presented in section 2.4 and 2.5, assessing the legal compatibility of smart CCTV
and passenger profiling systems with the European fundamental rights framework, shows
how legal norms should be taken into account and that an all too simple understanding of
the utilitarian formula “whatever works best” will not suffice when designing or acquiring
SMTs.
2.3.3 Proportionality in the quadruple structure of the legal
compatibility analysis in 2.4 and 2.5
Based on our considerations with regard to the systematic-fractal logic of law (sections 1.1.1
and 1.2b) and the suitability of proportionality analysis in assessing the compatibility of
SMTs with European fundamental rights (sections 2.3.1 and 2.3.2) we now propose a
quadruple structure for the SMT-fundamental rights compatibility analysis presented below
(sections 2.4 and 2.5).
When looking at a particular Article of the ECHR or CFREU, or at an EU Directive or
Regulation concerning EU fundamental rights, we take four aspects into consideration:
(i) Under what conditions are limitations to the fundamental right justified?
(ii) Are there specific conflicts or convergences with other fundamental rights
that are to be expected when assessing the compatibility of this right with an
SMT?
(iii) Which compatibility issues can be envisioned in relation to smart CCTV and
passenger profiling?
(iv) What are the design implications for smart CCTV?
The first two questions reflect that legal norms can never be interpreted in isolation
and that the proportionality analysis used in assessing the compatibility of the use of SMT X
78
in situation Y with fundamental right Z, is a specific way of constructing the legal relation
between an individual case (e.g., the use of SMT X in situation Y) and the fundamental right
in its interrelatedness with the legal framework to which it belongs (e.g., fundamental right
Z, belonging to legal framework A, which is invoked by looking at the conflict and
convergences of right Z to rights B, C, D, etc.) .
With regard to the second question (conflicts or convergences with other
fundamental rights) we noted some recurrent issues in our analysis.
Firstly, there is the possible conflict between the positive obligation to provide
security, derived from either Art. 2 ECHR (Right to life) or Art. 2 CFREU (Right to life), and the
negative obligations (to abstain from interference) following from most of the other
European fundamental rights that are discussed.
A second tension that reoccurs throughout our analysis is between data protection
and anti-discrimination rights. This tension can arise, firstly, between the data protection
requirements that the processing of data should be minimized and that the processing of
sensitive personal data is only permitted in exceptional cases, and the anti-discrimination
requirement that, from the perspective of an effective fight against unwarranted
discrimination, it can be helpful to store data regarding ethnicity, nationality, gender,
religion, political orientation, age, etc. A second reason why a tension between data
protection and anti-discrimination rights could arise is the fact that data minimization might
increase the reliance on discriminatory heuristics: lacking detailed personal information
decision makers might be tempted to base their decisions or broad categorizations involving,
for example, gender, age and nationality in an unwarranted discriminatory way (e.g., “all
male nationals of country X in age category Y are considered possible suspects and will be
treated differently from other passengers”).
Apart from conflicts and tensions between rights, we also look at possible
convergences between them. As we argued in chapter 1, legal norms operate in a
systematic-fractal way: therefore not only conflicts but also convergences between rights
than can be crucial for their interpretation and application in a particular case. When two
rights converge in a certain way, this can strengthen a particular interpretation of them.
Conflicts and convergences can sometimes go together. For example, next to the two
recurrent conflicts between data protection and anti-discrimination rights we also noted a
likely convergence between data them: the additionally strong protection of sensitive
79
personal data (Art. 8 DPD) converges in some cases with the protected grounds of EU anti-
discrimination law. Another recurrent convergence between the rights that we studied is the
convergence of habeas corpus and fair trial (Art. 5 and 6 ECHR, and Art. 6 CFREU) with
procedural requirements such as can be found, for example, in Art. 15 of the DPD (a decision
which produces legal effects or significantly affects the data subject may not be based solely
on automated processing of data). Interpreted in light of each other these legal provisions
seem to suggest that the requirement of due process (creating an equality of arms between
individual citizen and the State through such means as, e.g., the right to adversarial process,
the right to a public hearing before an independent and impartial tribunal within a
reasonable time, and the presumption of innocence) fuelling habeas corpus (Art. 5 ECHR)
and fair trial (Art. 6 ECHR) can also play a role before the moment of an actual investigation,
arrest or trial, and that Art. 15 DPD can be interpreted as a requirement of due processing.
(Coudert, De Vries, & Kowalewski, 2008)
2.3.4 A typology of technological and organizational design implications
With regard to the fourth question (design implications for smart CCTV) of our quadruple
analysis, we found that there were several recurrent technological and organizational design
solutions.77 The design solutions that we propose in our analyses below, in sections 2.4. and
2.5, answer six types of requirements that follow from European fundamental rights. The
first of these six requirements is a meta-requirement: it requires one to follow a (proto-
)legal (systematic-fractal) way of reasoning when looking for the optimal SMT in terms of
fundamental rights compatibility, and make this reasoning accessible. The other five types
requirements (always applied with the first meta-requirement in mind) are: the
enhancement of (a) security and confidentiality, (b) transparency, (c) accountability, and the
minimization of (d) data use (not more than strictly necessary), and (e) unwarranted and
illegitimate discrimination.
77
Close adaptation of the list presented by one of us in: Hildebrandt, 2013b, p. 20-22. However, there are three important differences between the two lists: (1) the original list Hildebrandt, 2013b, p. 20-22, is strictly focused on technological solutions, whereas the design suggestions presented in this deliverable are both technological and organizational, (2) the current list is organized according to the functionality of the design solutions, (3) the description of the technological design solutions is more extensive in the original list.
80
(a) Applying a (proto-)legal reasoning:
- A (basic) proportionality analysis justifying the choice for a particular SMT.
- Offering alternatives to an SMT to enhance proportionality.
- Creating safeguards that compensate for potential fundamental right
infringements.
- Assessing as much as possible what the effects of an SMT are in practice and
whether it effectively achieves the aim(s) it is supposed to fulfill.
- Notification of the presence of a particular SMT.
- Making the (basic) proportionality analysis easily accessible.
By listing all potentially relevant fundamental rights and studying their potential
convergences and conflicts in a particular SMT setting, one should strive to create
an SMT-setting that optimizes all rights and interests involved. In order to do this
one should, as much as possible, assess what the effects and effectiveness of the
particular SMT setting is. Two other important aspects within a proportionality
analysis are the possibility for alternatives (e.g., if a passenger objects to a body
scanner one should be able to opt for a traditional frisk-and-search) and
safeguards (e.g., judicial oversight, access to judicial remedies, etc.).
A proportionality analysis following from this meta-requirement would partly
coincide with a data protection impact assessment (DPIA) but would also go
beyond it, as it regards all potentially fundamental rights involved. Moreover, this
meta-requirement would establish a clear link between fundamental rights
impact assessments (FRIAs) and legal protection by design (LPbD): by making the
FRIA easily accessible, the design of the SMT becomes more easily contestable.
One could even imagine technical solutions, such as an app for smart phones that
informs travelers and passengers of the considerations underlying a particular
SMT as they pass it.
As we argued earlier in this deliverable, it would be unreasonable to expect a
data processor to perform a proportionality analysis on the same level as this is
done by the Human Rights Court in Strasbourg (ECtHR). However, one could
expect a data controller to perform a basic or “proto-”proportionality analysis.
The KORA-method described in D9.2 is very suitable for this because it is a
method enforcing articulation and deliberation: the process of concretisation
81
forces the user of the KORA method to lay out all the considerations leading up to
the choice of a particular SMT, making it easier to contest this choice.
(b) Enhancement of security and confidentiality (limiting data access):
- Separation of data streams
- end-2-end encryption
- secure authentication
All of the aforementioned design solutions enhance security and confidentiality
by limiting data access to those who really have the right to access them.
(c) Enhancement of transparency and awareness:
- Personal Data Vaults or similar solutions, creating transparency as to which
data are used.
- See also: meta-requirement (a) concerning the accessibility of the (proto-
)legal reasoning underlying an SMT.
Personal data vaults can enhance transparency, for example, when PNR data or
API data are used: of course everything depends on the particular circumstances,
but there is no a priori reason to assume that concealing from a passenger which
personal data are used will enhance security. In fact, being transparent about
data usage might even enhance security by producing a deterrent effect. One
could also think of an ICT-system that shows the passenger which authorities or
bodies have accessed data. Personal data vaults will, of course, be of little use
when the SMT processes data that are not linkable to an identified or identifiable
person. This is, for example, the case with a smart CCTV system that categorizes
movements or activities and that is not linked to a database that leads back to an
identified natural persons (faces, gait, etc.). In this case transparency can be
enhanced by drawing design implications from the legal meta-requirement
(notification of the presence of a particular SMT combined with an accessible
justification for it).
(d) Enhancement of accountability:
- Transparent information about remedies to compensate for potential
fundamental rights infringements.
The creation of judicial remedies and appropriate damages and sanctions is the
responsibility of the State, but the data controller can add to the accountability of
82
an SMT by providing clear information about the possibilities for judicial
remedies.
(e) Data minimization:
- Privacy preserving data mining (PPDM) and aggregation techniques to achieve
anonymisation, ensuring that no more information than necessary is used
(data minimization). These techniques are only useful for analytics on data
aggregates.
- Management of credentials instead of identification, ensuring that no more
information than necessary is used (data minimization). This is a
straightforward technique for situations when data need to be linked back to
one particular person, but full identification (name, address, ID-number, etc.)
is not necessary.
- Metadata that define when, how or how long data can be used (e.g., type of
data; ground for processing; allowed purpose of processing per controller;
consent for which purpose for which controller/processor; allowed recipients
of the data; time stamps for release, processing operations, erasure;
linkability; anonymisation, etc.). This is a rather elaborate technique that
allows for fine-grained specifications of the who, when, where, how and why
of data access and usage.
(f) Transparency about and adjustment of prohibited discrimination:
- Discrimination Aware Data Mining (DADM), providing transparency and the
possibility to adjust biases within data mining operations. (Custers, et al.,
2012; Dwork, Hardt, Pitassi, Reingold, & Zemel, 2012; Hajian, 2013; Kamiran,
2011)
83
2.4 Council of Europe: the ECHR
The analyses presented below regard Art. 3 ECHR (prohibition of torture and inhuman and
degrading treatment), Art. 5 ECHR (prohibition of illegal detention and habeas corpus), art. 6
(presumption of innocence, due process, fair trial), Art. 7 ECHR (prohibition of retroactive
criminalization) and Art. 8 ECHR (respect for private life).
We also look at Art. 14 ECHR, even though it differs from the other rights that are
discussed: unlike the other rights this Article does in principle not grant an autonomous right
but merely an ancillary one. This means that Art. 14 does not prohibit illegitimate
discrimination as such, but that it prohibits unequal treatment in the enjoyment of other
fundamental ECHR rights (situations such as: one ethnic group has freedom of assembly
while another has not; a difference is made in the respect for the private and family life
according to gender; etc.). This also explains why there is relatively little ECtHR case law on
Art. 14: after all, when the Court can establishes that one of the fundamental rights of the
Convention has been infringed, there will be no need to explore whether it has been
infringed in a discriminatory way (Art. 14). However, since Protocol 1278 entered into force in
2005, the scope of Art. 14 has broadened. (Edel, 2010; Fribergh & Kjaerum, 2011) Protocol
12 guarantees equal treatment in the enjoyment of any right, i.e. also rights granted by
national law. Consequently, Protocol 12 has the potential of turning Art. 14 into a free-
standing, “general prohibition of discrimination.”79 Moreover, independent of the question
whether Art. 14 functions as an ancillary or autonomous right, it is of particular importance
to SIAM because it relates the compatibility assessments of SMTs with fundamental rights to
the distribution of freedom infringements.
Art. 15 opens the possibility to derogate from the ECHR fundamental rights in time of
emergency. It should be noted that we do not discuss Art. 15 separately, but that it is an
important article that we take into account when discussing the limitations of fundamental
78
Protocol No. 12 to the 1950 European Convention for the Protection of Human Rights and Fundamental Freedoms, ETS no. 177, adopted on 4 November 2000 (Rome); entry into force on April 1, 2005. Currently (August 2013) 18 member states have ratified the Treaty (from a total of 47 Council of Europe member states). See: http://conventions.coe.int/Treaty/Commun/QueVoulezVous.asp?NT=177&CM=8&DF=15/09/2013&CL=ENG 79
ECtHR, Sejdić and Finci v. Bosnia and Herzegovina, nos. 27996/06 and 34836/06, judgment of 22 December 2009.
84
rights. Of all the articles that we discuss only Article 2 (Right to Life) and Article 3 (Prohibition
of Torture) cannot be derogated from based on Article 15.
Note that citizens of the member states of the Council of Europe can appeal to all
aforementioned articles in their national courts, and if they find that their rights have been
violated without an effective remedy within their national jurisdiction they can file their case
against their government with the European Court of Human Rights.
85
General remarks:
Art. 2 (see for a detailed description: Korff, 2006) is very important for SIAM because when
interpreted as not merely imposing negative obligations on a State (to not kill citizens) but
also as imposing positive obligations (to take measures to prevent citizens from getting
killed), it can be used as a legal ground for arguing that in as far as SMTs are necessary to
protect security and prevent deaths, the State is obliged to implement them. Creating
security is thus one of the positive obligations following from Art. 2. However, the extent
and content of a positive obligation is more difficult to establish than a negative one, and
will largely depend on a proportionality analysis that take the specific circumstances into
account.
(i) Under what conditions are limitations to Art. 2 justified?
Article 2(2) names various specific limitations to the right to life: when death results
from absolutely necessary violence in defense of another person from unlawful
violence, to effect a lawful arrest or prevent an escape, or to lawfully quell a riot or
insurrection. It is of course up for discussion what “absolutely necessary” means, but
it may be clear that the three derogations of Art. 2(2) give a rather broad range of
settings in which law officials can derogate from the right to life. Next to the
derogations in Art. 2(2), there is also one named in Art. 15(2): death resulting “from
lawful acts of war.” Nevertheless, Art. 2 is considered to be an absolute right because
there are no other derogations possible to the right to life than the four
ARTICLE 2. Right to life.
1. Everyone's right to life shall be protected by law. No one shall be deprived of his life
intentionally save in the execution of a sentence of a court following his conviction
of a crime for which this penalty is provided by law.
2. Deprivation of life shall not be regarded as inflicted in contravention of this article
when it results from the use of force which is no more than absolutely necessary:
(a) in defence of any person from unlawful violence;
(b) in order to effect a lawful arrest or to prevent escape of a person lawfully
detained;
(c) in action lawfully taken for the purpose of quelling a riot or insurrection.
86
aforementioned. The general principle of derogation (allowing for fundamental rights
limitations as long as they are lawful, have a legitimate aim and are necessary in a
democratic society) does not apply.
However, the question of derogations applies only to the negative obligations
(not killing) that follow from Art. 2. In the context of SIAM the positive obligations
(acts to prevent killing and preserve life) following from Art. 2 are in fact of bigger
importance than the negative ones, because one of the positive obligations that can
follow from Art. 2 is the obligation to provide protection and security against life
threatening situations. States can ensure the right to life by means of legislation or by
other means. This is, of course, not to say that the State has a positive obligation to
protect the right to life against just any private interference threatening life or is to
be blamed for the realization of just any life threatening situation. What can be
expected of the State is limited by reasonableness, and, consequently, requires a
strict proportionality test.
(ii) Are there specific conflicts or convergences with other fundamental rights that are to
be expected when assessing the compatibility of this right with an SMT?
Because the extent and content of a positive obligation is more difficult to establish
than a negative one (not-doing is rather unequivocal, whereas doing something can
mean many things), will largely depend on a proportionality analysis that does not
only take the specific circumstances into account but also looks at the interaction of
Art. 2 with other fundamental rights. For example, when a lawfully detained person
manages to commit suicide and it is clear that this could have been prevented by
more intensive surveillance (such as permanent camera observation in the cell of the
detainee) the Court needs to strike a balance80 between the positive obligation of the
State to protect human life and the negative obligation on the State to respect
private life (Art. 8), in which it is perfectly possible that the obligations of Art. 8
outweigh those of Art. 2 or the other way around.
(iii) Which compatibility issues can be envisioned in relation to Smart CCTV and Passenger
Profiling? 80
Cf. ECHR, Keenan v. the United Kingdom, no. 27229/95, judgment of 3 April 2001.
87
One could imagine that a smart CCTV system or passenger profiling system, used by
law enforcement officials, which clearly underperforms in providing security in a
situation where there is a concrete life threatening situation, could be understood as
an infringement on the positive obligation to protect life. To put it differently, a smart
CCTV or passenger profiling system that is fully ineffective and fails to provide
sufficient security (within the limits of reasonableness, that is, e.g., taking into
account costs, state of technical development and the availability of alternative
SMTs) could fail to comply with Art. 2. This makes it very important (not only for law
enforcement officials acting in name of the State, but also sometimes for private
actors due the horizontal effect of Art. 2) to continuously assess the actual
effectiveness of an SMT.
The compatibility of Smart CCTV and Passenger Profiling systems with negative
obligations following from Art. 2 will only become an issue when law enforcement
officials kill somebody based, or partly based, on the information of these two SMTs.
(iv) What are the design implications for Smart CCTV?
When Smart CCTV is used as a measure against serious crime, the most important
design implication following from Art. 2 is that one should strive for a design that is
actually effective in providing security from life threatening dangers. This
corresponds to design requirement (a) in section 2.3.4: to assess effectiveness,
alternatives and safeguards in a (proto-)legal manner and make this reasoning
accessible. Accessibility can be achieved by, for example, (electronic) publication or
by educating employees that interact with passengers.
When Smart CCTV is used to spot non-life threatening dangers (for example,
wrongly parked cars, theft, or persons without a residence permit) there are not
likely to be any design implications based on either positive or negative obligations
following from Art. 2.
88
General remarks:
Art. 3 (see for a detailed description: Reidy, 2002) is very important for SIAM because it
creates boundaries that SMTs should not cross in order not to become inhumane, degrading
or plain torture. Both negative as well as positive obligations can follow from Art. 3, creating
a horizontal effect:
“[T]he obligation on High Contracting Parties under Article 1 of the Convention to
secure to everyone within their jurisdiction the rights and freedoms defined in the
Convention, taken together with Article 3, requires states to take measures designed
to ensure that individuals within their jurisdiction are not subjected to torture or
inhuman or degrading treatment, including such ill-treatment administered by
private individuals” (Reidy, 2002, p. 37)
Among the positive obligations following from Art. 3 is the obligation to properly investigate
allegations of torture, inhumane or degrading treatment and offer appropriate redress when
an infringement of Art. 3 has occurred.
(i) Under what conditions are limitations to Art. 3 justified?
Of all the Convention freedoms and rights discussed in section 2.4, Article 3 ECHR is
the only one which is fully absolute: it does not allow any derogations or limitations.
“Even in the most difficult circumstances, such as the fight against terrorism
and organised crime, the Convention prohibits in absolute terms torture and
inhuman or degrading treatment or punishment. Whether or not any
individual has committed a terrorist or other serious criminal offence, or is
ARTICLE 3. Prohibition of torture. No one shall be subjected to torture or to inhuman or degrading treatment or punishment.
89
suspected of such, it is irrelevant for determining whether the treatment
inflicted on that person infringes the prohibition against ill-treatment.”
(Reidy, 2002, p. 19-20)
However, as explained before the absoluteness of Art. 3 only applies to treatments
with a certain minimum level of severity. For example, it is highly unlikely that an
ordinary frisking at an airport would fall under the protective scope of Art. 3. Since
Ireland v. the United Kingdom81 (1978) he ECtHR has made it clear that:
“…ill-treatment must attain a minimum level of severity if it is to fall within
the scope of Article 3 (art. 3). The assessment of this minimum is, in the
nature of things, relative; it depends on all the circumstances of the case,
such as the duration of the treatment, its physical or mental effects and, in
some cases, the sex, age and state of health of the victim, etc.” (§ 162)
Because only infringements with “a minimum level of severity” fall under the
protective scope of Art. 3, proportionality considerations can – to a certain extent -
enter the stage through the backdoor.
(ii) Are there specific conflicts or convergences with other fundamental rights that are to
be expected when assessing the compatibility of this right with an SMT?
The absolute nature of Art. 3 does not allow for any derogations in the name of a
public interest such as security: “a terrorist cannot be tortured […] even if it were
demonstrable that this could save many innocent lives.” (De Schutter & Tulkens,
2007, p. 12; 2008) Consequently one can safely assume that the absolute nature of
the negative obligations of Art. 3 would also prevail when the situation is formulated
as a conflict with the positive obligations following from Art. 2 (to protect human
lives) and that, notwithstanding how many lives are at stake, there can be no
justification to cross the minimum level of severity of ill-treatment.
While there is no rule that says that “in a situation of conflict between an
‘absolute’ right on the one hand, and a ‘relative’ right on the other, priority should 81
ECtHR, Ireland v. the United Kingdom, no. 5310/71, judgment of 18 January 1978, § 162.
90
should necessarily […] be recognized to the former” (De Schutter & Tulkens, 2007, p.
13; 2008), Art. 3 is such a pivotal and absolute right that it is likely to prevail if it
would be in conflict with one of the qualified or limited rights (e.g. Arts. 5, 6, 8, 9 or
14) of the Convention.
However, conflicts between Art. 3 and the aforementioned rights will be rare,
in as far as they all share the same logic of freedoms that require the State to abstain
from certain interferences. Instead convergences are more likely. Examples of
situations in which Art. 3 could converge with other ECHR right are, e.g., the illegal
(attempt) to kill a human person (Art. 2), detention without a lawful court assessing
and approving of that detention (Art. 5) or institutional discrimination82 (Art. 14).
(iii) Which compatibility issues can be envisioned in relation to Smart CCTV and Passenger
Profiling?
The application and use of knowledge generated by smart CCTV and passenger
profiling should never lead to an inhumane or degrading treatment. Such treatment
would be incompatible with Art. 3. This could, for example, be the case when smart
CCTV or a passenger profiling system lead to institutionalized racism or
discrimination.
(iv) What are the design implications for Smart CCTV?
One of the design implications could be the use of Discrimination Aware Data Mining
(DADM) to prevent institutionalised discrimination or racism.
Art. 3 can also be understood as requiring a continuous (self-)critical attitude in order
to keep the use of an SMT within the boundaries set out by this freedom. This
requirement has similarities to design requirement (a) of section 2.3.4, as it requires
a (proto-)legal assessment of the smart CCTV system, but it also differs from it as it is
not guided by a proportionality analysis (Art. 3 being an absolute freedom).
82
ECtHR, Abdulaziz, Cabales and Balkandali v. the United Kingdom, no. 9214/80, 9473/81, 9474/81, judgment of 28 May 1985.
91
ARTICLE 5. Freedom from unlawful detention 1. Everyone has the right to liberty and security of person.
No one shall be deprived of his liberty save in the following cases and in accordance with a procedure prescribed by law: (a) the lawful detention of a person after conviction by a competent court;
(b) the lawful arrest or detention of a person for non-compliance with the
lawful order of a court or in order to secure the fulfilment of any obligation
prescribed by law;
(c) the lawful arrest or detention of a person effected for the purpose of
bringing him before the competent legal authority of reasonable suspicion
of having committed and offence or when it is reasonably considered
necessary to prevent his committing an offence or fleeing after having done
so;
(d) the detention of a minor by lawful order for the purpose of educational
supervision or his lawful detention for the purpose of bringing him before
the competent legal authority;
(e) the lawful detention of persons for the prevention of the spreading of
infectious diseases, of persons of unsound mind, alcoholics or drug addicts,
or vagrants;
(f) the lawful arrest or detention of a person to prevent his effecting an
unauthorized entry into the country or of a person against whom action is
being taken with a view to deportation or extradition.
2. Everyone who is arrested shall be informed promptly, in a language which he
understands, of the reasons for his arrest and the charge against him.
3. Everyone arrested or detained in accordance with the provisions of paragraph
1(c) of this article shall be brought promptly before a judge or other officer
authorized by law to exercise judicial power and shall be entitled to trial within a
reasonable time or to release pending trial. Release may be conditioned by
guarantees to appear for trial.
4. Everyone who is deprived of his liberty by arrest or detention shall be entitled to
take proceedings by which the lawfulness of his detention shall be decided
speedily by a court and his release ordered if the detention is not lawful.
5. Everyone who has been the victim of arrest or detention in contravention of the
provisions of this article shall have an enforceable right to compensation.
92
General remarks:
When SMTs are used to detect criminal or dangerous activities, some of those detections are
likely to result in arrests and detentions of people suspected of such activities: actions that
rob persons of their “liberty and security”83. This makes Art. 5 (see for a detailed description:
Macovei, 2002), which protects from unlawful detention (i.e., against the arbitrary
deprivation of one’s “physical liberty” and “personal security”84) and thus expresses the
habeas corpus principle, very important to SIAM.
(i) Under what conditions are limitations to Art. 5 justified?
Art. 5(1) gives a list of situations in which the right to liberty and security can be
lawfully limited, e.g., when one is arrested on reasonable suspicion of a crime,
imprisoned in fulfillment of a conviction by a competent court, or detained to
prevent the spreading of infectious diseases. On top of the specific limitations listed
in Art 5(1), the fact that a State is “in time of war” or subjected to an “other public
emergency threatening the life of the nation” (Art. 15), can also make it permissible
to limit the right to liberty and security. Finally, the general principle of derogation
(allowing for fundamental right limitations as long as they are lawful, have a
legitimate aim and are necessary in a democratic society) also applies and can also
limit the freedom protected by Art. 5.
Taken together this is quite an abundant list of derogations. Nevertheless the case-
law of the ECtHR shows that:
“…the limitations on the right to liberty should be seen as exceptional and
only permitted where a cogent justification for them is provided; their
implementation cannot begin with any assumption that anything which public
authorities propose is necessarily appropriate.” (Macovei, 2002, p. 7)
The Court gives teeth to this idea with a “presumption in favour of liberty”: the
burden of proof is placed “on those who have taken away someone’s liberty.”
83
“Liberty and security” are considered as one legal concept and are not granted separate meanings. 84
ECtHR, Kurt v. Turkey, no. 24276/94, judgment of 25 May 1998, §123.
93
(Macovei, 2002, p. 8) This burden requires law enforcement officials to be self-critical
to ensure that, when they use their powers to deprive persons from their liberty,
they observe the limits imposed by Article 5. (Macovei, 2002, p. 8) In addition, Art. 5
stipulates that a detention can only be lawful when the conditions of Arts. 5(2), 5(3),
5(4) and 5(5) are observed: a detainee has to be informed of the reasons for the
arrest and any charges in a language he or she understands, get prompt access to
judicial proceedings to determine the legality of the arrest or detention and be given
a trial within a reasonable time, and be given compensation in the case of arrest or
detention in violation of Art. 5.
(ii) Are there specific conflicts or convergences with other fundamental rights that are to
be expected when assessing the compatibility of this right with an SMT?
There could be a conflict with the positive obligations following from Art. 2 to protect
human life.
There can be convergences with other Convention rights and freedoms,
especially with the negative obligations following from rights that require the State to
abstain from certain interferences (e.g., art. 8 ECHR).
(iii) Which compatibility issues can be envisioned in relation to Smart CCTV and Passenger
Profiling?
When the use of smart CCTV or a passenger profiling system leads up to the arrest or
detention of a person, these SMTs can be scrutinized from the perspective of Art. 5.
When the smart CCTV or passenger profiling system relies on an machine generated,
complex or opaque data model, compatibility issues with Art. 5(2) can arise, because
this Article requires that everyone who is arrested is informed, in a language which
he or she understands, of the reasons for his arrest and the charge against him. We
would like to argue that it is possible to interpret “in a language which he
understands” as requiring the translation of the calculative, algorithmic operations
underlying automated profiling into a comprehensible natural language that the
suspect understands.
(iv) What are the design implications for Smart CCTV?
94
The design implications of Art. 5 for smart CCTV systems have to do with design
requirements (d) the enhancement of accountability, (c) the enhancement of
transparency and awareness, and (a) the performance and publication of a basic
proportionality analysis.
95
General remarks:
Art. 6 (see for a detailed description: Mole & Harby, 2006) is very important to SIAM because
it codifies the principle of the fair trial and the idea of due process. It notably guarantees
that if criminal charges are brought against a person, this person must be given the means to
contest (1) that he or she has committed the incriminated action and (2) that his or her
actions can be qualified as a criminal offense. It also forces the state to integrate a set of
principles that aim to make such procedures fair, transparent and proportional:
Principle of access to an independent and impartial court
ARTICLE 6. Presumption of Innocence and Fair Trial
1. In the determination of his civil rights and obligations or of any criminal charge
against him, everyone is entitled to a fair and public hearing within a reasonable
time by an independent and impartial tribunal established by law. Judgement shall
be pronounced publicly by the press and public may be excluded from all or part of
the trial in the interest of morals, public order or national security in a democratic
society, where the interests of juveniles or the protection of the private life of the
parties so require, or the extent strictly necessary in the opinion of the court in
special circumstances where publicity would prejudice the interests of justice.
2. Everyone charged with a criminal offence shall be presumed innocent until proved
guilty according to law.
3. Everyone charged with a criminal offence has the following minimum rights:
(a) to be informed promptly, in a language which he understands and in detail, of
the nature and cause of the accusation against him;
(b) to have adequate time and the facilities for the preparation of his defence;
(c) to defend himself in person or through legal assistance of his own choosing
or, if he has not sufficient means to pay for legal assistance, to be given it free
when the interests of justice so require;
(d) to examine or have examined witnesses against him and to obtain the
attendance and examination of witnesses on his behalf under the same
conditions as witnesses against him;
(e) to have the free assistance of an interpreter if he cannot understand or speak
the language used in court.
96
Principle of external publicity
Principle of internal publicity or equality of arms between defence and prosecution
This includes being notified of the charge, having access to and being allowed to
contest all the evidence brought before the court
Presumption of innocence
In a broader sense this due process right guarantees that persons suspected of a criminal
offence are notified, have access to the charge and evidence against them and cannot be
treated as criminals until an objective third party has established their guilt on the basis of a
contradictory procedure.
To the extent that specific types of persons become suspect due to security measures
that aim to filter the population of a transport hub or airport, the right to due process may
be at stake. Can passengers reasonably foresee whether they will be suspected, will they be
notified at some point of the measures taken against them, are eventual infringements
evaluated by independent and an impartial third party, do they have a chance to contest
their categorization as outliers or threats to security?
(i) Under what conditions are limitations to Art. 6 justified?
Art. 6 is not an absolute right. Art. 6(1) enumerates some situations in which the right to fair
trial can be lawfully limited and can also be derogated from when a State is “in time of war”
or subjected to an “other public emergency threatening the life of the nation” (Art. 15).
Most importantly, however, is the implied limitation of the right to fair trial by the general
principle of derogation (allowing for fundamental right limitations as long as they are lawful,
have a legitimate aim and are necessary in a democratic society). Lawful limitations to Art. 6
are permissible as long the “essence of the right” is preserved:
“The Court recalls that Article 6 § 1 embodies the “right to a court” […]. However,
this right is not absolute, but may be subject to limitations; these are permitted by
implication since the right of access by its very nature calls for regulation by the
State. In this respect, the Contracting States enjoy a certain margin of appreciation,
although the final decision as to the observance of the Convention's requirements
rests with the Court. It must be satisfied that the limitations applied do not restrict or
97
reduce the access left to the individual in such a way or to such an extent that the
very essence of the right is impaired. Furthermore, a limitation will not be compatible
with Article 6 § 1 if it does not pursue a legitimate aim and if there is not a
reasonable relationship of proportionality between the means employed and the aim
sought to be achieved.”85
(ii) Are there specific conflicts or convergences with other fundamental rights that are to
be expected when assessing the compatibility of this right with an SMT?
One could envision situations in which a conflict between Art. 6 and the positive
obligations following from Art. 2 (to protect human life) could arise, that is where a
fair and public trial would be detrimental to the prevention of a life-threatening
situation, for example, because certain dangerous information is released. In such a
scenario it will be necessary to strike a fair balance: this will not require an either-or
decision but a composition that aspires to optimize – as far as possible – all the rights
and interests involved.
Art. 6 is informed by a more broad idea of “due process”, that “provides
subjects with a right to contest a decision against their interests” and consequently
also the right “to be made aware of and to be capable of contesting the violations of
other rights.” (Hildebrandt, 2013a, p. 13) In the context of profiling, this broad idea of
due process
“…would entail an effective right to be made aware of automated decision-
systems and the envisaged effects they have and the right to object against
being submitted to such decision making (Hildebrandt & De Vries, 2013;
Steinbock, 2005).” (Hildebrandt, 2013a, p. 13)
The principle of “due process” is articulated in various provisions of the ECHR. For
example, next to Art. 6, Art. 5(2) (that everyone who is arrested is informed in a
language which he understands of the reasons for his arrest and the charge against
85
cf. ECtHR, Tinnelly & Sons ltd and Mcduff and others v. the United Kingdom, no. 62/1997/846/1052–1053, judgment of 10 July 1998, § 72.
98
him) can also be seen as an expression of due process. Due process also informs
certain provisions within the EU legislation regarding data protection, such as Art.
15(1) of the DPD 95/46 (prohibition of decisions based solely on automated
processing)
“…the right to every person not to be subject to a decision which produces
legal effects concerning him or significantly affects him and which is based
solely on automated processing of data intended to evaluate certain personal
aspects relating to him, such as his performance at work, creditworthiness,
reliability, conduct, etc.” (art. 15(1) DPD 95/46)
In conjunction with Art 15(1) one of the access-provisions of Art. 12 DPD (right to
access the logic behind automated decisions) is also particularly relevant:
“[E]very data subject [has] the right to obtain from the controller knowledge
of the logic involved in any automatic processing of data concerning him at
least in the case of the automated decisions referred to in Article 15 (1).” (Art.
12 DPD 95/46)86
One could argue that Arts. 12 and 15(1) DPD, Art. 5(2) and Art. 6 ECHR provide for a
continuum of due process protection in different stages: Art. 12 and 15(1) DPD cover
situations where the automated profiling “merely” significantly affects the data
subject, Art. 5(2) ECHR gains relevance when the data subject has been arrested and
Art. 6 ECHR covers the situation when the State begins with legal proceedings, based
on criminal charges, against the data subject. Because each of these stages will affect
the other, it seems reasonable to understand the aforementioned provisions in their
interrelatedness through the concept of due process.
86
In the proposed GDPR the requirements of Arts. 15(1) and 12 DPD are duplicated and in Art. 20 of the proposed GDPR, and supplemented with the obligation for data controllers to provide “information as to the existence of processing” for automated decision-making. This addition to Art. 20 addresses the problem that in the current regime of the DPD Art. 15' will often be often little avail because one is not aware of the the automated decisions in the first place, and will thus not feel inclined to use the right of access granted by Arts 15(1) and 12 DPD. See for an extensive and critical analysis of the Art. 20 GDPR: (Hildebrandt, 2012, pp. 50-51)
99
(iii) Which compatibility issues can be envisioned in relation to Smart CCTV and Passenger
Profiling?
The compatibility issues with Art. 6 are very similar to those named under Art. 5
ECHR. Incompatibility can mainly arise from profile intransparency.
(iv) What are the design implications for Smart CCTV?
The design implications of Art. 6 for smart CCTV systems are very similar to those of
Art 5 ECHR: they have to do with design requirements (d) the enhancement of
accountability, (c) the enhancement of transparency and awareness, and (a)
performance and publication of a basic proportionality analysis. (See section 2.3.4 for
a description of these design implication)
100
General remarks:
Many of the SMTs described in SIAM have to do with surveillance and entail some kind of
infringement on private life. It is therefore clear that the right to respect for private and
family life (Art. 8) is of utmost importance to SIAM (see for a detailed description of Art. 8:
Kilkelly, 2001).
(i) Under what conditions are limitations to Art. 8 justified?
Next to the “state of war or emergency”-derogation of Art. 15 ECHR, the right to
respect of private life can be limited when an infringement passes the general
threefold proportionality test (allowing for fundamental right limitations as long as
they are lawful, have a legitimate aim and are necessary in a democratic society).
Because this threefold proportionality test is articulated very explicitly in Art. 8(2), we
take a more detailed look at how it functions in the context of Art. 8.
The three conditions that must be fulfilled in order to legitimize an infringement
or violation of Art. 8 are:
1. The security measure that violates the right to privacy must be based on law.
This implies:
a. Reasonably foreseeability
b. Proportional safeguards, e.g. warrants needed for more serious
infringements
2. The security measure that violates the right to privacy must be necessary in a
democratic society. This implies:
ARTICLE 8. Respect for private and family life 1. Everyone has the right to respect for his private and family life, his home and his
correspondence. 2. There shall be no interference by a public authority with the exercise of this right
except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.
101
a. That the measure will probably be effective in achieving the goal of
increasing security
b. That not other measure that are less infringing are possible
(subsidiarity)
3. The security measure that violate the right to privacy has a legitimate aim:
a. national security,
b. public safety or the economic well-being of the country,
c. for the prevention of disorder or crime,
d. for the protection of health or morals, or
e. for the protection of the rights and freedoms of others
4. The security measure must be proportionate in relation to the targeted
security benefit.
The threefold proportionality requirements of lawfulness, necessity in a democratic
society, and a legitimate aim, imply that security measures that violate privacy
necessitate institutionalized checks that preempt ad hoc, ineffective, and
disproportional measures. At the same time they require counter-infringement
measures that mitigate and compensate the infringements and ensure that
decisions are taken at the right level, are transparent and contestable in a court of
law.
(ii) Are there specific conflicts or convergences with other fundamental rights that are to
be expected when assessing the compatibility of this right with an SMT?
Situations in which a conflict between Art. 8 and the positive obligations following
from Art. 2 (to protect human life) arise are very wide-spread. Such conflict basically
emerges whenever surveillance is detrimental to Art. 8 and beneficial to Art. 2.
Another possible conflict is the one between Art. 8 and anti-discrimination
rights, which can be either derived from Art. 14 ECHR or from one of the EU
instruments against specific forms of discrimination. This conflict has been
poignantly formulated by Strahilevitz (2008) and amounts to the fact that a strong
protection of personal information can lead to situations in which distasteful,
discriminatory heuristics are used, and consequently to an increase of discriminatory
102
infringements, that is, on the right to not be discriminated against on the basis of
protected or arbitrary grounds. For example, lacking any detailed information about
whether passengers have committed aggressive criminal offences in the past,
security officers (and the profiling systems on which they rely) might take recourse to
discriminatory proxies such as “I have the impression that a high percentage of
passengers of ethnicity X and gender Y have a criminal record with violent offences,
so I will use ‘ethnicity’ and ‘gender’ as a proxy for ‘criminal record with violent
offences’.” According to Strahilevitz a decrease in privacy protection could thus
“reduce the prevalence of distasteful statistical discrimination.” (2008, p. 364) In
addition privacy and data minimization (one of the obligations following from EU
data protection law) could interfere with the realization of effective anti-
discrimination measures because they would make it impossible to discover
discrimination through the practice of recording and statistically analyzing sensitive
data with regard to how often a certain negative treatment is applied to particular
“categories” of people (ethnicity, gender, age, etc.).
The potential conflicts described above are, however, not always as black-
and-white as Strahilevitz makes them look. A maximal intrusion on private life will
not necessarily result in a maximal protection of the right to life. One could easily
envision situations in which a more privacy-friendly SMT is, in fact, also a more
effective one in terms of security. Similarly, as one of us has argued elsewhere
(Gellert, et al., 2012), the conflicts between privacy and anti-discrimination rights can
often be rearticulated in such a way that they turn into a convergence instead. The
choice “between either total transparency or total privacy’ (Gellert, et al., 2012, p.
82) is a flawed one. As the European Court of Justice showed in Huber87anti-
discrimination rights and data-protections provisions are not in conflict with each
other but supplementary. The collection and analysis of statistical data to monitor
discrimination can be perfectly in line with data protection when these data are
anonymized. Similarly, combating anti-discrimination can be a legitimate aim from
the perspective of Art. 8 of DPD 95/46, making the data processing legitimate:
87
ECJ, Huber v. Germany, C-524/06 (Judgment of December 16, 2008)
103
“By default, data protection allows for the processing of personal data, but
only at certain conditions. These conditions have been explained in the
previous section: in addition to pursuing a legitimate aim, the processing
must be necessary and proportional to this aim. Therefore, the point is more
about determining the necessity and the proportionality of a processing in
view of the legitimate aim […].” (Gellert, et al., 2012, p. 82)
In summary, when balancing Art. 8 ECHR against other rights, the alleged conflict
should preferably not be framed in terms of a zero-sum game but (as much as
possible) in terms of convergence, reconciliation, the construction of an optimal
composition and of a win-win situation.
As regards convergences between Art. 8 ECHR and other rights, there is a
wide range of possibilities. The right to self-determination that can be derived from
Art. 8 is echoed in other rights such as the aforementioned prohibition of
discrimination (Art. 14). One convergence that deserves special attention is the one
between Art. 8 and EU data protection (Art. 8 CFREU and the several secondary
legislative instruments, of which DPD 95/46/EC is undoubtedly the most important).
Based on the Art. 8 ECHR case-law of the ECtHR and the structure of the CFREU,
Gutwirth et. al. (2011) have convincingly shown that privacy and data protection are
partly overlapping but not coinciding legal concepts:
“…data protection is both broader and narrower than privacy. It is narrower
because it only deals with personal data, whereas the scope of privacy is
wider. It is broader, however, because the processing of personal data can
have consequences not only in terms of privacy, but also in terms of other
constitutional rights. For example, data processing can impact upon people’s
freedom of expression, freedom of religion and conscience, voting rights, etc.
Most importantly, the knowledge of individuals that can be inferred from
their personal data may also bear risks of discrimination.” (p. 7)
To summarize, the protective scope of data protection and respect for private
life can be conceptualized as two overlapping Venn-circles. Contrary to the protective
104
scope of data protection, which is limited to the processing of personal data, the
protective scope of Art. 8 is not limited to any particular area of life as it protects
personal autonomy (i.e., the possibility of self-determination with regard to one’s
body, relations with others, identity88, etc.) in any imaginable area of life. It is
important to underline that the widespread idea that the protective scope of respect
for private life is limited to a secluded sphere of intimacy or only concerns to
situations of surveillance, is thus a grave misunderstanding of Art. 8 ECHR:
“As the Court has had previous occasion to remark, the concept of ‘private
life’ is a broad term not susceptible to exhaustive definition. It covers the
physical and psychological integrity of a person. It can sometimes embrace
aspects of an individual's physical and social identity. Elements such as, for
example, gender identification, name and sexual orientation and sexual life
fall within the personal sphere protected by Article 8. Article 8 also protects a
right to personal development, and the right to establish and develop
relationships with other human beings and the outside world. Although no
previous case has established as such any right to self-determination as being
contained in Article 8 of the Convention, the Court considers that the notion
of personal autonomy is an important principle underlying the interpretation
of its guarantees.”89
This is not to say that the protective scope of data protection is always
smaller than the one granted by Art. 8 ECHR. On the contrary, data protection offers
protection in many situations that do not threaten personal autonomy but in which
data are not processed in a legitimate and fair way. In these instances the protective
scope of data protection is broader than the one of Art. 8 ECHR.
(iii) Which compatibility issues can be envisioned in relation to Smart CCTV and Passenger
Profiling?
88
Agre and Rotenberg have defined privacy poignantly as “the freedom from unreasonable constraints on the construction of one’s own identity.” (2001, p. 7) 89
ECtHR, Pretty v. UK, no. 2346/02, 29 April 2002, § 61.
105
Smart CCTV systems and passenger profiling systems are very likely to violate or
infringe on Art. 8 ECHR. However, as long as these infringements pass the triple
proportionality test, they can be completely compatible with Art. 8. Compatibility
issues will mainly result from the lack of a good quality legal basis for the data
processing, a disproportion between the scope or intrusiveness of the infringement
posed by the SMT and the legitimate aim it strives to achieve, or the lacking necessity
of such infringement in a democratic society.
(iv) What are the design implications for Smart CCTV?
The design implications of Art. 8 for smart CCTV are primarily those grouped under
design requirement (a) in section 2.3.4, that concern the performance and
publication of a basic proportionality analysis (including the exploration of
alternatives, safeguards and the effectiveness of the SMT). In addition (depending on
the particular circumstances and the results of proportionality analysis) other design
implications might be the ones listed under (c) (the enhancement of transparency
and awareness), (d) (the enhancement of accountability) and (e) (data minimization).
106
General remarks:
In line with the argument presented in D9.2 (p. 39) we believe that Art. 9 (see for a detailed
description: Murdoch, 2012) is important to SIAM because SMTs can be in conflict with
religious and moral requirements. This is clearly the case with persons whose conscience or
religion does not approve of SMTs, such as particular body scanners, that are able to look
“behind” one’s clothes or that force one to remove particular garments (passenger
identification, metal detectors, etc.).
Moreover, one could also imagine that when an SMT has a strong “chilling effect”, i.e., when
the mere presence of an SMT discourages individuals from manifesting one's religion or
beliefs, this could be framed as an infringement of Art. 9. Such a chilling effect is highly
problematic in because “respect for thought, conscience and religion may now be
considered a prerequisite of democratic society.” (Murdoch, 2012, p. 83)
(i) Under what conditions are limitations to Art. 9 justified?
The freedom of thought, conscience and religion is not absolute. Next to the “state of
war or emergency”-derogation of Art. 15 ECHR, the freedom of thought, conscience
and religion protected by Art. 9 is a qualified freedom. This implies that it can be
limited when an infringement passes the general threefold proportionality test
(allowing for fundamental right limitations as long as they are lawful, have a
legitimate aim and are necessary in a democratic society) explicitly stated in Art. 9(2).
ARTICLE 9. Freedom of thought, conscience and religion 1. Everyone has the right to freedom of thought, conscience and religion; this right
includes freedom to change his religion or belief, and freedom, either alone or in community with others and in public or private, to manifest his religion or belief, in worship, teaching, practice and observance.
2. Freedom to manifest one's religion or beliefs shall be subject only to such limitations as are prescribed by law and are necessary in a democratic society in the interests of public safety, for the protection of public order, health or morals, or the protection of the rights and freedoms of others.
107
(ii) Are there specific conflicts or convergences with other fundamental rights that are to
be expected when assessing the compatibility of this right with an SMT?
One could easily envision situations in which a conflict between Art. 9 and the
positive obligations following from Art. 2 (to protect human life) could arise, that is,
where the protection of a manifestation of one’s religion or belief has a detrimental
effect on the protection of human life through security measures. Classical examples
are religious garments such as a religious facial veil prohibiting proper identification
or religious prescriptions of chastity that conflict with body scanners or pat downs
performed by a member of the other sex. Again we would like to repeat what we
have argued before: that the balancing of (seemingly) conflicting rights should
preferably not be framed in terms of a zero-sum game but (as much as possible) in
terms of convergence, reconciliation, the construction of an optimal composition and
of a win-win situation. In the case of the aforementioned conflicts one could think of
constructive alternatives, such as facial identification or pat down by a member of
the same sex in an enclosed space combined with, for example, a metal detection.
Of the many possible convergences, we would like to give some special
attention to the convergence between Art. 9 and Art. 14 (prohibition of
discrimination). The latter, discussed in greater detail below, is an ancillary right that
explicitly mentions that discrimination based “religion, political or other opinion” is
prohibited. Because Art. 14 has lower evidential standards than Art. 9, an
infringement that cannot be proven under Art. 9, can still stand a chance under Art.
14 if it relates to a differential treatment based on religion or “other opinion.” The
reason for the lower evidential standards of Art. 14 is that discrimination is often
subtle and difficult to prove, and that setting high evidential standards would make
an effective protection of Art. 14 impossible:
“As it may be difficult in practice to establish a prima facie case of
discrimination even where such discrimination exists (if, for example, a non-
discriminatory rule is applied in a discriminatory manner so as to constitute
indirect discrimination), the Strasbourg Court has recently accepted in D.H.
108
and others v. the Czech Republic90 that ‘less strict evidential rules’ should
apply in the field of discrimination in order to guarantee those concerned ‘the
effective protection of their rights’.” (Murdoch, 2012, p. 76)
Consequently, the Court held in D.H. and others v. the Czech Republic that in the case
of applicability of Art. 14, mere statistical evidence of discrimination can suffice.
In addition, the scope of protection of the ECHR right that is infringed upon in
conjunction with Art. 14 (i.e., an infringement of Art. 14 always involves another
Convention right, such as Art. 9, of which the distribution can be deemed
discriminatory under Art. 14, if its enjoyment is not evenly distributed) is normally
interpreted more broadly when Art. 14 is involved.
Taking into account the lower evidential standards and the wider scope of
protection, Art. 14 can provide effective protection in cases that are beyond the
scope or improvable under Art. 9.
(iii) Which compatibility issues can be envisioned in relation to Smart CCTV and Passenger
Profiling?
Incompatibility with Art. 9 can arise when the profiling underlying a smart CCTV or
passenger profiling system
(a) …constitutes an interference with Art. 9, for example, because:
- the profiling system discriminates (directly or indirectly) with regard to
thought, religion or belief;
- interferes with a manifestation of religion or belief (for example, there might
be religions or beliefs that object to the practice of being automatically
categorized);
- creates a “chilling effect” that disencourages one to manifest one’s religion or
belief, and,
(b) …when this interference does not pass the triple proportionality test (in
accordance with law, legitimate aim, and necessary in a democratic society) set
out in Art. 9(2).
90
EctHR, D.H. and others v. the Czech Republic, no. 57325/00, judgment of 13 November 2007.
109
(iv) What are the design implications for Smart CCTV?
The primary design implication following from Art. 9 with regard to smart CCTV is
design requirement (a) of section 2.3.4, which concerns the performance and
publication of a basic proportionality analysis (including the exploration of
alternatives, safeguards and the effectiveness of the SMT). In addition (depending on
the particular circumstances and the results of the proportionality analysis) other
design implications might be the ones listed under (c) (the enhancement of
transparency and awareness), (d) (the enhancement of accountability), (e) (data
minimization), and (f) Discrimination Aware Data Mining (DADM) to prevent
discrimination based on religion or belief.
110
General remarks:
Art. 14 is of particular importance to SIAM because it relates the assessment of compatibility
of SMTs with fundamental rights to the distribution of infringements on the enjoyment of
these rights. Particularly when SMTs are based on automated profiling, i.e. a practice that is
inherently about differentiation and categorization, the distribution of the enjoyment of a
right is of utmost importance.
In contrast to the rights discussed above, Art. 14 is not an autonomous but an
ancillary right, that is, “a ‘parasitic’ norm” and “the ‘Cinderella’ of the Convention.” (Gerards,
2013, p. 100) The reason for this characterization of Art. 14 as “ancillary” lies in the fact that
it only prohibits discrimination in the enjoyment of the other fundamental rights guaranteed
by the Convention. As an ancillary right it does not have an autonomous existence but
always regards the way in which the enjoyment of other fundamental rights should be
distributed (see for a detailed description of Art. 14: Edel, 2010; Fribergh & Kjaerum, 2011).
In fact, the ancillary nature of Art. 14 is also precisely the reason why Art. 14 is so important
for SIAM: because it adds an extra dimension to the protection of the rights and freedoms
discussed above. Even when an infringement on one of the aforementioned rights (e.g.
freedom of thought or respect for private life) is justified on the grounds stipulated in the
particular ECHR Article protecting this right, the infringement can still be scrutinized if it does
not apply equally to all and if this uneven distribution in the enjoyment of the right is not
justified in terms of Art. 14.
Since the introduction of Protocol 12, the ancillary nature of Art. 14 should be
nuanced. Protocol No. 12 has extended the scope of Art. 14: it removes the limitation that
Art. 14 only applies to the enjoyment of Convention rights and extends its protection to the
enjoyment of any right, also on a national level. Consequently, for those member states that
ARTICLE 14. Prohibition of discrimination with regard to the exercise other human rights The enjoyment of the rights and freedoms set forth in this Convention shall be secured
without discrimination on any ground such as sex, race, colour, language, religion, political
or other opinion, national or social origin, association with a national minority, property,
birth or other status.
111
have ratified Protocol 12, Art. 14 has become less ancillary, and has turned into an all but
general prohibition of discrimination guarantying that no-one shall be discriminated against
on any ground by any public authority.
Article 14 enumerates several protected grounds of discrimination: sex, race, colour,
language, religion, political or other opinion, national or social origin, association with a
national minority, property, birth or other status. However, the formulation “…on any
ground such as…” does seem to indicate that this list is not limitative. The case-law of the
ECtHR is ambivalent and inconsistent as to whether other grounds than the ones named in
Art. 14 are also protected (Gerards, 2013): sometimes the Court admits cases that concern a
non-listed ground, while in other instances cases are declared inadmissible for concerning a
non-listed ground. In Kjeldsen, Busk Madsen and Pedersen91 (1976) the ECtHR held that any
difference in treatment that was not based on “a personal characteristic”, was inadmissible:
this both extends and limits the list of grounds of Art. 14. Yet, in later cases the Court has
dealt with differences in treatment on their merits, without investigating on which grounds
they were made and whether this ground would qualify as a personal characteristic or not.
(Gerards, 2013)
Although the majority of the case-law regarding Art. 14 ECHR deals with the grounds
that are explicitly mentioned in its text, or at least with “personal characteristics”, it is
apparently not limited to them. The reason for the wavering position of the ECtHR might be
a pragmatic one: interpreting the grounds of Art. 14 in an exhaustive manner allows the
Court to throw many cases out on grounds of inadmissibility. A consistent choice for a
limited interpretation might lower the workload92 of the ECtHR, while a consistent choice for
a non-exhaustive interpretation might increase it. As it stands, there have been cases of
discriminatory situations based on non-listed grounds in which the Court has felt that it was
necessary to intervene. Thus, notwithstanding the inconsistencies, the case law of the ECtHR
shows that interpreting Art. 14 in a non-exhaustive manner is definitely a possibility. If Art.
14 is indeed understood as enumerating protected grounds in a non-limitative way, this
would make its anti-discriminatory protection very different from the one granted by the
91
ECtHR, Kjeldsen, Busk Madsen and Pedersen v Denmark, no. 5095/71, 5920/72 and 5926/72, judgment of 7 December 1976. 92
“[A] clear choice for a nondiscrimination approach might help to reduce the influx of cases, as potential applicants may quickly learn to understand that their cases will be inadmissible. Given the current overload and backlog, this may be an important consideration for the Court.” (Gerards, 2013, p. 118)
112
legislative framework of the EU93, which only protects from discrimination based on a limited
set of grounds: i.e., discrimination based on sex, racial or ethnic origin, religion or belief,
disability, age or sexual orientation. (Art. 19 TFEU) When interpreted in a non-exhaustive
way, Art. 14 can be used to contest any discrimination that allegedly lacks reasonableness.
Art. 14 would thus be operating according to an “equal treatment rationale”, instead of one
based on non-discrimination:
“Article 14 can be regarded as an expression of the general principle of equality.
Different from the prohibition of discrimination, the equality principle is a rather
‘empty’ legal principle with no moral content of its own. […] If this perspective is
taken, each difference in treatment that affects an applicant’s Convention rights
should be assessed by the Court for reasonableness and fairness. The ground on
which the difference in treatment is based is not relevant to the applicability for a
test of justification. The only relevant question is if one group or person is allowed to
exercise a certain right or receive a certain benefit, whilst this is not permitted for
another person or group. The equal treatment approach is radically different from
the nondiscrimination approach, which clearly does have a normative content of its
own.” (Gerards, 2013, p. 118-9)
Interpreting Art. 14 “as an expression of the general principle of equality” gives it the
potential to become extremely important in assessing automated profiling practices that
often are based on grounds that do neither belong to the limited set of grounds protected
by secondary EU legislation or to the ones explicitly enumerated in Art. 14, but that are
nonetheless potentially undesirable from a fundamental rights perspective. For example, we
could think of the situation in which a (justified) infringement of the right to free movement
is unevenly distributed and (a) passengers of a certain ethnicity, and (b) passengers with a
low income, are disproportionately often prohibited from taking a particular mode of
93
Apart from Art. 21 CFREU, which duplicates the non-limitative formulation of Art. 14 ECHR. However, until now no direct impact of Art. 21 on the secondary EU anti-discrimination legislation has been noticeable. The main source for secondary EU anti-discrimination legislation is Article 19 of the Treaty on the Functioning of the Union (TFEU, 2008), which lists the protected grounds for discrimination in a limitative way: “…take appropriate action to combat discrimination based on sex, racial or ethnic origin, religion or belief, disability, age or sexual orientation.” Article 19 was first formulated as Article 13 of the Treaty Establishing the European Community (TEC, 1997; entry into force in 1999). With the entry into force of The Lisbon Treaty (2009) the TFEU has replaced the TEC. The text of ex Art. 13 TEC is identical to Art. 19 TFEU.
113
transportation. As we will see below in section 2.5, EU law prohibits the first difference in
treatment, based on ethnicity. With regard to the second infringement, regarding
passengers with a low income, Art. 14 ECHR can offer some protection: not always in the
same categorical manner as EU anti-discrimination law94, but by assessing and evaluating the
reasons adduced in justifying the difference in treatment in a highly individualized, case by
case, manner.
(i) Under what conditions are limitations to Art. 14 justified?
The ancillary prohibition of discrimination of Art. 14 is not absolute. Firstly, there is
the “state of war or emergency”-derogation of Art. 15 ECHR. Secondly, the
prohibition of discrimination can be limited according to the implied (not explicitly
mentioned in Art. 14) general limitation principle: if a limitation on Art. 14 passes the
threefold proportionality test (in accordance with law, a legitimate aim and necessary
in a democratic society) it is considered to be compatible with Art. 14.
(ii) Are there specific conflicts or convergences with other fundamental rights that are to
be expected when assessing the compatibility of this right with an SMT?
One could envision situations in which a conflict between Art. 14 and the positive
obligations following from Art. 2 (to protect human life) could arise, that is, where
abstaining from a differential treatment infringing on Art. 14 would have a
detrimental effect on the protection of human life through security measures. For
example, one could imagine a situation in which membership of a particular
apocalyptic or violent belief is a very good predictor of life threatening crime. In such
a case the two rights will have to be balanced against each other, taking into account
94
The protection of against different treatment based on the limited set of protected anti-discriminatory grounds of the EU is very categorical manner: even if there is plausible actuarial or statistical data providing empirical support for a difference in treatment, this does not justify a difference in treatment. “[I]n order to see such discrimination [based on sex] in perspective, it might be helpful to imagine a situation in which (as is perfectly plausible) statistics might show that a member of one ethnic group lived on average longer than those of another. To take those differences into account when determining the correlation between contributions and entitlements under the Community pension scheme would be wholly unacceptable, and I cannot see that the use of the criterion of sex rather than ethnic origin can be more acceptable.” Opinion of Advocate-General Sharpston on: ECJ, Lindorfer v. Council, C-227/04P (September 11, 2007). In Test-Achats (C-236/09, 1 March 2011) the decision of the ECJ, regarding insurance discrimination based on sex, was in line with the Opinion of the A-G in Lindorfer.
114
all the particulars (e.g. is there a concrete threat to life, or is it a more abstract and
generalized fear for terrorist acts?).
Because Art. 14 is an ancillary right, it converges inherently with the other
Convention rights. Two convergences are particularly interesting, namely the one
with, respectively, Art. 8 (respect for private life), and Art. 9 (freedom of conscience,
religion and belief). We extensively discussed these two convergences above, under
the heading of Art. 8 and of Art. 9, so there is no need to repeat them here.
(iii) Which compatibility issues can be envisioned in relation to Smart CCTV and Passenger
Profiling?
Smart CCTV systems and Passenger profiling systems are, at least to a large extent,
based on automated profiling. Automated profiling is a practice that is inherently
about differentiation and categorization. When the differentiations and
categorizations based on automated profiling result in differential treatments that
affect the distribution of enjoyment of Convention rights (Art. 14) or national rights
(Protocol 12), this can amount to an infringement of Art. 14. Whether such an
infringement is compatible with Art. 14 will depend on the outcome of the general,
threefold proportionality test.
(iv) What are the design implications for Smart CCTV?
First and foremost, the design implication for smart CCTV is to implement some form
of Discrimination Aware Data Mining (DADM) tool (that is, requirement (f) in section
2.3.4). However, one could imagine situations in which the differentiation and
categorization through the layer of intelligence added to a CCTV system is used
without resulting in differential treatments that are incompatible with Art. 14, for
example, by human security officers that have been trained to avoid differential
treatments or only use them when they are in line with the triple proportionality test.
In such situation DADM, removing all prohibited discrimination from the data
analysis, might throw out the baby with the bathwater.
Thus, the question whether Art. 14 requires the use of DAMD can best be answered
by following requirement (a) of section 2.3.4: the meta-requirement that requires the
performance and publication of a basic proportionality analysis (including the
115
exploration of alternatives, safeguards and the effectiveness of the SMT). In addition
(depending on the results of the proportionality analysis) other design implications
might be the ones listed under (c) (the enhancement of transparency and
awareness), (d) (the enhancement of accountability) and (e) (data minimization).
116
2.5 The EU: the EU charter of fundamental rights and secondary
legislation with regard to the protection of fundamental rights
The analyses presented below regard the EU charter of fundamental rights (CFREU) and a
selection of the secondary EU legislation with regard to the protection of fundamental
rights.
Compared to the analyses of the ECHR rights in section 2.4, the analyses of CFREU
rights in section 2.5.1 (below) will be very brief. This can be explained by the fact that there
are several CFREU Articles that duplicate ECHR articles. Art. 52(3) CFREU stipulates that the
protection 95 and interpretation of an ECHR right sets a minimum standard for the
corresponding CFREU right. This means that a CFREU rights can occasionally offer a
protection that is larger than the one given by the equivalent ECHR right. However, broadly
speaking the interpretation and protection of rights that are protected in both the ECHR and
the CFREU are likely to be very similar. Going into these subtle differences between
corresponding CFREU and ECHR rights goes beyond the scope of this deliverable. The reader
who is nevertheless interested in these differences can begin by comparing the precise
wordings of the various corresponding rights, and have a look at the abundant scholarly
literature comparing ECHR and CFREU (see e.g. above in section 1.1.3 and: Bratza, 2013). We
limit ourselves to simply quoting all the relevant CFREU rights and we will only provide short
commentaries to those CFREU rights that have no equivalent in the ECHR. A relatively
limited discussion of the CFREU rights also seems justified given the fact that the CFREU does
not have vertical direct effect (citizen versus State), and that, in contrast to the enforcement
of the ECHR by the ECtHR in Strasbourg, there is no Court to which as citizen could turn to
complain about a CFREU infringement. For a more detailed discussion of the CFREU rights
we refer to the existing commentaries on the CFREU (EU Network of independent experts on
fundamental rights, 2006; JUSTICE - the independent human rights and law reform
organisation, 2004) .
In section 2.5.2 we discuss the relevant secondary legislation. In comparison to our
discussion of the CFREU rights in 2.5.1 we will do this rather extensively, but compared to
95
The CREU rights are also absolute or limited in the same way as the corresponding ECHR rights. See, for example: European Commission (EC). (2011). Commission staff working paper. Operational Guidance on taking account of Fundamental Rights in Commission Impact Assessments. Brussels, 6.5.2011, SEC(2011) 567 final Brussels: European Union.
117
the ECHR analyses our analyses will seem frugal. This seemingly frugal approach follows
from the fact that we will often refer to matters we have already discussed in section 2.4 or
elsewhere in this deliverable (notably in section 2.3.1.b).
118
2.5.1 The EU Charter of Fundamental Rights (CFREU)
(a) CFREU Chapter 1 – Dignity (Arts. 1-5)
General remarks:
It is important that SMTs respect and protect human dignity: which makes Art. 1 CFREU very
relevant to SIAM.
Human dignity is not a legal concept that is named in any of the fundamental rights
articles of the ECHR. However, its wording corresponds almost literally to Art. 1(1) of the
German Grundgesetz (Constitution of 1949). It can also be found in the preamble of the
1948 Universal Declaration of Human Rights. The German Constitutional Court has
interpreted human dignity as the inviolable essence of all the other rights in the German
Constitution.
“’H]uman dignity’ means that the human being has a right to ‘social value and
respect’. Everyone possesses dignity as a human creature (as a ‘generic being’)
‘regardless of his/her innate characteristics, achievements and social status’. Even
through unworthy behaviour it cannot be lost. It cannot be taken away from any
human being.” (EU Network of independent experts on fundamental rights, 2006, p.
27)
One can safely assume that Art. 1 CFREU has a similar pivotal role and meaning (recognition
of the equal and intrinsic value of all humans that needs to be respected) as the first article
of the German constitution.
ARTICLE 1. Human dignity. Human Dignity is inviolable. It must be respected and protected.
119
(i) Under what conditions are limitations to Art. 1 justified?
Art. 1 CFREU is an absolute right to which no derogations are possible. In fact, it
constitutes the inviolable essence (unantastbarer Kernbereich) of all other
fundamental rights of the CFREU.
(ii) Are there specific conflicts or convergences with other fundamental rights that are to
be expected when assessing the compatibility of this right with an SMT?
Art. 1 is the inviolable essence of all other rights: limitations on any fundamental
right that result in a violation of human dignity are not permissible – there is no
provision or proportionality test that could justify such a limitation. As such Art. 1
CFREU has a point of convergence with all other rights of the CFREU.
(iii) Which compatibility issues can be envisioned in relation to Smart CCTV and Passenger
Profiling?
Because Art. 1 CFREU regards the very substance of all fundamental rights (respect
for the intrinsic value of all humans) incompatibility with this right would most likely
only occur when smart CCTV and passenger profiling would be used for clearly
degrading or inhumane treatments, discriminatory actions, etc. An example would be
to use passenger profiling to gravely humiliate all members of a certain ethnic group.
However, in conjunction with other fundamental rights other compatibility
issues are imaginable. In fact, if Art. 1 CFREU is indeed very similar to Art. 1(1) of the
German Constitution, it could be used as a source from which “new” fundamental
rights could be derived.
(iv) What are the design implications for Smart CCTV?
Art. 1 is so broad and substantial that this is hard to predict. It is clear however that
requirement (a) should be taken in to account (see section 2.3.4).
120
[Art. 2(1) CFREU corresponds to Art. 2 ECHR. See above, section 2.4.]
General remarks:
The right to integrity of the person (Art. 3(1) CFREU) is not present in the ECHR. However, if
Art 3(1) CFREU is interpreted as an absolute right it can be understood as the positive
articulation of the prohibition of torture (Art. 4 CFREU and Art. 3 ECHR), and if it is
interpreted as a limited, qualified right it can be understood as a reiteration of Art. 8 ECHR
(the right to respect for private life also protects the physical and mental integrity of a
person). Thus it is clear that the reasons why Art. 3 has been included in the CFREU are:
“…symbolic and systematic reasons in the context of the first Chapter on
Dignity, to provide for a specific right to personal integrity as a link between
the inviolability of human dignity in Article 1 and the prohibition of torture in
Article 4.” (EU Network of independent experts on fundamental rights, 2006,
p. 37)
This also means that there is no further need to discuss Art. 3 CFREU and that it suffices to
refer the reader to our discussion of Art. 3 ECHR and Art. 8 ECHR in section 2.4 (above).
ARTICLE 2(1). Right to life 1. Everyone has the right to life.
ARTICLE 3 (1). Right to the integrity of the person
1. Everyone has the right to respect for his or her physical and mental integrity
121
[Art. 3(1) corresponds to Art. 3 ECHR and/or 8 ECHR. See above, section 2.4.]
[Art. 4 CFREU corresponds to Art. 3 ECHR. See above, section 2.4.]
(b) CFREU Chapter 2 – Freedoms (Arts. 6-19)
ARTICLE 4. Prohibition of torture and inhuman or degrading treatment. No one shall be subjected to torture or to inhuman or degrading treatment or punishment.
122
[Art. 6 CFREU corresponds to Art. 5 ECHR. See above, section 2.4.]
[Art. 7 CFREU corresponds to Art. 8(1) ECHR. See above, section 2.4.]
ARTICLE 6. Right to liberty and security. Everyone has the right to liberty and security of person.
ARTICLE 7. Respect for private and family life. Everyone has the right to respect for his or her private and family life, home and communications.
123
General remarks:
The right to the protection of personal data is very important to SIAM because many SMTs
rely on the automated processing of personal data. Art. 8 CFREU has no equivalent in the
ECHR. Fuelled by the emergence of new information and communication technologies, the
idea that the processing of personal data deserves a protection that does not (fully) coincide
with the protection of private life (Art. 8 ECHR and Art. 7 CFREU), needed almost half a
century (González Fuster, 2013) to take its current shape. The turbulent story of the
emancipation of one right from another, culminated in 1995 in the EU data protection
directive (DPD 95/46/EC), in 2009 in the recognition of two separate rights in the CFREU
(respectively Art. 7 and 8) and will soon (probably in 2014) be embodied in a brand new
General Data Protection Regulation (proposed GPDR), which will replace the current DPD
95/46.
Data protection differs in several respects from respect of private life. Firstly, it has a
different scope: data protection only relates to the processing of personal data, whereas
respect from private life can in principle apply to all areas of life as long as personal
autonomy is at stake in some way. In contrast, data protection covers all situations of data
processing of personal data and is thus not limited to situations in which personal autonomy
is challenged. See for a description of the protective scope of the right to data protection
and the right to respect of private life our discussion of Art. 8 ECHR (above, section 2.4).
Secondly, the logic of data protection differs from the one of the respect for private life. De
ARTICLE 8. Protection of personal data 1. Everyone has the right to the protection of personal data concerning him or
her.
2. Such data must be processed fairly for specified purposes and on the basis of
the consent of the person concerned or some other legitimate basis laid
down by law. Everyone has the right of access to data which has been
collected concerning him or her, and the right to have it rectified.
3. Compliance with these rules shall be subject to control by an independent
authority
124
Hert and Gutwirth (2008) have characterized this difference as one of transparency versus
opacity:
“Privacy sets prohibitive limits that shield the individual against the State (and other)
powers warranting a certain level of opacity of the citizen, whilst data protection
channels legitimate use of power, imposing a certain level of transparency and
accountability to power.” (Gutwirth, et al., 2011, p. 8)
This difference can also be related to the fact that EU data protection is a right that has a
Janus-faced concern: it is both grounded in fundamental rights concerns as well as in a
concern for the efficient functioning of the internal common market of the EU (see section
2.3.1(b)). Obviously, data processing is an important factor for the economic flourishing of
the common market. The default position in data protection is thus that personal data can
be processed but that the processing should happen in conformity with data protection rules
that guarantee legitimacy, fairness and accountability.
Art. 8 CFREU does not give a very precise definition of the meaning and functioning
of EU data protection rules. Looking at the recent history of the emergence of the right to
the protection of personal data, one cannot fail to notice that the emergence of the data
protection right in the CFREU is a constitutionalized summary of more detailed development
that took place in the secondary legislation of the EU and, to a certain extent, in the case law
of the ECtHR. Therefore, in order to understand the precise meaning (What does “personal
data” mean? What is “processing”? What is a “legitimate basis”? Etc.) and functioning
(What limitations are permissible? How can a data subject enforce this right? Etc.) of Art. 8
CFREU we refer the reader to our discussion of the secondary EU legislation with regard to
data protection (below, section 2.5.2).
[See for a more detailed discussion of the meaning and functioning of Art. 8 CFREU our
analysis of a selection of relevant secondary EU legislation relating to data protection, in
section 2.5.1, below]
125
[Art. 10 CFREU corresponds to Art. 9(1) ECHR. See above, section 2.4.]
ARTICLE 10 (1). Freedom of thought, conscience and religion 1. Everyone has the right to freedom of thought, conscience and religion. This right includes freedom to change religion or belief and freedom, either alone or in community with others and in public or in private, to manifest religion or belief, in worship, teaching, practice and observance.
126
(c) CFREU Chapter 3 – Equality (Arts. 20-26)
General remarks:
Art. 21(1) CFREU corresponds to a large extent to Art. 14 ECHR. We therefore refer the
reader for further details to our discussion of Art. 14 ECHR in section 2.4 (above).
However, on top of the discussion of Art. 14 ECHR, presented in section 2.4, we
would like to make three additional observations about Art. 21 CFREU. These observations
concern the relation between Art. 21 CFREU and secondary anti-discriminatory EU legislation
(observation 1), and to some salient differences between Art. 21 CFREU and 14 ECHR
(observation 2 and 3).
Observation 1: There is a potentially interesting relation between Art. 21(1) CFREU
and the current secondary EU legislation with regard to the prohibition of anti-
discrimination discussed below, in section 2.5. Current secondary EU legislation is mainly
based on Article 13 of the Treaty Establishing the European Community96 (TEC, 1997; entry
into force in 1999) which states that “[…] the Council […] may take appropriate action to
combat discrimination based on sex, racial or ethnic origin, religion or belief, disability, age
or sexual orientation.” It should be noted that art. 13 TEC lists a limited set of prohibited
96
Now replaced by Article 19 of the Treaty on the Functioning of the Union (TFEU, 2008). The content of Art. 19 TFEU and Art. 13 TEC is identical.
ARTICLE 21. Non-discrimination 1. Any discrimination based on any ground such as sex, race, colour, ethnic or
social origin, genetic features, language, religion or belief, political or any
other opinion, membership of a national minority, property, birth, disability,
age or sexual orientation shall be prohibited.
2. Within the scope of application of the Treaty establishing the European
Community and of the Treaty on European Union, and without prejudice to
the special provisions of those Treaties, any discrimination on grounds of
nationality shall be prohibited.
127
discriminatory grounds, which in fact was a substantial enlargement of the set of grounds
protected by older EU law (Meenan, 2007). When the EEC (European Economic Community
– the predecessor of the EU) was first created in 1957, there were only two protected
grounds: nationality and gender (the latter only with regard to equal pay). (Meenan, 2007, p.
11) Secondary anti-discriminatory EU legislation follows the logic of Art. 13 TFEU and
prohibits discrimination on a limited set of grounds. Moreover, this protection only prohibits
discrimination in a limited set of areas of life (see section 2.5, below, for a discussion in
greater detail). In contrast, Art. 21 CFREU has a much broader protection, that can either be
understood as a broad principle of non-discrimination (regarding all discrimination based on
“personal characteristics”) or an even broader principle of equal treatment in general. (See:
Gerards, 2013, discussed above in section 2.4 with regard to Art. 14 ECHR) Because all EU
legislation should be made in accordance with the CFREU, this might imply that the
secondary anti-discriminatory EU legislation will have to be adjusted or supplemented.
Observation 2: Art. 21(1) CFREU, contrary to Art. 14 ECHR, is not an ancillary but an
autonomous right.
Observation 3: Art. 21 CFREU, contrary to Art. 14 ECHR, contains a separate
prohibition of discrimination between EU-nationals based on nationality (Art. 21(2) CFREU).
Given that one of the main concerns of the EU is the establishment and functioning of the
common internal EU market, this specific provision with regard to EU nationals, and
nationality based discrimination between them, is not surprising.
[Art. 21 CFREU corresponds – apart from the differences discussed above – to Art. 14 ECHR.
See above, section 2.4.]
ARTICLE 24 (2). The rights of the child In all actions relating to children, whether taken by public authorities or private
institutions, the child's best interests must be a primary consideration
128
General remarks:
Art. 24(2) is important to SIAM because it imposes a specific obligation to take children’s
best interests into account when designing or acquiring SMTs. Children are human beings
below the age of 18 (cf. Art.1 of the 1989 UN Convention on the Rights of the Child97).
Art. 24 CFREU builds on the 1989 UN Convention on the Rights of the Child. There is
no equivalent in the ECHR.
(i) Under what conditions are limitations to Art. 1 justified?
The triple proportionality test (in accordance with law, legitimate aim and necessary
in a democratic society) applies.
Moreover, the requirement that the child’s “best interests” should be a
primary consideration already seems to imply the possibility of a conflict of rights and
interests in which the interests of the child should be guiding.
(ii) Are there specific conflicts or convergences with other fundamental rights that are to
be expected when assessing the compatibility of this right with an SMT?
Firstly, Art. 24 CFREU converges with Art. 1 CFREU (human dignity) and non-
discrimination of all persons (Art. 21(1) CFREU).
Secondly, taking into account other rights of the CFREU will contribute to the
“best interests” of the child. As such Art. 24(2) can converge with virtually any of the
other CFREU rights. What Art. 24(2) adds to all the other CFREU rights, is that it
forces to engage with the possibility that the person subjected to an alleged freedom
infringement is a child. For example, in considering the safety of the radiation level of
a body scanner (related to Art. 2, 3 and 7 CFREU and art. 2 and 8 ECHR), one should
pay specific attention that the radiation level is also safe for children or provide an
alternative which is appropriate for children.
(iii) Which compatibility issues can be envisioned in relation to Smart CCTV and Passenger
Profiling?
97
UN Convention on the Rights of the Child, 20 November 1989, A/RES/44/25.
129
See compatibility issues with other CFREU rights. Similar issues can arise with regard
to children. However, in contrast to freedom infringements concerning adults, the
best interests of the child should deserve some specific consideration.
(iv) What are the design implications for Smart CCTV?
The design of Smart CCTV has to take the best interests of children into account, and
should not merely focus on adults. If children are not taken into account during the
design process, one could, for example, imagine a situation in which the activity and
behavior patterns of children do not fit the standard profiles and will result in a
disproportionate amount of false positives.
General remarks:
Art. 25 is important to SIAM because it imposes a specific obligation to take the rights of the
elderly into account when designing or acquiring SMTs. There is no equivalent in the ECHR.
(i) Under what conditions are limitations to Art. 1 justified?
The triple proportionality test (in accordance with law, legitimate aim and necessary
in a democratic society) applies.
(ii) Are there specific conflicts or convergences with other fundamental rights that are to
be expected when assessing the compatibility of this right with an SMT?
Art. 25 CFREU converges with Art. 1 CFREU (human dignity) and non-discrimination of
all persons (Art. 21(1) CFREU).
ARTICLE 25. The rights of the elderly The Union recognises and respects the rights of the elderly to lead a life of dignity and
independence and to participate in social and cultural life.
130
(iii) Which compatibility issues can be envisioned in relation to Smart CCTV and Passenger
Profiling?
See compatibility issues with other CFREU rights. Similar issues can arise with regard
to elderly. However, in contrast to freedom infringements concerning non-elderly,
the interests of the elderly can differ and should deserve some specific consideration.
(iv) What are the design implications for Smart CCTV?
The design of smart CCTV has to take the specific problems of elderly into account,
and should not merely focus on non-elderly. Problems that are associated to age are
an increased vulnerability to injuries and illness, diminished agility, and poverty. If
specific age-related problems are not taken into account during the design process,
one could, for example, imagine a situation in which the activities and behavioral
patterns of elderly do not fit the standard profiles (a trembling gait, changed facial
traits in facial recognition, etc.) and will result in a disproportionate amount of false
positives.
[The considerations with regard to Art. 26 CFREU are similar to the ones made in analysis
above, with regard to children (Arts. 24(1)) and, especially, elderly (Art. 25 CFREU). Replacing
the word “elderly” in Art. 25 CFREU with “persons with disabilities” will give the reader a
basic impression of the issues at stake]
ARTICLE 26. Integration of persons with disabilities The Union recognises and respects the right of persons with disabilities to benefit from
measures designed to ensure their independence, social and occupational integration and
participation in the life of the community.
131
(d) CFREU Chapter 4 – Solidarity (Arts. 26-38)
General remarks:
Art. 35 CFREU is of some importance to SIAM when it is interpreted as imposing an
obligation to pay specific attention to health when developing SMTs. Even though the Article
mainly focuses on solidarity and the right to access to health care and the right to benefit
from medical treatment, which has a limited importance to SIAM, it can be understood as
supportive of other rights relating to physical and mental integrity (Arts. 3 and 8 CFREU, and
8 ECHR). When designing or acquiring SMTs it’s necessary to take health impacts into
account. Next to the specific rights of children and elderly, and the integration of people
with disabilities, health is also a perspective that deserves specific attention when designing
or acquiring an SMT.
[The considerations with regard to Art. 35 CFREU are similar to the ones made in analysis
above, with regard to children (Arts. 24(1)), elderly (Art. 25 CFREU) and integration of people
with disabilities. However, because Art. 35 has limited importance to SIAM we do not go into
this any further here]
ARTICLE 35. Health care Everyone has the right of access to preventive health care and the right to benefit from medical treatment under the conditions established by national laws and practices. A high level of human health protection shall be ensured in the definition and implementation of all Union policies and activities.
132
(e) CFREU Chapter 5 – Citizen’s rights (Arts. 39-46)
[See for a more detailed discussion of the meaning and functioning of Art. 45(1) CFREU our
analysis of a selection of relevant secondary EU legislation relating to freedom of movement
and residence within the EU, in section 2.5.1, below]
ARTICLE 45(1). Freedom of movement and of residence Every citizen of the Union has the right to move and reside freely within the territory of the Member States
133
(f) CFREU Chapter 6 – Justice (Arts. 47-50)
[Art. 47 CFREU corresponds to Art. 6 ECHR. See above, section 2.4.]
[Art. 47 CFREU corresponds to Art. 6 ECHR. See above, section 2.4.]
ARTICLE 48. Presumption of innocence and right of defence 1. Everyone who has been charged shall be presumed innocent until proved guilty according to law. 2. Respect for the rights of the defence of anyone who has been charged shall be guaranteed.
ARTICLE 47. Right to an effective remedy and to a fair trial Everyone whose rights and freedoms guaranteed by the law of the Union are violated has the right to an effective remedy before a tribunal in compliance with the conditions laid down in this Article.
Everyone is entitled to a fair and public hearing within a reasonable time by an independent and impartial tribunal previously established by law. Everyone shall have the possibility of being advised, defended and represented.
Legal aid shall be made available to those who lack sufficient resources in so far as such aid is necessary to ensure effective access to justice.
134
2.5.2 Selection of secondary EU legislation with regard to Fundamental Rights
(a) Data Protection: Current and Proposed legislation
We discuss four instruments with regard to data protection. We devote most
attention to the legal instruments regulating general data protection: DPD 95/46 and
its successor (proposed GPDR). The current and proposed legal instruments with
regard to data processing in the field of law enforcement (JHA Framework Decision
and proposed LEDPD) duplicate many of the legal concepts from general data
protection but often sets weaker and less strict requirements to the processing.
(European Digital Right (EDRI), 2012)
Current and proposed legislation with regard to general data protection:
Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on
the protection of individuals with regard to the processing of personal data and on the free
movement of such data, Official Journal L 281 , 23/11/1995, p. 31 – 50
Proposal for a Regulation of the European Parliament and of the Council on the protection
of individuals with regard to the processing of personal data and on
the free movement of such data (General Data Protection Regulation), Brussels, 25.1.2012
COM(2012) 11 final .
Current and proposed legislation with regard to data protection in the field of law
enforcement and judicial cooperation in criminal law:
Council Framework Decision 2008/977/JHA of 27 November 2008 on the protection of
personal data processed in the framework of police and judicial cooperation in criminal
matters, Official Journal L 350/60, 30 December 2008.
Proposal for a Directive of the European parliament and of the Council on the protection of
individuals with regard to the processing of personal data by competent authorities for the
purposes of prevention, investigation, detection or prosecution of criminal offences or the
execution of criminal penalties, and the free movement of such data. 25 January 2012
COM(2012) 10 final .
135
Relevant provisions from DPA 95/46/EC Definitions:
- Personal data is “any information relating to an identified or identifiable natural person (‘data subject’); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity.” [Art. 2(a) DPA]
- Data processing is “any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction.” [Art. 2(b) DPA]
- A data controller is a “natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data.” [Art. 2(d) DPA]
- A data processer is a “natural or legal person, public authority, agency or any other body which processes personal data on behalf of the controller.” [Art. 2(e) DPA]
Lawfulness and legitimacy: The processing of personal data is legitimized when the processing is “necessary for the purposes of the legitimate interests pursued by the controller or by the third party or parties to whom the data are disclosed, except where such interests are overridden by the interests for fundamental rights and freedoms of the data subject.” (Art. 7(f) DPA) Arts. 7(a)-(e) name other legitimizing grounds that indicate that a legitimate interest is pursued, of which the most famous is that “the data subject has unambiguously given his consent.” (Art. 7(a)). However the grounds of Arts. 7(a)-(e) are always subordinated to Art. 7(f): when the processing is based on, e.g., consent with processing for an illegitimate interest or processing, or on the performance of a contract pursuing illegitimate interests, the processing cannot be considered compatible with the DPD.
Requirements of “proportionality” and “necessity”:
- Data minimization: purpose specification and use limitation (“personal data must be collected for specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes”), accuracy and completeness of the data, and deletion and anonymisation of the data as soon as they are no longer needed for the purpose that led to their collection. (Art. 6 DPD)
- Confidentiality and security of the data processing (Art. 16 and 17 DPD) - Extra strict requirements with regard to sensitive data regarding racial or ethnic origin, political opinions,
religious or philosophical beliefs, trade-union membership, health or sex life. (Art. 8 DPD) - No decisions with legal effect or that significantly affect the data subject and that are based solely on
automated processing are permitted (Art. 15(1)DPD) - Data transfers outside the EU: The basic principle is that transfers are only allowed to countries that ensure
“an adequate level of protection” Derogations from this principle can be made for several reasons, most importantly if the transfer is necessary for the performance of a contract, concerns an important public interest or concerns vital interests of the data subject. (Art. 25-26 DPD)
Preemption of infringements - individual rights and structural requirements: - The data subject has the individual rights to be informed about essential aspects of the data processing, to
access and correct one’s data, and to object to their processing when data are disclosed to third parties, used for direct marketing or when challenging the legitimacy of the processing (Art. 10, 11, 12 and 14 DPD).
- EU Member States have to provide for a national supervisory body (Art. 28 DPD). - Requirement of data protection by design (“appropriate organizational and technological measures”) with
regard to security of the processing. (Art. 17 DPD) - “Proto” impact assessment: obligation for data processors to notify the supervisory authority of processing
(Art. 18 DPD) and for the supervisory authority to “determine the processing operations likely to present specific risks to the rights and freedoms of data subjects” and to “check that these processing operations are examined prior to the start thereof” (Art. 20 DPD).
Judicial remedies:
- EU Member States have to provide for adequate judicial remedies, sanctions and damages (Art. 22-23 DPD).
136
Relevant differences between the Proposed GDPR and DPA 95/46/EC Definitions – additional clarifications:
- Consent means has to be “specific, informed and explicit” (i.e, implicit, uninformed consent to some very broad and abstractly defined processing, is not good enough). (Art. 4(8) and Recital 25 of the GDPR)
- Location data and online identifiers, such as cookies or IP addresses, are not necessarily personal data when taken in isolation, but (as often will be the case) when they can be associated with an individual profile (e.g, behavioral advertising or risk profile), they fall under the definition of personal data.
Requirements of “proportionality” and “necessity”:
- The prohibition of Art. 15(1)DPD (no significant decisions permitted that are solely based on automated processing) is presented under the denominator of “profiling” and is given more “bite” by obliging the data processor to inform the data subject of the fact that such automated profiling is taken place (in the existing DPD regime the data subject will often fail use the right to object to profiling because he or she in unaware of it).
- Processing of data of children: “the processing of personal data of a child below the age of 13 years shall only be lawful if and to the extent that consent is given or authorised by the child's parent or custodian.” (Art. 8 GDPR)
- Data transfers outside the EU: In the DPD transfers are “prohibited unless…”, whereas in the GPDR transfers are “allowed as long as the following conditions are fulfilled.”(Art. 40 DPD) Compared to the DPD the GPDR lists more grounds that legitimize transfers to third countries.
Preemption of infringements - individual rights and structural requirements: - A new individual right: “Right to be forgotten”. (Art. 17 GPDR) However, Article 17 does not add a lot in
comparison to the existing DPD regime. It mainly explicates what is implicit in the DPD. It rephrases one of the existing data minimization principles (that “data which are no longer necessary in relation to the purposes for which they were collected” should be erased) as an individual right, reiterates the right to object against illegitimate processing and explicitates that withdrawal of consent makes any further processing illegitimate and gives the data subject a ground to ask for erasure of the data. Moreover, Art 17 GPDR states that all of the above is especially important when it concerns data that were made available when the data subject was a child.
- A new individual right: “Right to data portability”. (Art. 18 GPDR) Gives the data subject the right to obtain a copy of an uploaded profile and subsequently upload it on a platform of a competing service provider.
- The rather bureaucratic requirement to notify the supervisory authority of processing (Art. 18 DPD) will be abolished;
- Instead data controllers and processors in companies with more than 250 employees will have to maintain extensive documentation (purpose of the processing, data categories, data transfers, time limits to how long data are kept, etc.) of all processing operations under their responsibility. (Art. 28 GDPR)
- Breach notification: Supervisory authorities and affected data subjects have to be notified of data security breaches. (Arts. 31 and 32 GDPR)
- Companies with more than 250 employees have to appoint a data protection officer (Arts. 35, 36 and 37 GDPR).
- Companies working in multiple EU member states will be subject to the jurisdiction of a single data protection authority, namely the one in the main place of their establishment (Art. 51 GDPR)
- Data protection by design (“appropriate organizational and technological measures”) is no longer only required for the security of the processing, but concerns all data protection requirements. (Art. 23 GDPR)
- Impact assessment: Data controllers (i.e., this is no longer the duty of the supervisory authority) will have to carry out “an assessment of the impact of the envisaged processing operations on the protection of personal data” when “processing operations present specific risks to the rights and freedoms of data subjects by virtue of their nature, their scope or their purposes”. (Art. 33 GDPR) Judicial remedies:
- Administrative fines for data protection violations could range up to 2 percent of a company’s annual worldwide income. (Art. 79 GDPR)
137
JHA Framework decision versus DPD 95/46/EC Scope: The JHA Framework Decision only partly complements the scope of the DPD:
- The JHA Framework Decision applies to “processing of personal data in the framework of police and judicial cooperation in criminal matters”. (Art. 1(1) JHA Framework Decision) and is thus limited to cross-border data processing.
- DPD 95/46 applies to all “processing of personal data wholly or partly by automatic means, and to the processing otherwise than by automatic means of personal data which form part of a filing system or are intended to form part of a filing system”, but does not apply to “processing operations concerning public security, defence, State security (including the economic well-being of the State when the processing operation relates to State security matters) and the activities of the State in areas of criminal law”. (Art. 3 DPD)
Requirements of “proportionality” and “necessity”: - Data transfers outside the EU: Transfers are only allowed towards a competent authority, when it is
necessary for the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, when the Member State from which the data are transferred has consented and when the receiving State can ensure “an adequate level of protection”. National law can derogate from the latter condition (adequate protection) in situations concerning “legitimate specific interests of the data subject” or “legitimate prevailing interests, especially important public interests.” (Art. 13(1) and (3) JHA Framework Decision) These derogations make it transfers to countries without adequate safeguards easier than under the general data protection regime.
Lawfulness and legitimacy: The JHA Framework Decision states that processing of data “shall be lawful” (Art. 3 JHA Framework Decision) but does not specify what “lawful” entails and does not give a list of legal grounds for lawful processing (as the DPD does in Art. 7(1)). This clearly has much less specificity and “bite” than the DPD.
Preemption of infringements - individual rights and structural requirements: - The individual rights to be informed about essential aspects of the data processing, to access and correct
one’s data (rectify or erase), and to object to their processing are much weaker than in the DPD because of all the exceptions (Art. 16, 17 and 18 JHA Framework Decision). For example, the right to access may be restricted by Member States “(a) to avoid obstructing official or legal inquiries, investigations or procedures; (b) to avoid prejudicing the prevention, detection, investigation and prosecution of criminal offences or for the execution of criminal penalties; (c) to protect public security; (d) to protect national security; (e) to protect the data subject or the rights and freedoms of others.” (Art. 18 (2)).
- The right to access is further limited by the fact that the data which the data subject wishes to access should be provided without “excessive delay or expense” (Art. 17(1). This means that access can be delayed and that the data subject can be charged for this service – as long as it is not excessive.
- There is no “proto” impact assessment but only an obligation to consult the national supervisory authority when a new filing system is created and the processing is expected to pose “specific risks for the fundamental rights and freedoms, and in particular the privacy, of the data subject”. (Art. 23 JHA Framework Decision)
138
The proposed Law Enforcement Data Protection Directive (LEDPD) versus its predecessor (JHA Framework decision) and the proposed GDPR: Scope: Compared to the JHA Framework Decision the LEDPD has a larger scope and is a true complement to general data protection: the JHA Framework Decision only applies where personal data are exchanged between Member States; whereas the LEDPD covers all data processing (also domestic) with regards to criminal law enforcement.
- The JHA Framework Decision applies to “processing of personal data in the framework of police and judicial cooperation in criminal matters”. (Art. 1(1) JHA Framework Decision)
- The LEDPD covers all processing of personal data for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal offences. (Art. 2 LEPD)
Lawfulness and legitimacy: In contrast to the JHA Framework Decision the LEDPD includes a specification of the grounds for lawful processing. (Art. 7 LEPDP) The commentary to Art.7 of the LEDPD clarifies: “Article 7 sets out the grounds for lawful processing, when necessary for the performance of a task carried out by a competent authority based on national law, to comply with a legal obligation to which the data controller is subject, in order to protect the vital interests of the data subject or another person or to prevent an immediate and serious threat to public security. The other grounds for lawful processing in Article 7 of Directive 95/46/EC are not appropriate for the processing in the area of police and criminal justice.” Requirements of “proportionality” and “necessity”:
- Data transfers outside the EU: The LEDPD follows the JHA Framework Decision to a large extent. However, the extensive list of broadly formulated derogations in Art. 36 LEDPD on the “adequate level of protection” and “appropriate safeguards” principles , combined with the fact that the existence of “appropriate safeguards” is left to the self-assessment of the controller or processor, create a very weak level of protection, also when compared to the level of protection given by the JHA Framework Decision.
Preemption of infringements - individual rights and structural requirements: With regard to individual rights and structural requirements the provisions of the JHA Framework Decision and the LEPDP are very much alike in the following respects:
- The JHA Framework Decision and the LEPDP differ from general data protection (the DPD and the proposed GDPR) by providing weak individual rights (due to exceptions) with regard to the possibilities to be informed about essential aspects of the data processing, to access and correct one’s data (rectify or erase), and to object to their processing. The new right to “be forgotten” has not been included in the LEDPD.
- The JHA Framework Decision and the LEPDP offer little, if any, structural requirements, such as the GDPR requirements of Impact Assessment and to maintain extensive documentation of processing. However, in contrast to the JHA Framework Decision, the LEPDP does include the obligation for Data Protection by Design and Default (Art. 19 LEDPD) , though this obligation is weaker and less specific than its equivalent in the proposed GDPR.
The LEDPD differs from the JHA Framework Decision (non-exhaustive list) in the following respects: - The provision in Art. 17 (1) JHA Framework Decision limiting the right to access t access by creating an
option for non-excessive delay and a charge for the service has been removed. - Instead, the LEDPD now contains a provision relating to the exercise of all individual rights of the data
subject, stating that the data subject might be denied in one’s rights when the request is considered “vexatious”: “Where requests are vexatious, in particular because of their repetitive character, or the size or volume of the request, the controller may charge a fee for providing the information or taking the action requested, or the controller may not take the action requested. In that case, the controller shall bear the burden of proving the vexatious character of the request”. (Art. 10(5) LEDPD) This provision clearly gives a rather broad possibility for denying the individual rights.
- Moreover, Art. 10(4) LEDPD states that “Member States shall provide that the controller informs the data subject about the follow-up given to their request without undue delay”. Similarly to the JHA Framework Decision delays are thus permitted as long as they are not “undue”. In contrast to the JHA Framework Decision this provision does not just relate to the right of access but to all individual rights.
- The LPDPD gives Member States to exclude whole categories of data from the right of access. (Art. 13(2) LEDPD) The JHA Framework Decision did not provide Member States with this option.
139
Some remark with regard to the structure of the analysis:
The EU data protection instruments operate according to a logic that differs from the triple
proportionality approach of the ECHR. While a freedom like the respect for private life (Art.
8 ECHR) follows the structure that this freedom might not be infringed upon unless the
infringement passes the proportionality test (freedom is default, infringement is the
exception), EU data protection reverses the reasoning by making processing the default, as
long as the conditions of the data protection instruments are taken into account (“there is
no infringement” is default, as long as all requirements are fulfilled). Despite their
fundamentally different structure there are striking similarities between the elements
constituting the “unless”-condition of the ECHR freedoms and the “as long as”-conditions of
the EU data protection instruments.
Contrary to the relevant rights of the ECHR and the CFREU, we cannot reproduce the
whole text of the relevant EU data protection instruments here. Instead we have
summarized some of the most salient and relevant provisions in the four textboxes above.
We have structured our summaries according to six categories: (1) definitions, (2) scope, (3)
lawfulness and legitimacy, (4) requirements of proportionality and necessity, (5) preemption
of infringements - individual rights and structural requirements, and (6) judicial remedies.
This categorization allows for a good comparison with the ECHR and CFREU rights. The
conditions of category 3 and 4 can be compared with the ECHR proportionality test.
Category 5 contains requirements that regulate processing and preempt infringements –
there are no corresponding requirements included in the ECHR rights, simply because these
rights do not conform to the regulatory and preemptive logic of the EU secondary legal
instruments that we discuss in this deliverable.
Keeping these caveats in mind, we found that with only a few minor adjustments we
could nevertheless largely follow the same quadruple structure of analysis that we used for
our assessment of the relevant ECHR and CFREU rights.
General remarks:
Many of the SMTs described in SIAM have to do with the processing of personal data. It is
therefore clear that the aforementioned EU legal instruments with regard to data protection
are of utmost importance to SIAM.
140
All of the four aforementioned instruments contain similar elements and
follow approximately the same structure. Firstly, the processing should have a lawful
ground. Secondly, the processing should take place in a fair, legitimate and
proportional way: the processing should have a specific, explicit and legitimate
purpose, it should be adequate, relevant and not excessive in relation to that
purpose, the data should be accurate and kept no longer than necessary and the
processing should happen in a way that guarantees security and confidentiality.
Moreover, if the personal data are sensitive, processing is not allowed unless the
data subject has given explicit consent or a limited list of very vital interests is at
stake. Extra strict conditions also apply in the case of automated profiling without
any human oversight leading up to decisions with a significant impact for the data
subject: this is only allowed in the performance of a contract or if authorized by “a
law which also lays down measures to safeguard the data subject's legitimate
interests.” (Art. 15(2)b DPD; Art.20(2)b GDPR; Art. 7 JHA Framework Decision; Art.
9(1) LEDPD) Thirdly, each of the four legal EU instruments grants a set of individual
rights to the data subject creating possibilities for the data subject to be informed
about essential aspects of the data processing, to access and correct one’s data
(rectify or erase), and to object to their processing if there are justified grounds to do
so. Compared to the strength of the individual rights granted by the current and
proposed general data protection instruments, the ones granted in the legal
instruments relating to law enforcement are significantly weaker. When comparing
the current DPD with its successor, one can see that the individual rights of the data
subject have been strengthened in the proposed GDPR. In contrast, when comparing
the JHA Framework Decision with its successor, the individual rights seem to have
stayed equally weak (some rights might have become slightly stronger, while others
have become slightly weaker). Fourthly, each of the discussed legal instruments
offers some structural-preemptive measures to prevent unlawful data processing of
personal data. These measures include the establishment of supervisory authorities,
data protection by design and default and, in the case of general data protection,
impact assessments. Again there is a significant difference between general and law
enforcement data protection: the latter does not include an obligation for an impact
assessment of the processing, has a weaker version of data protection by design and
141
by default, and grants less powers to supervisory authorities98. Also when comparing
the two current instruments (general and law enforcement) with the two proposed
ones, one can see that the structural-preemptive measures have been significantly
strengthened. Fifthly, each of these legal instruments sets certain conditions for
transfers of personal data to non-EU States. The basic principle in all four instruments
is that the receiving State should have an “adequate level” of data protection, and
that if the Commission has not established whether a State has an “adequate level”,
the existence of appropriate safeguards should be assessed. Data transfers to States
that do not have an appropriate level of protection and do not have appropriate
safeguards, can only take place if a certain set of conditions, set out in each of the
instruments, is fulfilled. These conditions differ across the four instruments. A
general observation is that, compared to the current legislative instruments, the
proposed ones have weaker conditions for transfers. This is especially the case in the
proposed LEDPD, where the assessment of the presence of “appropriate safeguards”
is left to a self-assessment by the data controller (Art. 35(1)b LEDPD) and where
there is a long list of broadly formulated derogations to the “adequate level” and
“appropriate safeguards” requirements. Sixthly, each of the legal instruments
provides for judicial remedies, sanctions and damages. The proposed GDPR creates
the possibility to impose large administrative fines on big multinationals (up to 2
percent of a company’s annual worldwide income). (Art. 79 GDPR)
(i) Under what conditions is processing of personal data justified?
Processing of personal data is only allowed when there is a legal ground legitimizing
the processing; when the processing is done to realize a specific, explicit and
legitimate aim; and when it happens in a proportional way. The processing should be
in accordance with all the requirements set out by the applicable legal instrument.
(ii) Are there specific conflicts or convergences with other fundamental rights that are to
be expected when assessing the compatibility of this right with an SMT?
98
EDPS/12/7 of 7th March 2012, available at http://europa.eu/rapid/pressReleasesAction.do?reference=EDPS/12/7&format=HTML&aged=0&language=EN&guiLanguage=fr.
142
Because the secondary data protection instruments are not on an equal level with
the CFREU rights (primary EU law), they cannot be in conflict with a fundamental
right on an equal footing. Therefore the whole issue of fair balancing between
conflicting rights does not apply here. If a provision of a secondary legal EU
instrument is not in accordance with one of the CFREU rights, it will have to be
declared invalid.
However, it can be helpful to interpret certain provisions of the secondary
data protection instruments in light of a particular fundamental right. In this way
provisions can be put in broader perspective. For example, in our discussion of Art. 6
ECHR (fair trial) we argued that profile transparency granted by the DPD (“knowledge
of the logic involved in any automatic processing of data concerning him at least in
the case of the automated decisions referred to in Article 15 (1) DPD”), Art. 5(2) ECHR
(stating that everyone who is arrested has the right to be informed, in a language
which he understands, of the reasons for the arrest and the charges) and Art. 6 ECHR
(part of “fair trial” is that a suspect is notified of the charge, given access to all the
evidence brought before the court and provided with the opportunity to contest it)
provide for a continuum of due process protection in different stages of suspicion.
Secondly, when interpreting secondary data protection instruments it is
important to take their overlaps and differences in protective scope with ECHR rights,
CFREU rights and other secondary legal instruments into account. For example, in our
analysis of Art. 8 ECHR we already discussed the overlaps and differences in
protective scope between data protection and Art. 8 ECHR (respect for private life).
Another interesting set of overlaps and differences in protective scope occurs
between the data protection provisions on “sensitive data”99 and anti-discrimination
law. (Gellert, et al., 2012) The data categorized as sensitive in all of these provisions
are exactly the same, apart from the ones in the proposed GDPR, which has an
overlapping but more extensive list. When comparing the (original or extended) list
of categories of sensitive personal data in the data protection instruments, one
99
Provisions with regard to sensitive personal data cannot only be found in the four legislative instruments discussed here (Art. 8 DPD; Art. 9 proposed GDPR; Art. 6 JHA Framework Decision; Art. 8 proposed LEDPD), but also, for example, in Art. 11(3) of the Council Proposal for a Passenger Name Record (PNR) Directive, COM(2011) 32 final(2011), a legislative instrument that we shortly discussed in section 2.1.3 in relation to passenger profiling.
143
cannot fail to notice both the overlap and differences with the set of protected
grounds of EU anti-discrimination law. Overlapping with Art. 21 CFREU (prohibition of
discrimination) are: racial or ethnic origin, political opinions, religious or philosophical
beliefs, trade-union membership, and the processing of data concerning health or sex
life. Overlapping with Art. 13 TEC (the source for all current secondary EU anti-
discrimination law) are: racial or ethnic origin, religion or belief and sexual
orientation. However, some of the protected anti-discrimination grounds are
strikingly absent from the list of sensitive personal data: categories such as sex, age,
and nationality.
The fact that there is only a partial overlap between the categories of sensitive
data and prohibited grounds of discrimination can be explained by the fact that EU
data protection law and EU anti-discrimination law have partly overlapping and partly
different rationales, which is reflected in their respective protective scopes. Both
data protection and anti-discrimination law share in a concern for anti-
discrimination, and the need to combat differential treatments on “distasteful”
grounds such as race. The difference between these two legal regimes is that data
protection is concerned with the process of data processing, while anti-discrimination
law is concerned with differential treatments, which sometimes happens to be the
outcome of the processing of (sensitive) personal data.
Recently the European Union Agency for Fundamental Rights (FRA) suggested in
Opinion (1/2011) on the proposed PNR-profiling Directive (COM(2011) 32 final) that
the discrepancy between the categories of sensitive data and the prohibited grounds
for discrimination should be dissolved by classifying all data related to the prohibited
grounds of art. 21 as sensitive, because the prohibition of the processing of such data
would help to pre-empt direct discrimination. The Commission has expressed its
approval of this suggestion (Computers, Privacy and Data Protection Conference,
Brussels, 27 January 2012). However, this could lead to quite absurd results, because
categories as sex, age, birth, nationality and language probably belong to the most
frequently processed personal data.
144
Dat
a P
rote
ctio
n
Art 8(1), DPD ;
Art. 6 JHA Framework
Decision;
Art 8 LEDPD;
Art. 11(3) of the proposed
PNR Directive, COM(2011)
32 final(2011);
Member States shall prohibit the processing of personal data revealing:
racial or ethnic origin, political opinions, religious or philosophical
beliefs, trade-union membership, and the processing of data concerning
health or sex life
Art. 9 (1) proposed
GPDR
The processing of personal data, revealing: race or ethnic origin, political
opinions, religion or beliefs, trade-union membership, and the
processing of genetic data or data concerning health or sex life or
criminal convictions or related security measures shall be prohibited.
An
ti-D
iscr
imin
atio
n
Art. 21 CFREU
(1) Any discrimination based on any ground such as: sex, race, colour,
ethnic or social origin, genetic features, language, religion or belief,
political or any other opinion, membership of a national minority,
property, birth, disability, age or sexual orientation shall be prohibited.
(2) Within the scope of application of the Treaty […] any discrimination
on grounds of nationality shall be prohibited.
Art. 13 TEC …take appropriate action to combat discrimination based on sex, racial
or ethnic origin, religion or belief, disability, age or sexual orientation
Figure 4. Discrepancies and overlaps between the categories of sensitive data and the prohibited
grounds for discrimination. In this table the bold categories are the ones that overlap, the
italic ones partly overlap, and the underlined ones are new additions in the proposal for a
Regulation (2012) which would replace Directive 95/46.
Reproduced, with minor adjustments, from: Gellert, et al., 2012.
(iii) Which compatibility issues can be envisioned in relation to Smart CCTV and Passenger
Profiling?
Smart CCTV systems and passenger profiling systems that process personal data
(data that can be related to an identified or identifiable person) fall under the scope
145
of the legal data protection instruments. Smart CCTV systems and passenger profiling
systems that process personal data consequently have to be in accordance with all
the provisions of the applicable legal instrument. As shown in the previous section:
anti-discrimination law and data protection law have complementary scopes: smart
CCTV systems and passenger profiling systems that do not process personal data but
do result in illegal differential treatments fall under the scope of anti-discrimination
law.
(iv) What are the design implications for Smart CCTV?
Data protection design implications for smart CCTV are: (a) a proto-legal
proportionality test, (b) Enhancement of security and confidentiality (limiting data
access); (c) (the enhancement of transparency and awareness); (d) (the enhancement
of accountability); (e) (data minimization), and possibly; (f) Discrimination Aware
Data Mining (DADM), to filter out sensitive grounds.
146
(b) Anti-discrimination
Legislation with regard to general anti-discrimination law:
Directive 2000/43/EC of 29 June 2000 implementing the principle of equal treatment
between persons irrespective of racial or ethnic origin, Official Journal L 180, 19 July 2000, p.
22-26 (“Race Directive”).
Directive 2000/78/EC of 27 November 2000 establishing a general framework for
equal treatment in employment and occupation, Official Journal L 303, 2 December 2000, p.
16-22 (“Employment Equality Directive”).
Directive 2006/54/EC of 5 July 2006 on the implementation of the principle of equal
opportunities and equal treatment of men and women in matters of employment and
occupation (recast), Official Journal L 204, 26 July 2006, p. 23-34 (“Gender Recast
Directive”).
Directive 2004/113/EC of 13 December 2004 implementing the principle of equal
treatment between men and women in the access to and supply of goods and services,
Official Journal L 373, 21 December 2004, p. 37-43 (“Gender Goods and Services
Directive”).
Proposal for a Council Directive on implementing the principle of equal treatment
between persons irrespective of religion or belief, disability, age or sexual orientation, 2 July
2008, COM (2008) 426 (“Proposed Equal Treatment Directive”).
147
Exemplary provisions from the Employment Directive (ED): very similar provisions are present in the other Anti-Discrimination Directives (Race, Gender Recast, Gender Goods and Services and the Proposed Equal Treatment Directive). Definitions: Concept of discrimination (Art. 2(2) ED):
- “Direct discrimination shall be taken to occur where one person is treated less favourably than another is, has been or would be treated in a comparable situation, […]”
- “Indirect discrimination shall be taken to occur where an apparently neutral provision, criterion or practice would put persons having a particular religion or belief, a particular disability, a particular age, or a particular sexual orientation at a particular disadvantage compared with other persons unless that provision, criterion or practice is objectively justified by a legitimate aim and the means of achieving that aim are appropriate and necessary […].”
Exceptions:
- A genuine and determining occupational requirement: “[…] Member States may provide that a difference of treatment which is based on a characteristic related to any of the grounds referred to in Article 1 shall not constitute discrimination where, by reason of the nature of the particular occupational activities concerned or of the context in which they are carried out, such a characteristic constitutes a genuine and determining occupational requirement, provided that the objective is legitimate and the requirement is proportionate.” (Art. 4 ED)
- Positive action: - “With a view to ensuring full equality in practice, the principle of equal treatment shall not prevent any
Member State from maintaining or adopting specific measures to prevent or compensate for disadvantages linked to any of the grounds referred to in Article 1.” (Art. 7 ED)
Defence of rights:
“1. Member States shall ensure that judicial and/or administrative procedures, including where they deem it appropriate conciliation procedures, for the enforcement of obligations under this Directive are available to all persons who consider themselves wronged by failure to apply the principle of equal treatment to them, even after the relationship in which the discrimination is alleged to have occurred has ended. 2. Member States shall ensure that associations, organisations or other legal entities which have, in accordance with the criteria laid down by their national law, a legitimate interest in ensuring that the provisions of this Directive are complied with, may engage, either on behalf or in support of the complainant, with his or her approval, in any judicial and/or administrative procedure provided for the enforcement of obligations under this Directive.” (Art. 9 ED)
148
Figure 5. The different protective scopes for the various prohibited grounds of discrimination.
Expanded and adjusted version of a table in: CEJI Policy Response, 2010, p. 2.
General remarks:
EU secondary anti-discrimination laws are of pivotal importance to SIAM because several of
the discussed SMTs are based on profiling, i.e., a practice that is inherently engaged in
differentiation and classification. These differentiations and classifications can form a base
for differential treatments. Not all differential treatments are prohibited. In fact, the EU
Anti-discrimination laws only concern six protected grounds (see figure 5), and on top of that
the scope of protection for each of these grounds is also limited to a particular set of areas
of life (see again: figure 5). Race and ethnic origin are prohibited in almost any area of life,
whereas the other grounds are prohibited in a more limited set of areas of life. (see for
greater detail on the asymmetrical protection in anti-discrimination law: Gellert, et al., 2012,
149
pp. 67-8). The proposed Equal Treatment Directive would make the scope of protection of
the various protected grounds more even.
An important conceptual distinction in EU anti-discrimination law is the one between
direct and indirect discrimination. Both direct and indirect discrimination are protected in all
of the recent Directives.
“Direct discrimination occurs when a person is treated in a less favourable way than
another person and this difference is based directly on a forbidden ground. For
instance, the Race Equality (RE) Directive states that ‘direct discrimination shall be
taken to occur where one person is treated less favourably than another is, has been
or would be treated in a comparable situation on grounds of racial or ethnic origin’
(art. 2(2a)). Indirect discrimination makes a conceptual shift from consistency to
substance (Fredman, 2002) by providing protection from apparently neutral
provisions, criteria or practices which have the ‘side effect’ of discriminating against
one of the specific forbidden grounds. Discrimination based on a neutral ‘proxy’ that
disadvantages a protected group is thus prevented, ‘unless that provision, criterion
or practice is objectively justified by a legitimate aim and the means of achieving that
aim are appropriate and necessary’ (art. 2(2b) Race Directive).” (Gellert, et al., 2012,
p. 65)
(i) Under what conditions are infringements on prohibited discrimination grounds
justified?
The conditions under which differential treatments based on protected grounds are
allowed are extremely limited. There are basically two situations in which differential
treatment is allowed: (1) in the name of positive action (e.g., Art. 7 Employment
Directive; Art. 5 Race Directive; Art. 6 Gender Goods and Services Directive) or, (2) if
there is a “genuine and determining occupational requirement.” (e.g., Art. 4(1)
Employment Directive; Art. 4(1) Race Directive; Art. 14(2) Gender Recast Directive)
For example, if there is a job opening for the position of a catholic priest,
differentiation based on religion will be considered justified because of the “nature
of the particular occupational activities.” Only the protection against age
150
discrimination in the field of employments, and gender discrimination in the field of
access to and supply of goods and services, can be derogated from based on a more
general derogation: “if…they are objectively and reasonably justified by a legitimate
aim” (Art. 6 Employment Directive with regard to differentiation based on age) and
“if the provision of the goods and services exclusively or primarily to members of one
sex is justified by a legitimate aim and the means of achieving that aim are
appropriate and necessary” (Art. 4(5) Gender Goods and Services Directive). The
Gender Goods and Services Directive used to contain another derogation in Art. 5(2),
permitting proportionate differences in individuals' premiums and benefits based on
sex if these differential treatments were based “on relevant and accurate actuarial
and statistical data”, but in Test-Achats100 the European Court of Justice held that this
provision was invalid due to incompatibility with Arts. 21 and 23 of the CFREU
(prohibition of discrimination). This only goes to show that on an EU level the
prohibition of discrimination is protected in a rather categorical way, allowing for
very little exceptions. Important for SMTs based on automated profiling is that the
Test-Achats case showed that being based on “relevant and accurate actuarial and
statistical data” is in itself not a sufficient justification for a differential treatment.
(ii) Are there specific conflicts or convergences with other fundamental rights that are to
be expected when assessing the compatibility of this right with an SMT?
Because the secondary data protection instruments are not on an equal level with
the CFREU rights (primary EU law), they cannot be in conflict with a fundamental
right on an equal footing. Therefore the whole issue of fair balancing protection does
not apply.
For convergences with the respect to private life (Art. 8 ECHR) and data
protection, we refer the reader to our analyses in, respectively, section 2.4 and 2.5.2
(above).
100
Test Achats v. Council, C-236/09 (judgment of March 1, 2011)
151
(iii) Which compatibility issues can be envisioned in relation to Smart CCTV and Passenger
Profiling?
Travel is a service that can be bought. Thus, with respect to anti-discrimination the
life area relevant to SIAM will be mainly “access to and supply of goods and services.”
Within this life area discriminations based on sex, race and ethnic origin are
prohibited. Discrimination based on race or ethnic origin is in this field completely
prohibited, while discrimination based on sex will only be allowed if there is an
appropriate aim and the means to achieve that aim are appropriate and necessary
(Art. 4(5) Gender Goods and Services Directive). If the proposed Equal Treatment
Directive comes into force, also discrimination based on the other four protected
grounds will become prohibited.
(iv) What are the design implications for Smart CCTV?
Data protection design implications for smart CCTV are: (f) Discrimination Aware
Data Mining (DADM).
152
(c) Freedom of movement
Legislation with regard to freedom of movement:
Directive 2004/38/EC of the European parliament and of the Council of 29 April 2004
on the right of citizens of the Union and their family members to move and reside freely
within the territory of the Member States, Official Journal L 229, 29 June 2004, p. 35-48
General remarks:
Free movement within the EU, particularly of EU nationals but also of others, is one of the
fundamental objectives of the EU. However, freedom of movement, or the so-called right of
“exit and entry” is not an absolute right. The provisions regulating freedom of movement are
quite complex and detailed. We have left out the complexities and focus on the basic
principle that entry and exit from Member States should not be prohibited. For SIAM this is
Freedom of movement and residence Directive, Arts. 4, 5 and 27 (Right of exit and entry and restrictions on the right of entry)
Article 4. Right of exit 1. Without prejudice to the provisions on travel documents applicable to national border controls, all Union citizens with a valid identity card or passport and their family members who are not nationals of a Member State and who hold a valid passport shall have the right to leave the territory of a Member State to travel to another Member State. [….] Article 5. Right of entry 1. Without prejudice to the provisions on travel documents applicable to national border controls, Member States shall grant Union citizens leave to enter their territory with a valid identity card or passport and shall grant family members who are not nationals of a Member State leave to enter their territory with a valid passport. No entry visa or equivalent formality may be imposed on Union citizens. […] Art. 27. General principles restricting the right of entry […] Member States may restrict the freedom of movement […] of Union citizens and their family members, irrespective of nationality, on grounds of public policy, public security or public health. These grounds shall not be invoked to serve economic ends.
153
of utmost importance because transportation sites and airports are exactly the places where
exit and entry into another State can be scrutinized, delayed or prohibited.
(i) Under what conditions are infringements on free movement of EU nationals within
the EU justified?
Art. 27 states that the right to entry can be restricted based on grounds of public
policy, public security or public health, but underlines that these grounds should not
be invoked to serve economic ends.
(ii) Are there specific conflicts or convergences with other fundamental rights that are to
be expected when assessing the compatibility of this right with an SMT?
Because the secondary data protection instruments are not on an equal level with
the CFREU rights (primary EU law), they cannot be in conflict with a fundamental
right on an equal footing. Therefore the whole issue of fair balancing protection is
not relevant here.
The fact that the freedom of movement is included in the CFREU as a
fundamental freedom, gives the provisions in this Directive additional weight.
(iii) Which compatibility issues can be envisioned in relation to Smart CCTV and Passenger
Profiling?
Smart CCTV systems and passenger profiling systems can be used to target persons
that will be prohibited exit or entry to another EU state. When the grounds
underlying a restriction on the right to entry or exit are not public policy, public
security or public health, but are, for example, targeting people of a certain economic
status (“poor people”) or ethnic origin, the systems used for this type of targeting are
incompatible with Directive 2004/38/EC.
(iv) What are the design implications for Smart CCTV?
Data protection design implications for smart CCTV are: (f) Discrimination Aware Data
Mining (DADM).
154
Annex:
- Three tables illustrating the analyses of section 2.4 (ECHR fundamental rights)
- Six tables illustrating the analyses of section 2.5 (CFREU fundamental rights and EU
secondary legislation)
155
Legal framework
Passenger profiling
Council of Europe – European Convention for Human Rights (ECHR)
Art
. 2 R
igh
t to
life
A
rt. 3
Pro
hib
itio
n o
f to
rtu
re
A
rt. 5
Fre
edo
m f
rom
un
law
ful
det
enti
on
A
rt. 6
Pre
sum
pti
on
of
Inn
oce
nce
an
d F
air
Tri
al
Art
. 8 R
esp
ect
for
pri
vat
e an
d
fam
ily
life
A
rt. 9
(1)
Fre
edo
m o
f th
ou
ght,
co
nsc
ien
ce a
nd
rel
igio
n
Art
. 14
Pro
hib
itio
n o
f d
iscr
imin
atio
n w
ith
reg
ard
to
th
e ex
erci
se
oth
er h
um
an r
igh
ts
Passenger profiling based on risk-categories
(1) Potential incompatibility if the system is used to protect from a concrete, life-threatening danger and is ineffective in a disproportional way; (2) Passenger profiling systems that are connected to weapons or devices that can threaten life are incompatible.
(1) Incompatibility if risk-profiles lead to institutionalized racism or other sever discrimination, because this could be characterized as an inhumane or degrading treatment; (2) Incompatibility if used to target passengers in order to submit them to torture, inhumane or degrading treatment.
Incompatibility if used to arrest persons and the profiling is based on such complex or opaque data model that the reasons for the arrest are difficult to articulate.
Incompatibility if used as evidence and the profiling is based on such complex or opaque data model that the reasons for the arrest are difficult to articulate. However, this is mainly an issue of evidence law.
Is likely to infringe on Art. 8, but the question is if this infringement is incompatible. This is only the case when the triple proportionality test (in accordance with law, legitimate aim, necessary in a democratic society) is not passed.
Incompatibility when (1) the system infringes (e.g. discrimination of belief, interferes with manifestation of belief or creates “chilling” effect), AND, (2) the triple proportionality test (in accordance with law, legitimate aim, necessary in a democratic society) is not passed.
Incompatibility when (1) the risk-profiling results in differential treatment, AND, (2) the triple proportionality test (in accordance with law, legitimate aim, necessary in a democratic society) is not passed.
Passenger profiling based on detecting specified individuals
Idem (see above).
See above, point 2. Less likely to cause compatibility issues, but still: see above.
Less likely to cause compatibility issues, but still: see above.
Idem (see above). Less likely to cause compatibility issues, but still: see above.
Less likely to be incompatible, but still: see above.
156
Legal framework
Smart CCTV
Council of Europe – European Convention for Human Rights (ECHR)
Art
. 2 R
igh
t to
life
A
rt. 3
Pro
hib
itio
n o
f to
rtu
re
A
rt. 5
Fre
edo
m f
rom
un
law
ful
det
enti
on
A
rt. 6
Pre
sum
pti
on
of
Inn
oce
nce
an
d F
air
Tri
al
Art
. 8 R
esp
ect
for
pri
vat
e an
d
fam
ily
life
A
rt. 9
(1)
Fre
edo
m o
f th
ou
ght,
co
nsc
ien
ce a
nd
rel
igio
n
Art
. 14
Pro
hib
itio
n o
f d
iscr
imin
atio
n w
ith
reg
ard
to
th
e ex
erci
se
oth
er h
um
an r
igh
ts
1-t
o-1
re
cogn
itio
n
Facial recognition (1-to-1)
(1) Potential incompatibility if the system is used to protect from a concrete, life-threatening danger and is ineffective in a disproportional way; (2) smart cctv systems that are connected to weapons or devices that can threaten life are incompatible
Incompatibility if used to target passengers in order to submit them to torture, inhumane or degrading treatment
Not very likely that compatibility issues emerge, unless a person is arrested based on a facial match that is disputed and the facial recognition is based on such complex or opaque data model that the reasons for the arrest are difficult to articulate.
Not very likely that compatibility issues emerge, unless a person is arrested based on a facial match that is disputed, the match is used as evidence in court, and the facial recognition is based on such complex or opaque data model that the reasons for the arrest are difficult to articulate. However, this is mainly an issue of evidence law.
Intrusive form of surveillance, likely to pose an infringement. Will be incompatible if the triple proportionality test (in accordance with law, legitimate aim, necessary in a democratic society) is not passed.
Incompatibility when (1) the system infringes (e.g. discrimination of belief, interferes with manifestation of belief or creates “chilling” effect), AND, (2) the triple proportionality test (in accordance with law, legitimate aim, necessary in a democratic society) is not passed.
Not very likely to be incompatible.
Cat
ego
rica
l re
cogn
itio
n (
“pro
filin
g”)
Facial expression and gait recognition
Idem (see above)
Idem (see above)
Incompatibility if used to arrest persons and the profiling is based on such complex or opaque data model that the reasons for the arrest are difficult to articulate.
Incompatibility if used as evidence and the profiling is based on such complex or opaque data model that the reasons for the arrest are difficult to articulate. However, this is mainly an issue of evidence law.
Idem (see above)
Idem (see above)
Incompatibility when (1) the risk-profiling results in differential treatment, AND, (2) the triple proportionality test (in accordance with law, legitimate aim, necessary in a democratic society) is not passed.
Activity recognition
Idem (see above)
Idem (see above)
Idem (see above)
Idem (see above)
Slightly less intrusive, but still: see above.
Idem (see above)
Idem (see above)
Object recognition
Idem (see above)
Unlikely to result in compatibility issues
Unlikely to result in compatibility issues
Unlikely to result in compatibility issues
Unlikely to result in compatibility issues
Unlikely to result in compatibility issues
Unlikely to result in compatibility issues
157
Legal framework
Suggestions for
LPbD with regard to Smart CCTV
Council of Europe – European Convention for Human Rights (ECHR)
Art
. 2 R
igh
t to
life
A
rt. 3
Pro
hib
itio
n o
f to
rtu
re
A
rt. 5
Fre
edo
m f
rom
un
law
ful
det
enti
on
A
rt. 6
Pre
sum
pti
on
of
Inn
oce
nce
an
d
Fai
r T
rial
Art
. 8 R
esp
ect
for
pri
vat
e an
d f
amil
y
life
A
rt. 9
(1)
Fre
edo
m o
f th
ou
ght,
co
nsc
ien
ce a
nd
rel
igio
n
Art
. 14
Pro
hib
itio
n o
f d
iscr
imin
atio
n
wit
h r
egar
d t
o t
he
exer
cise
o
ther
hu
man
rig
hts
(a) proto-legal reasoning Yes, assessment of effectiveness when used for securing lives
Yes, partly – in the sense that a permanent self-critical attitude is required to stay within the boundaries of Art. 3
Yes, in the sense that a basic proportionality test is performed and published.
Yes, in the sense that a basic proportionality test is performed and published.
Yes, in the sense that a basic proportionality test is performed and published.
Yes, in the sense that a basic proportionality test is performed and published.
Yes, in the sense that a basic proportionality test is performed and published.
(b) Enhancement of security and confidentiality (limiting data access)
(c) Enhancement of transparency and awareness
Yes Yes Yes Yes Yes
(d) Enhancement of accountability
Yes Yes Yes Yes Yes
(e) Data minimization (several methods)
Yes Yes Yes
(f) Discrimination Aware Data Mining (DADM)
Yes, to prevent institutionalized racism or other severe discrimination
Yes Yes
(g) Other Permanent self-critical attitude
158
Legal framework
Passenger Profiling
Primary EU law: EU Charter (CFR)
Art
. 1 H
um
an d
ign
ity
Art
. 2 R
igh
t to
life
Art
. 3 (
1)
Rig
ht
to t
he
inte
grit
y o
f th
e p
erso
n
Art
. 4 P
roh
ibit
ion
of
tort
ure
an
d
inh
um
an o
r d
egra
din
g tr
eatm
ent
Art
. 6
Rig
ht
to li
ber
ty a
nd
sec
uri
ty
Art
. 7 R
esp
ect
for
pri
vat
e an
d
fam
ily
lif
e
Art
. 8 (
1,2
) P
rote
ctio
n o
f p
erso
nal
dat
a
Art
10
(1
) F
reed
om
of
tho
ugh
t,
con
scie
nce
an
d r
elig
ion
Art
. 21
(1)
No
n-d
iscr
imin
atio
n
Art
. 24
(2
) T
he
righ
ts o
f th
e ch
ild
Art
. 25
Th
e ri
ghts
of
the
eld
erly
Art
. 26
In
tegr
atio
n o
f p
erso
ns
wit
h
dis
abil
itie
s
Art
. 35
(2)
Hea
lth
car
e
Art
. 45
Fre
edo
m o
f m
ov
emen
t an
d
of
resi
den
ce
Art
. 47
. Rig
ht
to a
n e
ffec
tive
re
med
y an
d t
o a
fai
r tr
ial
Art
. 48
Pre
sum
pti
on
of
inn
oce
nce
an
d r
igh
t o
f d
efen
ce
Passenger profiling based on risk-categories
Inco
mp
atib
ilit
y m
ain
ly w
hen
se
vere
in
frin
gem
ents
occ
ur,
su
ch a
s in
stit
uti
on
aliz
ed r
acis
m.
SEE
AR
T. 2
EC
HR
SEE
AR
T.
3 A
ND
8 E
CH
R
SEE
AR
T. 3
EC
HR
SEE
AR
T. 5
EC
HR
SEE
AR
T. 8
(1)
EC
HR
SEE
SE
CO
ND
AR
Y E
U L
AW
SEE
SE
CO
ND
AR
Y E
U L
AW
SEE
AR
T. 1
4 E
CH
R
Inco
mp
atib
ilit
y if
no
att
enti
on
has
b
een
giv
en t
o t
he
effe
ct o
n c
hil
dre
n.
Inco
mp
atib
ilit
y if
no
att
enti
on
has
b
een
giv
en t
o t
he
effe
ct o
n e
lder
ly.
Inco
mp
atib
ilit
y if
no
att
enti
on
has
b
een
giv
en t
o t
he
effe
ct o
n p
erso
ns
wit
h d
isab
ilit
ies.
Inco
mp
atib
ilit
y if
no
att
enti
on
has
b
een
giv
en t
o t
he
effe
ct o
n h
ealt
h.
SEE
SE
CO
ND
AR
Y E
U L
AW
SEE
AR
T. 6
EC
HR
Passenger profiling based on detecting specified individuals
Idem
(se
e ab
ov
e)
Idem
(se
e ab
ov
e)
Idem
(se
e ab
ov
e)
Idem
(se
e ab
ov
e)
Idem
(se
e ab
ov
e)
159
Legal framework
Smart CCTV
Primary EU law: EU Charter (CFR)
Art
. 1 H
um
an d
ign
ity
Art
. 2 R
igh
t to
life
Art
. 3 (
1)
Rig
ht
to t
he
inte
grit
y o
f th
e p
erso
n
Art
. 4 P
roh
ibit
ion
of
tort
ure
an
d i
nh
um
an o
r d
egra
din
g tr
eatm
ent
Art
. 6
Rig
ht
to li
ber
ty a
nd
secu
rity
Art
. 7 R
esp
ect
for
pri
vat
e
and
fam
ily
lif
e
Art
. 8 (
1,2
) P
rote
ctio
n o
f
per
son
al d
ata
Art
10
(1
) F
reed
om
of
tho
ugh
t, c
on
scie
nce
an
d
reli
gio
n
Art
. 21
(1)
No
n-
dis
crim
inat
ion
Art
. 24
(2
) T
he
righ
ts o
f th
e
chil
d
Art
. 25
Th
e ri
ghts
of
the
eld
erly
Art
. 26
In
tegr
atio
n o
f
per
son
s w
ith
dis
abil
itie
s
Art
. 35
(2)
Hea
lth
car
e
Art
. 45
Fre
edo
m o
f m
ov
emen
t an
d o
f re
sid
ence
Art
. 47
. Rig
ht
to a
n
effe
ctiv
e re
med
y a
nd
to
a
fair
tri
al
Art
. 48
Pre
sum
pti
on
of
inn
oce
nce
an
d r
igh
t o
f d
efen
ce
1-t
o-1
re
cogn
itio
n
Facial recognition (1-to-1)
If p
erfo
rmed
in
a w
ay
infr
ingi
ng
dig
nit
y
SEE
AR
T. 2
EC
HR
SEE
AR
T.
3 A
ND
8 E
CH
R
SEE
AR
T. 3
EC
HR
SEE
AR
T. 5
EC
HR
SEE
AR
T. 8
(1)
EC
HR
SEE
SE
CO
ND
AR
Y E
U L
AW
SEE
SE
CO
ND
AR
Y E
U L
AW
SEE
AR
T. 1
4 E
CH
R
Inco
mp
atib
ilit
y if
no
att
enti
on
has
bee
n g
iven
to
th
e ef
fect
on
ch
ild
ren
Inco
mp
atib
ilit
y if
no
att
enti
on
has
bee
n g
iven
to
th
e ef
fect
on
el
der
ly
Inco
mp
atib
ilit
y if
no
att
enti
on
has
bee
n g
iven
to
th
e ef
fect
on
p
erso
ns
wit
h d
isab
ilit
ies
Inco
mp
atib
ilit
y if
no
att
enti
on
has
bee
n g
iven
to
th
e ef
fect
on
h
ealt
h
SEE
SE
CO
ND
AR
Y E
U L
AW
Cat
ego
rica
l re
cogn
itio
n (
“pro
filin
g”)
Facial expression and gait recognition
Idem
(se
e ab
ov
e)
AR
T. 6
EC
HR
Activity recognition
Idem
(se
e ab
ov
e)
SE
E
Object recognition
Un
lik
ely
Un
lik
ely
Un
lik
ely
Un
lik
ely
Un
lik
ely
160
Legal framework
Suggestions for LPbD with regard to Smart CCTV
Primary EU law: EU Charter (CFR)
Art
. 1 H
um
an d
ign
ity
Art
. 2 R
igh
t to
life
Art
. 3 (
1)
Rig
ht
to t
he
inte
grit
y o
f th
e p
erso
n
Art
. 4 P
roh
ibit
ion
of
tort
ure
an
d
inh
um
an o
r d
egra
din
g tr
eatm
ent
Art
. 6
Rig
ht
to li
ber
ty a
nd
sec
uri
ty
Art
. 7 R
esp
ect
for
pri
vat
e an
d
fam
ily
lif
e
Art
. 8 (
1,2
) P
rote
ctio
n o
f p
erso
nal
dat
a
Art
10
(1
) F
reed
om
of
tho
ugh
t,
con
scie
nce
an
d r
elig
ion
Art
. 21
(1)
No
n-d
iscr
imin
atio
n
Art
. 24
(2
) T
he
righ
ts o
f th
e ch
ild
Art
. 25
Th
e ri
ghts
of
the
eld
erly
Art
. 26
In
tegr
atio
n o
f p
erso
ns
wit
h d
isab
ilit
ies
Art
. 35
(2)
Hea
lth
car
e
Art
. 45
Fre
edo
m o
f m
ov
emen
t an
d
of
resi
den
ce
Art
. 47
. Rig
ht
to a
n e
ffec
tive
re
med
y an
d t
o a
fai
r tr
ial
Art
. 48
Pre
sum
pti
on
of
inn
oce
nce
an
d r
igh
t o
f d
efen
ce
(a) proto-legal reason
Yes
SEE
AR
T. 2
EC
HR
SEE
AR
T.
3 A
ND
8 E
CH
R
SEE
AR
T. 3
EC
HR
SEE
AR
T. 5
EC
HR
SEE
AR
T. 8
(1)
EC
HR
SEE
SE
CO
ND
AR
Y E
U L
AW
SEE
SE
CO
ND
AR
Y E
U L
AW
SEE
AR
T. 1
4 E
CH
R
SEE
SE
CO
ND
AR
Y E
U L
AW
6
EC
HR
(b) Enhancement of security and confidentiality (limiting data access)
(c) Enhancement of transparency and awareness
(d) Enhancement of accountability
SE
E
AR
T .
(e) Data minimization (several methods)
(f) Discrimi-nation Aware Data Mining (DADM)
(g) Other Design with
children in
mind
Design with
elderly in
mind
Design with
disabilities in mind
Design with
health in
mind
161
Legal framework
Passenger Profiling
Secondary EU law: EU Directives and Regulations
Data protection Anti-discrimination Freedom of movement
General Data Protection Data Protection in police and judicial cooperation in
criminal matters
Employment equality Directive 200/78/EC
Racial equality Directive 2000/43/EC
Gender Recast Directive 2006/54/EC
Gender Goods and Services Directive 2004/113/EC
Proposed Equal Treatment Directive 2006/54/EC
Directive 2004/38/EC on the right to move and reside freely
Data Protection Directive 95/46/EC
Framework Decision 2008/977/JHA
Passenger profiling based on risk-categories
When processing personal data outside the scope of law enforcement, the processing needs to take place in accordance with all the requirements of the DPD. One of the main rationales informing many of the provisions of the DPD is that the processing is done based on a legal ground legitimizing it, that there is a specific, explicit and legitimate aim, and that it happens in a proportional way. If not, the processing is incompatible with the DPD.
When processing personal data inside the scope of law enforcement, the processing needs to take place in accordance with all the requirements of the DPD. This regime is similar to that of the DPD but has weaker constraints.
Incompa-tibility when differential treatment occurs in the field of employment (e.g. only catholic people are employed for dealing with the passenger profiling SMT). Protected grounds are: religion or belief, disability, age and sexual orientation.
Incompa-tibility when differential treatment based on race occurs in any field (employment, services, social protection, etc.). Passenger profiling resulting in direct or indirect discrimination based on race is incompatible with the Race Directive.
Incompa-tibility when differential treatment based on sex occurs in the field employment (e.g. only male operators are employed for dealing with the passenger profiling SMT).
Incompa-tibility when differential treatment based on sex occurs with regard to access to goods & services. A passenger profiling systems that structurally stops male passengers from boarding a plane or using the subway is incompatible with this Directive.
HAS NOT YET ENTERED INTO FORCE, but would make differential treatment based on religion or belief, disability, age and sexual orientation. in any field (employment, services, social protection, etc.) incompatible with this Directive.
Incompa-tibility if exit or entry from a State is not allowed based on a ground that is unrelated to public policy, public security or public health.
Passenger profiling based on detecting specified individuals
Idem (see above) Idem (see above) Idem (see above)
Direct discrimination is unlikely, but indirect discrimination can occur.
Idem (see above)
Direct discrimination is unlikely, but indirect discrimination can occur.
Direct discrimination is unlikely, but indirect discrimination can occur.
Idem (see above)
162
Legal framework
Smart CCTV
Secondary EU law: EU Directives and Regulations
Data protection Anti-discrimination Freedom of movement
General Data Protection Data Protection in police and judicial cooperation in
criminal matters
Employment equality Directive 200/78/EC
Racial equality Directive 2000/43/EC
Gender Recast Directive 2006/54/EC
Gender Goods and Services Directive 2004/113/EC
Proposed Equal Treatment Directive 2006/54/EC
Directive 2004/38/EC on the right to move and reside freely
Data protection Directive 95/46/EC
Framework Decision 2008/977/JHA
1-t
o-1
re
cogn
itio
n
Facial recognition (1-to-1)
When processing personal data outside the scope of law enforcement, the processing needs to take place in accordance with all the requirements of the DPD. One of the main rationales informing many of the provisions of the DPD is that the processing is done based on a legal ground legitimizing it, that there is a specific, explicit and legitimate aim, and that it happens in a proportional way. If not, the processing is incompatible with the DPD.
When processing personal data inside the scope of law enforcement (judicial cooperation, not domestic; the later will be included in the proposed LEDPD) the processing needs to take place in accordance with all the requirements of the DPD. This regime is similar to that of the DPD but has weaker constraints.
Incompa-tibility when differential treatment occurs in the field of employment (e.g. only catholic people are employed for dealing with the Smart CCTV system). Protected grounds are: religion or belief, disability, age and sexual orientation.
Incompatibility when differential treatment based on race occurs in any field (employment, services, social protection, etc.). Use of Smart CCTV resulting in direct or indirect discrimination based on race is incompatible with the Race Directive.
Incompatibility when differential treatment based on sex occurs in the field employment (e.g. only male operators are employed for dealing with the Smart CCTV system).
Incompatibility when differential treatment based on sex occurs with regard to access to goods & services. A Smart CCTV system that results in male passengers being prevented from boarding a plane or using the subway disproportionately often, is incompatible with this Directive.
HAS NOT YET ENTERED INTO FORCE, but would make differential treatment based on religion or belief, disability, age and sexual orientation. in any field (employment, services, social protection, etc.) incompatible with this Directive.
Incompatibility if exit or entry from a State is not allowed based on a ground that is unrelated to public policy, public security or public health.
Cat
ego
rica
l re
cogn
itio
n
(“p
rofi
ling”
)
Facial expression and gait recognition
The main question here is if the processed data are personal data (can be related to an identified or identifiable person). If not, the DPD does not apply. If yes: see above.
The main question here is if the processed data are personal data (can be related to an identified or identifiable person). If not, the JHA Framework Decision does not apply. If yes: see above.
Idem (see above) Idem (see above) Idem (see above)
Idem (see above) Idem (see above) Idem (see above)
Activity recognition
Idem (see above)
Idem (see above)
Idem (see above) Idem (see above) Idem (see above)
Idem (see above) Idem (see above) Idem (see above)
Object recognition
DPD does not apply DPD does not apply Directive does not apply
Directive does not apply
Directive does not apply
Directive does not apply
Directive does not apply
Idem (see above)
163
Legal framework
Suggestions for LPbD with regard to
Smart CCTV
Secondary EU law: EU Directives and Regulations
Data protection Anti-discrimination Freedom of movement
General Data Protection Data Protection in police and judicial cooperation in
criminal matters
Employment equality Directive 200/78/EC
Racial equality Directive 2000/43/EC
Gender Recast Directive 2006/54/EC
Gender Goods and Services Directive 2004/113/EC
Proposed Equal Treatment Directive 2006/54/EC
Directive 2004/38/EC on the right to move and reside freely
Data protection Directive 95/46/EC
Framework Decision 2008/977/JHA
(a) proto-legal reasoning Yes Yes
(b) Enhancement of security and confidentiality (limiting data access)
Yes Yes
(c) Enhancement of transparency and awareness
Yes Yes
(d) Enhancement of accountability
Yes Yes
(e) Data minimization (several methods)
Yes Yes
(f) Discrimination Aware Data Mining (DADM)
Yes Yes Yes Yes Yes Yes Yes Yes
(g) Other Yes Yes
164
Internet resources of relevant legislative texts at the European level for the SIAM Database (organized according to the freedom infringement typology101 presented in D4.2):
General on EU legislation, EU Case law
Official website EU legislation: http://europa.eu/legislation_summaries/index_en.htm
Official website Court of Justice of the European Union (ECJ): http://curia.europa.eu
All the judgments of the ECtHR that we refer to in this deliverable are available from the
HUDOC site: http://hudoc.echr.coe.int
Security:
Airport - official website:
http://europa.eu/legislation_summaries/transport/air_transport/tr0028_en.htm
Railway safety - official website:
http://europa.eu/legislation_summaries/transport/rail_transport/l24201a_en.htm
Fight against terrorism - official website EU:
http://europa.eu/legislation_summaries/justice_freedom_security/fight_against_terrorism/i
ndex_en.htm
Detection technology in the work of law enforcement, customs and other security services:
Official site EU (German, Spanish, French):
http://europa.eu/legislation_summaries/justice_freedom_security/fight_against_terrorism/l
11099_en.htm
Fight against organized crime - official site EU:
http://europa.eu/legislation_summaries/justice_freedom_security/fight_against_organised
_crime/index_en.htm
Fight against trafficking in human beings- official site EU:
http://europa.eu/legislation_summaries/justice_freedom_security/fight_against_trafficking
_in_human_beings/index_en.htm
Combating drugs - official site EU:
101 See also figure 2 in this deliverable.
165
http://europa.eu/legislation_summaries/justice_freedom_security/combating_drugs/index_
en.htm
Freedom Infringements:
a. General:
The European Convention of Human Rights (Council of Europe) - official site:
http://conventions.coe.int/Treaty/en/Treaties/Html/005.htm
http://www.hri.org/docs/ECHR50.html
Human rights handbooks, practical guides to the implementation of the ECHR - official site
(Council of Europe) CoE:
http://www.coe.int/t/dgi/publications/hrhandbooks/index_handbooks_EN.asp
Official site of the European Court of Human Rights:
http://www.echr.coe.int/echr/homepage_EN
Human rights, general - official site EU:
http://europa.eu/legislation_summaries/human_rights/index_en.htm
Charter of Fundamental Rights of the European Union (CFREU) - official site EU:
http://europa.eu/legislation_summaries/justice_freedom_security/combating_discriminatio
n/l33501_en.htm
Main trends in the recent case law of the EU Court of Justice and the European Court of
Human Rights in the field of fundamental rights, UNHCR, the UN Refugee Agency:
http://www.unhcr.org/refworld/type,CASELAWCOMP,,,5086711d2,0.html
b. Bodily integrity
A guide to the implementation of Art. 3 ECHR, CoE:
http://www.coe.int/t/dgi/publications/hrhandbooks/HRHAND-06(2003)_en.pdf
Human Rights Handbooks, no. 8, Art. 2 ECHR, CoE:
http://www.coe.int/t/dgi/publications/hrhandbooks/HRHAND-08(2006)_en.pdf
Art. 1 CFREU, Human Dignity: http://www.eucharter.org/home.php?page_id=8
Art. 2 CFREU, Right to Life: http://www.eucharter.org/home.php?page_id=9
Art. 3 CFREU, Right to the integrity of the person:
166
http://www.eucharter.org/home.php?page_id=10
c. Equal treatment and non-discrimination
Racial discrimination EU, Official site:
http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32000L0043:en:HTML
Combating discrimination EU, Official site:
http://europa.eu/legislation_summaries/justice_freedom_security/combating_discriminatio
n/index_en.htm
d. Privacy & Data protection
Privacy
Art. 8 ECHR, Official site European Court of Human Rights: http://echr-online.com/art-8-
echr/introduction
Factsheet on Data protection as part of art. 8 ECHR - official site ECHR:
http://www.echr.coe.int/Documents/FS_Data_ENG.pdf
Standard Approach, prof. Douwe Korff, London, Metropolitan University:
http://ec.europa.eu/justice/news/events/conference_dp_2009/presentations_speeches/KO
RFF_Douwe_a.pdf
Human Rights Handbooks, no. 1, art. 8, CoE:
http://www.coe.int/t/dgi/publications/hrhandbooks/HRHAND-01(2003)_en.pdf
UK Human Rights Blog: http://ukhumanrightsblog.com/incorporated-rights/articles-
index/article-8-of-the-echr/
Data Protection
Data protection directive - official site EU:
http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:en:HTML
Data protection Council Framework Decision on police and criminal matters - official site EU:
http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2008:350:0060:01:EN:HTML
Proposed General Data Protection Regulation - official site EU:
167
http://ec.europa.eu/justice/data-protection/document/review2012/com_2012_11_en.pdf
Proposed Law Enforcement Data Protection Directive - official site EU:
http://ec.europa.eu/home-affairs/doc_centre/police/docs/com_2012_10_en.pdf
An activist platform that provides excellent information on the proposed EU Data Protection
legislation: http://protectmydata.eu
e. Freedom of movement
Mobility and passenger rights - official website EU:
http://europa.eu/legislation_summaries/transport/mobility_and_passenger_rights/index_e
n.htm
Free movement of persons, asylum and immigration - official site EU:
http://europa.eu/legislation_summaries/justice_freedom_security/free_movement_of_pers
ons_asylum_immigration/index_en.htm
f. Fair trial and due process
Human rights handbooks, 3, Art. 6 ECHR, Official site CoE:
http://www.coe.int/t/dgi/publications/hrhandbooks/HRHAND-03(2006)_en.pdf
g. Presumption of innocence
EU Charter of Fundamental Rights, site EU Charter: http://www.eucharter.org/
Human rights handbooks, 3, Art. 6 ECHR, Official site CoE:
http://www.coe.int/t/dgi/publications/hrhandbooks/HRHAND-03(2006)_en.pdf
Commission Green Paper on the presumption of innocence, Official site EU:
http://europa.eu/legislation_summaries/justice_freedom_security/judicial_cooperation_in_
criminal_matters/l16032_en.htm
h. Freedom from unlawful detention:
168
Human rights handbooks, no. 5, art. 5 ECHR, official site CoE:
http://www.coe.int/t/dgi/publications/hrhandbooks/HRHAND-05(2004)_en.pdf
Art. 6 CFREU, Right to liberty and security, site EU Charter:
http://www.eucharter.org/home.php?page_id=13
169
Bibliography
Agre, P. E., & Rotenberg, M. (2001). Technology and Privacy: The New Landscape.
Cambridge, Massachusetts: MIT. Alexander, I., & Beus-Dukic, L. (2009). Discovering requirements : how to specify products
and services. Chichester: Wiley. Article 29 Data Protection Working Party 29. (2012, 23 March). Opinion 01/2012 on the Data
Protection Reform Proposals Article 29 Data Protection Working Party 29. (2013, 13 May). Advice paper on essential
elements of a definition and a provision on profiling within the EU General Data Protection Regulation. 00530/12/EN WP 191.
Article 29 Data Protection Working Party 29. (2013, 22 April). Opinion 04/2013 on the Data Protection Impact Assessment Template for Smart Grid and Smart Metering Systems (‘DPIA Template’) prepared by Expert Group 2 of the Commission’s Smart Grid Task Force.
Bellanova, R., & De Hert, P. (2013). Practices and modes of transatlantic data-processing. From sorting countries to sorting individuals? In S. Body-Gendrot, M. Hough, K. Kerezsi, R. Lévy & S. Snacken (Eds.), The Routledge Handbook of European Criminology (pp. 514-535). Milton Park: Routledge.
Berg, B., & Leenes, R. (2013). Abort, Retry, Fail: Scoping Techno-Regulation and Other Techno-Effects. In M. Hildebrandt & J. Gaakeer (Eds.), Human Law and Computer Law: Comparative Perspectives (Vol. 25, pp. 67-87): Springer Netherlands.
Blair, A. (2005). The European Union since 1945. Harlow: Pearson/Longman. Bratza, N. (2013). The European Convention on Human Rights and the Charter of
Fundamental Rights of the European Union: A Process of Mutual Enrichment The Court of Justice and the Construction of Europe: Analyses and Perspectives on Sixty Years of Case-law - La Cour de Justice et la Construction de l'Europe: Analyses et Perspectives de Soixante Ans de Jurisprudence (pp. 167-181): T. M. C. Asser Press.
Brems, E. (2008). Conflicts between fundamental rights. Antwerpen: Intersentia. Bribosia, E., & Rorive, I. (2010). In search of a balance between the right to equality and
other fundamental rights. Brussels: Publications Office of the European Union. Brownsword, R. (2005). Code, control, and choice: why East is East and West is West. Legal
Studies, 25(1), 1-21. Brownsword, R. (2008). So What Does the World Need Now? Reflections on Regulating
Technologies. In R. Brownsword & K. Yeung (Eds.), Regulating technologies: legal futures, regulatory frames and technological fixes (pp. 23-48). Oxford: Hart.
Cavoukian, A. (2009). Privacy by Design.Take the Challenge. Retrieved from http://www.ipc.on.ca/images/Resources/PrivacybyDesignBook.pdf
Cavoukian, A. (2012). Operationalizing Privacy by Design: A Guide to Implementing Strong Privacy Practices Retrieved from http://www.ipc.on.ca/images/Resources/operationalizing-pbd-guide.pdf
CEJI Policy Response. (2010). Proposal for an Equality Directive. Brussels: CEJI - A Jewish Contribution to an Inclusive Europe.
Committee on Legal Affairs and Human Rights. (2007, 6 July). The principle of the Rule of Law. Report nr. 11343. Strasbourg: Council of Europe,.
170
Coudert, F., De Vries, K., & Kowalewski, J. (2008). Legal Implications of Forensic Profiling: of Good Old Dataprotection Legislation and Novel Legal Safeguards for Due Processing. In Z. Geradts & P. Sommer (Eds.), Forensic Profiling. Deliverable 6.7c of the FIDIS (The Future of Identity in the Information Society) Consortium (pp. 38-67). http://www.fidis.net: EU Sixth Framework Programme.
Council of Europe. (2010). Recommendation CM/Rec(2010)13 and the Explanatory Memorandum. The Protection of Individuals with regards to Automatic Processing of Personal Data in the Context of Profiling. Strasbourg.
Custers, B., Zarsky, T., Schermer, B., & Calders, T. (Eds.). (2012). Discrimination and Privacy in the Information Society. Effects of Data Mining and Profiling Large Databases. Dordrecht: Springer.
Custers, B. (2004). The Power of Knowledge. Ethical, Legal, and Technological Aspects of Data Mining and Group Profiling in Epidemiology. Nijmegen: Wolf Legal Publishers.
Dammann, I. (2011). Der Kernbereich der privaten Lebensgestaltung. Zum Menschenwürde- und Wesensgehaltsschutz im Bereich der Freiheitsgrundrechte. BerlinDuncker & Humblot.
de Goede, M. (2012). Speculative Security. The Politics of Pursuing Terrorist Monies. Minneapolis: University of Minnesota Press.
De Hert, P. (2005). Balancing security and liberty within the European human rights framework. A critical reading of the Court’s case law in the light of surveillance and criminal law enforcement strategies after 9/11. Utrecht Law Review, 1(1), 68-96.
De Hert, P. (2012). A Human Rights Perspective on Privacy and Data Protection Impact Assessments. In D. Wright & P. Hert (Eds.), Privacy Impact Assessment (Vol. 6, pp. 33-76): Springer Netherlands.
De Hert, P., De Vries, K., & Gutwirth, S. (2009). Duitse rechtspraak over remote searches, datamining en afluisteren op afstand. Het arrest Bundesverfassungsgericht 27 februari 2008 (Online Dursuchung) in breder perspectief [German caselaw on remote searches, datamining and wire tapping at a distance. The decision of Federal Constitutional Court 27 February 2008 ((Online Dursuchung) put into a broader perspective). Computerrecht(5), 200-211.
De Hert, P., & Gutwirth, S. (2008). Regulating profiling in a democratic constitutional state. In M. Hildebrandt & S. Gutwirth (Eds.), Profiling the European citizen. Cross disciplinary perspectives. (pp. 271-291). Dordrecht: Springer.
de Hert, P., & Papakonstantinou, V. (2013). Three Scenarios for International Governance of Data Privacy: Towards an International Data Privacy Organization, Preferably a UN Agency? ISJLP, 9, 271-327.
De Schutter, O., & Tulkens, F. (2007). Rights in Conflict: the European Court of Human Rights as a Pragmatic Institution Reflexive Governance in the Public interest (RefGov). EU FP6 project.
De Schutter, O., & Tulkens, F. (2008). Rights in Conflict: the European Court of Human Rights as a Pragmatic Institution. In E. Brems (Ed.), Conflicts between fundamental rights (pp. 169-216). Antwerpen: Intersentia.
De Vries, K. (2013). Privacy, due process and the computational turn. A parable and a first analysis. In M. Hildebrandt & K. De Vries (Eds.), Privacy, Due Process and the Computational Turn.The Philosophy of Law Meets the Philosophy of Technology (pp. 9-38). London: Routledge.
171
De Vries, K., Bellanova, R., De Hert, P., & Gutwirth, S. (2011). The German Constitutional Court Judgment on Data Retention: Proportionality Overrides Unlimited Surveillance (Doesn’t It?). In S. Gutwirth, Y. Poullet, P. De Hert & R. Leenes (Eds.), Data Protection: An Element of Choice (pp. 3-24). Springer: Dordrecht.
De Vries, K., & Van Dijk, N. (2013). A Bump in the Road. Ruling out Law from Technology. In M. Hildebrandt & J. Gaakeer (Eds.), Human Law and Computer Law: Comparative Perspectives (pp. 89-121). Dordrecht: Springer.
Dwork, C., Hardt, M., Pitassi, T., Reingold, O., & Zemel, R. (2012). Fairness Through Awareness. In S. Goldwasser (Ed.), Proceedings of the 3rd Innovations in Theoretical Computer Science (ITCS). Cambridge, MA, USA, January 8-10, 2012 (pp. 214-226). New York: ACM Press.
Dworkin, R. (1978). Hard Cases Taking rights seriously (pp. 81-130). Cambridge, Mass: Harvard University Press.
Dworkin, R. (1998). Law's empire. Oxford: Hart. Dworkin, R. (2006). Justice in robes. Cambridge, Mass.: Belknap Press of Harvard University
Press. Edel, F. (2010). Prohibition of Discrimination Under the European Convention on Human
Rights (Human Rights Files, No. 22)(2010): Council of Europe. EDRI. (2005, 14 July). EU Passenger Data Possibly Used Commercially. Retrieved from
http://www.edri.org/edrigram/number3.14/PNR Endicott, T. (2012). Proportionality and Incommensurability. Oxford Legal Studies Research
Paper, No. 40/2012. Retrieved from http://ssrn.com/abstract=2086622 EU Network of independent experts on fundamental rights. (2006). Commentary of the
Charter of Fundamental Rights of the European Union Retrieved from http://ec.europa.eu/justice/fundamental-rights/files/networkcommentaryfinal_en.pdf
European Commission (EC). (2011). Commission staff working paper. Operational Guidance on taking account of Fundamental Rights in Commission Impact Assessments. Brussels, 6.5.2011, SEC(2011) 567 final Brussels: European Union.
European Data Protection Supervisor. (2012, 7 March). Opinion of the European Data Protection Supervisor on the data protection reform package.
European Digital Right (EDRI). (2012). EDRi's Position on the Directive on Data Protection in Law-Enforcement (Proposal for a Directive of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal databy competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and the free movement of such data, COM(2012)10 final). Retrieved from http://dpreformlawenforcement.files.wordpress.com/2012/12/edri-position-papers-directive1.pdf
European Digital Rights (EDRI). (2012a). Everything you need to know about the Data Protection Directive for Law Enforcement, from http://policingprivacy.eu/
European Digital Rights (EDRI). (2012b). Everything you need to know about the Data Protection Regulation, from http://protectmydata.eu/
Ferraris, V., Bosco, F., Cafiero, G., D’Angelo, E., & Suloyeva, Y. (2013). Defining Profiling. Working paper on definition and domain of application of profiling. Profiling. Protecting Citizens' Rights Fighting Illicit Profiling: Research Project funded by the
172
European Commission, DG Justice, under the Fundamental Rights and Citizens programme.
Foucault, M. (2000). "Omnes et Singulatim": Towards a Critique of Political Reason. In J. D. Faubion (Ed.), Power. Essential works of Foucault, 1954-1984 (pp. 298-325). London: Allen Lane.
Fribergh, E., & Kjaerum, M. (2011). Handbook on European non-discrimination law Luxembourg: Publication Office of the European Union.
Friedman, B. (2004). Value sensitive design. In W. S. Bainbridge (Ed.), Berkshire encyclopedia of human-computer interaction (pp. 769-774). Great Barrington: Berkshire Publishing Group.
Gadamer, H.-G. (2004). Truth and method (J. Weinsheimer & D. G. Marshall, Trans. 2nd rev. ed.). London: Continuum.
Gellert, R. (2013). Cross-pollination between privacy/data protection and sustainable development: the case of smart grids Paper presented at the 30th Monday Researchers' Gathering (Faculty of Law, Metajuridica, 13 June 2013), Vrije Universiteit Brussel.
Gellert, R., de Vries, K., De Hert, P., & Gutwirth, S. (2012). A Comparative Analysis of Anti-Discrimination and Data Protection Legislations. In B. Custers, T. Calders, B. Schermer & T. Zarsky (Eds.), Discrimination and Privacy in the Information Society (pp. 61-89). Berlin: Springer.
Gerards, J. (2011). EVRM - Algemene beginselen [ECHR - General Principles]. Den Haag: Sdu Uitgevers.
Gerards, J. (2013). The Discrimination Grounds of Article 14 of the European Convention on Human Rights. Human Rights Law Review, 13(1), 99-124. doi: 10.1093/hrlr/ngs044
González Fuster, G. (2013). The Emergence of Personal Data Protection as a Fundamental Right of the European Union. Ph.D., Vrije Universiteit Brussel, Brussels.
González Fuster, G., Gutwirth, S., & Ellyne, E. (2010). Profiling in the European Union: A high-risk practice. INEX Policy Brief No. 10, June 2010.
Greenleaf, G. (2012). The influence of European data privacy standards outside Europe: implications for globalization of Convention 108. International Data Privacy Law, 2(2), 68-92. doi: 10.1093/idpl/ips006
Greimas, A. J., & Landowski, É. (1990). The Semiotic Analysis of Legal Discourse: Commercial Laws That Govern Companies and Groups of Companies (P. Perron & F. H. Collins, Trans.). In A. J. Greimas (Ed.), The social sciences : a semiotic view (pp. 102-138). Minneapolis: University of Minnesota Press.
Gürses, S., Gonzalez Troncoso, C., & Diaz, C. (2011). Engineering Privacy by Design. Paper presented at the Computer Privacy and Data Protection (CPDP), Brussels. https://lirias.kuleuven.be/handle/123456789/356730
Gutwirth, S., De Hert, P., & De Sutter, L. (2008). The trouble with technology regulation from a legal perspective. Why Lessig’s ‘optimal mix’ will not work. In R. Brownsword & K. Yeung (Eds.), Regulating Technologies. Legal Futures, Regulatory Frames and Technological Fixes (pp. 193-218). Oxford: Hart Publishers.
Gutwirth, S., Gellert, R., Bellanova, R., Friedewald, M., Schutz, P., Wright, D., et al. (2011). Deliverable D1: Legal, social, economic and ethical conceptualisations of privacy and data protection. http://www.prescient-project.eu/prescient/inhalte/download/PRESCIENT-D1---final.pdf: PRESCIENT (Privacy and emerging fields of science and technology: Towards a common
173
framework for privacy and ethical assessment), a project within the Seventh Framework Programme for research and technological development.
Hajian, S. (2013). Simultaneous discrimination prevention and privacy protection in data publishing and mining. Universitat Rovira i Virgili, Tarragona.
Hart, H. L. A. (1997). The concept of law (2nd ed.). Oxford: Clarendon Press. Heidegger, M. (1977). The question concerning technology (W. Lovitt, Trans.) The question
concerning technology, and other essays (pp. 1-35). New York: Harper & Row. Hildebrandt, M. (2008a). Defining profiling: a new type of knowledge? In M. Hildebrandt & S.
Gutwirth (Eds.), Profiling and the Identity of the European Citizen (pp. 39-50): Springer.
Hildebrandt, M. (2008b). Legal and Technological Normativity: more (and less) than twin sisters. Techné: Research in Philosophy and Technology, 12(3), 169–183.
Hildebrandt, M. (2008c). Profiling and the Identity of the European Citizen. In M. Hildebrandt & S. Gutwirth (Eds.), Profiling the European Citizen: Cross-disciplinary Perspectives (pp. 320-360). Dordrecht: Springer.
Hildebrandt, M. (2008). A Vision of Ambient Law. In R. Brownsword & K. Yeung (Eds.), Regulating technologies: legal futures, regulatory frames and technological fixes. (pp. 175-191). Oxford: Hart.
Hildebrandt, M. (2011a). Legal protection by design: Objections and refutations. Legisprudence, 5(2), 223-248.
Hildebrandt, M. (2011b). Technologische en juridische normativiteit: het tekort van het reguleringsparadigma. Een respons op Leenes’ ‘Technoregulering’. Recht der Werkelijkheid, 32(1), 53-59.
Hildebrandt, M. (2012). The Dawn of a Critical Transparency Right for the Profiling Era. In J. Bus, M. Crompton, M. Hildebrandt & G. Metakides (Eds.), Digital Enlightenment Yearbook (pp. 41-56). Amsterdam: IOS Press.
Hildebrandt, M. (2013a). Balance or Trade-off? Online Security Technologies and Fundamental Rights. Philosophy & Technology, 1-23.
Hildebrandt, M. (2013b). Legal Protection by Design in the Smart Grid. Privacy, Data Protection & Profile Transparency: Smart Energy Collective.
Hildebrandt, M., & De Vries, K. (Eds.). (2013). Privacy, Due Process and the Computational Turn. London: Routledge.
Hildebrandt, M., & Tielemans, L. (2013). Data Protection by Design and Technology Neutral Law. Computer Law & Security Review, 29(5), 509-521.
JUSTICE - the independent human rights and law reform organisation. (2004). EU Charter website. Retrieved from http://www.eucharter.org/
Kamiran, F. (2011). Discrimination-aware Classification. Ph.D., Technical University of Eindhoven.
Kelsen, H. (2005). Pure theory of law. Clark, N.J.: Lawbook Exchange. Kerr, I. (2013). Prediction, pre-emption, presumption: the path of law after the
computational turn. In M. Hildebrandt & K. De Vries (Eds.), Privacy, Due Process and the Computational Turn.The Philosophy of Law Meets the Philosophy of Technology (pp. 91-120). London: Routledge.
Kilkelly, U. (2001). The Right to Respect for Private and Family Life. A guide to the implementation of Article 8 of the European Convention on Human Rights. Human Rights Handbook, nr. 1 Retrieved from <http://www.coe.int/T/E/Human_rights/handbookse.asp>
174
Klitou, D. (2011). Privacy by Design and Privacy-Invading Technologies: Safeguarding Privacy, Liberty and Security in the 21st Century. Legisprudence, 5(3), 297–329.
Klitou, D. (2012). Privacy-invading technologies : safeguarding privacy, liberty & security in the 21st century. Ph.D., Leiden University, Leiden.
Koops, B.-J. (2008). Criteria for Normative Technology - The Acceptability of "Code as Law" in light of Democratic and Constitutional Values. In R. Brownsword & K. Yeung (Eds.), Regulating technologies: legal futures, regulatory frames and technological fixes (pp. 157-174). Oxford: Hart.
Korff, D. (2006). The right to life. A guide to the implementation of Article 2 of the European Convention on Human Rights. Retrieved from http://www.echr.coe.int/library/DIGDOC/DG2/HRHAND/DG2-EN-HRHAND-06(2003).pdf
Kranzberg, M. (1986). Technology and History: “Kranzberg’s Laws". Technology and Culture, 27, 544–560.
Krebs, D. (2013). "Privacy by Design": Nice-to-have or a Necessary Principle of Data Protection Law? JIPITEC, 4(1), 2-20.
Larenz, K. (1992). Methodenlehre der Rechtswissenschaft (3. v©œllig neu bearb. Aufl. ed.). Berlin: Springer-Verlag.
Latour, B. (1992). Where are the missing masses? The sociology of a few mundane artifacts. In W. E. Bijker & J. Law (Eds.), Shaping technology/building society: studies in sociotechnical change (pp. 225-258). Cambridge, Mass.: MIT Press.
Latour, B. (2010). The Making of Law. An Ethnography of the Conseil d’Etat. Cambridge: Polity Press.
Latour, B. (2013). An Enquiry into Modes of Existence. an anthropology of the Moderns (C. Porter, Trans.). Cambridge, Mass.: Harvard University Press.
Leenes, R. (2010). Harde lessen. Apologie van technologie als reguleringsinstrument. [Hard lessons. Apology for technology as a regulatory tool]. Tilburg: Universiteit van Tilburg.
Leenes, R. (2011). Technoregulering: regulering of ‘slechts’ disciplinering [Technoregulation: regulation or 'mere' disciplination]. Recht der Werkelijkheid, 32(1), 47-52.
Lessig, L. (2006). Code: version 2.0 (2nd ed.). New York: Basic Books. Macovei, M. (2002). The right to liberty and security of the person. A guide to the
implementation of Article 5 of the European Convention on Human Rights. Retrieved from http://www.echr.coe.int/library/DIGDOC/DG2/HRHAND/DG2-EN-HRHAND-05(2004).pdf
Marauhn, T., & Ruppel, N. (2008). Balancing Conflicting Human Rights: Konrad Hesse’s Notion of ‘Praktische Konkordanz’ and the German Federal Constitutional Court. In E. Brems (Ed.), Conflicts between fundamental rights (pp. 273-296). Antwerpen: Intersentia.
Mary Flanagan, Daniel C. Howe, & Nissenbaum, H. (2008). Embodying Values in Technology: Theory and Practice. In J. van den Hoven & J. Weckert (Eds.), Information Technology and Moral Philosophy (pp. 322–353). Cambridge: Cambridge University Press.
Meenan, H. (2007). Equality law in an enlarged European Union : understanding the Article 13 Directives. Cambridge: Cambridge University Press.
Mole, N., & Harby, C. (2006). The right to a fair trial. A guide to the implementation of Article 6 of the European Convention on Human Rights. Human rights handbooks, No. 3 Retrieved from http://www.echr.coe.int/library/DIGDOC/DG2/HRHAND/DG2-EN-HRHAND-03(2006).pdf
175
Montesquieu, C. (1748/1989). The spirit of the laws (A. M. Cohler, B. C. Miller & H. S. Stone, Trans.). Cambridge: Cambridge University Press.
Morozov, E. (2011). The net delusion : the dark side of internet freedom (1st ed.). New York, N.Y.: Public Affairs.
Morozov, E. (2013). To save everything, click here : technology, solutionism, and the urge to fix problems that don't exist. London: Allen Lane.
Murdoch, J. (2012). Protecting the right to freedom of thought, conscience and religion under the European Convention on Human Rights Retrieved from http://www.coe.int/t/dghl/cooperation/capacitybuilding/Source/documentation/hb09_rightfreedom_en.pdf
Norman, D. A. (1990). The design of everyday things. New York: Doubleday/Currency. Norman, D. A. (1998). The invisible computer: why good products can fail, the personal
computer is so complex, and information appliances are the solution. Cambridge, Mass.: MIT Press.
Polakiewicz, J. (2013). EU law and the ECHR: Will EU accession to the European Convention on Human Rights square the circle? Paper presented at the Fundamental Rights In Europe: A Matter For Two Courts, Oxford Brookes University. http://www.coe.int/t/dghl/standardsetting/hrpolicy/accession/Accession_documents/Oxford_18_January_2013_versionWeb.pdf
Reidenberg, J. R. (1998). Lex Informatica: The Formulation of Information Policy Rules Through Technology. Texas Law Review, 76(3), 553-584.
Reidy, A. (2002). The prohibition of torture. A guide to the implementation of Article 3 of the European Convention on Human Rights. Retrieved from http://www.echr.coe.int/library/DIGDOC/DG2/HRHAND/DG2-EN-HRHAND-06(2003).pdf
Ridley, M. (1993). The red queen : sex and the evolution of human nature. London: Viking. Roagna, I. (2012). Protecting the right to respect for private and family life under the
European Convention on Human Rights. Strasbourg: Council of Europe. Schiavone, A. (2012). The Invention of Law in the West (J. Carden & A. Shugaar, Trans.).
Cambridge, MA: Belknap Press of Harvard University Press. Sebok, A. J. (1998). Legal positivism in American jurisprudence. Cambridge: Cambridge
University Press. Sen, A. (2004). Elements of a Theory of Human Rights. Philosophy and Public Affairs, 32(4),
315-356. Solove, D. J., Rotenberg, M., & Schwartz, P. M. (Eds.). (2006). Information Privacy Law. New
York: Aspen. Sommerville, I., & Sawyer, P. (1997). Requirements engineering : a good practice guide.
Chichester: Wiley. Sottiaux, S. (2008). Terrorism and the limitation of rights : the ECHR and the US Constitution.
Oxford: Hart. Stanley, J. (2013, 11 January). TSA Once Again Considering Using Commercial Data To Profile
Passengers. Retrieved from http://www.aclu.org/ Steinbock, D. (2005). Data Matching, Data Mining, and Due Process. Georgia Law Review,
40(1), 1-84. Sternstein, A. (2013, 16 January). Big Data meets big brother in the passenger screening line.
Nextgov. Retrieved from http://www.nextgov.com/big-data/2013/01/big-data-meets-big-brother-passenger-screening-line/60698/
176
Stone Sweet, A., & Mathews, J. (2008). Proportionality Balancing and Global Constitutionalism. Columbia Journal of Transnational Law, 47, 73-165.
Strahilevitz, L. J. (2008). Privacy versus Antidiscrimination. University of Chicago Law Review, 75(1), 363-382.
Thomas, Y. (2011). Les opérations du droit. Paris: Seuil/Gallimard. Van Drooghenbroeck, S. (2001). La proportionnalité dans le droit de la Convention
européenne des droits de l'Homme. Prendre l'idée simple au sérieux. Brussels: Bruylant.
Verbeek, P.-P. (2011). Moralizing technology : understanding and designing the morality of things. Chicago, Ill. ; London: University of Chicago Press.
Vismann, C. (2008). Files (G. Winthrop-Young, Trans.). Stanford: Stanford University Press. Waldron, J. (2003). Security and liberty: The image of balance. Journal of Political Philosophy,
11(2), 191-210. Weber, M. (1949). The methodology of the social sciences (E. Shils & H. A. Finch, Trans.).
New York: Free Press. Westin, A. F. (1967). Privacy and freedom ([1st ed.). New York: Atheneum. Winner, L. (1986). Do artifacts have politics? In L. Winner (Ed.), The whale and the reactor: a
search for limits in an age of high technology (pp. 19-39). Chicago: University of Chicago Press.
Wright, D., & Hert, P. (2012). Privacy Impact Assessment. Dordrecht: Springer. Zucca, L. (2007). Constitutional dilemmas : conflicts of fundamental legal rights in Europe
and the USA. Oxford: Oxford University Press.