FR  I  EN
Data Gouvernance


A DPIA is a PIA: consequences in terms of implementation and scope

Estelle De Marco, 3 June 2021. First published on Linkedin on 31 March 2019.

 

According to guidelines that are increasingly provided in relation to the carrying out of a Data Protection Impact Assessment (DPIA), this latter notion expresses the same concept as the notion of Privacy Impact Assessment (PIA)[1] (I). This is indeed true, but are not always drawn from this statement the correct conclusions in terms of implementation and scope (II).

I - The notions of PIA and of DPIA express the same concept

Both a PIA and a DPIA consist in assessing the impact of an initiative on a series of fundamental rights and freedoms[0], the right primarily concerned being either privacy or personal data protection (I.1). As a result, the only difference between the two concepts lies in the distinction that may be made between the right to private life and the right to personal data protection, which has no practical effects (I.2).

I.1 - A PIA and a DPIA both consist in assessing the impact of a project on a series of fundamental rights and freedoms

Reasoning by analogy from the definition of a PIA, which consists in assessing the impact of a project on a series of fundamental rights and freedoms, primarily the right to private life (I.1.1), a DPIA consists in assessing the impact of a project on a series of fundamental rights and freedoms, primarily the right to personal data protection (I.1.2).

I.1.1 - A PIA consists in assessing the impact of a project on a series of fundamental rights and freedoms, primarily the right to private life

A PIA[2] has been defined by the consortium of the PIAF EU project as "a process for assessing the impacts on privacy of a project [...] or other initiative [...] and [...] for taking remedial actions as necessary in order to avoid or minimise the negative impacts"[3].

In this definition and in most of the others, very close, that have been proposed[4], the notion of "privacy" is understood as covering all the fundamental rights and freedoms[5] that might be impacted by the project or initiative under assessment, either without particular restriction or reducing the number of the targeted rights and freedoms to those that might be indirectly impacted through a privacy limitation.[6]

A PIA is therefore a tool that assesses the respect of general requirements for protecting a series of fundamental rights and freedoms, without being restricted to the bounds of special legislation[7], in relation to a project or initiative and its possible impacts. As a consequence, the assessment of these impacts may lead to determine certain measures to be implemented that are not provided for by law, and even which might aim to palliate the breach of a legal requirement that is difficult to apply in particular circumstances[8].

I.1.2 - A DPIA consists in assessing the impact of a project on a series of rights and freedoms, primarily the right to personal data protection

Reasoning by analogy, if a PIA does consist in the assessment of the impacts of an initiative on the right to private life, understood as covering impacts on all fundamental rights and at least those exercised behind the wall of the private sphere, a DPIA does consist for its part in the assessment of the impacts of an initiative on the right to personal data protection, understood as covering impacts on all fundamental rights[9] and at least those restricted by extension because of the processing or non-processing [10] of a personal information. This reasoning is perfectly consistent with the definition of an impact assessment, which consists in "the process of identifying the future consequences of a current or proposed action" according to the International Association for Impact Assessment (IAIA)[11].

Consequently, a PIA and a DPIA are indeed similar concepts and their difference does only consist in the difference that may be drawn-up between privacy and personal data protection, which in turn may lead to differentiate the lists of fundamental rights that are indirectly protected under the one and the other of these spheres of protection. However, such difference does not have any practical effect.

I.2 - The potential difference between privacy and personal data protection has no practical effects

The debate relating to the extent of the overlap between private life and personal data protection is still ongoing (I.2.1) but has no practical effects since both these rights protect, directly or indirectly, the same rights and freedoms (I.2.2).

I.2.1 - The extent of the juxtaposition of private life and personal data: a still ongoing debate...

All legal experts agree that the sphere of protected privacy and the sphere of protected personal data overlap, but a disagreement does persist concerning the exact extent of the juxtaposition.

To summarise the debate, a first doctrinal approach considers the notion of personal data to be perfectly included in the definition of private life, defined extensively[12]. Indeed, in the latter approach, are covered all the data that surround an individual who might be identified as being their subject, which are protected against any interference of third parties that would not be legitimate, necessary and proportionate in the sense given to these terms by the European Convention on Human Rights (ECHR)[13] and the European Union Charter of Fundamental Rights (EUCFR)[14] (texts which the GDPR transposes in more practical terms[15]), along with the freedoms exercised based on the secrecy or on the control of these data[16].

On the opposite, other doctrinal approaches give to protected privacy a more restrictive definition, which leads to consider the latter and protected personal data as being two different spheres that overlap without being exactly the same, primarily since some personal data such as elements of the public life of an individual are considered to be non-private in nature and therefore to not benefit from the protection granted to private life[17]. These approaches, which are perfectly understandable, appear nevertheless to be questionable in the light of the European Court of Human Rights (ECtHR) court cases[18] and the opinion of both the Council of Europe and the European Union Agency for Fundamental Rights[19].

However, this theoretical debate does not appear to have practical consequences.

I.2.2 - ...which has no practical effects

Whether the privacy and the personal data spheres overlap or not, both these spheres protect, directly or indirectly, the same list of fundamental rights.

This is obvious in the first hypothesis. In the second one, we can notice that each of these rights includes the other one in the indirect protection it offers to other fundamental rights.

Indeed, the right to private and family includes the right to personal data protection under Article 8 of the ECHR[20]. At the European Union level, the right to personal data protection has been laid down afterward as a stand-alone right in Article 8 of the EUCFR[21]. As a result and for example, the notion of PIA has preceded the notion of DPIA[22], and the concepts of privacy by design and by default have preceded the concepts of data protection by design and by default[23]. Taking account that the right to personal data protection is originally an expression of the right to private life, it is as a result one of the first fundamental right on which the impacts of a given initiative will be assessed within the framework of a PIA.

For its part, the right to personal data protection is sometimes considered to be a right that is or that must definitely be independent from the right to private life, because, essentially and according to the advocates of this approach, it does also protect information of non-private nature, as already described, which enables in turn to safeguard fundamental rights that are not always protected by the private sphere[24]. However, this statement alone suffices to conclude that the right to private life remains one of the first rights on which the impacts of a given initiative will be assessed within the framework of a DPIA. This was in addition confirmed in Directive 95/46/EC, which clearly announced protecting "the fundamental rights and freedoms of natural persons, and in particular their right to privacy with respect to the processing of personal data"[25], even though the GDPR - which protects the same rights as the Directive did - disconnects now both rights[26].

Since the assessment of the impacts on privacy drives to analyse impacts on the right to personal data protection and inversely, all the fundamental rights that are protected by the one or by other of these rights will be included, as a result, in the values to be protected, both in a PIA and in a DPIA. As a result, both the latter will analyse impacts on the same rights and freedoms, which renders them identical in terms of content, primarily as regards implementation and scope[27].

II. Consequences in terms of implementation and scope of a DPIA

Since a DPIA is similar to a PIA, the method employed should be the same. This statement has consequences in terms of implementation of the different steps (II.1) and in terms of scope (II.2) of a DPIA.

II.1 - Implementation of different DPIA steps

Such as a PIA, a DPIA should be initiated in each situation where the right to privacy and/or to personal data protection are at risk, which is basically the case in most situations of personal data processing. However, each situation does not require the same level of assessment, which means that first DPIA steps may be sufficient in most cases (II.1.1). This being said, the analysis highlights that once these steps are achieved, the remaining step is often mandatory under Article 32 GDPR (II.1.2).

II.1.1 - Performance of the first DPIA steps in most situations

According to the state of the art, a PIA is an ethical process to be initiated prior the start of any project[28]. Several PIA methods do exist, but it is possible to draw-up from them a seven-step method[29], in which first steps consist (together with the identification of responsible teams and of the scope of analysis) in verifying the compliance of the project with applicable laws, including the data protection legislation and ECHR requirements, which in turn implies the performance of tests of legal basis, legitimate aim[30], necessity and proportionality. If these first analyses do not highlight particular risks where applicable law is properly and correctly applied, the performance of the PIA can be ended.

The identification of the probability of risks, during this phase, is particularly easy. Indeed, practice shows that the performance, in relation to a same given data processing, of an analysis of compliance with both the ECHR requirements and the GDPR requirements, generally enables to obtain the same results. In other words, the performance of both these analyses enables to find out the same measures to be implemented in order to protect rights and freedoms at stake. Where the analysis of compliance with ECHR requirements enables to find out additional protection measures to be implemented, which are not provided for in the GDPR, this means that data processing operations under analysis are more sensitive than those to which GDPR main principles are supposed to apply, and that they are likely to pose risks to rights and freedoms: that simple observation should systematically involve the performance of the other PIA steps.

The GDPR does not particularly contradict this approach, even if it is not extremely clear in this regard. Indeed, data controllers and processors have the duty to ensure that GDPR requirements are met and that the rights of the data subject are protected[31], which implies, upstream, to perform a GDPR compliance analysis, in addition to, within the framework of a comprehensive understanding of the notion of "rights protection", tests of legal basis, legitimate aim, necessity and proportionality. In this regard, it may also be noted that several GDPR provisions imply the performance of necessity and proportionality tests[32], which are themselves likely to reveal a discrepancy with their results and those obtained through the pure and simple implementation of the GDPR. Moreover, hypotheses in which a DPIA is or may be mandatory are hypotheses in which, most of the time, a discrepancy will be noticed.

We can therefore regret that these tests (which can be performed in just a few hours), at least those of necessity and proportionality, are not explicitly mandatory upstream of any processing (unless it is absolutely unlikely to entail risks), and accompanied with a clear method to perform them (which may be easily drawn-up from ECtHR court cases and Article 29 working party[33]'s opinions by any specialist). This would have had the merit to favour legal certainty, whereas the obligation to perform all the PIA steps when processing operations are "likely to result in a high risk"[34] stays very confusing (and proceeds from a logical error, to paraphrase a comment from Professor Veselin Tselkov[35]), despite the existence of lists provided by supervisory authorities. Indeed, it is inevitably difficult to identify the likelihood of high risks without performing a risk analysis, in the absence of an alternative method of identification of this threshold, in particular within a framework where the GDPR suggests that a DPIA might be mandatory in any situation implying the "use of new technologies"[36].

Anyway, an ethical[37] (and legally secure) approach suggests the performance of such test of compliance with ECHR requirements, at least tests of necessity and proportionality, after completion of the GDPR compliance analysis which is the basis of any compliance strategy. These tests being done, it may be noted that the sole remaining DPIA step according to the GDPR is an analysis of risks to rights and freedoms, which is itself mandatory in most situations under Article 32 of the GDPR.

II.1.2 - The remaining step is often mandatory under Article 32 GDPR

According to the GDPR, a DPIA consist in the performance of tests of necessity and proportionality, of an analysis of compliance with the GDPR, and of an analysis of risks to rights and freedoms. The GDPR adds other steps which are however inherent to any compliance or risk analysis (namely, upstream, the description of the operations that are analysed and, downstream, the measures envisaged to address the weaknesses found out during assessments).

This being said, Article 32 of the GDPR imposes to controllers and processors to "implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk" (of "varying likelihood and severity for the rights and freedoms of natural persons"), which will imply in most situations the performance of a risk analysis targeting rights and freedoms in addition to the information system, even simplified or partial (covering for example risks to fundamental rights only), unless it has already been achieved and covers appropriately the project under assessment. Indeed, once again, it seems impossible to identify and mitigate risks posed by a situation without having performed a risk analysis in relation to this situation, except where the latter is extremely clear and controlled.

As a consequence, the simple compliance with GDPR requirements leads to implement one to two over the three DPIA steps, namely a GDPR compliance analysis and a risk analysis, the only missing analysis being an analysis of necessity and proportionality, which, from our perspective, should be a prerequisite for any evaluation of the need to perform both a risk analysis and a DPIA, in addition to ensure compliance with Articles 24, 25 and 28 of the GDPR.

At this stage of the reflection process, we can note that in several situations, a DPIA may have been completed or almost completed before even implementing Article 35 of the GDPR (remaining actions to be undertaken in order to obtain a DPIA being minor). The most crucial question appears therefore not to relate to situations in which the assessment must be implemented, but to be linked to the scope of this assessment, particularly in relation to the risk analysis.

II.2 - Scope of a DPIA

The statement that the terms of PIA and DPIA are interchangeable implies to give the analysis the same scope, both in relation to the action which can cause risks to fundamental rights (II.2.1) and in relation to the values to be protected against risks (II.2.2).

II.2.1 - Scope of analysis concerning the action which can cause risks to fundamental rights

Most of the definitions that have been given of a DPIA reduce the analysis to the impacts of the data processing operations at stake, without taking into account the whole initiative that includes this processing, and without considering the necessity to assess initiatives that do not consist in a personal data processing but that could have an impact on the right to personal data protection.

Indeed, a data protection impact assessment (DPIA) has been defined by the European Commission as "a systematic process for evaluating the potential impact of risks where processing operations are likely to present specific risks to the rights and freedoms of data subjects by virtue of their nature, their scope or their purposes"[38]. It has been defined by the Article 29 working party as "a process designed to describe the processing, assess the necessity and proportionality of a processing and to help manage the risks to the rights and freedoms of natural persons resulting from the processing of personal data (by assessing them and determining the measures to address them)"[39]. Once again, in these definitions the notion of "risks" is understood as risks for privacy and personal data protection, covering other fundamental rights[40].

This interpretation does not appear appropriate, since it harms the protection of personal data themselves, by ignoring the initiatives that could lead to an interference with the right to the protection of personal data without being themselves personal data processing[41]. Even though it would be impractical to request from any individual to assess the impacts or their day-to-day initiatives on the protection of third parties' fundamental rights (keeping in mind that the essence of privacy impact assessments is precisely to enable such assessment on a voluntary basis, any impact being likely to be sanctioned under the general civil liability regime in most countries[42]), it seems that it would be wise to include at least, in a DPIA, the assessment of risks posed by the context of the assessed personal data processing operation, in order both to remain committed to the notion of PIA and to protect adequately individuals' fundamental rights.

The GDPR does not necessarily contradict such an approach, since even though it considers in its Article 35 that a DPIA is the assessment of the "impact of the envisaged processing operations", this must be done taking into account "the nature, scope, context and purposes of the processing" - which means that the GDPR commands to assess the impacts of the processing operations having regards to their whole context, which might for example lead to process data that were not intended to be processed, due to a project misuse. This leads to extend the scope of the assessment to the impacts on fundamental rights of any project, system or initiative that includes a personal data processing (and even a non-personal data processing where there is a risk that personal data are included in it), as soon as assessed elements are likely to influence the nature, the content or the scope of this data processing.

As stated previously, and as stated explicitly in the GDPR, this assessment is an assessment of the impacts on fundamental rights of the project, system or initiative at stake. This second statement implies to embed, in the analysis' scope, fundamental rights as values to be protected against risks.

II.2.2 - Scope of analysis concerning the values to be protected against risks

If most DPIA methods agree on the use of international standards (adapted to the particularity of fundamental rights protection) in order to perform a risk analysis[43], some methods limit the risk analysis to the risks posed to the personal data that are processed, in practice, considering that the impacts on fundamental rights will only be due to an impact first suffered by one of the personal data that are processed[44].

However, this approach reduces drastically the scope of the DPIA, and prevents the identification of risks to freedoms that will be due to a correct use (as scheduled in compliance with the GDPR) of the data under processing. This approach does not seem compliant with GDPR requirements.

Basically, a risk analysis implies, before identifying feared events and threat scenarios, to identify so-called "primary assets", which are the non-material resources or processes that need to be protected. In other words, they consist in non-material resources and processes whose availability, confidentiality, integrity and other potential safety criteria (to be determined in a subsequent sub-step) must be ensured[45].

In a DPIA these primary assets must indeed include the personal data or (depending on the exact nature of the project to be assessed) more largely the data that will be processed by the system or that will result from the system processing's operations. But since the analysis must target risks to data subjects and other persons[46]' fundamental rights more generally, a DPIA that complies with both GDPR requirements and the state of the art relating to PIA must also include, in the list of primary assets, the fundamental rights that are at risks[47]. Failing that, it would be impossible, for example, to identify risks such as "discrimination, [...] financial loss [or] damage to the reputation"[48], which might happen despite the integrity, confidentiality and availability of data under processing are ensured.

Finally, the list of primary assets should also include GDPR compliance requirements[49]. Indeed, article 35 of the GDPR states that a DPIA must contain at least "the measures envisaged to address the risks, including safeguards [...] to demonstrate compliance with this Regulation [...]"[50]. Addressing the risks to freedoms with measures that enable to demonstrate compliance with the GDPR implies that this compliance is ensured, and therefore that the risks of non-compliance are under control, which cannot fully take place without a proper assessment of risks targeting precisely such compliance, not only in terms of data unlawful destruction, alteration and access, but also in relation to the other data controllers' obligations[51].

Conclusion

A DPIA is a PIA, and it must obligatorily be performed under the GDPR where processing operations are likely to result in high risks to persons' rights and freedoms. However and ideally, such assessment should be initiated within the framework of any data processing, unless impacts of the latter are clearly controlled, through a simple analysis of necessity and proportionality. This analysis enables highlighting whether a risk may arise from the data processing and its context, and therefore indicates whether both a complete DPIA and a risk analysis under Article 32 of the GDPR (at least in a simplified form) are required or at least highly recommended. Where a risk analysis is performed, this step will form a full DPIA, together with necessity and proportionality tests and the GDPR compliance analysis which is supposed to have been performed as a first step aiming to identify data protection weaknesses.

Such an approach brings legal certainty[52], and enables to get rid of current debates around the precise situations in which a DPIA is or not mandatory. However legal certainty also implies to be mindful of the scope of the analyses to be performed, in order to make sure that the latter include, on the one hand, all processing contextual elements that may have an impact on the rights and freedoms of natural persons, and, on the other hand, these same rights and freedoms and GDPR compliance requirements as primary assets, in other words as value to be protected from risks and their impacts.

 

The current publication is created within the project "INtroduction of the data protection reFORM to the judicial system" (INFORM). The project is funded by the European Union's Justice Programme (2014-2020) under Grant Agreement ? 763866. The content of this publication represents the views of the author only and is his/her sole responsibility. The European Commission does not accept any responsibility for use that may be made of the information it contains.

 

Bibliographie

[0] On the notion of fundamental rights, human rights and civil liberties, see Estelle De Marco, "6.4 - Human Rights, Civil Liberties and Fundamental Freedoms" in Cormac Callanan, Marco Gercke, Estelle De Marco, Hein Dries-Ziekenheiner, Internet blocking - balancing cybercrime responses in democratic societies, October 2009, Aconite Internet Solutions, p. 137 (French version of 11 May 2010, juriscom.net).

[1] See for ex. CNIL, CNIL publishes an update of its PIA Guides, 26 February 2018, PIA Manual 1 - Methodology, p. 1, https://www.cnil.fr/en/cnil-publishes-update-its-pia-guides (last accessed on 29 March 2019): "the acronym "PIA" is used interchangeably to refer to Privacy Impact Assessment (PIA) and Data Protection Impact Assessment (DPIA)"; Article 29 Data Protection Working Party, Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is "likely to result in a high risk" for the purposes of Regulation 2016/679 (WP248), 4 April 2017, https://ec.europa.eu/newsroom/article29/items/611236 (last accessed on 9 June 2021), p. 4: "Note: the term "Privacy Impact Assessment" (PIA) is often used in other contexts to refer to the same concept".

[2] For further information see Estelle De Marco, Comparative study between Directive 95/46/EC & the GDPR including their relations to fundamental rights, March 2018, Deliverable D2.10, INFORM project (INtroduction of the data protection reFORM to the judicial system), JUST-JTRA-EJTR-AG-2016, GA n° 763866, Section 2.4.1.2. For a comprehensive presentation of the notion, origin and content of a PIA, see for ex. Estelle De Marco, Privacy Impact Assessment of the MANDOLA outcomes, Deliverable D2.4a (Intermediate), 11 July 2017, version 2.4a.2, MANDOLA project (Monitoring ANd Detecting OnLine hAte speech) - GA n° JUST/2014/RRAC/AG/HATE/6652, http://mandola-project.eu/publications, Section 3 (URLs last accessed on 29 March 2019).

[3] Paul De Hert, Dariusz Kloza, David Wright et al., Recommendations for a privacy impact assessment framework for the European Union, November 2012, Deliverable D3, PIAF project (Privacy Impact Assessment Framework), Grant agreement JUST/2010/FRAC/AG/1137 - 30-­-CE-­-0377117/00-­-70, p.5.

[4] See for ex. David Wright and Paul De Hert, "Introduction to Privacy Impact Assessment", in David Wright and Paul De Hert, Privacy Impact Assessment, Law, Governance and Technology Series volume 6, Springer, 2012, pp. 3 s., in particular pp. 5 s.; Roger Clarke, An Evaluation of Privacy Impact Assessment Guidance Documents, International Data Privacy Law 1, 2 (March 2011) 111-120, available at http://www.rogerclarke.com/DV/PIAG-Eval.html (last accessed on 29 March 2019): a PIA is a "a systematic process that identifies and evaluates, from the perspectives of all stakeholders, the potential effects on privacy of a project, initiative or proposed system or scheme, and includes a search for ways to avoid or mitigate negative privacy impacts".

[5] The state of the art commands to preserve fundamental rights and interests first and to preserve derived rights or interests as a second step, where such protection does not harm the first one. See for ex. Art. 15,3 of the Council of Europe Convention on cybercrime and its explanatory report n° 148.

[6] See for example Estelle De Marco, Privacy Impact Assessment of the MANDOLA outcomes, op. cit. (footnote n°2), Section 3.1.1.; Paul De Hert, Dariusz Kloza, David Wright et al., Recommendations for a privacy impact assessment framework for the European Union, op. cit. (footnote n°3), p. 14; Paul De Hert, "A Human Rights Perspective on Privacy and Data Protection Impact Assessments", in David Wright and Paul De Hert, Privacy Impact Assessment, Law, Governance and Technology Series volume 6, Springer, 2012, pp. 33 s.; Colin Bennett's, In Defence of Privacy, Surveillance & Society, Vol. 8, No. 4, 2011, pp. 485-496, mentioned by Gary T. Marx, Privacy Is Not Quite Like the Weather, in David Wright and Paul De Hert, Privacy Impact Assessment, op. cit., foreword p. vi.

[7] See for instance Roger Clarke, Privacy Impact Assessments, 19 April 1999, last update on 26 May 2003, available at http://www.rogerclarke.com/DV/PIA.html (last accessed on 29 March 2019): "A PIA (.) considers the impacts of a proposed action, and is not constrained by questions of whether the action is already authorised by law. Moreover, to the extent that relevant codes or standards exist, it does not merely accept them, but considers whether they address the public's needs".

[8] Which might for example be the case of the principle of data minimisation, within the framework of a project aiming at performing big data analyses.

[9] See for ex. Felix Bieker, Michael Friedewald, Marit Hansen, Hannah Obersteller, and Martin Rost, "A Process for Data Protection Impact Assessment under the European General Data Protection Regulation", in K. Rannenberg and D. Ikonomou, Privacy Technologies and Policy, Fourth Annual Privacy Forum, APF 2016 Frankfurt. Heidelberg, New York, Dordrecht, London. According to the authors, a DPIA is "an instrument to identify and analyse risks for individuals, which exist due to the use of a certain technology or system by an organization in their various roles (as citizens, customers, patients, etc.) [...]".

[10] Estelle De Marco, Privacy Impact Assessment of the MANDOLA outcomes, Deliverable D2.4a (Intermediate), op. cit. (footnote n°2), Section 3.1.2.

[11] See the IAIA website at https://www.iaia.org/; see also Roger Clarke, Privacy Impact Assessments, 19 April 1999, last update on 26 May 2003, available at http://www.rogerclarke.com/DV/PIA.html (last accessed on 29 March 2019), "Origins and definition".

[12] Estelle De Marco, The definition of private life, 15 March 2019, Sections 1.1 and II.

[13] And clarified by the ECtHR.

[14] Which has the same meaning and scope of the ECHR as regards the rights laid down in the latter: EUCFR, Art. 52, 3.

[15] Estelle De Marco, Using the philosophy underlying the data protection legislation to teach GDPR awareness, 15 March 2019, https://www.linkedin.com/pulse/using-philosophy-underlying-data-protection-teach-gdpr-de-marco/ (URL last accessed on 29 March 2019).

[16] At least as far as it relates to the action that consist of exercising a freedom at stake; the protection or the balance of the other aspects of that freedom - such as the content and extent of this exercise - with third parties' rights might be based on another legal basis such as the right to freedom of expression or the freedom of assembly.

[17] See for instance the discussion in Paul De Hert, Dariusz Kloza, David Wright et al., Recommendations for a privacy impact assessment framework for the European Union, op. cit. (footnote n°3), p. 14. See also Estelle De Marco, Comparative study between Directive 95/46/EC & the GDPR including their relations to fundamental rights, op. cit. (footnote n°2), Section 2.1.

[18] The Court protects for example the "social life" of public figures under the right to private life enshrined in Article 8 of the ECHR, unless the context of the interference appears to be necessary in order to protect a contradictory interest (such as the right to information of the general public) and to be proportionate to this aim. See for ex. ECtHR, Mosley v. the United Kingdom, appl. n° 48009/08, 10 May 2011, §§ 129-130; Handbook on European data protection law, European Union Agency for Fundamental rights and Council of Europe, 2014, http://www.echr.coe.int/Documents/Handbook_data_protection_ENG.pdf, p. 22 s. (last accessed on 29 March 2019).

[19] Handbook on European data protection law, European Union Agency for Fundamental rights and Council of Europe, 2014, op.cit. (footnote n°18), p. 15: "the right to protection of personal data forms part of the rights protected under Article 8 of the ECHR which guarantees the right to respect for private and family life, home and correspondence and lays down the conditions under which restrictions of this right are permitted". For further details see Estelle De Marco, Comparative study between Directive 95/46/EC & the GDPR including their relations to fundamental rights, op. cit. (footnote n°2), Section 2.2.2.1.

[20] In relation to other instruments that protect privacy and personal data see Estelle De Marco, Identification and analysis of the legal and ethical framework, July 2017, Deliverable D2.2, MANDOLA project (Monitoring ANd Detecting OnLine hAte speech) - GA n° JUST/2014/RRAC/AG/HATE/6652, http://mandola-project.eu/publications, Sections 4.1.1 and 4.2.1 (URLs last accessed on 29 March 2019).

[21] In relation to other instruments that protect privacy and personal data see Estelle De Marco, Identification and analysis of the legal and ethical framework, op. cit., Sections 4.1.1 and 4.2.1.

[22] The concept of PIA is known since the mid-1990s: see Paul De Hert, Dariusz Kloza, David Wright et al., Recommendations for a privacy impact assessment framework for the European Union, op. cit. (footnote n°3), p.5; see also Estelle De Marco, Privacy Impact Assessment of the MANDOLA outcomes, Deliverable D2.4a (Intermediate), op. cit. (footnote n°2), Section 3.1.

[23] The privacy by design concept has been developed by Dr. Ann Cavoukian, former Information and Privacy Commissioner of Ontario, in the 1990s. See Ann Cavoukian, Privacy by Design in Law, Policy and Practice, A White Paper for Regulators, Decision-makers and Policy-makers, August 2011, https://gpsbydesign.org/resources-item/privacy-by-design-in-law-policy-and-practice-a-white-paper-for-regulators-decision-makers-and-policy-makers/ (last accessed on 29 March 2019), p. 3.

[24] See above footnote n°17.

[25] Directive 95/46/EC, Article 1, §1.

[26] GDPR, Article 1, §2.

[27] For further details see Estelle De Marco, Comparative study between Directive 95/46/EC & the GDPR including their relations to fundamental rights, op. cit. (footnote n°2), Sections 2.2.1.2 et 2.2.2.2.

[28] Estelle De Marco, Privacy Impact Assessment of the MANDOLA outcomes, Deliverable D2.4a (Intermediate), op.cit. (footnote n°2), Section 3.1.

[29] Estelle De Marco, Privacy Impact Assessment of the MANDOLA outcomes, Deliverable D2.4a (Intermediate), op.cit. (footnote n°2), Sections 3 and 3.4.

[30] Tests of legal basis and legitimate aim are not always necessary (and they are not required by the GDPR), but they may sometimes highlight particular privacy issues.

[31] See especially GDPR, Arts. 24,1; 25,1; 28,1.

[32] See for ex. GDPR, Art. 5 1, c; Art. 6, 1, b-e. In addition the tests of compatibility and of legitimate interest (respectively Arts. 6, 4 and 6, 1, f) are themselves reduced analyses of necessity and proportionality.

[33] Predecessor of the European data protection Board.

[34] GDPR, Art. 35. In relation to situations where a DPIA is necessary, see for ex. Estelle De Marco, Comparative study between Directive 95/46/EC & the GDPR including their relations to fundamental rights, op. cit. (footnote n°2), Section 3.7.3.5.

[35] Prof. Veselin Tselkov, Member of the Board of the Bulgarian Supervisory Authority, opinion raised at the Inform Final conference - "Data Protection Summit: Beyond being INFORMed" was held on 7 March 2019 in Sofia, http://www.netlaw.bg/en/a/inform-final-conference-beyond-being-informed (URL last accessed on 29 March 2019).

[36] GDPR, Art. 35.

[37] On the notion of legal ethics, see Estelle De Marco, Comparative study between Directive 95/46/EC & the GDPR including their relations to fundamental rights, March 2018, Deliverable D2.10, INFORM project (INtroduction of the data protection reFORM to the judicial system), JUST-JTRA-EJTR-AG-2016, GA n° 763866, Section 2.4.3, footnote n° 404; Estelle De Marco et al., MANDOLA Deliverable D2.2 - Identification and analysis of the legal and ethical framework, version 2.2.4 of 12 July 2017, MANDOLA project (Monitoring ANd Detecting OnLine hAte speech) - GA n° JUST/2014/RRAC/AG/HATE/6652, available at http://mandola-project.eu/publications, Section 3.3 p 14.

[38] EC recommendation of 9 March 2012 on preparations for the roll-out of smart metering systems (2012/148/EU), §I, 3 (c), available at http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2012:073:0009:0022:EN:PDF (last accessed on 29 March 2019). The Article 29 Data Protection Working Party supports this definition: see Article 29 Data Protection Working Party, Opinion 04/2013 on the Data Protection Impact Assessment Template for Smart Grid and Smart Metering Systems ('DPIA Template') prepared by Expert Group 2 of the Commission's Smart Grid Task Force (WP 205), adopted on 22 April 2013, p. 7.

[39] Article 29 Data Protection working party, Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is "likely to result in a high risk" for the purposes of Regulation 2016/679 (WP248), 4 April 2017, https://ec.europa.eu/newsroom/article29/items/611236, p. 4 (last accessed on 9 June 2021).

[40] See for example CNIL, CNIL publishes an update of its PIA Guides, 26 February 2018, op. cit. (footnote n° 1): "the term "privacy" is used as shorthand to refer to all fundamental rights and freedoms (including those mentioned in Articles 7 and 8 of the [EUCharter], Article 1 of the [Directive-95-46] and the Article 1 of the [DP-Act]: "human identity, human rights, privacy, or individual or public liberties"); Article 35 of the GDPR evokes the "risk to the rights and freedoms of natural persons".

[41] For example, the processing of simulated data, therefore of non-personal data, may lead to the creation of false information that might be linked to a real existing person, by coincidence; another example could be the development of an application to report penal infringements, accompanied with information on how to behave where facing an online illegal content. The accompanying information itself, despite it would not process any personal data, might impact the right to the protection of personal data and other rights, since it might mislead on the way to behave where a potentially illegal content is found, and therefore lead users to write information online or to report information to the system or elsewhere, in other words to create information that will be processed but that would not have been processed in case they would not have been misled.

[42] Estelle De Marco, Identification and analysis of the legal and ethical framework, op. cit. (footnote n° 20), Section 4.3.3.2, footnote n°349.

[43] The risk assessment international standard is primarily the norm ISO/IEC 27001, and methods based on this standards such as the ENISA emerging and future risks framework (2010), available in the ENISA dedicated webpages at https://www.enisa.europa.eu/topics/threat-risk-management/risk-management, and the EBIOS method proposed by the French National agency for information systems security (ANSSI), available at https://www.ssi.gouv.fr/guide/ebios-2010-expression-des-besoins-et-identification-des-objectifs-de-securite/. More information on this topic is available in Estelle De Marco, Privacy Impact Assessment of the MANDOLA outcomes, Deliverable D2.4a (Intermediate), op. cit. (footnote n°2), Section 3.1.2. See also the position, on the same line, the ICO Conducting privacy impact assessments code of practice, 2014, version 1.0, p. 23, which can still be found at https://www.pdpjournals.com/docs/88317.pdf (updated guidelines may be found on the ICO website at https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/data-protection-impact-assessments-dpias/). See also the new standard ISO/IEC 29134 which creates however a series of confusions compared to ISO 27001: Fabrizio Bottacin and Cesare Gallotti, PIA and proposals from ISO/IEC 29134 and ICO, 17 January 2017, Euro Privacy, https://europrivacy.info/2017/01/17/pia-and-proposals-from-isoiec-29134-and-ico/. URLs last accessed on 29 March 2019.

[44] See CNIL, CNIL publishes an update of its PIA Guides, 26 February 2018, op. cit. (footnote n° 1), p. 6.

[45] Estelle De Marco, Privacy Impact Assessment of the MANDOLA outcomes, Deliverable D2.4a (Intermediate), op. cit. (footnote n°2), Section 4.3.2.

[46] GDPR, 1rt.35, 7, d.

[47] On the same line, see the ICO DPIA guidelines, "Step 5: How do we identify and assess risks?", at https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/data-protection-impact-assessments-dpias/how-do-we-do-a-dpia/#how10 (URL last accessed on 29 March 2019).

[48] GDPR, Recital n°75.

[49] To be noted that the addition, to this list, of ECHR requirements, is an ideal complement.

[50] See also GDPR, recital 84 and article 5, §2.

[51] On the same line, see ICO DPIA guidelines, "Step 5: How do we identify and assess risks?", op. cit., which evokes compliance risks.

[52] Judicial courts are generally not bound by supervisory authorities' decisions or guidelines.


 

 


Inthemis - Montpellier Optimum Centre - 450, rue Baden Powell - 34000 Montpellier Données à caractère personnel Informations légales © Inthemis