Abstract
Data are the new oil of our society, but as opposed to the latter, business are not allowed to work them and re-use freely. To the extent that data fall under the category of “personal data”, businesses must comply with the data protection legal framework. In order to do this, it is primarily necessary to design internal and automatic procedures to understand if the sharing of data, as further processing operation, is compatible with the original purpose, and if appropriate safeguards – such as anonymisation – can be implemented without compromising achievement of the aim pursued through the sharing. When the aim of the sharing requires businesses to disclose personal data, businesses must detect a legal ground to rely upon and comply with several data protection rules. The aim of this paper is to briefly analyze solutions adopted by stakeholders under the EU data protection legal framework.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
- Data sharing
- EU data protection law
- Purpose limitation
- Data minimization
- Anonymised data
- Data subjects’ rights
- Privacy by design
1 Introduction
Nowadays the market offers a wide range of activities which may result in online sharing of data of whatsoever nature and for any reason: from the open data government initiative to the several projects for sharing research data among the scientific community, from the social networks phenomenon to the cloud services.
Sharing data has not only an economic value, but is fundamental for the progress of mankind and of a data driven economy, a priority also recognized by the Digital Single Market Strategy of the European Commission [1, 2].Footnote 1 On the other side, however, in certain circumstances, sharing of data can jeopardize fundamental rights of individuals as the information may be used to discriminate individuals by refusing to provide certain persons with services because of their health, religious or economic status [3]. It can also be used in a way, or for purposes, which can affect individuals’ dignity and last, but not least, violate the individuals’ right to respect their private life and personal data.Footnote 2 Control of individuals through their personal data remains one of the main topics and challenges to be addressed and solved when free flow of data is a priority of the new economy. Toward this goal the European Commission recently announced its new strategy in the Digital Single Market, including the adoption of a future-proof legislation that will support the free flow of data [4].
For the purpose of this paper it is worth specifying that there is no definition of “sharing of data” under the EU Data Protection Law, including the Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data. Some EU data protection authorities, which have provided guidance on data sharing agreements, define “data sharing” as the “disclosure of the data by transmission, dissemination or otherwise making it available” in many different contexts, i.e. within the public or private sectors, or among the public and/or private organizations [5].
In practice, “data sharing” is commonly used by operators when referring to different activities which can be classified under two specific categories of data processing operations explicitly mentioned by the EU Data Protection Law: “disclosure” and “dissemination”.
In particular, where data are communicated, released or in any other way made available to a limited number of individuals (i.e. identifiable recipients), this is regarded as “disclosure”. Conversely, “dissemination” usually occurs when personal data are spread among an indefinite number of unknown persons.Footnote 3 Keeping in mind the difference between disclosure and dissemination can be worthwhile when speaking about online sharing: sharing information through the web may be subject to restrictions that apply to dissemination only.
By way of example, the share of information carried out by governmental bodies in the context of the open data initiative or a data base freely accessible online from anywhere and by anybody in the world can be regarded as “dissemination”. On the other hand, where information is exchanged between two organizations through, e.g. a data room available in cloud, this is more correctly referenced as “disclosure” of data.
Irrespective of whether data sharing is disclosure or dissemination, the fact remains that sharing personal data constitutes a “personal data processing operation,” a condition that triggers the application of the EU Data Protection Law.Footnote 4
The goal of this paper is primarily to introduce the reader to the data protection legal framework which applies when a sharing of personal data occurs. Data controllers who are about to share information that can fall under the definition of “personal data” should first conduct a so called “data protection impact assessment” in order to identify roles of the parties involved in the sharing, and assess the purposes –we and possible related risks – arising from this sharing of personal data. Furthermore, since sharing personal data may also imply a re-use of the personal data for purposes other than the original one(s), data controllers need to find a legal ground to rely upon. The analysis continues by focusing on the main solutions usually adopted by businesses that decide to share their data. On the one hand, there are businesses which implement de-identification techniques for the purpose of avoiding the application of the EU Data Protection Law; this paper tries to highlight the related risks. On the other hand, there are businesses that cannot avoid sharing data in the form of “personal data” for the purposes for which the data have been collected and, therefore, they have no choice: they must comply with the data protection legal framework, in which case the technology may help to facilitate these businesses to implement systems according to the privacy by design principle as described below.
The structure of this paper is as follows: Section 2 illustrates the limitations to data sharing deriving from the “purpose limitation principle” and provides criteria for assessing the compatibility analysis; Sect. 3 is intended to describe main current trends implemented by business to share the data; Sect. 4 describes the conclusions.
2 Data Reusability and the “Purpose Limitation” Principle
The aims which can lead data controllersFootnote 5 to share personal data are countless: a large amount of private organizations regularly disclose personal data for executing contracts, fulfilling obligations provided under applicable laws or for the purpose of scientific researches. Other businesses collect personal data with the specific purpose to sell them to other private organizations for these organizations to use the data for their own business purposes. In this respect, however, it is worth to underline that often the disclosure or the dissemination is a subsequent phase of a processing of personal data in the data life-cycle. Data are indeed mainly collected for purposes other than the disclosure to third parties (unless in case of a business that collects data for the sole purpose of reselling them). By way of example, healthcare providers or telecommunication companies daily collect a large amount of personal data, respectively, of their patients and customers for the main purpose of providing them with their services. Nevertheless such providers may subsequently want, or be obliged, to disclose the above data for a wide range of reasons.
Given the above, sharing of data for purposes other than those originally collected may be considered as a re-use of data or, more precisely, in accordance with the data protection terminology, as a “further processing”.
Now, according to one of the pillarsFootnote 6 of the EU Data Protection Law (the purpose limitation principle) personal data must be processed only for specified, explicit and legitimate purposes (purpose specification) and must not be further processed in a manner that is incompatible with those purposes (compatible use) [6].Footnote 7
2.1 The Compatibility Test
Compatibility needs to be assessed on a case-by-case basis through a substantive compatibility assessment of all relevant circumstances taking into account specific key factors based on the guidance given under Article 6, paragraph 3.a of the Regulation (EU) 2016/679 that can be summarized as follows:
-
any link between the purposes for which the personal data have been collected and the purposes of the further processing: this may also cover situation where there is only a partial or even non-existent link with the original purpose;
-
the context in which personal data have been collected and the reasonable expectations of data subjects as to the further use of their personal data. In this case the transparency about the use of the data originally reached by the data controller when it collected the data is of paramount importance: the more expected is the further use, the more likely it is that it would be considered compatible. In order to assess the reasonable expectation of individuals as to the use of their data, attention should be given also to the environment and context in which data are collected (i.e. the nature of the relationship between data controller and data subjects could raise reasonable expectation of strict confidentiality or secrecy, as it is usually in the healthcare-patient, or bank-account holder, relationships);
-
the nature of personal data and the impact of the further processing on data subjects: particular attention must be paid when the re-use involves special categories of personal data such as the sensitive onesFootnote 8 as well as in case of biometric, genetic or location data and other kinds of information requiring special safeguard (e.g. personal data of children);Footnote 9
-
the safeguards adopted by the controller to ensure fair processing and to prevent any undue impact on data subjects: this factor may be, probably, the most important one because under certain circumstances it can help businesses to compensate for a change of purpose when all the other factors are deficient. In particular, in addition to appropriate”technical and organizational measures to ensure functional separation” (e.g. partial or full anonymisation, pseudonymization and aggregation of data), the data controller must have implemented “additional steps for the benefit of the data subjects such as increased transparency, with the possibility to object or provide specific consent” [6].
However, the newly approved Regulation (EU) 2016/679 recognizes the possibility for the data controller to avoid the assessment on the above factors for ascertaining the compatibility of the further processing if the controller can rely on the specific consent of data subject for using their personal data for further processing or in case the further processing is mandated by a legislative or statutory law to which the data controller must comply.
Processing of personal data in a manner incompatible with original purposes of the collection infringes the EU Data Protection Law and thus, is prohibited.
A key concept introduced by the EU Data Protection (Recital 40, Regulation (EU) 2016/679) is a “presumption of compatibility” with the initial purposes of the collection in the event that the further processing is carried out for achieving “purposes in the public interest, scientific or historic research purposes or statistical purposes”, provided that appropriate safeguards are adopted for the rights and freedoms of data subjects in accordance with the principle of minimization.Footnote 10
It should be finally clarified that, whenever personal data are shared with a third party service provider acting, and appointed by the data controller, as data processor,Footnote 11 this cannot be considered as a further processing for a new purpose: indeed, this disclosure of personal data occurs in order to fulfill the purposes of the collection (e.g. disclosing personal data of employees to a payroll provider appointed as data processor by the employer does not need a compatibility assessment since it is carried out for the purpose of executing the employment contract which is the original purpose of the collection of employees’ personal data; similarly, the use by the data controller of cloud based services for processing operations related to the contract with customers does not technically trigger any “disclosure” or even a “further processing” provided that the service provider acts under the instructions of the data controller as data processor).
2.2 The Roles of the Parties
Given the above, it is therefore preliminarily necessary to identify roles and purposes of the disclosure in order to assess whether the sharing of data is with a third party that will act as data controller.
In the event that a business intends to disclose data to a third party who uses them autonomously (as a data controller), it would be required to assess the nature of the data and the purposes of their collection in order to evaluate whether the new purposes for which personal data are disclosed may be considered compatible with the original purposes. To this extent, a data protection impact assessment (and/or the implementation of privacy-preserving technologies, if applicable) could be worth to adequately set technical procedures. This is not required if the data controller has obtained the specific consent of data subjects (or the disclosure of personal data to the third party is required by laws).
One of the most important queries that a system should be able to formulate and to which a system should be able to respond is to what extent the sharing of “personal data” - i.e. any information relating to an identified individual or to an individual who can be identified also by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person (Article 4, Regulation (EU) 2016/679) – is necessary for achieving the aim pursued through the sharing: in many cases businesses do not really need to share information that can be linked to an individual, i.e. personal data, because the scope for which the information is shared can be achieved also using anonymous data. Appropriate safeguards such as the implementation of anonymisation techniques may help businesses to comply with data protection laws and, at the same time, enabling them to make the necessary data available for the sharing [7].
3 Current Trends for Sharing Data
When the sharing of data is with a third party that receives the data to process them autonomously for its own purposes, depending on the answer given to the above queries, businesses can follow two different ways:
-
they may choose to release data in a sufficiently aggregated (or even effectively anonymized) form in order to strengthen the protection of the individuals to whom data relate (this would partially simplify the related data protection obligations); or
-
they may need to disclose the information as “personal data” and thus they should design and adopt technical solutions in order to comply with further data protection rules such as the provisions of smart mechanism to give, and also withdraw, the consent (if required) or to opt-out (if applicable), and implement tools for improving the data subjects’ control over their data and simplifying the fulfillment of data subjects’ request relating to the exercise of their rights.
3.1 Sharing De-Identified Data
Anonymization.
The EU Data Protection Law does not apply to “anonymous data”Footnote 12 or to “personal data rendered anonymous in such a manner that a data subject is not, or is no longer, identifiable” (Recital 26, Regulation (EU) 2016/679).
Anonymisation is usually intended as a process through which personal data are manipulated (concealed or hidden) to make it difficult to identify data subjects [8]. This can be done either by deleting or omitting identifying details or aggregating information [9].
However, in the last years the practice has revealed that anonymized data can be often easily re-identified or de-anonymized, especially considering that the wider and wider scale of electronic processing of data and the more and new sophisticated data mining processes and data analytics make possible to combine data sets from so many different sources to derive new information that can lead to re-identification of a person. [8]Footnote 13
The issue on risks of re-identification has been recently addressed by the Article 29 Working Party [10] according to which information are not “personal data” only when it is anonymized to the effect that it is no longer possible to associate it to an individual by using “all the means likely reasonably to be used” either by the controller or a third party.
In this opinion of the Article 29 Working Party, the outcome of such kind of anonymisation should be, in the current state of technology, as permanent as erasure, i.e. making it impossible to process personal data. Only in presence of an irreversible anonymisation it is possible to state that the EU Data Protection Law no longer applies. By setting the risk threshold at zero for any potential recipients of the data the consequence is that, except for very limited cases, there are no existing techniques that can achieve the required degree of anonymization. In which case sharing the data requires the consent of the data subjects or, alternatively, to rely on any of the other legitimate grounds provided by the EU Data Protection law.
However, part of the doctrine raised criticism on this strict approach considering that the Article 29 Working Part follows an absolute definition of acceptable risk in the form of “zero risk” while the legislation on the protection of personal data (i.e. the current Directive 95/46/EC on the protection of personal data, and the Regulation (EU) 2016/679) itself does not require a zero risk approach [11]. Indeed, the EU Data Protection Law provides that the impossibility of re-identification should be approached in light of the “all means likely reasonably” test. In this direction it moves indeed the most recent legislation: according to the new wording of Recital 26 of the Regulation (EU) 2016/679 “to determine whether a natural person is identifiable, account should be taken of all the means reasonably likely to be used, such as singling out,Footnote 14 either by the controller or by another person to identify the natural person directly or indirectly. To ascertain whether means are reasonably likely to be used to identify the natural person, account should be taken of all objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments”. It seems reasonable, indeed, that the wording used by the EU legislator was not intended to adopt a zero risks expectation.
Additionally it would be worth to outline that anonymisation triggers by itself a further processing of personal data and, as such, it must satisfy the requirement of compatibility of the anonymization with the original purposes of the collection. According to the Opinion of Article 29 Working Party [10], for the anonymisation to be considered as compatible with original purposes of the processing, the anonymisation process should produce reliably anonymized information.
In the light of the above, businesses should bear in mind that implementing anonymisation techniques does not exempt them from obligations provided under the EU Data Protection Law. Indeed, the purpose limitation principle still applies even when the further processing is aimed at anonymizing personal data.
Given this, it is recommended for businesses – also where they are not strictly required under mandatory lawsFootnote 15 - (i) to carry out an effective data protection impact assessment to verify the compatibility of the anonymization with the purposes for which data were originally collected; (ii) to identify what data may be available for sharing and at what level of anonymization and aggregation; and (iii) to detect any risks of re-identification by taking into account also the technological, economic and organizational capacity of the third parties recipients.
Pseudonymization.
An alternative, but significantly different method to reduce the possibility to immediate identification of data subjects is the pseudonymization, i.e. the procedure to replace identification data (e.g. the name of an individual or other direct identifiers) with codes or numbers.
According to the Opinion of Article 29 Working Party [10] and the EU Data Protection Law (Recital 28 and Article 4, paragraph 1, n. 5, Regulation (EU) 2016/679), pseudonymization method is a measure to protect personal data, to which extent it can reduce (but does not exclude) the risk related to the processing of personal data. Implementing pseudonymization techniques is indeed very often expressly recommended also by the data protection regulators to help controllers and processors to meet their data protection obligations. In particular, under the EU Data Protection Law pseudonymization is explicitly defined as “the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organizational measures to ensure that the personal data are not attribute to an identified or identifiable natural personal”.
The Article 29 Working Party stressed that equating pseudonymized data to anonymized data is considered as one of the misconceptions among many data controllers. Pseudonymized data, indeed, continues to allow an individual to be singled out and linkable across different data sets and their use remains subject to data protection rules. Accordingly, applying pseudonymization technique should not be seen by businesses sharing personal data as a way to avoid application of data protection rules. On the contrary, it should be seen as a measure to strengthen the individuals’ data protection rights.
3.2 Sharing Personal Data
If sharing of personal data (as opposed to anonymous information) is strictly necessary to achieve the aim pursued when the data are disclosed, businesses should take the necessary safeguards in order to comply with data protection rules, but also to ensure that third parties receiving personal data will do the same: the disclosing party should honor its obligations towards the individuals from whom it collected personal data and should take the appropriate actions to ensure that the receiving party is bound to certain conditions for the processing of the shared personal data.
Preserving data protection while sharing personal data could be achieved by implementing privacy by design strategies in different phases of the data life-cycle: from the collection to the moment of the re-use by the recipients. The aim of these strategies is indeed to mitigate the risks of unlawful processing by:
-
increasing awareness of data subjects about any further processing operations: the more transparency on the data life-cycle controllers guarantee the more likely it is that any further use they contemplate may be considered “compatible” (the compatibility is not required if a new consent is obtained for sharing). Using standardized machine-readable icons, as recommended by the EU Data Protection Law (Recital (60), Regulation (EU) 2016/679) will be one of the preferable manners to offer clear and easily visible transparency to the data subjects [12–14];
-
even when this is not strictly mandatory, carrying out a data protection impact assessment in order to identify the risks which may originate from sharing of data as well as to detect possible liabilities for each of the parties involved in the data life-cycle;
-
developing appropriate and non–traditional privacy policies which should automatically be enforced against any party involved in the data life-cycle, e.g. using data sharing agreements [15];
-
complying with the minimization principle by reducing the identifiability of individual as much as possible, keeping in mind that using pseudonymization techniques does not exclude the risks of re-identification;
-
implementing new tools to allow data subjects to control how their data are processed and shared, and to actively handle their rights at any time; this is of a paramount importance for future since the EU Data Protection Law provides a wide range of rights in favor of data subjects – from the right of erasure (i.e. the right to be forgotten) to the right of data portability –exercise of which by a large group of individuals might undermine the resources of businesses in term of personnel, time and costs.
On the other hand, controllers should also identify implications of relying on a legal ground instead of another.
The EU Data Protection Law, indeed, allows the processing of personal data only to the extent that it relies on, at least, one of the specific and limited legal grounds provided under Article 6.Footnote 16 When businesses decide to share personal data on the basis of the data subjects’ consent, they may rely on new mechanisms of consent; the traditional notice and consent paradigm gave, indeed, only an apparent, but inconsistent, self-determination [16]. However, over the last years, new technologies have reinforced the concept of consent by creating user friendly consent mechanisms which are mainly based on engineered banner solutions. Moreover, taking into account that the EU Data Protection Law provides that consent can be also given by a statement or by a clear affirmative action, other types of consent tools may be developed which will involve practical user positive actions through new sensors (e.g. gesture, spatial patterns, behavioral patterns, motions [17]). Provided that the EU Data Protection Law (Article 7, paragraph 3, Regulation (EU) 2016/679) requires that it shall be as easy to withdraw consent as to give it, businesses shall also implement appropriate mechanism to allow individuals to easily revoke their consent.
This should be implemented by adopting the approach of privacy preferences according to which sticky policies can provide a mechanism for attaching privacy preferences to specific data sets and accordingly drive data processing decisions.
On the other hand, when businesses should decide to rely on their legitimate interest in sharing data, they shall implement appropriate mechanism in order to inform individuals about the further processing and permit them to exercise their data protection rights, including the right to object, i.e. to opt-out (Article 21, Regulation (EU) 2016/679).
4 Conclusion
Re-use of personal data to share them for purposes other than those for which data were originally collected is not forbidden by the EU Data Protection Law, but, on the contrary, it can be carried out on the basis of various legal grounds, e.g. the “compatibility test” or consent.
However, if the sharing of the personal data involves large amount of personal data, it is of paramount importance for businesses to carry out a prior data protection impact assessment in order to identify the roles and the potential risks related to the sharing, and to design strategies for preserving privacy during the entire data chain.
To this extent, the current market scenario has started to provide the first examples on privacy preserving solutions and the technologies bode well for the future.
Notes
- 1.
For a complete overview on the value of the sharing economy in the European Union see European Parliamentary Research Service “The Cost of Non-Europe in the Sharing Economy” (2016), according to which the new data protection legislation (i.e. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data - General Data Protection Regulation) will help the development of the sharing economy by enabling citizens to exercise effectively their rights to personal data protection and modernizing and unifying rules for businesses to make the most of the Digital Single Market.
- 2.
The right to data protection is a fundamental right in EU Law. It is established under Article 8 of the Charter of Fundamental Rights of the European Union which became legally binding as EU primary law with the coming into force of the Lisbon Treaty on December 1, 2009. Even if fundamental, the right to the protection of personal data is not an absolute right, but must be considered in relation to its function within society. Therefore, a balancing exercise with other rights (e.g. freedom of expression, access to document, freedom of the arts and sciences) is necessary when applying and interpreting the right to data protection.
- 3.
The distinction between “disclosure” and “dissemination” is important for some data processing operations. By way of example, under Article 26 of the Italian Data Protection Code (Legislative Decree 30 June 2003 n. 196) disseminating health data is prohibited while their disclosure, under certain conditions, is permitted.
- 4.
Regulation (EU) 2016/679 provides two scopes of application: material and territorial. According to the material scope (Article 2), the Regulation applies to the processing of personal data wholly or partly by automated means and to the processing, other than by automated means, of personal data which form part of a filing system or are intended to form part of a filing system. Pursuant to the territorial scope, the Regulation applies to (i) the processing of personal data in the context of the activities of an establishment of a controller or a processor in the EU, regardless of whether the processing takes place in the EU or not; (ii) to the processing of personal data of data subjects who are in the EU by a controller or processor not established in the EU, where the processing activities are related to: (a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the EU; or (b) the monitoring of their behaviour as far as their behaviour takes place within the EU; (iii) to the processing of personal data by a controller not established in the EU, but in a place where Member State law applies by virtue of public international law.
- 5.
Under the Regulation (EU) 2016/679 a “data controller” is the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data.
- 6.
The EU Data Protection Law permits to process personal data only under specific and limited circumstances (i.e. legal grounds) and requires data controller to comply with the following principles: lawfulness, fairness and transparency (i.e. data must be processed lawfully, fairly and in a transparent manner); purpose limitation; data minimization (i.e. data must be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed); accuracy (i.e. data must be accurate and, where necessary kept up to date); storage limitation (i.e. data must be kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed); integrity and confidentiality (i.e. data must be processed in a manner that ensures appropriate security of the personal data).
- 7.
“Specification of purpose is an essential first step in applying data protection laws and designing data protection safeguards for any processing operation. Indeed, specification of the purpose is a pre-requisite for applying other data quality requirements, including the adequacy, relevance, proportionality and accuracy of the data collected and the requirements regarding the period of data retention. The principle of purpose limitation is designed to establish the boundaries within which personal data collected for a given purpose may be processed and may be put to further use.” (Article 29 Working Party, WP203, p. 4).
- 8.
Regulation (EU) 2016/679 provides additional safeguards for the processing of “special categories” of personal data which are, by their nature, particularly sensitive in relation to fundamental rights and freedoms (i.e. data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade-union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation).
- 9.
Regulation (EU) 2016/679 provides a special care for children by introducing limits and additional requirements when the processing concerns data of children under 16 years.
- 10.
Under Article 89 of the Regulation (EU) 2016/679 pseudonymization is suggested as an appropriate safeguard when the processing of personal data occurs for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes, without prejudice to other technique which may be more effective.
- 11.
Under the Regulation (EU) 2016/679 a “processor” is the natural person or the entity that processes personal data on behalf of a controller. By way of example, where IT services are provided by a third party service provider, the organization using the IT services offered by the third party appoints the IT provider as its data processor.
- 12.
Recital 26 of the Regulation (EU) 2016/679 defines “anonymous data” as “the information which does not relate to an identified or identifiable natural person”.
- 13.
See the AOL and Netflix cases in Ohm Paul, Id (2009), 1717-1722.
- 14.
This is a new wording introduced by the Regulation (EU) 2016/679 and draws on the Opinion of the Article 29 Working Party where it stated that an effective anonymisation solution prevents all parties from singling out an individual in a dataset, from linking two records within a dataset (or between two separate datasets) and from inferring any information in such dataset. Another factor pointed out by the Article 29 Working Party in its Opinion 5/2014 in assessing the notion of “impossibility” is the robustness of the anonymisation technique employed. In assessing the robustness of different techniques of anonymization, the following questions should be taken into account: (1) is it still possible to single out an individual; (2) is it still possible to link records relating to an individual, and (3) can information be inferred concerning an individual? Using these three questions, the Article 29 Working Party produced a table that shows the strengths and weakness of the different techniques of anonymisation.
- 15.
According to Article 35, paragraph 1, of the Regulation (EU) 2016/679 carrying out an assessment of the impact of the processing operations on the protection of personal data is required when, given the type of processing, in particular if new technologies are involved, and taking into account the nature, scope, context and purposes of the processing, it is likely that the processing results in a high risk to the rights and freedoms of natural persons. Pursuant to Article 35, paragraph 3.a, it is specifically required to carry out a data protection impact assessment in case of: (a) systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person; (b) processing on a large scale of special categories of data or of personal data relating to criminal convictions and offences; or (c) systematic monitoring of a publicly accessible area on a large scale.
- 16.
According to Article 5 of the Regulation (EU) 2016/679, personal data can be processed if the data subject has given his/her consent or if the processing is necessary:
-
(a)
for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;
-
(b)
for compliance with a legal obligation to which the controller is subject;
-
(c)
in order to protect the vital interests of the data subject or of another natural person;
-
(d)
for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller; or
-
(e)
for the purposes of the legitimate interests pursued by the controller or by a third party.
-
(a)
References
European Parliamentary Research Service, The Cost of Non-Europe in the Sharing Economy (2016)
European Commission, European Cloud Initiative - Building a competitive data and knowledge economy in Europe – COM (2016). 178 final
European Union Agency for Fundamental Rights, Handbook on European Data Protection Law (2014)
European Commission. http://europa.eu/rapid/press-release_IP-16-1407_en.htm
Information Commissioner’s Office, The Data Sharing Code of Practice (2011). https://ico.org.uk/media/for-organisations/documents/1068/data_sharing_code_of_practice.pdf
Article 29 Data Protection Working Party, Opinion 03/2013 on Purpose Limitation (WP203), 2 April 2013
Fisk, G., Ardi, C., Pickett, N., Heidemann, J., Fisk, M., Papadopoulos, C.: Privacy principles for sharing cyber security data. In: Security and Privacy Workshops (SPW). IEEE (2015)
Paul, O.: Broken promises of privacy: responding to the surprising failure of anonymisation. UCLA Rev. 57, 1707 (2009)
Kuan, H.W., Millard, C., Walden, I.: The problem of ‘personal data’ in cloud computing – what information is regulated? The cloud of unknowing. Int. Data Priv. Law 1(4), 211–228 (2011). Queen Mary School of Law Legal Studies Research Paper No. 75/2011
Article 29 Working Party, Opinion 05/2014 on Anonymisation Technique (WP216), 10 April 2014
El Emam, K., Alvarez, C.: A critical appraisal of the article 29 working party opinion 05/2014 on data anonymization techniques. Int. Data Priv. Law 5(1), 73–87 (2015)
Edwards, L., Abel, W.: The Use of Privacy Icons and Standard Contract Terms for Generating Consumer Trust and Confidence in Digital Services, CREATe Working Paper 2014/15, 31 October 2014
Holtz, L.-E., Nocun, K., Hansen, M.: Towards displaying privacy information with icons. In: Fischer-Hübner, S., Duquenoy, P., Hansen, M., Leenes, R., Zhang, G. (eds.) Privacy and Identity Management for Life. IFIP AICT, vol. 352, pp. 338–348. Springer, Heidelberg (2011)
Enisa, On the security, privacy and usability of online seals. An overview (2013)
Caimi, C., Gambardella, C., Manea, M., Petrocchi, M., Stella, D.: Legal and technical perspectives in data sharing agreements definition. In: Berendt, B., et al. (eds.) APF 2015. LNCS, vol. 9484, pp. 178–192. Springer, Heidelberg (2016). doi:10.1007/978-3-319-31456-3_10
Mantelero, A.: The future of consumer data protection in the E.U. rethinking the “notice and consent” paradigm in the new era of predictive analytics. Comput. Law Secur. Rev. 30(6), 643–660 (2014)
Enisa, Privacy by Design in Big Data (2015)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Mauro, F., Stella, D. (2016). Brief Overview of the Legal Instruments and Restrictions for Sharing Data While Complying with the EU Data Protection Law. In: Casteleyn, S., Dolog, P., Pautasso, C. (eds) Current Trends in Web Engineering. ICWE 2016. Lecture Notes in Computer Science(), vol 9881. Springer, Cham. https://doi.org/10.1007/978-3-319-46963-8_5
Download citation
DOI: https://doi.org/10.1007/978-3-319-46963-8_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-46962-1
Online ISBN: 978-3-319-46963-8
eBook Packages: Computer ScienceComputer Science (R0)