SlideShare a Scribd company logo
Big Data and GDPR
Sari Depreeuw
Beta Cowork Brussels
21 March 2019
1
Overview
• Introduction
• Does GDPR apply?
– « personal data »
– « processing »
• Processing within limits
– Principles
– Legal grounds
• How to do big data (right)?
– Impact assessment
– Data protection by design & by default
– Transparency
– Data quality
– Data subjects’ rights
2
Introduction
3
Introduction
4
Risks
• Tracking & surveillance (government/commercial)
• Security
• Trust (understanding, control)
• Inaccuracy
• Discrimination / influencing (nudging) / unfair treatment
/ perpetuating existing (economic) imbalance
5
Opportunities
• Autonomous cars
• Insurance
• Marketing / ads (online, offline)
• Retail
• Mobility
• Search
• Public sector / private sector
6
“Big data”
1. Collecting data
– Large quantities of data
– From diverse sources
– Variety of data
(personal and non-personal data)
2. Analysing data
– Correlations – profiles – algorithms (black box)
– “Anonymisation”
3. Applying profiles (if applicable)
– Fitting individuals in profiles
7
Does the GDPR apply?
“personal data”, “processing”, “anonymisation” v.
“pseudonymisation”
8
Personal data
• “Personal data” (art. 4(1) GDPR)
– Any information relating to an identified or identifiable
natural person (‘data subject’)
– Large notion!
• “Identifiable natural person”
– Anyone who can be identified - directly or indirectly, e.g.
by reference to:
• identifier (e.g. name, identification number, location data,
online identifier)
• factors specific to the physical, physiological, genetic, mental,
economic, cultural or social identity of that natural person
• e.g. combination of IP address, MAC address, browser type,
type of phone
9
Personal data (2)
• Natural person “identifiable”?
– Assessment on the basis of all the means reasonably likely to
be used either by the controller or by any other person to
identify the said person (Recital 26 of GDPR)
– Factors:
• costs of identification
• intended purpose (identify individuals?)
• advantages expected by the controller, interests at stake for the
individual
• future technological evolution (dynamic test)!
– “Singling out”
• Not necessary to know name or e-mail address
• Impact on individual’s behaviour
10
Personal data (3)
• “data relating to an individual”
= data referring to the identity, characteristics or behaviour
of an individual or such information that is used to
determine or influence the way in which that person is
treated or evaluated (WP 29)
Examples:
– A natural person’s name, picture, phone number (private
or professional), bank account number, e-mail address.
– CJEU: employees’ working time records, person’s image
by CCTV, tax data, information in a press release where
(unnamed) person was easily identifiable, fingerprint, IP
address, exam scripts.
Even if data subject acts in professional context!
11
Processing
= Any operation or set of operations which is performed on
personal data or on sets of personal data, whether or not
by automated means (art. 4(2) GDPR)
– E.g. collection, recording, organisation, structuring, storage, adaptation or
alteration, retrieval, consultation, use, disclosure by transmission,
dissemination or otherwise making available, alignment or combination,
restriction, erasure or destruction
– Large interpretation
– CJEU: loading on internet page; collection, publication, transfer on CD-
ROM, text messaging; communication in response to a request for access
to documents; crawling, indexing, transmitting by search engine; leaking
to press; video recording.
12
Anonymisation
• “Processing” of “personal data”
– Collecting of data
– Analysing
– Applying (targeting)
• Anonymisation?
= act of processing
Principle: no more “personal” data (rec. 26)
In practice: pseudonymisation
Data sets – combination – re-identification -
targeting
13
PROCESSING WITHIN LIMITS
Principles for processing, legal grounds for
processing, sensitive data, profiling
14
Principles for processing (art. 5 GDPR)
a) Lawfulness, fairness and transparency
b) Purpose limitation: Data collection for specified, explicit and
legitimate purposes and not further processed in a manner that is
incompatible with those purposes;
further processing for archiving purposes in the public interest,
scientific or historical research purposes or statistical purposes shall
not be considered to be incompatible with the initial purposes
c) Data minimisation: adequate, relevant and limited to what is
necessary in relation to the purposes for which they are processed
d) Accuracy: accurate and, where necessary, kept up to date; every
reasonable step must be taken to ensure that personal data that are
inaccurate, having regard to the purposes for which they are
processed, are erased or rectified without delay
15
Principles for processing (2)
e) Storage limitation: kept in a form which permits identification of data
subjects for no longer than is necessary for the purposes for which
the personal data are processed; storage for longer periods if
processing solely for archiving purposes in the public interest,
scientific or historical research purposes or statistical purposes
subject to safeguards;
f) Integrity and confidentiality: processed in a manner that ensures
appropriate security of the personal data, including protection against
unauthorised or unlawful processing and against accidental loss,
destruction or damage, using appropriate technical or organisational
measures.
g) Accountability: Controller is responsible for compliance with these
principles and should be able to demonstrate such compliance.
16
Bottlenecks
• Transparency
>< black box, opacity
• Purpose limitation
>< discovery, unpredictability
>< repurposing data
• Data minimisation
>< collecting all available data
>< generating new data
provided >< observed, derived, inferred
17
Lawfulness (art. 6 GDPR)
• Legal grounds for processing :
– Consent
– Performance of contract
– Legal obligation of controller
– Vital interests of data subject or other
– Public interest or official authority of controller
– Legitimate interests of controller or other
• Unless overriding interest or fundamental rights and
freedoms of data subject (esp. child)
18
Consent (art. 7 GDPR)
• Freely given, specific, informed, unambiguous (art. 4(11) GDPR)
– Statement or affirmative action >< pre-ticked box
– Concerning all purposes (initial + further processing)
Set prior to processing with consent
– Intelligible // public
– Using clear and plain language  “legalese”
– Otherwise: not binding
– Valid consent = illusion?
• Withdrawal of consent
– At any time
– As easy as granting consent
– Without detriment
19
Bottlenecks
• Consent (informed, specific)
>< unpredictable outcome
>< limited understanding (technophobia)
Granular, real time consent?
• Performance of contract
>< necessity
• Legal obligation
>< limited cases
• Legitimate interests
>< necessity
>< privacy, non-discrimination (cf. Google Spain)
risk: balance in favour of data subject
20
Sensitive data (art. 9 GDPR)
• Special protection for « special categories » of data
Data revealing racial or ethnic origin, political opinions,
religious or philosophical beliefs, or trade union
membership
Genetic data, biometric data for the purpose of uniquely
identifying a natural person, data concerning health or
data concerning a natural person's sex life or sexual
orientation
21
Sensitive data (2)
• Principle: prohibition
• Exceptions:
– Explicit consent
– Rights re employment, social security, social protection law
– Vital interests
– Not for profit organisation with political, philosophical,
religious or trade union aim
– Prior publicity by data subject
– Legal claims
– Substantial public interest, public health
– Health care related processing
– Archiving, research, statistics
22
Profiling (art. 22 GDPR)
• To be subject to (i) a decision (ii) based solely on
automated processing, including profiling, (iii) which
produces legal effects concerning him or her or similarly
significantly affects him or her
• 'profiling' means “any form of automated processing of
personal data consisting of using those data to evaluate
certain personal aspects relating to a natural person, in
particular to analyse or predict aspects concerning that
natural person's performance at work, economic
situation, health, personal preferences, interests,
reliability, behaviour, location or movements” (rec. 71)
23
Profiling (2)
• Principle: data subject’s right not to be subject to automated
decision
Profiling as such not prohibited.
• Except:
– Necessary for contract (conclusion / performance)
– Authorised by law
– Explicit consent
Safeguards: rights, freedoms, interests of data subject
minimum: human intervention + defence, right to contest
• No decisions based on sensitive data => prohibited discrimination
Unless explicit consent or public interest + safeguards
 Art. 29 WP Guidelines on Automated individual decision-making and
Profiling for the purposes of GDPR (WP251) v. 6 Feb 18
24
HOW TO DO BIG DATA (RIGHT)?
Impact assessment, data protection by design,
Transparency, information, consent,
25
Starting point
• Assume personal data is processed
– Collecting => source(s)?
– Analysing => statistics?
– Applying => automated? Significant impact on data
subject?
• Define purposes (initial + further processing)
Apply data processing principles (bottlenecks)
• Define legal basis
– Legitimate interest
– Consent
26
Impact assessment (art. 35)
• Data protection impact assessment
– // type of processing (new technologies)
– // nature, scope, context and purposes of the processing
 Potential “high risk” to the rights and freedoms of natural
persons
• Required:
– a systematic and extensive evaluation of personal aspects
relating to natural persons, based on automated processing,
including profiling, and on which decisions are based that
produce legal effects concerning the natural person or similarly
significantly affect the natural person;
– processing on a large scale of sensitive data / data relating to
criminal convictions and offences; or
– a systematic monitoring of a publicly accessible area on a large
scale.
27
Impact assessment (2)
• DPIA:
– Description of processing operations, purposes, legitimate
interest of controller;
– Assessment of necessity and proportionality of processing//
purposes;
– Assessment of risks to the rights and freedoms of data
subjects;
– the measures envisaged to address the risks
e.g. https://www.cnil.fr/en/privacy-impact-assessment-pia
28
DP by design
• Data protection by design and by default (art. 25 GDPR)
– By design:
• Data protection incorporated “in the design” of the solution
– // state of the art technology, cost of implementation, scope, context and
purposes of processing, risk for rights of data subject
– “Appropriate” technical and organisational measures (e.g.
pseudonymisation, data minimisation, data segregation)
• Entire lifecycle management of personal data
• E.g. security measures // risk
Anonymisation (cf. art. 29 working party - Opinion 05/2014)
Pseudonymisation & encryption, access control, audit logs, guarantee
ongoing confidentiality, integrity, availability, resilience of system,
restoring availability and access to data in case of physical or technical
accident, regular testing and assessing security measures
• E.g. functional separation (statistics X impact on individual)
29
DP by default
– By default:
• As a starting point: only data necessary per specific purpose
(limitation re amount of data collected, extent of processing, storage
period and accessibility)
– “Appropriate” technical and organisational measures
– No accessibility to indefinite number of people without intervention
of the individual
– E.g. control data collected through form (fields), processes to manage
duration of storage (alerts, automated deletion or pseudonymisation)
– To be documented!
30
Fairness, transparency
• Impact of processing
e.g. ads v. different treatment
Prohibited discrimination (e.g. ethnicity, gender, religion)
• Legitimate expectations
e.g. loyalty card; social media (cf. Cambridge Analytica)
Transparency, prior information
Evolving attitude of data subjects (generational?)
• Information
About existence of processing (tracking?), methods
cf. Facebook decisions (Brussels court of appeal)
31
Data quality
• Accurate, up to date data
cf. right to rectification
• Algorithmic accountability
logic – discrimination (perpetuating) – active
detection (algorithm, data sets)
inaccurate predictions – associations –
correlation / causation
32
Rights of data subject (1)
• Information + access (art. 13-15 GDPR)
– Who (controller), why (purpose), what (data, processing,
source), how (recipients, access, rectification, profiling), how
long (retention period), remedies (SA, DPO)
– Existence of automated decision making, incl. profiling: logic,
significance, consequences for data subject
– Right to obtain a copy
including observed, inferred data.
importance of proper data management!
Give access to profiles, labels?
– Information in plain language – intelligible (public)
Legal design: visualisation tools? Icons? Simple text?
– How to inform data subject where data from various sources?
33
Rights of data subject (2)
• Rectification and erasure (art. 16-17 GDPR)
inaccurate data => rectification
Erasure => “right to be forgotten”!
– Limited cases incl. withdrawal of consent, data no longer
necessary,…
Cf. also CJEU Google Spain (C-131/12)
• Right to object (art. 21 GDPR)
– To processing (and profiling) on public interest/legitimate interest
ground
• If particular situation
• Controller may establish compelling legitimate grounds
– To direct marketing (including profiling)
– To processing for scientific/historical research/statistical purposes
34
Rights of data subject (3)
• Restriction of processing (art. 18 GDPR)
• Data portability (art. 20 GDPR)
• “Automated individual decision making”, incl. profiling
(art. 22 GDPR)
– Right not to be subject to decision based solely on
automated processing if legal effects or significant
impact
35
Automated decisions
• Right of information
existence + logic >< IP, trade secrets
• Right of access (incl. copy)
to personal data incl. generated data (application of profile)
existence + logic >< IP, trade secrets
• Right of rectification
inaccurate profiling?
• Right to erasure
Withdrawal of consent
• Right to object
Legitimate interest (incl. profiling) data subject > controller
(i) particular situation, or (ii) direct marketing and profiling
• Right not to be subject to automated decision making
36
Moving target
• Anonymise where possible
– Pseudonomyse is valid plan B
• Data protection by design
– Think about design – all the time
– Start by impact assessment
– Integrate GDPR + « ethical » principles
– Check algorithms + data sets
• Communicate transparently
– Privacy notices
• Document choices
37
Moving target (2)
• Opportunities?
• Algorithmic transparency (accountability)
Diverse development or testing teams?
Audits?
Information?
• Innovative communication (legal design)
Bite-size, evolving messages
Plain text
Visualisation tools? Icons?
Access to profile, applied labels?
• Consent
Granular consent // information?
38
Many thanks!
Sari Depreeuw
sdp@daldewolf.com
39

More Related Content

PPTX
PDF
GDPR Overview
PPTX
GDPR clinic - CloudWATCH at Cloud Security Expo 2017
PDF
Data Protection Seminar_GDPR_ISOLAS_26-06-17
DOCX
DPIA template
PPTX
GDPR, Data Privacy.
PDF
Osio workshop: Data Protection Regulation and Health Care
PDF
"Legal tips and compliance requirements" - Anastasia Botsi, ICT Legal
GDPR Overview
GDPR clinic - CloudWATCH at Cloud Security Expo 2017
Data Protection Seminar_GDPR_ISOLAS_26-06-17
DPIA template
GDPR, Data Privacy.
Osio workshop: Data Protection Regulation and Health Care
"Legal tips and compliance requirements" - Anastasia Botsi, ICT Legal

What's hot (19)

PDF
Browne Jacobson - Administrative and public law - October 2017
PPT
Guernsey Data Protection Legislation
PPTX
GDPR: The Catalyst for Customer 360
PPTX
Paperless Lab Academy 'legal aspects of big data analytics'
PDF
When Past Performance May Be Indicative of Future Results - The Legal Implica...
PPTX
Niall Rooney FD Event 05.09.19
PPTX
Data Privacy for Information Security Professionals Part 1
PPTX
Simple GDPR Overview
PDF
GDPR for public sector DPO's, April 2018, Nottingham
PDF
DPOs in the public sector, May 2018, Birmingham
PDF
GDPR for public sector DPO's seminar, April 2018, Manchester
PDF
DPOs in the public sector, May 2018, London
PDF
Personal data: Legal Issues in Research Data Collection and Sharing by EUDAT ...
PDF
The interface between data protection and ip law
PPTX
20180619 Controller-to-Processor agreements
PPTX
New Security Legislation & Its Implications for OSS Management
PPTX
New Security Legislation and its Implications for OSS Management
PDF
Third Principle Of The Data Protection Act, 1998 (Uk)
PPTX
Overview Data Privacy Bill India
Browne Jacobson - Administrative and public law - October 2017
Guernsey Data Protection Legislation
GDPR: The Catalyst for Customer 360
Paperless Lab Academy 'legal aspects of big data analytics'
When Past Performance May Be Indicative of Future Results - The Legal Implica...
Niall Rooney FD Event 05.09.19
Data Privacy for Information Security Professionals Part 1
Simple GDPR Overview
GDPR for public sector DPO's, April 2018, Nottingham
DPOs in the public sector, May 2018, Birmingham
GDPR for public sector DPO's seminar, April 2018, Manchester
DPOs in the public sector, May 2018, London
Personal data: Legal Issues in Research Data Collection and Sharing by EUDAT ...
The interface between data protection and ip law
20180619 Controller-to-Processor agreements
New Security Legislation & Its Implications for OSS Management
New Security Legislation and its Implications for OSS Management
Third Principle Of The Data Protection Act, 1998 (Uk)
Overview Data Privacy Bill India
Ad

Similar to Course 5: GDPR & Big Data by Sari Depreeuw (20)

PDF
Webinar: An EU regulation affecting companies worldwide - GDPR
PPTX
Medical device data protection and security
PPTX
GDPR Enforcement is here. Are you ready?
PPTX
EU General Data Protection Regulation top 8 operational impacts in personal c...
PPTX
Data protection and data integrity
PPTX
Data Protection GDPR Basics
PPTX
General Data Protection Regulation or GDPR
PDF
Legal and ethical considerations for sharing research data
PPTX
PLA Legal aspects of Big Data analytics final
PDF
Continuous PCI and GDPR Compliance With Data-Centric Security
PDF
Engage 2018: GDPR Three Days To Go
PDF
Social care forum, February 2019, London
PPTX
GDPR: Protecting Your Data
PDF
Sharing personal data and the GDPR - how can it be done - Francisco Romero Pa...
PDF
GDPR 11/1/2017
PPTX
GDPR Benefits and a Technical Overview
PPTX
Ichec & ESC gdpr feb 2020
PPTX
3A – DATA PROTECTION: ADVICE
 
PDF
Social care forum, February 2019, Birmingham
PPT
Data privacy & social media
Webinar: An EU regulation affecting companies worldwide - GDPR
Medical device data protection and security
GDPR Enforcement is here. Are you ready?
EU General Data Protection Regulation top 8 operational impacts in personal c...
Data protection and data integrity
Data Protection GDPR Basics
General Data Protection Regulation or GDPR
Legal and ethical considerations for sharing research data
PLA Legal aspects of Big Data analytics final
Continuous PCI and GDPR Compliance With Data-Centric Security
Engage 2018: GDPR Three Days To Go
Social care forum, February 2019, London
GDPR: Protecting Your Data
Sharing personal data and the GDPR - how can it be done - Francisco Romero Pa...
GDPR 11/1/2017
GDPR Benefits and a Technical Overview
Ichec & ESC gdpr feb 2020
3A – DATA PROTECTION: ADVICE
 
Social care forum, February 2019, Birmingham
Data privacy & social media
Ad

More from Betacowork (6)

PDF
Course 8 : How to start your big data project by Eric Rodriguez
PDF
Course 10 : Introduction to machine learning by Christoph Evers
PDF
Course 6 (part 2) data visualisation by toon vanagt
PDF
Course 4 : Big Data Structuring, Integration and Management Systems by Daan G...
PDF
Course 3 : Types of data and opportunities by Nikolaos Deligiannis
PDF
Course 1 - Introduction to Big Data by Toon Vanagt ( #BigDataBXL)
Course 8 : How to start your big data project by Eric Rodriguez
Course 10 : Introduction to machine learning by Christoph Evers
Course 6 (part 2) data visualisation by toon vanagt
Course 4 : Big Data Structuring, Integration and Management Systems by Daan G...
Course 3 : Types of data and opportunities by Nikolaos Deligiannis
Course 1 - Introduction to Big Data by Toon Vanagt ( #BigDataBXL)

Recently uploaded (20)

PPT
lectureusjsjdhdsjjshdshshddhdhddhhd1.ppt
PDF
Microsoft Core Cloud Services powerpoint
PPT
Image processing and pattern recognition 2.ppt
PPTX
Introduction to Inferential Statistics.pptx
PPTX
modul_python (1).pptx for professional and student
PDF
Introduction to the R Programming Language
PPTX
QUANTUM_COMPUTING_AND_ITS_POTENTIAL_APPLICATIONS[2].pptx
DOCX
Factor Analysis Word Document Presentation
PPTX
sac 451hinhgsgshssjsjsjheegdggeegegdggddgeg.pptx
PPTX
Pilar Kemerdekaan dan Identi Bangsa.pptx
PDF
Microsoft 365 products and services descrption
PPTX
Leprosy and NLEP programme community medicine
PPTX
SAP 2 completion done . PRESENTATION.pptx
PPTX
Managing Community Partner Relationships
PPT
Predictive modeling basics in data cleaning process
PPTX
IMPACT OF LANDSLIDE.....................
PDF
Navigating the Thai Supplements Landscape.pdf
PPTX
DS-40-Pre-Engagement and Kickoff deck - v8.0.pptx
PDF
Systems Analysis and Design, 12th Edition by Scott Tilley Test Bank.pdf
PPTX
Topic 5 Presentation 5 Lesson 5 Corporate Fin
lectureusjsjdhdsjjshdshshddhdhddhhd1.ppt
Microsoft Core Cloud Services powerpoint
Image processing and pattern recognition 2.ppt
Introduction to Inferential Statistics.pptx
modul_python (1).pptx for professional and student
Introduction to the R Programming Language
QUANTUM_COMPUTING_AND_ITS_POTENTIAL_APPLICATIONS[2].pptx
Factor Analysis Word Document Presentation
sac 451hinhgsgshssjsjsjheegdggeegegdggddgeg.pptx
Pilar Kemerdekaan dan Identi Bangsa.pptx
Microsoft 365 products and services descrption
Leprosy and NLEP programme community medicine
SAP 2 completion done . PRESENTATION.pptx
Managing Community Partner Relationships
Predictive modeling basics in data cleaning process
IMPACT OF LANDSLIDE.....................
Navigating the Thai Supplements Landscape.pdf
DS-40-Pre-Engagement and Kickoff deck - v8.0.pptx
Systems Analysis and Design, 12th Edition by Scott Tilley Test Bank.pdf
Topic 5 Presentation 5 Lesson 5 Corporate Fin

Course 5: GDPR & Big Data by Sari Depreeuw

  • 1. Big Data and GDPR Sari Depreeuw Beta Cowork Brussels 21 March 2019 1
  • 2. Overview • Introduction • Does GDPR apply? – « personal data » – « processing » • Processing within limits – Principles – Legal grounds • How to do big data (right)? – Impact assessment – Data protection by design & by default – Transparency – Data quality – Data subjects’ rights 2
  • 5. Risks • Tracking & surveillance (government/commercial) • Security • Trust (understanding, control) • Inaccuracy • Discrimination / influencing (nudging) / unfair treatment / perpetuating existing (economic) imbalance 5
  • 6. Opportunities • Autonomous cars • Insurance • Marketing / ads (online, offline) • Retail • Mobility • Search • Public sector / private sector 6
  • 7. “Big data” 1. Collecting data – Large quantities of data – From diverse sources – Variety of data (personal and non-personal data) 2. Analysing data – Correlations – profiles – algorithms (black box) – “Anonymisation” 3. Applying profiles (if applicable) – Fitting individuals in profiles 7
  • 8. Does the GDPR apply? “personal data”, “processing”, “anonymisation” v. “pseudonymisation” 8
  • 9. Personal data • “Personal data” (art. 4(1) GDPR) – Any information relating to an identified or identifiable natural person (‘data subject’) – Large notion! • “Identifiable natural person” – Anyone who can be identified - directly or indirectly, e.g. by reference to: • identifier (e.g. name, identification number, location data, online identifier) • factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person • e.g. combination of IP address, MAC address, browser type, type of phone 9
  • 10. Personal data (2) • Natural person “identifiable”? – Assessment on the basis of all the means reasonably likely to be used either by the controller or by any other person to identify the said person (Recital 26 of GDPR) – Factors: • costs of identification • intended purpose (identify individuals?) • advantages expected by the controller, interests at stake for the individual • future technological evolution (dynamic test)! – “Singling out” • Not necessary to know name or e-mail address • Impact on individual’s behaviour 10
  • 11. Personal data (3) • “data relating to an individual” = data referring to the identity, characteristics or behaviour of an individual or such information that is used to determine or influence the way in which that person is treated or evaluated (WP 29) Examples: – A natural person’s name, picture, phone number (private or professional), bank account number, e-mail address. – CJEU: employees’ working time records, person’s image by CCTV, tax data, information in a press release where (unnamed) person was easily identifiable, fingerprint, IP address, exam scripts. Even if data subject acts in professional context! 11
  • 12. Processing = Any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means (art. 4(2) GDPR) – E.g. collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction – Large interpretation – CJEU: loading on internet page; collection, publication, transfer on CD- ROM, text messaging; communication in response to a request for access to documents; crawling, indexing, transmitting by search engine; leaking to press; video recording. 12
  • 13. Anonymisation • “Processing” of “personal data” – Collecting of data – Analysing – Applying (targeting) • Anonymisation? = act of processing Principle: no more “personal” data (rec. 26) In practice: pseudonymisation Data sets – combination – re-identification - targeting 13
  • 14. PROCESSING WITHIN LIMITS Principles for processing, legal grounds for processing, sensitive data, profiling 14
  • 15. Principles for processing (art. 5 GDPR) a) Lawfulness, fairness and transparency b) Purpose limitation: Data collection for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes; further processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes shall not be considered to be incompatible with the initial purposes c) Data minimisation: adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed d) Accuracy: accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay 15
  • 16. Principles for processing (2) e) Storage limitation: kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed; storage for longer periods if processing solely for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes subject to safeguards; f) Integrity and confidentiality: processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures. g) Accountability: Controller is responsible for compliance with these principles and should be able to demonstrate such compliance. 16
  • 17. Bottlenecks • Transparency >< black box, opacity • Purpose limitation >< discovery, unpredictability >< repurposing data • Data minimisation >< collecting all available data >< generating new data provided >< observed, derived, inferred 17
  • 18. Lawfulness (art. 6 GDPR) • Legal grounds for processing : – Consent – Performance of contract – Legal obligation of controller – Vital interests of data subject or other – Public interest or official authority of controller – Legitimate interests of controller or other • Unless overriding interest or fundamental rights and freedoms of data subject (esp. child) 18
  • 19. Consent (art. 7 GDPR) • Freely given, specific, informed, unambiguous (art. 4(11) GDPR) – Statement or affirmative action >< pre-ticked box – Concerning all purposes (initial + further processing) Set prior to processing with consent – Intelligible // public – Using clear and plain language  “legalese” – Otherwise: not binding – Valid consent = illusion? • Withdrawal of consent – At any time – As easy as granting consent – Without detriment 19
  • 20. Bottlenecks • Consent (informed, specific) >< unpredictable outcome >< limited understanding (technophobia) Granular, real time consent? • Performance of contract >< necessity • Legal obligation >< limited cases • Legitimate interests >< necessity >< privacy, non-discrimination (cf. Google Spain) risk: balance in favour of data subject 20
  • 21. Sensitive data (art. 9 GDPR) • Special protection for « special categories » of data Data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership Genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation 21
  • 22. Sensitive data (2) • Principle: prohibition • Exceptions: – Explicit consent – Rights re employment, social security, social protection law – Vital interests – Not for profit organisation with political, philosophical, religious or trade union aim – Prior publicity by data subject – Legal claims – Substantial public interest, public health – Health care related processing – Archiving, research, statistics 22
  • 23. Profiling (art. 22 GDPR) • To be subject to (i) a decision (ii) based solely on automated processing, including profiling, (iii) which produces legal effects concerning him or her or similarly significantly affects him or her • 'profiling' means “any form of automated processing of personal data consisting of using those data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements” (rec. 71) 23
  • 24. Profiling (2) • Principle: data subject’s right not to be subject to automated decision Profiling as such not prohibited. • Except: – Necessary for contract (conclusion / performance) – Authorised by law – Explicit consent Safeguards: rights, freedoms, interests of data subject minimum: human intervention + defence, right to contest • No decisions based on sensitive data => prohibited discrimination Unless explicit consent or public interest + safeguards  Art. 29 WP Guidelines on Automated individual decision-making and Profiling for the purposes of GDPR (WP251) v. 6 Feb 18 24
  • 25. HOW TO DO BIG DATA (RIGHT)? Impact assessment, data protection by design, Transparency, information, consent, 25
  • 26. Starting point • Assume personal data is processed – Collecting => source(s)? – Analysing => statistics? – Applying => automated? Significant impact on data subject? • Define purposes (initial + further processing) Apply data processing principles (bottlenecks) • Define legal basis – Legitimate interest – Consent 26
  • 27. Impact assessment (art. 35) • Data protection impact assessment – // type of processing (new technologies) – // nature, scope, context and purposes of the processing  Potential “high risk” to the rights and freedoms of natural persons • Required: – a systematic and extensive evaluation of personal aspects relating to natural persons, based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person; – processing on a large scale of sensitive data / data relating to criminal convictions and offences; or – a systematic monitoring of a publicly accessible area on a large scale. 27
  • 28. Impact assessment (2) • DPIA: – Description of processing operations, purposes, legitimate interest of controller; – Assessment of necessity and proportionality of processing// purposes; – Assessment of risks to the rights and freedoms of data subjects; – the measures envisaged to address the risks e.g. https://www.cnil.fr/en/privacy-impact-assessment-pia 28
  • 29. DP by design • Data protection by design and by default (art. 25 GDPR) – By design: • Data protection incorporated “in the design” of the solution – // state of the art technology, cost of implementation, scope, context and purposes of processing, risk for rights of data subject – “Appropriate” technical and organisational measures (e.g. pseudonymisation, data minimisation, data segregation) • Entire lifecycle management of personal data • E.g. security measures // risk Anonymisation (cf. art. 29 working party - Opinion 05/2014) Pseudonymisation & encryption, access control, audit logs, guarantee ongoing confidentiality, integrity, availability, resilience of system, restoring availability and access to data in case of physical or technical accident, regular testing and assessing security measures • E.g. functional separation (statistics X impact on individual) 29
  • 30. DP by default – By default: • As a starting point: only data necessary per specific purpose (limitation re amount of data collected, extent of processing, storage period and accessibility) – “Appropriate” technical and organisational measures – No accessibility to indefinite number of people without intervention of the individual – E.g. control data collected through form (fields), processes to manage duration of storage (alerts, automated deletion or pseudonymisation) – To be documented! 30
  • 31. Fairness, transparency • Impact of processing e.g. ads v. different treatment Prohibited discrimination (e.g. ethnicity, gender, religion) • Legitimate expectations e.g. loyalty card; social media (cf. Cambridge Analytica) Transparency, prior information Evolving attitude of data subjects (generational?) • Information About existence of processing (tracking?), methods cf. Facebook decisions (Brussels court of appeal) 31
  • 32. Data quality • Accurate, up to date data cf. right to rectification • Algorithmic accountability logic – discrimination (perpetuating) – active detection (algorithm, data sets) inaccurate predictions – associations – correlation / causation 32
  • 33. Rights of data subject (1) • Information + access (art. 13-15 GDPR) – Who (controller), why (purpose), what (data, processing, source), how (recipients, access, rectification, profiling), how long (retention period), remedies (SA, DPO) – Existence of automated decision making, incl. profiling: logic, significance, consequences for data subject – Right to obtain a copy including observed, inferred data. importance of proper data management! Give access to profiles, labels? – Information in plain language – intelligible (public) Legal design: visualisation tools? Icons? Simple text? – How to inform data subject where data from various sources? 33
  • 34. Rights of data subject (2) • Rectification and erasure (art. 16-17 GDPR) inaccurate data => rectification Erasure => “right to be forgotten”! – Limited cases incl. withdrawal of consent, data no longer necessary,… Cf. also CJEU Google Spain (C-131/12) • Right to object (art. 21 GDPR) – To processing (and profiling) on public interest/legitimate interest ground • If particular situation • Controller may establish compelling legitimate grounds – To direct marketing (including profiling) – To processing for scientific/historical research/statistical purposes 34
  • 35. Rights of data subject (3) • Restriction of processing (art. 18 GDPR) • Data portability (art. 20 GDPR) • “Automated individual decision making”, incl. profiling (art. 22 GDPR) – Right not to be subject to decision based solely on automated processing if legal effects or significant impact 35
  • 36. Automated decisions • Right of information existence + logic >< IP, trade secrets • Right of access (incl. copy) to personal data incl. generated data (application of profile) existence + logic >< IP, trade secrets • Right of rectification inaccurate profiling? • Right to erasure Withdrawal of consent • Right to object Legitimate interest (incl. profiling) data subject > controller (i) particular situation, or (ii) direct marketing and profiling • Right not to be subject to automated decision making 36
  • 37. Moving target • Anonymise where possible – Pseudonomyse is valid plan B • Data protection by design – Think about design – all the time – Start by impact assessment – Integrate GDPR + « ethical » principles – Check algorithms + data sets • Communicate transparently – Privacy notices • Document choices 37
  • 38. Moving target (2) • Opportunities? • Algorithmic transparency (accountability) Diverse development or testing teams? Audits? Information? • Innovative communication (legal design) Bite-size, evolving messages Plain text Visualisation tools? Icons? Access to profile, applied labels? • Consent Granular consent // information? 38