SlideShare a Scribd company logo
Lilian Edwards
Professor of E-Governance
University of Strathclyde
Lilian.edwards@strath.ac.uk
@lilianedwards
Pangloss:
http://blogscript.blogspot.co.uk
/
Slave to the
Algorithm
March 2016
Slave to the algo-ri(y)thm
 Algorithms are Big News : ATI, Royal Society,
CCCAV
 Historical pre-digital notion – everything from
knitting patterns to – “a set of logical
instructions intended to solve a problem” –
often different answers depending on
variables. (Braun)
Slave to the Algorithm  2016
Machine learning algorithms
 Eg machine vision. Is that a table or a packing case?
 Machine language analysis. Eg number 2
 Movements eg backing up a truck
 Typically, use a training set of many thousand or million
examples – inputs outputs
 Already defined what a successful output is. (eg yes, that was
a 2)
 Train system to recognise inputs that produce success output
 Apply to new data
 Usually much preparation (“cleaning”) of data sets used -
Gillespie; context of use;
 Not nearly as automated or neutral a process as it is usually
presented
 Arguably running the world – “algorithmic governance”
Algorithm + data = ?
 Now –twin sibling to Big Data?
 the “key logic governing the flows of information upon which our
society depends” (Tarleton, 2013)
 “together data structures and algorithms are two halves of the
ontology of the world according to a computer” (Manovich, 1999)
 Considerable interest, coders-> business, sociologists,
politicians, lawyers; “Governing Algorithms”, MIT, 2013; Mayer-
Schoenberger & Cukier Big Data (2013); Morozow To save
Everything Click here (2013); Pariser The Filter Bubble (2011),
Kitchin The Programmable City
 Law - Pasquale Black Box Society(2015), Edwards
 Has change happened because of better algorithms or more
data? esp in the Internet industries, eg, search.
Why are algorithms important to society,
governance,innovation?
 Predictive profiling of persons
 OBA/targeted ads on Google/social networks, etc;
 price , service discrimination;
 criminal/terrorist profiling; -> pre-crime?
 Future health, obesity, Alzheimers risks
 Non-personal predictions of what is important/significant/popular/profitable;
 eg “trending topics” on Twitter;
 Google News; top Google results on search by keywords;
 automated stock exchanges;
 recommendations on Netflix/Amazon etc
 Filtering online of unwanted content
 Spam algorithms, Google Hell (anti SEO)
 Ideal? Twitter UK anti-women trolling cases summer 2013: ACPO “They [Twitter]
are ingenious people, it can't be beyond their wit to stop these crimes”
 “Real world” as well as online effects: Algorithms to instruct robots on how to
behave adaptively when circumstances change from original programming;
driverless cars liability?
 Almost hopelessly wide topic! See *Kohl (2013) 12 IJLIT 187.
Are algorithms intrinsically fair,
neutral or objective?
Please please believe me..
Are algorithms “fair”, “neutral”,“objective”? Some key themes
 Kitchin: algorithms alleged to be “technical, benign and commonsensical”
 “strictly rational concerns marrying the certainties of mathematics with
the objectivity of technology”
 Why is “automated” taken to => “neutral” / “objective”? A game can after
all be rigged..
 Yet tendency to regard algorithmic reality as real eg
 first page of Google search;
 Facebook Newsfeed emotional manipulation experiment (merely beta testing?)
June 2014
 Curated newsfeeds favouring one or other of polarised political views – the
“echo chamber” worry and the Trump/Brexit effect?
 Not new : cf “computer says no”, early credit scoring errors
 Kohl : “the automation-neutrality platitude”
Slave to the Algorithm  2016
Slave to the Algorithm  2016
Why might algorithms not be
neutral?
 Revealed unintentional bias eg Harvard medical admissions
BUT ALSO
 Selection of data for training set, and ..
 Selection of data as inputs:
 How was the data to which the algorithm is applied selected and made
“algorithm-ready”? (“translation”)
 The evaluation of relevance to a successful task – (exclusion/inclusion;
demotion/promotion; manual intervention;)
 Complexity and iterative dynamic change : “algorithmic systems are not
standalone little boxes but massive networked ones wit hundreds of
hands”
 Overwhemingly – the black box effect – that causality not shown, merely
statistical correlation
 Legal implications (and remedies):
 discrimination in profile based advertising ( Latanya Sweeney);
 ranking of legal vs illegal download sites on Google;
 adult content ranking eg not in Amazon top sellers ;
 unfair competition issues – see also re Google Search results; defamation
and autocomplete cases.
Enslaving the algorithm: (1)
competition remedy
 Repeated claims Google manipulates search to demote competitors,
promote own products
 Early US case law : Search King v Google 2003 Google’s rankings not
challengeable as “opinion”, 1st Am protected!
 However EU competition regulators, national & Commission and FTC in
USA have taken allegations more seriously eg Foundem (UK), Ciao (EU),
ejustice.fr (Fr) –proposed remedies April 2013 - architectural, labelling
remedies – fairly minor.
 Can a notion of fairness/neutrality/impartiality be reasonably imposed on
Google’s proprietary algorithm?
 Is there any canonical form of Googles front page?
 It’s Google’s game and they make the rules? But can clearly make or break
businesses due to market dominance.
 Reliance and trust. Google: “Our users trust our objectivity and no short term goal
could ever justify breaching that trust”
 Could it ever be “neutral” to suit everyone?
Enslaving
the
algorithm:
(2)libel
 Algorithmic defamation! Eg. Bettina Wulff case, Germany 2012/3
 Google’s defenses: "The search terms in Google Autocomplete reflect the
actual search terms of all users“ (“Crowdsourcing defense”) Also - Automation;
objectivity;
 (If Google’s rankings are only “opinion” , are its autocomplete suggestions not
even more so?
 But Wulff won: although Google only liable on notice and failureto remove
 But French and UK courts disagree,
 “Crowdsourced” defense could inspire astroturfing – Morozow suggests
competitors could hire Mechanical Turks to diss opponents..
 Is there social interest in making autocomplete too risky to keep turned on?
 Is repressing questionable autocompletes a further version of the filter bubble?
Example
 Try looking up “Tories are”, “labout are”, Lib dems
are”,,,
 “Users who enter “Labour are” are offered
completed terms including “… finished”, “… a
joke”, and “… right wing”. Similarly, entering “Lib
Dems are” offers up “… finished”, “… pointless”
and “… traitors”. But entering “Conservatives are”
or “Tories are” offers no search suggestions at all.”
 Guardian, 2 feb 2016
 Shows lack of transparency and accountability
within Google algorithm?
The algorithm as black box
 Example: Google search algorithm is not just Page
Rank (counting links) but c 200 other signals, changed
regularly – c 500-600 times/year – some clues given to
SEO industry
 Why accepted as trade secret?
 revenue depends on it – key market advantage?
 Secrecy prevents rampant gaming/ “SEO”
 ? Disclosure might disrupt the useful claims of
automation, neutrality, objectivity
 Do we have any rights to audit the algorithm? Should
we? Would it help any?
 Would it be disastrous for Google to disclose given:
 Value comes from the big data not the algorithm?
 The algorithm is constantly changed?
 Does Google KNOW what its algorithm is doing??
 Could DP data subject access rights help??
Data Protection Directive
Art 12: "every data subject [has] the right to obtain from the
controller..
- knowledge of the logic involved in any automatic processing of
data concerning him at least in the case of the automated
decisions referred to in Article 15 (1)“
Art 15(1) : every person has the right "not to be subject to a
decision which produces legal effects concerning him or
significantly affects him and which is based solely on automated
processing of data intended to evaluate certain personal aspects
relating to him, such as his performance at work, creditworthiness,
reliability, conduct, etc.“
Rec 41: "any person must be able to exercise the right of access to
data relating to him which are being processed, in order to verify in
particular the accuracy of the data and the lawfulness of the
processing“
..” this right must not adversely affect trade secrets or intellectual
property and in particular the copyright protecting the software”
Draft DP Regulation (Jan 16)
 New Art 15: Rts of access
 Right to obtain where personal data is being
processed..
 “(h) the existence of automated decision making
including profiling [see art 20] .. And at least in
those cases, meaningful information about the
logic involved, as well as the significance and
envisaged consequences of such processing..”
 *Rec 51: “This right should not adversely affect
the rights and freedoms of others, including trade
secrets or intellectual property…However, the
result of these considerations should not be that
all information is refused to the data subject…”
Final issues to consider
 Transparency
 Audit
 Comprehension (“algorithmists”)
 Control
 Remedies?
 Existing laws – discrimination, employment, health and safety, data
protection, liability for algorithms (eg stock mkt?)
 Better consents?
 Access to data laws
 Access to algorithm?
 “Human in the loop”
 Human rights infringement? Title to sue?
 Second and third party audit – trust seals etc
 Architecture control – privacy and equality by design?
 cf PbDesign literature – PIAs etc

More Related Content

PPTX
Slave to the Algo-Rhythms?
PDF
ICPSR - Complex Systems Models in the Social Sciences - Lecture 6 - Professor...
PDF
Legal Analytics - Introduction to the Course - Professor Daniel Martin Katz +...
PDF
CATS4ML Data Challenge: Crowdsourcing Adverse Test Sets for Machine Learning
PDF
Algocracy and the state of AI in public administrations.
PPTX
Codes of Ethics and the Ethics of Code
PDF
The Three Forms of (Legal) Prediction: Experts, Crowds and Algorithms -- Prof...
PDF
Algorithmic Bias - What is it? Why should we care? What can we do about it?
Slave to the Algo-Rhythms?
ICPSR - Complex Systems Models in the Social Sciences - Lecture 6 - Professor...
Legal Analytics - Introduction to the Course - Professor Daniel Martin Katz +...
CATS4ML Data Challenge: Crowdsourcing Adverse Test Sets for Machine Learning
Algocracy and the state of AI in public administrations.
Codes of Ethics and the Ethics of Code
The Three Forms of (Legal) Prediction: Experts, Crowds and Algorithms -- Prof...
Algorithmic Bias - What is it? Why should we care? What can we do about it?

What's hot (20)

PDF
A Glimpse Into the Future of Data Science - What's Next for AI, Big Data & Ma...
PDF
Data and Ethics: Why Data Science Needs One
PPTX
Ethical Issues in Machine Learning Algorithms. (Part 1)
PDF
Ethics and Data
PDF
Law, Ethics and Tech Aspects for an Irrevocable BlockChain Based Curriculum V...
PDF
Fairness-aware Machine Learning: Practical Challenges and Lessons Learned (KD...
PPTX
Ppt3 copy
PPTX
How can algorithms be biased?
PDF
Data ethics
PDF
Algorithmic Bias : What is it? Why should we care? What can we do about it?
PDF
Weekly eDiscovery Top Story Digest - November 20, 2013
PPTX
Fairness in AI (DDSW 2019)
PDF
The Business Case for Applied Artificial Intelligence
PPTX
Fairness and Privacy in AI/ML Systems
PDF
Fairness, Transparency, and Privacy in AI @LinkedIn
PDF
Detecting Algorithmic Bias (keynote at DIR 2016)
PDF
Data monopolists like google are threatening the economy hbr
PDF
Introduction to the ethics of machine learning
PDF
Rapid fire with Douglas Van Praet
PDF
AI and Future of Professions
A Glimpse Into the Future of Data Science - What's Next for AI, Big Data & Ma...
Data and Ethics: Why Data Science Needs One
Ethical Issues in Machine Learning Algorithms. (Part 1)
Ethics and Data
Law, Ethics and Tech Aspects for an Irrevocable BlockChain Based Curriculum V...
Fairness-aware Machine Learning: Practical Challenges and Lessons Learned (KD...
Ppt3 copy
How can algorithms be biased?
Data ethics
Algorithmic Bias : What is it? Why should we care? What can we do about it?
Weekly eDiscovery Top Story Digest - November 20, 2013
Fairness in AI (DDSW 2019)
The Business Case for Applied Artificial Intelligence
Fairness and Privacy in AI/ML Systems
Fairness, Transparency, and Privacy in AI @LinkedIn
Detecting Algorithmic Bias (keynote at DIR 2016)
Data monopolists like google are threatening the economy hbr
Introduction to the ethics of machine learning
Rapid fire with Douglas Van Praet
AI and Future of Professions
Ad

Viewers also liked (16)

PPTX
Cloud computing : legal , privacy and contract issues
PPTX
From Privacy Impact Assessment to Social Impact Assessment: Preserving TRrus...
PPTX
Music Magazine Questionnaire Analysis
PDF
Business Region Aarhus Iværksætterpris
PPTX
Why Jesus? Christmas Message
PDF
Fra koncept til forretning - Fitmeal-præsentation
PPTX
UK copyright, online intermediaries and enforcement
PPTX
What Do You Do with a Problem Like AI?
PPTX
The GDPR, Brexit, the UK and adequacy
PPTX
Privacy, the Internet of Things and Smart Cities
PPTX
From One to Many
PPTX
The GDPR for Techies
PDF
Precision Agriculture with Sensors and Technologies from IoT - INForum 2016
PPTX
Value Stream Mapping
PPT
Ky nang lap ke hoach pdca
PPTX
Agricultural drone
Cloud computing : legal , privacy and contract issues
From Privacy Impact Assessment to Social Impact Assessment: Preserving TRrus...
Music Magazine Questionnaire Analysis
Business Region Aarhus Iværksætterpris
Why Jesus? Christmas Message
Fra koncept til forretning - Fitmeal-præsentation
UK copyright, online intermediaries and enforcement
What Do You Do with a Problem Like AI?
The GDPR, Brexit, the UK and adequacy
Privacy, the Internet of Things and Smart Cities
From One to Many
The GDPR for Techies
Precision Agriculture with Sensors and Technologies from IoT - INForum 2016
Value Stream Mapping
Ky nang lap ke hoach pdca
Agricultural drone
Ad

Similar to Slave to the Algorithm 2016 (20)

PPTX
Bsa cpd a_koene2016
DOCX
SHOULD ALGORITHMS DECIDE YOUR FUTUREThis publication was .docx
PPTX
Algorithmic and technological transparency
PDF
Fairness and Transparency: Algorithmic Explainability, some Legal and Ethical...
PDF
The Relevance of Algorithms
PPTX
Injustice_Harm_Digital_Infrastructures.pptx
PPTX
2019 WIA - The Importance of Ethics in Data Science
PPTX
Ansgar rcep algorithmic_bias_july2018
PPTX
Human Agency on Algorithmic Systems
PPTX
algorithmic-bias.pptx
PDF
Algorithms and Fundamental Rights - Jeroen van den Hoven
PPTX
Algorithmic Accountability Reporting | Journalism Interactive 2014
DOCX
httpsdoi.org10.11771477370819876762European Journal o
PDF
Joachim Ganseman - Pitfalls in AI - Infosecurity.be 2019
PPTX
Oxford Internet Institute 19 Sept 2019: Disinformation – Platform, publisher ...
PDF
Ethical Algorithms: Bias in Machine Learning for NextAI
PPTX
Responsible Data Science against black boxes - transparency
PPTX
iConference 2018 BIAS workshop keynote
PPTX
Tecnologías emergentes: priorizando al ciudadano
PDF
Algorithmic competition – Michal Gal – June 2023 OECD discussion
Bsa cpd a_koene2016
SHOULD ALGORITHMS DECIDE YOUR FUTUREThis publication was .docx
Algorithmic and technological transparency
Fairness and Transparency: Algorithmic Explainability, some Legal and Ethical...
The Relevance of Algorithms
Injustice_Harm_Digital_Infrastructures.pptx
2019 WIA - The Importance of Ethics in Data Science
Ansgar rcep algorithmic_bias_july2018
Human Agency on Algorithmic Systems
algorithmic-bias.pptx
Algorithms and Fundamental Rights - Jeroen van den Hoven
Algorithmic Accountability Reporting | Journalism Interactive 2014
httpsdoi.org10.11771477370819876762European Journal o
Joachim Ganseman - Pitfalls in AI - Infosecurity.be 2019
Oxford Internet Institute 19 Sept 2019: Disinformation – Platform, publisher ...
Ethical Algorithms: Bias in Machine Learning for NextAI
Responsible Data Science against black boxes - transparency
iConference 2018 BIAS workshop keynote
Tecnologías emergentes: priorizando al ciudadano
Algorithmic competition – Michal Gal – June 2023 OECD discussion

More from Lilian Edwards (14)

PPTX
Global Governance of Generative AI: The Right Way Forward
PPTX
How to regulate foundation models: can we do better than the EU AI Act?
PPTX
Can ChatGPT be compatible with the GDPR? Discuss.
PPTX
the Death of Privacy in Three Acts
PPTX
Revenge porn: punish, remove, forget, forgive?
PPTX
From piracy to “The Producers?
PPTX
The Death of Privacy in Three Acts
PPTX
Police surveillance of social media - do you have a reasonable expectation of...
PPTX
IT law : the middle kingdom between east and West
PPTX
What do we do with aproblem like revenge porn ?
PPTX
9worlds robots
PPTX
The death of data protection
PPTX
The death of data protection sans obama
PPTX
Cdas 2012, lilian edwards and edina harbinja
Global Governance of Generative AI: The Right Way Forward
How to regulate foundation models: can we do better than the EU AI Act?
Can ChatGPT be compatible with the GDPR? Discuss.
the Death of Privacy in Three Acts
Revenge porn: punish, remove, forget, forgive?
From piracy to “The Producers?
The Death of Privacy in Three Acts
Police surveillance of social media - do you have a reasonable expectation of...
IT law : the middle kingdom between east and West
What do we do with aproblem like revenge porn ?
9worlds robots
The death of data protection
The death of data protection sans obama
Cdas 2012, lilian edwards and edina harbinja

Recently uploaded (20)

PPT
Understanding the Impact of the Cyber Act
DOCX
FOE Reviewer 2022.docxhgvgvhghhghyjhghggg
PPTX
Peter Maatouk Is Redefining What It Means To Be A Local Lawyer Who Truly List...
PDF
The AI & LegalTech Surge Reshaping the Indian Legal Landscape
PDF
Trademark, Copyright, and Trade Secret Protection for Med Tech Startups.pdf
PDF
TRAFFIC-MANAGEMENT-AND-ACCIDENT-INVESTIGATION-WITH-DRIVING-PDF-FILE.pdf
PDF
Constitution of India and fundamental rights pdf
PPT
wipo: IP _smes_kul_06_www_6899913 (1).ppt
PDF
OpenAi v. Open AI Summary Judgment Order
PPTX
RULE_4_Out_of_Court_or_Informal_Restructuring_Agreement_or_Rehabilitation.pptx
PPT
3. INDUTRIAL RELATIONS INTRODUCTION AND CONCEPTS.ppt
PDF
NRL_Legal Regulation of Forests and Wildlife.pdf
PDF
SUMMARY CASES-42-47.pdf tax -1 257++/ hsknsnd
PPTX
Income under income Tax Act..pptx Introduction
PDF
AHRP LB - Quick Look of the Newly-initiated Koperasi Merah Putih (KMP).pdf
PPTX
FFFFFFFFFFFFFFFFFFFFFFTA_012425_PPT.pptx
PDF
APPELLANT'S AMENDED BRIEF – DPW ENTERPRISES LLC & MOUNTAIN PRIME 2018 LLC v. ...
PDF
New York State Bar Association Journal, September 2014
PPTX
4-D...Preparation of Research Design.pptx
PDF
Analysis Childrens act Kenya for the year 2022
Understanding the Impact of the Cyber Act
FOE Reviewer 2022.docxhgvgvhghhghyjhghggg
Peter Maatouk Is Redefining What It Means To Be A Local Lawyer Who Truly List...
The AI & LegalTech Surge Reshaping the Indian Legal Landscape
Trademark, Copyright, and Trade Secret Protection for Med Tech Startups.pdf
TRAFFIC-MANAGEMENT-AND-ACCIDENT-INVESTIGATION-WITH-DRIVING-PDF-FILE.pdf
Constitution of India and fundamental rights pdf
wipo: IP _smes_kul_06_www_6899913 (1).ppt
OpenAi v. Open AI Summary Judgment Order
RULE_4_Out_of_Court_or_Informal_Restructuring_Agreement_or_Rehabilitation.pptx
3. INDUTRIAL RELATIONS INTRODUCTION AND CONCEPTS.ppt
NRL_Legal Regulation of Forests and Wildlife.pdf
SUMMARY CASES-42-47.pdf tax -1 257++/ hsknsnd
Income under income Tax Act..pptx Introduction
AHRP LB - Quick Look of the Newly-initiated Koperasi Merah Putih (KMP).pdf
FFFFFFFFFFFFFFFFFFFFFFTA_012425_PPT.pptx
APPELLANT'S AMENDED BRIEF – DPW ENTERPRISES LLC & MOUNTAIN PRIME 2018 LLC v. ...
New York State Bar Association Journal, September 2014
4-D...Preparation of Research Design.pptx
Analysis Childrens act Kenya for the year 2022

Slave to the Algorithm 2016

  • 1. Lilian Edwards Professor of E-Governance University of Strathclyde Lilian.edwards@strath.ac.uk @lilianedwards Pangloss: http://blogscript.blogspot.co.uk / Slave to the Algorithm March 2016
  • 2. Slave to the algo-ri(y)thm  Algorithms are Big News : ATI, Royal Society, CCCAV  Historical pre-digital notion – everything from knitting patterns to – “a set of logical instructions intended to solve a problem” – often different answers depending on variables. (Braun)
  • 4. Machine learning algorithms  Eg machine vision. Is that a table or a packing case?  Machine language analysis. Eg number 2  Movements eg backing up a truck  Typically, use a training set of many thousand or million examples – inputs outputs  Already defined what a successful output is. (eg yes, that was a 2)  Train system to recognise inputs that produce success output  Apply to new data  Usually much preparation (“cleaning”) of data sets used - Gillespie; context of use;  Not nearly as automated or neutral a process as it is usually presented  Arguably running the world – “algorithmic governance”
  • 5. Algorithm + data = ?  Now –twin sibling to Big Data?  the “key logic governing the flows of information upon which our society depends” (Tarleton, 2013)  “together data structures and algorithms are two halves of the ontology of the world according to a computer” (Manovich, 1999)  Considerable interest, coders-> business, sociologists, politicians, lawyers; “Governing Algorithms”, MIT, 2013; Mayer- Schoenberger & Cukier Big Data (2013); Morozow To save Everything Click here (2013); Pariser The Filter Bubble (2011), Kitchin The Programmable City  Law - Pasquale Black Box Society(2015), Edwards  Has change happened because of better algorithms or more data? esp in the Internet industries, eg, search.
  • 6. Why are algorithms important to society, governance,innovation?  Predictive profiling of persons  OBA/targeted ads on Google/social networks, etc;  price , service discrimination;  criminal/terrorist profiling; -> pre-crime?  Future health, obesity, Alzheimers risks  Non-personal predictions of what is important/significant/popular/profitable;  eg “trending topics” on Twitter;  Google News; top Google results on search by keywords;  automated stock exchanges;  recommendations on Netflix/Amazon etc  Filtering online of unwanted content  Spam algorithms, Google Hell (anti SEO)  Ideal? Twitter UK anti-women trolling cases summer 2013: ACPO “They [Twitter] are ingenious people, it can't be beyond their wit to stop these crimes”  “Real world” as well as online effects: Algorithms to instruct robots on how to behave adaptively when circumstances change from original programming; driverless cars liability?  Almost hopelessly wide topic! See *Kohl (2013) 12 IJLIT 187.
  • 7. Are algorithms intrinsically fair, neutral or objective?
  • 8. Please please believe me.. Are algorithms “fair”, “neutral”,“objective”? Some key themes  Kitchin: algorithms alleged to be “technical, benign and commonsensical”  “strictly rational concerns marrying the certainties of mathematics with the objectivity of technology”  Why is “automated” taken to => “neutral” / “objective”? A game can after all be rigged..  Yet tendency to regard algorithmic reality as real eg  first page of Google search;  Facebook Newsfeed emotional manipulation experiment (merely beta testing?) June 2014  Curated newsfeeds favouring one or other of polarised political views – the “echo chamber” worry and the Trump/Brexit effect?  Not new : cf “computer says no”, early credit scoring errors  Kohl : “the automation-neutrality platitude”
  • 11. Why might algorithms not be neutral?  Revealed unintentional bias eg Harvard medical admissions BUT ALSO  Selection of data for training set, and ..  Selection of data as inputs:  How was the data to which the algorithm is applied selected and made “algorithm-ready”? (“translation”)  The evaluation of relevance to a successful task – (exclusion/inclusion; demotion/promotion; manual intervention;)  Complexity and iterative dynamic change : “algorithmic systems are not standalone little boxes but massive networked ones wit hundreds of hands”  Overwhemingly – the black box effect – that causality not shown, merely statistical correlation  Legal implications (and remedies):  discrimination in profile based advertising ( Latanya Sweeney);  ranking of legal vs illegal download sites on Google;  adult content ranking eg not in Amazon top sellers ;  unfair competition issues – see also re Google Search results; defamation and autocomplete cases.
  • 12. Enslaving the algorithm: (1) competition remedy  Repeated claims Google manipulates search to demote competitors, promote own products  Early US case law : Search King v Google 2003 Google’s rankings not challengeable as “opinion”, 1st Am protected!  However EU competition regulators, national & Commission and FTC in USA have taken allegations more seriously eg Foundem (UK), Ciao (EU), ejustice.fr (Fr) –proposed remedies April 2013 - architectural, labelling remedies – fairly minor.  Can a notion of fairness/neutrality/impartiality be reasonably imposed on Google’s proprietary algorithm?  Is there any canonical form of Googles front page?  It’s Google’s game and they make the rules? But can clearly make or break businesses due to market dominance.  Reliance and trust. Google: “Our users trust our objectivity and no short term goal could ever justify breaching that trust”  Could it ever be “neutral” to suit everyone?
  • 13. Enslaving the algorithm: (2)libel  Algorithmic defamation! Eg. Bettina Wulff case, Germany 2012/3  Google’s defenses: "The search terms in Google Autocomplete reflect the actual search terms of all users“ (“Crowdsourcing defense”) Also - Automation; objectivity;  (If Google’s rankings are only “opinion” , are its autocomplete suggestions not even more so?  But Wulff won: although Google only liable on notice and failureto remove  But French and UK courts disagree,  “Crowdsourced” defense could inspire astroturfing – Morozow suggests competitors could hire Mechanical Turks to diss opponents..  Is there social interest in making autocomplete too risky to keep turned on?  Is repressing questionable autocompletes a further version of the filter bubble?
  • 14. Example  Try looking up “Tories are”, “labout are”, Lib dems are”,,,  “Users who enter “Labour are” are offered completed terms including “… finished”, “… a joke”, and “… right wing”. Similarly, entering “Lib Dems are” offers up “… finished”, “… pointless” and “… traitors”. But entering “Conservatives are” or “Tories are” offers no search suggestions at all.”  Guardian, 2 feb 2016  Shows lack of transparency and accountability within Google algorithm?
  • 15. The algorithm as black box  Example: Google search algorithm is not just Page Rank (counting links) but c 200 other signals, changed regularly – c 500-600 times/year – some clues given to SEO industry  Why accepted as trade secret?  revenue depends on it – key market advantage?  Secrecy prevents rampant gaming/ “SEO”  ? Disclosure might disrupt the useful claims of automation, neutrality, objectivity  Do we have any rights to audit the algorithm? Should we? Would it help any?  Would it be disastrous for Google to disclose given:  Value comes from the big data not the algorithm?  The algorithm is constantly changed?  Does Google KNOW what its algorithm is doing??  Could DP data subject access rights help??
  • 16. Data Protection Directive Art 12: "every data subject [has] the right to obtain from the controller.. - knowledge of the logic involved in any automatic processing of data concerning him at least in the case of the automated decisions referred to in Article 15 (1)“ Art 15(1) : every person has the right "not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc.“ Rec 41: "any person must be able to exercise the right of access to data relating to him which are being processed, in order to verify in particular the accuracy of the data and the lawfulness of the processing“ ..” this right must not adversely affect trade secrets or intellectual property and in particular the copyright protecting the software”
  • 17. Draft DP Regulation (Jan 16)  New Art 15: Rts of access  Right to obtain where personal data is being processed..  “(h) the existence of automated decision making including profiling [see art 20] .. And at least in those cases, meaningful information about the logic involved, as well as the significance and envisaged consequences of such processing..”  *Rec 51: “This right should not adversely affect the rights and freedoms of others, including trade secrets or intellectual property…However, the result of these considerations should not be that all information is refused to the data subject…”
  • 18. Final issues to consider  Transparency  Audit  Comprehension (“algorithmists”)  Control  Remedies?  Existing laws – discrimination, employment, health and safety, data protection, liability for algorithms (eg stock mkt?)  Better consents?  Access to data laws  Access to algorithm?  “Human in the loop”  Human rights infringement? Title to sue?  Second and third party audit – trust seals etc  Architecture control – privacy and equality by design?  cf PbDesign literature – PIAs etc

Editor's Notes

  • #9: (Eg, Amazon does not measure “sales rank” of adult books) 2. How far can humans interfere, both overtly and in devising/tweaking the algorithms, before not “automated”? What interventions are justified/neutral?
  • #13: punish those not playing ball (eg Italian newspapers refusing to be spidered)
  • #14: How difficult would it be for Google to police this given they filter for copyright autosuggest (since 2012)?