Loading presentation...

Present Remotely

Send the link below via email or IM


Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.


The Ethics of Algorithmic Governance

No description

Robert Domanski

on 1 June 2018

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of The Ethics of Algorithmic Governance

Social prejudices embedded mathematically into algorithms with widespread influence

"Weapons of Math Destruction"
What is Algorithmic Bias?
Causes of Algorithmic Bias
1) The
of the algorithm itself

2) The
written to implement it

3) The
used to train it
Role of Government?
Fire-alarm vs. Police patrol models

NYC Algorithm-monitoring Task Force
Algorithmic Bias in Policy
Transparency & Accountability

Algorithmic auditing

Analysis of algorithm design
Code review and testing of outputs
Scrutiny of selected training data

Ethical Code of Conduct

New Hippocratic Oath

Artificial Intelligence
What are the ethical responsibilities of:


Individuals ("Citizen Data Scientists")
Algorithmic Bias
Algorithmic Governance
Towards a New Framework
The Ethics of Algorithmic Governance
Robert Domanski, PhD
City University of New York


Google Images search for "three black teenagers" vs. "three white teenagers"
Google's facial recognition algorithm consistently identified African-Americans as "gorillas" and "apes"
Predatory Advertising

Additional issues / questions
To what extent human oversight?

To what extent government regulation?
Jews are...

Not human
Women should...

Stay at home
Be slaves
Be in the kitchen

Women should not...

Have rights
A first-of-its-kind initiative designed to counter systemic algorithmic bias within the city government's internal systems
Criminal Sentencing
Predictive Policing

Pretrial Integrity & Safety Act (2017)
"Pornification" of "black girls"
A study by ProPublica compared predicted recidivism to actual recidivism.

"Risk scores" were wrong 40% of the time

Black defendants were falsely labeled future criminals at almost twice the rate as white defendants
Criminal Sentencing
Predictive Policing
"Strategic Subject Lists"

NYC's "Stop & Frisk" program

Of nearly 5 million people stopped between 2003 - 2013, in some years nearly 90% were Black and Latino
Moneyed Bail System
Black and Latino men are asked to pay, respectively, 35% and 19% higher bail than white men
More Ideas
Increased diversity on engineering design teams

Increased involvement of social scientists with engineering design teams
Sample questions
Source: Electronic Frontier Foundation
Full transcript