Skip to content

A continually-updated list of studies from the CSCW 2021 paper, "Problematic Machine Behavior: A Systematic Literature Review of Algorithm Audits"

Notifications You must be signed in to change notification settings

comp-journalism/list-of-algorithm-audits

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 

Repository files navigation

List of Algorithm Audits

A continually-updated list of studies from the CSCW 2021 paper, "Problematic Machine Behavior: A Systematic Literature Review of Algorithm Audits"

Repository is a work-in-progress, open to the community for edits.

To add a study, simply create an issue or pull request.

Definitions

Algorithm Audit: an empirical study investigating a public algorithmic system for potential problematic behavior.

  • empirical study: includes an experiment or analysis (quantitative or qualitative) that generates evidence-based claims with well-defined outcome metrics. It must not be purely an opinion/position paper, although position papers with substantial empirical components were included
  • algorithmic system: is any socio-technical system influenced by at least one algorithm. This includes systems that may rely on human judgement and/or other non-algorithmic components, as long as they include at least one algorithm.
  • public: algorithmic system is one used in a commercial context or other public setting such as law enforcement, education, criminal punishment, or public transportation
  • problematic behavior: in this study refers to discrimination, distortion, exploitation, or mis- judgement, as well as various types of behaviors within each of these categories. A behavior is problematic when it causes harm (or potential harm). In the ACM Code of Ethics, examples of harm include "unjustified physical or mental injury, unjustified destruction or disclosure of information, and unjustified damage to property, reputation, and the environment."

Studies

Discrimination

The algorithm disparately treats or disparately impacts people on the basis of their race, age, gender, location, socioeconomic status, and/or intersectional identity. For example, an algorithm implicated in discrimination may systematically favor people who identify as males, or reinforce harmful stereotypes about elderly people.

Pricing

Advertising

Search

Recommendation

Computer Vision

Criminal Punishment

Language Processing

Distortion

The algorithm presents media that distorts or obscures an underlying reality. For example, an algorithm implicated in distortion may favor content from a given political perspective, hyper-personalize output for different users, change its output frequently and without good reason, or provide misleading information to users.

Search

Mapping

Recommendation

Advertising

Language Processing

Exploitation

The algorithm inappropriately uses content from other sources and/or sensitive personal information from people. For example, an algorithm implicated in exploitation may infer sensitive personal information from users without proper consent, or feature content from an outside source without attribution.

Advertising

Search

Misjudgment

The algorithm makes incorrect predictions or classifications. Notably, misjudgment can often lead to discrimination, distortion, and/or exploitation, but some studies in the review focused on this initial error of misjudgment without exploring second-order problematic effects. An algorithm implicated in misjudgment may incorrectly classify a user’s employment status or mislabel a piece of political news as being primarily about sports, for example.

Criminal Punishment

Advertising

About

A continually-updated list of studies from the CSCW 2021 paper, "Problematic Machine Behavior: A Systematic Literature Review of Algorithm Audits"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published