Bias in Computer Systems

From AcaWiki
Jump to: navigation, search

Citation: Batya Friedman, Helen Nissenbaum (1996) Bias in Computer Systems. ACM Transactions on Information Systems (RSS)
DOI (original publisher): 10.1145/230538.230561
Semantic Scholar (metadata): 10.1145/230538.230561
Sci-Hub (fulltext): 10.1145/230538.230561
Internet Archive Scholar (search for fulltext): Bias in Computer Systems
Download: http://vsdesign.org/publications/pdf/64 friedman.pdf
Tagged:

Summary

Authors "use the term bias to refer to computer systems that systematically and unfairly discriminate against certain individuals or groups of individuals in favor of others ... joined with an unfair outcome."

Examined 17 computer systems from diverse fields and found that 3 categories of bias emerged:

  • Preexisting bias, in which computer systems embody bias already existing, entering the system either explicitly or subconsciously:
    • originating from individuals with significant input into the system design;
    • originating from society at large.
  • Technical bias, arising from technical constraints or considerations:
    • Computer Tools
    • Decontextualized Algorithms
    • Random Number Generation
    • Formalization of Human Constructs
  • Emergent bias, arising in the context of use:
    • New Societal Knowledge
    • Mismatch between users and design (different expertise or values)

Describe 3 of the systems examined.

Remedying bias in computer systems involves:

  • Identifying/diagnosing bias
  • Correcting and avoiding bias

Design of computer systems must not only be scrutinized, but scrutinized with knowledge of relevant biases in the world.

Designers must have professional backing for their efforts to minimize bias in systems.

Not all bias is addressable in computer systems.

In the conclusion:

"As with other criteria for good computer systems, such as reliability, accuracy, and efficiency, freedom from bias should be held out as an ideal."

Theoretical and Practical Relevance

Quote from paper:

Computer systems, for instance, are comparatively inexpensive to disseminate, and thus, once developed, a biased system has the potential for widespread impact. If the system becomes a standard in the field, the bias becomes pervasive. If the system is complex, and most are, biases can remain hidden in the code, difficult to pinpoint or explicate, and not necessarily disclosed to users or their clients. Further more, unlike in our dealings with biased individuals with whom a potential victim can negotiate, biased systems offer no equivalent means for appeal.