Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 16 von 754157
Assessing risk, automating racism
Science (American Association for the Advancement of Science), 2019-10, Vol.366 (6464), p.421-422
2019

Details

Autor(en) / Beteiligte
Titel
Assessing risk, automating racism
Ist Teil von
  • Science (American Association for the Advancement of Science), 2019-10, Vol.366 (6464), p.421-422
Ort / Verlag
Washington: The American Association for the Advancement of Science
Erscheinungsjahr
2019
Link zum Volltext
Quelle
American Association for the Advancement of Science
Beschreibungen/Notizen
  • A health care algorithm reflects underlying racial bias in society As more organizations and industries adopt digital tools to identify risk and allocate resources, the automation of racial discrimination is a growing concern. Social scientists have been at the forefront of studying the historical, political, economic, and ethical dimensions of such tools ( 1 – 3 ). But most analysts do not have access to widely used proprietary algorithms and so cannot typically identify the precise mechanisms that produce disparate outcomes. On page 447 of this issue, Obermeyer et al. ( 4 ) report one of the first studies to examine the outputs and inputs of an algorithm that predicts health risk, and influences treatment, of millions of people. They found that because the tool was designed to predict the cost of care as a proxy for health needs, Black patients with the same risk score as White patients tend to be much sicker, because providers spend much less on their care overall. This study contributes greatly to a more socially conscious approach to technology development, demonstrating how a seemingly benign choice of label (that is, health cost) initiates a process with potentially life-threatening results. Whereas in a previous era, the intention to deepen racial inequities was more explicit, today coded inequity is perpetuated precisely because those who design and adopt such tools are not thinking carefully about systemic racism.
Sprache
Englisch
Identifikatoren
ISSN: 0036-8075
eISSN: 1095-9203
DOI: 10.1126/science.aaz3873
Titel-ID: cdi_proquest_miscellaneous_2309488170

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX