{domain:"www.qualitydigest.com",server:"169.47.211.87"} Skip to main content

User account menu
Main navigation
  • Topics
    • Customer Care
    • FDA Compliance
    • Healthcare
    • Innovation
    • Lean
    • Management
    • Metrology
    • Operations
    • Risk Management
    • Six Sigma
    • Standards
    • Statistics
    • Supply Chain
    • Sustainability
    • Training
  • Videos/Webinars
    • All videos
    • Product Demos
    • Webinars
  • Advertise
    • Advertise
    • Submit B2B Press Release
    • Write for us
  • Metrology Hub
  • Training
  • Subscribe
  • Log in
Mobile Menu
  • Home
  • Topics
    • 3D Metrology-CMSC
    • Customer Care
    • FDA Compliance
    • Healthcare
    • Innovation
    • Lean
    • Management
    • Metrology
    • Operations
    • Risk Management
    • Six Sigma
    • Standards
    • Statistics
    • Supply Chain
    • Sustainability
    • Training
  • Login / Subscribe
  • More...
    • All Features
    • All News
    • All Videos
    • Contact
    • Training

Biased AI Can Be Bad for Your Health

Here’s how to promote algorithmic fairness

Sharona Hoffman
Wed, 04/07/2021 - 12:02
  • Comment
  • RSS

Social Sharing block

  • Print
Body

Artificial intelligence holds great promise for improving human health by helping doctors make accurate diagnoses and treatment decisions. It can also lead to discrimination that can harm minorities, women, and economically disadvantaged people.

ADVERTISEMENT

The question is, when healthcare algorithms discriminate, what recourse do people have?

A prominent example of this kind of discrimination is an algorithm used to refer chronically ill patients to programs that care for high-risk patients. A study in 2019 found that the algorithm favored whites over sicker African Americans in selecting patients for these beneficial services. This is because it used past medical expenditures as a proxy for medical needs.

Poverty and difficulty accessing healthcare often prevent African Americans from spending as much money on healthcare as others. The algorithm misinterpreted their low spending as indicating they were healthy, and deprived them of critically needed support.

 …

Want to continue?
Log in or create a FREE account.
Enter your username or email address
Enter the password that accompanies your username.
By logging in you agree to receive communication from Quality Digest. Privacy Policy.
Create a FREE account
Forgot My Password

Add new comment

Image CAPTCHA
Enter the characters shown in the image.
Please login to comment.
      

© 2025 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute Inc.

footer
  • Home
  • Print QD: 1995-2008
  • Print QD: 2008-2009
  • Videos
  • Privacy Policy
  • Write for us
footer second menu
  • Subscribe to Quality Digest
  • About Us
  • Contact Us