{"id":9151,"date":"2024-10-31T19:26:08","date_gmt":"2024-10-31T19:26:08","guid":{"rendered":"https:\/\/www.tun.com\/home\/?p=9151"},"modified":"2025-03-14T19:31:11","modified_gmt":"2025-03-14T19:31:11","slug":"umich-researchers-develop-algorithm-to-counteract-racial-bias-in-ai-medical-diagnoses","status":"publish","type":"post","link":"https:\/\/www.tun.com\/home\/umich-researchers-develop-algorithm-to-counteract-racial-bias-in-ai-medical-diagnoses\/","title":{"rendered":"UMich Researchers Develop Algorithm to Counteract Racial Bias in AI Medical Diagnoses"},"content":{"rendered":"\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-uagb-blockquote uagb-block-e7eb3fc3 uagb-blockquote__skin-border uagb-blockquote__stack-img-none\"><blockquote class=\"uagb-blockquote\"><div class=\"uagb-blockquote__content\">Researchers at the University of Michigan have developed an algorithm to correct racial bias in medical data, ensuring AI systems can make fairer and more accurate predictions. This advancement promises to minimize health care disparities, significantly affecting how AI aids in diagnosing illnesses like sepsis.<\/div><footer><div class=\"uagb-blockquote__author-wrap uagb-blockquote__author-at-left\"><\/div><\/footer><\/blockquote><\/div>\n\n\n\n<div class=\"wp-block-group is-content-justification-space-between is-nowrap is-layout-flex wp-container-core-group-is-layout-0dfbf163 wp-block-group-is-layout-flex\"><div style=\"font-size:16px;\" class=\"has-text-align-left wp-block-post-author\"><div class=\"wp-block-post-author__content\"><p class=\"wp-block-post-author__name\">The University Network<\/p><\/div><\/div>\n\n\n<div class=\"wp-block-uagb-social-share uagb-social-share__outer-wrap uagb-social-share__layout-horizontal uagb-block-ee584a31\">\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-ec619ce7\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/www.facebook.com\/sharer.php?u=\" tabindex=\"0\" role=\"button\" aria-label=\"facebook\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\"><path d=\"M504 256C504 119 393 8 256 8S8 119 8 256c0 123.8 90.69 226.4 209.3 245V327.7h-63V256h63v-54.64c0-62.15 37-96.48 93.67-96.48 27.14 0 55.52 4.84 55.52 4.84v61h-31.28c-30.8 0-40.41 19.12-40.41 38.73V256h68.78l-11 71.69h-57.78V501C413.3 482.4 504 379.8 504 256z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n\n\n\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-32d99934\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/twitter.com\/share?url=\" tabindex=\"0\" role=\"button\" aria-label=\"twitter\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\"><path d=\"M389.2 48h70.6L305.6 224.2 487 464H345L233.7 318.6 106.5 464H35.8L200.7 275.5 26.8 48H172.4L272.9 180.9 389.2 48zM364.4 421.8h39.1L151.1 88h-42L364.4 421.8z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n\n\n\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-1d136f14\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/www.linkedin.com\/shareArticle?url=\" tabindex=\"0\" role=\"button\" aria-label=\"linkedin\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 448 512\"><path d=\"M416 32H31.9C14.3 32 0 46.5 0 64.3v383.4C0 465.5 14.3 480 31.9 480H416c17.6 0 32-14.5 32-32.3V64.3c0-17.8-14.4-32.3-32-32.3zM135.4 416H69V202.2h66.5V416zm-33.2-243c-21.3 0-38.5-17.3-38.5-38.5S80.9 96 102.2 96c21.2 0 38.5 17.3 38.5 38.5 0 21.3-17.2 38.5-38.5 38.5zm282.1 243h-66.4V312c0-24.8-.5-56.7-34.5-56.7-34.6 0-39.9 27-39.9 54.9V416h-66.4V202.2h63.7v29.2h.9c8.9-16.8 30.6-34.5 62.9-34.5 67.2 0 79.7 44.3 79.7 101.9V416z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n<\/div>\n<\/div>\n<\/div><\/div>\n\n\n\n<p>A new study from researchers at the University of Michigan has revealed significant racial disparities in medical testing, underscoring the urgency for equitable health care solutions. The researchers have identified a systematic bias that puts Black patients at a disadvantage, affecting the accuracy of AI models used in diagnosing critical conditions.<\/p>\n\n\n\n<p>Unveiled in two key studies, one <a href=\"https:\/\/journals.plos.org\/globalpublichealth\/article?id=10.1371\/journal.pgph.0003555\" title=\"\">published<\/a> in PLOS Global Public Health and another <a href=\"https:\/\/proceedings.mlr.press\/v235\/chang24e.html\" title=\"\">presented<\/a> at the International Conference on Machine Learning in Vienna, Austria, this innovative research demonstrates how medical data often used to train AI is biased against Black patients. The team discovered that, with identical medical conditions, Black patients are less likely than their white counterparts to receive essential diagnostic tests.<\/p>\n\n\n\n<p>&#8220;If there are subgroups of patients who are systematically undertested, then you are baking this bias into your model,&#8221; corresponding author Jenna Wiens, an associate professor of computer science and engineering at the University of Michigan, said in a <a href=\"https:\/\/news.umich.edu\/accounting-for-bias-in-medical-data-helps-prevent-ai-from-amplifying-racial-disparity\/\" title=\"\">news release<\/a>. &#8220;Adjusting for such confounding factors is a standard statistical technique, but it&#8217;s typically not done prior to training AI models. When training AI, it&#8217;s really important to acknowledge flaws in the available data and think about their downstream implications.&#8221;<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p>The research found that white patients were tested up to 4.5% more often than Black patients with similar medical needs. This bias was evident in data from Michigan Medicine in Ann Arbor and the Medical Information Mart for Intensive Care (MIMIC) dataset from Beth Israel Deaconess Medical Center in Boston.<\/p>\n\n\n\n<p>To tackle this issue, the team developed a novel algorithm designed to identify patients who, despite being untested, were likely suffering from severe conditions based on their race and vital signs. This allowed AI models to compensate for the bias without excluding any patient records.<\/p>\n\n\n\n<p>&#8220;Approaches that account for systematic bias in data are an important step towards correcting some inequities in health care delivery, especially as more clinics turn toward AI-based solutions,&#8221; Trenton Chang, a doctoral student in computer science and engineering at the University of Michigan and the first author of both studies, said in the news release.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p>Using simulated data, the algorithm significantly improved the accuracy of AI models in diagnosing illnesses like sepsis, achieving accuracy rates comparable to models trained on unbiased datasets.<\/p>\n\n\n\n<p>This pioneering work reveals a path forward for AI in health care that does not perpetuate existing biases but strives to mitigate them. The integration of such bias-correcting algorithms could lead to more equitable health care outcomes and inspire the development of fairer, more inclusive AI systems.<\/p>\n\n\n\n<p>This research not only highlights a crucial flaw in current AI training practices but also offers a pragmatic solution, promoting equity in medical treatment through advanced technology. As more clinics adopt AI-based solutions, the implementation of these algorithms could be a significant step towards a more just health care system.<\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>A new study from researchers at the University of Michigan has revealed significant racial disparities in medical testing, underscoring the urgency for equitable health care solutions. The researchers have identified a systematic bias that puts Black patients at a disadvantage, affecting the accuracy of AI models used in diagnosing critical conditions. Unveiled in two key [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"single-no-separators","format":"standard","meta":{"_acf_changed":false,"_uag_custom_page_level_css":"","_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[8],"tags":[76],"class_list":["post-9151","post","type-post","status-publish","format-standard","hentry","category-ai","tag-university-of-michigan"],"acf":[],"aioseo_notices":[],"uagb_featured_image_src":{"full":false,"thumbnail":false,"medium":false,"medium_large":false,"large":false,"1536x1536":false,"2048x2048":false},"uagb_author_info":{"display_name":"The University Network","author_link":"https:\/\/www.tun.com\/home\/author\/funky_junkie\/"},"uagb_comment_info":0,"uagb_excerpt":"A new study from researchers at the University of Michigan has revealed significant racial disparities in medical testing, underscoring the urgency for equitable health care solutions. The researchers have identified a systematic bias that puts Black patients at a disadvantage, affecting the accuracy of AI models used in diagnosing critical conditions. Unveiled in two key&hellip;","_links":{"self":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/9151","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/comments?post=9151"}],"version-history":[{"count":8,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/9151\/revisions"}],"predecessor-version":[{"id":9253,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/9151\/revisions\/9253"}],"wp:attachment":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/media?parent=9151"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/categories?post=9151"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/tags?post=9151"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}