Skip to main content

AI Bias Harms Black Families and Businesses. Howard University Is Working to Change That

By: 3BL Media

SOURCE: The Mastercard Center for Inclusive Growth

DESCRIPTION:

For most people, artificial intelligence algorithms can be a mystery. Raw data goes in one end and answers come out the other. That information could be a credit decision, a prioritization line for organ donations or even what show you should watch next on Netflix.

But in the past few years one thing has become increasingly clear—these algorithms often disadvantage communities of color, especially when it comes to finance. Racial biases can seep into artificial intelligence when the teams of AI scientists and engineers fail to consider diverse backgrounds. Through their choices around which data to use and which data matters more, they can create algorithms that systematically disadvantage Black peopleand small business owners.

Howard University, the historically Black university in Washington, D.C., is taking a novel approach to this problem, and it starts with future developers. Its new Center for Applied Data Science and Analytics, funded in part through a $5 million grant from Mastercard, will train the next generation of data scientists on how to eliminate biases in AI. The center’s new graduate data science program, which launches this fall, will teach students to address data equity issues.

“There are potential dangers when we don’t have diversity in the skills of individuals creating algorithms,” says Howard Provost Anthony Wutoh.

AI bias can show up in a host of ways. For example, a consumer might not qualify for a credit card because the approval AI is looking for a history of previous credit card payments and the applicant may have only been using cash.

A bank might deny a loan applicant who wants to a new home based on their zip code because the AI model is built on the assumption that someone from a poorer neighborhood will default on a loan—“codifying past injustices into their model,” writes Cathy O’Neal in her book “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.”

“This problem is critical to solve since biases in AI can effectively block access to vital services if not addressed,” says Salah Goss, senior vice president for Mastercard’s Center for Inclusive Growth, the company’s philanthropic hub that is administering the grant. “Howard University is taking a novel approach to solving this problem by investing in a new generation of professionals who can combat the issue at its source.”

Bias in AI can create unfairly negative results not only for Black communities, but other communities of color and just about anyone using these systems. That means fixing AI biases can help to create equity within finance and credit for Black consumers and business owners, while also creating positive downstream impact to other groups historically locked out of the formal economy.

“If you’re from an impoverished family, probably no one in your family had a credit card,” says William Southerland, a Howard biochemistry professor who will be running CADSA. “You may have been equally as responsible financially, but your evaluation may come out noisy.”

As Wutoh noted, AI biases extend well beyond the world of finance. Researchers found that a health care algorithm that predicts which patients need extra care incorrectly identified Black patients as healthier and needing less treatment, potentially jeopardizing their health. ProPublica reporters discovered that a model that makes decisions about bail setting flagged Black defendants as almost two times more likely to commit another crime.

Building more equitable AI could help solve these types of biases and prevent the racial wealth gap – Black families in American have one-tenth the wealth of white families – from widening even further. In recent years, more companies, non-profits and even the U.S. government have taken on the topic of eliminating biases from AI. IBM, for example, created an AI ethics board, while U.S. lawmakers penned the Algorithmic Accountability Act.

The new center at Howard, is taking an interdisciplinary approach to improving AI. Professors will come not only from computer science and engineering fields but also from the arts. By bringing a broader approach to building algorithms, the hope is to eliminate bias before it becomes a problem.

“We want to play a role in democratizing data science,” Southerland says.

The $5 million grant to Howard is part of Mastercard’s giving totaling $10 million to help Historically Black Colleges and Universities close the racial wealth and opportunity gap and create a more inclusive economy, including helping Morehouse College and Spelman College launch the Center for Black Entrepreneurship to further the development of cutting-edge entrepreneurial programming, thought leadership, networking and academic and mentorship opportunities for aspiring Black entrepreneurs. The investment is the latest in Mastercard’s In Solidarity initiative to combat racism and create equal opportunities for all.

“We want to use data science to answer some of the broader social questions we believe Howard can significantly impact, including around healthcare and economic disparities and the drive for criminal justice reform,” Wutoh says. “Data science touches everything, and it’s going to continue to be increasingly impactful in everything that we do.”

Photos courtesy of Howard University.

Check out more content from The Mastercard Center for Inclusive Growth​

Tweet me: AI bias harms Black families and businesses. Read how Howard University is working to change that: https://mstr.cd/3p3spHO @CNTR4Growth

KEYWORDS: NYSE:MA, The Mastercard Center for Inclusive Growth, Howard University, racial bias, mastercard, #InclusiveGrowth, #InSolidarity, #DataScience, #DataForGood, #Data4Good, AI bias, #BlackHistoryMonth, #FinancialInclusion, African American Small Businesses

two Black Howard University students in masks work on laptops

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.