Duke professors are working to bring equity to computing, tackle algorithmic bias

Faculty members across Duke’s computer science and electrical and computer engineering departments have been working to promote equity in their fields.

Computer parity problems can occur at a lower level with software development. Just as an author’s implicit beliefs and biases can manifest in literature, the way a programmer constructs a predictive model can be shaped by his or her own beliefs and biases. The lack of diversity in this field means that algorithms can inadvertently perpetuate stereotypes.

This phenomenon – called algorithmic bias – can have serious consequences given the public’s reliance on predictive models in many critical areas. For example, a 2019 study found that an algorithm used to estimate future medical costs for more than 200 million hospital patients across the US favored white patients over Black patients.

It’s important that high-level models don’t put marginalized groups at a disadvantage, according to Jian Pei, a professor of electrical and computer engineering.

Eliminating this bias became Pei’s main goal. In dealing with algorithmic bias, Pei uses a three-pronged approach: awareness, effective assessment of bias and use of equal technology.

Lire Aussi :  Toilet time: Is your mobile device affecting how long you're in the bathroom? Experts reveal health risks

“We must use cases and examples to demonstrate in detail the power of the lack of fairness, diversity and equality that occurs in algorithm design. If people know these topics, the next thing is how we can measure them,” said Pei. “People don’t know how to use those concepts, or how to deal with those ideas. It is important to create quantitative measures, such as how unfair it is, or how close it is to being fair [a model is]and whether it is possible to trade.”

Pei explained that fairness should be considered in many stages of the data science life cycle, including how data is collected, sampled for analysis and evaluated after processing. Fixing outcomes that appear to benefit certain minority groups over others is not enough; rather, models should be designed to make unbiased calculations.

Lire Aussi :  Ed Boon Has Bad News For Mortal Kombat, Injustice Fans

“We must strengthen his hand in ways to prevent future problems. This is not just about fixing the existing problem, we need to improve the whole system going forward,” said Pei.

Pei is not alone in his efforts. Cynthia Rudin, the Earl D. McLean junior professor of computer science, electrical and computer engineering, mathematical sciences and biostatistics & bioinformatics, emphasized interpretability and transparency in her work. Rudin has received acclaim for his research examining bias in black-box models used in health care and criminal justice settings — the algorithms whose inner workings are hidden.

“In criminal justice, we are trying to show that black box models can be replaced by interpretable models without losing accuracy, making the models less susceptible to the unobservable selection and data entry errors that black boxes have,” Rudin wrote in an email.

As a professor, Rudin has approached equity in the classroom from several angles. For example, he has made his courses more accessible by posting materials online, both to students who cannot attend class and to members of the general public. Rudin also revised his curriculum last year to include interpretation as a core concept.

Lire Aussi :  BluWave-ai Debuts EV Fleet Orchestrator SaaS Product for Fleet Operations

“As soon as we start working on something high-level where errors matter, like health care decisions or loans, it’s very important that we understand what predictive models are doing,” Rudin wrote. “I decided that if you’re going to be an expert in machine learning, you should know something about interpretation.

Rudin said his students were “the first students to be taught these things in a classroom anywhere in the world.”


Gautam Sirdeshmukh
| Staff reporter

Gautam Sirdeshmukh is a Trinity senior and staff reporter for the news department. He was previously the health and science editor for volume 117 of the Chronicle.



Source

Leave a Reply

Your email address will not be published.

Related Articles

Back to top button