Auditing Algorithms @ Northeastern

submited by
Style Pass
2025-01-14 02:30:04

This site is the homepage for the Algorithm Auditing Research Group within the Khoury College of Computer Sciences at Northeastern University. Here, you will find explanations of and links to our work, as well as open-source data and code from our research.

Today, we are surrounded by algorithmic systems in our everyday life. Examples on the web include Google Search, which personalizes search results to try and surface more relevant content; Amazon and Netflix, which recommend products and media; and Facebook, which personalizes each user's news-feed to highlight engaging content. Algorithms are also increasingly appearing in real world contexts, like surge pricing for vehicles from Uber; predictive policing algorithms that attempt to infer where crimes will occur and who will commit them; and credit scoring systems that determine eligibility for loans and credit cards. The proliferation of algorithms is driven by the explosion of Big Data that is available about people's online and offline behavior.

Although there are many cases where algorithms are beneficial to users, scientists and regulators are concerned that they may also harm individuals. For example, sociologists and political scientists worry that online Filter Bubbles may create "echo chambers" that increase political polarization. Similarly, personalization on e-commerce sites can be used to implement price discrimination. Furthermore, algorithms may exhibit racial and gender discrimination if they are trained on biased datasets. As algorithmic system proliferate, the potential for (unintentional) harmful consequences to users increases.

Leave a Comment