Question Your Search: How Algorithms Perpetuate Inequality
Whether you know it or not, algorithms power many of the online services we rely upon, customizing the content we see based on our personal information, actions, and interactions online. But these algorithms are imperfect and can distort, exclude, or misrepresent our view of others, ourselves, and the world.
You may have heard stories of how algorithms create racist and sexist imagery, determine hiring practices, manage merchant ships and supply chains, cause issues in law enforcement, impact vaccine distribution, and impact what you see on social media.
The weakness in this construct is complex; algorithms are powered by logic determined by humans who have biases. But algorithms are not the sole determinant behind each of our unique experiences on the internet and why we may see inequality being supported by online platforms. Much of what we find online is heavily shaped by the economics that have allowed the internet to thrive. Many of our favorite search engines, sites, games and apps are free to use, but they are not free to operate. They are funded by those who have money and need an audience. Those parties get the privilege of delivering much of what we see, see first and see most, in the form of advertising, sponsored posts and most importantly priority positioning before our eyes. They all have an agenda; advertisers want to make you aware of their wares and convince you to buy them. Others may have more insidious motivations such as promoting racism or peddling conspiracy theories.
What does this have to do with you and your everyday digital life? Everything.
In our 16th episode of Managing Family Life Online Webinar Series, “How Algorithms Perpetuate Inequality,” we spoke to one of the nation’s leading experts Dr. Safiya Noble, Associate Professor, UCLA Department of Information Studies and Co-Founder, UCLA Center for Critical Internet Inquiry (C2i2) about racism, sexism, and the role of online algorithms in perpetuating them.
Although this discussion hit on a large cut of societal concerns, a few key takeaways include:
- Search engines are driven by advertising, not popularity. A person’s first instinct is to believe the top items you see in a search online are the most popular or most searched for terms, unfortunately this is not the case. We believe this because our most common searches are usually safe, banal topics such as finding the closest coffee shop or restaurant and those tend to be accurate. However, when you start looking at deeper topics, the search is driven by key words and those have been paid for by companies or individuals to drive certain sites to the top of your search. This can often lead to the spread of biased content or misinformation as these sources can seem like the truth since they are presented to us first.
- Instant gratification is not instant knowledge. We need to move back to a place where knowledge is contested – as a society we need to break this more recent mindset that the right answer is a simple ‘Google search’ or Wikipedia review away. That what you need to know – and trust that it is accurate – is available with the simple push of a button. This shortsightedness only furthers the impact to the historically oppressed and why legacies, lessons and viewpoints from groups and peoples who are underrepresented or do not have the means never make it to the larger digital lexicon. We collectively need to slow down, reflect on what we are learning/digesting, think more critically, and move beyond the ‘fast food nation’ approach to digital education.
- There are alternatives to your search. First and foremost, the libraries of America are still around and when open (some will be closed due to COVID concerns) serve as an incredible resource. For items larger than a trivial search, the recommendation is still to look at legitimate and validated resources. Consider services like Duck Duck Go which is less likely to serve content based on tracking your behavior (like Google does) and is more likely to deliver content based solely on the term you’re searching on. You can also try’logging out’ and using Incognito mode in browsers such as Safari or Chrome which limits their tracking capabilities of your searches and is less likely to rely upon algorithms to serve you what you’re looking for.
To hear more on this topic, listen to the full conversation with Dr. Noble.
You can also find our most recent list of tips to help here. Additional helpful resources include:
- Algorithms of Oppression– by Dr. Safiya Umoja Noble
- “Coded Bias”– An exploration into the fallout of MIT Media Lab researcher Joy Buolamwini’s discovery of racial bias in facial recognition algorithms.
- Time 100 – Tristan Harris and Safiya U. Noble on the impact of algorithms. Moderated by The Duke and Duchess of Sussex
- UCLA Center for Critical Internet Inquiry