print page
Technology Connections. Algorithmic Bias
Column

Technology Connections Algorithmic Bias

An algorithm is a set of instructions that is used to perform a task. They are often taught in math, but algorithms are pervasive in modern life. They control stop lights, bus schedules, how we grade papers, and even how we sort books. An algorithm can be everything from that delicious cookie recipe to PageRank—Google's infamous Internet search engine algorithm. In library land, algorithms control most of the research we and our students do online.

Algorithms appear on their face to be unbiased—after all—they are just sets of steps. However, algorithms are only as good as their creators and produce results only as good as their ingredients. Referring to our cookie recipe, if the baker is a fan of crispy cookies, there is going to be more butter. Bakers with a sweet tooth will call for a little more vanilla. Similarly, different search engines have different algorithms and as a result, serve up different results. You can try this yourselves by searching the same terms in Google and DuckDuckGo. Each company has a different recipe for anticipating what you want so slightly different results will appear.

In our cookie recipe example, both end results are lovely. I have a ridiculous sweet tooth and would like them equally. In doing online research, however, those differences are not so innocuous. For example, online shopping is a place where results may vary based on the algorithms created by the company. Google has changed its search algorithm several times to favor big businesses over small ones (Grind et al. 2019). As a result, sites at the top of the search are ones who have paid Google for that spot, whether or not that's the place you actually wanted to shop. Facebook allowed companies to draw a literal red line around areas that they did not want their ads to appear. Specifically, Facebook designed an algorithm that allowed advertisers to exclude women in the workforce, moms of grade-school kids, foreigners, Puerto Rico islanders, or people interested in parenting, accessibility, service animals, hijab fashion, or Hispanic culture ("HUD Charges" 2019). So, algorithms can be helpful in locating information, but acknowledging their power to do harm is important when we're teaching students how to do online research responsibly.

Algorithms can have bias baked in by their creators, but algorithms also have the power to learn. Your search results when you are logged into a site may be "customized" or "optimized" by what the algorithm already knows about you and every search adds to its knowledge of what links you, personally, would like to see at the top of your results. Engineers give the algorithms datasets to teach them how to respond to a variety of queries. These data sets have both implicit and explicit biases.

Computer algorithms are dependent on the quality of their ingredients. Search engines learn about you based on what you search, when you search, and where you search and will return results with those pieces of information embedded in the algorithm. The quality of the results is directly related to what the algorithm knows about the searcher, what other searchers clicked on previously, and what the company perceives to be the most profitable.

Librarians have several tools to combat algorithmic bias. First, we have the power of attention. We can point out and draw attention to bias when we see it. Secondly, we have the power to teach students about algorithmic hygiene; and last, we have the power of encouragement.

Attention is one of the most powerful tools at our disposal. Noting the presence of bias to companies that supply our databases is an important function. When we see racist captions, biased or incomplete search results, we should report it. For example, when searching "Charlottesville" in one of our paid databases, there were eight pages of football players that appeared before the first female athlete appeared. This, despite having two female swimmers in this past summer's Olympics. Bringing this to the attention of our sales representative led to a meeting with content and engineering leads to work on getting better content to our students. The quality of the students' research will improve because we drew attention to the problem.

We also have the power to teach students how to maintain their algorithmic hygiene. Here are a few tips and tricks to help students clean up their algorithms:

  1. Delete cookies and search history regularly via "My Google Activity" dashboards and browser settings.
  2. Turn off the "private results" option.
  3. Use the incognito or private browsing tab with your "Signed out search activity" turned off.
  4. Spend time on a variety of news sites as this helps your results stay politically balanced.

Knowing that algorithmic bias is embedded in search results impacts the way that we teach our students to do research. We can teach them how to use search terms instead of asking full questions. We can help them recognize bias when they see it through critical reading skills and digital citizenship lessons. We can also help them not just accept that bias is embedded in our research, but rather that they have the power to change it.

As librarians, we have the power to encourage. Bringing bias to the attention of the maker will help to repair damage already done. However, encouraging our students to be proactive in technology, to get hired by Google and Facebook and Mozilla/Firefox, and to change the way technology is made from the inside out will hopefully lead to lasting change.

Works Cited

Grind, Kirsten, Sam Schechner, Robert McMillan, and John West. "How Google Interferes with Its Search Algorithms and Changes Your Results" Wall Street Journal (November 15, 2019). (Online) https://www.wsj.com/articles/how-google-interferes-with-its-search-algorithms-and-changes-your-results-11573823753

"HUD Charges Facebook with Housing Discrimination over Company's Targeted Advertising Practices." Press Release (March 28, 2019). US Department of Housing and Urban Development https://archives.hud.gov/news/2019/pr19-035.cfm.

About the Authors

Ida Mae Craddock, MEd, is the librarian at Burley Middle School in Albemarle County, VA. A graduate of Old Dominion University Darden School of Education, her research is focused around maker-education and libraries as hubs of innovation. She is also an adjunct professor of library science at Old Dominion University where she was named a Darden Fellow. Winner of the Magna Award from the National School Boards Association and Virginia's 2019 Librarian of the Year, IdaMae also co-authors the technology column for School Library Connection.

Heather Moorefield-Lang, EdD, is an associate professor for the Department of Library and Information Science at the University of North Carolina at Greensboro. To see more of Heather's work visit her institutional repository page at https://libres.uncg.edu/ir/uncg/clist.aspx?id=14828, email her at hmoorefield@gmail.com, or follow her on Twitter @actinginthelib.

Select Citation Style:
MLA Citation
Moorefield-Lang, Heather, and Ida Mae Craddock. "Technology Connections. Algorithmic Bias." School Library Connection, January 2022, schoollibraryconnection.com/content/article/2272670.
Chicago Citation
Moorefield-Lang, Heather, and Ida Mae Craddock. "Technology Connections. Algorithmic Bias." School Library Connection, January 2022. https://schoollibraryconnection.com/content/article/2272670.
APA Citation
Moorefield-Lang, H., & Craddock, I. (2022, January). Technology connections. algorithmic bias. School Library Connection. https://schoollibraryconnection.com/content/article/2272670
https://schoollibraryconnection.com/content/article/2272670?learningModuleId=2272670&topicCenterId=2247905

Entry ID: 2272670

back to top