top of page

IP 3: Algorithms


“At a time when state funding for public goods such as universities, schools, libraries, archives, and other important memory institutions is in decline in the US, private corporations are providing products, services and financing on their behalf. With these trade-offs comes an exercising of greater control over the information, which is deeply consequential for those already systematically oppressed…” (Noble, p. 123)



1. Explain in your own words what "content prioritization" (Noble, p. 156) means (give some examples) and how (in lay terms) content prioritization algorithms work.


Content prioritization describes how an algorithm (and by extension, the humans who program it) chooses what content to present to users. Literally, which content should be given highest priority. Without content prioritization algorithms, sifting through online content would be impossible, so algorithms do the legwork of delivering what you want and weeding out content deemed irrelevant or low-quality. In theory, that is. Algorithms involve machine learning and data science. They’re capable of parsing data and ranking information/websites/media based on criteria that companies (and the human programmers within them) determine.

Google is the king of content prioritization, but the way its search engines choose to prioritize things can contribute to negative stereotypes. For example, Noble says that in 2012 “searching for “Latinas” or “Asian women” brings about results that focus on porn, dating, and fetishization.” (Noble, p. 159) She goes on to talk about how these already marginalized groups have little say or recourse over the negative narratives being presented about them to a global audience. At the time Noble published this book (2018), Google would likely have been considered the most powerful public content prioritization platform, however, I would offer TikTok as a more current contender for first place. Much of the reason for TikTok’s massive, global success is the uncanny ability of its algorithm to take a users input (how long a video is watched for, if it’s shared, if a comment is made, etc.) and curate an incredibly personalized media selection for that individual user. Its appeal is that each user is essentially "training" their very own media delivery algorithm. However, from the beginning, TikTok users have been aware that the algorithm is flawed. It notably suppresses BIPOC creators, prioritizes conventionally attractive bodies and often “shadow bans” creators by de-prioritizing their videos. In this way, the algorithm is shaping what content users see. On the surface, this may seem trivial, but what it’s really doing is reinforcing the single, dominant narrative. One that is often racist, colonialist, sexist and homophobic.


2. With control over the “largest digital repository in the world” (Noble, p. 187), how have Google’s content prioritization algorithms been “consequential for those already systematically oppressed”? How do they impact your professional life? (give specific examples and briefly discuss)


For those already systematically oppressed, prioritization algorithms act out racialized violence made worse by the guise of “neutrality”. Noble gives the example of Kandis, “a small business owner who runs the only local African American hair salon within a predominantly White neighborhood, located near a prestigious college town in the United States” (Noble, p. 173) When Yelp became popular, it broke down the previous means of advertising; word of mouth. It’s algorithms for business owners to pay to have their business displayed high enough in the prioritization algorithm to attract customers. Their algorithms also do not recognize that black people often have a cultural distrust around things such as “checking in” or leaving written reviews. Obviously this is only one example of how a lack of diversity and cultural awareness on the part of programmers can have very negative real-world consequences, especially for those who are already systematically oppressed.

In my professional life one way that I’ve had to contend with content prioritization algorithms is when doing research for unit/lesson planning. Google often seems to prioritize American content creators, who are working with very different curriculums than Canadian educators. It often takes a considerable amount of “google-fu” to find the kind of content that is meaningful to me. A recent example of this would be difficulty finding Canadian-centric resources for Black History Month. A quick google search populates with almost exclusively American content, the search must be significantly refined to show Canadian-centric examples.


3. What are some ways PageRank impacts your personal life? (specific examples and briefly discuss) (How) can you impact PageRank? Explain.


A personal example of how PageRank affects my life is when my partner and I travel abroad. When you’re in Mexico, for example, and Google “best restaurants in Puerto Vallarta” the first two to three pages will be filled with websites that have the best/most English reviews. We discovered very quickly during our travels that most of the people leaving the reviews that would cause Google to PageRank these pages, have no idea what they’re talking about. They’re tourists (often American) who generally don’t understand the local cuisine or have the ability to make an informed comment. Now, we don’t even bother to Google restaurants when we travel. We either ask a local, or wander until we find a place that is busy with local patrons. The algorithm just isn’t set up to accommodate countries where many locals don’t have access to the internet, or reviews are left in different languages.

Noble outlined a few ways to affect PageRank such as paying for advertising, but my favourite was “google bombing” which she describes as “the practice of excessively hyperlinking to a website (repeatedly coding HTML to link a page to a term or phrase) to cause it to rise to the top of PageRank, but it is also seen as a type of “hit and run” activity that can deliberately co-opt terms and identities on the web for political, ideological, and satirical purposes.” (Noble, p.47) I’d be curious to know whether or not the current iteration of Google’s algorithms have managed to “patch” this PageRank workaround.




References

Noble, S. U. (2018). Algorithms of oppression. New York University Press. (Introduction, Chapter 1, Conclusion)Links to an external site.


コメント


bottom of page