At the start of Black History Month 2019, Google designed its daily-changing homepage logo to include an image of African-American activist Sojourner Truth, the great 19th-century abolitionist and women’s rights activist. But what would Truth say about Google’s continual lack of care and respect toward people of colour?
While bringing more attention to Sojourner Truth is venerable, Google can do better. As a professor and researcher of digital cultures, I have found that a lack of care and investment by tech companies towards users who are not white and male allows racism and sexism to creep into search engines, social networks and other algorithmic technologies.
Whiteness over-represented
For instance, when I search for “woman” or “girl” via Google’s image search, the vast majority of results are pictures of thin white women (with the notable exceptions being a woman in a hijab, a white woman without a nose and a disabled girl).
People of colour are not completely absent, but they are underrepresented on Google’s image search. For example of the first 50 images when searching for “girl,” 46 displayed white girls, three were of Asian girls and only one included a Black girl. These well-documented disparities in search-engine results are in part due to the dismal low number of Black women working at Google — only 1.2 per cent of their workforce.
To make matters worse, Google suggests that I narrow down my search results with adjectives ranging from “attractive” to “skinny” to “pregnant.” In contrast, when searching for “men” (a category that also overrepresents whiteness), the first three adjectives are “cartoon,” “hair style” and “old.” These adjectives may be descriptive, but they also replicate the stereotype that women are primarily valued for their beauty and reproductive organs and men are important for their personality and wisdom. Stereotyping is endemic to most any digital technology like Google that aims to replicate how humans already sort information.
Social media researcher and UCLA professor Safiya Noble has written most extensively on this topic. In her book Algorithms of Oppression, she points out that Google suggests racist and sexist search results are the user’s fault since they simply reflect our own cultural assumptions and previous search histories. Noble also illustrates how Google’s algorithms skew their results in ways that prioritise advertisers and the white affluent audiences they are often trying to attract.
My research has shown how these biased practices are unthinkingly adopted from earlier industries and technologies dominated by white men.
After reading Noble’s work, many of my students decided to test Google out themselves. They found that while the specific searches that Noble performed now lead to reasonable results, many others do not. Google has clearly made a change since Noble conducted her research. Google has not publicly stated why they made this change, but the timing suggests it was in response to Noble’s work.
What caught my student’s eye was that Google’s algorithms still appears to favour sexualised images of Latinas and Asian women and girls in both their search results and the images displayed. The student’s informal search results featured scantily clad women – and seemed to do so much more than their white counterparts.
Suggestive autocompletion
The racism on Google is certainly not limited to the search result images it displays. It is also evident in its autocomplete function, which tries to guess what exactly you want to search for.
For years, searches for variations on “Black women,” led to racist and sexist suggestions. Now Google simply no longer autocompletes with anything at all. For instance, now when I type “Why are Black women so” into Google’s search bar – the search that is on the cover of Noble’s book – it does not autocomplete at all.
This reduction of autocomplete functionality is Google’s typical response when journalists and scholars point to blatant racism on its site. At this moment, Google no longer autocompletes, among other things, the phrases “blacks are,” “asians are,” “homosexuals are,” “Latinos are,” “Muslims are,” and “Jews are” but does autocomplete “whites are,” “Latinas are,” “heterosexuals are” and “Canadians are.”
This tendency to reduce rather than improve functionality when it comes to minorities goes beyond autocomplete. For instance, when people began pointing out that Google was misidentifying Black people as gorillas in 2015, Google’s response was simply to no longer use “gorilla” as a descriptor at all; rather than fix their algorithm, they chose to break it more so now Google can also no longer identify gorillas.
Long-term fixes
At the beginning of 2018, Wired magazine and others pointed out that Google still could not accurately identify or label either gorillas or Black people. Reducing functionality by simply turning off technologies may be a fine short-term response, but in the long term, the internet becomes a less welcoming and useful space for people of colour and women.
Considering Google’s status as a monopoly and its desire to continually present itself as a public good company, it is surprisingly muted when it comes to these vital areas it must improve on. The same is true for virtually every other major tech company, though some, like Facebook, are beginning to bow to public pressure by kicking white nationalists off their sites.
This comes weeks after they were heavily criticised and sued for their role in the New Zealand Christchurch mosque attack. Given their long history of censoring Black Lives matter posts rather than those made by white supremacists, and their desire to continue allowing Holocaust denials to be posted, this certainly feels like too little, too late.
Jonathan Cohn, Assistant Professor of Digital Cultures, University of Alberta.
This article first appeared on The Conversation.
Limited-time offer: Big stories, small price. Keep independent media alive. Become a Scroll member today!
Our journalism is for everyone. But you can get special privileges by buying an annual Scroll Membership. Sign up today!