Updated: Dec 16, 2021
Everyone knows Google.
Everyone uses Google.
Google seems harmless but are we sure Google is not influencing our personal beliefs?
“to use the Google search engine to obtain information about (someone or something) on the World Wide Web” – Merriam-webster Dictionary.
In their article “’Why do white people have thin lips?’ Google and the perpetuation of stereotypes via auto-complete search forms” (2013), Baker and Potts investigated Google’s connection to reinforcing stereotypes about age, gender, race, and sexuality to their users. In other words, this means that when you open your browser and type in the search engine the terms “how women”, Google will auto complete your sentences with various meanings, such as “how women think” or “how women test men”. Most of these meanings proposed by Google through a certain algorithmic process imply that most searches are about their relation to men; out of 10 results, 6 fall in this category.
The Internet is a wide place “where identities are co-constructed and negotiated and it is still unclear whether this technology would ever have been able to foster a non-racist, non-sexist and non-classist cultural model” (Barker & Potts, 2013, p. 187). Therefore, considering how immense this digital space is, it is impossible not to encounter the social issues discussed. However, Barker and Potts do not only question Google’s position but also argue that the World Wide Web “is not as democratic as these interfaced communities of engaged and like-minded individuals” (ibid., 188).
“At the time of writing, the World Wide Web’s most widely used search engine is Google. This popularity may be attributed in part to Google’s prominent role at the forefront of a radical shift in emergent technology” – ibid.
Considering their discussion, in the beginning, Web 1.0 offered this immense space for both producers and consumers to connect, access the flow of information, although there were possible just a few ways of exchanging information at that time.
Nowadays, Web 2.0 is a “platform” (ibid.) designed to be used interactively. For example, if a piece of content is produced and published on the Internet by an Italian, another user with a different cultural background from another country can see, use, modify or alter the same content in any way they want giving it another meaning. The issue is when a global community (it can be understood as racial, gender, age communities) is being constantly stereotyped through the content proposed by the search engine’s results. When this content is being clicked on many times by thousands of users, it starts to become popular on the Internet and appears more in the top list.
Consequently, when one starts to type “how women”, no matter the user’s actual interest, the search engine would generate those types of results discussed previously, some of which were gender-role stereotypes.
“to believe unfairly that all people or things with a particular characteristic are the same” – ‘Stereotype’ definition by Merriam-webster Dictionary.
To make things easier, in 2009, “Google altered its search facility by incorporating personalisation algorithms in order to build a profile of the sort of person who is using a particular machine” (ibid.); algorithms that are based on the “location where a person logs on from, the sort of browser they are using or the amount of time they take between entering a search term and clicking on a result” (Pariser quoted in ibid.). By doing this, the results are more personalised for each user.
Nowadays, the Internet does not only help people to exchange information but also to monetize the content they produce. Therefore, it became controlled by businesses and other authorities, although it is all about the client/user’s preferences. Barker and Potts (2013, p. 188) argue that the ‘filter bubble’ is dangerous for two reasons:
“it involves the use of people’s personal information without their knowledge”,
“it can result in people being enclosed in a loop, whereby they are only directed to sites which they have previously shown interest in”.
As a result, Google uses personal data to tailor the user’s experience by directing him/her to sites that the “filter bubble” assumes they might be interested in, or they might find them more enjoyable.
Also, in this way the users are directed to “sites that are already widely popular” (ibid.), those that appear at the top of the list. Even if the user assumes he/she has complete independence and freedom in what he/she wants to access on the Internet, Google is controlling their searches to some extent into accessing the most popular content that produces money.
Google Instant was launched in 2010 and its purpose is that of using “auto-completion algorithms to attempt to guess what is being searched for, from the moment that the user begins to type” (Google quoted in Barker & Potts, 2013, p. 189).
For example, if we type the letter ‘d’ into the Google search box, a menu will open and show various options that are the most popular based on certain aspects, such as the location of the user.
Subsequently, the user receives the most popular responses in a matter of seconds. So, these specific responses shape the user’s interest to click on sites that they might not have considered initially.
Barker and Potts (2013, p. 189) define the Google medium as a “social practice” because these repetitive actions of searching and receiving quick and popular responses constitute “social identities of and relationships between people and groups of people” (Fairclough & Wodak quoted in ibid.).
Scholars, including the ones mentioned already, argue that this “conversation” with a search engine raises questions of how the user understands those responses.
For example, if we type “why do Asians” in the search engine box, almost all results are based on stereotypes about this culture, such as eating too much rice or having less body hair than the other cultures. In this way, the results are not only stereotyping the Asian culture but are also stereotyping other cultures. If the Asians have less body hair, then that means Europeans, for example, have more body hair than usual.
“Stangor (2000, pp. 11–13) summarises the outcomes of stereotyping as producing negative behaviours towards others (discrimination), positive behaviours towards those who are viewed as belonging to the same social group as us (in-group favouritism) and a tendency for people to perceive individuals from the same group as more similar than they really are (perceptual accentuation)” – Barker & Potts, 2013, p. 190.
In any case, Google does not invent the results. The reason why those results are mostly about Asian cultural myths is that a great number of people have raised those questions in their conversations with the search engine. If the questions are frequent and more people click on the same results, there is no way those results will disappear; they would only become more popular. In some cases, some of the search engine’s responses “have the capacity to cause offence” (ibid.) to the people involved.
In their article, Barker and Potts gave the example of a case when one man sued Google because “auto-complete suggested words such as conman and fraud when his name was typed into the search box” (ibid.). Moreover, some users, especially the young ones, might not recognise stereotypes and start to use those results as their personal beliefs about other cultures or gender in various contexts.
“Other users, who hold such stereotypes, may feel that their attitudes are validated, because the questions appear on a reputable search page such as Google” – ibid.
To conclude, it is impossible to know how many people typed those answers and received those responses, but it is certain that some Internet users might be influenced by those conversations with the search engine. For further research, Google Instant and the ‘filter bubble’ might reveal answers to race, gender, sexuality, and cultural studies altogether.
Baker, P., Potts, A. (2013) "‘Why do white people have thin lips?’ Google and the perpetuation of stereotypes via auto-complete search forms". Critical Discourse Studies. 10(2). pp. 187-204, DOI: 10.1080/17405904.2012.744320 (Accessed 20 October 2021).
This article is written as part of an assignment for the Digital Activism class in the MA Media and Creative Cultures program at the University of Greenwich.