In order to draw attention to the ongoing problem of everyday sexism, the United Nations organisation UN Women is portraying the online search engine Google as an objective guarantor of truth. The reactions are divided.

Produced by the agency ‘Memac Oglivy & Mather Dubai’ and released in March 2013, the video The Autocomplete Truth served as a contribution to the eponymous women’s rights campaign of the ‘United Nations Entity for Gender Equality and the Empowerment of Women’, in short UN Women. The video’s message was that the 21st century is celebrated as the century of women, but appearances are deceptive. Even a simple Google search reveals how much women are still exposed to prejudice and discrimination in everyday life. However, on closer inspection, the understanding of Google as an impartial ‘truth machine’ turns out to be problematic.

The video begins with a review of historical milestones in the women’s rights movement over the past 150 years, accompanied by sentimental piano music. For example, a protest march by the National Woman Suffrage Association (NAWSA) in 1869 is shown, followed by a referendum in Colorado in 1893, in which women in the United States were granted the right to vote for the first time. Also featured are Valentina Tereshkova, the first woman in space, as well as Nandini Satpathydas, who was one of the first women in India to be elected to high political office in 1966, and the athlete Sarah Attar, who became the first woman to represent Saudi Arabia at the 2012 Olympics.

But then, in 2013, the steady progress suddenly comes to a halt. A simulated Google search query is faded in – based, the video claims, on actual suggestions by the autocomplete function. The principle: When a user starts typing in the Google search bar, a drop-down list displays a series of suggestions that most likely match the user’s interest. The Google algorithm calculates these suggestions from the popularity of related search queries. In the video, the incomplete entry ‘Women should’ is followed by suggestions such as ‘Women should stay at home’, ‘Women should be slaves’, and ‘Women should be in the kitchen’. However, instead of selecting one of these suggestions, the query is completed by hand: ‘Women should be seen as equal by now.’

The message is clear: society can boast of selective achievements in the field of equality, but you only have to scratch the surface with the right tools and outdated role models will emerge. Google, of all places, the symbol of a new, progressive net age, reveals an antiquated and still omnipresent sexism. The search engine is here, as already indicated in the video’s title with the term ‘truth’, understood in a way as a ‘truth generator’ – as an unbiased algorithm nexus that mathematically proves that sexism is reflected in the search queries made worldwide. Although everyone on the net is an individual, the thinking of the masses as a whole is quantifiable and abstract, the claim goes.

This pattern of argumentation seems intuitively plausible. After all, the great strength of the clip is that it is based on data that anyone can verify at home and that it leaves out individual expressions of opinion. Thus, from the outset the video counters the objection that it is merely the ideologically dressed-up views of eternally dissatisfied men-hating feminists. For algorithms, the video suggests, have no political agenda, unlike activists. If the search engine’s corpus of data points to a problem, then that problem does exist.

However, the interpretation of these data must be viewed critically. For example, the video is based on the not unproblematic basic assumption that a search query (‘women belong in the kitchen’) automatically implies agreement (‘women actually belong in the kitchen’). However, the connection between entering search queries and the personal convictions of Google users cannot be reconstructed beyond doubt. In principle, it is quite possible – the probability is open to question – that some of the queries such as ‘Women should not speak in church’ were motivated solely by historical interest in legitimising the oppression of women.

Nevertheless, provided that the data is correct, the video serves as proof that the overwhelming majority of requests with the phrase ‘Women should’ are characterised by sexist formulations. This fact alone can be interpreted as a problem. It is unclear, however, how many of the thousands upon thousands of daily Google search queries actually begin with ‘Women should’. For the experiment that is re-enacted in the video is a circular one: When people google antiquated role models – in other words, claims about what women should do by their very nature (‘Women should’) – it is not surprising that they are actually confronted with such views. Even the phrase ‘Women should’ presupposes outdated gender ideals, which – unsurprisingly – are then also reflected in the autocomplete suggestions.

Thus, while the phrase completions provide information on possible sexist views among users who google ‘Women should’, they do not provide information on the totality of Google users. This is where the lack of clear figures and usage statistics becomes noticeable at the latest. The number of users who do not use such search terms from the outset and are therefore not represented by the autocomplete algorithm remains uncertain. Although the accusation of widespread sexism can by no means be dispelled in this way – on the contrary – it is at least doubtful that the Google algorithm represents a meaningful measuring instrument.

The Autocomplete Truth was a great success. In the weeks following its release, the video became a viral hit on social networking sites and by the end of 2016 had been grabbed up nearly 600 times by a wide range of media outlets worldwide, including the BBC, CNN, Huffington Post, and The Guardian (see Ben-Yehoshua 2016). The hashtag #womenshould was distributed on Twitter in 50 countries (ibid.). According to estimates, the campaign reached nearly 1.2 billion people and was awarded several prizes.

In the commentary columns, the video was discussed controversially. While some users experimented with Google themselves and came to similar results, others stated that they had not achieved comparable results and questioned the representativeness of the campaign. In addition, some users pointed out that the search term ‘Men should’ also led to autosuggestions with negative connotations. In the meantime, search terms such as ‘women should’, ‘cannot’, ‘shouldn’t’ or ‘need to’ no longer generate automatic completions. Whether the success of the campaign resulted in an intervention by the Google group in the sense of damage limitation has not been confirmed by official sources.

Angela Pelivan