Are SEOs Responsible for Google Search Bias?

In 2016, the UN declared internet access a human right.

This decision was taken on the understanding that the Internet is a tool allowing everyone to access information at an affordable price and to express themselves.

This resolution has prompted discussions about access in terms of infrastructure, where fiber optic cables are installed or upgraded, or allowing uninterrupted access during civil unrest and other emergencies.

– Advertising –

While these are valid and important points, the internet is not just wires and screens, and the information viewed can be changed based on algorithms.

As the Internet has integrated itself into our lives, it is now part of our social infrastructure (similar to medical or educational services).

It is well documented that there are biases in medical and educational spaces, including access to care and quality of care, but what about research results?

Are they fair? Are they representative of the world around us? Or are they causing more harm than good?

What’s in an Algorithm?

In digital marketing, “algorithm” is a term that is thrown around daily, whether or not anyone understands what it means. Each platform has one (or more), and our job is to try to satisfy them.

An algorithm is a procedure followed when a system performs a calculation.

This process takes input and uses formulas, rules, or other problem-solving operations to produce output.

For search, this means that queries entered into a search box are the input and the SERP (search engine results page) is the output.

This is a very simplistic explanation of what is happening. Google uses several algorithms in combination with AI (Artificial Intelligence) and machine learning.

Dissecting the whole system would be far beyond my scope and beyond the purpose of this article.

The Canary in the SERPs

As a woman, I am no stranger to bias in websites, politics, and society in general.

Every day I navigate the world with a pinch of salt. Investigating potential bias in search results has been something I’ve been interested in for some time, and I started researching the topic in 2021.

An original research project (Full disclosure: which I helped lead) called Give Us Features, Not Flowers examined gender biases in the social and research landscape for professional photographers.

Several gender-neutral queries were tested, such as “best photography Instagram accounts” or “best photographers”.

The results ?

Women were featured as professional photographers significantly less than men in rich results and first-page content, despite making up 50% of professionals.

Who is responsible for these prejudices? The editors who wrote the articles? Search engines to reward these pages? SEO pros to recommend the article to their client?

My knee-jerk reaction is to blame whoever created the algorithm.

While this is true to some degree, it’s not the whole story and it’s just not fair.

Bias is rooted in our existing societal structures, woven into our culture, our government, and our interactions with the world around us.

Research published in 2011 has already called into question the fairness of PageRank.

The models show that as the web has grown, the stability of top-ranking websites becomes more stable, leaving the remaining websites to argue for remnants.

Nature, a peer-reviewed journal, published an article in February 2022 examining the PageRank algorithm to see if it introduces or amplifies bias.

To put it in simple terms, the researchers created five potential societal patterns with varying degrees of homophily (“the tendency to connect with similar others”).

Each model contains 20 nodes, but let’s call them websites. Then, each website has been assigned a page ranking and is either majority or minority within the society.

Inequality was measured using the Gini coefficient (a statistical analysis to measure inequality) to see how an individual scored against an equal distribution. Inequality was measured by calculating the percentage of minorities in the top search results.

Their results show that the PageRank algorithm can reduce, replicate or amplify bias depending on the model used.

In models with a high degree of homophily, dominant voices perpetuated these views and biases while under-representing minorities.

On the other hand, when the majority group is heterophile (the tendency to congregate in diverse groups), there is an overrepresentation of minorities.

This lays the groundwork for future research into potential interventions or reducing bias against algorithms.

The intersection of culture and Google Image search results

Much research has shown that algorithms can be and many are biased. As discussed earlier, PankRank can play into these biases to amplify or diminish them, but the algorithms do not act alone.

In the case of Google, there are not only multiple algorithms at play, but also AI and machine learning. All of these are continually evolving through our (human) interactions.

Another research that was published this year to investigate whether societal gender inequalities were present in Google Image search results (via localized search algorithms).

The researchers plotted gender inequality by country (based on the Global Gender Gap Index) and the percentage of men appearing in Google Image search results when searching for “nobody”. in each country’s respective language (using a VPN to access local results).

Countries with greater gender inequality saw more images of men for the neutral keyword “person”. What they claim is a link between societal norms and algorithmic production.

The second part of the study examined how these biased results can influence individuals’ decision-making.

Participants viewed screenshots of Google Image results from low and high inequality countries and were asked about gender and occupation.

Skipping the details (although I think the article is worth reading), the results showed that cultural biases present in algorithms can (and influence) individual decision-making.

When participants saw image results from countries with low inequality, their results were more egalitarian than those from countries with high inequality, where the results reinforced gender biases.

The level of societal gender inequality is reflected in the search algorithm, which makes me wonder how much. The combination of these elements then influences the individual perception through each use.

Who is responsible for bias in the SERPs?

I started this journey by asking this question hoping for a simple answer.

Unfortunately, there isn’t because we are all responsible for biases in search results. From original coders and copywriters to SEO professionals and link builders, as well as the society, culture and environment in which we exist.

Imagine all the Algorithms you interact with every day. If exposure to these algorithms influences your perception of the world, then it becomes messy, unraveling the chains of multiple inputs.

As a hopeless optimist, I cannot leave you with such a heavy burden. Let’s start the discussion on how we can make search and content a more inclusive space.

Researchers who have examined biases in PageRank have discussed that while homophilic networks cause representational inequalities, minorities can overcome this through strategic networking.

It’s not a reasonable solution, so they suggested implementing Le (don’t worry, I won’t go into detail!).

This model would eliminate the need for minorities to be required to network with majorities.

Psychology-based interventions were suggested by the other study as they concluded that societal gender inequality was reflected in the algorithm. They call for more ethical AI that combines our understanding of psychology and society.

Typically, an SEO professional’s biggest concern is how to appeal to the algorithm rather than questioning the fairness or equality of it or how we might perpetuate harmful biases.

Through the use of AI-powered software to interpret AI-powered algorithms, there should be a time when we should start questioning the ethical component of our work.

Currently, search results are not an accurate representation of a fair world as they may be.

As SEO professionals, content creators, and marketers, we play a significant role in reproducing unfair content, increasing the visibility of already prominent voices, and perpetuating our local cultural biases.

Here are some other suggestions I had to help create a more equitable search landscape.

  • Stop replicating biased content – share your platform with diversify voices and create new stories around your niche.
  • Audit AI content – I’m not going to say no to all AI content, but it should be reviewed by a human because it might fall into the same patterns.
  • Algorithm audits – in the same way that we audit websites, algorithms can be audited. There are resources to check for potential biases and check for impact analyses.
  • Support education – support or volunteer with organizations that provide coding, software, or technical training to women, people of color, or other marginalized groups. Shoutout to Women in SEO Tech for being one of those spaces.
  • Multilingual resources – create SEO and other marketing resources in languages ​​other than English to allow for diverse voices and perspectives.
  • Create less biased algorithms and AI – easier said than done, but Google AI announced KELM last year, which has some potential when it comes to fact-checking and reducing bias
  • Stop the gentrification of research – To be anti-competitive is to be anti-business. It removes new and diverse voices, so I’d like to see more companies in the search landscape and more variety in the results.

I don’t intend to have the last word on this topic, as this conversation should continue in Twitter feeds, at conferences, over coffee, and in our daily work.

Please share your thoughts or questions on this topic so we can start discussing creating a search experience that doesn’t harm society.

More resources:

Featured Image: Andrii Yalanskyi/Shutterstock

Leave a Comment