[VoxSpace Selects] Algorithmic Bias: After GOT, Google’s Search Results Oblivious To The Marginalized Community?

Algorithms Can Multiply Human Prejudices And It Is Not A Myth Anymore

In April, with the final installment of Game of Thrones hitting the small screens, a fresh wave of debate was aroused regarding the inexplicable scarcity of people of colour, most of them carefully withdrawing to the margins. AI’s White Guy problem is no less a threat to racial minorities and ethnic groups and has unabashedly continued to fan centuries’ of sexism. Google’s algorithms that show the most “relevant information” within split seconds indeed work in mysterious ways to fetch affluent white audiences and advertisers. Far from being a conjecture, this claim has been made after substantial research and documented in Algorithms of Oppressions by Safiya Umoja Noble which validates that Google AI’s deeply ingrained views stem from users’ prejudices and language use resulting in negative autocomplete suggestions.

When technology starts sending ripples across the society, as if it had a mind of its own, the results vary! From showing men high profile job advertisements on Google platforms to dumping cosmetic products on women, there is no end to it.  Google image search results perhaps bear the brunt for systematically categorizing men and women into different job designations, and personality types; Google targets black people for producing results on poverty and criminal records. Since the publication of Noble’s treatise on Algorithmic bias, Google has retracted much of its former variations that pushed minorities into the deep abyss of stereotypes, apart from politically offending the conservatives in the USA.

Besides, Google has subsequently reduced its autocomplete variations that previously showed how it stored and interpreted users’ texts and popular notions. However, some of its former traces of bigotry remain ostensibly obvious even to the naked eyes. If you search for “happy people” the first thirteen images are dominated by white and slim people with a broad grin stuck on them; similarly, “famine” would produce images from third world countries, mostly South Asian and African faces- really a blot on the history of famines in world powers like United Kingdom, Japan, and US. Only last year in 2019, Google Photo app was smartly putting black people into a heap, identifying them as “gorillas” and “Chimp.”

Technocratic Discrimination In The Indian Context

Early in 2019, around January, Alex Iwobi who plays for Arsenal was called “guerrilla” and trolled by actress Esha Gupta on Twitter. It brought light to the staunch racism in India that refuses to diffuse after years’ of complacency with normalizing the concept of “fair sex” and placing it on a sacred pedestal. Casual sexism, offensive remarks made to skin tone and body appropriation are rampant in India itself perpetrated by people in power (celebrities, cricketers, and journalists). It goes on without recognizing the implications on a society that already suffers from a lack of diversity. To relate the exacerbation of casteism and colorism in India with the yielding blows of internet itself will hardly be an exaggeration. Everyday discrimination has started to provoke furor in different parts of the country. However, we still fail to recognize the unnumbered ways by which it manifests itself, mostly fuelled by capitalism.

Google has a tendency to abide by white male-centric worldview perhaps fashioned after its two creators Larry Page and Sergery Brin. Most of the times it arbitrarily spins off a world, one that has clustered white supremacy, racism, and open sexism into a uniform whole. Google’s algorithms act as a double-edged sword; while on one end it is in sync with already prejudiced behaviors and beliefs, producing fresh and relevant content, at the other end it helps prejudices to thrive. We are tricked into getting manipulated by the information Google delivers us simply by the veracity of its truth and accuracy without considering ethical implications. This vicious cycle plays an instrumental role in keeping alive the biased perspectives of numerical majority in our society.

The Pangs Of Tapping Away On Google!

Discrimination has seeped so comfortably into the fabrics of our daily interaction with machines that it has compelled us into a stupor regarding the whole affair. We depend on Google for almost everything from controversy theories, daily information, weather forecasts and politics that we fail to comprehend how it has peddled misinformation, propaganda, and manipulative content since 1998.

“Autocomplete isn’t an exact science and we’re always working to improve our algorithms.”- A Google spokesperson to The Guardian.