Magazine excerpt: Future Insight

Written by Rohit Talwar and Helena Calle on 3 September 2018 in Features
Features

How can we tackle gender imbalance in the personalities of AI learning tools? Rohit Talwar and Helena Calle investigate.

The expected growth in use of artificial intelligence (AI) in learning applications is raising concerns about both the potential gendering of these tools and the risk they will display the inherent biases of their developers.

Why the concern? Well, to make it easier for us to integrate AI tools and chatbots into our lives, designers often give them human attributes.

For example, applications and robots are frequently given a personality and gender. Unfortunately, in many cases, gender stereotypes are being perpetuated. The types of roles robots are designed to perform usually reflect over-generalisations of feminine or masculine attributes. 

Feminine personalities in AI tools such as chatbots and consumer devices like Amazon’s Alexa are often designed to have sympathetic features and perform tasks related to care-giving, assistance or service. Examples include Emma the floor cleaning robot and Apple’s Siri, your personal iPhone assistant.

Gendering of technology is problematic because it perpetuates stereotypes and struggles present in society today.

Conversely, male robots are usually designed as strong, intelligent and able to perform ‘dirty’ jobs. They typically work in analytical roles, logistics and security. Examples include Ross the legal researcher, Stan the robotic parking valet and Leo the airport luggage porter.

Gendering of technology is problematic because it perpetuates stereotypes and struggles present in society today. It can also help reinforce the inequality of opportunities between genders. These stereotypes aren’t beneficial for either males or females as they can limit a person’s possibilities and polarise personalities with artificial boundaries. 

Response strategies

We propose four strategies to help tackle this issue at different stages of the problem:

  • Mix it up – developers of AI learning solutions can experiment with allocating different genders and personality traits to their tools.
  • Gender-based testing – new tools can be tested on different audiences to assess the impact of, say, a quantum mechanics teaching aid with a female voice but quite masculine persona.
  • Incentives for women in technology – by the time we reach developer stage the biases may have set in. So, given the likely growth in demand for AI-based applications in learning and other domains, organisations and universities could sponsor women to undertake technology degrees and qualifications which emphasise a more gender-balanced approach across all that they do, from the structure of faculty to the language used.
  • Gender-neutral schooling – the challenge here is to provide gender-neutral experiences from the start, as the early stages experiences offered to children usually perpetuate stereotypes. How many opportunities do boys have to play with dolls at school without being bullied? Teachers’ interactions are crucial in role modelling and addressing ‘appropriate’ or ‘inappropriate’ behaviour. For example, some studies show teachers give boys more opportunities to expand ideas orally and are more rewarded to do so than girls. Conversely, girls can be punished more severely for the use of bad language. 

 

About the author

The authors are futurists with Fast Future, a professional foresight fiirm and publishing business. Visit www.fastfuture.com and read their latest books – Beyond Genuine Stupidity: Ensuring AI Serves Humanity and The Future Reinvented: Reimagining Life, Society and Business.

 

This is a piece from September's TJ magazine. Not a subscriber? Click for three months' free digital subscription on us. 

Share this page

CONTRIBUTIONS FROM READERS

Please login to post a comment or register for a free account.

Related Articles

21 September 2018

Nick Craig discusses organisational purpose in an extract from his new book, Leading from Purpose.

17 September 2018

Tara O’Sullivan has a message for businesses, at the start of National Coding Week.

Categories

Tags

Related Sponsored Articles

8 June 2018

A report published today has revealed the extent of ageist attitudes across the UK, and how they harm the health and wellbeing of everyone in society as we grow older. 

5 January 2015

Vincent Belliveau, Senior Vice President & General Manager EMEA at Cornerstone OnDemand, explores the benefits of internal recruitment