In recent years, chatbots have been increasingly used by organizations to interact with their customers. Interestingly, most of these chatbots are female, displaying stereotypical representations in their avatars, profile pictures and language. Given the harmful effects associated with gender stereotypes at the societal level – and particularly the harmful effects on women – it is crucial to understand the effects of such stereotypes when they are transmitted by chatbots.
The problematic stereotypes associated with chatbots and virtual assistants
In the IT industry, chatbots with female personalities are generally perceived as more approachable and friendly, considered ideal for customer service roles. They are also generally seen as more helpful and empathetic to language model choice, something that can be very helpful when customers are frustrated, upset or confused. On the other hand, male chatbots are expected to be more assertive and direct. This approach can be useful for tasks that require quick and decisive action, such as refund processing or dispute resolution. Chatbots with a male identity, according to the IT industry, can be more effective as a tool for dealing with customer inquiries. The advice for businesses is that depending on the business itself, they should consider what chatbot personality to create, as long as the AI chatbot is designed to fulfill the business tasks and goals.
The technology for virtual assistants operating on the basis of artificial intelligence gained popularity already in 2011. To the millions of obscene insults that Siri has received in the years up to 2019, she replied “I would blush if I could”. According to a UNESCO report, the assistant’s compliance in the face of sexual violence remained unchanged for years until 2019, when, after an update, Siri’s response changed to “I don’t know how to answer this.”
AI voice assistants designed to sound and communicate like young women continue to confirm harmful gender biases. A UNESCO study found that using female voices by default in artificial intelligence – as Microsoft has done with Cortana, Amazon with Alexa, Google with Google Assistant and Apple with Siri – reinforces the belief that women only exist to help men get on with more important things.
The solutions
The increasingly blurred line between the perception of “female machines” and real women has real-life consequences for women and girls, in the ways they are perceived and treated, and in the way they perceive themselves.
Perhaps the easiest way companies have tackled AI gender issues is by adding male voice alternatives or eliminating a default female voice feature, thus forcing users to to choose the identity of their virtual assistant.