In recent years, more and more companies are using chatbots to give users access to data and services in written form.
Stereotypes based on gender differences are already built into the creation of this type of artificial intelligence. The frequent use of female gender in artificial intelligence has led to accusations of encouraging the objectification of women.
Stage design
A study conducted in 2019 in Germany proves that over 76% of chatbots are given female characteristics at the design stage. Their identity is revealed by the name they have, distinctive features in their avatars such as hair ribbons, and the language they use.
They are already built in when making this type of artificial intelligence. The frequent use of female gender in artificial intelligence has led to accusations of encouraging the objectification of women.
The situation in Bulgaria
To investigate how gender-specific cues are implemented in the design of chatbots in Bulgaria, 11 chatbots freely accessible through the websites of companies operating in Bulgaria as of the end of March 2023 were examined. Focus was placed on three parameters – whether the name of the chatbot indicates gender, what the avatar looks like, as well as whether the description gives hints about gender. Considering the research carried out in Germany, it was assumed that most of the chatbots on the Bulgarian market have female names, avatars and descriptions.
In Bulgaria, chatbots – virtual assistants that respond to customer inquiries through text messages, operate mostly on service sites. The chatbots examined are from the telecommunications, banking, insurance, courier services and consumer pet goods industries. Of the 11 chatbots studied, 7 have female names, 1 male and 4 neutral. Three of them have female avatars, 2 have avatars but without gender markers, and 6 have no avatar at all. In addition, 4 speak with feminine pronouns and/or have feminine descriptions, 1 speaks with masculine pronouns, and the remaining 6 express themselves gender neutrally. For example, A1 Bulgaria’s chatbot Ava has a female name, but when communicating, it does not use pronouns that suggest gender. Unlike Ava, UBB’s Kate asks users “What can I do to help?” revealing gender (when said in Bulgarian).
The results
From the analysis, it is clear that 72.72% of chatbots have gender characteristics, 45% of them have pronouns or gender descriptions, 27% have avatars with gender characteristics. 18% of chatbots analyzed have all three types of gender-revealing characteristics. Again, 18% are neutral and do not reveal gender characteristics. The results give reason to conclude that 63% of chatbots exhibit female characteristics. Gender biases are particularly visible in the design of chatbots for customer service and sales applications.
The topic of artificial intelligence and the stereotypes of the creators of the technologies embedded in applications such as chatbots is new for Bulgaria. With the introduction of this technology, there is a gender bias on the websites of companies in the service sector, as chatbots display characteristics, have avatars and use pronouns that are mostly female. This confirms the stereotype that women’s role is to serve and help men. Eradicating bias and discrimination from AI requires diverse teams to work on the applications designs. A possible option is for companies to eliminate the default male or female gender option and offer a neutral virtual assistant.
*The text is part of a scientific report “Chatbots with female names – gender stereotyping and gender bias in artificial intelligence” by the author, published in the proceedings of the XV International Scientific International Scientific Conference on “e-Management and e-Communications”, organized by Technical University – Sofia between June 25th and 29th, 2023.