Andreea Danielsescu/ Rebecca Ivanhoe
Do Better: Building Gender-Inclusive Conversational AIs
Featuring:
Andreea Danielescu, Manager of Accenture Labs
Rebecca Evanhoe, Conversation Designer & Consultant
Content Warning: Some mention of sexual harassment.
What problems do we see with gender bias in conversational interfaces?
There is a high potential for biases in this field. This technology is made by people who carry biases as it is created to mimic human behavior and is based on human data collected by others.
Devices seem to work better for men than women. Personalities display gender stereotypes or reinforce them. Interactions may even make light of or encourage harmful behavior such as sexual harassment.
An unfortunate and disturbing truth is that people do sexually harass chatbots and digital assistants. “30% of all inputs to chatbots are off-topic, abusive, romantic or sexual in nature” (based on Mitsuku).
A lot of these devices are female presenting even if they don’t technically have a gender. That’s harmful because these devices are in service to us and having it be a female sends a signal that women are obliging, docile and eager-to-please helpers. It also reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.
A solution to this may be non-binary speech beyond pitch. This means that there is not one non-binary performance but many different tonality.
The Voice Creation Process looks like:
- Select voice actor and record 6 hours of audio
- First round of voice options created
- Round 1 of non-binary community surveying
- Refined based on feedback. Added audio data from non-binary individuals for training model
- Round 2 of non-binary community surveying
We need to make sure that we are representing everybody we are serving well.
Comments
Comment Form