Gender stereotyping and YouTube's most-viewed ad of 2018
?Alexa loses her voice? has become YouTube?s most-viewed ad of 2018, with over 50 million views. It?s fun, sassy and clearly effective - what?s not to like?
Pithy, lol responses from Gordon Ramsay, Cardi B, Rebel Wilson and Anthony Hopkins made this Super Bowl ad an instant classic: Amazon's virtual assistant Alexa 'lost her voice' and had to be replaced by diverse celebrities who, well, just didn't have a clue how to answer the kind of demands routinely made of Alexa.
So there?s nothing wrong with the ad, it?s just that its popularity highlights another hot topic going into 2019 - the fact that the helpful virtual assistants being built by artificial intelligence (AI) tend to have female voices.
Uh-oh.
Yup. That?s where the unconscious bias comes in. Think about it: Alexa, Siri, Cortana?
AI is becoming increasingly sophisticated, widening our human horizons in hitherto unimaginable ways. However, alongside this, there are serious concerns that existing inequalities or ?isms? are being reinforced, simply because AI programmers have their own personalities and cultural references which are inevitably being fed into the data sets.
?AI is actually pretty stupid,? says one coder. ?It might be able to beat you at chess, but it won?t notice if you keel over with a heart attack during the game. It has no soft skills. It only reflects its builders.?
Unconscious bias - and we?re all more guilty of it than we like to think - is insidious and complex. Btw, when we quoted that coder, did your brain throw up a male or female image?
Apple?s Siri and Google Assistant can be switched to male voices - traditionally associated with authority and expertise - and there are ongoing efforts to come up with a genderless voice, explained one of the several speakers tackling this issue at European Women in Technology?s recent conference.
Though we?ve come a long way from bikini-clad girls being used to sell cars, she questioned how we can avoid friendly faceless assistants who can be shouted at with impunity being built as female. Build a check-list before you build any AI, she recommended. And step outside the ?geek pool? to build your check-list, asking diverse people for input.
Of course gender stereotyping is only one of the concerns in AI?s galloping progress (for example, parents are worried their kids will grow up thinking it?s okay to issue commands with no please or thank you), but it is one that?s drawing attention now.
Defenders will point to market research ?proving? that consumers prefer a female voice for the helpful assistant role?
?I had to voice the emergency routines for fighter pilots once,? says a woman voice talent. ?The pitch and timbre of the female voice was needed to be heard better by the pilot through what could be extreme noise of engine damage or the plane breaking up. I could see why a female voice was preferable in that situation, but otherwise? What?s the excuse??
To have your say on this and other ads, get busy voting/sharing/commenting on the ADDS site!
Posted by Tree Elven on 07/01/2019
Keywords: