Close

Results 1 to 2 of 2
  1. #1
    DF PwNagE flumperino's Avatar
    Join Date
    Jan 2004
    Location
    Isle of flumps
    Posts
    9,681
    Thanks
    557
    Thanked:        726
    Karma Level
    875

    Shit Female-voice AI reinforces bias, says UN report

    AI-powered voice assistants with female voices are perpetuating harmful gender biases, according to a UN study.
    These female helpers are portrayed as "obliging and eager to please", reinforcing the idea that women are "subservient", it finds.


    Particularly worrying, it says, is how they often give "deflecting, lacklustre or apologetic responses" to insults.
    The report calls for technology firms to stop making voice assistants female by default.


    [Only registered and activated users can see links. ], which is borrowed from a response from Siri to being called a sexually provocative term.
    "Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation," the report says.


    "Because the speech of most voice assistants is female, it sends a signal that women are... docile helpers, available at the touch of a button or with a blunt voice command like 'hey' or 'OK'. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility," the report says.
    "In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment."


    Research firm Canalys estimates that approximately 100 million smart speakers - the hardware that allows users to interact with voice assistants - were sold globally in 2018.
    And, according to research firm Gartner, by 2020 some people will have more conversations with voice assistants than with their spouses.


    Voice assistants now manage an estimated one billion tasks per month, according to the report, and the vast majority - including those designed by Chinese tech giants - have obviously female voices.
    Microsoft's Cortana was named after a synthetic intelligence in the video game Halo that projects itself as a sensuous unclothed woman, while Apple's Siri means "beautiful woman who leads you to victory" in Norse. While Google Assistant has a gender-neutral name, its default voice is female.


    The report calls on developers to create a neutral machine gender for voice assistants, to programme them to discourage gender-based insults and to announce the technology as non-human at the outset of interactions with human users.


    A group of [Only registered and activated users can see links. ]

    The report also highlights the digital skills gender gap, from lack of internet use among girls and women in sub-Saharan Africa and parts of South Asia, to the decline of ICT studies being taken up by girls in Europe.
    According to the report, women make up just 12% of AI researchers.

    [Only registered and activated users can see links. ]

  2. #2
    DF PwNagE flumperino's Avatar
    Join Date
    Jan 2004
    Location
    Isle of flumps
    Posts
    9,681
    Thanks
    557
    Thanked:        726
    Karma Level
    875

    Default Re: Female-voice AI reinforces bias, says UN report

    lol, this is just getting silly now.

    Makes all voices male
    "Where are all the women, that's sexist."

    Makes all the voices female
    "That's still sexist, you misogynist."

    I would say let's just get rid of all the men and be done with it......but that would probably still end up being our fault too.

    3 Thanks given to flumperino

    Ashley (3 Weeks Ago), BertRoot (3 Weeks Ago), Over Carl (3 Weeks Ago) 


Social Networking Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •