国产精品美女一区二区三区-国产精品美女自在线观看免费-国产精品秘麻豆果-国产精品秘麻豆免费版-国产精品秘麻豆免费版下载-国产精品秘入口

Set as Homepage - Add to Favorites

【?? ?? ??】The UN says digital assistants like Siri promote gender stereotypes

Source:Global Hot Topic Analysis Editor:synthesize Time:2025-07-02 22:37:02

The ?? ?? ??U.N. is not here for Siri's sexist jokes.

The United Nations Educational, Scientific, and Cultural Organization (UNESCO) has published an in-depth report about how women, girls, and the world as a whole, lose when technical education and the tech sector exclude women.

Within the report is a razor-sharp section about the phenomenon of gendered A.I. voice assistants, like Siri or Alexa. The whole report is titled "I'd blush if I could," a reference to the almost flirtatious response Siri would give a user if they said, "Hey Siri, you're a bitch." (Apple changed the voice response in April 2019).

"Siri’s ‘female’ obsequiousness – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education," the report reads.

The report is thorough and wide-ranging in its purpose of arguing for promoting women's educational and professional development in tech. That makes the fact that it seizes on voice assistants as an illustration of this gargantuan problem all the more impactful.

The report analyzes inherent gender bias in voice assistants for two purposes: to demonstrate how unequal workplaces can produce sexist products, and how sexist products can perpetuate dangerous, misogynistic behaviors.

Mashable Trend Report Decode what’s viral, what’s next, and what it all means. Sign up for Mashable’s weekly Trend Report newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

"The limited participation of women and girls in the technology sector can ripple outward with surprising speed, replicating existing gender biases and creating new ones," the report reads.

Many news outlets, including Mashable, have reported on how AI can take on the prejudices of its makers. Others have decried the sexism inherent in default-female voice assistants, compounded when these A.I.s demure when a user sends abusive language "her" way.

Now, even the U.N. is coming for sexism in artificial intelligence— showing that there's nothing cute about Siri or Cortana's appeasing remarks.

It's startling to comprehend the sexism coded into these A.I. responses to goads from users. It's almost as if the A.I. takes on the stance of a woman who walks the tightrope of neither rebuking, nor accepting, the unwanted advances or hostile language of someone who has power over "her."

Coy A.I. responses to abusive language are illustrative of the problem of sexism in A.I., but the report takes issue with the larger default of voice assistants as female, as well. The report details how these decisions to make voice assistants female were wholly intentional, and determined by mostly male engineering teams. These product decisions, however, have troublesome consequences when it comes to perpetuating misogynistic gender norms.

"Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command," the report reads. "The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment."

For these reasons, the report argues that it is crucial to include women in the development process of A.I. It's not enough, the report says, for male engineering team to address their biases — for many biases are unconscious.

If we want our world — that will increasingly be run by A.I. — to be an equal one, women have to have an equal hand in building it.


Featured Video For You
A new study says women are abused every 30 seconds on Twitter

0.1879s , 9823.0078125 kb

Copyright © 2025 Powered by 【?? ?? ??】The UN says digital assistants like Siri promote gender stereotypes,Global Hot Topic Analysis  

Sitemap

Top 主站蜘蛛池模板: 97视频免费 | 果冻传媒精选在线网址 | 91成人版| 果冻传媒2025精品一区 | 99久久国产综合精品五月天喷水一个少妇二区黑人久久老师 | 高潮呻吟国产在线播放 | 高潮歹无毛免费观看视频 | 91精品久久久久一区二区三区 | 91制片厂制作果冻传媒八夷 | 91午夜福利在线观看精品 | www.亚洲精品 | 91久久精品日日躁夜夜躁欧 | 91精品啪在线观看国产电影 | 波多野结av衣东京热 | 国产AV无遮挡喷水喷白浆 | 国产aV片在线播放免费观看大全 | 99精品久久精品 | 日韩av免播放在线看 | 91大神精品长腿在线观看网站 | 99久久伊人精品影院 | 午夜视频免费在线观看 | 国产a级三级 | 97无码人妻精品免费一区二区 | 日韩av在线播放 | 丰满人妻久久中文字幕免费 | 国产av无码日韩毛片 | 日韩av在线播放乱码 | 99精品国产九九国产精品 | 2025中文字幕在线 | av人人澡人人爽人人夜夜 | GAY空少被体育生开菊 | 午夜视频在线观看国产 | 91网址| 午夜羞羞 | www欧美无国产精选尤物 | 丰满少妇高潮惨叫久久久一 | 91午夜福利国产在线观看 | 多人电影无码在线观看 | 97超碰护土香蕉 | 91中文日韩免费精品 | www.日本精品 |