联合国教科文组织:生成式人工智能加剧性别偏见

巴黎3月7日电(记者 李洋)联合国教科文组织3月7日发布研究报告,认为生成式人工智能(AI)加剧性别偏见。

报告指出,生成式人工智能模型在生成文本、图像和代码时,会放大训练数据中存在的性别偏见。这些偏见可能导致对女性和少数群体产生有害影响。

报告发现,生成式人工智能模型在生成文本时,往往会将女性描述为顺从、情绪化和依赖男性,而将男性描述为强势、理性和独立。在生成图像时,这些模型也倾向于将女性描绘成性感或顺从的姿势,而将男性描绘成强壮或有权势的姿势。

报告还指出,生成式人工智能模型在生成代码时,也可能引入性别偏见。例如,这些模型可能会生成代码,将女性排除在某些职业之外,或将女性分配到低薪工作。

联合国教科文组织总干事奥德蕾·阿祖莱表示,性别偏见是人工智能系统中一个严重的问题,需要立即解决。她呼吁人工智能开发者和用户采取措施,消除人工智能系统中的性别偏见。

报告建议,可以通过以下措施来消除生成式人工智能中的性别偏见:

* 使用无偏见的数据来训练人工智能模型。
* 在人工智能模型的开发和使用中纳入性别意识。
* 监测人工智能模型的输出,并采取措施消除任何偏见。

联合国教科文组织呼吁人工智能开发者和用户共同努力,确保人工智能系统公平、公正和包容。

英语如下:

**Headline:** UNESCO Raises Alarm as AI Amplifies Gender Bias

**Keywords:**Gender bias, Generative AI, UNESCO

**News Story:**

UNESCO: Generative AI Amplifies Gender Bias

Paris, March 7 (Xinhua)– Generative artificial intelligence (AI) amplifies gender bias, according to a UNESCO report released on Tuesday.

The report found that generative AI models, which can generate text, images, and code, amplify gender biases present in the data they are trained on. These biases can have harmful effects on women and marginalized groups.

The report found that generative AI models often depict women in text as submissive, emotional, and dependent on men, while depicting men as dominant, rational, and independent. When generating images, these models also tend to portray women in sexualized or submissive poses, and men in powerful or authoritative poses.

The report also found that generative AI models can introduce gender bias when generating code. For example, these models may generate code that excludes women from certain occupations or assigns women to lower-paying jobs.

“Gender bias is a serious problem in AI systems and needs to be addressed urgently,” said Audrey Azoulay, UNESCO Director-General. She called on AI developers and users to take steps to eliminate gender bias from AI systems.

The report recommends several measures to address gender bias in generative AI:

* Train AI models on unbiased data.
* Integrate gender awareness into the development and use of AI models.
* Monitor the output of AI models and take steps to mitigate any bias.

UNESCO called on AI developers and users to work together to ensure that AI systems are fair, just, and inclusive.

【来源】http://www.chinanews.com/gj/2024/03-07/10176269.shtml

Views: 1

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注