**中外顶尖专家在京签署《北京AI安全国际共识》:禁止AI自我复制及涉及生化武器应用**

近日,一场具有里程碑意义的会议在北京举行,图灵奖得主约书亚·本吉奥、杰弗里·辛顿、姚期智等数十位全球知名专家共同参与,他们一致签署了一份由智源研究院发起的《北京AI安全国际共识》。这一共识旨在规范人工智能的发展,确保其在安全、道德的框架内运行,避免潜在风险。

共识明确提出人工智能的“风险红线”,其中明确禁止AI系统自行复制以及应用于生化武器的研发和制造。这一举措旨在防止AI技术的无序扩散和可能带来的灾难性后果。此外,“风险红线”还包括防止AI系统寻求自主权力、协助不良行为者以及进行误导性的行为,以保障全球社会的安全和稳定。

智源研究院作为此次共识的发起者,展示了其在人工智能领域的领导力和责任担当。这一国际共识的达成,不仅体现了科技界对AI伦理问题的高度重视,也为全球AI治理提供了重要的参考框架。未来,各方将共同努力,确保人工智能技术在促进人类福祉的同时,不会对社会和环境构成威胁。

英语如下:

**News Title:** “Global Experts Unite: Forbid AI Replication and Bioweapon Applications for Ensuring Intelligent Safety”

**Keywords:** AI Safety Consensus, Expert Signatories, Prohibition on Replication

**News Content:**

Recently, a landmark conference was held in Beijing, where renowned experts, including Turing Award winners Yoshua Bengio, Geoffrey Hinton, and Yao Qizhi, among others, gathered to sign the “Beijing AI Safety International Consensus” proposed by the BAAI (Beijing Academy of Artificial Intelligence).

This consensus aims to regulate the development of artificial intelligence, ensuring it operates within a safe and ethical framework, mitigating potential risks. It explicitly outlines the “red lines” for AI, forbidding AI systems from self-replication and their use in the development and production of bioweapons. This measure is designed to prevent the uncontrolled spread of AI technology and potential catastrophic consequences.

The “red lines” also encompass preventing AI systems from seeking autonomy, assisting malicious actors, or engaging in deceptive actions, thereby safeguarding global societal security and stability.

As the initiator of this consensus, BAAI demonstrates its leadership and commitment in the AI field. The达成 of this international agreement highlights the tech community’s high level of concern for AI ethics and provides a crucial reference framework for global AI governance. Moving forward, stakeholders will work together to ensure that AI technology, while promoting human well-being, does not pose threats to society and the environment.

【来源】https://new.qq.com/rain/a/20240318A04IFG00

Views: 1

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注