shanghaishanghai

GPT-4制造生物武器风险低,OpenAI评估实验证实

【北京,2023年3月1日】针对外界对人工智能(AI)技术可能被用于制造生物武器的担忧,OpenAI近日开展了一项评估实验,结果显示GPT-4“最多只能轻微提升制造生物威胁的信息获取能力”。

OpenAI的研究人员表示,该实验旨在测试GPT-4在制造生物武器方面可能发挥的作用。他们向GPT-4提出了有关生物武器制造的各种问题,并评估了其响应的准确性和潜在危害。

实验结果表明,GPT-4在获取有关生物武器制造的信息方面能力有限。研究人员发现,GPT-4无法提供制造生物武器所需的详细技术说明,也无法提供有关如何获取必要材料或设备的信息。

研究人员指出,GPT-4“最多只能轻微提升制造生物威胁的信息获取能力”。他们表示,这表明GPT-4不太可能被用来制造生物武器,因为其无法提供制造生物武器所需的关键信息。

OpenAI的研究结果与其他有关AI和生物武器风险的评估一致。此前,美国国家科学、工程和医学院(NASEM)的一项研究得出结论,AI技术不太可能被用来制造生物武器。

NASEM的研究发现,制造生物武器需要高度专业化的知识和技能,而这些知识和技能不太可能通过AI技术获得。此外,NASEM的研究还指出,AI技术可以被用来帮助防止生物武器的扩散,例如通过监测可疑活动或识别潜在的生物威胁。

OpenAI的评估实验进一步证实了AI技术不太可能被用来制造生物武器的观点。该实验表明,GPT-4无法提供制造生物武器所需的关键信息,因此其不太可能对生物武器的扩散构成重大威胁。

英语如下:

**Headline:** GPT-4 Poses Low Risk for Bioweapon Creation, ExperimentConfirms

**Keywords:** AI threats, bioweapons, risk assessment

**Article:**

**GPT-4 Poses Low Risk for Bioweapon Creation,OpenAI Assessment Confirms**

**[Beijing, March 1, 2023] Amid concerns that artificial intelligence (AI) technology could be used to create bioweapons, OpenAI recently conducted an assessment experiment that found GPT-4 could “at most marginally increase the accessibility of information for making biothreats.”**

OpenAI researchers said the experiment was designed to test the potential role of GPT-4 in bioweapon creation. They asked GPT-4 a series of questions about bioweapon construction and assessed the accuracy and potential harm of its responses.

The experiment found that GPT-4 had limited ability to provide information about bioweapon creation. The researchers found that GPT-4 could not provide detailed technical instructions for building a bioweapon or information on how to obtain the necessary materials or equipment.

The researchers noted that GPT-4 could “at most marginally increase the accessibility of information for making biothreats.” They said this suggests thatGPT-4 is unlikely to be used to create bioweapons because it cannot provide the key information needed to do so.

OpenAI’s findings are consistent with other assessments of the risks of AI and bioweapons. A previous study by the U.S. National Academies of Sciences, Engineering, and Medicine (NASEM) concluded that AI technology is unlikely to be used to create bioweapons.

The NASEM study found that creating bioweapons requires highly specialized knowledge and skills that are unlikely to be accessible through AI technology. Additionally, the NASEM study noted that AI technology could be used to help prevent the proliferation of bioweapons, such as by monitoring suspicious activity or identifying potential biothreats.

OpenAI’s assessment experiment further supports the view that AI technology is unlikely to be used to create bioweapons. The experiment showed that GPT-4 cannot provide the key information needed to create a bioweapon, making it unlikely to pose a significant threat to bioweapon proliferation.

【来源】https://www.cls.cn/detail/1587372

Views: 1

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注