【斯坦福教授李飞飞在科技峰会上发声:对AI末日论持审慎态度,担忧过度炒作影响社会】

美国当地时间周四,著名华裔人工智能科学家、斯坦福大学教授李飞飞在旧金山的一场科技峰会上,与知名科技记者艾米丽·张展开了一场深入的对话,主题聚焦于人工智能的安全与伦理问题。李飞飞在访谈中表达了对公众过度担忧AI末日的不以为然,她认为,这种担忧可能会被过度渲染,甚至可能导致公众对真正紧迫问题的忽视。

李飞飞指出,当前社会更应该关注的是人工智能带来的实际问题,比如虚假信息的泛滥。她强调,生成式人工智能技术的发展虽然带来了新的挑战,但普遍存在的“悲观情绪”可能被夸大了。她直言不讳地表示:“我更担忧的是,对人工智能的过度炒作可能会引发不必要的恐慌,甚至误导公众,而不是人工智能本身会导致人类灭绝的风险。”

李飞飞的观点在科技界引起了广泛的思考,她呼吁公众和业界理性看待AI的发展,既要认识到其潜力,也要直面随之而来的新问题,以更明智的方式引导技术进步,确保其对社会的正面影响。这一言论也反映了科技伦理在人工智能研究与应用中的重要性,为未来的科技发展提供了富有洞察力的参考。

英语如下:

**News Title:** “Fei-Fei Li Warns: Exaggerated AI Doomsday Hype Overshadows Real Threat of Misinformation”

**Keywords:** Fei-Fei Li interview, AI safety, overhype

**News Content:**

Renowned AI scientist and Stanford University professor Fei-Fei Li sounded a cautious note on apocalyptic AI scenarios during a tech summit in San Francisco on Thursday (local time), engaging in a thought-provoking conversation with prominent tech journalist Emily Chang.

Li expressed skepticism about the public’s excessive concern over an AI-induced doomsday, arguing that such fears could be overblown and distract from more pressing issues. She emphasized that society should focus more on the tangible problems AI presents, such as the proliferation of misinformation.

Highlighting the development of generative AI technologies as posing new challenges, Li suggested that the prevalent “pessimism” might be exaggerated. She candidly stated, “I’m more concerned that the overhype around AI could trigger unnecessary panic and mislead the public, rather than the actual risk of AI leading to humanity’s extinction.”

Li’s perspective has sparked broad reflection within the tech community. She urged the public and industry to approach AI development rationally, acknowledging its potential while confronting the new challenges it brings. Her remarks underscore the importance of tech ethics in AI research and application, offering insightful guidance for the responsible advancement of technology to ensure its positive impact on society.

【来源】https://new.qq.com/rain/a/20240510A048CQ00

Views: 2

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注