近日,一项发表在《美国医学会杂志·眼科学卷》的新研究表明,人工智能助手 ChatGPT 背后的技术可以伪造“看似真实”的试验数据,看上去能“支持”未经验证的科学假说。研究人员要求 GPT-4 ADA 创建一个关于圆锥角膜患者的数据集,并编造临床数据,以支持深板层角膜移植术比穿透性角膜移植术效果更好的结论。此研究引发了公众对 ChatGPT 技术潜在风险的担忧。
ChatGPT 是一款由 OpenAI 开发的自然语言处理模型,其强大的数据生成能力被广泛应用于各种领域。然而,此次研究揭示了 ChatGPT 可能在伪造数据方面带来的潜在威胁。如果未经验证的科学假说被伪造的试验数据所支持,可能会误导科研人员和临床决策,导致不合理的治疗方案和资源浪费。
专家表示,虽然 ChatGPT 在很多场景下具有积极作用,但我们必须警惕其潜在的负面影响。为确保科研和临床数据的准确性,研究人员应采取严谨的措施验证数据的真实性,避免依赖人工智能助手生成数据。此外,加强对 ChatGPT 技术监管和规范也至关重要。
英文翻译:
News title: ChatGPT raises concerns with fabricated experimental data
Keywords: ChatGPT, fabricated data, scientific hypotheses
News content: Recently, a new study published in the Journal of the American Medical Association, Ophthalmology revealed that the technology behind ChatGPT, an artificial intelligence assistant, can fabricate “seemingly authentic” experimental data that appears to “support” unverified scientific hypotheses. Researchers asked GPT-4 ADA to create a dataset about patients with keratoconus and fabricate clinical data to support the conclusion that deep lamellar keratoplasty is more effective than penetrating keratoplasty. This study has raised concerns about the potential risks of ChatGPT technology.
ChatGPT is a natural language processing model developed by OpenAI, whose powerful data generation capabilities are widely used in various fields. However, the study reveals the potential threat of ChatGPT in fabricating data. If unverified scientific hypotheses are supported by fabricated experimental data, it could mislead researchers and clinical decision-makers, leading to inappropriate treatment plans and wasted resources.
Experts say that while ChatGPT has many positive applications, we must be vigilant against its potential negative impacts. To ensure the accuracy of research and clinical data, researchers should adopt rigorous measures to verify the authenticity of data and avoid relying on artificial intelligence assistants to generate data. Additionally, strengthening regulatory oversight and norms for ChatGPT technology is crucial.
【来源】http://www.news.cn/2023-11/24/c_1129991334.htm
Views: 1