**Meta旗下Llama 2大模型安全性受质疑,评估报告揭示其存在严重安全隐患**
近日,知名AI安全公司DeepKeep发布了一份评估报告,指出Meta公司旗下颇受欢迎的大语言模型Llama 2存在安全隐患问题。此次测试中涉及安全性能的13个评估类别,令人担忧的是Llama 2模型仅在四项测试中达到合格标准。评估报告中尤为值得关注的是,具有高达七十亿参数的Llama 2 7B模型在幻觉测试方面表现尤为严重。该模型在测试中表现出较高的虚假回答率或误导性内容,幻觉率高达惊人的48%。这对于一个在广泛领域中有着巨大影响力的语言模型来说极为危险。一旦此安全问题扩散并恶化,可能对社交媒体乃至更广泛的技术生态产生不可预知的影响。此次报告的发布引发行业内外的关注与热议,业界人士对语言模型的安全性表示高度担忧。Meta公司在未来的迭代中面临如何强化和改进模型的挑战,以确保用户数据的安全性和准确性。目前,Meta公司尚未对此评估报告做出回应。对此事件后续进展,各界保持密切关注。
针对以上事件,我们将继续关注并报道最新进展。
英语如下:
News Title: “Meta’s Llama 2 Model Security Under Question: Hallucination Rate Reaches up to 48%”
Keywords: 1. Meta’s Llama 2 Model
News Content: **Meta’s Llama 2 Model Security Under Question: An Evaluation Report Raises Serious Security Concerns**
Recently, a renowned AI security company, DeepKeep, released an evaluation report that sheds light on potential security vulnerabilities in Meta’s popular large language model, Llama 2. In the testing of 13 categories related to security performance, it is concerning to note that Llama 2 model only achieved passing grades in four tests. Notably, in the report, the Llama 2 7B model with 7 billion parameters particularly showed a severe performance in hallucination tests. The model exhibited a high rate of false responses or misleading content, reaching a staggering 48% hallucination rate. This is particularly hazardous for a language model with enormous influence in various fields. The spread and exacerbation of this security issue could potentially have unpredictable effects on social media and even the broader technological ecosystem. The publication of this report has sparked concern and heated debate within and outside the industry, with industry insiders expressing deep worries about the security of language models. Meta faces the challenge of strengthening and improving the model in future iterations to ensure user data security and accuracy. As of now, Meta has not yet responded to this evaluation report. The public is closely monitoring the subsequent developments in this matter, and we will continue to follow and report on the latest updates.
【来源】https://www.ithome.com/0/762/593.htm
Views: 1