The Danger of Artificial Voices人工声音的危险
作者 亨利·弗拉泽/文 杜安/译
发表于 2026年3月

In 2024, the tragic news broke that US teenager Sewell Seltzer III took his own life after forming a deep emotional attachment to an artificial intelligence (AI) chatbot on the Character.AI website.

2024年,一则令人悲痛的消息传出:美国一个名叫休厄尔·塞尔策三世的少年自杀身亡,死前他竟对“角色AI”网站上的一款AI聊天机器人产生了深厚的情感依恋。

As his relationship with the companion AI became increasingly intense, the 14-year-old began withdrawing from family and friends, and was getting in trouble at school.

当时,随着休厄尔与AI伴侣的关系持续升温,这名14岁的少年开始渐渐疏远家人和朋友,在学校也惹上了麻烦。

In a lawsuit filed against Character.AI by the boy’s mother, chat transcripts show intimate and often highly sexual conversations between Sewell and the chatbot Dany, modelled on the Game of Thrones character Daenerys Targaryen.

休厄尔的母亲事后起诉了“角色AI”。诉讼过程中提交的聊天记录显示,休厄尔与一款名叫“达尼”的聊天机器人有过亲密而且往往极为露骨的对话。该聊天机器人以《权力的游戏》中的角色丹妮莉丝·坦格利安为原型。

They discussed crime and suicide, and the chatbot used phrases such as “that’s not a reason not to go through with it”.

休厄尔与达尼讨论了犯罪和自杀,后者用到了诸如“这并非不去做的理由”这样的说法。

This is not the first known instance of a vulnerable person dying by suicide after interacting with a chatbot persona1.

这次事件并非首起脆弱个体与聊天机器人角色互动后选择轻生的案例。

A Belgian man took his life in 2023 in a similar episode involving Character.AI’s main competitor, Chai AI. When this happened, the company told the media they were “working our hardest to minimise harm”.

2023年,一名比利时男子在经历了类似情形后选择结束自己的生命,涉及的聊天机器人平台是“角色AI”的主要竞争对手“Chai AI”。悲剧发生后,该公司向媒体坦言,公司上下正在“竭尽全力将伤害降到最低”。

In a statement to CNN, Character.AI has stated they “take the safety of our users very seriously” and have introduced “numerous new safety measures over the past six months”.

在给美国有线电视新闻网的一份声明中,“角色AI”称他们“非常重视用户安全”,为此“在过去的六个月采取了多项新的安全措施”。

本文刊登于《英语世界》2026年2期
龙源期刊网正版版权
更多文章来自
订阅