The problem with AI and ‘empathy’ - FT中文网
登录×
电子邮件/用户名
密码
记住我
请输入邮箱和密码进行绑定操作:
请输入手机号码,通过短信验证(目前仅支持中国大陆地区的手机号):
请您阅读我们的用户注册协议隐私权保护政策,点击下方按钮即视为您接受。
人工智能

The problem with AI and ‘empathy’

If technology redefines what our language means it could also change our perceptions of ourselves
00:00

{"text":[[{"start":null,"text":"

Research suggests that LLMs read or predict people’s emotions, and write in a way which gives us the impression of empathy
"}],[{"start":6.74,"text":"One after another, the “uniquely human” traits we once thought would remain untouched by the rise of the machines have started to look vulnerable after all. First it was creativity. Is empathy next?"}],[{"start":23.979999999999997,"text":"If you have been reading the research of late, you could be forgiven for thinking so. In one study, a team of licensed healthcare professionals compared the responses of chatbots and real doctors to patient questions posed in an online forum. The chatbot responses were rated significantly higher not just for quality, but for empathy."}],[{"start":46.129999999999995,"text":"In another piece of research, the large language models ChatGPT-4, ChatGPT-o1, Gemini 1.5 Flash, Copilot 365, Claude 3.5 Haiku and DeepSeek V3 outperformed humans on five standard emotional intelligence tests, achieving an average accuracy of 81 per cent, compared with the 56 per cent human average reported in the original validation studies. This, the authors argued, added to “the growing body of evidence that LLMs like ChatGPT are proficient — at least on par with, or even superior to, many humans — in socio-emotional tasks traditionally considered accessible only to humans”."}],[{"start":96.96,"text":"But before we conclude that AI is more empathic than humans, can I suggest that we stop for a moment and give ourselves a shake?"}],[{"start":108.00999999999999,"text":"To be “empathic”, after all, means to be able to put oneself in someone else’s shoes. The Cambridge Dictionary defines empathy as “the ability to share someone else’s feelings or experiences by imagining what it would be like to be in that person’s situation”. But LLMs do not, and cannot, feel. What the research suggests they can do, rather well, is to read or predict other people’s emotions (at least in test conditions), and to write in a way which gives people the impression of empathy. It would be a dangerous mistake to allow the definition of the word “empathy” to quietly morph into something which need only meet this description."}],[{"start":161.35,"text":"Am I splitting hairs? One could take the utilitarian view that what really matters is not whether machines can feel, but whether their expressions of empathy can have a positive impact on human patients or customers. In an article titled “In praise of empathic AI”, a group of psychologists argue that “perceived expressions of empathy can leave beneficiaries feeling that someone is concerned for them, that they are validated and understood. If more people feel heard and cared for with the assistance of AI, this could increase human flourishing”."}],[{"start":204.89999999999998,"text":"There is indeed evidence to suggest that some therapeutic conversations with chatbots, with sufficient guardrails, can have positive effects on people’s mental health. They can also, of course, have very dangerous effects on some vulnerable people, as recent instances of “AI psychosis” make clear."}],[{"start":227.83999999999997,"text":"Either way, we must find a different word, or set of words, to describe what LLMs are doing in these interactions. Because if we call it “empathy”, one risk is that it might change our perceptions of ourselves, and not necessarily for the better. As the psychologists say in their paper, AI’s expressions of empathy “do not seem to suffer from typical human limitations” such as growing weary over time."}],[{"start":257.78,"text":"But these are not limitations of human empathy — they are features of it. And if we grow frustrated with real human empathy, compared with the indefatigable simulation of it we can receive on-demand from LLMs, that might drive us apart. We might grow to prefer our chatbot companions and forget what we are missing from one another."}],[{"start":283.46,"text":"The other problem with calling machines “empathic” is that it provides cover for actions which would otherwise feel morally uncomfortable, such as leaving lonely elderly people alone with chatbots to converse with, in lieu of making sure they have regular human company. If a machine was described as “more empathic” than a human care worker, that would conceal from view what had really been lost along the way."}],[{"start":313.71,"text":"It is not unusual for new technologies to quietly change the meaning of words. As the late cultural critic Neil Postman wrote, the invention of writing changed what we once meant by “wisdom”. The telegraph changed what we once meant by “information”. The television changed what we once meant by “news”."}],[{"start":337.74,"text":"“The old words still look the same, are still used in the same kinds of sentences,” Postman wrote in his book Technopoly in 1992. “But they do not have the same meanings; in some cases, they have the opposite meanings.” What is really dangerous, he added, is that when technology redefines words with deep roots, “it does not pause to tell us. And we do not pause to ask”."}],[{"start":369.90000000000003,"text":""}]],"url":"https://audio.ftcn.net.cn/album/a_1767827053_3547.mp3"}

版权声明:本文版权归FT中文网所有,未经允许任何单位或个人不得转载,复制或以任何其他方式使用本文全部或部分,侵权必究。

艺术市场需要AI吗?

一些艺术家正积极拥抱这项技术,但许多画廊却严重落后——而且可能被迫直面它。

日产全球重组,将在欧洲裁员10%

这家日本车企还将合并其在英国桑德兰工厂的两条生产线。

酶研究显示量子计算向药物发现迈进一步

科学家已利用这项技术模拟蛋白质分子的行为

欢迎来到“大蛰伏”时代

为什么没有更多人辞职?

“迷因股之王”大胆收购eBay能否成功?

瑞安•科恩正试图促成一笔560亿美元的交易,将视频游戏零售商“游戏驿站”与在线市场eBay合并。

为什么施罗德家族选择出售

在家族掌门人去世与美国巨头基金崛起之后,英国最大的独立资产管理公司被出售。
设置字号×
最小
较小
默认
较大
最大
分享×