人工智能论文英文版-Let’s Put Ourselves in Sally’s Shoes:Shoes-of-Others Prefixing Improves Theory of Mind in Large Language Models.pdfVIP

  • 0
  • 0
  • 约6.7万字
  • 约 14页
  • 2025-06-13 发布于湖南
  • 举报

人工智能论文英文版-Let’s Put Ourselves in Sally’s Shoes:Shoes-of-Others Prefixing Improves Theory of Mind in Large Language Models.pdf

Let’sPutOurselvesinSally’sShoes:Shoes-of-OthersPrefixingImproves

TheoryofMindinLargeLanguageModels

KazutoshiShinodaNobukatsuHojoKyosukeNishidaYoshihiroYamazaki

KeitaSuzukiHiroakiSugiyamaKunikoSaito

NTTCorporation,Japan

kazutoshi.shinoda@

Abstract

RecentstudieshaveshownthatTheoryof

5Mind(ToM)inlargelanguagemodels(LLMs)

2

0hasnotreachedhuman-levelperformanceyet.

2Sincefine-tuningLLMsonToMdatasetsoften

degradestheirgeneralization,severalinference-

n

utimemethodshavebeenproposedtoenhance

JToMinLLMs.However,existinginference-

6timemethodsforToMarespecializedforin-

ferringbeliefsfromcontextsinvolvingchanges

]intheworldstate.Inthisstudy,wepresent

Lanewinference-timemethodforToM,Shoes-

C.of-Others(SoO)prefixing,whichmakesfewer

sassumptionsaboutcontextsandisapplicableto

c

[broaderscenarios.SoOprefixingsimplyspeci-

fiesthebeginningofLLMoutputswith“Let’s

1putourselvesinA’sshoes.”,whereAdenotes

vthetargetcharacter’sname.WeevaluateSoOFigure1:Shoes-of-Othersprefixingspecifiesthebe-

0

7prefixingontwobenchmarksthatassessToMginningofoutputsandthenLLMsgeneratethecontin-

9inconversationalandnarrativecontextswithoutuation.TheaboveexamplefromToMATO(Shinoda

5chan

您可能关注的文档

文档评论(0)

1亿VIP精品文档

相关文档