- 1、本文档共10页,可阅读全部内容。
- 2、原创力文档(book118)网站文档一经付费(服务费),不意味着购买了该文档的版权,仅供个人/单位学习、研究之用,不得用于商业用途,未经授权,严禁复制、发行、汇编、翻译或者网络传播等,侵权必究。
- 3、本站所有内容均由合作方或网友上传,本站不对文档的完整性、权威性及其观点立场正确性做任何保证或承诺!文档内容仅供研究参考,付费前请自行鉴别。如您付费,意味着您自己接受本站规则且自行承担风险,本站不退款、不进行额外附加服务;查看《如何避免下载的几个坑》。如果您已付费下载过本站文档,您可以点击 这里二次下载。
- 4、如文档侵犯商业秘密、侵犯著作权、侵犯人身权等,请点击“版权申诉”(推荐),也可以打举报电话:400-050-0827(电话支持时间:9:00-18:30)。
BART:DenoisingSequence-to-SequencePre-trainingforNatural
LanguageGeneration,Translation,andComprehension
MikeLewis*,YinhanLiu*,NamanGoyal*,MarjanGhazvininejad,
AbdelrahmanMohamed,OmerLevy,VesStoyanov,LukeZettlemoyer
FacebookAI
mikelewis,yinhanliu,naman@
Abstractmaskedtokensarepredicted(Yangetal.,2019),andthe
availablecontextforreplacingmaskedtokens(Dong
WepresentBART,adenoisingautoencoderetal.,2019).However,thesemethodstypicallyfocus
9forpretrainingsequence-to-sequencemodels.onparticulartypesofendtasks(e.g.spanprediction,
1BARTistrainedby(1)corruptingtextwithangeneration,etc.),limitingtheirapplicability.
0arbitrarynoisingfunction,and(2)learningaInthispaper,wepresentBART,whichpre-trains
2modeltoreconstructtheoriginaltext.ItusesamodelcombiningBidirectionalandAuto-Regressive
tastandardTranformer-basedneuralmachineTransformers.BARTisadenoisingautoencoderbuilt
ctranslationarchitecturewhich,despiteitssim-withasequence-to-sequencemodelthatisapplicable
Oplicity,canbeseenasgeneralizingBERT(duetoaverywiderangeofendtasks.Pretraininghas
9tothebidirectionalencoder),GPT(withthetwostages(1)textiscorruptedwithanarbitrarynois-
2left-to-righ
文档评论(0)