BERT深度双向Transformer预训练技术综述与实验分析.pptx

BERT深度双向Transformer预训练技术综述与实验分析.pptx

Review:PaperReading

BERT:Pre-trainingofDeepBidirectionalTransformersforLanguageUnderstanding范老师2020/06/07

ContentIntroductionRelatedWorkBERTExperimentsAblationStudiesConclusion

IntroductionBERT:BidirectionalEncoderRepresentationsfromTransformersBidirectionalpre-trainingforlanguagerepresentationsPre-trainedrepresentationsreducetheneedformanyheavily-engineeredtask-specificarchitecturesFine-tuningwithoutdoingmuchworkBERTadvancesthestateoftheartfor11NLPtasks.

RelatedWork2.1UnsupervisedFeature-basedApproaches(notbidirectional)Left-to-rightlangua

文档评论(0)

1亿VIP精品文档

相关文档