Review:PaperReading
BERT:Pre-trainingofDeepBidirectionalTransformersforLanguageUnderstanding范老师2020/06/07
ContentIntroductionRelatedWorkBERTExperimentsAblationStudiesConclusion
IntroductionBERT:BidirectionalEncoderRepresentationsfromTransformersBidirectionalpre-trainingforlanguagerepresentationsPre-trainedrepresentationsreducetheneedformanyheavily-engineeredtask-specificarchitecturesFine-tuningwithoutdoingmuchworkBERTadvancesthestateoftheartfor11NLPtasks.
RelatedWork2.1UnsupervisedFeature-basedApproaches(notbidirectional)Left-to-rightlangua
原创力文档

文档评论(0)