深度学习及其应用_ 课件 0405自监督-对比及重建.pptxVIP

  • 0
  • 0
  • 约4.32千字
  • 约 15页
  • 2026-02-27 发布于山东
  • 举报

深度学习及其应用_ 课件 0405自监督-对比及重建.pptx

Self-supervisedLearning:contrastivelearningandreconstruction-basedlearning

Self-supervisedLearningPretexttasksContrastivebasedLearningGenerativeReconstructionbasedLearning2015-2022-Populartime

outlineSJTUDeepLearningLecture.3ContrastivebasedLearningGenerativeReconstructionbasedLearning

ContrastiveLearning(InfoNCE-2018)AttractthefeaturesofpositivesamplesRepelthefeaturesofnegativesamplesDataaugmentationsRepresentationlearningwithcontrastivepredictivecoding,arXivpreprintarXiv:1807.03748(DeepMind)

Component1.Astochasticdataaugmentationmodulethattransformsanygivendatasamplerandomlyresultingintwocorrelatedviewsofthesamesample:x_iandx_j,consideredasapositivepair2.Twoneuralnetworkbasedencodersq(.),k(.)thatextractrepresentationvectorsfromaugmenteddatasamples.Theyrepresenttheextractedfeaturepairas(q,k+).k+ispositivesample3.Amemorybanktosaveasetofkeyvalues{k1,k2,…,}(kisrandomgeneratedinGaussiandistribution).Foreachqueryq,theyconsiderthepair(q,ki)asanegativepair.4.Acontrastivelossisdefinedforacontrastivepredictiontask.Thislossaimstoupdatetheparametersofq(.)encoder5.Amomentumbasedoptimizationmethodtoupdatek(.)encoderPositivesTheimagesaugmentedfromthesameimagearerecognizedaspositives,andotherimagesarerecognizedasnegativesObjectivefunction?OptimizationmethodHeremisamomentumcoefficient.(defaultis0.999)ContrastiveLearning:MoCo(CVPR2020)

ContrastiveLearning:MoCo(CVPR2020)

Component1.Astochasticdataaugmentationmodulethattransformsanygivendatasamplerandomlyresultingintwocorrelatedviewsofthesameexample,denotedx?iandx?j,whichtheyconsiderasapositivepair.2.Aneuralnetworkbaseencoderf(·)thatextractsrepresentationvectorsfromaugmenteddatasamples.3.Asmallneuralnetworkprojectionheadg(·)thatmapsrepresentationstothespacewhereco

文档评论(0)

1亿VIP精品文档

相关文档