深度学习及其应用_ 课件 0525梯度回传和神经网络.pptxVIP

  • 0
  • 0
  • 约3.64千字
  • 约 20页
  • 2026-02-26 发布于山东
  • 举报

深度学习及其应用_ 课件 0525梯度回传和神经网络.pptx

TrainingtechniquesDeepLearninganditsApplication

Back-PropagationSJTUDeepLearningLecture.2

Back-propagationAlgorithm(1)Reviewmulti-layerneuralnetworksFeedforwardoperationisachainfunctioncalculations

Back-propagationAlgorithm(2)Lossfunctionexample:squareerrorNNexample:asimpleonelayerlinearmodel:Sothederivativeoflossfunction(singlesample)is:

Back-propagationAlgorithm(3)Generalunitactivationinamultilayernetwork:Forwardpropagation:calculateforeachunitThelossLdependsononlythrough:ErrorsignalActivationfunctionInput/outputofhiddenlayerActivation

Back-propagationAlgorithm(4)Outputunitwithlinearoutputfunction:Hiddenunitwhichsendsinputstounits:Updateweights(learningrate):KaiYu.SJTUDeepLearningLecture.6CheckallnodesconnectedtotApplychainrules

Back-propagationAlgorithm(5)BPalgorithmformulti-layerNNcanbedecomposedinthefollowingfoursteps:Feed-forwardcomputationBackpropagationtotheoutputlayerBackpropagationtothehiddenlayerWeightupdatesKaiYu.SJTUDeepLearningLecture.7

ActivationFunction--SigmoidSigmoidFunctionKaiYu.SJTUDeepLearningLecture.8

ActivationFunction--SigmoidSigmoidssaturateandkillgradients

Whentheneuron’sactivationsaturatesateithertailof0or1,thegradientattheseregionsisalmostzero.Sigmoidoutputsarenotzero-centered

Ifthedatacomingintoaneuronisalwayspositive,thenthegradientontheweightswillbecomeeitherallbepositive,orallnegative(zigzagproblem).KaiYu.SJTUDeepLearningLecture.9

ActivationFunction--TanhTanhFunctionKaiYu.SJTUDeepLearningLecture.10

ActivationFunction--ReLUReLU(RectifiedLinearUnit)KaiYu.SJTUDeepLearningLecture.11Q:IsReLuLinearorNon-Linearactivationfunction?

ActivationFunction--ReLUGreatlyacceleratetheconvergenceofstochasticgradientdescentcomparedtothe.ReLUcanbeimplementedby

文档评论(0)

1亿VIP精品文档

相关文档