- 0
- 0
- 约4.37千字
- 约 34页
- 2026-02-26 发布于山东
- 举报
recurrentneuralnetworkDeepLearninganditsApplication
TheMemoryProblemThecontributionoftheinputatthefirsttime
steptotheoutputwilldecreaseexponentially
LongTermDependencyWhentislarge,couldRNNlearnlongtermdependency?Intheory,yes.Inpractice,no!Problem:vanishinggradientSJTUDeepLearningLecture.3
LongShort-TermMemory(LSTM)GraphfromwikiFirstproposedin1997.SJTUDeepLearningLecture.4
LongShort-TermMemory(LSTM)Graphfromhttp://colah.github.io/posts/2015-08-Understanding-LSTMs/SJTUDeepLearningLecture.5
LongShort-TermMemory(LSTM)
Formula:Q:What’stheinputandoutputofaLSTMmodule?SJTUDeepLearningLecture.forget
LSTM:InputsandOutputsht,ct=LSTM(xt,ht-1,ct-1)SJTUDeepLearningLecture.7
GatesGatesareawaytooptionallyletinformationthrough.Theyarecomposedofasigmoidneuralnetlayerandapointwisemultiplicationoperation.SJTUDeepLearningLecture.8
LSTM:ForgetGateForgetgatecontrolswhetherto“forget”cellmemory.SJTUDeepLearningLecture.9
LSTM:CellstateUseinputstocalculatenewcellstatecandidate.SJTUDeepLearningLecture.10
LSTM:InputGateInputgatecontrolswhetherto“incorporate”newcalculatedcellmemory.SJTUDeepLearningLecture.11
LSTM:CellThecellstateisacombinationofmemoryandupdate.SJTUDeepLearningLecture.12
LSTM:OutputGateSJTUDeepLearningLecture.13Outputgatecontrolswhetherto“output”totheoutsideworld.
LSTMFormulationWrap-upSJTUDeepLearningLecture.14
LSTM:ReasonForSuccessMostimportant:thedirectself-linkincells.Iftheresnoforgetgates(ft=1),thenSJTUDeepLearningLecture.15
Variant:Peep-holeLSTMPeep-holeLSTM?letthegatelayerslookatthecellstate.SJTUDeepLearningLecture.16Recurrentnetsthattimeandcount,IJCNN2000
IsLSTMTheOnlySolution?No!Forexample,GatedRecurrentUnit(GRU)hassimilareffects.Directmemory!Gates!SJTUDeepLearningLecture.17
Extension:Bi-directionalRNN/LSTMUni-directionalRNN/LSTMcanbee
原创力文档

文档评论(0)