深度学习及其应用_ 课件 21-循环神经网络.pptxVIP

  • 2
  • 0
  • 约4.6千字
  • 约 28页
  • 2026-02-26 发布于山东
  • 举报

深度学习及其应用_ 课件 21-循环神经网络.pptx

recurrentneuralnetworkDeepLearninganditsApplication

WhydoWeNeedRNN?DNNstructureisgreatforinputoffixedsize.SJTUDeepLearningLecture.2

Q:Whatifwewanttomakepredictionfromasequenceofevents?。。。Today’sweather?SJTUDeepLearningLecture.3WhydoWeNeedRNN?

Solution1:Onlyconsiderthemostrecentseveraldays:。。。Today’sweather?Solution2:Defineawaythat“encodes”thewholehistory.RNNdoesthatinaneuralnetworkstyle.SJTUDeepLearningLecture.4WhydoWeNeedRNN?

SerialorderMichaelJordan,Serialorder:Aparalleldistributedprocessingapproach(Tech.Rep.No.8604).SanDiego:UniversityofCalifornia,InstituteforCognitiveScience.“Itismyviewthatmanyoftheseproblemsdisappearwhenacleardistinctionismadebetweenthestateofthesystemandtheoutputofthesystem.“-MichaelI.JordanSerialorder:aparalleldistributedprocessingapproach.Technicalreport,June1985-March1986

ElmannetworkJeffreyElman,FindingStructureinTime,1990

EarlyStudy(Elmannetwork)SJTUDeepLearningLecture.[1]MichaelJordan,Serialorder:Aparalleldistributedprocessingapproach(Tech.Rep.No.8604).SanDiego:UniversityofCalifornia,InstituteforCognitiveScience.[2]JeffreyElman,FindingStructureinTime,1990SerialorderElmannetwork(trainedusingbackpropagation)outputtostatestatetostate

Serialorder(remakeversion)Dashedline:fromtimet-1totCalledstateunitsinJordan’spaperMichaelJordan,Serialorder:Aparalleldistributedprocessingapproach(Tech.Rep.No.8604).SanDiego:UniversityofCalifornia,InstituteforCognitiveScience.

Elmannetwork(remakeversion)Dashedline:fromtimet-1totJeffreyElman,FindingStructureinTime,1990

RevisitXORProblemHowtomakeitatemporallearningtask?Ellman’sway:XY-Z[XYZ]XYistheinputtwobitsZistheoutputbitZ=XOR(X,Y)0,01,01,10,1100

TheresultsofXORRNNtest

RecurrentNeuralNetwork(RNN)SJTUDeep

文档评论(0)

1亿VIP精品文档

相关文档