理解深度学习教程部分答案.pdf

Answerbookletforstudents

November21,2024

2

Copyright©2023SimonPrince.

Answerbooklet

ThisdocumentaccompaniesthebookUnderstandingDeepLearning.Itcontainsanswers

toaselectedsubsetoftheproblemsattheendofeachchapterofthemainbook.The

remaininganswersareavailableonlytoinstructorsviatheMITPress.

Thisbooklethasnotyetbeencheckedverycarefully.Ireallyneedyourhelpinthis

regardandI’dbeverygratefulifyouwouldmailmeatudlbookmail@ifyou

cannotunderstandthetextorifyouthinkthatyoufindamistake.Suggestionsforextra

problemswillalsobegratefullyreceived!

SimonPrince

November21,2024

Copyright©2023SimonPrince.

4

Copyright©2023SimonPrince.

Chapter2

Supervisedlearning

Problem2.1Towalk“downhill”onthelossfunction(equation2.5),wemeasureitsgradientwith

respecttotheparametersϕ0andϕ1.Calculateexpressionsfortheslopes∂L/∂ϕ0and∂L/∂ϕ1.

Problem2.2Showthatwecanfindtheminimumofthelossfunctioninclosed-formbysetting

theexpressionforthederivativesfromproblem2.1tozeroandsolvingforϕ0andϕ1.

Problem2.3Considerreformulatinglinearregressionasagenerativemodelsowehavex=

g[y,ϕ]=ϕ+ϕy.Whatisthenewlossfunction?Findanexpressionfortheinversefunctiony=

01

g−1[x,ϕ]thatwewouldusetoperforminference.Willthismodelmakethesamepredictions

asthediscriminativeversionforagiventrainingdataset{x,y}?Onewaytoestablishthisis

ii

towritecodethatfitsalinetothreedatapointsusingbothmethodsandseeiftheresultisthe

same.

Answer

Wecantriviallysolveforxbyrearranging

文档评论(0)

1亿VIP精品文档

相关文档