您所在位置网站首页 > 海量文档  > 专业论文 > 综合论文

Discussion of Least angle regression by Efron et al.pdf 6页

本文档一共被下载: ,您可全文免费在线阅读后下载本文档。

  • 支付并下载
  • 收藏该文档
  • 百度一下本文档
  • 修改文档简介


特别说明: 下载前务必先预览,自己验证一下是不是你要下载的文档。
  • 上传作者 l215322(上传创作收益人)
  • 发布时间:2017-04-06
  • 需要金币50(10金币=人民币1元)
  • 浏览人气
  • 下载次数
  • 收藏次数
  • 文件大小:138.64 KB
Discussion of Least angle regression by Efron et al
a r X i v : m a t h / 0 4 0 6 4 7 0 v 1 [ m a t h .S T ] 2 3 J u n 2 0 0 4 The Annals of Statistics 2004, Vol. 32, No. 2, 469–475 DOI: 10.1214/009053604000000067 c? Institute of Mathematical Statistics, 2004 DISCUSSION OF “LEAST ANGLE REGRESSION” BY EFRON ET AL. By Saharon Rosset and Ji Zhu IBM T. J. Watson Research Center and Stanford University 1. Introduction. We congratulate the authors on their excellent work. The paper combines elegant theory and useful practical results in an in- triguing manner. The LAR–Lasso–boosting relationship opens the door for new insights on existing methods’ underlying statistical mechanisms and for the development of new and promising methodology. Two issues in particu- lar have captured our attention, as their implications go beyond the squared error loss case presented in this paper, into wider statistical domains: ro- bust fitting, classification, machine learning and more. We concentrate our discussion on these two results and their extensions. 2. Piecewise linear regularized solution paths. The first issue is the piecewise linear solution paths to regularized optimization problems. As the discussion paper shows, the path of optimal solutions to the “Lasso” regularized optimization problem β?(λ) = argmin β ‖y ?Xβ‖22 + λ‖β‖1(2.1) is piecewise linear as a function of λ; that is, there exist ∞>λ0 >λ1 > · · ·> λm = 0 such that ?λ≥ 0, with λk ≥ λ≥ λk+1, we have β?(λ) = β?(λk)? (λ? λk)γk. In the discussion paper’s terms, γk is the “LAR” direction for the kth step of the LAR–Lasso algorithm. This property allows the LAR–Lasso algorithm to generate the whole path of Lasso solutions, β?(λ), for “practically” the cost of one least squares calculation on the data (this is exactly the case for LAR but not for LAR– Lasso, which may be significantly more computationally intensive on some data sets). The important practical consequence is that it is not necessary This is an electronic reprint of the original article published by the In


用户名: 验证码: 点击我更换图片

“原创力文档”前称为“文档投稿赚钱网”,本站为“文档C2C交易模式”,即用户上传的文档直接卖给(下载)用户,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有【成交的100%(原创)】。原创力文档是网络服务平台方,若您的权利被侵害,侵权客服QQ:3005833200 电话:19940600175 欢迎举报,上传者QQ群:784321556