网站大量收购独家精品文档,联系QQ:2885784924

WJ-CH05-模式识别-模式选择 英文版 演示教学.ppt

WJ-CH05-模式识别-模式选择 英文版 演示教学.ppt

  1. 1、本文档共47页,可阅读全部内容。
  2. 2、原创力文档(book118)网站文档一经付费(服务费),不意味着购买了该文档的版权,仅供个人/单位学习、研究之用,不得用于商业用途,未经授权,严禁复制、发行、汇编、翻译或者网络传播等,侵权必究。
  3. 3、本站所有内容均由合作方或网友上传,本站不对文档的完整性、权威性及其观点立场正确性做任何保证或承诺!文档内容仅供研究参考,付费前请自行鉴别。如您付费,意味着您自己接受本站规则且自行承担风险,本站不退款、不进行额外附加服务;查看《如何避免下载的几个坑》。如果您已付费下载过本站文档,您可以点击 这里二次下载
  4. 4、如文档侵犯商业秘密、侵犯著作权、侵犯人身权等,请点击“版权申诉”(推荐),也可以打举报电话:400-050-0827(电话支持时间:9:00-18:30)。
查看更多
WJ-CH05-模式识别-模式选择 英文版 演示教学.ppt

Sequential backward selection. Here the reverse procedure is followed. Compute C for each feature. Select the “best” one, say x1 For all possible 2D combinations of x1, i.e., [x1, x2], [x1, x3], [x1, x4] compute C and choose the best, say [x1, x3]. For all possible 3D combinations of [x1, x3], e.g., [x1, x3, x2], etc., compute C and choose the best one. The above procedure is repeated till the “best” vector with features has been formed. This is also a suboptimal technique, requiring: operations. Floating Search Methods The above two procedures suffer from the nesting effect. Once a bad choice has been done, there is no way to reconsider it in the following steps. In the floating search methods one is given the opportunity in reconsidering a previously discarded feature or to discard a feature that was previously chosen. The method is still suboptimal, however it leads to improved performance, at the expense of complexity. Remarks: Besides suboptimal techniques, some optimal searching techniques can also be used, provided(假如) that the optimizing cost has certain properties, e.g., monotonic(单调). Instead of using a class separability measure (filter techniques过滤技术) or using directly the classifier (wrapper techniques包裹技术), one can modify the cost function of the classifier appropriately, so that to perform feature selection and classifier design in a single step (embedded嵌入技术) method. For the choice of the separability measure a multiplicity(多样性) of costs have been proposed, including information theoretic costs. 王杰(博士/教授/博导) 郑州大学电气工程学院 0371 wj@zzu.edu.cn 模式识别 Pattern Recognition Chapter 5 FEATURE SELECTION The goals: Select the “optimum” number l of features Select the “best” l features Large l has a three-fold disadvantage: High computational demands Low generalization performance Poor error estimates FEATURE SELECTION Given N l must be large enough to learn what makes classes different what makes patterns in

您可能关注的文档

文档评论(0)

yuzongxu123 + 关注
实名认证
内容提供者

该用户很懒,什么也没介绍

1亿VIP精品文档

相关文档