- 1、本文档共81页,可阅读全部内容。
- 2、原创力文档(book118)网站文档一经付费(服务费),不意味着购买了该文档的版权,仅供个人/单位学习、研究之用,不得用于商业用途,未经授权,严禁复制、发行、汇编、翻译或者网络传播等,侵权必究。
- 3、本站所有内容均由合作方或网友上传,本站不对文档的完整性、权威性及其观点立场正确性做任何保证或承诺!文档内容仅供研究参考,付费前请自行鉴别。如您付费,意味着您自己接受本站规则且自行承担风险,本站不退款、不进行额外附加服务;查看《如何避免下载的几个坑》。如果您已付费下载过本站文档,您可以点击 这里二次下载。
- 4、如文档侵犯商业秘密、侵犯著作权、侵犯人身权等,请点击“版权申诉”(推荐),也可以打举报电话:400-050-0827(电话支持时间:9:00-18:30)。
- 5、该文档为VIP文档,如果想要下载,成为VIP会员后,下载免费。
- 6、成为VIP后,下载本文档将扣除1次下载权益。下载后,不支持退款、换文档。如有疑问请联系我们。
- 7、成为VIP后,您将拥有八大权益,权益包括:VIP文档下载权益、阅读免打扰、文档格式转换、高级专利检索、专属身份标志、高级客服、多端互通、版权登记。
- 8、VIP文档为合作方或网友上传,每下载1次, 网站将根据用户上传文档的质量评分、类型等,对文档贡献者给予高额补贴、流量扶持。如果你也想贡献VIP文档。上传文档
查看更多
DataminingConceptsandteniques8
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * Ensemble Methods: Increasing the Accuracy Ensemble methods Use a combination of models to increase accuracy Combine a series of k learned models, M1, M2, …, Mk, with the aim of creating an improved model M* Popular ensemble methods Bagging: averaging the prediction over a collection of classifiers Boosting: weighted vote with a collection of classifiers Ensemble: combining a set of heterogeneous classifiers * Bagging: Boostrap Aggregation Analogy: Diagnosis based on multiple doctors’ majority vote Training Given a set D of d tuples, at each iteration i, a training set Di of d tuples is sampled with replacement from D (i.e., bootstrap) A classifier model Mi is learned for each training set Di Classification: classify an unknown sample X Each classifier Mi returns its class prediction The bagged classifier M* counts the votes and assigns the class with the most votes to X Prediction: can be applied to the prediction of continuous values by taking the average value of each prediction for a given test tuple Accuracy Often significantly better than a single classifier derived from D For noise data: not considerably worse, more robust Proved improved accuracy in prediction * Boosting Analogy: Consult several doctors, based on a combination of weighted diagnoses—weight assigned based on the previous diagnosis accuracy How boosting works? Weights are assigned to each training tuple A series of k classifiers is iteratively learned After a classifier Mi is learned, the weights are updated to allow the subsequent classifier, Mi+1, to pay more attention to the training tuples that were misclassified by Mi The final M* combines the votes of each individual classifier, where the weight of each classifiers vote is a function of its accuracy Boosting algorithm can be extended for numeric prediction Comparing with bagging: Boosting tends to have greater accuracy, but it also risks overfitting the
您可能关注的文档
- Authorware课件制作实例教程第章课件中的动画效果.ppt
- ATTIRE英国婚纱礼服杂志年月号完整版杂志doc.ppt
- Australia澳大利亚风土人情英文版.ppt
- AutoCAD使用教程.ppt
- AutoCAD第三讲编辑命令.ppt
- AutoCAD实用教程免费下载.ppt
- AutoCAD机械绘图技巧完整版工科同学必备.ppt
- AXD调试工具的使用详解.ppt
- AutoCAD建筑绘图精解.ppt
- AutoCAD第四讲编辑命令.ppt
- DataminingConceptsandteniques6.ppt
- DataminingConceptsandteniques7.ppt
- DataminingConceptsandteniques5.ppt
- DataMininginBioinformatics.ppt
- DataminingConceptsandteniques9.ppt
- DataminingConceptsandteniques02.ppt
- DataminingConceptsandteniques教学PPT.ppt
- DEFORMDv基本操作指南.ppt
- Deloitte内控与风险管理信息系统.ppt
- DEH控制系统特点.ppt
最近下载
- 《神经外科病历书写》课件.ppt VIP
- TGDSTT 1-2021 柔性密封自锁接口聚乙烯缠绕实壁排水管及配件.docx VIP
- 2024年译林版九年级英语(上册)重点单词、短语、句型背诵手册.pdf VIP
- 人教版高中物理必修一第一章《运动的描述》测试题(含答案解析).doc VIP
- 2025CSCO头颈部肿瘤诊疗指南解读 (1)PPT课件.pptx VIP
- 2024年恩施州鹤峰县选调工作人员笔试真题.docx VIP
- 2025山西临汾隰县人力资源和社会保障局开发公益性岗位招用就业困难人员91人备考题库及答案解析.docx VIP
- 苏科版八年级第七章从粒子到宇宙复习.ppt VIP
- TSG T- 电梯维护保养规则.pptx VIP
- 2024活跃用户研究报告(小红书平台).pdf VIP
文档评论(0)