- 1、本文档共62页,可阅读全部内容。
- 2、原创力文档(book118)网站文档一经付费(服务费),不意味着购买了该文档的版权,仅供个人/单位学习、研究之用,不得用于商业用途,未经授权,严禁复制、发行、汇编、翻译或者网络传播等,侵权必究。
- 3、本站所有内容均由合作方或网友上传,本站不对文档的完整性、权威性及其观点立场正确性做任何保证或承诺!文档内容仅供研究参考,付费前请自行鉴别。如您付费,意味着您自己接受本站规则且自行承担风险,本站不退款、不进行额外附加服务;查看《如何避免下载的几个坑》。如果您已付费下载过本站文档,您可以点击 这里二次下载。
- 4、如文档侵犯商业秘密、侵犯著作权、侵犯人身权等,请点击“版权申诉”(推荐),也可以打举报电话:400-050-0827(电话支持时间:9:00-18:30)。
查看更多
[工学]Data Mining - Concepts and Techniques CH03
* * * * * Data Mining: Concepts and Techniques * Dimensionality Reduction(维规约) Feature selection (i.e., attribute subset selection): Select a minimum set of features such that the probability distribution of different classes given the values for those features is as close as possible to the original distribution given the values of all features reduce # of patterns in the patterns, easier to understand Heuristic methods (due to exponential # of choices): step-wise forward selection (逐步向前选择) step-wise backward elimination (逐步向后删除) combining forward selection and backward elimination decision-tree induction (决策树归纳) * Data Mining: Concepts and Techniques * Example of Decision Tree Induction Initial attribute set: {A1, A2, A3, A4, A5, A6} A4 ? A1? A6? Class 1 Class 2 Class 1 Class 2 Reduced attribute set: {A1, A4, A6} * Data Mining: Concepts and Techniques * * Data Mining: Concepts and Techniques * Relevance Measures Quantitative relevance measure determines the classifying power of an attribute within a set of data. Methods information gain (ID3) gain ratio (C4.5) gini index ?2 contingency table statistics uncertainty coefficient * Data Mining: Concepts and Techniques * Information-Theoretic Approach Decision tree each internal node tests an attribute each branch corresponds to attribute value each leaf node assigns a classification ID3 algorithm build decision tree based on training objects with known class labels to classify testing objects rank attributes with information gain measure minimal height the least number of tests to classify an object See example * Data Mining: Concepts and Techniques * Top-Down Induction of Decision Tree Attributes = {Outlook, Temperature, Humidity, Wind} Outlook Humidity Wind sunny rain overcast yes no yes high normal no strong weak yes PlayTennis = {yes, no} * Data Mining: Concepts and Techniques * Entropy and Information Gain S contains si tuples of class Ci for i = {1, …, m} Information measures info required to classify a
您可能关注的文档
最近下载
- GE Digital iFIX:iFIX安全性和加密技术教程.Tex.header.docx
- 上海英语中考考纲词汇2025电子版 .pdf VIP
- 基于EVA的恒瑞医药财务绩效评价研究.docx
- 江南嘉捷MPSCOP-04操作说明(V1.1).doc
- GE Digital iFIX:iFIX用户权限管理技术教程.Tex.header.docx
- 2024年徐州市中心医院高校医学专业毕业生招聘考试历年高频考点试题含答案解析.docx
- 国外安全知识享分.ppt VIP
- 2024年中国企业全球化报告.pdf
- 医院消防安全知识.pdf VIP
- 国家开放大学《农村政策法规》形成性考核(平时作业)2022版 参考答案.docx
文档评论(0)