- 1、原创力文档(book118)网站文档一经付费(服务费),不意味着购买了该文档的版权,仅供个人/单位学习、研究之用,不得用于商业用途,未经授权,严禁复制、发行、汇编、翻译或者网络传播等,侵权必究。。
- 2、本站所有内容均由合作方或网友上传,本站不对文档的完整性、权威性及其观点立场正确性做任何保证或承诺!文档内容仅供研究参考,付费前请自行鉴别。如您付费,意味着您自己接受本站规则且自行承担风险,本站不退款、不进行额外附加服务;查看《如何避免下载的几个坑》。如果您已付费下载过本站文档,您可以点击 这里二次下载。
- 3、如文档侵犯商业秘密、侵犯著作权、侵犯人身权等,请点击“版权申诉”(推荐),也可以打举报电话:400-050-0827(电话支持时间:9:00-18:30)。
- 4、该文档为VIP文档,如果想要下载,成为VIP会员后,下载免费。
- 5、成为VIP后,下载本文档将扣除1次下载权益。下载后,不支持退款、换文档。如有疑问请联系我们。
- 6、成为VIP后,您将拥有八大权益,权益包括:VIP文档下载权益、阅读免打扰、文档格式转换、高级专利检索、专属身份标志、高级客服、多端互通、版权登记。
- 7、VIP文档为合作方或网友上传,每下载1次, 网站将根据用户上传文档的质量评分、类型等,对文档贡献者给予高额补贴、流量扶持。如果你也想贡献VIP文档。上传文档
查看更多
面向贯序不均衡数据的混合采样极限学习机.doc
面向贯序不均衡数据的混合采样极限学习机
摘要:针对现有机器学习算法难以有效提高贯序不均衡数据分类问题中少类样本分类精度的问题,提出一种基于混合采样策略的在线贯序极限学习机。该算法可在提高少类样本分类精度的前提下,减少多类样本的分类精度损失,主要包括离线和在线两个阶段:离线阶段采用均衡采样策略,利用主曲线分别构建多类和少类样本的可信区域,在不改变样本分布特性的前提下,利用可信区域扩充少类样本和削减多类样本,进而得到均衡的离线样本集,建立初始模型;在线阶段仅对贯序到达的多类数据进行欠采样,根据样本重要度挑选最具价值的多类样本,进而动态更新网络权值。通过理论分析证明所提算法在理论上存在损失信息上界。采用UCI标准数据集和实际的澳门空气污染预报数据进行仿真实验,结果表明,与现有在线贯序极限学习机(OSELM)、极限学习机(ELM)和元认知在线贯序极限学习机(MCOSELM)算法相比,所提算法对少类样本的预测精度更高,且数值稳定性良好。
关键词:极限学习机;在线贯序数据;不均衡分类;主曲线
中图分类号: TP181
文献标志码:A
Hybrid sampling extreme learning machine for sequential imbalanced data
MAO Wentao1,2*, WANG Jinwan1, HE Ling1, YUAN Peiyan1,2
1.College of Computer and Information Engineering, Henan Normal University, Xinxiang Henan 453007, China
;
2.Engineering Laboratory of Intellectual Business and Internet of Things Technologies, Henan Province, Xinxiang Henan 453007, China
Abstract:
Many traditional machine learning methods tend to get biased classifier which leads to lower classification precision for minor class in sequential imbalanced data. To improve the classification accuracy of minor class, a new hybrid sampling online extreme learning machine on sequential imbalanced data was proposed. This algorithm could improve the classification accuracy of minor class as well as reduce the loss of classification accuracy of major class, which contained two stages. In offline stage, the principal curve was introduced to model the confidence regions of minor class and major class respectively based on the strategy of balanced samples. Oversampling of minority and undersampling of majority was achieved based on confidence region. Then the initial model was established. In online stage, only the most valuable samples of major class were chosen according to the sample importance, and then the network weight was updated dynamically. The proposed algorithm had upper bound of the information loss through the theoretical proof. The experiment was taken on two UCI datasets and the realworld ai
您可能关注的文档
最近下载
- 2025年闽教版(2024)小学英语四年级上册(全册)教学设计(附目录P123).docx
- 人教版高中英语第三册Unit 1 FESTIVALS AND CELEBRATIONS教学设计.docx VIP
- 数据结构常用算法数据结构算法.pdf VIP
- 20世纪人类最伟大的100项科学发明.doc VIP
- 北师大版九年级上册数学第一次月考试卷及答案.docx VIP
- 脊柱外科进修汇报.pptx VIP
- 2025年最新版个人征信报告(含水印)模板【可修改】 .pdf VIP
- 金刚砂地坪施工技术交底.pdf VIP
- 人教版英语2024七年级上册全册单元知识清单(背诵版).pdf VIP
- 股权设计与股权激励.pdf VIP
文档评论(0)