[1]翟俊海,臧立光,周昭一.集成Dropout极限学习机数据分类方法[J].南京师范大学学报(自然科学版),2017,40(03):59.[doi:10.3969/j.issn.1001-4616.2017.03.009]
 Zhai Junhai,Zang Liguang,Zhou Zhaoyi.An Approach of Integrating Dropout Extreme Learning Machine for Data Classification[J].Journal of Nanjing Normal University(Natural Science Edition),2017,40(03):59.[doi:10.3969/j.issn.1001-4616.2017.03.009]
点击复制

集成Dropout极限学习机数据分类方法()
分享到:

《南京师范大学学报》(自然科学版)[ISSN:1001-4616/CN:32-1239/N]

卷:
第40卷
期数:
2017年03期
页码:
59
栏目:
·计算机科学·
出版日期:
2017-09-30

文章信息/Info

Title:
An Approach of Integrating Dropout Extreme Learning Machine for Data Classification
文章编号:
1001-4616(2017)03-0059-08
作者:
翟俊海12臧立光3周昭一1
(1.河北省机器学习与计算智能重点实验室,河北大学数学与信息科学学院,河北 保定 071002)(2.浙江师范大学数理与信息工程学院,浙江 金华 321004)(3.河北大学计算机科学与技术学院,河北 保定 071002)
Author(s):
Zhai Junhai12Zang Liguang3Zhou Zhaoyi1
(1.Key Lab of Machine Learning and Computational Intelligence,College of Mathematics and Information Science,Hebei University,Baoding 071002,China)(2.College of Mathematics,Physics and Information Engineering,Zhejiang Normal University,Jinhua 321004,Chi
关键词:
极限学习机随机化方法重复训练泛化能力集成
Keywords:
extreme learning machinerandomization methodsretraingeneralizationensemble
分类号:
TP181
DOI:
10.3969/j.issn.1001-4616.2017.03.009
文献标志码:
A
摘要:
极限学习机(Extreme Learning Machine,ELM)是一种速度快,泛化能力强的训练单隐含层前馈神经网络(Single-hidden Layer Feed-forward Neural-network,SLFN)的算法. 但是在应用ELM解决实际问题时,需要先确定合适的SLFN结构. 然而,对于给定的问题,确定合适的SLFN结构是非常困难的. 针对这一问题,本文提出了一种集成学习方法. 用该方法解决问题时,不需要事先确定SLFN的结构. 提出的方法包括3步:(1)初始化一个比较大的SLFN;(2)用ELM重复训练若干个Dropout掉若干个隐含层结点的SLFNs;(3)用多数投票法集成训练好的SLFNs,并对测试样例进行分类. 在10个数据集上进行了实验,比较了本文提出的方法和传统的极限学习机方法. 实验结果表明,本文提出的方法在分类性能上优于传统的极限学习机算法.
Abstract:
Extreme learning machine(ELM)is an algorithm for training single-hidden layer feed-forward neural-network(SLFN)with fast speed and good generalization. It is indispensable to firstly select an appropriate architecture of SLFN when applying ELM to solve practical problems. However. It is very difficult to select an appropriate architecture of SLFN. In order to deal with this problem,an ensemble learning method is proposed in this paper. It is unnecessary to select the appropriate architecture of SLFN when using the proposed method to solve the problems. The proposed method includes 3 steps:(1)initialize a big SLFN;(2)retrain some SLFNs with dropout hidden nodes with ELM;(3)the trained SLFNs are integrated by majority voting method,and the integrated model is used to classify testing instances. We experimentally compared the proposed approach with traditional ELM on 10 data sets,and the experimental results confirm that the proposed apprach outperforms the traditional ELM on performance of classification.

参考文献/References:

[1] HUANG G B,ZHU Q Y,SIEW C K. Extreme learning machine:a new learning scheme of feedforward neural networks[C]//Proceedings of International Joint Conference on Neural Networks(IJCNN2004),vol. 2,Budapest,Hungary,25-29 July,2004:985-990.
[2]HUANG G B,CHEN L,SIEW C K. Universal approximation using incremental constructive feedforward networks with random hidden nodes[J]. IEEE transactions on neural networks,2006,17(4):879-892.
[3]MOHAMMED A A,MINHAS R,JONATHAN Q M,et al. Human face recognition based on multidimensional PCA and extreme learning machine[J]. Pattern recognition,2011,44(10/11):2 588-2 597.
[4]CHACKO B P,VIMAL V R V,RAJU G,et al. Handwritten character recognition using wavelet energy and extreme learning machine[J]. International journal of machine learning and cybernetics,2012,3(2):149-161.
[5]张弦,王宏力. 基于贯序正则极端学习机的时间序列预测及其应用[J]. 航空学报,2011,32(7):1 302-1 308.
[6]YANG H,THOMAS P,OLGA F. Fault detection based on signal reconstruction with auto-associative extreme learning machines[J]. Engineering applications of artificial intelligence,2017,57:105-117.
[7]ZHU H,TSANG E C,WANG X Z,et al. Monotonic classification extreme learning machine[J]. Neurocomputing,2017,225:205-213.
[8]CHEN Y L,WU W. Mapping mineral prospectivity using an extreme learning machine regression[J]. Ore geology reviews,2017,80:200-213.
[9]XU S,WANG J. A fast incremental extreme learning machine algorithm for data streams classification[J]. Expert systems with applications,2016,65:332-344.
[10]ANAM K,AL-JUMAILY A. Evaluation of extreme learning machine for classification of individual and combined finger movements using electromyography on amputees and non-amputees[J]. Neural networks,2017,85:51-68.
[11]IOSIFIDIS A,TEFAS A,PITAS I. Approximate kernel extreme learning machine for large scale data classification[J]. Neurocomputing,2017,219:210-220.
[12]裘日辉,刘康玲,谭海龙,等. 基于极限学习机的分类算法及在故障识别中的应用[J]. 浙江大学学报(工学版),2016,50(10):1 965-1 972.
[13]HUANG G B,CHEN L,SIEW C K. Universal approximation using incremental constructive feedforward networks with random hidden nodes[J]. IEEE transactions on neural networks,2006,17(4):879-892.
[14]HUANG G B,LI M B,CHEN L,et al. Incremental extreme learning machine with fully complex hidden nodes[J]. Neurocomputing,2008,71(4/5/6):576-583.
[15]RONG H J,ONG Y S,TAN A H,et al. A fast pruned-extreme learning machine for classification problem[J]. Neurocomputing,2008,72(1/2/3):359-366.
[16]AKAIKE H. Information theory and an extension of the maximum likelihood principle[C]//Second International Symposium on Information Theory. Budapest:Academiai Kiado,1992:267-281.
[17]ZHAI J H,SHAO Q Y,WANG X Z. Improvements for P-ELM1 and P-ELM2 pruning algorithms in extreme learning machines[J]. International journal of uncertainty,fuzziness and knowledge-based systems,2016,24(3):327-345.
[18]ZHAI J H,SHAO Q Y,WANG X Z. Architecture selection of ELM networks based on sensitivity of hidden nodes[J]. Neural processing letters,2016,44(2):1-19.
[19]MICHE Y,SORJAMAA A,BAS P,et al. OP-ELM:optimally pruned extreme learning machine[J]. IEEE transactions on neural networks,2010,21(1):158-162.
[20]HINTON G E,SRIVASTAVA N,KRIZHEVSKY A,et al. Improving neural networks by preventing co-adaptation of feature detectors[DB/OL]. https://arxiv.org/abs/1207.0580,2012.
[21]SRIVASTAVA N,HINTON G,KRIZHEVSKY A,et al. Dropout:a simple way to prevent neural networks from overfitting[J]. Journal of machine learning research,2014,15(1):1 929-1 958.
[22]BALDI P,SADOWSKI P. The dropout learning algorithm[J]. Artificial intelligence,2014,210(210):78-122.
[23]WAGER S,WANG S,LIANG P. Dropout training as adaptive regularization[C]//Advances in Neural Information Processing Systems,Laka Tahoe,Nevada,2013:351-359.
[24]IOSIFIDIS A,TEFAS A,PITAS I. DropELM:fast neural network regularization with dropout and drop connect[J]. Neurocomputing,2015,162:57-66.
[25]BA L J,FREY B. Adaptive dropout for training deep neural networks[C]//Advances in Neural Information Processing Systems,Laka Tahoe,Nevada,2013.
[26]LI Z,GONG B,YANG T. Improved dropout for shallow and deep learning[C]//Advances in Neural Information Processing Systems,Barcelona,Spain,2016:1-10.
[27]YANG W,JIN L,TAO D,et al. Drop sample:a new training method to enhance deep convolutional neural networks for large-scale unconstrained handwritten Chinese character recognition[J]. Pattern recognition,2016,58:190-203.
[28]KLEIN E B,STONE W N,HICKS M W,et al. Understanding Dropouts[J]. Advances in neural information processing systems,2013,26(2):89-100.
[29]FRANK A,ASUNCION A. UCI machine learning repository[DB/OL]. [http://archive.ics.uci.edu/ml],2013.

相似文献/References:

[1]马月梅,付 浩,刘国军,等.基于极限学习机的底层特征全参考彩色图像质量评价方法[J].南京师范大学学报(自然科学版),2022,45(04):91.[doi:10.3969/j.issn.1001-4616.2022.04.013]
 Ma Yuemei,Fu Hao,Liu Guojun,et al.Full Reference Color Image Quality Assessment Method via Low-level Features Combination with Extreme Learning Machine[J].Journal of Nanjing Normal University(Natural Science Edition),2022,45(03):91.[doi:10.3969/j.issn.1001-4616.2022.04.013]

备注/Memo

备注/Memo:
收稿日期:2017-03-18.
基金项目:国家自然科学基金项目(71371063)、河北省自然科学基金项目(F2017201026)、浙江省计算机科学与技术重中之重学科(浙江师范大学)基金项目.
通讯联系人:翟俊海,博士,教授,研究方向:数据挖掘与模式识别. E-mail:mczjh@126.com
更新日期/Last Update: 2017-09-30