|Table of Contents|

An Approach of Integrating Dropout Extreme Learning Machine for Data Classification(PDF)

《南京师大学报(自然科学版)》[ISSN:1001-4616/CN:32-1239/N]

Issue:
2017年03期
Page:
59-
Research Field:
·计算机科学·
Publishing date:

Info

Title:
An Approach of Integrating Dropout Extreme Learning Machine for Data Classification
Author(s):
Zhai Junhai12Zang Liguang3Zhou Zhaoyi1
(1.Key Lab of Machine Learning and Computational Intelligence,College of Mathematics and Information Science,Hebei University,Baoding 071002,China)(2.College of Mathematics,Physics and Information Engineering,Zhejiang Normal University,Jinhua 321004,Chi
Keywords:
extreme learning machinerandomization methodsretraingeneralizationensemble
PACS:
TP181
DOI:
10.3969/j.issn.1001-4616.2017.03.009
Abstract:
Extreme learning machine(ELM)is an algorithm for training single-hidden layer feed-forward neural-network(SLFN)with fast speed and good generalization. It is indispensable to firstly select an appropriate architecture of SLFN when applying ELM to solve practical problems. However. It is very difficult to select an appropriate architecture of SLFN. In order to deal with this problem,an ensemble learning method is proposed in this paper. It is unnecessary to select the appropriate architecture of SLFN when using the proposed method to solve the problems. The proposed method includes 3 steps:(1)initialize a big SLFN;(2)retrain some SLFNs with dropout hidden nodes with ELM;(3)the trained SLFNs are integrated by majority voting method,and the integrated model is used to classify testing instances. We experimentally compared the proposed approach with traditional ELM on 10 data sets,and the experimental results confirm that the proposed apprach outperforms the traditional ELM on performance of classification.

References:

[1] HUANG G B,ZHU Q Y,SIEW C K. Extreme learning machine:a new learning scheme of feedforward neural networks[C]//Proceedings of International Joint Conference on Neural Networks(IJCNN2004),vol. 2,Budapest,Hungary,25-29 July,2004:985-990.
[2]HUANG G B,CHEN L,SIEW C K. Universal approximation using incremental constructive feedforward networks with random hidden nodes[J]. IEEE transactions on neural networks,2006,17(4):879-892.
[3]MOHAMMED A A,MINHAS R,JONATHAN Q M,et al. Human face recognition based on multidimensional PCA and extreme learning machine[J]. Pattern recognition,2011,44(10/11):2 588-2 597.
[4]CHACKO B P,VIMAL V R V,RAJU G,et al. Handwritten character recognition using wavelet energy and extreme learning machine[J]. International journal of machine learning and cybernetics,2012,3(2):149-161.
[5]张弦,王宏力. 基于贯序正则极端学习机的时间序列预测及其应用[J]. 航空学报,2011,32(7):1 302-1 308.
[6]YANG H,THOMAS P,OLGA F. Fault detection based on signal reconstruction with auto-associative extreme learning machines[J]. Engineering applications of artificial intelligence,2017,57:105-117.
[7]ZHU H,TSANG E C,WANG X Z,et al. Monotonic classification extreme learning machine[J]. Neurocomputing,2017,225:205-213.
[8]CHEN Y L,WU W. Mapping mineral prospectivity using an extreme learning machine regression[J]. Ore geology reviews,2017,80:200-213.
[9]XU S,WANG J. A fast incremental extreme learning machine algorithm for data streams classification[J]. Expert systems with applications,2016,65:332-344.
[10]ANAM K,AL-JUMAILY A. Evaluation of extreme learning machine for classification of individual and combined finger movements using electromyography on amputees and non-amputees[J]. Neural networks,2017,85:51-68.
[11]IOSIFIDIS A,TEFAS A,PITAS I. Approximate kernel extreme learning machine for large scale data classification[J]. Neurocomputing,2017,219:210-220.
[12]裘日辉,刘康玲,谭海龙,等. 基于极限学习机的分类算法及在故障识别中的应用[J]. 浙江大学学报(工学版),2016,50(10):1 965-1 972.
[13]HUANG G B,CHEN L,SIEW C K. Universal approximation using incremental constructive feedforward networks with random hidden nodes[J]. IEEE transactions on neural networks,2006,17(4):879-892.
[14]HUANG G B,LI M B,CHEN L,et al. Incremental extreme learning machine with fully complex hidden nodes[J]. Neurocomputing,2008,71(4/5/6):576-583.
[15]RONG H J,ONG Y S,TAN A H,et al. A fast pruned-extreme learning machine for classification problem[J]. Neurocomputing,2008,72(1/2/3):359-366.
[16]AKAIKE H. Information theory and an extension of the maximum likelihood principle[C]//Second International Symposium on Information Theory. Budapest:Academiai Kiado,1992:267-281.
[17]ZHAI J H,SHAO Q Y,WANG X Z. Improvements for P-ELM1 and P-ELM2 pruning algorithms in extreme learning machines[J]. International journal of uncertainty,fuzziness and knowledge-based systems,2016,24(3):327-345.
[18]ZHAI J H,SHAO Q Y,WANG X Z. Architecture selection of ELM networks based on sensitivity of hidden nodes[J]. Neural processing letters,2016,44(2):1-19.
[19]MICHE Y,SORJAMAA A,BAS P,et al. OP-ELM:optimally pruned extreme learning machine[J]. IEEE transactions on neural networks,2010,21(1):158-162.
[20]HINTON G E,SRIVASTAVA N,KRIZHEVSKY A,et al. Improving neural networks by preventing co-adaptation of feature detectors[DB/OL]. https://arxiv.org/abs/1207.0580,2012.
[21]SRIVASTAVA N,HINTON G,KRIZHEVSKY A,et al. Dropout:a simple way to prevent neural networks from overfitting[J]. Journal of machine learning research,2014,15(1):1 929-1 958.
[22]BALDI P,SADOWSKI P. The dropout learning algorithm[J]. Artificial intelligence,2014,210(210):78-122.
[23]WAGER S,WANG S,LIANG P. Dropout training as adaptive regularization[C]//Advances in Neural Information Processing Systems,Laka Tahoe,Nevada,2013:351-359.
[24]IOSIFIDIS A,TEFAS A,PITAS I. DropELM:fast neural network regularization with dropout and drop connect[J]. Neurocomputing,2015,162:57-66.
[25]BA L J,FREY B. Adaptive dropout for training deep neural networks[C]//Advances in Neural Information Processing Systems,Laka Tahoe,Nevada,2013.
[26]LI Z,GONG B,YANG T. Improved dropout for shallow and deep learning[C]//Advances in Neural Information Processing Systems,Barcelona,Spain,2016:1-10.
[27]YANG W,JIN L,TAO D,et al. Drop sample:a new training method to enhance deep convolutional neural networks for large-scale unconstrained handwritten Chinese character recognition[J]. Pattern recognition,2016,58:190-203.
[28]KLEIN E B,STONE W N,HICKS M W,et al. Understanding Dropouts[J]. Advances in neural information processing systems,2013,26(2):89-100.
[29]FRANK A,ASUNCION A. UCI machine learning repository[DB/OL]. [http://archive.ics.uci.edu/ml],2013.

Memo

Memo:
-
Last Update: 2017-09-30