Estimation of the optimal number of neurons in extreme learning machine using simulated annealing and the golden section

datacite.rightshttp://purl.org/coar/access_right/c_abf2eng
dc.contributor.authorGelvez-Almeida, E
dc.contributor.authorMora, M
dc.contributor.authorHuérfano-Maldonado, Y
dc.contributor.authorSalazar-Jurado, E
dc.contributor.authorMartínez-Jeraldo, N
dc.contributor.authorLozada-Yavina, R
dc.contributor.authorBaldera-Moreno, Y
dc.contributor.authorTobar, L
dc.date.accessioned2023-08-18T18:12:03Z
dc.date.available2023-08-18T18:12:03Z
dc.date.issued2023
dc.description.abstractExtreme learning machine is a neural network algorithm widely accepted in the scientific community due to the simplicity of the model and its good results in classification and regression problems; digital image processing, medical diagnosis, and signal recognition are some applications in the field of physics addressed with these neural networks. The algorithm must be executed with an adequate number of neurons in the hidden layer to obtain good results. Identifying the appropriate number of neurons in the hidden layer is an open problem in the extreme learning machine field. The search process has a high computational cost if carried out sequentially, given the complexity of the calculations as the number of neurons increases. In this work, we use the search of the golden section and simulated annealing as heuristic methods to calculate the appropriate number of neurons in the hidden layer of an Extreme Learning Machine; for the experiments, three real databases were used for the classification problem and a synthetic database for the regression problem. The results show that the search for the appropriate number of neurons is accelerated up to 4.5× times with simulated annealing and up to 95.7× times with the golden section search compared to a sequential method in the highest-dimensional database.eng
dc.format.mimetypepdfeng
dc.identifier.doihttps://doi.org/10.1088/1742-6596/2515/1/012003
dc.identifier.issn17426596
dc.identifier.urihttps://hdl.handle.net/20.500.12442/13163
dc.language.isoengeng
dc.publisherIOP Publishingeng
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.rights.accessrightsinfo:eu-repo/semantics/openAccesseng
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.sourceJournal of Physics: Conference Serieseng
dc.sourceVol. 2515 (2023)
dc.subjectExtreme Learning Machineeng
dc.subjectClassification and regression problemseng
dc.subjectDigital image processingeng
dc.subjectNeural networkseng
dc.subjectSimulatedeng
dc.subjectSequential methodeng
dc.titleEstimation of the optimal number of neurons in extreme learning machine using simulated annealing and the golden sectioneng
dc.type.driverinfo:eu-repo/semantics/articleeng
dc.type.spaArtículo científicospa
dcterms.referencesMiche Y, Sorjamaa A, Bas P, Simula O, Jutten C and Lendasse A 2009 OP-ELM: optimally pruned extreme learning machine IEEE Transactions on Neural Networks 21(1) 158eng
dcterms.referencesMiche Y, Van Heeswijk M, Bas P, Simula O and Lendasse A 2011 TROP-ELM: a double-regularized elm using lars and tikhonov regularization Neurocomputing 74(16) 2413eng
dcterms.referencesSoria-Olivas E, Gomez-Sanchis J, Martin J D, Vila-Frances J, Martinez M, Magdalena J R and Serrano A J 2011 BELM: bayesian extreme learning machine IEEE Transactions on Neural Networks 22(3) 505eng
dcterms.referencesThornton C, Hutter F, Hoos H H and Leyton-Brown K 2013 Auto-WEKA: combined selection and hyperparameter optimization of classification algorithms Proceedings of the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining ed Ghani R, Senator T E and Bradley P (Chicago: Association for Computing Machinery) p 847eng
dcterms.referencesAhila R, Sadasivam V and Manimala K 2015 An integrated PSO for parameter determination and feature selection of elm and its application in classification of power system disturbances Applied Soft Computing 32 23eng
dcterms.referencesAli M H, Fadlizolkipi M, Firdaus A and Khidzir N Z 2018 A hybrid particle swarm optimization-extreme learning machine approach for intrusion detection system Proceedings of the 2018 IEEE Student Conference on Research and Development (SCOReD) (Selangor: IEEE) p 1eng
dcterms.referencesBao X, Li Y, Li J, Shi R and Ding X 2021 Prediction of train arrival delay using hybrid ELM-PSO approach Journal of Advanced Transportation 2021 7763125eng
dcterms.referencesCao Z, Xia J, Zhang M, Jin J, Deng L, Wang X and Qu J 2015 Optimization of gear blank preforms based on a new R-GPLVM model utilizing GA-ELM Knowledge-Based Systems 83 66eng
dcterms.referencesAhmad W, Ayub N, Ali T, Irfan M, Awais M, Shiraz M and Glowacz A 2020 Towards short term electricity load forecasting using improved support vector machine and extreme learning machine Energies 13(11) 2907eng
dcterms.referencesMartinho A D, Ribeiro C, Gorodetskaya Y, Fonseca T L and Goliatt L 2020 Extreme learning machine with evolutionary parameter tuning applied to forecast the daily natural flow at cahora bassa dam, mozambique Bioinspired Methods and Their Applications ed Filipiˇc B, Minisci E and Vasile M (Cham: Springer) p 255eng
dcterms.referencesZhang M, Wu Q, Xu Z et al. 2019 Tuning extreme learning machine by an improved electromagnetism-like mechanism algorithm for classification problem Mathematical Biosciences and Engineering 16(5) 4692
dcterms.referencesKirkpatrick S, Gelatt C D and Vecchi M P 1983 Optimization by simulated annealing Science 220(4598) 671eng
dcterms.referencesKoupaei J A, Hosseini S M M and Ghaini F M 2016 A new optimization algorithm based on chaotic maps and golden section search method Engineering Applications of Artificial Intelligence 50 201eng
dcterms.referencesLyche T 2020 Numerical Linear Algebra and Matrix Factorizations vol 22 (Oslo: Springer)eng
dcterms.referencesHuang G B, Zhu Q Y and Siew C K 2006 Extreme learning machine: theory and applications Neurocomputing 70(1-3) 489eng
dcterms.referencesHaznedar B, Arslan M T and Kalinli A 2021 Optimizing ANFIS using simulated annealing algorithm for classification of microarray gene expression cancer data Medical & Biological Engineering & Computing 59(3) 497eng
dcterms.referencesKubat M 1999 Neural networks: a comprehensive foundation The Knowledge Engineering Review vol 13 ed Simon H (New York: Cambridge University Press) p 409eng
dcterms.referencesPeltonen J, Klami A and Kaski S 2004 Improved learning of riemannian metrics for exploratory analysis Neural Networks 17(8-9) 1087eng
dcterms.referencesHsu C W and Lin C J 2002 A comparison of methods for multiclass support vector machines IEEE Transactions on Neural Networks 13(2) 415eng
oaire.versioninfo:eu-repo/semantics/publishedVersioneng

Archivos

Bloque original
Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
PDF.pdf
Tamaño:
677.12 KB
Formato:
Adobe Portable Document Format
Descripción:
PDF
Bloque de licencias
Mostrando 1 - 1 de 1
No hay miniatura disponible
Nombre:
license.txt
Tamaño:
381 B
Formato:
Item-specific license agreed upon to submission
Descripción:

Colecciones