Estimation of the optimal number of neurons in extreme learning machine using simulated annealing and the golden section

Cargando...
Miniatura

Fecha

2023

Título de la revista

ISSN de la revista

Título del volumen

Editor

IOP Publishing

Resumen

Extreme learning machine is a neural network algorithm widely accepted in the scientific community due to the simplicity of the model and its good results in classification and regression problems; digital image processing, medical diagnosis, and signal recognition are some applications in the field of physics addressed with these neural networks. The algorithm must be executed with an adequate number of neurons in the hidden layer to obtain good results. Identifying the appropriate number of neurons in the hidden layer is an open problem in the extreme learning machine field. The search process has a high computational cost if carried out sequentially, given the complexity of the calculations as the number of neurons increases. In this work, we use the search of the golden section and simulated annealing as heuristic methods to calculate the appropriate number of neurons in the hidden layer of an Extreme Learning Machine; for the experiments, three real databases were used for the classification problem and a synthetic database for the regression problem. The results show that the search for the appropriate number of neurons is accelerated up to 4.5× times with simulated annealing and up to 95.7× times with the golden section search compared to a sequential method in the highest-dimensional database.

Descripción

Palabras clave

Extreme Learning Machine, Classification and regression problems, Digital image processing, Neural networks, Simulated, Sequential method

Citación

Colecciones