Mastorocostas, P. A.Varsamis, D. N.Mastorocostas, C. A.2015-06-292024-09-272015-06-292024-09-272006http://www.actapress.com/PaperInfo.aspx?PaperID=23195&reason=500https://repository2024.ihu.gr/handle/123456789/1559The RPROP algorithm was originally developed in [5] for static networks and constitutes one of the best performing first order learning methods for neural networks [6]. However, in RPROP the problem of poor convergence to local minima, faced by all gradient descent-based methods, is not fully eliminated. Hence, in an attempt to alleviate this drawback, a combination of RPROP with the global search technique of Simulated Annealing (SA) was introduced in [7]. The resulted algorithm, named SARPROP, was proved to be an efficient learning method for static neural networks. A fast and efficient training method for block-diagonal recurrent fuzzy neural networks is proposed. The method modifies the Simulated Annealing RPROP algorithm, originally developed for static models, in order to be applied to dynamic systems. A comparative analysis with a series of algorithms and recurrent models is given, indicating the effectiveness of the proposed learning approach.6enAttribution-NonCommercial-NoDerivatives 4.0 Διεθνέςhttp://creativecommons.org/licenses/by-nc-nd/4.0/A Simulated Annealing-based Learning Algorithm for Block-Diagonal Recurrent Neural NetworksΆρθρο σε επιστημονικό συνέδριο