Πλοήγηση ανά Συγγραφέα "Varsamis, Dimitris"
Τώρα δείχνει 1 - 4 από 4
- Αποτελέσματα ανά σελίδα
- Επιλογές ταξινόμησης
Τεκμήριο A generalized Takagi–Sugeno–Kang recurrent fuzzy-neural filter for adaptive noise cancelation(2008-10) Mastorocostas, Paris; Varsamis, Dimitris; Hilas, Constantinos; Mastorocostas, ConstantinosThis paper presents a recurrent fuzzy-neural filter for adaptive noise cancelation. The cancelation task is transformed to a system-identification problem, which is tackled by use of the dynamic neuron-based fuzzy neural network (DN-FNN). The fuzzy model is based on Takagi–Sugeno–Kang fuzzy rules, whose consequent parts consist of linear combinations of dynamic neurons. The orthogonal least squares method is employed to select the number of rules, along with the number and kind of dynamic neurons that participate in each rule. Extensive simulation results are given and performance comparison with a series of other dynamic fuzzy and neural models is conducted, underlining the effectiveness of the proposed filter and its superior performance over its competing rivals.Τεκμήριο An Optimal Bivariate Polynomial Interpolation Basis for the Application of the Evaluation-Interpolation Technique(2014-01-01) Varsamis, Dimitris; Karampetakis, Nicholas; Mastorocostas, ParisA new basis of interpolation points for the special case of the Newton two variable polynomial interpolation problem is proposed. This basis is implemented when the upper bound of the total degree and the degree in each variable is known. It is shown that this new basis under certain conditions (that depends on the degrees of the interpolation polynomial), coincides either with the known triangular/rectangular basis or it is a polygonal basis. In all cases it uses the least interpolation points with further consequences to the complexity of the algorithms that we use.Τεκμήριο Optimal Degree Estimation of the Determinant of a Polynomial Matrix(2014-03-01) Varsamis, Dimitris; Karampetakis, NicholasIn this paper we present a recursive formula to find the degree of the determinant of a bivariate polynomial matrix. The proposed algorithm returns the optimal estimation of the degree but has a very large computational cost. The recursive formula can be represented as an assignment problem which is solved with the Hungarian method that has a very small computational cost. The contribution of the proposed formula is to reduce the required interpolation points for the evaluation–interpolation technique. That is, we reduce the evaluations which are computations of determinants or inverse matrices.Τεκμήριο A Recurrent Neural Network–based Forecasting System for Telecommunications Call Volume(2013-09-01) Mastorocostas, Paris; Hilas, Constantinos; Varsamis, Dimitris; Dova, StergianiA recurrent neural network–based forecasting system for telecommunications call volume is proposed in this work. In particular, the forecaster is a Block–Diagonal Recurrent Neural Network with internal feedback. Model’s performance is evaluated by use of real–world telecommunications data, where an extensive comparative analysis with a series of existing forecasters is conducted, including both traditional models as well as neural and fuzzy approaches.