Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
Online Publication Date2021-03-23
Print Publication Date2021-06
Permanent link to this recordhttp://hdl.handle.net/10754/668212
MetadataShow full item record
AbstractThe paper investigates approximation error of two-layer feedforward Fourier Neural Networks (FNNs). Such networks are motivated by the approximation properties of Fourier series. Several implementations of FNNs were proposed since 1980s: by Gallant and White, Silvescu, Tan, Zuo and Cai, and Liu. The main focus of our work is Silvescu's FNN, because its activation function does not fit into the category of networks, where the linearly transformed input is exposed to activation. The latter ones were extensively described by Hornik. In regard to non-trivial Silvescu's FNN, its convergence rate is proven to be of order O(1/n). The paper continues investigating classes of functions approximated by Silvescu FNN, which appeared to be from Schwartz space and space of positive definite functions.
CitationZhumekenov, A., Takhanov, R., Castro, A. J., & Assylbekov, Z. (2021). Approximation error of Fourier neural networks. Statistical Analysis and Data Mining: The ASA Data Science Journal. doi:10.1002/sam.11506
SponsorsThis work was supported by the Nazarbayev University faculty-development competitive research grants program, Grant Number 240919FD3921, and by the Committee of Science of the Ministry of Education and Science of the Republic of Kazakhstan, IRN AP05133700.
Except where otherwise noted, this item's license is described as This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.