KAUST DepartmentComputer, Electrical and Mathematical Sciences and Engineering
Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
Electrical Engineering Program
Permanent link to this recordhttp://hdl.handle.net/10754/655989
MetadataShow full item record
AbstractThis paper focuses on studying the performance of general regularized discriminant analysis (RDA) classifiers based on the Gaussian mixture model with different means and covariances. RDA offers a rich class of regularization options, covering as special cases the regularized linear discriminant analysis (RLDA) and the regularized quadratic discriminant analysis (RQDA) classifiers. Based on fundamental results from random matrix theory, we analyze RDA under the double asymptotic regime where the data dimension and the training size both increase in a proportional way. Under the double asymptotic regime and some mild assumptions, we show that the asymptotic classification error converges to a deterministic quantity that only depends on the data statistical parameters and dimensions. This result can be leveraged to select the optimal parameters that minimize the classification error, thus yielding the optimal classifier. Numerical results are provided to validate our theoretical findings on synthetic data showing high accuracy of our derivations.
Conference/Event name2018 IEEE International Symposium on Information Theory (ISIT)