Show simple item record

dc.contributor.advisorAl-Naffouri, Tareq Y.
dc.contributor.authorElkhalil, Khalil
dc.date.accessioned2019-06-25T13:56:10Z
dc.date.available2019-06-25T13:56:10Z
dc.date.issued2019-06
dc.identifier.citationElkhalil, K. (2019). Random Matrix Theory: Selected Applications from Statistical Signal Processing and Machine Learning. KAUST Research Repository. https://doi.org/10.25781/KAUST-Q7W78
dc.identifier.doi10.25781/KAUST-Q7W78
dc.identifier.urihttp://hdl.handle.net/10754/655682
dc.description.abstractRandom matrix theory is an outstanding mathematical tool that has demonstrated its usefulness in many areas ranging from wireless communication to finance and economics. The main motivation behind its use comes from the fundamental role that random matrices play in modeling unknown and unpredictable physical quantities. In many situations, meaningful metrics expressed as scalar functionals of these random matrices arise naturally. Along this line, the present work consists in leveraging tools from random matrix theory in an attempt to answer fundamental questions related to applications from statistical signal processing and machine learning. In a first part, this thesis addresses the development of analytical tools for the computation of the inverse moments of random Gram matrices with one side correlation. Such a question is mainly driven by applications in signal processing and wireless communications wherein such matrices naturally arise. In particular, we derive closed-form expressions for the inverse moments and show that the obtained results can help approximate several performance metrics of common estimation techniques. Then, we carry out a large dimensional study of discriminant analysis classifiers. Under mild assumptions, we show that the asymptotic classification error approaches a deterministic quantity that depends only on the means and covariances associated with each class as well as the problem dimensions. Such result permits a better understanding of the underlying classifiers, in practical large but finite dimensions, and can be used to optimize the performance. Finally, we revisit kernel ridge regression and study a centered version of it that we call centered kernel ridge regression or CKRR in short. Relying on recent advances on the asymptotic properties of random kernel matrices, we carry out a large dimensional analysis of CKRR under the assumption that both the data dimesion and the training size grow simultaneiusly large at the same rate. We particularly show that both the empirical and prediction risks converge to a limiting risk that relates the performance to the data statistics and the parameters involved. Such a result is important as it permits a better undertanding of kernel ridge regression and allows to efficiently optimize the performance.
dc.language.isoen
dc.subjectRandom matrix theory
dc.subjectdiscriminant analysis
dc.subjectkernel regression
dc.subjectHigh dimensional statistics
dc.titleRandom Matrix Theory: Selected Applications from Statistical Signal Processing and Machine Learning
dc.typeDissertation
dc.contributor.departmentComputer, Electrical and Mathematical Science and Engineering (CEMSE) Division
thesis.degree.grantorKing Abdullah University of Science and Technology
dc.contributor.committeememberKammoun,Alba
dc.contributor.committeememberHero, Alfred
dc.contributor.committeememberZhang, Xiangliang
dc.contributor.committeememberAlouini, Mohamed-Slim
thesis.degree.disciplineElectrical Engineering
thesis.degree.nameDoctor of Philosophy
refterms.dateFOA2019-06-25T13:56:11Z
kaust.request.doiyes


Files in this item

Thumbnail
Name:
Final Thesis.pdf
Size:
1.291Mb
Format:
PDF
Description:
Final thesis

This item appears in the following Collection(s)

Show simple item record