On the Efficient Simulation of Outage Probability in a Log-normal Fading Environment
KAUST DepartmentApplied Mathematics and Computational Science Program
Center for Uncertainty Quantification in Computational Science and Engineering (SRI-UQ)
Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
Electrical Engineering Program
Permanent link to this recordhttp://hdl.handle.net/10754/622924
MetadataShow full item record
AbstractThe outage probability (OP) of the signal-to-interference-plus-noise ratio (SINR) is an important metric that is used to evaluate the performance of wireless systems. One difficulty toward assessing the OP is that, in realistic scenarios, closed-form expressions cannot be derived. This is for instance the case of the Log-normal environment, in which evaluating the OP of the SINR amounts to computing the probability that a sum of correlated Log-normal variates exceeds a given threshold. Since such a probability does not admit a closed-form expression, it has thus far been evaluated by several approximation techniques, the accuracies of which are not guaranteed in the region of small OPs. For these regions, simulation techniques based on variance reduction algorithms is a good alternative, being quick and highly accurate for estimating rare event probabilities. This constitutes the major motivation behind our work. More specifically, we propose a generalized hybrid importance sampling scheme, based on a combination of a mean shifting and a covariance matrix scaling, to evaluate the OP of the SINR in a Log-normal environment. We further our analysis by providing a detailed study of two particular cases. Finally, the performance of these techniques is performed both theoretically and through various simulation results.
CitationBen Rached N, Kammoun A, Alouini M-S, Tempone R (2017) On the Efficient Simulation of Outage Probability in a Log-normal Fading Environment. IEEE Transactions on Communications: 1–1. Available: http://dx.doi.org/10.1109/TCOMM.2017.2669979.