Presentations
Recent Submissions

Solution of Density Driven Groundwater Flow with Uncertain Porosity and Permeability Coefficients(201903) [Presentation]In many countries, groundwater is the strategic reserve, which is used as drinking water and as an irrigation resource. Therefore, accurate modeling of the pollution of the soil and groundwater aquifer is highly important. As a model, we consider a densitydriven groundwater flow problem with uncertain porosity and permeability. This problem may arise in geothermal reservoir simulation, natural salinedisposal basins, modeling of contaminant plumes and subsurface flow. This strongly nonlinear problem describes how salt or polluted water streams down building 'fingers". The solving process requires a very fine unstructured mesh and, therefore, high computational resources. Consequently, we run the parallel multigrid solver UG4 (https://github.com/UG4/ughub.wiki.git) on Shaheen II supercomputer. The parallelization is done in both  the physical space and the stochastic space. The novelty of this work is the estimation of risks that the pollution will achieve a specific critical concentration. Additionally, we demonstrate how the multigrid UG4 solver can be run in a blackbox fashion for testing different scenarios in the densitydriven flow. We solve Elder's problem in 2D and 3D domains, where unknown porosity and permeability are modeled by random fields. For approximations in the stochastic space, we use the generalized polynomial chaos expansion.

Efficient Simulations for Contamination of Groundwater Aquifers under Uncertainties(20190225) [Presentation]Accidental contamination of groundwater can be extremely hazardous and thus, accurately predicting the fate of pollutants in groundwater is essential. Certain pollutants are soluble in water and can leak into groundwater systems, such as seawater into coastal aquifers or wastewater leaks. Indeed, some pollutants can change the density of a fluid and induce densitydriven flows within the aquifer. This causes faster propagation of the contamination due to convection. Thus, simulation and analysis of this densitydriven flow plays an important role in predicting how pollution can migrate through an aquifer. We propose the new parallel algorithm to compute a functional approximation of the QoI. Namely, we approximate the QoI with the polynomial chaos expansion (PCE), where all PCE coefficients are computed in parallel. We demonstrate 2D and 3D examples.

Numerical methods for density driven groundwater ow with uncertain data(20190220) [Presentation]Accurate modeling of contamination in subsurface flow and water aquifers is crucial for agriculture and environmental protection. We consider the densitydriven subsurface flow and estimate how uncertainty from permeability and porosity propagates to the solution  mass fraction. We take an Elderlike problem as a numerical benchmark and we use random fields to model the limited knowledge on the porosity and permeability. We construct a lowcost generalized polynomial chaos expansion (gPCE) surrogate model, where the gPCE coefficients are computed by projection on sparse and full tensor grids. We parallelize both the numerical solver for the deterministic problem based on the multigrid method, and the quadrature over the parametric space.

Integration Strategy for Heterogeneously Integrated Wearable and Implantable Electronics(2018 International Flexible Electronics Technology Conference (IFETC), IEEE, 20190109) [Presentation]We live in a world where electronics play critical enabling role. Specifically, matured and advanced CMOS technology with its arts and science of miniaturization has propelled variety of CMOS devices to a level where their lofty performance over cost benefit has ushered into a wide range of application spectrum ranging from computers to display to today's home automation. Going forward we may want to ask ourselves a few important questions: 1. Can CMOS technology be expanded further to add new functionalities to CMOS devices while retaining their existing attributes in tact? 2. Whether this exercise will have a better functionalities over cost metric? 3. If the first two questions are addressed well, whether the existing applications will be strengthened and/or diversified? Whether new applications may emerge?

GPUbased largescale scientific visualization(SIGGRAPH Asia 2018 Courses on  SA '18, ACM Press, 20181129) [Presentation]Displayaware processing with flexible new image pyramid (spdf map). Consistent, sparse representation of pixel footprint pdfs Unified evaluation of many important nonlinear image operations. Local Laplacian filtering for gigapixel images Efficient CUDA implementation Precomputation costly, but only performed once Run time storage and computation similar to standard pyramids.

Integer programming for layout problems(SIGGRAPH Asia 2018 Courses on  SA '18, ACM Press, 20181129) [Presentation]

Uncertainty quantification of groundwater contamination(20181008) [Presentation]In many countries, groundwater is the strategic reserve, which is used as drinking water and as an irrigation resource. Therefore, accurate modeling of the pollution of the soil and groundwater aquifer is highly important. As a model, we consider a densitydriven groundwater flow problem with uncertain porosity and permeability. This problem may arise in geothermal reservoir simulation, natural salinedisposal basins, modeling of contaminant plumes and subsurface flow. This strongly nonlinear problem describes how salt or polluted water streams down building ''fingers". The solving process requires a very fine unstructured mesh and, therefore, high computational resources. Consequently, we run the parallel multigrid solver UG4 (https://github.com/UG4/ughub.wiki.git) on Shaheen II supercomputer. The parallelization is done in both  the physical space and the stochastic space. The novelty of this work is the estimation of risks that the pollution will achieve a specific critical concentration. Additionally, we demonstrate how the multigrid UG4 solver can be run in a blackbox fashion for testing different scenarios in the densitydriven flow. We solve Elder's problem in 2D and 3D domains, where unknown porosity and permeability are modeled by random fields. For approximations in the stochastic space, we use the generalized polynomial chaos expansion. We compute different quantities of interest such as the mean, variance and exceedance probabilities of the concentration. As a reference solution, we use the solution, obtained from the quasiMonte Carlo method.

Ultraviolet FSO to laserbased VLC – the role of groupIIInitride devices(20181004) [Presentation]

Overview of Lowrank and Sparse Techniques in Spatial Statistics and Parameter Identification(20181003) [Presentation]Motivation: improve statistical model by implementing more efficient numerical tools Major Goal: Develop new statistical tools to address new problems. Overview: Lowrank matrices, Sparse matrices, Hierarchical matrices. Approximation of Matern covariance functions and joint Gaussian likelihood, Identification of unknown parameters via maximizing Gaussian loglikelihood, Lowrank tensor methods

Multilevel Monte Carlo Acceleration of Seismic Wave Propagation under Uncertainty(20180906) [Presentation]We consider forward seismic wave propagation in an inhomogeneous linear viscoelastic media with random wave speeds and densities, subject to deterministic boundary and initial conditions. We study this forward problem as a first step towards the treatment of inverse problems. There the goal is to determine, for example, earthquake source locations from seismograms recorded in a small number of seismic sensors at the Earth’s surface. Existing results on earthquake source inversion for a given event show a large variability, which indicates that the inherent uncertainty of the Earth parameters should be taken into account. Here this uncertainty is modeled through random parameters. We propose multilevel Monte Carlo simulations for computing statistics of quantities of interest which are motivated by the choice of loss function for the corresponding inverse problem, presenting a case study based on experimental seismic data from a passive experiment in Tanzania. This work provides a benchmark for the implementation of multilevel algorithms to accelerate seismic inversion addressing earthquake source estimation as well as inferring Earth structure.

Multilevel ensemble Kalman filtering for spatiotemporal processes(20180704) [Presentation]The ensemble Kalman filter (EnKF) is a sequential filtering method that uses an ensemble of particle paths to estimate the means and covariances required by the Kalman filter by the use of sample moments, i.e., the Monte Carlo method. EnKF is often both robust and efficient, but its performance may suffer in settings where the computational cost of accurate simulations of particles is high. The multilevel Monte Carlo method (MLMC) is an extension of the classical Monte Carlo method, which by sampling stochastic realizations on a hierarchy of resolutions may reduce the computational cost of moment approximations by orders of magnitude. In this talk I will present ideas on combining MLMC and EnKF to construct the multilevel ensemble Kalman filter (MLEnKF) for the setting of finite and infinite dimensional state spaces. Theoretical results and numerical studies of the performance gain of MLEnKF over EnKF will also be presented. (Joint work with Alexey Chernov, Kody J. H. Law, Fabio Nobile, and Raul Tempone.) References: [1] H. Hoel, K. Law, and R. lTempone(2016). Multilevelensemble Kalman filtering. SIAM J. Numer. Anal. 54(3), 1813–1839. [2] A. Chernov, H. Hoel, K. Law, F. Nobile, and R. Tempone (2016). Multilevel ensemble Kalman filtering for spatially extended models. ArXiv eprints. arXiv: 1608.08558 [math.NA].

Adaptive Strategies in Date Palm Revealed by Confocal Imaging Technologies(20180622) [Presentation]Date palm are confronted by harsh environmental conditions and have therefore adapted various strategies to survive the hostile environment. To unravel the underlying mechanisms of adaptation to desert conditions we conducted a detailed analysis of date palm tissue anatomy at different developmental stages. Using confocal imaging we reveal new anatomical features and complex structures in roots, shoot and leaves explaining strategies of adaptation of date palm to desert conditions.

Integrative Approach Toward Revealing and Understanding Complexity of Root System Architecture in Date Palm(20180622) [Presentation]The evolution from the primordial aquatic organisms to vascular terrestrial plants has been accompanied by increasing complexity in the structure and functions of their vegetative and reproductive organs. Plants have undergone dramatic changes in their root systems to adapt to terrestrial life. The development of complex diverse root architectures gave plants the advantage ability to colonize new and particularly arid and dry environments. Date palm Phoenix dactylifera fruits are known for their high nutritive, economic and social values. In arid and semiarid areas, it plays an important role in affecting the microclimate by creating a microsystem allowing desert farming. Understanding the properties of growth and development in date palm is an essential step towards gaining insights as to how plants have evolved their strategies to cope with changes in their surrounding and survive in challenged habitats like the desert. To unravel the underlying mechanisms of date palm adaptation to desert conditions we conducted a detailed analysis of date palm anatomy during different stages of development from germination to adult plants. Using the art of state imaging technologies, we unraveled new developmental mechanisms in date palm occurring during germination, plant growth and development. MicroCT Xray imaging technology combined with high resolution microscopy revealed that date palm roots bear structures that have not been previously described. Some of these structures are conserved only among desert palm species. In addition, a comparative studies of date palm cultivars originated from different geographical habitat, Tunisia, UAE and KSA and having distinct levels of tolerance to soil salinity revealed substantial differences in root system architecture.

Role of library's subscription licenses in promoting open access to scientific research(20180430) [Presentation]This presentation, based on KAUST’’s experience to date, will attempt to explain the different ways of bringing Open Access models to scientific Publisher’s licenses. Our dual approach with offset pricing is to redirect subscription money to publishing money and embed green open access deposition terms in understandable language in our license agreements. Resolving the inherent complexities in open access publishing, repository depositions and offsetting models will save libraries money and also time wasted on tedious and unnecessary administration work. Researchers will also save their time with overall clarity and transparency. This will enable trust and, where mistakes are made, and there inevitably will be with untried models, we can learn from these mistakes and make better, more robust services with auto deposition of our articles to our repository fed by Publishers’ themselves. The plan is to cover all Publishers with OA license terms for KAUST author’s right while continuing our subscription to them. There are marketing campaigns, awareness sessions are planned, in addition to establishing Libguides to help researchers, in addition to manage offset pricing models.

Application of Parallel Hierarchical Matrices in Spatial Statistics and Parameter Identification(20180420) [Presentation]Parallel Hmatrices in spatial statistics 1. Motivation: improve statistical model 2. Tools: Hierarchical matrices [Hackbusch 1999] 3. Matern covariance function and joint Gaussian likelihood 4. Identification of unknown parameters via maximizing Gaussian loglikelihood 5. Implementation with HLIBPro

Tucker tensor analysis of Matern functions in spatial statistics(20180420) [Presentation]Lowrank Tucker tensor methods in spatial statistics 1. Motivation: improve statistical models 2. Motivation: disadvantages of matrices 3. Tools: Tucker tensor format 4. Tensor approximation of Matern covariance function via FFT 5. Typical statistical operations in Tucker tensor format 6. Numerical experiments

Exploring offset pricing models and article deposit terms at King Abdullah University of Science & Technology (KAUST)(20180409) [Presentation]In the ‘normal’ world of retail and commerce you pay for an item and receive the item. In the world of academic journals you prepay for the item and you might receive the item and you might get some money back depending on what journals you did or didn’t receive. In the world of offset pricing you prepay, then you pay again, you sometimes use vouchers, you might get a discount (the following year) then you might get money back, or you might not. Are publishers knowingly placing barriers to offset models, and not transparently offsetting the APCs to the subscription cost, in order to raise more income? Whether by design or accident it is a complex world which needs a time commitment, which not all librarians can give, to understand fully. The new model of scholarly communication, which leading universities (including KAUST) want to introduce, is based on shifting the subscription costs to publishing costs, not to double the payment channels to the publishers. Can we get to a mutually beneficial position where the author can deposit the accepted version of the article into the Institutional Repository without any embargo period as the institute is agreeing to pay the subscription fee on an ongoing basis? The required model does not adversely affect the vendors’ revenue. This presentation, based on KAUST’’s experience to date, will attempt to explain the different models of offset pricing while outlining KAUST’s dual approach, redirecting subscription money to publishing money and embedding open access terms in understandable language in our license agreements, to the problem. Why we have accepted IoP’s offset offer and not Springer’s, though we were considered among the first timers and important Institutions? Why is this important? Resolving the inherent complexities in offsetting models will save libraries money and also time wasted on tedious and unnecessary administration work. Researchers do not want to know about offsetting agreements nor should they need to know. It is difficult enough to do and write up valuable research without having to do further research on offset pricing models. The authors of the articles without whom, as academic librarians or publishers, we would be redundant are often the neglected link in the chain. Finally, the Institutional Repository needs to know what we are up to. The current answer to many queries is that “it depends on the publisher,” isn’t good enough. There has to be a standard model. What is needed overall is clarity and transparency. This will enable trust and, where mistakes are made, and there inevitable will be with untried models, we can learn from these mistakes and make better, more robust services with auto deposition of our articles to our repository fed by Publishers’ themselves . If libraries can organize as groups at regional or (with more difficulty) international level more favorable licensing agreements, including standardized offset pricing model language, can be leveraged which will be advantageous to all parties; publishers, libraries and, most importantly, authors. It is incumbent that we familiarize ourselves with the pricing models, in all their complexity, and strive through collective organization to have these models simplified and standardized. Let’s turn that subscription money into publishing money.

Application of Parallel Hierarchical Matrices and LowRank Tensors in Spatial Statistics and Parameter Identification(20180312) [Presentation]Part 1: Parallel Hmatrices in spatial statistics 1. Motivation: improve statistical model 2. Tools: Hierarchical matrices 3. Matern covariance function and joint Gaussian likelihood 4. Identification of unknown parameters via maximizing Gaussian loglikelihood 5. Implementation with HLIBPro. Part 2: Lowrank Tucker tensor methods in spatial statistics

Making discoveries – how library systems development can facilitate user success(20180307) [Presentation]