• Integration Strategy for Heterogeneously Integrated Wearable and Implantable Electronics

      Hussain, Muhammad Mustafa (IEEE, 2019-01-09)
      We live in a world where electronics play critical enabling role. Specifically, matured and advanced CMOS technology with its arts and science of miniaturization has propelled variety of CMOS devices to a level where their lofty performance over cost benefit has ushered into a wide range of application spectrum ranging from computers to display to today's home automation. Going forward we may want to ask ourselves a few important questions: 1. Can CMOS technology be expanded further to add new functionalities to CMOS devices while retaining their existing attributes in tact? 2. Whether this exercise will have a better functionalities over cost metric? 3. If the first two questions are addressed well, whether the existing applications will be strengthened and/or diversified? Whether new applications may emerge?
    • Adaptive Strategies in Date Palm Revealed by Confocal Imaging Technologies

      Xiao, Ting Ting; Blilou, Ikram (2018-06-22)
      Date palm are confronted by harsh environmental conditions and have therefore adapted various strat-egies to survive the hostile environment. To unravel the underlying mechanisms of adaptation to desert conditions we conducted a detailed analysis of date palm tissue anatomy at different developmental stag-es. Using confocal imaging we reveal new anatomical features and complex structures in roots, shoot and leaves explaining strategies of adaptation of date palm to desert conditions.
    • Integrative Approach Toward Revealing and Understanding Complexity of Root System Architecture in Date Palm

      Blilou, Ikram (2018-06-22)
      The evolution from the primordial aquatic organisms to vascular terrestrial plants has been accompanied by increasing complexity in the structure and functions of their vegetative and reproductive organs. Plants have undergone dramatic changes in their root systems to adapt to terrestrial life. The development of complex diverse root architectures gave plants the advantage ability to colonize new and particularly arid and dry environments. Date palm Phoenix dactylifera fruits are known for their high nutritive, economic and social values. In arid and semi-arid areas, it plays an impor-tant role in affecting the microclimate by creating a microsystem allowing desert farming. Understanding the properties of growth and development in date palm is an essential step towards gaining insights as to how plants have evolved their strategies to cope with changes in their surrounding and survive in chal-lenged habitats like the desert. To unravel the under-lying mechanisms of date palm adaptation to desert conditions we conducted a detailed analysis of date palm anatomy during different stages of develop-ment from germination to adult plants. Using the art of state imaging technologies, we unraveled new devel-opmental mechanisms in date palm occurring during germination, plant growth and development. MicroCT Xray imaging technology combined with high resolu-tion microscopy revealed that date palm roots bear structures that have not been previously described. Some of these structures are conserved only among desert palm species. In addition, a comparative stud-ies of date palm cultivars originated from different geographical habitat, Tunisia, UAE and KSA and hav-ing distinct levels of tolerance to soil salinity revealed substantial differences in root system architecture.
    • Multilevel hybrid split-step implicit tau-leap

      Ben Hammouda, Chiheb; Moraes, Alvaro; Tempone, Raul (2016-08-15)
      In biochemically reactive systems with small copy numbers of one or more reactant molecules, the dynamics is dominated by stochastic effects. To approximate those systems, discrete state-space and stochastic simulation approaches have been shown to be more relevant than continuous state-space and deterministic ones. In systems characterized by having simultaneously fast and slow timescales, existing discrete space-state stochastic path simulation methods, such as the stochastic simulation algorithm (SSA) and the explicit tau-leap method, can be very slow. Implicit approximations have been developed to improve numerical stability and provide efficient simulation algorithms for those systems. Here, we propose an efficient Multilevel Monte Carlo (MLMC) method in the spirit of the work by Anderson and Higham (2012). This method uses split-step implicit tau-leap (SSI-TL) at levels where the explicit-TL method is not applicable due to numerical stability issues. We present numerical examples that illustrate the performance of the proposed method.
    • Multilevel Monte Carlo Acceleration of Seismic Wave Propagation under Uncertainty

      Ballesio, Marco; Beck, Joakim; Pandey, Anamika; Parisi, Laura; von Schwerin, Erik; Tempone, Raul (2018-09-06)
      We consider forward seismic wave propagation in an inhomogeneous linear viscoelastic media with random wave speeds and densities, subject to deterministic boundary and initial conditions. We study this forward problem as a first step towards the treatment of inverse problems. There the goal is to determine, for example, earthquake source locations from seismograms recorded in a small number of seismic sensors at the Earth’s surface. Existing results on earthquake source inversion for a given event show a large variability, which indicates that the inherent uncertainty of the Earth parameters should be taken into account. Here this uncertainty is modeled through random parameters. We propose multilevel Monte Carlo simulations for computing statistics of quantities of interest which are motivated by the choice of loss function for the corresponding inverse problem, presenting a case study based on experimental seismic data from a passive experiment in Tanzania. This work provides a benchmark for the implementation of multilevel algorithms to accelerate seismic inversion addressing earthquake source estimation as well as inferring Earth structure.
    • Multilevel ensemble Kalman filtering for spatio-temporal processes

      Hoel, Hakon; Chernov, Alexey; Law, Kody; Nobile, Fabio; Tempone, Raul (2018-07-04)
      The ensemble Kalman filter (EnKF) is a sequential filtering method that uses an ensemble of particle paths to estimate the means and covariances required by the Kalman filter by the use of sample moments, i.e., the Monte Carlo method. EnKF is often both robust and efficient, but its performance may suffer in settings where the computational cost of accurate simulations of particles is high. The multilevel Monte Carlo method (MLMC) is an extension of the classical Monte Carlo method, which by sampling stochastic realizations on a hierarchy of resolutions may reduce the computational cost of moment approximations by orders of magnitude. In this talk I will present ideas on combining MLMC and EnKF to construct the multilevel ensemble Kalman filter (MLEnKF) for the setting of finite and infinite dimensional state spaces. Theoretical results and numerical studies of the performance gain of MLEnKF over EnKF will also be presented. (Joint work with Alexey Chernov, Kody J. H. Law, Fabio Nobile, and Raul Tempone.) References: [1] H. Hoel, K. Law, and R. lTempone(2016). Multilevelensemble Kalman filtering. SIAM J. Numer. Anal. 54(3), 1813–1839. [2] A. Chernov, H. Hoel, K. Law, F. Nobile, and R. Tempone (2016). Multilevel ensemble Kalman filtering for spatially extended models. ArXiv e-prints. arXiv: 1608.08558 [math.NA].
    • Multilevel ensemble Kalman filtering for spatially extended models

      Hoel, Hakon; Chernov, Alexey; Law, Kody JH; Nobile, Fabio; Tempone, Raul (2018-01-10)
      The ensemble Kalman filter (EnKF) is a sequential filtering method that uses an ensemble of particle paths to estimate the means and covariances required by the Kalman filter by the use of sample moments, i.e., the Monte Carlo method. EnKF is often both robust and efficient, but its performance may suffer in settings where the computational cost of accurate simulations of particles is high. The multilevel Monte Carlo method (MLMC) is an extension of the classical Monte Carlo method, which by sampling stochastic realizations on a hierarchy of resolutions may reduce the computational cost of moment approximations by orders of magnitude. In this talk I will present ideas on combining MLMC and EnKF to construct the multilevel ensemble Kalman filter (MLEnKF) for the setting of finite and infinite dimensional state spaces. Theoretical results and numerical studies of the performance gain of MLEnKF over EnKF will also be presented. (Joint work with Alexey Chernov, Kody J. H. Law, Fabio Nobile, and Raul Tempone.)
    • Study of Regional Volcanic Impact on the Middle East and North Africa using high-resolution global and regional models

      Osipov, Sergey; Dogar, Muhammad Mubashar; Stenchikov, Georgiy L. (2016-04)
      High-latitude winter warming after strong equatorial volcanic eruptions caused by circulation changes associated with the anomalously positive phase of Arctic Oscillation is a subject of active research during recent decade. But severe winter cooling in the Middle East observed after the Mt. Pinatubo eruption of 1991, although recognized, was not thoroughly investigated. These severe regional climate perturbations in the Middle East cannot be explained by solely radiative volcanic cooling, which suggests that a contribution of forced circulation changes could be important and significant. To better understand the mechanisms of the Middle East climate response and evaluate the contributions of dynamic and radiative effects we conducted a comparative study using Geophysical Fluid Dynamics Laboratory global High Resolution Atmospheric Model (HiRAM) with the effectively "regional-model-resolution" of 25-km and the regional Weather Research and Forecasting (WRF) model focusing on the eruption of Mount Pinatubo on June 15, 1991 followed by a pronounced positive phase of the Arctic Oscillation. The WRF model has been configured over the Middle East and North Africa (MENA) region. The WRF code has been modified to interactively account for the radiative effect of volcanic aerosols. Both HiRAM and WRF capture the main features of the MENA climate response and show that in winter the dynamic effects in the Middle East prevail the direct radiative cooling from volcanic aerosols.
    • Regional Climate Response to Volcanic Radiative Forcing in Middle East and North Africa

      Stenchikov, Georgiy L.; Dogar, Muhammad Mubashar (2012-04)
      We have tested the regional climate sensitivity in the Middle East and North Africa (MENA) to radiation perturbations caused by the large explosive equatorial volcanic eruptions of the second part of 20th century, El Chichon and Pinatubo occurred, respectively, in 1982 and 1991. The observations and reanalysis data show that the surface volcanic cooling in the MENA region is two-three times larger than the global mean response to volcanic forcing. The Red Sea surface temperature appears to be also very sensitive to the external radiative impact. E.g., the sea surface cooling, associated with the 1991 Pinatubo eruption, caused deep water mixing and coral bleaching for a few years. To better quantify these effects we use the Geophysical Fluid Dynamics Laboratory global High Resolution Atmospheric Model (HIRAM) to conduct simulations of both the El Chichon and Pinatubo impacts with the effectively 25-km grid spacing. We find that the circulation changes associated with the positive phase of the arctic oscillation amplified the winter temperature anomalies in 1982-1984 and 1991-1993. The dynamic response to volcanic cooling also is characterized by the southward shift of the inter-tropical convergence zone in summer and associated impact on the precipitation patterns. Thus, these results suggest that the climate regime in the MENA region is highly sensitive to external forcing. This is important for better understanding of the climate variability and change in this region.
    • Study of Ocean Response to Periodic and Constant Volcanic Radiative Forcing

      Dogar, Muhammad Mubashar; Stenchikov, Georgiy L. (2013-12)
      It is known that volcanic radiative impacts could produce long-term perturbations of the ocean heat content. In this study we systematically compare the effect of periodic volcanic forcing with an equivalent time-average radiative cooling. One could expect that a sporadic strong cooling should initiate more vigorous vertical mixing of the upper ocean layer and therefore cools the ocean more effectively than a uniform radiative forcing. However, the long-term simulations show that on average the ocean heat content responses to periodic and constant forcings are almost identical. To better understand this controversy we conducted two sets of parallel simulations, the first one with uniform volcanic forcing and the second one with periodic volcanic forcing with 10 and 50 years repeating cycle using Geophysical Fluid Dynamics Laboratory Coupled Model CM2.1. We found that average perturbations of surface temperature, precipitation, ocean heat content, and sea level rise in both sets of simulations are similar but responses of Atlantic Meridional Overturning Circulation are significantly different, which explains the differences in the relaxation processes. These findings could be important for ocean initialization in long-tern climate studies and for geoengineering applications.
    • Uncertainty quantification of groundwater contamination

      Litvinenko, Alexander; Logashenko, Dmitry (2018-10-08)
      In many countries, groundwater is the strategic reserve, which is used as drinking water and as an irrigation resource. Therefore, accurate modeling of the pollution of the soil and groundwater aquifer is highly important. As a model, we consider a density-driven groundwater flow problem with uncertain porosity and permeability. This problem may arise in geothermal reservoir simulation, natural saline-disposal basins, modeling of contaminant plumes and subsurface flow. This strongly non-linear problem describes how salt or polluted water streams down building ''fingers". The solving process requires a very fine unstructured mesh and, therefore, high computational resources. Consequently, we run the parallel multigrid solver UG4 (https://github.com/UG4/ughub.wiki.git) on Shaheen II supercomputer. The parallelization is done in both - the physical space and the stochastic space. The novelty of this work is the estimation of risks that the pollution will achieve a specific critical concentration. Additionally, we demonstrate how the multigrid UG4 solver can be run in a black-box fashion for testing different scenarios in the density-driven flow. We solve Elder's problem in 2D and 3D domains, where unknown porosity and permeability are modeled by random fields. For approximations in the stochastic space, we use the generalized polynomial chaos expansion. We compute different quantities of interest such as the mean, variance and exceedance probabilities of the concentration. As a reference solution, we use the solution, obtained from the quasi-Monte Carlo method.
    • Ultraviolet FSO to laser-based VLC – the role of group-III-nitride devices

      Ooi, Boon S.; Sun, Xiaobin; Shen, Chao; Guo, Yujian; Liu, Guangyu; Ng, Tien Khee (2018-10-04)
    • Overview of Low-rank and Sparse Techniques in Spatial Statistics and Parameter Identification

      Litvinenko, Alexander (2018-10-03)
      Motivation: improve statistical model by implementing more efficient numerical tools Major Goal: Develop new statistical tools to address new problems. Overview: Low-rank matrices, Sparse matrices, Hierarchical matrices. Approximation of Matern covariance functions and joint Gaussian likelihood, Identification of unknown parameters via maximizing Gaussian log-likelihood, Low-rank tensor methods
    • Exploring off-set pricing models and article deposit terms at King Abdullah University of Science & Technology (KAUST)

      Buck, Stephen; Vijayakumar, J.K. (2018-04-09)
      In the ‘normal’ world of retail and commerce you pay for an item and receive the item. In the world of academic journals you prepay for the item and you might receive the item and you might get some money back depending on what journals you did or didn’t receive. In the world of offset pricing you prepay, then you pay again, you sometimes use vouchers, you might get a discount (the following year) then you might get money back, or you might not. Are publishers knowingly placing barriers to off-set models, and not transparently offsetting the APCs to the subscription cost, in order to raise more income? Whether by design or accident it is a complex world which needs a time commitment, which not all librarians can give, to understand fully. The new model of scholarly communication, which leading universities (including KAUST) want to introduce, is based on shifting the subscription costs to publishing costs, not to double the payment channels to the publishers. Can we get to a mutually beneficial position where the author can deposit the accepted version of the article into the Institutional Repository without any embargo period as the institute is agreeing to pay the subscription fee on an ongoing basis? The required model does not adversely affect the vendors’ revenue. This presentation, based on KAUST’’s experience to date, will attempt to explain the different models of offset pricing while outlining KAUST’s dual approach, redirecting subscription money to publishing money and embedding open access terms in understandable language in our license agreements, to the problem. Why we have accepted IoP’s offset offer and not Springer’s, though we were considered among the first timers and important Institutions? Why is this important? Resolving the inherent complexities in offsetting models will save libraries money and also time wasted on tedious and unnecessary administration work. Researchers do not want to know about offsetting agreements nor should they need to know. It is difficult enough to do and write up valuable research without having to do further research on offset pricing models. The authors of the articles without whom, as academic librarians or publishers, we would be redundant are often the neglected link in the chain. Finally, the Institutional Repository needs to know what we are up to. The current answer to many queries is that “it depends on the publisher,” isn’t good enough. There has to be a standard model. What is needed overall is clarity and transparency. This will enable trust and, where mistakes are made, and there inevitable will be with untried models, we can learn from these mistakes and make better, more robust services with auto deposition of our articles to our repository fed by Publishers’ themselves . If libraries can organize as groups at regional or (with more difficulty) international level more favorable licensing agreements, including standardized offset pricing model language, can be leveraged which will be advantageous to all parties; publishers, libraries and, most importantly, authors. It is incumbent that we familiarize ourselves with the pricing models, in all their complexity, and strive through collective organization to have these models simplified and standardized. Let’s turn that subscription money into publishing money.
    • Role of library's subscription licenses in promoting open access to scientific research

      Buck, Stephen (2018-04-30)
      This presentation, based on KAUST’’s experience to date, will attempt to explain the different ways of bringing Open Access models to scientific Publisher’s licenses. Our dual approach with offset pricing is to redirect subscription money to publishing money and embed green open access deposition terms in understandable language in our license agreements. Resolving the inherent complexities in open access publishing, repository depositions and offsetting models will save libraries money and also time wasted on tedious and unnecessary administration work. Researchers will also save their time with overall clarity and transparency. This will enable trust and, where mistakes are made, and there inevitably will be with untried models, we can learn from these mistakes and make better, more robust services with auto deposition of our articles to our repository fed by Publishers’ themselves. The plan is to cover all Publishers with OA license terms for KAUST author’s right while continuing our subscription to them. There are marketing campaigns, awareness sessions are planned, in addition to establishing Libguides to help researchers, in addition to manage offset pricing models.
    • Application of Parallel Hierarchical Matrices in Spatial Statistics and Parameter Identification

      Litvinenko, Alexander (2018-04-20)
      Parallel H-matrices in spatial statistics 1. Motivation: improve statistical model 2. Tools: Hierarchical matrices [Hackbusch 1999] 3. Matern covariance function and joint Gaussian likelihood 4. Identification of unknown parameters via maximizing Gaussian log-likelihood 5. Implementation with HLIBPro
    • Tucker tensor analysis of Matern functions in spatial statistics

      Litvinenko, Alexander (2018-04-20)
      Low-rank Tucker tensor methods in spatial statistics 1. Motivation: improve statistical models 2. Motivation: disadvantages of matrices 3. Tools: Tucker tensor format 4. Tensor approximation of Matern covariance function via FFT 5. Typical statistical operations in Tucker tensor format 6. Numerical experiments