Uncertainty estimations are presented of the response of a multiscale in-stent restenosis model, as obtained by both non-intrusive and semi-intrusive uncertainty quantification. The in-stent restenosis model is a fully coupled multiscale simulation of post-stenting tissue growth, in which the most costly submodel is the blood flow simulation. Surrogate modeling for non-intrusive uncertainty quantification takes the whole model as a black-box and maps directly from the three uncertain inputs to the quantity of interest, the neointimal area. The corresponding uncertain estimates matched the results from quasi-Monte Carlo simulations well. In the semi-intrusive uncertainty quantification, the most expensive submodel is replaced with a surrogate model. We developed a surrogate model for the blood flow simulation by using a convolutional neural network. The semi-intrusive method with the new surrogate model offered efficient estimates of uncertainty and sensitivity while keeping relatively high accuracy. It outperformed the result obtained with earlier surrogate models. It also achieved the estimates comparable to the non-intrusive method with similar efficiency. Presented results on uncertainty propagation with non-intrusive and semi-intrusive metamodeling methods allow us to draw some conclusions on the advantages and limitations of these methods.
Shear thickening of particle suspensions is characterized by a transition between lubricated and frictional contacts between the particles. Using 3D numerical simulations, we study how the inter-particle friction coefficient influences the effective macroscopic friction coefficient and hence the microstructure and rheology of dense shear thickening suspensions. We propose expressions for effective friction coefficient in terms of distance to jamming for varying shear stresses and particle friction coefficient values. We find effective friction coefficient to be rather insensitive to interparticle friction, which is perhaps surprising but agrees with recent theory and experiments.
In this paper, the sensitivity analysis of a single scale model is employed in order to reduce the input dimensionality of the related multiscale model, in this way, improving the efficiency of its uncertainty estimation. The approach is illustrated with two examples: a reaction model and the standard Ornstein-Uhlenbeck process. Additionally, a counterexample shows that an uncertain input should not be excluded from uncertainty quantification without estimating the response sensitivity to this parameter. In particular, an analysis of the function defining the relation between single scale components is required to understand whether single scale sensitivity analysis can be used to reduce the dimensionality of the overall multiscale model input space.
A family of semi-intrusive uncertainty propagation (UP) methods for multiscale models is introduced. The methods are semi-intrusive in the sense that inspection of the model is limited up to the level of the single scale systems, and viewing these single scale components as black-boxes. The goal is to estimate uncertainty in the result of multiscale models at a reduced amount of time as compared to black-box Monte Carlo (MC). In the resulting semi-intrusive MC method, the required number of samples of an expensive single scale model is minimized in order to reduce the execution time for the overall UP. In the metamodeling approach, the expensive model component is replaced completely by a computationally much cheaper surrogate model. These semi-intrusive algorithms have been tested on two case studies based on reaction-diffusion dynamics. The results demonstrate that the proposed semi-intrusive methods can result in a significant reduction of the computational time for multiscale UP, while still computing accurately the estimates of uncertainties. The semi-intrusive methods can, therefore, be a valid alternative, when uncertainties of a multiscale model cannot be estimated by the black-box MC methods in a feasible amount of time.
In dynamical systems, local interactions between dynamical units generate correlations which are stored and transmitted throughout the system, generating the macroscopic behavior. However a framework to quantify and study this at the microscopic scale is missing. Here we propose an 'information processing' framework based on Shannon mutual information quantities between the initial and future states. We apply it to the 256 elementary cellular automata (ECA), which are the simplest possible dynamical systems exhibiting behaviors ranging from simple to complex. Our main finding for ECA is that only a few features are needed for full predictability and that the 'information integration' (synergy) feature is always most predictive. Finally we apply the formalism to foreign exchange (FX) and interest-rate swap (IRS) time series data and find that the 2008 financial crisis marks a sudden and sustained regime shift (FX and EUR IRS) resembling tipping point behavior. The USD IRS market exhibits instead a slow and steady progression which appears consistent with the hypothesis that this market is (part of) the driving force behind the crisis. Our work suggests that the proposed framework is a promising way of predicting emergent complex systemic behaviors in terms of the local information processing of units.
We expect that multiscale simulations will be one of the main high performance computing workloads in the exascale era. We propose multiscale computing patterns as a generic vehicle to realise load balanced, fault tolerant and energy aware high performance multiscale computing. Multiscale computing patterns should lead to a separation of concerns, whereby application developers can compose multiscale models and execute multiscale simulations, while pattern software realises optimized, fault tolerant and energy aware multiscale computing. We introduce three multiscale computing patterns, present an example of the extreme scaling pattern, and discuss our vision of how this may shape multiscale computing in the exascale era.
The Fisher-Rao metric from Information Geometry is related to phase transition phenomena in classical statistical mechanics. Several studies propose to extend the use of Information Geometry to study more general phase transitions in complex systems. However, it is unclear whether the Fisher-Rao metric does indeed detect these more general transitions, especially in the absence of a statistical model. In this paper we study the transitions between patterns in the Gray-Scott reaction-diffusion model using Fisher information. We describe the system by a probability density function that represents the size distribution of blobs in the patterns and compute its Fisher information with respect to changing the two rate parameters of the underlying model. We estimate the distribution non-parametrically so that we do not assume any statistical model. The resulting Fisher map can be interpreted as a phase-map of the different patterns. Lines with high Fisher information can be considered as boundaries between regions of parameter space where patterns with similar characteristics appear. These lines of high Fisher information can be interpreted as phase transitions between complex patterns.
The Fisher Information matrix is a widely used measure for applications ranging from statistical inference, information geometry, experiment design, to the study of criticality in biological systems. Yet there is no commonly accepted non-parametric algorithm to estimate it from real data. In this rapid communication we show how to accurately estimate the Fisher information in a nonparametric way. We also develop a numerical procedure to minimize the errors by choosing the interval of the finite difference scheme necessary to compute the derivatives in the definition of the Fisher information. Our method uses the recently published "Density Estimation using Field Theory" algorithm to compute the probability density functions for continuous densities. We use the Fisher information of the normal distribution to validate our method and as an example we compute the temperature component of the Fisher Information Matrix in the two dimensional Ising model and show that it obeys the expected relation to the heat capacity and therefore peaks at the phase transition at the correct critical temperature.
We present the Multiscale Coupling Library and Environment: MUSCLE 2. This multiscale component-based execution environment has a simple to use Java, C++, C, Python and Fortran API, compatible with MPI, OpenMP and threading codes. We demonstrate its local and distributed computing capabilities and compare its performance to MUSCLE 1, file copy, MPI, MPWide, and GridFTP. The local throughput of MPI is about two times higher, so very tightly coupled code should use MPI as a single submodel of MUSCLE 2; the distributed performance of GridFTP is lower, especially for small messages. We test the performance of a canal system model with MUSCLE 2, where it introduces an overhead as small as 5% compared to MPI.
Multiscale simulations are essential in the biomedical domain to accurately model human physiology. We present a modular approach for designing, constructing and executing multiscale simulations on a wide range of resources, from desktops to petascale supercomputers, including combinations of these. Our work features two multiscale applications, in-stent restenosis and cerebrovascular bloodflow, which combine multiple existing single-scale applications to create a multiscale simulation. These applications can be efficiently coupled, deployed and executed on computers up to the largest (peta) scale, incurring a coupling overhead of 1 to 10% of the total execution time.