A. Narayan, J. Jakeman, T. Zhou.
A Christoffel function weighted least squares algorithm for collocation approximations, In Mathematics of Computation, Vol. 86, No. 306, pp. 1913--1947. 2017.
ISSN: 0025-5718, 1088-6842
We propose, theoretically investigate, and numerically validate an algorithm for the Monte Carlo solution of least-squares polynomial approximation problems in a collocation frame- work. Our method is motivated by generalized Polynomial Chaos approximation in uncertainty quantification where a polynomial approximation is formed from a combination of orthogonal polynomials. A standard Monte Carlo approach would draw samples according to the density of orthogonality. Our proposed algorithm samples with respect to the equilibrium measure of the parametric domain, and subsequently solves a weighted least-squares problem, with weights given by evaluations of the Christoffel function. We present theoretical analysis to motivate the algorithm, and numerical results that show our method is superior to standard Monte Carlo methods in many situations of interest.
Parallel code portability in the petascale era requires modifying existing codes to support new architectures with large core counts and SIMD vector units. OpenMP is a well established and increasingly supported vehicle for portable parallelization. As architectures mature and compiler OpenMP implementations evolve, best practices for code modernization change as well. In this paper, we examine the impact of newer OpenMP features (in particular OMP SIMD) on the Intel Xeon Phi Knights Landing (KNL) architecture, applied in optimizing loops in the single moment 6-class microphysics module (WSM6) in the US Navy's NEPTUNE code. We find that with functioning OMP SIMD constructs, low thread invocation overhead on KNL and reduced penalty for unaligned access compared to previous architectures, one can leverage OpenMP 4 to achieve reasonable scalability with relatively minor reorganization of a production physics code.
T.A.J. Ouermi, A. Knoll, R.M. Kirby, M. Berzins. Optimization Strategies for WRF Single-Moment 6-Class Microphysics Scheme (WSM6) on Intel Microarchitectures, In Proceedings of the fifth international symposium on computing and networking (CANDAR 17). Awarded Best Paper , IEEE, 2017.
Optimizations in the petascale era require modifications of existing codes to take advantage of new architectures with large core counts and SIMD vector units. This paper examines high-level and low-level optimization strategies for numerical weather prediction (NWP) codes. These strategies employ thread-local structures of arrays (SOA) and an OpenMP directive such as OMP SIMD. These optimization approaches are applied to the Weather Research Forecasting single-moment 6-class microphysics schemes (WSM6) in the US Navy NEPTUNE system. The results of this study indicate that the high-level approach with SOA and low-level OMP SIMD improves thread and vector parallelism by increasing data and temporal locality. The modified version of WSM6 runs 70x faster than the original serial code. This improvement is about 23.3x faster than the performance achieved by Ouermi et al., and 14.9x faster than the performance achieved by Michalakes et al.
Revisiting Abnormalities in Brain Network Architecture Underlying Autism Using Topology-Inspired Statistical Inference, In Connectomics in NeuroImaging, Springer International Publishing, pp. 98--107. 2017.
A large body of evidence relates autism with abnormal structural and functional brain connectivity. Structural covariance MRI (scMRI) is a technique that maps brain regions with covarying gray matter density across subjects. It provides a way to probe the anatomical structures underlying intrinsic connectivity networks (ICNs) through the analysis of the gray matter signal covariance. In this paper, we apply topological data analysis in conjunction with scMRI to explore network-specific differences in the gray matter structure in subjects with autism versus age-, gender- and IQ-matched controls. Specifically, we investigate topological differences in gray matter structures captured by structural covariance networks (SCNs) derived from three ICNs strongly implicated in autism, namely, the salience network (SN), the default mode network (DMN) and the executive control network (ECN). By combining topological data analysis with statistical inference, our results provide evidence of statistically significant network-specific structural abnormalities in autism, from SCNs derived from SN and ECN. These differences in brain architecture are consistent with direct structural analysis using scMRI (Zielinski et al. 2012).
Large-scale parallel applications with complex global data dependencies beyond those of reductions pose significant scalability challenges in an asynchronous runtime system. Internodal challenges include identifying the all-to-all communication of data dependencies among the nodes. Intranodal challenges include gathering together these data dependencies into usable data objects while avoiding data duplication. This paper addresses these challenges within the context of a large-scale, industrial coal boiler simulation using the Uintah asynchronous many-task runtime system on GPU architectures. We show significant reduction in time spent analyzing data dependencies through refinements in our dependency search algorithm. Multiple task graphs are used to eliminate subsequent analysis when task graphs change in predictable and repeatable ways. Using a combined data store and task scheduler redesign reduces data dependency duplication ensuring that problems fit within host and GPU memory. These modifications did not require any changes to application code or sweeping changes to the Uintah runtime system. We report results running on the DOE Titan system on 119K CPU cores and 7.5K GPUs simultaneously. Our solutions can be generalized to other task dependency problems with global dependencies among thousands of nodes which must be processed efficiently at large scale.
Modern science is inundated with ever increasing data sizes as computational capabilities and image acquisition techniques continue to improve. For example, simulations are tackling ever larger domains with higher fidelity, and high-throughput microscopy techniques generate larger data that are fundamental to gather biologically and medically relevant insights. As the image sizes exceed memory, and even sometimes local disk space, each step in a scientific workflow is impacted. Current software solutions enable data exploration with limited interactivity for visualization and analytic tasks. Furthermore analysis on HPC systems often require complex hand-written parallel implementations of algorithms that suffer from poor portability and maintainability. We present a software infrastructure that simplifies end-to-end visualization and analysis of massive data. First, a hierarchical streaming data access layer enables interactive exploration of remote data, with fast data fetching to test analytics on subsets of the data. Second, a library simplifies the process of developing new analytics algorithms, allowing users to rapidly prototype new approaches and deploy them in an HPC setting. Third, a scalable runtime system automates mapping analysis algorithms to whatever computational hardware is available, reducing the complexity of developing scaling algorithms. We demonstrate the usability and performance of our system using a use case from neuroscience: filtering, registration, and visualization of tera-scale microscopy data. We evaluate the performance of our system using a leadership-class supercomputer, Shaheen II.
This article surveys the history and current state of the art of visualization in meteorology, focusing on visualization techniques and tools used for meteorological data analysis. We examine characteristics of meteorological data and analysis tasks, describe the development of computer graphics methods for visualization in meteorology from the 1960s to today, and visit the state of the art of visualization techniques and tools in operational weather forecasting and atmospheric research. We approach the topic from both the visualization and the meteorological side, showing visualization techniques commonly used in meteorological practice, and surveying recent studies in visualization research aimed at meteorological applications. Our overview covers visualization techniques from the fields of display design, 3D visualization, flow dynamics, feature-based visualization, comparative visualization and data fusion, uncertainty and ensemble visualization, interactive visual analysis, efficient rendering, and scalability and reproducibility. We discuss demands and challenges for visualization research targeting meteorological data analysis, highlighting aspects in demonstration of benefit, interactive visual analysis, seamless visualization, ensemble visualization, 3D visualization, and technical issues.
P. Seshadri, A. Narayan, S. Mahadevan. Effectively Subsampled Quadratures for Least Squares Polynomial Approximations, In SIAM/ASA Journal on Uncertainty Quantification, pp. 1003--1023. Jan, 2017.
This paper proposes a new deterministic sampling strategy for constructing polynomial chaos approximations for expensive physics simulation models. The proposed approach, effectively subsampled quadratures involves sparsely subsampling an existing tensor grid using QR column pivoting. For polynomial interpolation using hyperbolic or total order sets, we then solve the following square least squares problem. For polynomial approximation, we use a column pruning heuristic that removes columns based on the highest total orders and then solves the tall least squares problem. While we provide bounds on the condition number of such tall submatrices, it is difficult to ascertain how column pruning effects solution accuracy as this is problem specific. We conclude with numerical experiments on an analytical function and a model piston problem that show the efficacy of our approach compared with randomized subsampling. We also show an example where this method fails.
T. Sodergren, J. Hair, J.M. Phillips, B. Wang. Visualizing Sensor Network Coverage with Location Uncertainty, In CoRR, Vol. abs/1710.06925, 2017.
We present an interactive visualization system for exploring the coverage in sensor networks with uncertain sensor locations. We consider a simple case of uncertainty where the location of each sensor is confined to a discrete number of points sampled uniformly at random from a region with a fixed radius. Employing techniques from topological data analysis, we model and visualize network coverage by quantifying the uncertainty defined on its simplicial complex representations. We demonstrate the capabilities and effectiveness of our tool via the exploration of randomly distributed sensor networks.
A. Suh, M. Hajij, B. Wang, C. Scheidegger, P. Rosen. Driving Interactive Graph Exploration Using 0-Dimensional Persistent Homology Features, In CoRR, 2017.
Graphs are commonly used to encode relationships among entities, yet, their abstractness makes them incredibly difficult to analyze. Node-link diagrams are a popular method for drawing graphs. Classical techniques for the node-link diagrams include various layout methods that rely on derived information to position points, which often lack interactive exploration functionalities; and force-directed layouts, which ignore global structures of the graph. This paper addresses the graph drawing challenge by leveraging topological features of a graph as derived information for interactive graph drawing. We first discuss extracting topological features from a graph using persistent homology. We then introduce an interactive persistence barcodes to study the substructures of a force-directed graph layout; in particular, we add contracting and repulsing forces guided by the 0-dimensional persistent homology features. Finally, we demonstrate the utility of our approach across three datasets.
J. Tate, K. Gillette, B. Burton, W. Good, J. Coll-Font, D. Brooks, R. MacLeod. Analyzing Source Sampling to Reduce Error in ECG Forward Simulations, In Computing in Cardiology, Vol. 44, 2017.
A continuing challenge in validating ECG Imaging is the persistent error in the associated forward problem observed in experimental studies. One possible cause of error is insufficient representation of the cardiac sources, which is often measured from only the ventricular epicardium, ignoring the endocardium and the atria. We hypothesize that measurements that completely cover the heart are required for accurate forward solutions. In this study, we used simulated and measured cardiac potentials to test the effect of different levels of sampling on the forward simulation. We found that omitting source samples on the atria increases the peak RMS error by a mean of 464 μV when compared the the fully sampled cardiac surface. Increasing the sampling on the atria in stages reduced the average error of the forward simulation proportionally to the number of additional samples and revealed some strategies may reduce error with fewer samples, such as adding samples to the AV plane and the atrial roof. Based on these results, we can design a sampling strategy to use in future validation studies.
W.Thevathasan, B. Debu, T. Aziz, B. R. Bloem, C. Blahak, C. Butson, V. Czernecki, T. Foltynie, V. Fraix, D. Grabli, C. Joint, A. M. Lozano, M. S. Okun, J. Ostrem, N. Pavese, C. Schrader, C. H. Tai, J. K. Krauss, E. Moro.
Pedunculopontine nucleus deep brain stimulation in Parkinson's disease: A clinical review, In Movement Disorders, Vol. 33, No. 1, pp. 10--20. 2017.
Pedunculopontine nucleus region deep brain stimulation (DBS) is a promising but experimental therapy for axial motor deficits in Parkinson's disease (PD), particularly gait freezing and falls. Here, we summarise the clinical application and outcomes reported during the past 10 years. The published dataset is limited, comprising fewer than 100 cases. Furthermore, there is great variability in clinical methodology between and within surgical centers. The most common indication has been severe medication refractory gait freezing (often associated with postural instability). Some patients received lone pedunculopontine nucleus DBS (unilateral or bilateral) and some received costimulation of the subthalamic nucleus or internal pallidum. Both rostral and caudal pedunculopontine nucleus subregions have been targeted. However, the spread of stimulation and variance in targeting means that neighboring brain stem regions may be implicated in any response. Low stimulation frequencies are typically employed (20-80 Hertz). The fluctuating nature of gait freezing can confound programming and outcome assessments. Although firm conclusions cannot be drawn on therapeutic efficacy, the literature suggests that medication refractory gait freezing and falls can improve. The impact on postural instability is unclear. Most groups report a lack of benefit on gait or limb akinesia or dopaminergic medication requirements. The key question is whether pedunculopontine nucleus DBS can improve quality of life in PD. So far, the evidence supporting such an effect is minimal. Development of pedunculopontine nucleus DBS to become a reliable, established therapy would likely require a collaborative effort between experienced centres to clarify biomarkers predictive of response and the optimal clinical methodology.
We present a new method for progressive volume rendering by accumulating object-space samples over successively rendered frames. Existing methods for progressive refinement either use image space methods or average pixels over frames, which can blur features or integrate incorrectly with respect to depth. Our approach stores samples along each ray, accumulates new samples each frame into a buffer, and progressively interleaves and integrates these samples. Though this process requires additional memory, it ensures interactivity and is well suited for CPU architectures with large memory and cache. This approach also extends well to distributed rendering in cluster environments. We implement this technique in Intel's open source OSPRay CPU ray tracing framework and demonstrate that it is particularly useful for rendering volumetric data with costly sampling functions.
W. Usher, P. Klacansky, F. Federer, P. T. Bremer, A. Knoll, J. Yarch, A. Angelucci, V. Pascucci.
A Virtual Reality Visualization Tool for Neuron Tracing, In IEEE Transactions on Visualization and Computer Graphics, IEEE, 2017.
Tracing neurons in large-scale microscopy data is crucial to establishing a wiring diagram of the brain, which is needed to understand how neural circuits in the brain process information and generate behavior. Automatic techniques often fail for large and complex datasets, and connectomics researchers may spend weeks or months manually tracing neurons using 2D image stacks. We present a design study of a new virtual reality (VR) system, developed in collaboration with trained neuroanatomists, to trace neurons in microscope scans of the visual cortex of primates. We hypothesize that using consumer-grade VR technology to interact with neurons directly in 3D will help neuroscientists better resolve complex cases and enable them to trace neurons faster and with less physical and mental strain. We discuss both the design process and technical challenges in developing an interactive system to navigate and manipulate terabyte-sized image volumes in VR. Using a number of different datasets, we demonstrate that, compared to widely used commercial software, consumer-grade VR presents a promising alternative for scientists.
Longitudinal Modeling of Multi-modal Image Contrast Reveals Patterns of Early Brain Growth, In Medical Image Computing and Computer Assisted Intervention - MICCAI 2017, Springer International Publishing, pp. 75--83. 2017.
The brain undergoes rapid development during early childhood as a series of biophysical and chemical processes occur, which can be observed in magnetic resonance (MR) images as a change over time of white matter intensity relative to gray matter. Such a contrast change manifests in specific patterns in different imaging modalities, suggesting that brain maturation is encoded by appearance changes in multi-modal MRI. In this paper, we explore the patterns of early brain growth encoded by multi-modal contrast changes in a longitudinal study of children. For a given modality, contrast is measured by comparing histograms of intensity distributions between white and gray matter. Multivariate non-linear mixed effects (NLME) modeling provides subject-specific as well as population growth trajectories which accounts for contrast from multiple modalities. The multivariate NLME procedure and resulting non-linear contrast functions enable the study of maturation in various regions of interest. Our analysis of several brain regions in a study of 70 healthy children reveals a posterior to anterior pattern of timing of maturation in the major lobes of the cerebral cortex, with posterior regions maturing earlier than anterior regions. Furthermore, we find significant differences between maturation rates between males and females.
Adaptive Mesh Refinement (AMR) methods are widespread in scientific computing, and visualizing the resulting data with efficient and accurate rendering methods can be vital for enabling interactive data exploration. In this work, we detail a comprehensive solution for directly volume rendering block-structured (Berger-Colella) AMR data in the OSPRay interactive CPU ray tracing framework. In particular, we contribute a general method for representing and traversing AMR data using a kd-tree structure, and four different reconstruction options, one of which in particular (the basis function approach) is novel compared to existing methods. We demonstrate our system on two types of block-structured AMR data and compressed scalar field data, and show how it can be easily used in existing production-ready applications through a prototypical integration in the widely used visualization program ParaView.
Y. Wan, C. Hansen. Uncertainty Footprint: Visualization of Nonuniform Behavior of Iterative Algorithms Applied to 4D Cell Tracking, In Computer Graphics Forum, Wiley, 2017.
Research on microscopy data from developing biological samples usually requires tracking individual cells over time. When cells are three-dimensionally and densely packed in a time-dependent scan of volumes, tracking results can become unreliable and uncertain. Not only are cell segmentation results often inaccurate to start with, but it also lacks a simple method to evaluate the tracking outcome. Previous cell tracking methods have been validated against benchmark data from real scans or artificial data, whose ground truth results are established by manual work or simulation. However, the wide variety of real-world data makes an exhaustive validation impossible. Established cell tracking tools often fail on new data, whose issues are also difficult to diagnose with only manual examinations. Therefore, data-independent tracking evaluation methods are desired for an explosion of microscopy data with increasing scale and resolution. In this paper, we propose the uncertainty footprint, an uncertainty quantification and visualization technique that examines nonuniformity at local convergence for an iterative evaluation process on a spatial domain supported by partially overlapping bases. We demonstrate that the patterns revealed by the uncertainty footprint indicate data processing quality in two algorithms from a typical cell tracking workflow – cell identification and association. A detailed analysis of the patterns further allows us to diagnose issues and design methods for improvements. A 4D cell tracking workflow equipped with the uncertainty footprint is capable of self diagnosis and correction for a higher accuracy than previous methods whose evaluation is limited by manual examinations.
Image segmentation and registration techniques have enabled biologists to place large amounts of volume data from fluorescence microscopy, morphed three-dimensionally, onto a common spatial frame. Existing tools built on volume visualization pipelines for single channel or red-green-blue (RGB) channels have become inadequate for the new challenges of fluorescence microscopy. For a three-dimensional atlas of the insect nervous system, hundreds of volume channels are rendered simultaneously, whereas fluorescence intensity values from each channel need to be preserved for versatile adjustment and analysis. Although several existing tools have incorporated support of multichannel data using various strategies, the lack of a flexible design has made true many-channel visualization and analysis unavailable. The most common practice for many-channel volume data presentation is still converting and rendering pseudosurfaces, which are inaccurate for both qualitative and quantitative evaluations.
Here, we present an alternative design strategy that accommodates the visualization and analysis of about 100 volume channels, each of which can be interactively adjusted, selected, and segmented using freehand tools. Our multichannel visualization includes a multilevel streaming pipeline plus a triple-buffer compositing technique. Our method also preserves original fluorescence intensity values on graphics hardware, a crucial feature that allows graphics-processing-unit (GPU)-based processing for interactive data analysis, such as freehand segmentation. We have implemented the design strategies as a thorough restructuring of our original tool, FluoRender.
The redesign of FluoRender not only maintains the existing multichannel capabilities for a greatly extended number of volume channels, but also enables new analysis functions for many-channel data from emerging biomedical-imaging techniques.
Restricted and repetitive behaviors are defining features of autism spectrum disorder (ASD). Under revised diagnostic criteria for ASD, this behavioral domain now includes atypical responses to sensory stimuli. To date, little is known about the neural circuitry underlying these features of ASD early in life.
Longitudinal diffusion tensor imaging data were collected from 217 infants at high familial risk for ASD. Forty-four of these infants were diagnosed with ASD at age 2. Targeted cortical, cerebellar, and striatal white matter pathways were defined and measured at ages 6, 12, and 24 months. Dependent variables included the Repetitive Behavior Scale-Revised and the Sensory Experiences Questionnaire.
Among children diagnosed with ASD, repetitive behaviors and sensory response patterns were strongly correlated, even when accounting for developmental level or social impairment. Longitudinal analyses indicated that the genu and cerebellar pathways were significantly associated with both repetitive behaviors and sensory responsiveness but not social deficits. At age 6 months, fractional anisotropy in the genu significantly predicted repetitive behaviors and sensory responsiveness at age 2. Cerebellar pathways significantly predicted later sensory responsiveness. Exploratory analyses suggested a possible disordinal interaction based on diagnostic status for the association between fractional anisotropy and repetitive behavior.
Our findings suggest that restricted and repetitive behaviors contributing to a diagnosis of ASD at age 2 years are associated with structural properties of callosal and cerebellar white matter pathways measured during infancy and toddlerhood. We further identified that repetitive behaviors and unusual sensory response patterns co-occur and share common brain-behavior relationships. These results were strikingly specific given the absence of association between targeted pathways and social deficits.
L. Yang, A. Narayan, P. Wang.
Sequential data assimilation with multiple nonlinear models and applications to subsurface flow, In Journal of Computational Physics, Vol. 346, pp. 356--368. Oct, 2017.
Complex systems are often described with competing models. Such divergence of interpretation on the system may stem from model fidelity, mathematical simplicity, and more generally, our limited knowledge of the underlying processes. Meanwhile, available but limited observations of system state could further complicates one's prediction choices. Over the years, data assimilation techniques, such as the Kalman filter, have become essential tools for improved system estimation by incorporating both models forecast and measurement; but its potential to mitigate the impacts of aforementioned model-form uncertainty has yet to be developed. Based on an earlier study of Multi-model Kalman filter, we propose a novel framework to assimilate multiple models with observation data for nonlinear systems, using extended Kalman filter, ensemble Kalman filter and particle filter, respectively. Through numerical examples of subsurface flow, we demonstrate that the new assimilation framework provides an effective and improved forecast of system behaviour.