SCI Publications
2018
S. Petruzza, A. Gyulassy, V. Pascucci,, P. T. Bremer.
A Task-Based Abstraction Layer for User Productivity and Performance Portability in Post-Moore’s Era Supercomputing, In 3RD INTERNATIONAL WORKSHOP ON POST-MOORE’S ERA SUPERCOMPUTING (PMES), 2018.
The proliferation of heterogeneous computing architectures in current and future supercomputing systems dramatically increases the complexity of software development and exacerbates the divergence of software stacks. Currently, task-based runtimes attempt to alleviate these impediments, however their effective use requires expertise and deep integration that does not facilitate reuse and portability. We propose to introduce a task-based abstraction layer that separates the definition of the algorithm from the runtime-specific implementation, while maintaining performance portability.
A. Prakosa, H. J. Arevalo, D. Deng, P. M. Boyle, P. P. Nikolov, H. Ashikaga, J. J. E. Blauer, E. Ghafoori, C. J. Park, R. C. Blake, F. T. Han, R. S. MacLeod, H. R. Halperin, D. J. Callans, R. Ranjan, J. Chrispin, S. Nazarian, N. A. Trayanova.
Personalized virtual-heart technology for guiding the ablation of infarct-related ventricular tachycardia, In Nature Biomedical Engineering, Springer Nature America, Inc, September, 2018.
DOI: 10.1038/s41551-018-0282-2
Ventricular tachycardia (VT), which can lead to sudden cardiac death, occurs frequently in patients with myocardial infarction. Catheter-based radio-frequency ablation of cardiac tissue has achieved only modest efficacy, owing to the inaccurate identification of ablation targets by current electrical mapping techniques, which can lead to extensive lesions and to a prolonged, poorly tolerated procedure. Here, we show that personalized virtual-heart technology based on cardiac imaging and computational modelling can identify optimal infarct-related VT ablation targets in retrospective animal (five swine) and human studies (21 patients), as well as in a prospective feasibility study (five patients). We first assessed, using retrospective studies (one of which included a proportion of clinical images with artefacts), the capability of the technology to determine the minimum-size ablation targets for eradicating all VTs. In the prospective study, VT sites predicted by the technology were targeted directly, without relying on prior electrical mapping. The approach could improve infarct-related VT ablation guidance, where accurate identification of patient-specific optimal targets could be achieved on a personalized virtual heart before the clinical procedure.
N. Ramesh, T. Tasdizen.
Semi-supervised learning for cell tracking in microscopy images, In 2018 IEEE 15th International Symposium on Biomedical Imaging (ISBI 2018), IEEE, April, 2018.
This paper discusses an algorithm for semi-supervised learning to predict cell division and motion in microscopy images. The cells for tracking are detected using extremal region selection and are depicted using a graphical representation. The supervised loss minimizes the error in predictions for the division and move classifiers. The unsupervised loss constrains the incoming links for every detection such that only one of the links is active. Similarly for the outgoing links, we enforce at-most two links to be active. The supervised and un-supervised losses are embedded in a Bayesian framework for probabilistic learning. The classifier predictions are used to model flow variables for every edge in the graph. The cell lineages are solved by formulating it as an energy minimization problem with constraints using integer linear programming. The unsupervised loss adds a significant improvement in the prediction of the division classifier.
M. Razi, A. Narayan, RM. Kirby, D. Bedrov.
Fast predictive models based on multi-fidelity sampling of properties in molecular dynamics simulations, In Computational Materials Science, Vol. 152, Elsevier BV, pp. 125--133. September, 2018.
DOI: 10.1016/j.commatsci.2018.05.029
In this paper we introduce a novel approach for enhancing the sampling convergence for properties predicted by molecular dynamics. The proposed approach is based upon the construction of a multi-fidelity surrogate model using computational models with different levels of accuracy. While low fidelity models produce result with a lower level of accuracy and computational cost, in this framework they can provide the basis for identification of the optimal sparse sampling pattern for high fidelity models to construct an accurate surrogate model. Such an approach can provide a significant computational saving for the estimation of the quantities of interest for the underlying physical/engineering systems. In the present work, this methodology is demonstrated for molecular dynamics simulations of a Lennard-Jones fluid. Levels of multi-fidelity are defined based upon the integration time step employed in the simulation. The proposed approach is applied to two different canonical problems including (i) single component fluid and (ii) binary glass-forming mixture. The results show about 70% computational saving for the estimation of averaged properties of the systems such as total energy, self diffusion coefficient, radial distribution function and mean squared displacements with a reasonable accuracy.
M. Reblin, D. Ketcher, P. Forsyth, E. Mendivil, L. Kane, J. Pok, M. Meyer, Y.Wu, J. Agutter.
Outcomes of an electronic social network intervention with neuro-oncology patient family caregivers, In Journal of Neuro-Oncology, Springer Nature, pp. 1--7. May, 2018.
DOI: 10.1007/s11060-018-2909-2
Introduction
Informal family caregivers (FCG) are an integral and crucial human component in the cancer care continuum. However, research and interventions to help alleviate documented anxiety and burden on this group is lacking. To address the absence of effective interventions, we developed the electronic Support Network Assessment Program (eSNAP) which aims to automate the capture and visualization of social support, an important target for overall FCG support. This study seeks to describe the preliminary efficacy and outcomes of the eSNAP intervention.
Methods
Forty FCGs were enrolled into a longitudinal, two-group randomized design to compare the eSNAP intervention in caregivers of patients with primary brain tumors against controls who did not receive the intervention. Participants were followed for six weeks with questionnaires to assess demographics, caregiver burden, anxiety, depression, and social support. Questionnaires given at baseline (T1) and then 3-weeks (T2), and 6-weeks (T3) post baseline questionnaire.
Results
FCGs reported high caregiver burden and distress at baseline, with burden remaining stable over the course of the study. The intervention group was significantly less depressed, but anxiety remained stable across groups.
Conclusions
With the lessons learned and feedback obtained from FCGs, this study is the first step to developing an effective social support intervention to support FCGs and healthcare providers in improving cancer care.
A. Rodenhauser, W.W. Good, B. Zenger, J. Tate, K. Aras, B. Burton, R.S. Macleod.
PFEIFER: Preprocessing Framework for Electrograms Intermittently Fiducialized from Experimental Recordings, In The Journal of Open Source Software, Vol. 3, No. 21, The Open Journal, pp. 472. Jan, 2018.
DOI: 10.21105/joss.00472
Preprocessing Framework for Electrograms Intermittently Fiducialized from Experimental Recordings (PFEIFER) is a MATLAB Graphical User Interface designed to process bioelectric signals acquired from experiments.
PFEIFER was specifically designed to process electrocardiographic recordings from electrodes placed on or around the heart or on the body surface. Specific steps included in PFEIFER allow the user to remove some forms of noise, correct for signal drift, and mark specific instants or intervals in time (fiducialize) within all of the time sampled channels. PFEIFER includes many unique features that allow the user to process electrical signals in a consistent and time efficient manner, with additional options for advanced user configurations and input. PFEIFER is structured as a consolidated framework that provides many standard processing pipelines but also has flexibility to allow the user to customize many of the steps. PFEIFER allows the user to import time aligned cardiac electrical signals, semi-automatically determine fiducial markings from those signals, and perform computational tasks that prepare the signals for subsequent display and analysis.
U. Ruede, K. Willcox, L. C. McInnes, H. De Sterck, G. Biros, H. Bungartz, J. Corones, E. Cramer, J. Crowley, O. Ghattas, M. Gunzburger, M. Hanke, R. Harrison, M. Heroux, J. Hesthaven, P. Jimack, C. Johnson, K. E. Jordan, D. E. Keyes, R. Krause, V. Kumar, S. Mayer, J. Meza, K. M. Mrken, J. T. Oden, L. Petzold, P. Raghavan, S. M. Shontz, A. Trefethen, P. Turner, V. Voevodin, B. Wohlmuth,, C. S. Woodward.
Research and Education in Computational Science and Engineering, In SIAM Review, Vol. 60, No. 3, SIAM, pp. 707--754. Jan, 2018.
DOI: 10.1137/16m1096840
This report presents challenges, opportunities and directions for computational science and engineering (CSE) research and education for the next decade. Over the past two decades the field of CSE has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers with algorithmic inventions and software systems that transcend disciplines and scales. CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society; and the CSE community is at the core of this transformation. However, a combination of disruptive developments—including the architectural complexity of extreme-scale computing, the data revolution and increased attention to data-driven discovery, and the specialization required to follow the applications to new frontiers—is redefining the scope and reach of the CSE endeavor. With these many current and expanding opportunities for the CSE field, there is a growing demand for CSE graduates and a need to expand CSE educational offerings. This need includes CSE programs at both the undergraduate and graduate levels, as well as continuing education and professional development programs, exploiting the synergy between computational science and data science. Yet, as institutions consider new and evolving educational programs, it is essential to consider the broader research challenges and opportunities that provide the context for CSE education and workforce development.
A. Sanderson, A. Humphrey, J. Schmidt, R. Sisneros.
Coupling the Uintah Framework and the VisIt Toolkit for Parallel In Situ Data Analysis and Visualization and Computational Steering, In High Performance Computing, June, 2018.
Data analysis and visualization are an essential part of the scientific discovery process. As HPC simulations have grown, I/O has become a bottleneck, which has required scientists to turn to in situ tools for simulation data exploration. Incorporating additional data, such as runtime performance data, into the analysis or I/O phases of a workflow is routinely avoided for fear of excaberting performance issues. The paper presents how the Uintah Framework, a suite of HPC libraries and applications for simulating complex chemical and physical reactions, was coupled with VisIt, an interactive analysis and visualization toolkit, to allow scientists to perform parallel in situ visualization of simulation and runtime performance data. An additional benefit of the coupling made it possible to create a "simulation dashboard" that allowed for in situ computational steering and visual debugging.
A. Sanderson, X. Tricoche.
Exploration of periodic flow fields, In 18th International Symposium on Flow Visualization, 2018.
One of the difficulties researchers face when exploring flow fields is understanding the respective strengths and limitations of the visualization and analysis techniques that can be applied to their particular problem. We consider in this paper the visualization of doubly periodic flow fields. Specifically, we compare and contrast two traditional visualization techniques, the Poincaré plot and the finite-time Lyapunov exponent (FTLE) plot with a technique recently proposed by the authors, which enhances the Poincaré plot with analytical results that reveal the topology. As is often the case, no single technique achieves a holistic visualization of the flow field that would address all the needs of the analysis. Instead, we show that additional insight can be gained from applying them in combination.
I. .J Schwerdt, A. Brenkmann, S. Martinson, B. D. Albrecht, S. Heffernan, M. R. Klosterman, T. Kirkham, T. Tasdizen, L. W. McDonald IV.
Nuclear proliferomics: A new field of study to identify signatures of nuclear materials as demonstrated on alpha-UO3, In Talanta, Vol. 186, Elsevier BV, pp. 433--444. Aug, 2018.
DOI: 10.1016/j.talanta.2018.04.092
The use of a limited set of signatures in nuclear forensics and nuclear safeguards may reduce the discriminating power for identifying unknown nuclear materials, or for verifying processing at existing facilities. Nuclear proliferomics is a proposed new field of study that advocates for the acquisition of large databases of nuclear material properties from a variety of analytical techniques. As demonstrated on a common uranium trioxide polymorph, α-UO3, in this paper, nuclear proliferomics increases the ability to improve confidence in identifying the processing history of nuclear materials. Specifically, α-UO3 was investigated from the calcination of unwashed uranyl peroxide at 350, 400, 450, 500, and 550 °C in air. Scanning electron microscopy (SEM) images were acquired of the surface morphology, and distinct qualitative differences are presented between unwashed and washed uranyl peroxide, as well as the calcination products from the unwashed uranyl peroxide at the investigated temperatures. Differential scanning calorimetry (DSC), UV–Vis spectrophotometry, powder X-ray diffraction (p-XRD), and thermogravimetric analysis-mass spectrometry (TGA-MS) were used to understand the source of these morphological differences as a function of calcination temperature. Additionally, the SEM images were manually segmented using Morphological Analysis for MAterials (MAMA) software to identify quantifiable differences in morphology for three different surface features present on the unwashed uranyl peroxide calcination products. No single quantifiable signature was sufficient to discern all calcination temperatures with a high degree of confidence; therefore, advanced statistical analysis was performed to allow the combination of a number of quantitative signatures, with their associated uncertainties, to allow for complete discernment by calcination history. Furthermore, machine learning was applied to the acquired SEM images to demonstrate automated discernment with at least 89% accuracy.
B. Summa, N. Faraj, C. Licorish, V. Pascucci.
Flexible Live‐Wire: Image Segmentation with Floating Anchors, In Computer Graphics Forum, Vol. 37, No. 2, Wiley, pp. 321-328. May, 2018.
DOI: 10.1111/cgf.13364
We introduce Flexible Live‐Wire, a generalization of the Live‐Wire interactive segmentation technique with floating anchors. In our approach, the user input for Live‐Wire is no longer limited to the setting of pixel‐level anchor nodes, but can use more general anchor sets. These sets can be of any dimension, size, or connectedness. The generality of the approach allows the design of a number of user interactions while providing the same functionality as the traditional Live‐Wire. In particular, we experiment with this new flexibility by designing four novel Live‐Wire interactions based on specific primitives: paint, pinch, probable, and pick anchors. These interactions are only a subset of the possibilities enabled by our generalization. Moreover, we discuss the computational aspects of this approach and provide practical solutions to alleviate any additional overhead. Finally, we illustrate our approach and new interactions through several example segmentations.
T. Tasdizen, M. Sajjadi, M. Javanmardi, N. Ramesh.
Improving the robustness of convolutional networks to appearance variability in biomedical images, In 2018 IEEE 15th International Symposium on Biomedical Imaging (ISBI 2018), IEEE, April, 2018.
DOI: 10.1109/isbi.2018.8363636
While convolutional neural networks (CNN) produce state-of-the-art results in many applications including biomedical image analysis, they are not robust to variability in the data that is not well represented by the training set. An important source of variability in biomedical images is the appearance of objects such as contrast and texture due to different imaging settings. We introduce the neighborhood similarity layer (NSL) which can be used in a CNN to improve robustness to changes in the appearance of objects that are not well represented by the training data. The proposed NSL transforms its input feature map at a given pixel by computing its similarity to the surrounding neighborhood. This transformation is spatially varying, hence not a convolution. It is differentiable; therefore, networks including the proposed layer can be trained in an end-to-end manner. We demonstrate the advantages of the NSL for the vasculature segmentation and cell detection problems.
S. Thomas, J. Silvernagel, N. Angel, E. Kholmovski, E. Ghafoori, N. Hu, J. Ashton, D.J. Dosdall, R.S. MacLeod, R. Ranjan.
Higher contact force during radiofrequency ablation leads to a much larger increase in edema as compared to chronic lesion size, In Journal of Cardiovascular Electrophysiology, Wiley, June, 2018.
DOI: 10.1111/jce.13636
1 Introduction
Reversible edema is a part of any radiofrequency ablation but its relationship with contact force is unknown. The goal of this study was to characterize through histology and MRI, acute and chronic ablation lesions and reversible edema with contact force.
2 Methods and results
In a canine model (n = 14), chronic ventricular lesions were created with a 3.5‐mm tip ThermoCool SmartTouch (Biosense Webster) catheter at 25 W or 40 W for 30 seconds. Repeat ablation was performed after 3 months to create a second set of lesions (acute). Each ablation procedure was followed by in vivo T2‐weighted MRI for edema and late‐gadolinium enhancement (LGE) MRI for lesion characterization. For chronic lesions, the mean scar volumes at 25 W and 40 W were 77.8 ± 34.5 mm3 (n = 24) and 139.1 ± 69.7 mm3 (n = 12), respectively. The volume of chronic lesions increased (25 W: P < 0.001, 40 W: P < 0.001) with greater contact force. For acute lesions, the mean volumes of the lesion were 286.0 ± 129.8 mm3 (n = 19) and 422.1 ± 113.1 mm3 (n = 16) for 25 W and 40 W, respectively (P < 0.001 compared to chronic scar). On T2‐weighted MRI, the acute edema volume was on average 5.6–8.7 times higher than the acute lesion volume and increased with contact force (25 W: P = 0.001, 40 W: P = 0.011).
3 Conclusion
With increasing contact force, there is a marginal increase in lesion size but accompanied with a significantly larger edema. The reversible edema that is much larger than the chronic lesion volume may explain some of the chronic procedure failures.
J.N. Todd, T.G. Maak, G.A. Ateshian, S.A. Maas, J.A. Weiss.
Hip chondrolabral mechanics during activities of daily living: Role of the labrum and interstitial fluid pressurization, In Journal of Biomechanics, Vol. 69, Elsevier BV, pp. 113--120. March, 2018.
DOI: 10.1016/j.jbiomech.2018.01.001
Osteoarthritis of the hip can result from mechanical factors, which can be studied using finite element (FE) analysis. FE studies of the hip often assume there is no significant loss of fluid pressurization in the articular cartilage during simulated activities and approximate the material as incompressible and elastic. This study examined the conditions under which interstitial fluid load support remains sustained during physiological motions, as well as the role of the labrum in maintaining fluid load support and the effect of its presence on the solid phase of the surrounding cartilage. We found that dynamic motions of gait and squatting maintained consistent fluid load support between cycles, while static single-leg stance experienced slight fluid depressurization with significant reduction of solid phase stress and strain. Presence of the labrum did not significantly influence fluid load support within the articular cartilage, but prevented deformation at the cartilage edge, leading to lower stress and strain conditions in the cartilage. A morphologically accurate representation of collagen fibril orientation through the thickness of the articular cartilage was not necessary to predict fluid load support. However, comparison with simplified fibril reinforcement underscored the physiological importance. The results of this study demonstrate that an elastic incompressible material approximation is reasonable for modeling a limited number of cyclic motions of gait and squatting without significant loss of accuracy, but is not appropriate for static motions or numerous repeated motions. Additionally, effects seen from removal of the labrum motivate evaluation of labral reattachment strategies in the context of labral repair.
L. Tu, M. Styner, J. Vicory, S. Elhabian, R. Wang, J. Hong, B. Paniagua, J.C. Prieto, D. Yang, R. Whitaker, M. Pizer.
Skeletal Shape Correspondence through Entropy, In IEEE Transactions on Medical Imaging, Vol. 37, No. 1, IEEE, pp. 1--11. Jan, 2018.
DOI: 10.1109/tmi.2017.2755550
We present a novel approach for improving the shape statistics of medical image objects by generating correspondence of skeletal points. Each object's interior is modeled by an s-rep, i.e., by a sampled, folded, two-sided skeletal sheet with spoke vectors proceeding from the skeletal sheet to the boundary. The skeleton is divided into three parts: the up side, the down side, and the fold curve. The spokes on each part are treated separately and, using spoke interpolation, are shifted along that skeleton in each training sample so as to tighten the probability distribution on those spokes' geometric properties while sampling the object interior regularly. As with the surface/boundary-based correspondence method of Cates et al., entropy is used to measure both the probability distribution tightness and the sampling regularity, here of the spokes' geometric properties. Evaluation on synthetic and real world lateral ventricle and hippocampus data sets demonstrate improvement in the performance of statistics using the resulting probability distributions. This improvement is greater than that achieved by an entropy-based correspondence method on the boundary points.
W Usher, P Klacansky, F Federer, PT Bremer, A Knoll, J. Yarch, A. Angelucci, V. Pascucci .
A virtual reality visualization tool for neuron tracing, In IEEE Transactions on Visualization and Computer Graphics, Vol. 24, No. 1, IEEE, pp. 994--1003. Jan, 2018.
DOI: 10.1109/tvcg.2017.2744079
racing neurons in large-scale microscopy data is crucial to establishing a wiring diagram of the brain, which is needed to understand how neural circuits in the brain process information and generate behavior. Automatic techniques often fail for large and complex datasets, and connectomics researchers may spend weeks or months manually tracing neurons using 2D image stacks. We present a design study of a new virtual reality (VR) system, developed in collaboration with trained neuroanatomists, to trace neurons in microscope scans of the visual cortex of primates. We hypothesize that using consumer-grade VR technology to interact with neurons directly in 3D will help neuroscientists better resolve complex cases and enable them to trace neurons faster and with less physical and mental strain. We discuss both the design process and technical challenges in developing an interactive system to navigate and manipulate terabyte-sized image volumes in VR. Using a number of different datasets, we demonstrate that, compared to widely used commercial software, consumer-grade VR presents a promising alternative for scientists.
W. Usher, S. Rizzi, I. Wald, J. Amstutz, J. Insley, V. Vishwanath, N. Ferrier, M. E. Papka,, V. Pascucci.
libIS: A Lightweight Library for Flexible In Transit Visualization, In Proceedings of the Workshop on In Situ Infrastructures for Enabling Extreme-Scale Analysis and Visualization, ACM Press, 2018.
DOI: 10.1145/3281464.3281466
As simulations grow in scale, the need for in situ analysis methods to handle the large data produced grows correspondingly. One desirable approach to in situ visualization is in transit visualization. By decoupling the simulation and visualization code, in transit approaches alleviate common difficulties with regard to the scalability of the analysis, ease of integration, usability, and impact on the simulation. We present libIS, a lightweight, flexible library which lowers the bar for using in transit visualization. Our library works on the concept of abstract regions of space containing data, which are transferred from the simulation to the visualization clients upon request, using a client-server model. We also provide a SENSEI analysis adaptor, which allows for transparent deployment of in transit visualization. We demonstrate the flexibility of our approach on batch analysis and interactive visualization use cases on different HPC resources.
F. Wang, W. Li, S. Wang, C.R. Johnson.
Association Rules-Based Multivariate Analysis and Visualization of Spatiotemporal Climate Data, In ISPRS International Journal of Geo-Information, Vol. 7, No. 7, MDPI AG, pp. 266. July, 2018.
DOI: 10.3390/ijgi7070266
Understanding atmospheric phenomena involves analysis of large-scale spatiotemporal multivariate data. The complexity and heterogeneity of such data pose a significant challenge in discovering and understanding the association between multiple climate variables. To tackle this challenge, we present an interactive heuristic visualization system that supports climate scientists and the public in their exploration and analysis of atmospheric phenomena of interest. Three techniques are introduced: (1) web-based spatiotemporal climate data visualization; (2) multiview and multivariate scientific data analysis; and (3) data mining-enabled visual analytics. The Arctic System Reanalysis (ASR) data are used to demonstrate and validate the effectiveness and usefulness of our method through a case study of "The Great Arctic Cyclone of 2012". The results show that different variables have strong associations near the polar cyclone area. This work also provides techniques for identifying multivariate correlation and for better understanding the driving factors of climate phenomena.
Z. Xiong, V. V. Fedorov, X. Fu, E. Cheng, R. Macleod, J. Zhao.
Fully Automatic Left Atrium Segmentation from Late Gadolinium Enhanced Magnetic Resonance Imaging Using a Dual Fully Convolutional Neural Network, In IEEE Transactions on Medical Imaging, IEEE, pp. 1--1. 2018.
DOI: 10.1109/tmi.2018.2866845
Atrial fibrillation (AF) is the most prevalent form of cardiac arrhythmia. Current treatments for AF remain suboptimal due to a lack of understanding of the underlying atrial structures that directly sustain AF. Existing approaches for analyzing atrial structures in 3D, especially from late gadolinium-enhanced (LGE)-MRIs, rely heavily on manual segmentation methods which are extremely labor-intensive and prone to errors. As a result, a robust and automated method for analyzing atrial structures in 3D is of high interest. We have therefore developed AtriaNet, a 16-layer convolutional neural network (CNN), on 154 3D LGE-MRIs with a spatial resolution of 0.625 mm × 0.625 mm × 1.25 mm from patients with AF, to automatically segment the left atrial (LA) epicardium and endocardium. AtriaNet consists of a multi-scaled, dual pathway architecture that captures both the local atrial tissue geometry, and the global positional information of LA using 13 successive convolutions, and 3 further convolutions for merging. By utilizing computationally efficient batch prediction, AtriaNet was able to successfully process each 3D LGE-MRI within one minute. Furthermore, benchmarking experiments showed that AtriaNet outperformed state-of-the-art CNNs, with a DICE score of 0.940 and 0.942 for the LA epicardium and endocardium respectively, and an inter-patient variance of <0.001. The estimated LA diameter and volume computed from the automatic segmentations were accurate to within 1.59 mm and 4.01 cm³ of the ground truths. Our proposed CNN was tested on the largest known dataset for LA segmentation, and to the best of our knowledge, it is the most robust approach that has ever been developed for segmenting LGE-MRIs. The increased accuracy of atrial reconstruction and analysis could potentially improve the understanding and treatment of AF.
Z. Yang, D. Sahasrabudhe, A. Humphrey, M. Berzins.
A Preliminary Port and Evaluation of the Uintah AMT Runtime on Sunway TaihuLight, In 9th IEEE International Workshop on Parallel and Distributed Scientific and Engineering Computing (PDSEC 2018), IEEE, May, 2018.
The Sunway TaihuLight is the world's fastest supercomputer at the present time with a low power consumption per flop and a unique set of architectural features. Applications performance depends heavily on being able to adapt codes to make best use of these features. Porting large codes to novel architectures such as Sunway is both time-consuming and expensive, as modifications throughout the code may be needed. One alternative to conventional porting is to consider an approach based upon Asynchronous Many Task (AMT) Runtimes such as the Uintah framework considered here. Uintah structures the problem as a series of tasks that are executed by the runtime via a task scheduler. The central challenge in porting a large AMT runtime like Uintah is thus to consider how to devise an appropriate scheduler and how to write tasks to take advantage of a particular architecture. It will be shown how an asynchronous Sunway-specific scheduler, based upon MPI and athreads, may be written and how individual taskcode for a typical but model structured-grid fluid-flow problem needs to be refactored. Preliminary experiments show that it is possible to obtain a strong-scaling efficiency ranging from 31.7% to 96.1% for different problem sizes with full optimizations. The asynchronous scheduler developed here improves the overall performance over a synchronous alternative by at most 22.8%, and the fluid-flow simulation reaches 1.17% the theoretical peak of the running nodes. Conclusions are drawn for the porting of full-scale Uintah applications.
Page 26 of 142