SCI Publications
2019
L. Zhou, R. Netzel, D. Weiskopf,, C. R. Johnson.
Spectral Visualization Sharpening, In ACM Symposium on Applied Perception 2019, No. 18, Association for Computing Machinery, pp. 1--9. 2019.
DOI: https://doi.org/10.1145/3343036.3343133
In this paper, we propose a perceptually-guided visualization sharpening technique.We analyze the spectral behavior of an established comprehensive perceptual model to arrive at our approximated model based on an adapted weighting of the bandpass images from a Gaussian pyramid. The main benefit of this approximated model is its controllability and predictability for sharpening color-mapped visualizations. Our method can be integrated into any visualization tool as it adopts generic image-based post-processing, and it is intuitive and easy to use as viewing distance is the only parameter. Using highly diverse datasets, we show the usefulness of our method across a wide range of typical visualizations.
2018
O. Abdullah, L. Dai, J. Tippetts, B. Zimmerman, A. Van Hoek, S. Joshi, E. Hsu.
High resolution and high field diffusion MRI in the visual system of primates (P3.086), In Neurology, Vol. 90, No. 15 Supplement, Wolters Kluwer Health, Inc, 2018.
ISSN: 0028-3878
Objective: Establishing a primate multiscale genetic brain network linking key microstructural brain components to social behavior remains an elusive goal.
Background: Diffusion MRI, which quantifies the magnitude and anisotropy of water diffusion in brain tissues, offers unparalleled opportunity to link the macroconnectome (resolution of ~0.5mm) to histological-based microconnectome at synaptic resolution.
Design/Methods: We tested the hypothesis that the simplest (and most clinically-used) reconstruction technique (known as diffusion tensor imaging, DTI) will yield similar brain connectivity patterns in the visual system (from optic chiasm to visual cortex) compared to more sophisticated and accurate reconstruction methods including diffusion spectrum imaging (DSI), q-ball imaging (QBI), and generalized q-sampling imaging. We obtained high resolution diffusion MRI data on ex vivo brain from Macaca fascicularis: MRI 7T, resolution 0.5 mm isotropic, 515 diffusion volumes up to b-value (aka diffusion sensitivity) of 40,000 s/mm2 with scan time ~100 hrs.
Results: Tractography results show that despite the limited ability of DTI to resolve crossing fibers at the optic chiasm, DTI-based tracts mapped to the known projections of layers in lateral geniculate nucleus and to the primary visual cortex. The other reconstructions were superior in localized regions for resolving crossing regions.
Conclusions: In conclusion, despite its simplifying assumptions, DTI-based fiber tractography can be used to generate accurate brain connectivity maps that conform to established neuroanatomical features in the visual system.
K. A. Aiello, S. P. Ponnapalli, O. Alter.
Mathematically universal and biologically consistent astrocytoma genotype encodes for transformation and predicts survival phenotype, In APL Bioengineering, Vol. 2, No. 3, AIP Publishing, pp. 031909. September, 2018.
DOI: 10.1063/1.5037882
DNA alterations have been observed in astrocytoma for decades. A copy-number genotype predictive of a survival phenotype was only discovered by using the generalized singular value decomposition (GSVD) formulated as a comparative spectral decomposition. Here, we use the GSVD to compare whole-genome sequencing (WGS) profiles of patient-matched astrocytoma and normal DNA. First, the GSVD uncovers a genome-wide pattern of copy-number alterations, which is bounded by patterns recently uncovered by the GSVDs of microarray-profiled patient-matched glioblastoma (GBM) and, separately, lower-grade astrocytoma and normal genomes. Like the microarray patterns, the WGS pattern is correlated with an approximately one-year median survival time. By filling in gaps in the microarray patterns, the WGS pattern reveals that this biologically consistent genotype encodes for transformation via the Notch together with the Ras and Shh pathways. Second, like the GSVDs of the microarray profiles, the GSVD of the WGS profiles separates the tumor-exclusive pattern from normal copy-number variations and experimental inconsistencies. These include the WGS technology-specific effects of guanine-cytosine content variations across the genomes that are correlated with experimental batches. Third, by identifying the biologically consistent phenotype among the WGS-profiled tumors, the GBM pattern proves to be a technology-independent predictor of survival and response to chemotherapy and radiation, statistically better than the patient's age and tumor's grade, the best other indicators, and MGMT promoter methylation and IDH1 mutation. We conclude that by using the complex structure of the data, comparative spectral decompositions underlie a mathematically universal description of the genotype-phenotype relations in cancer that other methods miss.
D. N. Anderson, B. Osting, J. Vorwerk, A. D Dorval, C. R Butson.
Optimized programming algorithm for cylindrical and directional deep brain stimulation electrodes, In Journal of Neural Engineering, Vol. 15, No. 2, pp. 026005. 2018.
Objective. Deep brain stimulation (DBS) is a growing treatment option for movement and psychiatric disorders. As DBS technology moves toward directional leads with increased numbers of smaller electrode contacts, trial-and-error methods of manual DBS programming are becoming too time-consuming for clinical feasibility. We propose an algorithm to automate DBS programming in near real-time for a wide range of DBS lead designs. Approach. Magnetic resonance imaging and diffusion tensor imaging are used to build finite element models that include anisotropic conductivity. The algorithm maximizes activation of target tissue and utilizes the Hessian matrix of the electric potential to approximate activation of neurons in all directions. We demonstrate our algorithm's ability in an example programming case that targets the subthalamic nucleus (STN) for the treatment of Parkinson's disease for three lead designs: the Medtronic 3389 (four cylindrical contacts), the direct STNAcute (two cylindrical contacts, six directional contacts), and the Medtronic-Sapiens lead (40 directional contacts). Main results. The optimization algorithm returns patient-specific contact configurations in near real-time—less than 10 s for even the most complex leads. When the lead was placed centrally in the target STN, the directional leads were able to activate over 50% of the region, whereas the Medtronic 3389 could activate only 40%. When the lead was placed 2 mm lateral to the target, the directional leads performed as well as they did in the central position, but the Medtronic 3389 activated only 2.9% of the STN. Significance. This DBS programming algorithm can be applied to cylindrical electrodes as well as novel directional leads that are too complex with modern technology to be manually programmed. This algorithm may reduce clinical programming time and encourage the use of directional leads, since they activate a larger volume of the target area than cylindrical electrodes in central and off-target lead placements.
D. N. Anderson, G. Duffley, J. Vorwerk, A. Dorval, C. R. Butson.
Anodic Stimulation Misunderstood: Preferential Activation of Fiber Orientations with Anodic Waveforms in Deep Brain Stimulation, In Journal of Neural Engineering, IOP Publishing, Oct, 2018.
DOI: 10.1088/1741-2552/aae590
Objective: During deep brain stimulation (DBS), it is well understood that extracellular cathodic stimulation can cause activation of passing axons. Activation can be predicted from the second derivative of the electric potential along an axon, which depends on axonal orientation with respect to the stimulation source. We hypothesize that fiber orientation influences activation thresholds and that fiber orientations can be selectively targeted with DBS waveforms. Approach: We used bioelectric field and multicompartment NEURON models to explore preferential activation based on fiber orientation during monopolar or bipolar stimulation. Preferential fiber orientation was extracted from the principal eigenvectors and eigenvalues of the Hessian matrix of the electric potential. We tested cathodic, anodic, and charge-balanced pulses to target neurons based on fiber orientation in general and clinical scenarios. Main Results: Axons passing the DBS lead have positive second derivatives around a cathode, whereas orthogonal axons have positive second derivatives around an anode, as indicated by the Hessian. Multicompartment NEURON models confirm that passing fibers are activated by cathodic stimulation, and orthogonal fibers are activated by anodic stimulation. Additionally, orthogonal axons have lower thresholds compared to passing axons. In a clinical scenario, fiber pathways associated with therapeutic benefit can be targeted with anodic stimulation at 50% lower stimulation amplitudes. Significance: Fiber orientations can be selectively targeted with simple changes to the stimulus waveform. Anodic stimulation preferentially activates orthogonal fibers, approaching or leaving the electrode, at lower thresholds for similar therapeutic benefit in DBS with decreased power consumption.
G.A. Ateshian, J.J. Shim, S.A. Maas, J.A. Weiss.
Finite Element Framework for Computational Fluid Dynamics in FEBio, In Journal of Biomechanical Engineering, Vol. 140, No. 2, ASME International, pp. 021001. Jan, 2018.
DOI: 10.1115/1.4038716
The mechanics of biological fluids is an important topic in biomechanics, often requiring the use of computational tools to analyze problems with realistic geometries and material properties. This study describes the formulation and implementation of a finite element framework for computational fluid dynamics (CFD) in FEBio, a free software designed to meet the computational needs of the biomechanics and biophysics communities. This formulation models nearly incompressible flow with a compressible isothermal formulation that uses a physically realistic value for the fluid bulk modulus. It employs fluid velocity and dilatation as essential variables: The virtual work integral enforces the balance of linear momentum and the kinematic constraint between fluid velocity and dilatation, while fluid density varies with dilatation as prescribed by the axiom of mass balance. Using this approach, equal-order interpolations may be used for both essential variables over each element, contrary to traditional mixed formulations that must explicitly satisfy the inf-sup condition. The formulation accommodates Newtonian and non-Newtonian viscous responses as well as inviscid fluids. The efficiency of numerical solutions is enhanced using Broyden's quasi-Newton method. The results of finite element simulations were verified using well-documented benchmark problems as well as comparisons with other free and commercial codes. These analyses demonstrated that the novel formulation introduced in FEBio could successfully reproduce the results of other codes. The analogy between this CFD formulation and standard finite element formulations for solid mechanics makes it suitable for future extension to fluid–structure interactions (FSIs).
D. Ayyagari, N. Ramesh, D. Yatsenko, T. Tasdizen, C. Atria.
Image reconstruction using priors from deep learning, In Medical Imaging 2018: Image Processing, SPIE, March, 2018.
Tomosynthesis, i.e. reconstruction of 3D volumes using projections from a limited perspective is a classical inverse, ill-posed or under constrained problem. Data insufficiency leads to reconstruction artifacts that vary in severity depending on the particular problem, the reconstruction method and also on the object being imaged. Machine learning has been used successfully in tomographic problems where data is insufficient, but the challenge with machine learning is that it introduces bias from the learning dataset. A novel framework to improve the quality of the tomosynthesis reconstruction that limits the learning dataset bias by maintaining consistency with the observed data is proposed. Convolutional Neural Networks (CNN) are embedded as regularizers in the reconstruction process to introduce the expected features and characterstics of the likely imaged object. The minimization of the objective function keeps the solution consistent with the observations and limits the bias introduced by the machine learning regularizers, improving the quality of the reconstruction. The proposed method has been developed and studied in the specific problem of Cone Beam Tomosynthesis Flouroscopy (CBT-fluoroscopy)1 but it is a general framework that can be applied to any image reconstruction problem that is limited by data insufficiency.
M. Berzins.
Nonlinear stability and time step selection for the MPM method, In Computational Particle Mechanics, Jan, 2018.
ISSN: 2196-4386
DOI: 10.1007/s40571-018-0182-y
The Material Point Method (MPM) has been developed from the Particle in Cell (PIC) method over the last 25 years and has proved its worth in solving many challenging problems involving large deformations. Nevertheless there are many open questions regarding the theoretical properties of MPM. For example in while Fourier methods, as applied to PIC may provide useful insight, the non-linear nature of MPM makes it necessary to use a full non-linear stability analysis to determine a stable time step for MPM. In order to begin to address this the stability analysis of Spigler and Vianello is adapted to MPM and used to derive a stable time step bound for a model problem. This bound is contrasted against traditional Speed of sound and CFL bounds and shown to be a realistic stability bound for a model problem.
H. Bhatia, A.G. Gyulassy, V. Lordi, J.E. Pask, V. Pascucci, P.T. Bremer.
TopoMS: Comprehensive topological exploration for molecular and condensed‐matter systems, In Journal of Computational Chemistry, Vol. 39, No. 16, Wiley, pp. 936--952. March, 2018.
DOI: 10.1002/jcc.25181
We introduce TopoMS, a computational tool enabling detailed topological analysis of molecular and condensed‐matter systems, including the computation of atomic volumes and charges through the quantum theory of atoms in molecules, as well as the complete molecular graph. With roots in techniques from computational topology, and using a shared‐memory parallel approach, TopoMS provides scalable, numerically robust, and topologically consistent analysis. TopoMS can be used as a command‐line tool or with a GUI (graphical user interface), where the latter also enables an interactive exploration of the molecular graph. This paper presents algorithmic details of TopoMS and compares it with state‐of‐the‐art tools: Bader charge analysis v1.0 (Arnaldsson et al., 01/11/17) and molecular graph extraction using Critic2 (Otero‐de‐la‐Roza et al., Comput. Phys. Commun. 2014, 185, 1007). TopoMS not only combines the functionality of these individual codes but also demonstrates up to 4× performance gain on a standard laptop, faster convergence to fine‐grid solution, robustness against lattice bias, and topological consistency. TopoMS is released publicly under BSD License. © 2018 Wiley Periodicals, Inc.
A. Bock, E. Axelsson, C. Emmart, M. Kuznetsova, C. Hansen, A. Ynnerman.
OpenSpace: Changing the Narrative of Public Dissemination in Astronomical Visualization from What to How, In IEEE Computer Graphics and Applications, Vol. 38, No. 3, IEEE, pp. 44--57. May, 2018.
DOI: 10.1109/mcg.2018.032421653
We present the development of an open-source software called OpenSpace that bridges the gap between scientific discoveries and public dissemination and thus paves the way for the next generation of science communication and data exploration. We describe how the platform enables interactive presentations of dynamic and time-varying processes by domain experts to the general public. The concepts are demonstrated through four cases: Image acquisitions of the New Horizons and Rosetta spacecraft, the dissemination of space weather phenomena, and the display of high-resolution planetary images. Each case has been presented at public events with great success. These cases highlight the details of data acquisition, rather than presenting the final results, showing the audience the value of supporting the efforts of the scientific discovery.
B.M. Burton, K.K. Aras, W.W. Good, J.D. Tate, B. Zenger, R.S. MacLeod.
A Framework for Image-Based Modeling of Acute Myocardial Ischemia Using Intramurally Recorded Extracellular Potentials, In Annals of Biomedical Engineering, Springer Nature, May, 2018.
DOI: 10.1007/s10439-018-2048-0
The biophysical basis for electrocardiographic evaluation of myocardial ischemia stems from the notion that ischemic tissues develop, with relative uniformity, along the endocardial aspects of the heart. These injured regions of subendocardial tissue give rise to intramural currents that lead to ST segment deflections within electrocardiogram (ECG) recordings. The concept of subendocardial ischemic regions is often used in clinical practice, providing a simple and intuitive description of ischemic injury; however, such a model grossly oversimplifies the presentation of ischemic disease—inadvertently leading to errors in ECG-based diagnoses. Furthermore, recent experimental studies have brought into question the subendocardial ischemia paradigm suggesting instead a more distributed pattern of tissue injury. These findings come from experiments and so have both the impact and the limitations of measurements from living organisms. Computer models have often been employed to overcome the constraints of experimental approaches and have a robust history in cardiac simulation. To this end, we have developed a computational simulation framework aimed at elucidating the effects of ischemia on measurable cardiac potentials. To validate our framework, we simulated, visualized, and analyzed 226 experimentally derived acute myocardial ischemic events. Simulation outcomes agreed both qualitatively (feature comparison) and quantitatively (correlation, average error, and significance) with experimentally obtained epicardial measurements, particularly under conditions of elevated ischemic stress. Our simulation framework introduces a novel approach to incorporating subject-specific, geometric models and experimental results that are highly resolved in space and time into computational models. We propose this framework as a means to advance the understanding of the underlying mechanisms of ischemic disease while simultaneously putting in place the computational infrastructure necessary to study and improve ischemia models aimed at reducing diagnostic errors in the clinic.
B.M. Burton, K.K. Aras, W.W. Good, J.D. Tate, B. Zenger, R.S. MacLeod.
Image-Based Modeling of Acute Myocardial Ischemia Using Experimentally Derived Ischemic Zone Source Representations, In Journal of Electrocardiology, Vol. 51, No. 4, Elsevier BV, pp. 725--733. July, 2018.
DOI: 10.1016/j.jelectrocard.2018.05.005
Background
Computational models of myocardial ischemia often use oversimplified ischemic source representations to simulate epicardial potentials. The purpose of this study was to explore the influence of biophysically justified, subject-specific ischemic zone representations on epicardial potentials.
Methods
We developed and implemented an image-based simulation pipeline, using intramural recordings from a canine experimental model to define subject-specific ischemic regions within the heart. Static epicardial potential distributions, reflective of ST segment deviations, were simulated and validated against measured epicardial recordings.
Results
Simulated epicardial potential distributions showed strong statistical correlation and visual agreement with measured epicardial potentials. Additionally, we identified and described in what way border zone parameters influence epicardial potential distributions during the ST segment.
Conclusion
From image-based simulations of myocardial ischemia, we generated subject-specific ischemic sources that accurately replicated epicardial potential distributions. Such models are essential in understanding the underlying mechanisms of the bioelectric fields that arise during ischemia and are the basis for more sophisticated simulations of body surface ECGs.
M.J.M. Cluitmans, S. Ghimire, J. Dhamala, J. Coll-Font, J.D. Tate, S. Giffard-Roisin, J. Svehlikova, O. Doessel, M.S. Guillem, D.H. Brooks, R.S. Macleod, L. Wang.
P1125 Noninvasive localization of premature ventricular complexes: a research-community-based approach, In EP Europace, Vol. 20, No. Supplement, Oxford University Press, March, 2018.
DOI: 10.1093/europace/euy015.611
Background: Noninvasive localization of premature ventricular complexes (PVCs) to guide ablation therapy is one of the emerging applications of electrocardiographic imaging (ECGI). Because of its increasing clinical use, it is essential to compare the many implementations of ECGI that exist to understand the specific characteristics of each approach.
Objective: Our consortium is a community of researchers aiming to collaborate in the field of ECGI, and to objectively compare and improve methods. Here, we will compare methods to localize the origin of PVCs with ECGI.
Methods: Our consortium hosts a repository of ECGI data on its website. For the current study, participants analysed simulated electrocardiograms from premature beats, freely available on that website. These PVCs were simulated to originate from eight ventricular locations and the resulting body-surface potentials were computed. These body-surface electrocardiograms (and the torso-heart geometry) were then provided to the study participants to apply their ECGI algorithms to determine the origin of the PVCs. Participants could choose freely among four different source models, i.e., representations of the bioelectric fields reconstructed from ECGI: 1) epicardial potentials (POTepi), 2) epicardial & endocardial potentials (POTepi&endo), 3) transmembrane potentials on the endocardium and epicardium (TMPepi&endo) and 4) transmembrame potentials throughout the myocardium (TMPmyo). Participants were free to employ any software implementation of ECGI and were blinded to the ground truth data.
Results: Four research groups submitted 11 entries for this study. The figure shows the localization error between the known and reconstructed origin of each PVC for each submission, categorized per source model. Each colour represents one research group and some groups submitted results using different approaches. These results demonstrate that the variation of accuracy was larger among research groups than among the source models. Most submissions achieved an error below 2 cm, but none performed with a consistent sub-centimetre accuracy.
Conclusion: This study demonstrates a successful community-based approach to study different ECGI methods for PVC localization. The goal was not to rank research groups but to compare both source models and numerical implementations. PVC localization with these methods was not as dependent on the source representation as it was on the implementation of ECGI. Consequently, ECGI validation should not be performed on generic methods, but should be specifically performed for each lab's implementation. The novelty of this study is that it achieves this in the first open, international comparison of approaches using a common set of gold standards. Continued collaborative validation is essential to understand the effect of implementation differences, in order to reach significant improvements and arrive at clinically-relevant sub-centimetre accuracy of PVC localization.
M. Cluitmans, D. H. Brooks, R. MacLeod, O. Dössel, M. S. Guillem, P. M. van Dam, J. Svehlikova, B. He, J. Sapp, L. Wang, L. Bear.
Validation and Opportunities of Electrocardiographic Imaging: From Technical Achievements to Clinical Applications, In Frontiers in Physiology, Vol. 9, Frontiers Media SA, pp. 1305. 2018.
ISSN: 1664-042X
DOI: 10.3389/fphys.2018.01305
Electrocardiographic imaging (ECGI) reconstructs the electrical activity of the heart from a dense array of body-surface electrocardiograms and a patient-specific heart-torso geometry. Depending on how it is formulated, ECGI allows the reconstruction of the activation and recovery sequence of the heart, the origin of premature beats or tachycardia, the anchors/hotspots of re-entrant arrhythmias and other electrophysiological quantities of interest. Importantly, these quantities are directly and noninvasively reconstructed in a digitized model of the patient’s three-dimensional heart, which has led to clinical interest in ECGI’s ability to personalize diagnosis and guide therapy.
Despite considerable development over the last decades, validation of ECGI is challenging. Firstly, results depend considerably on implementation choices, which are necessary to deal with ECGI’s ill-posed character. Secondly, it is challenging to obtain (invasive) ground truth data of high quality. In this review, we discuss the current status of ECGI validation as well as the major challenges remaining for complete adoption of ECGI in clinical practice.
Specifically, showing clinical benefit is essential for the adoption of ECGI. Such benefit may lie in patient outcome improvement, workflow improvement, or cost reduction. Future studies should focus on these aspects to achieve broad adoption of ECGI, but only after the technical challenges have been solved for that specific application/pathology. We propose ‘best’ practices for technical validation and highlight collaborative efforts recently organized in this field. Continued interaction between engineers, basic scientists and physicians remains essential to find a hybrid between technical achievements, pathological mechanisms insights, and clinical benefit, to evolve this powerful technique towards a useful role in clinical practice.
A. Erdemir, P.J. Hunter, G.A. Holzapfel, L.M. Loew, J. Middleton, C.R. Jacobs, P. Nithiarasu, R. Löhner, G. Wei, B.A. Winkelstein, V.H. Barocas, F. Guilak, J.P. Ku, J.L. Hicks, S.L. Delp, M.S. Sacks, J.A. Weiss, G.A. Ateshian, S.A. Maas, A.D. McCulloch, G.C.Y. Peng.
Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research, In Journal of Biomechanical Engineering, Vol. 140, No. 2, ASME International, pp. 024701. Jan, 2018.
DOI: 10.1115/1.4038768
The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate model sharing, and there are corresponding initiatives by the scientific journals. Outside the publishing enterprise, infrastructure to facilitate model sharing in biomechanics exists, and simulation software developers are interested in accommodating the community's needs for sharing of modeling resources. Encouragement for the use of standardized markups, concerns related to quality assurance, acknowledgement of increased burden, and importance of stewardship of resources are noted. In the short-term, it is advisable that the community builds upon recent strategies and experiments with new pathways for continued demonstration of model sharing, its promotion, and its utility. Nonetheless, the need for a long-term strategy to unify approaches in sharing computational models and related resources is acknowledged. Development of a sustainable platform supported by a culture of open model sharing will likely evolve through continued and inclusive discussions bringing all stakeholders at the table, e.g., by possibly establishing a consortium.
M.D. Foote, B. Zimmerman, A. Sawant, S. Joshi.
Real-Time Patient-Specific Lung Radiotherapy Targeting using Deep Learning, In 1st Conference on Medical Imaging with Deep Learning (MIDL 2018), Amsterdam, The Netherlands, 2018.
Radiation therapy has presented a need for dynamic tracking of a target tumor volume. Fiducial markers such as implanted gold seeds have been used to gate radiation delivery but the markers are invasive and gating significantly increases treatment time. Pretreatment acquisition of a 4DCT allows for the development of accurate motion estimation for treatment planning. A deep convolutional neural network and subspace motion tracking is used to recover anatomical positions from a single radiograph projection in real-time. We approximate the nonlinear inverse of a diffeomorphic transformation composed with radiographic projection as a deep network that produces subspace coordinates to define the patient-specific deformation of the lungs from a baseline anatomic position. The geometric accuracy of the subspace projections on real patient data is similar to accuracy attained by original image registration between individual respiratory-phase image volumes.
S. Guler, M. Dannhauer, B. Roig-Solvas, A. Gkogkidis, R. Macleod, T. Ball, J. G. Ojemann, D. H. Brooks.
Computationally optimized ECoG stimulation with local safety constraints, In NeuroImage, Vol. 173, Elsevier BV, pp. 35--48. June, 2018.
DOI: 10.1016/j.neuroimage.2018.01.088
Direct stimulation of the cortical surface is used clinically for cortical mapping and modulation of local activity. Future applications of cortical modulation and brain-computer interfaces may also use cortical stimulation methods. One common method to deliver current is through electrocorticography (ECoG) stimulation in which a dense array of electrodes are placed subdurally or epidurally to stimulate the cortex. However, proximity to cortical tissue limits the amount of current that can be delivered safely. It may be desirable to deliver higher current to a specific local region of interest (ROI) while limiting current to other local areas more stringently than is guaranteed by global safety limits. Two commonly used global safety constraints bound the total injected current and individual electrode currents. However, these two sets of constraints may not be sufficient to prevent high current density locally (hot-spots). In this work, we propose an efficient approach that prevents current density hot-spots in the entire brain while optimizing ECoG stimulus patterns for targeted stimulation. Specifically, we maximize the current along a particular desired directional field in the ROI while respecting three safety constraints: one on the total injected current, one on individual electrode currents, and the third on the local current density magnitude in the brain. This third set of constraints creates a computational barrier due to the huge number of constraints needed to bound the current density at every point in the entire brain. We overcome this barrier by adopting an efficient two-step approach. In the first step, the proposed method identifies the safe brain region, which cannot contain any hot-spots solely based on the global bounds on total injected current and individual electrode currents. In the second step, the proposed algorithm iteratively adjusts the stimulus pattern to arrive at a solution that exhibits no hot-spots in the remaining brain. We report on simulations on a realistic finite element (FE) head model with five anatomical ROIs and two desired directional fields. We also report on the effect of ROI depth and desired directional field on the focality of the stimulation. Finally, we provide an analysis of optimization runtime as a function of different safety and modeling parameters. Our results suggest that optimized stimulus patterns tend to differ from those used in clinical practice.
L. Guo, A. Narayan, T. Zhou.
A gradient enhanced ℓ1-minimization for sparse approximation of polynomial chaos expansions, In Journal of Computational Physics, Vol. 367, Elsevier BV, pp. 49--64. Aug, 2018.
We investigate a gradient enhanced ℓ 1-minimization for constructing sparse polynomial chaos expansions. In addition to function evaluations, measurements of the function gradient is also included to accelerate the identification of expansion coefficients. By designing appropriate preconditioners to the measurement matrix, we show gradient enhanced ℓ 1 minimization leads to stable and accurate coefficient recovery. The framework for designing preconditioners is quite general and it applies to recover of functions whose domain is bounded or unbounded. Comparisons between the gradient enhanced approach and the standard ℓ 1-minimization are also presented and numerical examples suggest that the inclusion of derivative information can guarantee sparse recovery at a reduced computational cost.
L. Guo, A. Narayan, L. Yan, T. Zhou.
Weighted approximate fekete points: sampling for least-squares polynomial approximation, In SIAM Journal on Scientific Computing, Vol. 40, No. 1, SIAM, pp. A366--A387. Jan, 2018.
DOI: 10.1137/17m1140960
We propose and analyze a weighted greedy scheme for computing deterministic sample configurations in multidimensional space for performing least-squares polynomial approximations on $L^2$ spaces weighted by a probability density function. Our procedure is a particular weighted version of the approximate Fekete points method, with the weight function chosen as the (inverse) Christoffel function. Our procedure has theoretical advantages: when linear systems with optimal condition number exist, the procedure finds them. In the one-dimensional setting with any density function, our greedy procedure almost always generates optimally conditioned linear systems. Our method also has practical advantages: our procedure is impartial to the compactness of the domain of approximation and uses only pivoted linear algebraic routines. We show through numerous examples that our sampling design outperforms competing randomized and deterministic designs when the domain is both low and high dimensional.
M. Hajij, B. Wang, C. Scheidegger, P. Rosen.
Visual Detection of Structural Changes in Time-Varying Graphs Using Persistent Homology, In 2018 IEEE Pacific Visualization Symposium (PacificVis), IEEE, April, 2018.
DOI: 10.1109/pacificvis.2018.00024
Topological data analysis is an emerging area in exploratory data analysis and data mining. Its main tool, persistent homology, has become a popular technique to study the structure of complex, high-dimensional data. In this paper, we propose a novel method using persistent homology to quantify structural changes in time-varying graphs. Specifically, we transform each instance of the time-varying graph into a metric space, extract topological features using persistent homology, and compare those features over time. We provide a visualization that assists in time-varying graph exploration and helps to identify patterns of behavior within the data. To validate our approach, we conduct several case studies on real-world datasets and show how our method can find cyclic patterns, deviations from those patterns, and one-time events in time-varying graphs. We also examine whether a persistence-based similarity measure satisfies a set of well-established, desirable properties for graph metrics.
Page 24 of 142