SCI Publications

2010


M. Jolley, J. Stinstra, J. Tate, S. Pieper, R.S. Macleod, L. Chu, P. Wang, J.K. Triedman. “Finite element modeling of subcutaneous implantable defibrillator electrodes in an adult torso,” In Heart Rhythm, Vol. 7, No. 5, pp. 692--698. May, 2010.
DOI: 10.1016/j.hrthm.2010.01.030
PubMed ID: 20230927
PubMed Central ID: PMC3103844

ABSTRACT

BACKGROUND:
Total subcutaneous implantable subcutaneous defibrillators are in development, but optimal electrode configurations are not known.

OBJECTIVE:
We used image-based finite element models (FEM) to predict the myocardial electric field generated during defibrillation shocks (pseudo-DFT) in a wide variety of reported and innovative subcutaneous electrode positions to determine factors affecting optimal lead positions for subcutaneous implantable cardioverter-defibrillators (S-ICD).

METHODS:
An image-based FEM of an adult man was used to predict pseudo-DFTs across a wide range of technically feasible S-ICD electrode placements. Generator location, lead location, length, geometry and orientation, and spatial relation of electrodes to ventricular mass were systematically varied. Best electrode configurations were determined, and spatial factors contributing to low pseudo-DFTs were identified using regression and general linear models.

RESULTS:
A total of 122 single-electrode/array configurations and 28 dual-electrode configurations were simulated. Pseudo-DFTs for single-electrode orientations ranged from 0.60 to 16.0 (mean 2.65 +/- 2.48) times that predicted for the base case, an anterior-posterior configuration recently tested clinically. A total of 32 of 150 tested configurations (21%) had pseudo-DFT ratios /=1, indicating the possibility of multiple novel, efficient, and clinically relevant orientations. Favorable alignment of lead-generator vector with ventricular myocardium and increased lead length were the most important factors correlated with pseudo-DFT, accounting for 70% of the predicted variation (R(2) = 0.70, each factor P < .05) in a combined general linear modl in which parameter estimates were calculated for each factor.

CONCLUSION:
Further exploration of novel and efficient electrode configurations may be of value in the development of the S-ICD technologies and implant procedure. FEM modeling suggests that the choice of configurations that maximize shock vector alignment with the center of myocardial mass and use of longer leads is more likely to result in lower DFT.



E. Jurrus, A.R.C. Paiva, S. Watanabe, J.R. Anderson, B.W. Jones, R.T. Whitaker, E.M. Jorgensen, R.E. Marc, T. Tasdizen. “Detection of Neuron Membranes in Electron Microscopy Images Using a Serial Neural Network Architecture,” In Medical Image Analysis, Vol. 14, No. 6, pp. 770--783. 2010.
DOI: 10.1016/j.media.2010.06.002
PubMed ID: 20598935



J. Krüger. “A New Sampling Scheme for Slice Based Volume RenderingA new sampling scheme for slice based volume rendering,” In Proceedings of IEEE/EG International Symposium on Volume Graphics (2010), pp. 1--4. 2010.
DOI: 10.2312/VG/VG10/001-004

ABSTRACT

In this paper we present a novel approach to generate proxy geometry for slice based volume rendering. The basic idea is derived from the behavior of a ray-caster and is a simple extension of the well known 2D object-aligned texture stack based technique. From this our novel scheme inherits the advantage that it enables hardware-based volume rendering for devices that do not support 3D textures. On these devices previous object-aligned 2D texture based approaches suffered from disturbing view angle dependent stack-switching artifacts which are avoided by our novel method. Our approach also shows benefits compared to the widely used view aligned slicing algorithm as it avoids jagged boundary artifacts and increases performance.



S. Kumar, V. Vishwanath, P. Carns, V. Pascucci, R. Latham, T. Peterka, M. Papka, R. Ross. “Towards Efficient Access of Multi-dimensional, Multi-resolution Scientific Data,” In Proceedings of the 5th Petascale Data Storage Workshop, Supercomputing 2010, pp. (in press). 2010.



S.S. Kuppahally, N. Akoum, N.S. Burgon, T.J. Badger, E.G. Kholmovski, S. Vijayakumar, S.N. Rao, J. Blauer, E.N. Fish, E.V. Dibella, R.S. Macleod, C. McGann, S.E. Litwin, N.F. Marrouche. “Left atrial strain and strain rate in patients with paroxysmal and persistent atrial fibrillation: relationship to left atrial structural remodeling detected by delayed-enhancement MRI,” In Circ Cardiovasc Imaging, Vol. 3, No. 3, pp. 231--239. 2010.
PubMed ID: 20133512



S.S. Kuppahally, N. Akoum, T.J. Badger, N.S. Burgon, T. Haslam, E. Kholmovski, R.S. Macleod, C. McGann, N.F. Marrouche. “Echocardiographic left atrial reverse remodeling after catheter ablation of atrial fibrillation is predicted by preablation delayed enhancement of left atrium by magnetic resonance imaging,” In American Heart Journal, Vol. 160, No. 5, pp. 877--884. 2010.
DOI: 10.1016/j.ahj.2010.07.003
PubMed ID: 21095275
PubMed Central ID: PMC2995281

ABSTRACT

BACKGROUND:
Atrial fibrosis is a hallmark of atrial structural remodeling (SRM) and leads to structural and functional impairment of left atrial (LA) and persistence of atrial fibrillation (AF). This study was conducted to assess LA reverse remodeling after catheter ablation of AF in mild and moderate-severe LA SRM.

METHODS:
Catheter ablation was performed in 68 patients (age 62 ± 14 years, 68% males) with paroxysmal (n = 26) and persistent (n = 42) AF. The patients were divided into group 1 with mild LA SRM (10%, n = 37) by delayed enhancement magnetic resonance imaging (DEMRI). Two-dimensional echocardiography, LA strain, and strain rate during left ventricular systole by velocity vector imaging were performed pre and at 6 ± 3 months postablation. The long-term outcome was monitored for 12 months.

RESULTS:
Patients in group 1 were younger (57 ± 15 vs 66 ± 13 years, P = .009) with a male predominance (80% vs 57%, P < .05) as compared to group 2. Postablation, group 1 had significant increase in average LA strain (??: 14% vs 4%, P < .05) and strain rate (??: 0.5 vs 0.1 cm/s, P < .05) as compared to group 2. There was a trend toward more patients with persistent AF in group 2 (68% vs 55%, P = .2), but it was not statistically significant. Group 2 had more AF recurrences (41% vs 16%, P = .02) at 12 months after ablation.

CONCLUSION:
Mild preablation LA SRM by DEMRI predicts favorable LA structural and functional reverse remodeling and long-term success after catheter ablation of AF, irrespective of the paroxysmal or persistent nature of AF.



N. Lange, M.B. Dubray, J.E. Lee, M.P. Froimowitz, A. Froehlich, N. Adluru, B. Wright, C. Ravichandran, P.T. Fletcher, E.D. Bigler, A.L. Alexander, J.E. Lainhart. “Atypical diffusion tensor hemispheric asymmetry in autism,” In Autism Research, Vol. 3, No. 6, pp. 350--358. 2010.
DOI: 10.1002/aur.162
PubMed ID: 21182212



J.A. Levine, D.J. Swenson, Z. Fu, R.S. MacLeod, R.T. Whitaker. “A Comparison of Delaunay Based Meshing Algorithms for Electrophysiological Cardiac Simulations,” In Virtual Physiological Human, pp. 181--183. 2010.



A. Lex, M. Streit, E. Kruijff, D. Schmalstieg. “Caleydo: Design and Evaluation of a Visual Analysis Framework for Gene Expression Data in its Biological Context,” In Proceeding of the IEEE Symposium on Pacific Visualization (PacificVis '10), pp. 57--64. 2010.
ISBN: 424466856
DOI: 10.1109/PACIFICVIS.2010.5429609

ABSTRACT

The goal of our work is to support experts in the process of hypotheses generation concerning the roles of genes in diseases. For a deeper understanding of the complex interdependencies between genes, it is important to bring gene expressions (measurements) into context with pathways. Pathways, which are models of biological processes, are available in online databases. In these databases, large networks are decomposed into small sub-graphs for better manageability. This simplification results in a loss of context, as pathways are interconnected and genes can occur in multiple instances scattered over the network. Our main goal is therefore to present all relevant information, i.e., gene expressions, the relations between expression and pathways and between multiple pathways in a simple, yet effective way. To achieve this we employ two different multiple-view approaches. Traditional multiple views are used for large datasets or highly interactive visualizations, while a 2.5D technique is employed to support a seamless navigation of multiple pathways which simultaneously links to the expression of the contained genes. This approach facilitates the understanding of the interconnection of pathways, and enables a non-distracting relation to gene expression data. We evaluated Caleydo with a group of users from the life science community. Users were asked to perform three tasks: pathway exploration, gene expression analysis and information comparison with and without visual links, which had to be conducted in four different conditions. Evaluation results show that the system can improve the process of understanding the complex network of pathways and the individual effects of gene expression regulation considerably. Especially the quality of the available contextual information and the spatial organization was rated good for the presented 2.5D setup.



A. Lex, M. Streit, C. Partl, K. Kashofer, D. Schmalstieg. “Comparative Analysis of Multidimensional, Quantitative Data,” In IEEE Transactions on Visualization and Computer Graphics, Vol. 16, No. 6, pp. 1027--1035. 2010.

ABSTRACT

When analyzing multidimensional, quantitative data, the comparison of two or more groups of dimensions is a common task. Typical sources of such data are experiments in biology, physics or engineering, which are conducted in different configurations and use replicates to ensure statistically significant results.  One common way to analyze this data is to filter it using statistical methods and then run clustering algorithms to group similar values. The clustering results can be visualized using heat maps, which show differences between groups as changes in color. However, in cases where groups of dimensions have an a priori meaning, it is not desirable to cluster all dimensions combined, since a clustering algorithm can fragment continuous blocks of records.  Furthermore, identifying relevant elements in heat maps becomes  more difficult as the number of dimensions increases. To aid in such situations, we have developed Matchmaker, a visualization technique that allows researchers to arbitrarily  arrange and  compare multiple groups of dimensions at the same time.  We create separate groups of dimensions which can be clustered individually, and place them in an arrangement of heat maps reminiscent of parallel coordinates. To identify relations, we render bundled curves and ribbons between related records in different groups. We then allow interactive drill-downs using enlarged detail views of the data, which enable in-depth comparisons of clusters between groups. To reduce visual clutter, we minimize crossings between the views. This paper concludes with two case studies. The first demonstrates the value of our technique for the comparison of clustering algorithms. In the second, biologists use our system to investigate why certain strains of mice develop liver disease while others remain healthy, informally showing the efficacy of our system  when analyzing multidimensional data containing distinct groups of dimensions.



J. Li, D. Xiu. “Evaluation of Failure Probability via Surrogate Models,” In Journal of Computational Physics, Vol. 229, No. 23, pp. 8966--8980. 2010.
DOI: 10.1016/j.jcp.2010.08.022

ABSTRACT

Evaluation of failure probability of a given system requires sampling of the system response and can be computationally expensive. Therefore it is desirable to construct an accurate surrogate model for the system response and subsequently to sample the surrogate model. In this paper we discuss the properties of this approach. We demonstrate that the straightforward sampling of a surrogate model can lead to erroneous results, no matter how accurate the surrogate model is. We then propose a hybrid approach by sampling both the surrogate model in a “large” portion of the probability space and the original system in a “small” portion. The resulting algorithm is significantly more efficient than the traditional sampling method, and is more accurate and robust than the straightforward surrogate model approach. Rigorous convergence proof is established for the hybrid approach, and practical implementation is discussed. Numerical examples are provided to verify the theoretical findings and demonstrate the efficiency gain of the approach.

Keywords: Failure probability, Sampling, Polynomial chaos, Stochastic computation



W. Liu, P. Zhu, J.S. Anderson, D. Yurgelun-Todd, P.T. Fletcher. “Spatial Regularization of Functional Connectivity Using High-Dimensional Markov Random Fields,” In Medical Image Computing and Computer-Assisted Intervention (MICCAI 2010), Vol. 14, pp. 363--370. 2010.
PubMed ID: 20879336



Z. Liu, C. Goodlett, G. Gerig, M. Styner. “Evaluation of DTI Property Maps as Basis of DTI Atlas Building,” In SPIE Medical Imaging, Vol. 7623, 762325, February, 2010.
DOI: 10.1117/12.844911



Z. Liu, Y. Wang, G. Gerig, S. Gouttard, R. Tao, T. Fletcher, M.A. Styner. “Quality control of diffusion weighted images,” In SPIE Medical Imaging, Vol. 7628, 76280J, February, 2010.
DOI: 10.1117/12.844748



Y. Livnat, P. Gesteland, J. Benuzillo, W. Pettey, D. Bolton, F. Drews, H. Kramer, M. Samare. “A Novel Workbench for Epidemic investigation and Analysis of Search Strategies in public health practice,” In Proceedings of AMIA 2010 Annual Symposium, pp. 647--651. 2010.



M.A.S. Lizier, M.F. Siqueira, J.D. Daniels II, C.T. Silva, L.G. Nonato. “Template-based Remeshing for Image Decomposition,” In Proceedings of the 23rd SIBGRAPI Conference on Graphics, Patterns and Images, pp. 95--102. 2010.



J. Luitjens, M. Berzins. “Improving the Performance of Uintah: A Large-Scale Adaptive Meshing Computational Framework,” In Proceedings of the 24th IEEE International Parallel and Distributed Processing Symposium (IPDPS10), Atlanta, GA, pp. 1--10. 2010.
DOI: 10.1109/IPDPS.2010.5470437

ABSTRACT

Uintah is a highly parallel and adaptive multi-physics framework created by the Center for Simulation of Accidental Fires and Explosions in Utah. Uintah, which is built upon the Common Component Architecture, has facilitated the simulation of a wide variety of fluid-structure interaction problems using both adaptive structured meshes for the fluid and particles to model solids. Uintah was originally designed for, and has performed well on, about a thousand processors. The evolution of Uintah to use tens of thousands processors has required improvements in memory usage, data structure design, load balancing algorithms and cost estimation in order to improve strong and weak scalability up to 98,304 cores for situations in which the mesh used varies adaptively and also cases in which particles that represent the solids move from mesh cell to mesh cell.

Keywords: csafe, c-safe, scirun, uintah, fires, explosions, simulation



J. Luitjens, J. Guilkey, T. Harman, B. Worthen, S.G. Parker. “Adaptive Computations in the Uintah Framework,” In Advanced Computational Infastructures for Parallel/Distributed Adapative Applications, Ch. 1, Wiley Press, 2010.



J. Luitjens. “The Scalability of Parallel Adaptive Mesh Refinement Within Uintah,” School of Computing, University of Utah, 2010.

ABSTRACT

Solutions to Partial Differential Equations (PDEs) are often computed by discretizing the domain into a collection of computational elements referred to as a mesh. This solution is an approximation with an error that decreases as the mesh spacing decreases. However, decreasing the mesh spacing also increases the computational requirements. Adaptive mesh refinement (AMR) attempts to reduce the error while limiting the increase in computational requirements by refining the mesh locally in regions of the domain that have large error while maintaining a coarse mesh in other portions of the domain. This approach often provides a solution that is as accurate as that obtained from a much larger fixed mesh simulation, thus saving on both computational time and memory. However, historically, these AMR operations often limit the overall scalability of the application.

Adapting the mesh at runtime necessitates scalable regridding and load balancing algorithms. This dissertation analyzes the performance bottlenecks for a widely used regridding algorithm and presents two new algorithms which exhibit ideal scalability. In addition, a scalable space-filling curve generation algorithm for dynamic load balancing is also presented. The performance of these algorithms is analyzed by determining their theoretical complexity, deriving performance models, and comparing the observed performance to those performance models. The models are then used to predict performance on larger numbers of processors. This analysis demonstrates the necessity of these algorithms at larger numbers of processors. This dissertation also investigates methods to more accurately predict workloads based on measurements taken at runtime. While the methods used are not new, the application of these methods to the load balancing process is. These methods are shown to be highly accurate and able to predict the workload within 3% error. By improving the accuracy of these estimations, the load imbalance of the simulation can be reduced, thereby increasing the overall performance.

Finally, the scalability of AMR simulations as a whole using these algorithms is tested within the Uintah computational framework. Scalability tests are performed using up to 98,304 processors and nearly ideal scalability is demonstrated.



C. Mahnkopf, T.J. Badger, N.S. Burgon, M. Daccarett, T.S. Haslam, C.T. Badger, C.J. McGann, N. Akoum, E. Kholmovski, R.S. Macleod, N.F. Marrouche. “Evaluation of the left atrial substrate in patients with lone atrial fibrillation using delayed-enhanced MRI: implications for disease progression and response to catheter ablation,” In Heart Rhythm, Vol. 7, No. 10, pp. 1475--1481. 2010.
PubMed ID: 20601148