SCI Publications
2013
S. Gerber, O. Reubel, P.-T. Bremer, V. Pascucci, R.T. Whitaker.
Morse-Smale Regression, In Journal of Computational and Graphical Statistics, Vol. 22, No. 1, pp. 193--214. 2013.
DOI: 10.1080/10618600.2012.657132
J.M. Gililland, L.A. Anderson, H.B. Henninger, E.N. Kubiak, C.L. Peters.
Biomechanical analysis of acetabular revision constructs: is pelvic discontinuity best treated with bicolumnar or traditional unicolumnar fixation?, In Journal of Arthoplasty, Vol. 28, No. 1, pp. 178--186. 2013.
DOI: 10.1016/j.arth.2012.04.031
Pelvic discontinuity in revision total hip arthroplasty presents problems with component fixation and union. A construct was proposed based on bicolumnar fixation for transverse acetabular fractures. Each of 3 reconstructions was performed on 6 composite hemipelvises: (1) a cup-cage construct, (2) a posterior column plate construct, and (3) a bicolumnar construct (no. 2 plus an antegrade 4.5-mm anterior column screw). Bone-cup interface motions were measured, whereas cyclical loads were applied in both walking and descending stair simulations. The bicolumnar construct provided the most stable construct. Descending stair mode yielded more significant differences between constructs. The bicolumnar construct provided improved component stability. Placing an antegrade anterior column screw through a posterior approach is a novel method of providing anterior column support in this setting.
S. Gratzl, A. Lex, N. Gehlenborg, H. Pfister,, M. Streit.
LineUp: Visual Analysis of Multi-Attribute Rankings, In IEEE Transactions on Visualization and Computer Graphics (InfoVis '13), Vol. 19, No. 12, pp. 2277--2286. 2013.
ISSN: 1077-2626
DOI: 10.1109/TVCG.2013.173
Rankings are a popular and universal approach to structure otherwise unorganized collections of items by computing a rank for each item based on the value of one or more of its attributes. This allows us, for example, to prioritize tasks or to evaluate the performance of products relative to each other. While the visualization of a ranking itself is straightforward, its interpretation is not because the rank of an item represents only a summary of a potentially complicated relationship between its attributes and those of the other items. It is also common that alternative rankings exist that need to be compared and analyzed to gain insight into how multiple heterogeneous attributes affect the rankings. Advanced visual exploration tools are needed to make this process efficient.
In this paper we present a comprehensive analysis of requirements for the visualization of multi-attribute rankings. Based on these considerations, we propose a novel and scalable visualization technique - LineUp - that uses bar charts. This interactive technique supports the ranking of items based on multiple heterogeneous attributes with different scales and semantics. It enables users to interactively combine attributes and flexibly refine parameters to explore the effect of changes in the attribute combination. This process can be employed to derive actionable insights into which attributes of an item need to be modified in order for its rank to change.
Additionally, through integration of slope graphs, LineUp can also be used to compare multiple alternative rankings on the same set of items, for example, over time or across different attribute combinations. We evaluate the effectiveness of the proposed multi-attribute visualization technique in a qualitative study. The study shows that users are able to successfully solve complex ranking tasks in a short period of time.
A. Grosset, M. Schott, G.-P. Bonneau, C.D. Hansen.
Evaluation of Depth of Field for Depth Perception in DVR, In Proceedings of the 2013 IEEE Pacific Visualization Symposium (PacificVis), pp. 81--88. 2013.
L.K. Ha, J. King, Z. Fu, R.M. Kirby.
A High-Performance Multi-Element Processing Framework on GPUs, SCI Technical Report, No. UUSCI-2013-005, SCI Institute, University of Utah, 2013.
Many computational engineering problems ranging from finite element methods to image processing involve the batch processing on a large number of data items. While multielement processing has the potential to harness computational power of parallel systems, current techniques often concentrate on maximizing elemental performance. Frameworks that take this greedy optimization approach often fail to extract the maximum processing power of the system for multi-element processing problems. By ultilizing the knowledge that the same operation will be accomplished on a large number of items, we can organize the computation to maximize the computational throughput available in parallel streaming hardware. In this paper, we analyzed weaknesses of existing methods and we proposed efficient parallel programming patterns implemented in a high performance multi-element processing framework to harness the processing power of GPUs. Our approach is capable of levering out the performance curve even on the range of small element size.
M. Hall, R.M. Kirby, F. Li, M.D. Meyer, V. Pascucci, J.M. Phillips, R. Ricci, J. Van der Merwe, S. Venkatasubramanian.
Rethinking Abstractions for Big Data: Why, Where, How, and What, In Cornell University Library, 2013.
Big data refers to large and complex data sets that, under existing approaches, exceed the capacity and capability of current compute platforms, systems software, analytical tools and human understanding [7]. Numerous lessons on the scalability of big data can already be found in asymptotic analysis of algorithms and from the high-performance computing (HPC) and applications communities. However, scale is only one aspect of current big data trends; fundamentally, current and emerging problems in big data are a result of unprecedented complexity |in the structure of the data and how to analyze it, in dealing with unreliability and redundancy, in addressing the human factors of comprehending complex data sets, in formulating meaningful analyses, and in managing the dense, power-hungry data centers that house big data.
The computer science solution to complexity is finding the right abstractions, those that hide as much triviality as possible while revealing the essence of the problem that is being addressed. The "big data challenge" has disrupted computer science by stressing to the very limits the familiar abstractions which define the relevant subfields in data analysis, data management and the underlying parallel systems. Efficient processing of big data has shifted systems towards increasingly heterogeneous and specialized units, with resilience and energy becoming important considerations. The design and analysis of algorithms must now incorporate emerging costs in communicating data driven by IO costs, distributed data, and the growing energy cost of these operations. Data analysis representations as structural patterns and visualizations surpass human visual bandwidth, structures studied at small scale are rare at large scale, and large-scale high-dimensional phenomena cannot be reproduced at small scale.
As a result, not enough of these challenges are revealed by isolating abstractions in a traditional soft-ware stack or standard algorithmic and analytical techniques, and attempts to address complexity either oversimplify or require low-level management of details. The authors believe that the abstractions for big data need to be rethought, and this reorganization needs to evolve and be sustained through continued cross-disciplinary collaboration.
In what follows, we first consider the question of why big data and why now. We then describe the where (big data systems), the how (big data algorithms), and the what (big data analytics) challenges that we believe are central and must be addressed as the research community develops these new abstractions. We equate the biggest challenges that span these areas of big data with big mythological creatures, namely cyclops, that should be conquered.
M. Hall, J.C. Beckvermit, C.A. Wight, T. Harman, M. Berzins.
The influence of an applied heat flux on the violence of reaction of an explosive device, In Proceedings of the Conference on Extreme Science and Engineering Discovery Environment: Gateway to Discovery, San Diego, California, XSEDE '13, pp. 11:1--11:8. 2013.
ISBN: 978-1-4503-2170-9
DOI: 10.1145/2484762.2484786
Keywords: DDT, cook-off, deflagration, detonation, violence of reaction, c-safe
D.K. Hammond, Y. Gur, C.R. Johnson.
Graph Diffusion Distance: A Difference Measure for Weighted Graphs Based on the Graph Laplacian Exponential Kernel, In Proceedings of the IEEE global conference on information and signal processing (GlobalSIP'13), Austin, Texas, pp. 419--422. 2013.
DOI: 10.1109/GlobalSIP.2013.6736904
X. Hao, P.T. Fletcher.
Joint Fractional Segmentation and Multi-Tensor Estimation in Diffusion MRI, In Proceedings of the International Conference on Information Processing in Medical Imaging (IPMI), Lecture Notes in Computer Science (LNCS), pp. (accepted). 2013.
In this paper we present a novel Bayesian approach for fractional segmentation of white matter tracts and simultaneous estimation of a multi-tensor diffusion model. Our model consists of several white matter tracts, each with a corresponding weight and tensor compartment in each voxel. By incorporating a prior that assumes the tensor fields inside each tract are spatially correlated, we are able to reliably estimate multiple tensor compartments in fiber crossing regions, even with low angular diffusion-weighted imaging (DWI). Our model distinguishes the diffusion compartment associated with each tract, which reduces the effects of partial voluming and achieves more reliable statistics of diffusion measurements.We test our method on synthetic data with known ground truth and show that we can recover the correct volume fractions and tensor compartments. We also demonstrate that the proposed method results in improved segmentation and diffusion measurement statistics on real data in the presence of crossing tracts and partial voluming.
M.D. Harris, S.P. Reese, C.L. Peters, J.A. Weiss, A.E. Anderson.
Three-dimensional Quantification of Femoral Head Shape in Controls and Patients with Cam-type Femoroacetabular Impingement, In Annals of Biomedical Engineering, Vol. 41, No. 6, pp. 1162--1171. 2013.
DOI: 10.1007/s10439-013-0762-1
M.D. Harris, M. Datar, R.T. Whitaker, E.R. Jurrus, C.L. Peters, A.E. Anderson.
Statistical Shape Modeling of Cam Femoroacetabular Impingement, In Journal of Orthopaedic Research, Vol. 31, No. 10, pp. 1620--1626. 2013.
DOI: 10.1002/jor.22389
M.D. Harris.
The geometry and biomechanics of normal and pathomorphologic human hips, Note: Ph.D. Thesis, Department of Bioengineering, University of Utah, 2013.
C.R. Henak, A.E. Anderson, J.A. Weiss.
Subject-specific analysis of joint contact mechanics: application to the study of osteoarthritis and surgical planning, In Journal of Biomechanical Engineering, Vol. 135, No. 2, 2013.
DOI: 10.1115/1.4023386
PubMed ID: 23445048
Advances in computational mechanics, constitutive modeling, and techniques for subject-specific modeling have opened the door to patient-specific simulation of the relationships between joint mechanics and osteoarthritis (OA), as well as patient-specific preoperative planning. This article reviews the application of computational biomechanics to the simulation of joint contact mechanics as relevant to the study of OA. This review begins with background regarding OA and the mechanical causes of OA in the context of simulations of joint mechanics. The broad range of technical considerations in creating validated subject-specific whole joint models is discussed. The types of computational models available for the study of joint mechanics are reviewed. The types of constitutive models that are available for articular cartilage are reviewed, with special attention to choosing an appropriate constitutive model for the application at hand. Issues related to model generation are discussed, including acquisition of model geometry from volumetric image data and specific considerations for acquisition of computed tomography and magnetic resonance imaging data. Approaches to model validation are reviewed. The areas of parametric analysis, factorial design, and probabilistic analysis are reviewed in the context of simulations of joint contact mechanics. Following the review of technical considerations, the article details insights that have been obtained from computational models of joint mechanics for normal joints; patient populations; the study of specific aspects of joint mechanics relevant to OA, such as congruency and instability; and preoperative planning. Finally, future directions for research and application are summarized.
H.B. Henninger, C.J. Underwood, S.J. Romney, G.L. Davis, J.A. Weiss.
Effect of Elastin Digestion on the Quasi-Static Tensile Response of Medial Collateral Ligament, In Journal of Orthopaedic Research, pp. (published online). 2013.
DOI: 10.1002/jor.22352
Elastin is a structural protein that provides resilience to biological tissues. We examined the contributions of elastin to the quasi-static tensile response of porcine medial collateral ligament through targeted disruption of the elastin network with pancreatic elastase. Elastase concentration and treatment time were varied to determine a dose response. Whereas elastin content decreased with increasing elastase concentration and treatment time, the change in peak stress after cyclic loading reached a plateau above 1 U/ml elastase and 6 h treatment. For specimens treated with 2 U/ml elastase for 6 h, elastin content decreased approximately 35%. Mean peak tissue strain after cyclic loading (4.8%, p ≥ 0.300), modulus (275 MPa, p ≥ 0.114) and hysteresis (20%, p ≥ 0.553) were unaffected by elastase digestion, but stress decreased significantly after treatment (up to 2 MPa, p ≤ 0.049). Elastin degradation had no effect on failure properties, but tissue lengthened under the same pre-stress. Stiffness in the linear region was unaffected by elastase digestion, suggesting that enzyme treatment did not disrupt collagen. These results demonstrate that elastin primarily functions in the toe region of the stress–strain curve, yet contributes load support in the linear region. The increase in length after elastase digestion suggests that elastin may pre-stress and stabilize collagen crimp in ligaments
C.R. Henak, Carruth E, A.E. Anderson, M.D. Harris, B.J. Ellis, C.L. Peters, J.A. Weiss.
Finite element predictions of cartilage contact mechanics in hips with retroverted acetabula, In Osteoarthritis and Cartilage, Vol. 21, pp. 1522-1529. 2013.
DOI: 10.1016/j.joca.2013.06.008
Background
A contributory factor to hip osteoarthritis (OA) is abnormal cartilage mechanics. Acetabular retroversion, a version deformity of the acetabulum, has been postulated to cause OA via decreased posterior contact area and increased posterior contact stress. Although cartilage mechanics cannot be measured directly in vivo to evaluate the causes of OA, they can be predicted using finite element (FE) modeling.
Objective
The objective of this study was to compare cartilage contact mechanics between hips with normal and retroverted acetabula using subject-specific FE modeling.
Methods
Twenty subjects were recruited and imaged: 10 with normal acetabula and 10 with retroverted acetabula. FE models were constructed using a validated protocol. Walking, stair ascent, stair descent and rising from a chair were simulated. Acetabular cartilage contact stress and contact area were compared between groups.
Results
Retroverted acetabula had superomedial cartilage contact patterns, while normal acetabula had widely distributed cartilage contact patterns. In the posterolateral acetabulum, average contact stress and contact area during walking and stair descent were 2.6–7.6 times larger in normal than retroverted acetabula (P ≤ 0.017). Conversely, in the superomedial acetabulum, peak contact stress during walking was 1.2–1.6 times larger in retroverted than normal acetabula (P ≤ 0.044). Further differences varied by region and activity.
Conclusions
This study demonstrated superomedial contact patterns in retroverted acetabula vs widely distributed contact patterns in normal acetabula. Smaller posterolateral contact stress in retroverted acetabula than in normal acetabula suggests that increased posterior contact stress alone may not be the link between retroversion and OA.
C.R. Henak, A.K. Kapron, B.J. Ellis, S.A. Maas, A.E. Anderson, J.A. Weiss.
Specimen-specific predictions of contact stress under physiological loading in the human hip: validation and sensitivity studies, In Biomechanics and Modeling in Mechanobiology, pp. 1-14. 2013.
DOI: 10.1007/s10237-013-0504-1
Hip osteoarthritis may be initiated and advanced by abnormal cartilage contact mechanics, and finite element (FE) modeling provides an approach with the potential to allow the study of this process. Previous FE models of the human hip have been limited by single specimen validation and the use of quasi-linear or linear elastic constitutive models of articular cartilage. The effects of the latter assumptions on model predictions are unknown, partially because data for the instantaneous behavior of healthy human hip cartilage are unavailable. The aims of this study were to develop and validate a series of specimen-specific FE models, to characterize the regional instantaneous response of healthy human hip cartilage in compression, and to assess the effects of material nonlinearity, inhomogeneity and specimen-specific material coefficients on FE predictions of cartilage contact stress and contact area. Five cadaveric specimens underwent experimental loading, cartilage material characterization and specimen-specific FE modeling. Cartilage in the FE models was represented by average neo-Hookean, average Veronda Westmann and specimen- and region-specific Veronda Westmann hyperelastic constitutive models. Experimental measurements and FE predictions compared well for all three cartilage representations, which was reflected in average RMS errors in contact stress of less than 25 %. The instantaneous material behavior of healthy human hip cartilage varied spatially, with stiffer acetabular cartilage than femoral cartilage and stiffer cartilage in lateral regions than in medial regions. The Veronda Westmann constitutive model with average material coefficients accurately predicted peak contact stress, average contact stress, contact area and contact patterns. The use of subject- and region-specific material coefficients did not increase the accuracy of FE model predictions. The neo-Hookean constitutive model underpredicted peak contact stress in areas of high stress. The results of this study support the use of average cartilage material coefficients in predictions of cartilage contact stress and contact area in the normal hip. The regional characterization of cartilage material behavior provides the necessary inputs for future computational studies, to investigate other mechanical parameters that may be correlated with OA and cartilage damage in the human hip. In the future, the results of this study can be applied to subject-specific models to better understand how abnormal hip contact stress and contact area contribute to OA.
C.R. Henak.
Cartilage and labrum mechanics in the normal and pathomorphologic human hip, Note: Ph.D. Thesis, Department of Bioengineering, University of Utah, 2013.
H. Hernandez, J. Knezevic, T. Fogal, T. Sherman, T. Jevremovic.
Visual numerical steering in 3D AGENT code system for advanced nuclear reactor modeling and design, In Annals of Nuclear Energy, Vol. 55, pp. 248--257. 2013.
Keywords: Numerical steering, AGENT code, Deterministic neutron transport codes, Method of Characteristics, R-functions, Numerical visualizations
K. Higuchi, M. Akkaya, M. Koopmann, J.J. Blauer, N.S. Burgon, K. Damal, R. Ranjan, E. Kholmovski, R.S. Macleod, N.F. Marrouche..
The Effect of Fat Pad Modification during Ablation of Atrial Fibrillation: Late Gadolinium Enhancement MRI Analysis, In Pacing and Clinical Electrophysiology (PACE), Vol. 36, No. 4, pp. 467--476. April, 2013.
DOI: 10.1111/pace.12084
PubMed ID: 23356963
PubMed Central ID: PMC3651513
Background: Magnetic resonance imaging (MRI) can visualize locations of both the ablation scar on the left atrium (LA) after atrial fibrillation (AF) ablation and epicardial fat pads (FPs) containing ganglionated plexi (GP).
Methods: We investigated 60 patients who underwent pulmonary vein antrum (PVA) isolation along with LA posterior wall and septal debulking for AF. FPs around the LA surface in well-known GP areas (which were considered as the substitution of GP areas around the LA) were segmented from the dark-blood MRI. Then the FP and the ablation scar image visualized by late gadolinium enhancement (LGE)-MRI on the LA were merged together. Overlapping areas of FP and the ablation scar image were considered as the ablated FP areas containing GP. Patients underwent 24-hour Holter monitoring after ablation for the analysis of heart rate variability.
Results: Ablated FP area was significantly wider in patients without AF recurrence than those in patients with recurrence (5.6 ± 3.1 cm2 vs 4.2 ± 2.7 cm2 ,P = 0.03). The mean values of both percentage of differences greater than 50 ms in the RR intervals (pRR > 50) and standard deviation of RR intervals over the entire analyzed period (SDNN), which were obtained from 24-hour Holter monitoring 1-day post-AF ablation, were significantly lower in patients without recurrence than those in patients with recurrence (5.8 ± 6.0% vs 14.0 ± 10.1%; P = 0.0005, 78.7 ± 32.4 ms vs 109.2 ± 43.5 ms; P = 0.005). There was a significant negative correlation between SDNN and the percentage of ablated FP area (Y =- 1.3168X + 118.96, R2 = 0.1576, P = 0.003).
Conclusion: Extensively ablating LA covering GP areas along with PVA isolation enhanced the denervation of autonomic nerve system and seemed to improve procedural outcome in patients with AF.
Keywords: ganglionated plexi, fat pad, atrial fibrillation, catheter ablation, LGE-MRI
J. Hinkle, S. Joshi.
PDiff: Irrotational Diffeomorphisms for Computational Anatomy, In Proceedings of the International Conference on Information Processing in Medical Imaging (IPMI), Lecture Notes in Computer Science (LNCS), pp. (accepted). 2013.
Page 46 of 144
