Mesh U-Nets for 3D Cardiac Deformation Modeling

Marcel Beetz*, Jorge Corral Acero, Abhirup Banerjee, Ingo Eitel, Ernesto Zacur, Torben Lange, Thomas Stiermaier, Ruben Evertz, Sören J. Backhaus, Holger Thiele, Alfonso Bueno-Orovio, Pablo Lamata, Andreas Schuster, Vicente Grau

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

Abstract

During a cardiac cycle, the heart anatomy undergoes a series of complex 3D deformations, which can be analyzed to diagnose various cardiovascular pathologies including myocardial infarction. While volume-based metrics such as ejection fraction are commonly used in clinical practice to assess these deformations globally, they only provide limited information about localized changes in the 3D cardiac structures. The objective of this work is to develop a novel geometric deep learning approach to capture the mechanical deformation of complete 3D ventricular shapes, offering potential to discover new image-based biomarkers for cardiac disease diagnosis. To this end, we propose the mesh U-Net, which combines mesh-based convolution and pooling operations with U-Net-inspired skip connections in a hierarchical step-wise encoder-decoder architecture, in order to enable accurate and efficient learning directly on 3D anatomical meshes. The proposed network is trained to model both cardiac contraction and relaxation, that is, to predict the 3D cardiac anatomy at the end-systolic phase of the cardiac cycle based on the corresponding anatomy at end-diastole and vice versa. We evaluate our method on a multi-center cardiac magnetic resonance imaging (MRI) dataset of 1021 patients with acute myocardial infarction. We find mean surface distances between the predicted and gold standard anatomical meshes close to the pixel resolution of the underlying images and high similarity in multiple commonly used clinical metrics for both prediction directions. In addition, we show that the mesh U-Net compares favorably to a 3D U-Net benchmark by using 66% fewer network parameters and drastically smaller data sizes, while at the same time improving predictive performance by 14%. We also observe that the mesh U-Net is able to capture subpopulation-specific differences in mechanical deformation patterns between patients with different myocardial infarction types and clinical outcomes.

Original languageEnglish
Title of host publicationStatistical Atlases and Computational Models of the Heart. Regular and CMRxMotion Challenge Papers - 13th International Workshop, STACOM 2022, Held in Conjunction with MICCAI 2022, Revised Selected Papers
EditorsOscar Camara, Esther Puyol-Antón, Avan Suinesiaputra, Alistair Young, Chen Qin, Maxime Sermesant, Shuo Wang
PublisherSpringer Science and Business Media Deutschland GmbH
Pages245-257
Number of pages13
ISBN (Print)9783031234422
DOIs
Publication statusPublished - 2022
Event13th International Workshop on Statistical Atlases and Computational Models of the Heart, STACOM 2022, held in conjunction with the 25th International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2022 - Singapore, Singapore
Duration: 18 Sept 202218 Sept 2022

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume13593 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference13th International Workshop on Statistical Atlases and Computational Models of the Heart, STACOM 2022, held in conjunction with the 25th International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2022
Country/TerritorySingapore
CitySingapore
Period18/09/202218/09/2022

Keywords

  • 3D heart contraction
  • Acute myocardial infarction
  • Cardiac mechanics
  • Cardiac MRI
  • Geometric deep learning
  • Mesh sampling
  • Spectral graph convolutions

Fingerprint

Dive into the research topics of 'Mesh U-Nets for 3D Cardiac Deformation Modeling'. Together they form a unique fingerprint.

Cite this