McSTRA: A multi-branch cascaded swin transformer for point spread function-guided robust MRI reconstruction

Research output: Contribution to journalArticleResearchpeer-review

1 Citation (Scopus)

Abstract

Deep learning MRI reconstruction methods are often based on Convolutional neural network (CNN) models; however, they are limited in capturing global correlations among image features due to the intrinsic locality of the convolution operation. Conversely, the recent vision transformer models (ViT) are capable of capturing global correlations by applying self-attention operations on image patches. Nevertheless, the existing transformer models for MRI reconstruction rarely leverage the physics of MRI. In this paper, we propose a novel physics-based transformer model titled, the Multi-branch Cascaded Swin Transformers (McSTRA) for robust MRI reconstruction. McSTRA combines several interconnected MRI physics-related concepts with the Swin transformers: it exploits global MRI features via the shifted window self-attention mechanism; it extracts MRI features belonging to different spectral components via a multi-branch setup; it iterates between intermediate de-aliasing and data consistency via a cascaded network with intermediate loss computations; furthermore, we propose a point spread function-guided positional embedding generation mechanism for the Swin transformers which exploit the spread of the aliasing artifacts for effective reconstruction. With the combination of all these components, McSTRA outperforms the state-of-the-art methods while demonstrating robustness in adversarial conditions such as higher accelerations, noisy data, different undersampling protocols, out-of-distribution data, and abnormalities in anatomy.

Original languageEnglish
Article number107775
Number of pages15
JournalComputers in Biology and Medicine
Volume168
DOIs
Publication statusPublished - Jan 2024

Keywords

  • Accelerated MRI
  • Deep learning
  • Image reconstruction
  • k-space
  • Physics-based
  • Point spread function
  • Swin transformer
  • Undersampled

Cite this