Hybrid window attention based transformer architecture for brain tumor segmentation

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review


As intensities of MRI volumes are inconsistent across institutes, it is essential to extract universal features of multi-modal MRIs to precisely segment brain tumors. In this concept, we propose a volumetric vision transformer that follows two windowing strategies in attention for extracting fine features and local distributional smoothness (LDS) during model training inspired by virtual adversarial training (VAT) to make the model robust. We trained and evaluated network architecture on the FeTS Challenge 2022 dataset. Our performance on the online evaluation is as follows: Dice Similarity Score of 85.70%, 90.59% and 87.27%; Hausdorff Distance (95%) of 10.46 mm, 7.40 mm, 12.66 mm for the enhancing tumor, whole tumor, and tumor core, respectively. Overall, the experimental results verify our method’s effectiveness by yielding better performance in segmentation accuracy for each tumor sub-region. Our code implementation is publicly available.

Original languageEnglish
Title of host publication8th International Workshop, BrainLes 2022 Held in Conjunction with MICCAI 2022 Singapore, September 18, 2022 Revised Selected Papers, Part II
EditorsSpyridon Bakas, Alessandro Crimi, Ujjwal Baid, Sylwia Malec, Monika Pytlarz, Bhakti Baheti, Maximilian Zenk, Reuben Dorent
Place of PublicationCham Switzerland
Number of pages10
ISBN (Electronic)9783031441530
ISBN (Print)9783031441523
Publication statusPublished - 2023
Event8th International MICCAI Brainlesion Workshop, BrainLes 2022 - Singapore, Singapore
Duration: 18 Sept 202218 Sept 2022

Publication series

NameLecture Notes in Computer Science
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference8th International MICCAI Brainlesion Workshop, BrainLes 2022


  • Brain Tumor Segmentation
  • Deep Learning
  • Medical Image Segmentation
  • Virtual Adversarial Training
  • Vision Transformers

Cite this