A deep ensemble learning framework for glioma segmentation and grading prediction

Abstract The segmentation and risk grade prediction of gliomas based on preoperative multimodal magnetic resonance imaging (MRI) are crucial tasks in computer-aided diagnosis. Due to the significant heterogeneity between and within tumors, existing methods mainly rely on single-task approaches, over...

Full description

Saved in:
Bibliographic Details
Main Authors: Liang Wen, Hui Sun, Guobiao Liang, Yue Yu
Format: Article
Language:English
Published: Nature Portfolio 2025-02-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-87127-z
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract The segmentation and risk grade prediction of gliomas based on preoperative multimodal magnetic resonance imaging (MRI) are crucial tasks in computer-aided diagnosis. Due to the significant heterogeneity between and within tumors, existing methods mainly rely on single-task approaches, overlooking the inherent correlation between segmentation and grading tasks. Furthermore, the limited availability of glioma grading data presents further challenges. To address these issues, we propose a deep-ensemble learning framework based on multimodal MRI and the U-Net model, which simultaneously performs glioma segmentation and risk grade prediction. We introduce asymmetric convolution and dual-domain attention in the encoder, fully integrating effective information from different modalities, enhancing the extraction of features from critical regions, and constructing a dual-branch decoder that combines spatial features and global semantic information for both segmentation and grading. In addition, we propose a weighted composite adaptive loss function to balance the optimization objectives of the two tasks. Our experimental results on the BraTS dataset demonstrate that our method outperforms state-of-the-art methods, yielding superior segmentation accuracy and precise risk grade prediction.
ISSN:2045-2322