본문으로 건너뛰기
← 뒤로

Multimodal hybrid mamba classification model for tumor pathological grade prediction using magnetic resonance images.

Neural networks : the official journal of the International Neural Network Society 2026 Vol.198() p. 108726 AI in cancer detection
TL;DR The Multimodal Hybrid Mamba (MSHM) classification model for tumor pathological grade prediction and multimodal medical image analysis strikes a balance between high performance and efficiency, advancing both tumor pathological grade prediction and multimodal medical image analysis.
OpenAlex 토픽 · AI in cancer detection Brain Tumor Detection and Classification MRI in cancer diagnosis

Zhou L, Fu T, Qu X, Wu J, Huang Y, Song H, Fan J, Ai D, Xiao D, Xian J, Yang J

📝 환자 설명용 한 줄

The Multimodal Hybrid Mamba (MSHM) classification model for tumor pathological grade prediction and multimodal medical image analysis strikes a balance between high performance and efficiency, advanci

이 논문을 인용하기

BibTeX ↓ RIS ↓
APA Langtao Zhou, Tianyu Fu, et al. (2026). Multimodal hybrid mamba classification model for tumor pathological grade prediction using magnetic resonance images.. Neural networks : the official journal of the International Neural Network Society, 198, 108726. https://doi.org/10.1016/j.neunet.2026.108726
MLA Langtao Zhou, et al.. "Multimodal hybrid mamba classification model for tumor pathological grade prediction using magnetic resonance images.." Neural networks : the official journal of the International Neural Network Society, vol. 198, 2026, pp. 108726.
PMID 41719867

Abstract

Malignant tumors present a significant global health challenge, and accurate pathological grading is essential for personalized treatment. Traditional grading methods, which rely on invasive biopsies, are limited by tumor location. In contrast, magnetic resonance imaging (MRI) offers a non-invasive, high-resolution tool with multi-sequence MRI (e.g., T1, T2, T1C) enabling comprehensive tumor assessment. However, existing methods often struggle to capture cross-modal correlations and global dependencies. To address this limitation, we propose the Multimodal Hybrid Mamba (MSHM) classification model for tumor pathological grade prediction. The model integrates convolutional neural networks for shallow feature extraction, Mamba encoders for modeling global dependencies, and cross-modal attention to fuse multi-sequence MRI data. The Mamba-Fusion module further refines the global features, enhancing lesion recognition and computational efficiency. Experimental results demonstrate that MSHM outperforms existing methods, achieving 98.36 ± 1.00% AUC and 92.08 ± 3.26% F1-Score on the private orbital adnexal lymphoma dataset from multi-centers, and 98.93 ± 0.19% AUC and 95.82 ± 0.62% F1-Score on the public glioma BraTS 2024 dataset. Additionally, MSHM performs exceptionally well on the LLD-MMRI dataset, achieving 99.25 ± 0.26% AUC and 96.97 ± 0.55% F1-Score in distinguishing between benign and malignant liver lesions, further validating the model's robust performance across diverse datasets. Ablation studies confirm the effectiveness of the proposed modules. Overall, MSHM strikes a balance between high performance and efficiency, advancing both tumor pathological grade prediction and multimodal medical image analysis.

MeSH Terms

Humans; Magnetic Resonance Imaging; Neural Networks, Computer; Neoplasm Grading; Neoplasms

같은 제1저자의 인용 많은 논문 (5)