본문으로 건너뛰기
← 뒤로

C2HFusion: Clinical context-driven hierarchical fusion of multimodal data for personalized and quantitative prognostic assessment in pancreatic cancer.

Medical image analysis 2026 Vol.109() p. 103937

Zeng B, Xu Y, Wang P, Lu T, Xie Z, Zeng M, Zhou J, Liu L, Sun H, Chen X

📝 환자 설명용 한 줄

Pancreatic ductal adenocarcinoma (PDAC) is a highly aggressive malignancy.

이 논문을 인용하기

BibTeX ↓ RIS ↓
APA Zeng B, Xu Y, et al. (2026). C2HFusion: Clinical context-driven hierarchical fusion of multimodal data for personalized and quantitative prognostic assessment in pancreatic cancer.. Medical image analysis, 109, 103937. https://doi.org/10.1016/j.media.2026.103937
MLA Zeng B, et al.. "C2HFusion: Clinical context-driven hierarchical fusion of multimodal data for personalized and quantitative prognostic assessment in pancreatic cancer.." Medical image analysis, vol. 109, 2026, pp. 103937.
PMID 41564638

Abstract

Pancreatic ductal adenocarcinoma (PDAC) is a highly aggressive malignancy. Accurate prognostic modeling enables reliable risk stratification to identify patients most likely to benefit from adjuvant therapy, thereby facilitating individualized clinical management and potentially improving patient outcomes. Although recent deep learning approaches have shown promise in this area, their effectiveness is often constrained by fusion strategies that fail to fully capture the hierarchical and complementary information across heterogeneous clinical modalities. To address these limitations, we propose C2HFusion, a novel fusion framework inspired by clinical decision-making for personalized prognostic risk assessment. C2HFusion is unique in that it integrates multimodal data across multiple representational levels and structural forms. At the imaging level, it extracts and aggregates tumor-level features from multi-sequence MRI using cross-attention, effectively capturing complementary imaging patterns. At the patient level, it encodes structured data (e.g., laboratory results, demographics) and unstructured data (e.g., radiology reports) as contextual priors, which are then fused with imaging representations through a novel feature modulation mechanism. To further enhance this cross-level integration, a scalable Mixture-of-Clinical-Experts (MoCE) module dynamically routes different modalities through specialized branches and adaptively optimizes feature fusion for more robust multimodal modeling. Validation on multi-center real-world datasets covering 681 PDAC patients shows that C2HFusion consistently outperforms state-of-the-art methods in overall survival prediction, achieving over a 5% improvement in C-index. These results highlight its potential to improve prognostic accuracy and support more informed, personalized clinical decision-making.

MeSH Terms

Humans; Pancreatic Neoplasms; Prognosis; Magnetic Resonance Imaging; Carcinoma, Pancreatic Ductal; Multimodal Imaging; Image Interpretation, Computer-Assisted; Precision Medicine; Risk Assessment; Female; Male; Deep Learning

같은 제1저자의 인용 많은 논문 (5)