본문으로 건너뛰기
← 뒤로

AEGFN: adaptive evidence-gated fusion network for medical image prediction via conflict compensation and DS evidence theory-driven regularization.

1/5 보강
Biomedical physics & engineering express 📖 저널 OA 7.7% 2024: 0/1 OA 2025: 0/9 OA 2026: 2/16 OA 2024~2026 2025 Vol.11(6)
Retraction 확인
출처

Tan X, Wang H, Hu S, Ge Y, Liang R, Wu D

📝 환자 설명용 한 줄

The discrepancy in reliability and evidence conflict in multimodal data fusion for medical image prediction significantly undermines the accuracy of clinical decision-making.

이 논문을 인용하기

↓ .bib ↓ .ris
APA Tan X, Wang H, et al. (2025). AEGFN: adaptive evidence-gated fusion network for medical image prediction via conflict compensation and DS evidence theory-driven regularization.. Biomedical physics & engineering express, 11(6). https://doi.org/10.1088/2057-1976/ae183b
MLA Tan X, et al.. "AEGFN: adaptive evidence-gated fusion network for medical image prediction via conflict compensation and DS evidence theory-driven regularization.." Biomedical physics & engineering express, vol. 11, no. 6, 2025.
PMID 41195538 ↗

Abstract

The discrepancy in reliability and evidence conflict in multimodal data fusion for medical image prediction significantly undermines the accuracy of clinical decision-making. To address this challenge, we propose an Adaptive Evidence-Gated Fusion Network (AEGFN) based on Dempster-Shafer (DS) evidence theory. This framework models the evidence quality and cognitive uncertainty of CT images, image sequences, and clinical data using the Dirichlet distribution. We innovatively introduce an Evidence-Attention Gate (EAG) to dynamically adjust fusion weights for high-conflict modalities (conflict > 0.6), enabling conflict-aware uncertainty compensation. Additionally, a hybrid loss function combining KL divergence regularization with uncertainty-weighted cross-entropy is designed to balance model confidence and generalization. Evaluated on colorectal cancer (656 cases) and radiation pneumonitis (117 cases) datasets for binary classification tasks (predicting patient death and predicting RP occurrence), AEGFN achieves classification accuracies of 95.04% (AUC 0.97) and 82.34% (AUC 0.8312), outperforming the state-of-the-art method DDEF by 0.66% and 1.94%, respectively. This work provides a robust and interpretable solution for multimodal medical prediction, enhancing the reliability of clinical decision support systems.

🏷️ 키워드 / MeSH 📖 같은 키워드 OA만

같은 제1저자의 인용 많은 논문 (5)

🏷️ 같은 키워드 · 무료전문 — 이 논문 MeSH/keyword 기반