본문으로 건너뛰기
← 뒤로

An interpretable hybrid deep learning framework for gastric cancer diagnosis using histopathological imaging.

Scientific reports 2025 Vol.15(1) p. 34204

Ren T, Govindarajan V, Bourouis S, Wang X, Ke S

📝 환자 설명용 한 줄

The increasing incidence of gastric cancer and the complexity of histopathological image interpretation present significant challenges for accurate and timely diagnosis.

이 논문을 인용하기

BibTeX ↓ RIS ↓
APA Ren T, Govindarajan V, et al. (2025). An interpretable hybrid deep learning framework for gastric cancer diagnosis using histopathological imaging.. Scientific reports, 15(1), 34204. https://doi.org/10.1038/s41598-025-15702-5
MLA Ren T, et al.. "An interpretable hybrid deep learning framework for gastric cancer diagnosis using histopathological imaging.." Scientific reports, vol. 15, no. 1, 2025, pp. 34204.
PMID 41034364

Abstract

The increasing incidence of gastric cancer and the complexity of histopathological image interpretation present significant challenges for accurate and timely diagnosis. Manual assessments are often subjective and time-intensive, leading to a growing demand for reliable, automated diagnostic tools in digital pathology. This study proposes a hybrid deep learning approach combining convolutional neural networks (CNNs) and Transformer-based architectures to classify gastric histopathological images with high precision. The model is designed to enhance feature representation and spatial contextual understanding, particularly across diverse tissue subtypes and staining variations. Three publicly available datasets-GasHisSDB, TCGA-STAD, and NCT-CRC-HE-100 K-were utilized to train and evaluate the model. Image patches were preprocessed through stain normalization, augmented using standard techniques, and fed into the hybrid model. The CNN backbone extracts local spatial features, while the Transformer encoder captures global context. Performance was assessed using fivefold cross-validation and evaluated through accuracy, F1-score, AUC, and Grad-CAM-based interpretability. The proposed model achieved a 99.2% accuracy on the GasHisSDB dataset, with a macro F1-score of 0.991 and AUC of 0.996. External validation on TCGA-STAD and NCT-CRC-HE-100 K further confirmed the model's robustness. Grad-CAM visualizations highlighted biologically relevant regions, demonstrating interpretability and alignment with expert annotations. This hybrid deep learning framework offers a reliable, interpretable, and generalizable tool for gastric cancer diagnosis. Its superior performance and explainability highlight its clinical potential for deployment in digital pathology workflows.

MeSH Terms

Stomach Neoplasms; Humans; Deep Learning; Neural Networks, Computer; Image Interpretation, Computer-Assisted; Image Processing, Computer-Assisted

같은 제1저자의 인용 많은 논문 (5)