Translational deep learning models for risk stratification to predict prognosis and immunotherapy response in gastric cancer using digital pathology.
1/5 보강
PICO 자동 추출 (휴리스틱, conf 2/4)
유사 논문P · Population 대상 환자/모집단
292 patients in the Cancer Genome Atlas-Stomach Adenocarcinoma cohort were analyzed.
I · Intervention 중재 / 시술
추출되지 않음
C · Comparison 대조 / 비교
추출되지 않음
O · Outcome 결과 / 결론
By integrating tumor-specific and tumor microenvironmental features, this approach offers a scalable, objective tool for personalized treatment planning, improving precision oncology strategies. [SUPPLEMENTARY INFORMATION] The online version contains supplementary material available at 10.1186/s12967-025-07416-z.
[BACKGROUND] Gastric cancer (GC) is one of the leading causes of cancer-related deaths globally, with a 5-year survival rate of less than 40%.
APA
Nguyen MH, Do-Huu HH, et al. (2025). Translational deep learning models for risk stratification to predict prognosis and immunotherapy response in gastric cancer using digital pathology.. Journal of translational medicine, 23(1), 1419. https://doi.org/10.1186/s12967-025-07416-z
MLA
Nguyen MH, et al.. "Translational deep learning models for risk stratification to predict prognosis and immunotherapy response in gastric cancer using digital pathology.." Journal of translational medicine, vol. 23, no. 1, 2025, pp. 1419.
PMID
41444960
Abstract
[BACKGROUND] Gastric cancer (GC) is one of the leading causes of cancer-related deaths globally, with a 5-year survival rate of less than 40%. While immune checkpoint inhibitors have provided promising therapeutic options for advanced GC, only a small proportion of patients benefit. In this study, we developed a deep learning model using whole-slide images to predict prognoses and sensitivity to immune checkpoint inhibitors in GC patients by predicting a novel marker.
[METHODS] Formalin-fixed, paraffin-embedded whole-slide images from 292 patients in the Cancer Genome Atlas-Stomach Adenocarcinoma cohort were analyzed. Tumor regions were identified using a ResNet50-based tumor detection model and validated in HiESD dataset. Tiles classified as malignant were extracted for subsequent analysis. Risk score prediction models were developed using convolutional neural networks, clustering-constrained attention multiple-instance learning (CLAM), and dual-stream multiple-instance learning (DSMIL). Attention heatmap visualization was used to interpret tumor microenvironment (TME) features and a multi-model classification framework utilizing a support vector machine (SVM) was developed to assess the impact of clinical variables in model performances. The results were evaluated using the area under the receiver operating characteristic curve (AUROC), accuracy, sensitivity, specificity, and F1 score.
[RESULTS] The tumor detection model achieved an AUROC of 0.99 on the training set, 0.92 on the test set, and 0.87 on external test set. Among risk score prediction models, DSMIL demonstrated the highest performance, with an AUROC of 0.73 and accuracy of 0.73 on the training set and AUROC of 0.70 and accuracy of 0.68 on the internal test set. High-risk patients exhibited worse survival outcomes and lower immunotherapy response rates compared with low-risk patients. Feature attribution analysis using attention heatmaps confirmed that the model prioritized TME components, specifically regions with dense lymphocytic infiltration. The muti-modal analysis showed that the image features alone were superior to the model combining image features with clinicopathological data.
[CONCLUSION] Deep learning models leveraging whole-slide images show potential in predicting prognoses and immunotherapy responses in GC. By integrating tumor-specific and tumor microenvironmental features, this approach offers a scalable, objective tool for personalized treatment planning, improving precision oncology strategies.
[SUPPLEMENTARY INFORMATION] The online version contains supplementary material available at 10.1186/s12967-025-07416-z.
[METHODS] Formalin-fixed, paraffin-embedded whole-slide images from 292 patients in the Cancer Genome Atlas-Stomach Adenocarcinoma cohort were analyzed. Tumor regions were identified using a ResNet50-based tumor detection model and validated in HiESD dataset. Tiles classified as malignant were extracted for subsequent analysis. Risk score prediction models were developed using convolutional neural networks, clustering-constrained attention multiple-instance learning (CLAM), and dual-stream multiple-instance learning (DSMIL). Attention heatmap visualization was used to interpret tumor microenvironment (TME) features and a multi-model classification framework utilizing a support vector machine (SVM) was developed to assess the impact of clinical variables in model performances. The results were evaluated using the area under the receiver operating characteristic curve (AUROC), accuracy, sensitivity, specificity, and F1 score.
[RESULTS] The tumor detection model achieved an AUROC of 0.99 on the training set, 0.92 on the test set, and 0.87 on external test set. Among risk score prediction models, DSMIL demonstrated the highest performance, with an AUROC of 0.73 and accuracy of 0.73 on the training set and AUROC of 0.70 and accuracy of 0.68 on the internal test set. High-risk patients exhibited worse survival outcomes and lower immunotherapy response rates compared with low-risk patients. Feature attribution analysis using attention heatmaps confirmed that the model prioritized TME components, specifically regions with dense lymphocytic infiltration. The muti-modal analysis showed that the image features alone were superior to the model combining image features with clinicopathological data.
[CONCLUSION] Deep learning models leveraging whole-slide images show potential in predicting prognoses and immunotherapy responses in GC. By integrating tumor-specific and tumor microenvironmental features, this approach offers a scalable, objective tool for personalized treatment planning, improving precision oncology strategies.
[SUPPLEMENTARY INFORMATION] The online version contains supplementary material available at 10.1186/s12967-025-07416-z.