AI-assisted rapid on-site evaluation ROSE of EUS-FNA cytopathology for pancreatic solid lesions: A two-stage deep learning approach.
1/5 보강
PICO 자동 추출 (휴리스틱, conf 2/4)
유사 논문P · Population 대상 환자/모집단
92 patients with pancreatic solid lesions at Peking University First Hospital, captured using mobile phones, during ROSE.
I · Intervention 중재 / 시술
추출되지 않음
C · Comparison 대조 / 비교
추출되지 않음
O · Outcome 결과 / 결론
Furthermore, the inherent variability in EUS-FNA sample quality (e.g., cellularity and debris) and suboptimal smear preparation during procedures can impede reliable automated analysis.This study aimed to develop and validate a novel 2-stage deep learning-based diagnostic model for the automated analysis of EUS-FNA cy…
[BACKGROUND] EUS-guided fine-needle aspiration (EUS-FNA) combined with rapid on-site evaluation (ROSE) is crucial for diagnosing pancreatic solid lesions.
APA
Tian Y, Zhang J, et al. (2025). AI-assisted rapid on-site evaluation ROSE of EUS-FNA cytopathology for pancreatic solid lesions: A two-stage deep learning approach.. Endoscopic ultrasound, 14(6), 314-327. https://doi.org/10.1097/eus.0000000000000154
MLA
Tian Y, et al.. "AI-assisted rapid on-site evaluation ROSE of EUS-FNA cytopathology for pancreatic solid lesions: A two-stage deep learning approach.." Endoscopic ultrasound, vol. 14, no. 6, 2025, pp. 314-327.
PMID
41585852
Abstract
[BACKGROUND] EUS-guided fine-needle aspiration (EUS-FNA) combined with rapid on-site evaluation (ROSE) is crucial for diagnosing pancreatic solid lesions. However, the widespread application of ROSE faces challenges, particularly the limited availability of specialized cytopathologists for immediate interpretation. Furthermore, the inherent variability in EUS-FNA sample quality (e.g., cellularity and debris) and suboptimal smear preparation during procedures can impede reliable automated analysis.This study aimed to develop and validate a novel 2-stage deep learning-based diagnostic model for the automated analysis of EUS-FNA cytological images of pancreatic solid lesions captured using mobile phones during the ROSE process, with the goal of automating diagnosis and improving efficiency.
[METHODS] We retrospectively collected 882 EUS-FNA cytological images from 92 patients with pancreatic solid lesions at Peking University First Hospital, captured using mobile phones, during ROSE. A 2-stage deep learning model was developed. The first stage utilized YOLOv8n-p2 for robust tissue cell detection, addressing background complexity, and enabling preprocessing steps such as cell size normalization. In the second stage, DenseNet201 was employed to classify the detected cellular regions as malignant or normal. The dataset was divided into training, validation, and testing sets. The model performance was evaluated using a dedicated test set consisting of whole images.
[RESULTS] On the independent test set, the proposed 2-stage method achieved high performance in classifying images for malignancy, yielding an accuracy of 93.3%, an area under the precision-recall curve (AUC-PR) of 0.958, and an area under the receiver operating characteristic curve (AUC-ROC) of 0.962.
[CONCLUSION] The diagnostic accuracy of the proposed AI method is comparable to that of traditional expert-involved ROSE, while offering the potential to significantly reduce diagnostic time and dependency on immediate pathologist availability. This 2-stage deep learning approach effectively analyzes variable mobile phone-captured EUS-FNA images, demonstrating its potential as an automated, AI-assisted diagnostic system for the ROSE process. It holds significant potential for clinical application, particularly in facilitating timely pancreatic cancer diagnosis in resource-constrained settings and in scenarios where on-site pathological expertise is limited.
[METHODS] We retrospectively collected 882 EUS-FNA cytological images from 92 patients with pancreatic solid lesions at Peking University First Hospital, captured using mobile phones, during ROSE. A 2-stage deep learning model was developed. The first stage utilized YOLOv8n-p2 for robust tissue cell detection, addressing background complexity, and enabling preprocessing steps such as cell size normalization. In the second stage, DenseNet201 was employed to classify the detected cellular regions as malignant or normal. The dataset was divided into training, validation, and testing sets. The model performance was evaluated using a dedicated test set consisting of whole images.
[RESULTS] On the independent test set, the proposed 2-stage method achieved high performance in classifying images for malignancy, yielding an accuracy of 93.3%, an area under the precision-recall curve (AUC-PR) of 0.958, and an area under the receiver operating characteristic curve (AUC-ROC) of 0.962.
[CONCLUSION] The diagnostic accuracy of the proposed AI method is comparable to that of traditional expert-involved ROSE, while offering the potential to significantly reduce diagnostic time and dependency on immediate pathologist availability. This 2-stage deep learning approach effectively analyzes variable mobile phone-captured EUS-FNA images, demonstrating its potential as an automated, AI-assisted diagnostic system for the ROSE process. It holds significant potential for clinical application, particularly in facilitating timely pancreatic cancer diagnosis in resource-constrained settings and in scenarios where on-site pathological expertise is limited.
같은 제1저자의 인용 많은 논문 (5)
- A Chemoenzymatic Labeling Strategy for Site-Specific Analysis of Tumor-Associated Sialyl Thomsen-Friedenreich Antigen.
- STAT3-driven EMT in cancer metastasis and chemoresistance: A review.
- An IL-6-induced STAT3-to-PI3K signaling switch potently drives PD-L1 transcription in cancer stem cells of colorectal cancer.
- Fine mapping regulatory variants by characterizing native CpG methylation with nanopore long-read sequencing.
- Engineered suppressor T cells overexpressing IFN-α and PD-L1 inhibit GVHD but retain GVL effects in mice.