본문으로 건너뛰기
← 뒤로

A navigation-guided 3D breast ultrasound scanning and reconstruction system for automated multi-lesion spatial localization and diagnosis.

3/5 보강
Medical image analysis 📖 저널 OA 3.6% 2025: 0/7 OA 2026: 1/21 OA 2025~2026 2026 Vol.110() p. 103965 cited 1 AI in cancer detection
TL;DR An intelligent navigation-guided breast ultrasound scanning system delivering seamless 3D reconstruction, nipple-centric lesion localization, and video-based malignancy prediction with full adaptation to the routine workflow is developed by coupling precise 3D spatial annotation with foundation-model-enhanced spatiotemporal characterization.
Retraction 확인
출처
PubMed DOI OpenAlex Semantic 마지막 보강 2026-04-29
OpenAlex 토픽 · AI in cancer detection Ultrasound Imaging and Elastography Breast Lesions and Carcinomas

Zhang Y, Yan Y, Wang K, Cai M, Xiang Y, Guo Y, Tu P, Ying T, Chen X

📝 환자 설명용 한 줄

An intelligent navigation-guided breast ultrasound scanning system delivering seamless 3D reconstruction, nipple-centric lesion localization, and video-based malignancy prediction with full adaptation

🔬 핵심 임상 통계 (초록에서 자동 추출 — 원문 검증 권장)
  • p-value p < 0.0001

이 논문을 인용하기

↓ .bib ↓ .ris
APA Yi Zhang, Yulin Yan, et al. (2026). A navigation-guided 3D breast ultrasound scanning and reconstruction system for automated multi-lesion spatial localization and diagnosis.. Medical image analysis, 110, 103965. https://doi.org/10.1016/j.media.2026.103965
MLA Yi Zhang, et al.. "A navigation-guided 3D breast ultrasound scanning and reconstruction system for automated multi-lesion spatial localization and diagnosis.." Medical image analysis, vol. 110, 2026, pp. 103965.
PMID 41633129 ↗

Abstract

Handheld ultrasound (HHUS) is indispensable for breast cancer screening but remains compromised by operator-dependent acquisition, subjective 2D interpretation and clock-face annotation. Existing spatial tracking systems for HHUS typically lack integration, adaptability, flexibility, and robust 3D representation. Additionally, current deep learning diagnostic methods are predominantly based on single ultrasound images, whereas video-based malignancy classification approaches suffer from limited temporal interpretability. In this study, we develop an intelligent navigation-guided breast ultrasound scanning system delivering seamless 3D reconstruction, nipple-centric lesion localization, and video-based malignancy prediction with full adaptation to the routine workflow. Specifically, a Hybrid Lesion-informed Spatiotemporal Transformer (HLST) is proposed to selectively fuse intra- and peri-lesional dynamics augmented from a prompt-driven BUS-SAM-2 foundation model for sequence-level classification. Moreover, a geometry-adaptive clock projection and analysis method is designed to enable automated standardized clock-face orientation and lesion-to-nipple distance measurement for breasts of arbitrary shape, eliminating patient-attached fiducials or pre-marked landmarks. Validation on three breast phantoms demonstrated high correlations with CT reference (r > 0.99 for distance, r > 0.97 for 3D size, and r=1.00 for clockwise angle, p < 0.0001). Clinical evaluation in 43 female patients (30 abnormal breasts) yielded median clock-face orientation and size discrepancies of 0 h and 0.7 mm  ×  0.6 mm, respectively, versus conventional reports. Meanwhile, HLST achieved superior performance (86.1% accuracy) on the BUV dataset. By coupling precise 3D spatial annotation with foundation-model-enhanced spatiotemporal characterization, the proposed system offers a reliable, streamlined workflow that standardizes follow-up, guides biopsies, and promotes diagnostic confidence in HHUS practice.

🏷️ 키워드 / MeSH 📖 같은 키워드 OA만

같은 제1저자의 인용 많은 논문 (5)

🏷️ 같은 키워드 · 무료전문 — 이 논문 MeSH/keyword 기반