본문으로 건너뛰기
← 뒤로

Trustworthy tree-based machine learning by MoS flash-based analog content-addressable memory with inherent soft boundaries.

Nature communications 2026 🔓 OA Network Packet Processing and Optimi
OpenAlex 토픽 · Network Packet Processing and Optimization Ferroelectric and Negative Capacitance Devices Advanced Memory and Neural Computing

Wen B, Gao G, Xu Z, Jiang M, Mao R, Qi X, Chen J, Yin X, Hu XS, Li C

📝 환자 설명용 한 줄

The rapid advancement of artificial intelligence has raised concerns regarding its trustworthiness, especially in terms of interpretability and robustness.

이 논문을 인용하기

BibTeX ↓ RIS ↓
APA Bo Wen, Guoyun Gao, et al. (2026). Trustworthy tree-based machine learning by MoS flash-based analog content-addressable memory with inherent soft boundaries.. Nature communications. https://doi.org/10.1038/s41467-026-72118-z
MLA Bo Wen, et al.. "Trustworthy tree-based machine learning by MoS flash-based analog content-addressable memory with inherent soft boundaries.." Nature communications, 2026.
PMID 42031751

Abstract

The rapid advancement of artificial intelligence has raised concerns regarding its trustworthiness, especially in terms of interpretability and robustness. Tree-based models such as Random Forest excel in interpretability and accuracy for tabular data, but scaling them remains computationally expensive due to poor data locality and high data dependence. Previous efforts to accelerate these models with analog content-addressable memory (CAM) have struggled because difficult-to-implement sharp decision boundaries are highly susceptible to device variations, leading to poor hardware performance and vulnerability to adversarial attacks. Here, we present a hardware-software co-design approach using MoS flash-based analog CAM with inherent soft boundaries, enabling efficient inference with soft tree-based models. Our fabricated analog CAM arrays achieve 96% accuracy on the Wisconsin Diagnostic Breast Cancer dataset, and our experimentally calibrated model shows only a 0.6% accuracy drop on MNIST under 10% device threshold variation, compared with 45.3% for traditional decision trees.

같은 제1저자의 인용 많은 논문 (2)