In this repository, we will collect and document startup companies, researchers, and their outstanding work related to autonomous medical ultrasound systems.
-
Robotic ultrasound imaging: State-of-the-art and future perspectives, Medical Image Analysis (2023.10).
Zhongliang Jiang, Septimiu E. Salcudean, Nassir Navab. [Paper][Code]
-
Video-based AI for beat-to-beat assessment of cardiac function, Nature (2020).
David Ouyang, Bryan He, Amirata Ghorbani, Neal Yuan, Joseph Ebinger, Curt P. Langlotz, Paul A. Heidenreich, Robert A. Harrington, David H. Liang, Euan A. Ashley, and James Y. Zou. [Dataset][Paper][Code]
Description: The dataset contains 10,030 apical-4-chamber echocardiography videos from individuals who underwent imaging between 2016 and 2018 as part of routine clinical care at Stanford University Hospital. Each video was cropped and masked to remove text and information outside of the scanning sector. The resulting images were then downsampled by cubic interpolation into standardized 112x112 pixel videos.
-
Straight to the point: reinforcement learning for user guidance in ultrasound, arXiv:1903.00586 (2019).
Fausto Milletari, Vighnesh Birodkar, Michal Sofka. [Paper][Code]
- Method: deep reinforcement learning with simulated training environment by collecting ultrasound image and probe posture pairs from 7x7 spatial bins on patients' chest. The rotations and tilts are only collected in the bin marked as "correct" due to the huge time cost for collecting in all bins.
- Data: train on 22 different environment with ~160,000 images and test on 5 different environment with ~40,000 images.
- Criteria: whether provide the correct guidance towards the target bin; this criteria might be proper for the topic of probe guidance, but not suitable for autonomous robotic plane localization.
- Experiments: totally conducted in the offline environments; since the simulated environment is a simplified version of the situation in real world, it is not clear the performance when applied to real world.
-
Automatic Probe Movement Guidance for Freehand Obstetric Ultrasound, MICCAI (2020).
Richard Droste, Lior Drukker, Aris T. Papageorghiou, J. Alison Noble. [Paper][Code]
- Method: supervised learning with RNN architecture.
- Data: collect 5079 demonstrations of standard plane acquisitions from 464 2nd- and 3rd-trimester scans acquired by 17 accredited sonographers.
- Criteria: whether provide the correct guidance towards the target; this criteria might be proper for the topic of probe guidance, but not suitable for autonomous robotic plane localization.
-
USFM: A Universal Ultrasound Foundation Model Generalized to Tasks and Organs towards Label Efficient Image Analysis, arXiv 2024 (arXiv:2401.00153).
Jing Jiao, Jin Zhou, Xiaokang Li, Menghua Xia, Yi Huang, Lihong Huang, Na Wang, Xiaofan Zhang, Shichong Zhou, Yuanyuan Wang, Yi Guo. [Paper][Code]
Insights: the information in frequence domain may be a good way to distinguish similar ultrasound images as shown in Fig.1(c).