Automatic Detection of Breast Lesions in Automated 3D Breast Ultrasound with Cross-Organ Transfer Learning

Project Leaders

Zhengrui Huang

Partner Organisations

杭州市第一人民医院

Breast cancer stands as a formidable challenge to women's health, compounded by the current absence of an effective treatment. This underscores the pivotal role of early detection and diagnosis in mitigating the risk of mortality. Over the past few decades, an array of enhancement techniques, including X-rays (mammography), ultrasound, and magnetic resonance imaging (MRI), has been deployed to offer intricate insights into mammogram images, streamlining the detection of breast cancer. While these methods excel in screening for breast tumors, they fall short in monitoring patients' diverse stages and grapple with the challenges of predicting diseases in advance.

Moreover, a comprehensive understanding of breast cancer's etiology remains elusive. Acknowledging the human body as a complex, interconnected system, where the various states of organs mirror similar diseases in patients, underscores the imperative to leverage the diverse states and developments of different body parts for predicting breast cancer. The widespread implementation of the electronic health record (EHR) system, storing extensive medical data across different periods, provides a significant opportunity for achieving early prediction of breast cancer. Through the utilization of large-scale longitudinal health records, this project aims to develop an early prediction system for breast cancer, delving deeper into unraveling the evolving patterns of this complex disease.

Project Example


SAMASK-CLTR : A spatial-aware mask guidedlearning model for benign and malignant tumor classification in ABUS/ABVS

Project Leaders

Peirong Xu

we propose SAMASK-CLTR (Spatial-Aware Mask Prompting with Con- volutional Transformer Architecture), a hybrid framework that combines the feature extraction power of CNNs with the global modeling capability of Transformers. In our approach, ResNet-50 extracts hierarchical, multi-scale features that are refined by a Transformer encoder-decoder to capture global context. Crucially, during de- coding, a mask prompt enhanced with 3D positional encoding guides the network to focus on key tumor regions, directly addressing the challenges of precise localiza- tion and classification. Experiments on 6,973 ABUS/ABVS images—including 6,873 clinical cases from Internal Datasets and 100 cases from the public ABUS Challenge Cup—demonstrate that SAMASK-CLTR achieves AUCs of 88.45% and 70.46% on internal and external datasets, respectively. These results highlight the potential of our framework to significantly enhance breast cancer diagnosis by improving the accuracy and reliability of lesion classification

3D Breast Ultrasound Object detection

Project Leaders

Xin Qian

Breast cancer is one of the most common malignancies among women globally, and early accurate diagnosis is crucial for improving survival rates. Breast ultrasound (BUS) serves as an essential screening tool due to its non-radiative, cost-effective nature, and high sensitivity, especially for dense breasts or younger women. However, traditional 2D ultrasound suffers from high operator dependency, significant noise, and ambiguous lesion boundaries, which limit the accuracy in distinguishing benign from malignant lesions. In contrast, 3D breast ultrasound provides comprehensive lesion information, including morphology, margins, and internal echogenicity, overcoming some of these limitations. This study aims to develop a deep learning-based 3D object detection framework to enhance BUS effectiveness, focusing on accurate lesion localization, benign-malignant classification, and few-shot learning. Innovations include multi-scale 3D feature fusion, dynamic enhancement modeling, and interpretability methods to improve diagnostic precision and reliability.

Project Example

Predict case from our model, the red box represents the possible lesions predicted by our model.