Two-stage CNNs for BI-RADS classification of computer-processed breast ultrasound images (BioMedical Engineering OnLine)

Two‑stage CNNs for computerized BI‑RADS categorization in breast ultrasound images

1. Summary

Background: Quantification of Breast Imaging Reporting and Data System (BI-RADS) standards into distinct categories within a single ultrasound modality has been a challenge. To achieve this goal, we propose a two-stage grading system based on convolutional neural networks (CNNs) to automatically assess breast tumors into five categories from ultrasound images.

METHODS : A newly developed automatic grading system consisted of two stages of tumor identification and tumor grading. The constructed tumor recognition network, called ROI-CNN, can identify tumor-containing regions from raw breast ultrasound images. The following tumor classification network (G-CNN) can generate effective features to distinguish the identified regions of interest (ROIs) into Category "3", Category "4A", Category "4B", Category "4C" and Category" 5". In particular, to facilitate the predictions identified by the ROI-CNN to be more tumor-appropriate, we utilized a level-set-based refinement procedure as the junction between the stages and the graded stages.

Results: We tested the proposed two-stage grading system on 2238 ultrasound images of breast tumors. Using accuracy as a metric, our automated computerized assessment of breast tumor grading demonstrated comparable performance to physician-determined subjective categories. Experimental results show that our two-stage framework has a correct rate of 0.998 for category "3", 0.940 for "4A", 0.734 for "4B", and 0.922 for "4C", The correct rate for "5" is 0.876.

Conclusion: This scheme extracts effective features from breast ultrasound images by decoupling the recognition features and classification features of different neural networks,

Guess you like

Origin blog.csdn.net/qq_40108803/article/details/116273519