Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Computerized medical imaging and graphics, 2018-12, Vol.70, p.53-62
2018
Volltextzugriff (PDF)

Details

Autor(en) / Beteiligte
Titel
SD-CNN: A shallow-deep CNN for improved breast cancer diagnosis
Ist Teil von
  • Computerized medical imaging and graphics, 2018-12, Vol.70, p.53-62
Ort / Verlag
United States: Elsevier Ltd
Erscheinungsjahr
2018
Quelle
Access via ScienceDirect (Elsevier)
Beschreibungen/Notizen
  • •Implement state-of-art deep CNN as a feature generator for CEDM image diagnosis.•A shallow CNN is trained to synthesize CEDM image.•Improve diagnosis performance by combining synthetic CEDM image with FFDM images. Breast cancer is the second leading cause of cancer death among women worldwide. Nevertheless, it is also one of the most treatable malignances if detected early. Screening for breast cancer with full field digital mammography (FFDM) has been widely used. However, it demonstrates limited performance for women with dense breasts. An emerging technology in the field is contrast-enhanced digital mammography (CEDM), which includes a low energy (LE) image similar to FFDM, and a recombined image leveraging tumor neoangiogenesis similar to breast magnetic resonance imaging (MRI). CEDM has shown better diagnostic accuracy than FFDM. While promising, CEDM is not yet widely available across medical centers. In this research, we propose a Shallow-Deep Convolutional Neural Network (SD-CNN) where a shallow CNN is developed to derive “virtual” recombined images from LE images, and a deep CNN is employed to extract novel features from LE, recombined or “virtual” recombined images for ensemble models to classify the cases as benign vs. cancer. To evaluate the validity of our approach, we first develop a deep-CNN using 49 CEDM cases collected from Mayo Clinic to prove the contributions from recombined images for improved breast cancer diagnosis (0.85 in accuracy, 0.84 in AUC using LE imaging vs. 0.89 in accuracy, 0.91 in AUC using both LE and recombined imaging). We then develop a shallow-CNN using the same 49 CEDM cases to learn the nonlinear mapping from LE to recombined images. Next, we use 89 FFDM cases from INbreast, a public database to generate “virtual” recombined images. Using FFDM alone provides 0.84 in accuracy (AUC = 0.87), whereas SD-CNN improves the diagnostic accuracy to 0.90 (AUC = 0.92).

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX