A Novel Transfer-Learning Model for Automatic Detection and Classification of Breast Cancer Based Deep CNN | ||||
Kafrelsheikh Journal of Information Sciences | ||||
Article 8, Volume 2, Issue 1, August 2021, Page 1-9 PDF (752.93 K) | ||||
Document Type: Original Article | ||||
DOI: 10.21608/kjis.2021.192207 | ||||
View on SCiNiTO | ||||
Authors | ||||
Abeer Saber 1; Mohamed Sakr2; Osama Abou-Seida3; Arabi Keshk4 | ||||
1Department of computer science, faculty of computers and information, Kafrelsheikh University | ||||
2Department of computer science, Faculty of computers and information, Menoufia University | ||||
3Computer Science Department, Faculty of Computers and Information Sciences, Kafrelsheikh University, Egypt. | ||||
4Faculty of Computers and Information, Menofia University, Egypt | ||||
Abstract | ||||
Breast cancer (BC) is a leading cause of cancer death among women in which breast cells develop out of control is by encouraging patients to receive timely care, early detection of BC increases the likelihood of survival. In this context, a new deep learning (DL) model is presented for automatic detection and classification of the suspected area of the breast based on the transfer learning (TL) technique. A pre-trained visual geometry group (VGG)-19, VGG16, and InceptionV3 networks are used in the presented model to transfer their learning parameters for improving the performance of breast tumor classification. The main goals of this project are to use segmentation to automatically determine the affected breast tumor region, reduce training time, and improve classification performance. In the presented model, the Mammographic Image Analysis Society (MIAS) dataset is used for extracting the breast tumor features. We have chosen four evaluation metrics for evaluating the performance of the presented model accuracy, sensitivity, specificity, and area under the ROC curve (AUC). The experiments showed that transferring parameters from the model of VGG16 is a powerful for BC classification than VGG19 and Inception V3 with overall specificity, accuracy, sensitivity, and AUC 98%,96.8%, 96%, and 0.99, respectively. | ||||
Statistics Article View: 319 PDF Download: 323 |
||||