|
|
Application of Improved NASNet Algorithm in Breast Ultrasound Diagnosis |
Yi Sanli1,2*, She Furong1,2, Yang Xuelian1,2, Chen Dong3, Luo Xiaomao3* |
1(School of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650504, China) 2(Key Laboratory of Computer Technology Application of Yunnan Province, Kunming 650504, China) 3(Department of Ultrasound Medicine, Yunnan Cancer Hospital, Kunming 650118, China) |
|
|
Abstract Ultrasound image is of great significance in clinical diagnosis of breast diseases, however, the resolution of the breast ultrasound image is low, and the sample size is small. Although NASNet is suitable for small sample data, it requires many parameters that makes it difficult to train. This paper proposed an improved NASNet classification algorithm to distinguish the benign and malignant breast masses. Firstly, NASNet is pre-trained on Imagenet by transfer learning technology, and the learned features were directly used for benign and malignant tumor recognition on breast ultrasound images, which saved the cost of calculation and improved the accuracy of the calculation. Then, to enhance the ability of the network to extract ultrasonic image features and make the network lightweight, we deeply integrated deep separable convolution into NASNet to construct a large-scale network. Finally, to enhance feature weights that are more relevant to the disease and further enhance the extraction ability of high-order feature information, we added an SE module to screen the channel features that account for more weight in ultrasonic images. To verify the algorithm, we used the training method of 5-fold cross-validations based on the experiments of local hospital data sets and public data sets and compare the algorithm with the widely used classification algorithm. There were 1 350 ultrasound images in the local hospital datasets and 895 ultrasound images in the two public datasets. Based on the data of local hospitals, Acc, Sen, and F1 were 97.52%. The Acc, Sen, and F1 of experiments with public data set as training set and verification set and local hospital data set as test set were 96.31%, 96.31%, and 96.39% respectively. The Acc, Sen, and F1 of the mixed data experiment based on local hospital data and public data were 98.27%. The results showed that the improved algorithm had advantages over other algorithms and was more suitable for the classification of benign and malignant tumors with a small amount of breast ultrasound images.
|
Received: 09 September 2021
|
|
Corresponding Authors:
*E-mail:152514845@qq.com;blueskyluoxiaomao@163.com
|
|
|
|
[1] Siegel RL, Miller KD, Jemal A. Cancer statistics [J]. CA Cancer J Clin, 2019, 69(1): 7-34. [2] Bray F, Ferlay J, Soerjomataram I, et al. Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries [J]. CA Cancer J Clin, 2018, 68(6): 394-424. [3] Menezes GLG, Knuttel FM, Stehouwer BL, et al. Magnetic resonance imaging in breast cancer: a literature review and future perspectives [J]. World journal of clinical oncology, 2014, 5(2): 61-70. [4] 中国抗癌协会乳腺癌专业委员会. 中国抗癌协会乳腺癌诊治指南与规范(2019年版) [J]. 中国癌症杂志, 2019, 29(8):609-680. [5] Han Zhongyi, Wei Benzheng, Zheng Yuanjie, et al. Breast cancer multi-classification from histopathological images with structured deep learning model [J]. Scientific Reports, 2017, 7(1): 1-10. [6] Wollmann T, Gunkel M, Chung I, et al. GRUU-Net: Integrated convolutional and gated recurrent neural network for cell segmentation [J]. Medical Image Analysis, 2019, 56:68-79. [7] Yang Zhanbo, Ran Lingyan, Zhang Shizhou, et al. EMS-Net: Ensemble of Multiscale Convolutional Neural Networks for Classification of Breast Cancer Histology Images [J]. Neurocomputing, 2019, 366:46-53. [8] Zhao Ruohan, Li Qin, Wu Jianrong, et al. A nested U-shape network with multi-scale upsample attention for robust retinal vascular segmentation [J]. Pattern Recognition, 2021, 120:107998. [9] Ramkumar G, Thandaiah PR, Ngangbam PS, et al. Experimental analysis of brain tumor detection system using Machine learning approach [EB/OL]. https://doi.org/10.1016/j.matpr.2021.01.246.2021-02-25/2021-09-09. [10] Lecun Y, Bottou L, Bengio Y, et al. Gradient-based learning applied to document recognition [J] Proceedings of the IEEE, 1998,86(11):2278-2324. [11] Krizhevsky A. Sutskever Hinton GE. Imagenet classification with deep convolutional neural networks [C]//Advances in Neural Information Processing Systems. Carson City: Nips Foundation, 2012: 1097-1105. [12] Szegedy C, Liu Wei, Jia Yangqing, et al. Going Deeper with Convolutions [C]//Conference on Computer Vision and Pattern Recognition. Boston: IEEE, 2015: 1-9. [13] Chollet F. Xception: Deep Learning with Depthwise Separable Convolutions [C]//Conference on Computer Vision and Pattern Recognition. Honolulu: IEEE, 2017: 1800-1807. [14] Hu Jie, Shen Li, Sun Gang. Squeeze-and-excitation networks [C]//The IEEE Conference on Computer Vision and Pattern Recognition. Salt Lake City: IEEE, 2017: 7132-7141. [15] Li Fangyi, Shang Changjing, Li Ying, et al. Interpretable mammographic mass classification with fuzzy interpolative reasoning [J]. Knowledge-Based Systems, 2019, 191(C): 105279. [16] Qayyum A, Anwar SM, Awais M, et al. Medical image retrieval using deep convolutional neural network [J]. Neurocomputing, 2017, 266: 8-20. [17] Kumar A, Kim J, Lyndon D, et al. An ensemble of fine-tuned convolutional neural networks for medical image classification [J]. IEEE Journal of Biomedical and Health Informatics, 2017, 21 (1): 31-40. [18] Zoph B, Vasudevan V, Shlens J, et al. Learning Transferable Architectures for Scalable Image Recognition [C]//2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2018: 8697-8710. [19] Jiao Zhicheng, Gao Xinbo, Wang Ying, et al. A deep feature based framework for breast masses classification [J]. Neurocomputing. 2016, 197: 221-231. [20] Qi Xiaofeng, Zhang Lei, Chen Yao, et al. Automated diagnosis of breast ultrasonography images using deep neural networks [J]. Medical Image Analysis, 2018, 52: 185-198. [21] Wang Linjing, Zheng Chao, Chen Wentao, et al. Multi-path synergic fusion deep neural network framework for breast mass classification using digital breast tomosynthesis [J]. Physics in Medicine & Biology, 2020, 65(23): 235045. [22] Russakovsky O, Deng Jia, Su Hao, et al. ImageNet large scale visual recognition challenge [J]. International Journal of Computer Vision, 2015,115 (3): 211-252. [23] Howard AG, Zhu Menglong, Chen Bo, et al. MobileNets: efficient convolutional neural networks for mobile visionapplications [C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Honolulu: IEEE, 2017: 710-729. [24] Al-Dhabyani W, Gomaa M, Khaled, et al. Dataset of breast ultrasound images. Data in brief [EB/OL]. https://doi.org/10.1016/j.dib.2019.104863.2020-02/2021-09-09. [25] Badawy SM, Mohamed AE-NA, Hefnawy AA, et al. Automatic semantic segmentation of breast tumors in ultrasound images based on combining fuzzy logic and deep learning—a feasibility study [EB/OL]. https://doi.org/10.1371/journal.pone.0251899.2021-05-20/2021-09-09. [26] Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition [C]//The 3rd International Conference on Learning Representations .San Diego: IEEE, 2014: 1-14. [27] He Kaiming, Zhang Xiangyu, Ren Shaoqing, et al. Deep residual learning for image recognition [C]//Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2015: 770-778. [28] Zhang Xiangyu, Zhou Xinyu, Lin Mengxiao, et al. ShuffleNet: An Extremely Efficient Convolutional Neural Network for Mobile Devices [C]//Computer Vision and Pattern Recognition. Salt Lake City: IEEE, 2018:6848-6856. [29] Huang Gao, Liu Zhuang, Van Der Maatenn L, et al. Densely connected convolutional networks [C]//Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2017:4700-4708. [30] Pistolese CA, Tosti D, Citraro D, et al. Probably Benign Breast Nodular Lesions (BI-RADS 3): Correlation between Ultrasound Features and Histologic Findings [J]. Ultrasound in Medicine & Biology, 2019, 45(1):78-84. [31] Shin HC, Roth HR, Gao M, et al. Deep convolutional neural networks for computer-aided detection: CNN architectures, dataset characteristics and transfer learning [J]. IEEE transactions on medical imaging, 2016, 35(5): 1285-1298. [32] Shen Wei-Chih, Chang Ruey-Feng, Moon WK, et al. Computer aided classification system for breast ultrasound based on Breast Imaging Reporting and Data System (BI-RADS) [J]. Ultrasound in Medicine & Biology, 2007, 33(11):1688-1698. [33] Zhao Wenwei, Wang Runze, Qi Yunliang, et al. BASCNet: bilateral adaptive spatial and channel attention network for breast density classification in the mammogram [J]. Biomedical Signal Processing and Control, 2021, 70(1):103073. |
[1] |
Zhang Haowei, Li Zhanqi, Liu Ying, Li Miao. Study of Ultrasonic Thyroid Nodules Detection Based on Cascade Rcnn[J]. Chinese Journal of Biomedical Engineering, 2022, 41(1): 64-72. |
[2] |
Wang Congzhi, Xu Zibi, Ma Xiangyuan, Hong Zilan, Fang Qiang, Guo Yanchun. Mask R-CNN and Data Augmentation and Transfer Learning[J]. Chinese Journal of Biomedical Engineering, 2021, 40(4): 410-418. |
|
|
|
|