Small Intestinal Polyp Detection in Wireless Capsule Endoscopy Images
Fan Shanhui1, Liu Shichen1, Cao E1, Fan Yihong2, Wei Kaihua1, Li Lihua1*
1(College of Life Information Science and Instrument Engineering, Hangzhou Dianzi University, Hangzhou 310018, China) 2(Department of Gastroenterology, Zhejiang Provincial Hospital of Traditional Chinese Medicine, Hangzhou 310006, China)
Abstract:Polyp is one of the most common small intestinal diseases. Wireless capsule endoscopy (WCE) is one of the routine clinical methods used for intestinal disease diagnosis. WCE can produce a mass of images during one examination, which may only contain a few abnormal images, therefore it is time consuming for doctors to review those images, which easily causes false detection and/or missed detection. Therefore, the development of an automatic polyp detection method is greatly valuable to provide support for doctors with better accuracy and efficiency. This study proposed a novel framework combining deep learning, transfer learning and data augmentation methods for polyp detection. The dataset used for model training and evaluation contained6 920 normal images and 6 864 polyp images, which was augmented from an original dataset containing 4 300 normal images and 429 polyp images. Specifically, three convolutional neural networks varied from depth to depth (AlexNet, VGGNet and GoogLeNet) were trained from scratch. The results showed that the GoogLeNet achieved the best performance with a sensitivity of 97.18%, specificity of 98.78% and accuracy of 97.99%. However, the training of deeper networks required more time and better computer, so we performed transfer learning strategy by fine-tuning a pretrained AlexNet. This model achieved a high accuracy of 97.74%, sensitivity of 96.57%, specificity of 98.89% and the area under the receiver operating characteristic curve (AUC) of 0.996. The proposed method provided an effective way for precise automatic intestinal polyp detection with limited training data, lower time cost and computer configuration, having potentials to help doctors efficiently detect intestinal polyp with WCE images.
范姗慧, 刘士臣, 曹鹗, 范一宏, 魏凯华, 厉力华. 无线胶囊内窥镜图像小肠息肉的自动识别[J]. 中国生物医学工程学报, 2019, 38(5): 522-532.
Fan Shanhui, Liu Shichen, Cao E, Fan Yihong, Wei Kaihua, Li Lihua. Small Intestinal Polyp Detection in Wireless Capsule Endoscopy Images. Chinese Journal of Biomedical Engineering, 2019, 38(5): 522-532.
[1] 八尾恒良. 小肠疾病临床诊断与治疗 [M]. 韩少良,译. 北京:人民军医出版社,2008. [2] Iddan G, Meron G, Glukhovsky A, et al. Wireless capsule endoscopy [J]. Nature, 2000, 405(6785): 417-417. [3] Karargyris A, Bourbakis N. Detection of small bowel polyps and ulcers in wireless capsule endoscopy videos [J]. IEEE Trans Biomed Eng, 2011, 58(10):2777-2786. [4] Yuan Yiyuan, Li Baopu, Meng Max QH. Improved bag of feature for automatic polyp detection in wireless capsule endoscopy Images [J]. IEEE Trans Autom Sci Eng, 2016, 13(2):529-535. [5] Hwang S, Celebi ME. Polyp detection in wireless capsule endoscopy videos based on image segmentation and geometric feature [C] //IEEE International Conference on Acoustics, Speech and Signal Processing. Dallas: IEEE, 2010: 678-681. [6] Eskandari H, Talebpour A, Alizadeh M, et al. Polyp detection in wireless capsule endoscopy images by using region-based active contour model [C] //The 19th Iranian Conference of Biomedical Engineering. Tehran: IEEE, 2012:305-308. [7] Yuan Yiyuan, Meng Max QH. Polyp classification based on bag of features and saliency in wireless capsule endoscopy [C] //International Conference on Robotics and Automation. Hong Kong: IEEE, 2014:3930-3935 [8] Yuan Yiyuan, Meng Max QH. A novel feature for polyp detection in wireless capsule endoscopy images [C] //IEEE/RSJ International Conference on Intelligent Robots and Systems. Chicago: IEEE, 2014:5010–5015. [9] Zhao Qian, Meng Max Q. Polyp detection in wireless capsule endoscopy images using novel color texture features [C] //The 9th World Congress on Intelligent Control and Automation. Taipei: IEEE, 2011: 948-952. [10] Guo Yanming, Liu Yu, Oerlemans A, et al. Deep learning for visual understanding: A review [J]. Neurocomputing, 2016, 187:27-48. [11] Krizhevsky A, Sutskever I, Hinton GE. ImageNet classification with deep convolutional neural networks [C] //The 25th International Conference on Neural Information Processing Systems. Lake Tahoe: NIPS, 2012:1097-1105. [12] Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition [EB/OL] https://arxiv.org/abs/1409.1556,2015-04-10/2018-11-24. [13] Szegedy C, Liu Wei, Jia Yangqing, et al. Going deeper with convolutions [C] //IEEE Conference on Computer Vision and Pattern Recognition. Boston: IEEE, 2015:1-9. [14] Ravishankar H, Sudhakar P, Venkataramani R, et al. Understanding the mechanisms of deep transfer learning for medical images [C] //International Workshop on Deep Learning in Medical Image Analysis. Athens: Springer, 2016:188-196. [15] Shin H C, Roth H R, Gao Mingchen, et al. Deep convolutional neural networks for computer-aided detection: CNN architectures, dataset characteristics and transfer learning [J]. IEEE Trans Med Imaging, 2016, 35(5):1285-1298. [16] Soekhoe D, Putten PVD, Plaat A. On the impact of data set size in transfer learning using deep neural networks [C] //International Symposium on Intelligent Data Analysis. Stockholm: Springer, 2016:50-60. [17] Zhao Wei. Research on the deep learning of the small sample data based on transfer learning [C] // Proceedings of American Institute of Physics Conference. Chongqing: AIP, 2017:020018. [18] Pan SJ, Yang Qiang. A survey on transfer learning [J]. IEEE Trans Knowl Data Eng, 2010, 22(10):1345-1359. [19] Mittal P, Vatsa M, Singh R. Composite sketch recognition via deep network-a transfer learning approach [C] //International Conference on Biometrics. Phuket: IEEE, 2015:251-256. [20] Huynh BQ, Li Hui, Giger ML. Digital mammographic tumor classification using transfer learning from deep convolutional neural networks [J]. J Med Imaging, 2016, 3(3):034501. [21] Rosario G, Sonderman T, Zhu Xingquan. Deep transfer learning for traffic sign recognition [C] // International Conference on Information Reuse and Integration. Salt Lake City: IEEE, 2018:178-185. [22] Zhang Yinan, An Mingqiang. Deep learning- and transfer learning-based super resolution reconstruction from single medical image [J]. J Healthc Eng, 2017. 2017:5859727. [23] Ng HW, Nguyen VD, Vonikakis V, et al. Deep learning for emotion recognition on small datasets using transfer learning [C] //International Conference on Multimodal Interaction. Seattle: ACM, 2015:443-449. [24] Menegola A, Fornaciali M, Pires R, et al. Knowledge transfer for melanoma screening with deep learning [C]//The 14th International Symposium on Biomedical Imaging. Melbourne: IEEE, 2017:297-300. [25] Jia Yangqing, Shelhamer E, Donahue J, et al. CAFFE: Convolutional architecture for fast feature embedding [C] //The 22nd ACM International Conference on Multimedia. Orlando: ACM, 2014:675-678. [26] Bradley AP. The use of the area under the ROC curve in the evaluation of machine learning algorithms [J]. Pattern Recogn, 1997, 30(7):1145-1159. [27] Ansari ME, Charfi S. Computer-aided system for polyp detection in wireless capsule endoscopy images [C] //International Conference on Wireless Networks and Mobile Communications. Rabat: IEEE, 2017:1-6. [28] Maghsoudi OH. Superpixel based segmentation and classification of polyps in wireless capsule endoscopy [C] //Signal Processing in Medicine and Biology Symposium. Philadelphia: IEEE, 2017:1-4.