A Novel Lesion Segmentation Algorithm based on U-Net Network for Tuberculosis CT Image

Shaoyue Wen, Jing Liu*, Wenge Xu

*Corresponding author for this work

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

6 Citations (Scopus)

Abstract

Lung CT images provide several essential information for lung disease diagnosis and lung surgery. However, the traditional detection method through manual segmentation is laborious and time-consuming. This paper presents automatic tuberculosis (TB) lesion segmentation method based on U-Net neural network for detecting TB. In addition, we combined an edge detection algorithm called canny edge detector with this network to get a more accurate TB lesion boundary. This method is trained on two split databases with 3576 lung CT images obtained by data enhancement on 447 discontinuous lung CT images. The results show that the proposed approach is validated for complex TB lesions with a high dice coefficient (91.2%).

Original languageEnglish
Title of host publication10th International Conference on Control, Automation and Information Sciences, ICCAIS 2021 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages909-914
Number of pages6
ISBN (Electronic)9781665440295
DOIs
Publication statusPublished - 2021
Externally publishedYes
Event10th International Conference on Control, Automation and Information Sciences, ICCAIS 2021 - Xi'an, China
Duration: 14 Oct 202117 Oct 2021

Publication series

Name10th International Conference on Control, Automation and Information Sciences, ICCAIS 2021 - Proceedings

Conference

Conference10th International Conference on Control, Automation and Information Sciences, ICCAIS 2021
Country/TerritoryChina
CityXi'an
Period14/10/2117/10/21

Keywords

  • Image processing
  • Lesions segmentation
  • Tuberculosis
  • Unet

Fingerprint

Dive into the research topics of 'A Novel Lesion Segmentation Algorithm based on U-Net Network for Tuberculosis CT Image'. Together they form a unique fingerprint.

Cite this