Lightweight UAV Image Segmentation Model Design with Edge Feature Aggregation

Fengyufan Yang, Liye Jia, Erick Purwanto, Jeremy Smith, Ka Lok Man, Yutao Yue*

*Corresponding author for this work

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

Abstract

Nowadays, Unmanned Aerial Vehicles (UAVs) tend to become an attractable platform for smart city applications, especially for environmental sensing tasks. UAV downwards view image segmentation can extract meaningful information, but the challenge lies in segmenting small-size objects. This paper proposed a reasonable method where edge features were aggregated to enhance the performance of segmentation results, together with a suitable re-weighting method to alleviate the imbalance of interested object class distribution. The effectiveness of the designed network architecture was proven by experimental results that around a 2-4% increase of IoU for aimed small-size classes was achieved. Additionally, the proposed model remains in a lightweight structure.

Original languageEnglish
Title of host publication2023 International Conference on Platform Technology and Service
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages7-12
Number of pages6
ISBN (Electronic)9798350305999
DOIs
Publication statusPublished - 2023
Event9th International Conference on Platform Technology and Service, PlatCon 2023 - Busan, Korea, Republic of
Duration: 16 Aug 202318 Aug 2023

Publication series

Name2023 International Conference on Platform Technology and Service, PlatCon 2023 - Proceedings

Conference

Conference9th International Conference on Platform Technology and Service, PlatCon 2023
Country/TerritoryKorea, Republic of
CityBusan
Period16/08/2318/08/23

Keywords

  • Edge feature
  • Imbalanced classes
  • Lightweight model
  • Semantic segmentation
  • UAV image

Fingerprint

Dive into the research topics of 'Lightweight UAV Image Segmentation Model Design with Edge Feature Aggregation'. Together they form a unique fingerprint.

Cite this