High accurate environmental sound classification: Sub-spectrogram segmentation versus temporal-frequency attention mechanism

Tianhao Qiao, Shunqing Zhang*, Shan Cao, Shugong Xu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

9 Citations (Scopus)

Abstract

In the important and challenging field of environmental sound classification (ESC), a crucial and even decisive factor is the feature representation ability, which can directly affect the accuracy of classification. Therefore, the classification performance often depends to a large extent on whether the effective representative features can be extracted from the environmental sound. In this paper, we firstly propose a sub-spectrogram segmentation with score level fusion based ESC classification framework, and we adopt the proposed convolutional recurrent neural network (CRNN) for improv-ing the classification accuracy. By evaluating numerous truncation schemes, we numerically figure out the optimal number of sub-spectrograms and the corresponding band ranges, and, on this basis, we propose a joint attention mechanism with temporal and frequency attention mechanisms and use the global attention mechanism when generating the attention map. Finally, the numerical results show that the two frameworks we proposed can achieve 82.1% and 86.4% classification accuracy on the public environmental sound dataset ESC-50, respectively, which is equivalent to more than 13.5% improvement over the traditional baseline scheme.

Original languageEnglish
Article number5500
JournalSensors
Volume21
Issue number16
DOIs
Publication statusPublished - 2 Aug 2021
Externally publishedYes

Keywords

  • Convolutional recurrent neural network
  • Environmental sound classification
  • Score level fusion
  • Sub-spectrogram segmentation
  • Temporal-frequency attention mechanism

Fingerprint

Dive into the research topics of 'High accurate environmental sound classification: Sub-spectrogram segmentation versus temporal-frequency attention mechanism'. Together they form a unique fingerprint.

Cite this