Cross Domain Person Re-Identification with Large Scale Attribute Annotated Datasets

Bolei Xu, Jingxin Liu, Xianxu Hou, Ke Sun, Guoping Qiu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

We propose a novel deep convolutional neural network framework called deep augmented attribute network (DAAN) to learn augmented attribute features for cross-domain person re-identification (person Re-ID) task. We observed that in some cases, different persons could have similar attributes (e.g., wearing similar clothes). It motivates us to distinguish such pedestrians by further learning complementary image features. We thus construct a deep neural network with three branches: 1) the attribute branch predicts the attributes of the input image; 2) the augmentation branch generates complementary features that are fused with the output of the attribute branch to form the augmented attribute features; and 3) the reconstruction branch to refine augmented attribute features on the target dataset. In order to learn precise and detailed attributes for pedestrian, we manually labeled two large datasets (CUHK03 and Market-1501) with 25 pre-defined mid-level semantic attributes. We evaluate the DAAN on a series of cross-domain person Re-ID tasks, where DAAN demonstrates superior performance (around 6%) to the prior state-of-the-art cross-domain algorithms.

Original languageEnglish
Article number8630965
Pages (from-to)21623-21634
Number of pages12
JournalIEEE Access
Volume7
DOIs
Publication statusPublished - 2019
Externally publishedYes

Keywords

  • Deep learning
  • cross domain
  • person re-identification

Cite this