SG-GAN: Adversarial Self-Attention GCN for Point Cloud Topological Parts Generation

Yushi Li, George Baciu

Research output: Contribution to journalArticlepeer-review

10 Citations (Scopus)


Point clouds are fundamental in the representation of 3D objects. However, they can also be highly unstructured and irregular. This makes it difficult to directly extend 2D generative models to three-dimensional space. In this article, we cast the problem of point cloud generation as a topological representation learning problem. In order to capture the representative features of 3D shapes in the latent space, we propose a hierarchical mixture model that integrates self-attention with an inference tree structure for constructing a point cloud generator. Based on this, we design a novel Generative Adversarial Network (GAN) architecture that is capable of generating recognizable point clouds in an unsupervised manner. The proposed adversarial framework (SG-GAN) relies on self-attention mechanism and Graph Convolution Network (GCN) to hierarchically infer the latent topology of 3D shapes. Embedding and transferring the global topology information in a tree framework allows our model to capture and enhance the structural connectivity. Furthermore, the proposed architecture endows our model with partially generating 3D structures. Finally, we propose two gradient penalty methods to stabilize the training of SG-GAN and overcome the possible mode collapse of GAN networks. To demonstrate the performance of our model, we present both quantitative and qualitative evaluations and show that SG-GAN is more efficient in training and it exceeds the state-of-the-art in 3D point cloud generation.

Original languageEnglish
Pages (from-to)3499-3512
Number of pages14
JournalIEEE Transactions on Visualization and Computer Graphics
Issue number10
Publication statusPublished - 1 Oct 2022
Externally publishedYes


  • 3D shape generation
  • Generative adversarial network
  • binary tree
  • gradient penalty
  • graph convolution network
  • point cloud learning
  • self-attention

Cite this