Edge Guided GANs with Semantic Preserving for Semantic Image Synthesis
A novel ECGAN for the challenging semantic image synthesis task
- we propose to use edge as an intermediate representation which is further adopted to guide image generation via a proposed attention guided edge transfer module. Edge information is produced by a convolutional generator and introduces detailed structure information.
- we design an effective module to selectively highlight class-dependent feature maps according to the original semantic layout to preserve the semantic information.
- inspired by current methods in contrastive learning, we propose a novel contrastive learning method, which aims to enforce pixel embeddings belonging to the same semantic class to generate more similar image content than those from different classes. Doing so can capture more semantic relations by explicitly exploring the structures of labeled pixels from multiple input semantic layouts.