Exploring Incompatible Knowledge Transfer
in Few-shot Image Generation

1Singapore University of Technology and Design (SUTD)
2Sea AI Lab (SAIL), Singapore


Few-shot image generation (FSIG) learns to generate diverse and high-fidelity images from a target domain using a few (e.g., 10) reference samples. Existing FSIG methods select, preserve and transfer prior knowledge from a source generator (pretrained on a related domain) to learn the target generator. In this work, we investigate an underexplored issue in FSIG, dubbed as incompatible knowledge transfer, which would significantly degrade the realisticness of synthetic samples (e.g., "Trees/Buildings on sea" or "Cat with glasses"). Empirical observations show that the issue stems from the least significant filters from the source generator.

To this end, we propose Removing In-Compatible Knowledge (RICK) Transfer via knowledge truncation to mitigate this issue in FSIG, which is a complementary operation to knowledge preservation and is implemented by a lightweight pruning-based method. Extensive experiments show that knowledge truncation is simple and effective, consistently achieving state-of-the-art performance, including challenging setups where the source and target domains are more distant.



1: We consider the problem of FSIG with Transfer Learning using very limited target samples (e.g., 10-shot).
2: Our work makes two contributions:
  • We explore the incompatible knowledge transfer for FSIG, reveal that SOTA methods fail in handling this issue, investigate the underlying causation, and disclose the inadequacy of fine-tuning in removal of incompatible knowledge.
  • We propose knowledge truncation to alleviate incompatible knowledge transfer, and realize it with a lightweight filter-pruning based method.
3: Below is the schematic diagram of the proposed Removing In-Compatible Knowledge (RICK) for FSIG: We measure the importance of each filter for adaptation, during adaptation, periodically (e.g., every 50 iterations). We zero out (i.e., "remove") the filters with least importance. The same operations are applied to the discriminator. Similar to state-of-the-art knowledge preservation methods, we also preserve the filters with highest importance, and fine-tuning the others.

Experiment Results

Experiment results of FSIG on few-shot target samples. FFHQ is the source domain. Left: 10-shot real target samples for adaptation. Mid: We visualize the generated images using the adapted generator with different methods. Images in each column are from the same noise input. It is noticeable that our method while preserving the source knowledge useful for the target domain, reliably removes the incompatible source knowledge. For example, in SOTA methods, (Top orange frames) hat, doodle, sunglasses and beard are transferred and lead to generated babies with degraded realism; (Bottom blue frames) hat, glasses, human face texture and artifacts are transferred and lead to generated cats with low realism. In contrast, our method can address this issue in different setups. Right: Quantitatively, we measure the quality and diversity of generated images via FID and intra-LPIPS..


If you find our research useful in your work, please consider citing our paper:

		title={Exploring incompatible knowledge transfer in few-shot image generation},
		author={Zhao, Yunqing and Du, Chao and Abdollahzadeh, Milad and Pang, Tianyu and Lin, Min and Yan, Shuicheng and Cheung, Ngai-Man},
		booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},

Meanwhile, we also demonstrate a relevant research for few-shot image generation, via parameter-efficient Adaptation-Aware kernel Modulation (AdAM, NeurIPS-2022) to fine-tune the pretrained GANs:

				title={Few-shot Image Generation via Adaptation-Aware Kernel Modulation},
				author={Yunqing Zhao and Keshigeyan Chandrasegaran and Milad Abdollahzadeh and Ngai-man Cheung},
				booktitle={Advances in Neural Information Processing Systems},
				editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},