Loading [a11y]/accessibility-menu.js
Regional Traditional Painting Generation Based on Controllable Disentanglement Model | IEEE Journals & Magazine | IEEE Xplore
Scheduled Maintenance: On Tuesday, 25 February, IEEE Xplore will undergo scheduled maintenance from 1:00-5:00 PM ET (1800-2200 UTC). During this time, there may be intermittent impact on performance. We apologize for any inconvenience.

Regional Traditional Painting Generation Based on Controllable Disentanglement Model


Abstract:

Automatic generation of painting images is an interesting and difficult task, especially for regional traditional paintings with unique cultural styles while lacking larg...Show More

Abstract:

Automatic generation of painting images is an interesting and difficult task, especially for regional traditional paintings with unique cultural styles while lacking large-scale training sets. In this paper, a hierarchical painting generation method is proposed, which can disentangle the generation of content and style. By mimicking the human painting process, the proposed method introduces multiple content blocks first and gradually generates image contents. In each block, a spatial self-modulation module is proposed to inject local details while preserving the global layout. After the preliminary generation of contents, a series of style blocks are presented to gradually adjust the artistic style. In the style block, an edge-oriented style-modulation module is proposed, which focuses on the lines and edges. In addition, edge adversarial training is used to further improve the quality of generated lines. To train and evaluate the proposed method, we construct datasets for five types of Chinese folk paintings. Experimental results demonstrate that the proposed method can generate high-quality and diverse painting images. More importantly, it can disentangle content and style sufficiently, so that the generation of specific contents or styles can be controlled freely. The datasets and source codes is available at https://github.com/Ritsu-mio/HPGN.
Page(s): 6913 - 6925
Date of Publication: 25 July 2023

ISSN Information:

Funding Agency:


References

References is not available for this document.