skip to main content
10.1145/3330393.3330411acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmsspConference Proceedingsconference-collections
research-article

Optimal Transport of Deep Feature for Image Style Transfer

Published: 10 May 2019 Publication History

Abstract

Image style transfer is a classic image editing task which aims to transfer arbitrary visual styles to content images. In recent years, it has been revealed that a well-trained convolutional neural network with sufficient labeled data is powerful to deal with the style transfer problem. Thanks to the recent advances in the analysis of neural style transfer, the image style transfer can be cast as a problem of distribution alignment. In this paper, we propose to solve this issue by incorporating the theory of optimal transport in a simple and intuitive way. The main component of our style transfer method is an optimal transportation map, which is derived from the Monge-Kantorovicth theory of mass transportation, to perform the alignment process from the content image to the style image. We compare the generated stylized images with a number of representative algorithms to demonstrate the effectiveness of our approach. We also show that our results are visually more consistent and well-stylized simultaneously.

References

[1]
Ashikhmin, N. (2003). Fast texture transfer. Computer Graphics & Applications IEEE, 23(4), 38--43.
[2]
Gatys, L. A., Ecker, A. S., & Bethge, M. (2016). Image style transfer using convolutional neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 2414--2423).
[3]
Gatys, L. A., Ecker, A. S., & Bethge, M. (2015). A neural algorithm of artistic style. arXiv preprint arXiv:1508.06576.
[4]
Gooch, B., & Gooch, A. (2001). Non-photorealistic rendering. AK Peters/CRC Press.
[5]
Strothotte, T., & Schlechtweg, S. (2002). Non-photorealistic computer graphics: modeling, rendering, and animation. Morgan Kaufmann.
[6]
Lee, H., Seo, S., Ryoo, S., & Yoon, K. (2010, June). Directional texture transfer. In Proceedings of the 8th International Symposium on Non-Photorealistic Animation and Rendering (pp. 43--48). ACM.
[7]
Efros, A. A., & Leung, T. K. (1999, September). Texture synthesis by non-parametric sampling. In iccv (pp. 1033). IEEE.
[8]
Gatys, L., Ecker, A. S., & Bethge, M. (2015). Texture synthesis using convolutional neural networks. In Advances in Neural Information Processing Systems (pp. 262--270).
[9]
Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). Imagenet classification with deep convolutional neural networks. In Advances in neural information processing systems (pp. 1097--1105).
[10]
Efros, A. A., & Freeman, W. T. (2001, August). Image quilting for texture synthesis and transfer. In Proceedings of the 28th annual conference on Computer graphics and interactive techniques (pp. 341--346). ACM.
[11]
Frigo, O., Sabater, N., Delon, J., & Hellier, P. (2016). Split and match: Example-based adaptive patch sampling for unsupervised style transfer. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 553--561).
[12]
Elad, M., & Milanfar, P. (2017). Style Transfer via Texture Synthesis. IEEE Trans. Image Processing, 26(5), 2338--2351.
[13]
Li, C., & Wand, M. (2016). Combining markov random fields and convolutional neural networks for image synthesis. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 2479--2486).
[14]
Johnson, J., Alahi, A., & Fei-Fei, L. (2016, October). Perceptual losses for real-time style transfer and super-resolution. In European Conference on Computer Vision (pp. 694--711). Springer, Cham.
[15]
Ulyanov, D., Lebedev, V., Vedaldi, A., & Lempitsky, V. S. (2016, June). Texture Networks: Feed-forward Synthesis of Textures and Stylized Images. In ICML (pp. 1349--1357).
[16]
Portilla, J., & Simoncelli, E. P. (2000). A parametric texture model based on joint statistics of complex wavelet coefficients. International journal of computer vision, 40(1), 49--70.
[17]
Gatys, L. A., Ecker, A. S., Bethge, M., Hertzmann, A., & Shechtman, E. (2017, July). Controlling perceptual factors in neural style transfer. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[18]
Dumoulin, V., Shlens, J., & Kudlur, M. (2017). A learned representation for artistic style. Proc. of ICLR.
[19]
Chen, D., Yuan, L., Liao, J., Yu, N., & Hua, G. (2017, July). Stylebank: An explicit representation for neural image style transfer. In Proc. CVPR (Vol. 1, No. 3, p. 4).
[20]
Li, Y., Wang, N., Liu, J., & Hou, X. (2017). Demystifying neural style transfer. arXiv preprint arXiv:1701.01036.
[21]
Courty, N., Flamary, R., Tuia, D., & Rakotomamonjy, A. (2017). Optimal transport for domain adaptation. IEEE transactions on pattern analysis and machine intelligence, 39(9), 1853--1865.
[22]
Courty, N., Flamary, R., & Tuia, D. (2014, September). Domain adaptation with regularized optimal transport. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases (pp. 274--289). Springer, Berlin, Heidelberg.
[23]
Chen, T. Q., & Schmidt, M. (2016). Fast patch-based style transfer of arbitrary style. arXiv preprint arXiv:1612.04337.
[24]
Huang, X., & Belongie, S. J. (2017, February). Arbitrary Style Transfer in Real-Time with Adaptive Instance Normalization. In ICCV (pp. 1510--1519).
[25]
Li, Y., Fang, C., Yang, J., Wang, Z., Lu, X., & Yang, M. H. (2017). Universal style transfer via feature transforms. In Advances in Neural Information Processing Systems (pp. 386--396).
[26]
Monge, G. (1781). Mémoire sur la théorie des déblais et des remblais. Histoire de l'Académie Royale des Sciences de Paris.
[27]
Kantorovitch, L. (1958). On the translocation of masses. Management Science, 5(1), 1--4.
[28]
Knott, M., & Smith, C. S. (1984). On the optimal mapping of distributions. Journal of Optimization Theory and Applications, 43(1), 39--49.
[29]
K. Nichol. Painter by numbers, wikiart. https:// www.kaggle.com/c/painter-by-numbers, 2016.
[30]
Perrot, M., Courty, N., Flamary, R., & Habrard, A. (2016). Mapping estimation for discrete optimal transport. In Advances in Neural Information Processing Systems (pp. 4197--4205)

Cited By

View all
  • (2024)Energy-Based Domain Adaptation Without Intermediate Domain Dataset for Foggy Scene SegmentationIEEE Transactions on Image Processing10.1109/TIP.2024.348356633(6143-6157)Online publication date: 2024
  • (2024)Exact Fusion via Feature Distribution Matching for Few-Shot Image Generation2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52733.2024.00801(8383-8392)Online publication date: 16-Jun-2024
  • (2022)Exact Feature Distribution Matching for Arbitrary Style Transfer and Domain Generalization2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52688.2022.00787(8025-8035)Online publication date: Jun-2022
  • Show More Cited By

Index Terms

  1. Optimal Transport of Deep Feature for Image Style Transfer

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    ICMSSP '19: Proceedings of the 2019 4th International Conference on Multimedia Systems and Signal Processing
    May 2019
    213 pages
    ISBN:9781450371711
    DOI:10.1145/3330393
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    • Shenzhen University: Shenzhen University
    • Sun Yat-Sen University

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 10 May 2019

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Distribution Alignment
    2. Neural Networks
    3. Optimal Transport
    4. Style Transfer

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • Zhejiang Provincial Basic Research Program
    • Natural Science Foundation of Zhejiang Province

    Conference

    ICMSSP 2019

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)14
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 03 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Energy-Based Domain Adaptation Without Intermediate Domain Dataset for Foggy Scene SegmentationIEEE Transactions on Image Processing10.1109/TIP.2024.348356633(6143-6157)Online publication date: 2024
    • (2024)Exact Fusion via Feature Distribution Matching for Few-Shot Image Generation2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52733.2024.00801(8383-8392)Online publication date: 16-Jun-2024
    • (2022)Exact Feature Distribution Matching for Arbitrary Style Transfer and Domain Generalization2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52688.2022.00787(8025-8035)Online publication date: Jun-2022
    • (2022)Generating Natural Images with Direct Patch Distributions MatchingComputer Vision – ECCV 202210.1007/978-3-031-19790-1_33(544-560)Online publication date: 24-Oct-2022
    • (2020)Artist-based painting classification using Markov random fields with convolution neural networkMultimedia Tools and Applications10.1007/s11042-019-08547-479:17-18(12635-12658)Online publication date: 1-May-2020
    • (2020)Iterative Feature Transformation for Fast and Versatile Universal Style TransferComputer Vision – ECCV 202010.1007/978-3-030-58529-7_11(169-184)Online publication date: 13-Nov-2020

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media