skip to main content
10.1145/3520304.3529071acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
poster

MOPINNs: an evolutionary multi-objective approach to physics-informed neural networks

Published: 19 July 2022 Publication History

Abstract

This paper introduces Multi-Objective Physics-Informed Neural Networks (MOPINNs). MOPINNs use an EMO algorithm to find the set of trade-offs between the data and physical losses of PINNs and therefore allow practitioners to correctly identify which of these trade-offs better represent the solution they want to reach. We discuss how MOPINNs overcome the complexity of weighting the different loss functions and to the best of our knowledge this is the first work relating multi-objective optimization problems (MOPs) and PINNs via evolutionary algorithms. We provide an exploratory analysis of this technique in order to determine its feasibility by applying MOPINNs on PDEs of particular interest: the heat, waves, and Burgers equations.

References

[1]
C. Basdevant et al. 1986. Spectral and finite difference solutions of the Burgers equation. Computers & fluids 14, 1 (1986), 23--41.
[2]
R. Bischof and M. Kraus. 2021. Multi-Objective Loss Balancing for Physics-Informed Deep Learning. arXiv:2110.09813 [cs.LG]
[3]
J. Blank and K. Deb. 2020. pymoo: Multi-Objective Optimization in Python. IEEE Access 8 (2020), 89497--89509.
[4]
J. Branke et al. (Eds.). 2008. Multiobjective Optimization. Lecture Notes in Computer Science, Vol. 5252. Springer-Verlag, Berlin/Heidelberg.
[5]
T. Cai et al. 2021. The multi-task learning with an application of Pareto improvement. In The 2nd International Conference on Computing and Data Science (Stanford, USA). ACM.
[6]
Z. Chen, V. Badrinarayanan, C. Lee, and A. Rabinovich. 2018. Gradnorm: Gradient normalization for adaptive loss balancing in deep multitask networks. In ICML. Proceedings of Machine Learning Research, 794--803. arXiv:1711.02257 [cs.CV]
[7]
C.A. Coello Coello, G.B. Lamont, and D.A. van Veldhuizen. 2007. Evolutionary Algorithms for Solving Multi-Objective Problems (second ed.). Springer, New York.
[8]
K. Deb and H. Jain. 2014. An Evolutionary Many-Objective Optimization Algorithm Using Reference-point Based Non-dominated Sorting Approach, Part I: Solving Problems with Box Constraints. IEEE Transactions on Evolutionary Computation 18 (2014), 577--601.
[9]
K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan. 2002. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation 6 (2002), 182--197.
[10]
D. Dyankov, S.D. Riccio, G. Di Fatta, and G. Nicosia. 2019. Multi-task Learning by Pareto Optimality. In Machine Learning, Optimization, and Data Science. Springer International Publishing, Cham, 605--618.
[11]
W. Falcon et al. 2020. PyTorchLightning/pytorch-lightning: 0.7.6 release.
[12]
A.P. Guerreiro, C.M. Fonseca, and L. Paquete. 2020. The Hypervolume Indicator: Problems and Algorithms. (May 2020). arXiv:2005.00515 [cs.DS] http://arxiv.org/abs/2005.00515
[13]
S. Gupta, G. Singh, and M. Lease. 2021. Scalable Uni-directional Pareto Optimality for Multi-Task Learning with Constraints. arXiv preprint arXiv:2110.15442 (2021).
[14]
A.A. Heydari, C.A. Thompson, and A. Mehmood. 2019. Softadapt: Techniques for adaptive loss weighting of neural networks with multi-part loss functions. arXiv preprint arXiv:1912.12355 (2019).
[15]
C. Li, M. Georgiopoulos, and G.C. Anagnostopoulos. 2014. Pareto-Path Multi-Task Multiple Kernel Learning. (April 2014). arXiv:1404.3190 [cs.LG] http://arxiv.org/abs/1404.3190
[16]
X. Lin et al. 2019. Pareto Multi-Task Learning. (Dec. 2019). arXiv:1912.12854 [cs.LG] http://arxiv.org/abs/1912.12854
[17]
X. Lin, Z. Yang, Q. Zhang, and S. Kwong. 2020. Controllable Pareto Multi-Task Learning. (Oct. 2020). arXiv:2010.06313 [cs.LG] https://openreview.net/pdf?id=5mhViEOQxaV
[18]
P. Ma, T Du, and W. Matusik. 2020. Efficient continuous Pareto exploration in multi-task learning. (June 2020). arXiv:2006.16434 [cs.LG] http://arxiv.org/abs/2006.16434
[19]
D. Mahapatra and V. Rajan. 2021. Exact Pareto optimal search for multi-Task Learning: Touring the Pareto front. (Aug. 2021). arXiv:2108.00597 [cs.LG] http://arxiv.org/abs/2108.00597
[20]
I. Mondal, P. Sen, and D. Ganguly. 2021. Multi-objective few-shot learning for fair classification. (Oct. 2021). arXiv:2110.01951 [cs.LG] http://arxiv.org/abs/2110.01951
[21]
F. Pfisterer et al. 2019. Multi-Objective Automatic Machine Learning with AutoxgboostMC. CoRR (2019). arXiv:1908.10796 http://arxiv.org/abs/1908.10796
[22]
M. Raissi, P. Perdikaris, and G.E. Karniadakis. 2019. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. J. Comput. Phys. 378 (2019), 686--707.
[23]
F.M. Rohrhofer, S. Posch, and B.C. Geiger. 2021. On the Pareto Front of Physics-Informed Neural Networks. arXiv preprint arXiv:2105.00862 (2021).
[24]
M. Ruchte and J. Grabocka. 2021. Efficient Multi-Objective Optimization for Deep Learning. (March 2021). arXiv:2103.13392 [cs.LG] http://arxiv.org/abs/2103.13392
[25]
S. Wang, Y. Teng, and P. Perdikaris. 2021. Understanding and mitigating gradient flow pathologies in physics-informed neural networks. SIAM Journal on Scientific Computing 43, 5 (2021), A3055--A3081.
[26]
Q. Zhang and H. Li. 2007. MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition. IEEE Transactions on Evolutionary Computation 11 (2007), 712--731.
[27]
D. Zhou et al. 2018. Multi-task multi-view learning based on cooperative multi-objective optimization. IEEE access: practical innovations, open solutions 6 (2018), 19465--19477.

Cited By

View all
  • (2024)Effective Training of PINNs by Combining CMA-ES with Gradient Descent2024 IEEE Congress on Evolutionary Computation (CEC)10.1109/CEC60901.2024.10611964(1-8)Online publication date: 30-Jun-2024
  • (2024)Modeling global surface dust deposition using physics-informed neural networksCommunications Earth & Environment10.1038/s43247-024-01942-25:1Online publication date: 20-Dec-2024
  • (2023)LSA-PINN: Linear Boundary Connectivity Loss for Solving PDEs on Complex Geometry2023 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN54540.2023.10191236(1-10)Online publication date: 18-Jun-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '22: Proceedings of the Genetic and Evolutionary Computation Conference Companion
July 2022
2395 pages
ISBN:9781450392686
DOI:10.1145/3520304
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 19 July 2022

Check for updates

Author Tags

  1. evolutionary algorithms
  2. multi-objective optimization (MOO)
  3. physics-informed neural networks (PINNs)
  4. weighted PINNs

Qualifiers

  • Poster

Conference

GECCO '22
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)34
  • Downloads (Last 6 weeks)5
Reflects downloads up to 07 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Effective Training of PINNs by Combining CMA-ES with Gradient Descent2024 IEEE Congress on Evolutionary Computation (CEC)10.1109/CEC60901.2024.10611964(1-8)Online publication date: 30-Jun-2024
  • (2024)Modeling global surface dust deposition using physics-informed neural networksCommunications Earth & Environment10.1038/s43247-024-01942-25:1Online publication date: 20-Dec-2024
  • (2023)LSA-PINN: Linear Boundary Connectivity Loss for Solving PDEs on Complex Geometry2023 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN54540.2023.10191236(1-10)Online publication date: 18-Jun-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media