Abstract
Two different hierarchical levels of algorithmic differentiation are compared: the traditional approach and a higher-level approach where matrix operations are considered to be atomic. More explicitly: It is discussed how computer programs that consist of matrix operations (e.g. matrix inversion) can be evaluated in univariate Taylor polynomial arithmetic. Formulas suitable for the reverse mode are also shown. The advantages of the higher-level approach are discussed, followed by an experimental runtime comparison.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Bruce Christianson. Reverse accumulation and accurate rounding error estimates for Taylor series coefficients. Optimization Methods and Software, 1(1):81–94, 1991. Also appeared as Tech. Report No. NOC TR239, The Numerical Optimisation Centre, University of Hertfordshire, U.K., July 1991.
Mike B. Giles. Collected matrix derivative results for forward and reverse mode algorithmic differentiation. In Christian H. Bischof, H. Martin Bücker, Paul D. Hovland, Uwe Naumann, and J. Utke, editors, Advances in Automatic Differentiation, pages 35–44. Springer, 2008.
Andreas Griewank, David Juedes, H. Mitev, Jean Utke, Olaf Vogel, and Andrea Walther. ADOL-C: A package for the automatic differentiation of algorithms written in C/C + + . Technical report, Institute of Scientific Computing, Technical University Dresden, 1999. Updated version of the paper published in ACM Trans. Math. Software 22, 1996, 131–167.
Andreas Griewank. A mathematical view of automatic differentiation. In Acta Numerica, volume 12, pages 321–398. Cambridge University Press, 2003.
Andreas Griewank, Jean Utke, and Andrea Walther. Evaluating higher derivative tensors by forward propagation of univariate Taylor series. Mathematics of Computation, 69:1117–1130, 2000.
Gene H. Golub and Charles F. Van Loan. Matrix computations (3rd ed.). Johns Hopkins University Press, Baltimore, MD, USA, 1996.
Andreas Griewank and Andrea Walther. Evaluating Derivatives: Principles and Techniques of Algorithmic Differentiation. Number 105 in Other Titles in Applied Mathematics. SIAM, Philadelphia, PA, 2nd edition, 2008.
Michael J.R. Healy. Matrices for Statistics. Clarendon Press, Oxford, 2nd edition, 2000.
Laurent Hascoët and Valérie Pascual. Tapenade 2.1 user’s guide. Technical Report 0300, INRIA, 2004.
Jan R. Magnus and Heinz Neudecker. Matrix differential calculus with applications in statistics and econometrics. John Wiley & Sons, 2nd edition, 1999.
James R. Schott, editor. Matrix Analysis for Statistics. Wiley, New York, 1997.
Sebastian F. Walter. Taylorpoly. http://github.com/b45ch1/taylorpoly, 2009–2010.
Sebastian F. Walter. Source code of the performance comparison. http://github.com/b45ch1/hpsc_hanoi_2009_walter, 2010.
R. Clint Whaley, Antoine Petitet, and Jack J. Dongarra. Automated empirical optimization of software and the ATLAS project. Parallel Computing, 27(1–2):3–35, 2001. Also available as University of Tennessee LAPACK Working Note #147, UT-CS-00-448, 2000 (www.netlib.org/lapack/lawns/lawn147.ps).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Walter, S.F. (2012). On the Efficient Evaluation of Higher-Order Derivatives of Real-Valued Functions Composed of Matrix Operations. In: Bock, H., Hoang, X., Rannacher, R., Schlöder, J. (eds) Modeling, Simulation and Optimization of Complex Processes. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-25707-0_26
Download citation
DOI: https://doi.org/10.1007/978-3-642-25707-0_26
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-25706-3
Online ISBN: 978-3-642-25707-0
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)