Matrix CalculusExternal link is an online tool for computing derivatives of linear algebra expressions. The derivatives are computed in vectorized form, that is, using compound expressions for vectors and matrices that avoid the need for indices. Vectorized linear algebra expressions can be readily mapped to highly optimized linear algebra libraries like Eigen and Numpy, and thus be evaluated efficiently.
Our work on matrix calculus is motivated by our work on generic optimization, where an optimization problem is specified in a simple modelling language and a solver is generated from the specification at the click of a button. However, the stand-alone matrix calculus tool has proven its independent value. Since its launch in 2018, millions of derivatives have been computed by tens of thousands of users from all over the world.
Computing derivatives in vectorized form is a nontrivial task. For many years, vectorized derivatives have been tabulated in repositories like the Matrix Cookbook, but no generic algorithm to compute them was known. We have solved this problem by translating vectorized linear algebra expressions into some index form, computing the derivative in index form, and then translating back into vectorized form. An introduction to our approach can be found in a Research Highlight de that is based on the publications:
- S. Laue, M. Mitterreiter and J. Giesen. A Simple and Efficient Tensor CalculusExternal link. Proceedings of the 34th AAAI Conference on Artificial Intelligence (AAAI), (2020) 4527-4534 (extended arXiv versionExternal link)
- S. Laue, M. Mitterreiter and J. Giesen. Computing Higher Order Derivatives of Matrix and Tensor ExpressionsExternal link. Proceedings of the 32nd Conference on Neural Information Processing Systems (NeurIPS), (2018) 2755-2764
- J. Giesen, J. Klaus, S. Laue and F. Schreck. Visualization Support for Developing a Matrix Calculus Algorithm: A Case Study. Proceedings of the 21st EG/VGTC Conference on Visualization (EUROVIS), Computer Graphics Forum 38 (2019) 351-361
Social media: Matrix calculus on Hacker NewsExternal link and on Andrew Gelman's BlogExternal link.
We use our matrix calculus for computing convexity certificates for a fairly general class of expressions that covers much of classical machine learning. Details can be found in the following publications:
- P. Rump, N. Merk, J. Klaus, M. Wenig and J. Giesen. Convexity Certificates for Symbolic Tensor Expressions. Proceedings of the 33d International Joint Conference on Artificial Intelligence (IJCAI), (2024)
- J. Klaus, N. Merk, K. Wiedom, S. Laue and J. Giesen. Convexity Certificates from Hessians.External link Proceedings of the 36th Conference on Neural Information Processing Systems (NeurIPS), (2022)