Dates: July 8 and 9 from 2-4 pm IST, online.
The registration link for the course isĀ https://forms.gle/CE9yzUJEe6TBxViR7
Course Instructor: Pavanakumar Mohanamuraly, Senior Researcher, ALGO Team, CERFACS, Toulouse, France
Algorithmic differentiation (AD), also known as Automatic Differentiation or AutoDiff, is a crucial algorithmic technique in mathematics that has played a significant role in the success of fields such as Machine Learning (ML). AD enables efficient computation of adjoint gradients that are essential for training complex ML models like Large Language Models (LLMs) with billions of parameters. It also plays a pivotal role in multi-disciplinary design optimisation involving very large design spaces and PDE constraints in Aerospace and Mechanical Engineering.
Despite its broad utility and applicability, AD is mostly taught to a niche community of researchers in universities. Access to AD tools and practical application knowledge is limited for engineering students. Specifically, the reverse mode of AD requires adherence to a subset of a given programming language due to specific algorithmic constraints. Domain Specific Languages (DSLs) with AD support like TensorFlow also impose severe restrictions on the programming style. These constraints demand meticulous programming discipline to develop efficient AD-enabled scientific software.
Fortunately, numerous practical solutions to AD-related challenges have emerged and the AD community has been actively developing tools to tackle them. This course aims to democratise AD knowledge and bring it to a wider audience in engineering. Understanding AD is critical for students pursuing problems in gradient-based optimisation, and this course will serve as a useful primer. Given the popularity and prevalence of ML frameworks, we cover aspects of AD applied to ML. We present example problems from engineering and mathematics to demonstrate how AD can be leveraged to solve them efficiently using open-source AD tools in programming languages such as Fortran and Julia.
Key topics addressed in this course
- AD history and introduction to terminologies
- Three types of evaluating derivatives : Finite-difference, Symbolic and Algorithmic
- Forward and Reverse Mode of AD (Back propagation in ML)
- Adjoint-gradients in design optimisation and ML
- Programming Paradigms in AD : Source Transformation and Operator Overloading
- Application to practical problems in Fortran, Julia and Python languages
References
- Numerical Optimization, Jorge Nocedal and Stephen Wright, Springer, 2006.
- Evaluating Derivatives: Principles and Techniques of Algorithmic Differentiation, 2nd Edition, Andreas Griewank, Andrea Walther, SIAM, 2008.
- The Art of Differentiating Computer Programs: An Introduction to Algorithmic Differentiation, Uwe Naumann, SIAM, 2011.
- Algorithms for Optimization, Mykel J. Kochenderfer and Tim A. Wheeler, MIT Press, 2022.
About the Instructor
Dr. Pavanakumar Mohanamuraly currently serves as a Senior Researcher at the ALGO Team, CERFACS, Toulouse, France. He has extensive experience in CAD-based Aerodynamics Shape Optimisation, adjoint sensitivity analysis, Machine Learning, and high-performance computing, and brings a wealth of knowledge and practical experience in algorithmic differentiation applied to parallel codes.
He holds a PhD in Aerospace Engineering from Queen Mary University of London and an MS in Aerospace Engineering from Pennsylvania State University. His career includes roles at Integrated Test Range, DRDO, Balasore, Honeywell Technology Solutions, Bangalore, National Aerospace Laboratories, India, Airbus Group, Bangalore.
As a Marie Curie Early Stage Researcher (PhD) at QMUL, Pavanakumar had extensive training and produced algorithms and tools for aerodynamic shape optimisation and algorithmic differentiation of parallel CFD codes. His work has significantly contributed to the advancement of computational methods in CERFACS, particularly in the areas of hybrid CFD and machine learning and parallel adaptive mesh refinement and load-balancing problems.