Calculating a derivative can sometimes be more delicate than one might think, specially when the function to be derived is itself the result of a numerical calculation. In practice, this poses numerical and algorithmic problems: on the one hand, it is difficult to control the truncation error of a finite difference computation and on the other hand the symbolic computation can lead to an explosion of the complexity of the expressions.
This need is encountered in problems where one seeks to optimize a quantity resulting itself from an approximate calculation. This is formalized through the optimization of a criterion under constraint. For example:
- the optimal control (aerodynamic stability analysis, ...),
- data assimilation (meteorology, risk analysis, ...),
- the inverse problems (geophysics, medical imaging, design, ...),
- sensitivity analysis (quantification of uncertainties, ...),
- deep learning (neural networks, ...).
The algorithmic differentiation (AD) allows to get rid of a part of the calculations. The AD takes a calculation code, as well as a description of the input and output variables, and produces a new code that calculates the derivatives of the outputs with respect to the inputs. This uses concepts from compilation and program analysis (not computer algebra). There are two modes of AD, which differ in the way of applying the rule of derivation of a compound function:
- the direct mode, which is used when the number of inputs is smaller than the number of outputs;
- the reverse mode, which is used when the number of inputs is greater than the number of outputs, in particular to calculate the gradient of a functional with respect to a large vector.
The inverse mode is equivalent to the adjoint state calculation (used in optimal control) and is close to retro-propagation in neural networks. These results, well known to specialists in each discipline, are not new, however they deserve to be presented from a broader perspective.
The purpose of this day is to bring together specialists in these different technologies to highlight the common needs, the different use cases and the limits of the different methods. Another goal of this day is to meet different communities, who share these techniques.
The workshop will include educational presentations presenting the principles of the methods, feedback on different applications and presentations recent or more original extensions.
Challenges and Achievements of Source-Transformation Algorithmic Differentiation
Laurent Hascoët
Shape optimisation using AD of complete CFD workflows, including CAD geometry
Jens-Dominik Mueller
Beyond backprop: automatic differentiation in machine learning (cancelled)
Atilim Günes Baydin
This talk was cancelled. Here is a link provided by the author as an alternative.
Institut de Physique du Globe, 1 rue Jussieu, Paris
- Michel Kern
- Anne Cadiou