Description

Calculating a derivative can sometimes be more delicate than one might think, specially when the function to be derived is itself the result of a numerical calculation. In practice, this poses numerical and algorithmic problems: on the one hand, it is difficult to control the truncation error of a finite difference computation and on the other hand the symbolic computation can lead to an explosion of the complexity of the expressions.

This need is encountered in problems where one seeks to optimize a quantity resulting itself from an approximate calculation. This is formalized through the optimization of a criterion under constraint. For example:

  • the optimal control (aerodynamic stability analysis, ...),
  • data assimilation (meteorology, risk analysis, ...),
  • the inverse problems (geophysics, medical imaging, design, ...),
  • sensitivity analysis (quantification of uncertainties, ...),
  • deep learning (neural networks, ...).

The algorithmic differentiation (AD) allows to get rid of a part of the calculations. The AD takes a calculation code, as well as a description of the input and output variables, and produces a new code that calculates the derivatives of the outputs with respect to the inputs. This uses concepts from compilation and program analysis (not computer algebra). There are two modes of AD, which differ in the way of applying the rule of derivation of a compound function:

  • the direct mode, which is used when the number of inputs is smaller than the number of outputs;
  • the reverse mode, which is used when the number of inputs is greater than the number of outputs, in particular to calculate the gradient of a functional with respect to a large vector.

The inverse mode is equivalent to the adjoint state calculation (used in optimal control) and is close to retro-propagation in neural networks. These results, well known to specialists in each discipline, are not new, however they deserve to be presented from a broader perspective.

The purpose of this day is to bring together specialists in these different technologies to highlight the common needs, the different use cases and the limits of the different methods. Another goal of this day is to meet different communities, who share these techniques.

The workshop will include educational presentations presenting the principles of the methods, feedback on different applications and presentations recent or more original extensions.

Program

jeudi 24/01

09:00 09:30 Pas de support disponible Pas de résumé disponible

Welcome and introduction

09:30 10:30 Télécharger le support Pas de résumé disponible

Challenges and Achievements of Source-Transformation Algorithmic Differentiation

Laurent Hascoët

10:30 11:30 Télécharger le support Pas de résumé disponible

Derivative Code by Overloading in C++

Uwe Naumann

11:30 12:00 Télécharger le support Pas de résumé disponible

Use of AD in elsA CFD solver

Sébastien Bourasseau

13:30 14:30 Télécharger le support Pas de résumé disponible

Shape optimisation using AD of complete CFD workflows, including CAD geometry

Jens-Dominik Mueller

14:30 15:30 Télécharger le support Pas de résumé disponible

Arbogast: "Du calcul des dérivations" to higher order AD

Isabelle Charpentier

15:30 16:00 Pas de support disponible Pas de résumé disponible

Coffee break

16:00 17:00 Pas de support disponible

Beyond backprop: automatic differentiation in machine learning (cancelled)

Atilim Günes Baydin

This talk was cancelled. Here is a link provided by the author as an alternative.

Location

Institut de Physique du Globe, 1 rue Jussieu, Paris

Contact

  • Michel Kern
  • Anne Cadiou