Nous accueillons Yolhan Mannes de l'université de Toulon qui nous présentera,

Exploring Deep Learning through Flux.jl: Insights into Core Mechanisms and Datasets

Flux.jl stands out as a premier Julia package for deep learning, enveloping a wide array of recognized deep learning techniques such as convolution, attention mechanisms, and recurrent layers, to name a few. This session will probe into the foundational mechanisms of Flux.jl, emphasizing its three critical elements:

  • Decomposition of complex nested structures (Functors.jl)
  • Automatic reverse differentiation (Zygote.jl)
  • Descent-based minimization methods (Optimisers.jl)

Throughout the discussion, we will scrutinize how some of the built-in layers are structured and concurrently learn the process of formulating custom ones. Additionally, a significant part of the presentation will be dedicated to exploring datasets, particularly focusing on how to integrate and utilize datasets from Python and R to augment the deep learning processes in Flux.jl.

Finally, we will analyze several well-known test cases such as MLP for approximation issues, CNN for image classification objectives, and Transformers for challenges related to text comprehension and translation.

Cette session est accessible à tous et aura lieu sur la plateforme BBB de Mathrice.

Merci de bien vouloir vous inscrire pour suivre cette session. Cette session sera enregistrée. S'inscrire implique d'accepter ce principe.


  • Yolhan Mannes : doctorant à l'Institut de Mathématiques de l'université de Toulon.