Suite au succès de la journée JuliaDay à Nantes en 2019, nous organisons une seconde édition des journées "Julia et optimisation" du 4 au 6 octobre 2023 au CNAM, amphi Jean-Baptiste Say, 292 rue Saint-Martin à Paris.
Les journées se tiendront en français et en anglais, et sont ouvertes à toute personne curieuse des usages du langage Julia en optimisation, et plus généralement en calcul scientifique. Nous invitons notamment les doctorants et élèves de master à venir présenter leurs travaux.
Slack channel #juliaoptparis2023
HiGHS is the world's best open-source linear optimization software, and has had a good and valuable relationship with the JuMP community for several years. This talk will present the recent developments in HiGHS. In particular it will give a progress report on new interior point solver, and set out the motivation for its development.Coluna.jl is a branch-and-cut-and-price framework written in Julia. Together with BlockDecomposition, this package allows you to seamlessly run a branch-and-cut-and-price algorithm from your JuMP model. The platform provides default implementation of advanced features such as non-robust cuts, stabilization techniques that you can easily customize to develop your own next-stage state-of-the-art algorithms. This project is actively supported and continuously enhanced by the contributions of the best researchers in the field. Building such a platform is a real challenge that raises a great amount of engineering problems and questions. Today, we are reaching an important milestone in the development of Coluna thanks to an architecture of the package that is quite modular and allows us to easily try new algorithmic strategies. This is the perfect opportunity to share our experience in building a Julia optimization package.We introduce the application of Monte-Carlo Tree Search (MCTS) for a well-defined large scale Markov decision process. As an example we investigate, the ship refuelling problem subject to subject stochastic fuel prices. As a benchmark model, we compare this MCTS-based implementation to pure stochastic programming approaches with scenario tree generation. The objective of the new approach is to demonstrate the use of MCTS to produce approximate solutions to large scale MDP's when they become intractable via standard mathematical programming or dynamic programming. Another objective is to show that our MCTS based solver can achieve near-optimal performance with respect to a stochastic programming solution, and a much lower computational cost. Moreover, a hybrid approach exists involving applying exact methods, such as stochastic programming, to solve the a value function of a sub-problem. This value estimate can be used in tandem with MCTS, to reduce the variance of Monte Carlo sampling, and/or speed up convergence to the optimal solution. The practical application of this technology would be to allow maritime planners to more effectively optimize for fuel prices in the face of increasing model complexity, and also potentially in model-free settings.This endeavor focuses on integrating Constraint Programming (CP) into the versatile JuMP.jl framework. We compile a set of common XCSP3-standard constraints, each tailored to align with a type-based parameter interface, facilitating seamless multiple dispatch. This interface, distinct from JuMP, serves as a unifying structure for these constraints. Exploring classical problems like Sudoku, Magic Square, and Quadratic Assignment, this provides attendees, even those less familiar with CP, a structured entry into CP's integration through JuMP and Julia. With this approach, we bridge the gap between CP and optimization for broader accessibility. This work unites CP's power with JuMP's familiarity, enriching optimization methodologies.
From vOptGeneric.jl to MultiObjectiveAlgorithms.jl
Xavier Gandibleux, Gokhan Kof, Oscar DowsonvOptGeneric.jl is a JuMP extension package for modeling and solving multi-objective linear optimization problems It was created as part of the ANR/DFG research project vOpt (2015-2019). Inspired by the lessons learned in vOptGeneric, we added first-class support for multi-objective optimization problems to JuMP and MathOptInterface. In addition, we developed a meta-solver, MultiObjectiveAlgorithms.jl (MOA), which implements a number of algorithms for solving multi-objective optimization problems. The new features and MOA were released in February, 2023. In this talk, we discuss vOptGeneric, the transition to first-class support in JuMP, and our plans for the future.Bayesian optimization is a family of methods for sample efficient optimization of noisy, black box, expensive to evaluate functions. The approach of these methods is to maintain a random field, usually a Gaussian process, to model uncertainties in values of the objective function. This model is refined throughout optimization and is used for deciding where to evaluate the objective function next. The package BayesianOptimization.jl currently provides an implementation of traditional methods that work well for low dimensional problems. The goal of our project is to refactor and to extend the package such that it serves as a framework for implementing novel algorithms from this family and provides user-friendly implementations of successful algorithms. We are currently developing the framework and merging the implementation of traditional methods and a new implementation of Trust region Bayesian Optimization (TuRBO) algorithm into the framework. TuRBO belongs to recent developments in Bayesian optimization that aim to work well for higher dimensional problems. Some prominent use cases include hyperparameter tuning of algorithms and optimization problems where the objective is given by some expensive simulation. Samuel Belko is enrolled in a master's program in mathematics at Technical University of Munich. He works during the Google Summer of Code 2023 on BayesianOptimization.jl under the supervision of Johanni Brea.Topological Data Analysis (TDA) is an analysis method that aims to identify patterns and structures underlying complex data sets. TDA can be used to extract important information about the data, such as groupings and clusters. I'm going to describe an example of how to implement this method and explain why we chose the Julia language.Since half a century, the notion of Copulas, introduced by Sklar in 1959, is the standard approach to model complex dependence structures in multivariate random vectors. In a broad range of domains, multivariate statistics and dependence structure modeling is an important part of the statistical treatment of the information, and therefore many applied domains have widely adopted copulas frameworks. In Julia, the probabilistic and statistical ecosystem is centered on Distributions.jl, which provides the standard tools to deal with random variables. Copulas.jl provides many standard tools to model dependencies between random variables: evaluation of probabilities and moments, Kendall's tau, Spearman's rho, distribution function and density evaluation, fitting models to data through inverse moments or loglikelyhood maximization, etc, are available for a wide range of classical parametric copula families. Moreover, the Sklar type, mimicking Sklar's Theorem, allows building full models including the Copulas and marginal specifications. These complex multivariate models are compatible with the broader Distributions.jl ecosystem, allowing to, e.g., plug them directly into Turing.jl for Bayesian applications. In this talk, we present the new tools that we developed, their integration to the ecosystem, and we showcase the new functionalities that are now available to the practitioner. The fact that this is native Julia allows beautiful application to weird number types that were not possible before, and we believe this package is a great addition to the Julia ecosystem.
From non-differentiability to discrete geometry: B-differential, hyperplanes and matroids
Baptiste Plaquevent-JourdainSystems of nonsmooth nonlinear equations of the form H(x) = 0 can often be solved by the semismooth Newton method. An adaptation of the classic Newton method, it yields fast local convergence with little cost per iteration, provided the B-differential of H has only nonsingular Jacobians at the solution. Such setting of nonsmooth H is encountered when complementarity problems are reformulated as nonsmooth equations thanks to the componentwise minimum function. This talk describes some properties as well as the computation of the of the B-differential of the componentwise minimum of two affine vector functions. This question, having connections with linear algebra, convex analysis and discrete geometry (hyperplane arrangement), can be answered numerically. We shall emphasize the role of duality and matroid circuits in this enumeration problem, and how theoretical insights serve the numerical aspect. These properties allow us to design a fundamentally different (dual) approach of the state of the art approach by Rada and Černý, which brings significant improvements. The Julia package in development considers features such as: primal and dual approach for hyperplane arrangements, the possibility to use rational coordinates, central and non-central arrangements.Seleroute.jl is an implementation of many state-of-the-art algorithms to compute optimum computer-network routing. These include oblivious routing to take into account the demand uncertainty into the routing or variants of fair routing. The possibilities of Julia have had a decisive impact on design decisions.
Convex Hull Formulation for Optimal Operation of Storage Including Reserves
Maaike ElgersmaEnergy storage systems have become a promising option to increase power system flexibility and harness larger shares of variable renewable energy. To get a full picture of their potential operation and benefits, a realistic representation of their characteristics is essential in power system models. Energy storage models require binary variables to correctly model reserves and to ensure that there is no simultaneous charging and discharging. Solving mixed-integer programming (MIP) problems such as these are computationally demanding, and can pose a significant impediment in large-scale models. A common practice is therefore to formulate them as linear programs (LP), thus allowing simultaneous charging and discharging, resulting in practically infeasible solutions. We propose a tight linear program (LP), i.e., convex hull, for describing the operation of storage including reserves in one time period, which guarantees that there is no better LP approximation to its mixed-integer program (MIP) counterpart. It is obtained by writing the two disjunctive sets of constraints for charging and discharging, and applying the Fourier-Motzkin elimination procedure to reduce the dimensionality of the problem back to the original. We implemented two case studies, as introduced by J.M. Arroyo et al. (2020) in On the Use of a Convex Model for Bulk Storage in MIP-Based Power System Operation and Planning, in Julia, which show that our proposed formulation indeed results in an integer solution, contrary to previous ideas in this field. By embedding the proposed LP formulation into large optimization problems, we expect that it helps to provide solutions equal to or very near to the exact integer feasible behavior of the storage; and when used in its integer form, it is expected to speed up MIP problems.We introduce Dionysos, a highly modular package for solving optimal control problems for complex dynamical systems using state-of-the-art and experimental techniques from symbolic control, optimization, and learning. More often than not with Cyber-Physical systems, the only sensible way of developing a controller is by discretizing the different variables, thus transforming the model into a simple combinatorial problem on a finite-state mathematical object, called an abstraction of this system. Although this approach offers in principle a safety-critical framework, the available techniques suffer important scalability issues. In order to render these techniques practical, it is necessary to construct smarter abstractions that differ from classical techniques by partitioning the state-space in a non trivial way. Dionysos features optimal control problems definitions, that is a description of both the mathematical system and of a desired closed-loop behaviour. More importantly, it also features several abstraction-based optimal control strategies. In Dionysos, these strategies are seen as solvers, and inherit from JuMP and MathOptInterface powerful and convenient optimization framework. Given a problem and a method, Dionysos is then able to solve the problem and output/visualize the obtained closed-loop dynamics using existing visualization tools such as RecipesBase. It is built on top of existing Julia packages such as HybridSystems, MathematicalSystems, LazySets and IntervalArithmetics. Examples in the code also make use of RigidBodyDynamics, MeshCat and MechanismGeometries. In this talk, we will present Dionysos' structure, and go through the different modules of the package. We will also illustrate Dionysos by showing how it performs on famous examples in control theory such as DC-DC converters and path planning problems. Finally we will present the performance of our package on these examples, illustrating the power resulting of the combination of an efficient programming language and state-of-the-art theoretical methods.
Solving optimal control problems with Julia
Jean-Baptiste Caillau, Joseph GergaudJ.-B. Caillau, O. Cots, J. Gergaud, P. Martinon, S. Sed The numerical solution of optimal control of dynamical systems is a rich process that typically involves modelling, optimisation, differential equations (most notably Hamiltonian ones), nonlinear equations (e.g. for shooting), pathfollowing methods… and, at every step, automatic differentiation. While tremendous efforts are being made in every one of the above-mentioned fields by the Julia community, there are still some missing parts to bridge the gap between them and have a fully satisfactory solving process - meaning: an abstract description, as close as possible to the mathematical problem formulation, and an efficient & reliable numerical computation. The Julia language has a lot to offer in this respect, and there already are excellent codes available [1, 2, 3, 4], while mostly oriented towards « direct solving » (that is direct transcription of the original optimal control problem into a mathematical program). We report on recent experiments with the Julia package OptimalControl from  on a variety of applied or more theoretical problems. 1. [ControlSystems](https://juliacontrol.github.io/ControlSystems.jl/dev/#JuliaControl) 2. [InfiniteOpt](https://github.com/infiniteopt/InfiniteOpt.jl) 3. [TrajectoryOptimization](http://roboticexplorationlab.org/TrajectoryOptimization.jl) 4. [Enzyme](https://enzyme.mit.edu/julia) 5. [control-toolbox](https://control-toolbox.org)
Latest developments for mixed-integer nonlinear programming in Artelys Knitro
Maxime DufourArtelys Knitro is a mathematical programming solver for nonlinear and mixed-integer nonlinear problems. As input, it accepts linear structures, quadratic structures and black-box functions, with if possible, their first and second-order derivatives. Knitro relies on derivative-based algorithms to find locally optimal solutions. Knitro finds the global optimum for convex problems. For non-convex problems, Knitro converges to a first order stationary point (e.g. local optimum) for continuous models and is a heuristic for mixed-integer problems. For nonlinear problems, Knitro includes two interior point algorithms, a sequential linear quadratic programming algorithm and a sequential quadratic programming (SQP) algorithm. For mixed-integer nonlinear problems, Knitro includes a nonlinear branch-and-bound algorithm, a Quesada-Grossman branch-and-bound algorithm and a mixed-integer sequential quadratic programming algorithm. Since Artelys Knitro 13.0, the nonlinear branch-and-bound has been fully rewritten. The new nonlinear branch-and-bound is parallel and deterministic. Cut generation and cut management have been greatly improved by adapting the ideas developed for mixed-integer linear programming. The portfolio of heuristics executed during the search has been extended. It also includes better branching strategies and a restart procedure. Finally, to improve the solutions for non-convex problems, two ways to use multi-start within the nonlinear branch-and-bound have been implemented. In this talk, we will present the algorithms implemented in Artelys Knitro for mixed-integer nonlinear problems, and detail the recent developments for the nonlinear branch-and-bound algorithms.
Julia Smooth Optimizers: Streamlined Research and Recent Developments of the Ecosystem
Abel Soares Siqueira
Julia Smooth Optimizers (JSO) is an organization created in 2015 focused on streamlining the development of research-level optimization solvers and providing high-performance linear algebra and optimization packages for end-users. We are the maintainers of the packages NLPModels.jl, CUTEst.jl, Krylov.jl, LinearOperators.jl, Percival.jl, JSOSolvers.jl, and around 40 other packages. In this talk, we will provide an overview of how JSO streamlines the process of going from a prototype to a high-performance solver and recent advances in some of our packages, especially end-user packages.
We provide the tools to allow quick prototyping and evaluation of new optimization solvers while permitting an upgrade to publication-ready benchmarking and reporting. The researcher can test their code with problems created with single-line AD-powered models, convert from JuMP or AMPL, or load from CUTEst. They can run benchmarks on our existing problem repositories and export the results to LaTeX tables and performance profiles. Furthermore, we provide tools to enable high-performance implementations, such as matrix factorization packages, matrix-free linear systems and least squares solvers, and trust-region and line search subsolvers.
Another endeavour inside JSO is to provide end-user solutions as well. We deliver optimization solvers, matrix-free and factorization-based linear system solvers, and least squares solvers. More recently, we have been working on a user-friendly interface for JSO solvers that will be a one-stop solution for all optimization needs. It is, at the time of writing, yet unreleased.
Convex mixed-integer optimization algorithms with Boscia.jl and applications
Mathieu BesançonNonlinear and mixed-integer optimization have long remained separate fields with their own techniques, representations, and algorithms. In this talk, we will introduce a novel approach leveraging first-order methods for convex optimization based on the Frank-Wolfe algorithm. First-order methods are the usual choices for large-scale smooth optimization but are typically not prime candidates in branch-and-bound algorithms because of their slower convergence and lower accuracy, which does not systematically provide a safe dual bound necessary to prune nodes in a branch-and-bound framework. We will show how we designed a convex mixed-integer algorithm leveraging techniques and algorithms both from the first-order convex and mixed-integer literature, implemented in the Boscia.jl package built on top of FrankWolfe.jl, and showcase two applications where the constraint structure can be efficiently exploited in a Frank-Wolfe type approach.Many important optimization problems do not exhibit the convexity feature that would allow them to be solved to global optimality using classical convex optimization methods. In notable cases, such as with neural networks, we even fail to see what structure could be exploited to ensure global optimality. However, a wide variety of smooth nonlinear optimization problems can often be reformulated as polynomial optimization problems. Ignoring this important polynomial structure, these problems are often solved by local solvers such as Ipopt, even if they can only guarantee local optimality. On the other hand, the Macaulay or Sum-of-Squares (aka Lasserre) hierarchy converges to the global optimal. However, while the problem at each level of the hierarchy is solved in polynomial time, the size of this problem increases exponentially with the hierarchy. The ability of local solvers to yield locally optimal solution is therefore preferred when reaching the convergence of the hierarchy is computationally too expensive. In this presentation, we show that a hierarchy leveraging both the Macaulay and the Sum-of-Squares approaches can converge at a lower level. Moreover, we show that feasible solutions can be obtained via rounding algorithms at any level of the hierarchy. This allows the methods to be considered for larger problems as well, for which the convergence of the hierarchy may be expensive.The Julia graphs ecosystem has emerged as a worthy competitor to alternatives in other languages (eg. Python's networkx), promising to combine ease of use with scalability. Unfortunately, it contains a large number of packages with very different purposes, and navigating them can be tricky at first. In this tutorial, I will provide an overview of the ecosystem, highlighting the general interface introduced by Graphs.jl and its numerous extensions. I will then demonstrate how optimization algorithms on graphs can be implemented, both with and without JuMP.jl.
Algorithm for the packaging return logistics at Renault: a Julia code in production
Louis BouvierThis talk highlights the fruit of a partnership with Renault. Their return logistic requires solving a continent-scale multi-attribute inventory routing problem (IRP). It corresponds to the following situation: a supplier manages the delivery of several commodities to its customers on a multiple-day horizon in a centralized manner. The supplier has to plan routes to deliver commodities from its depots to customers with the objective of minimizing inventory and routing costs. For each route, the starting depot, the ordered list of customers visited, as well as the quantities of each commodity to be delivered at each stop must be decided. With an average of 30 commodities, 16 depots, and 600 customers spread across a continent, our instances are orders of magnitude larger than those in the literature. Existing algorithms do not scale. We propose a large neighborhood search (LNS) implemented in Julia. We make an academic version of the code as well as a dataset of industrial instances publicly available. An industrial version of the code is currently in production at Renault and run every day. We try to provide an industrial feedback on this project.We present a JuMP-based solver that combines a nested primal-dual decomposition technique and convex relaxation approaches for tackling non-convex multi-stage stochastic programming problems. The approach addresses optimal long-term water supply infrastructure planning with constraints feasibility at operational timescales. We combine an outer primal decomposition of planning stages and inner dual ones of operating scenarios, with convexified non-anticipative constraints relaxed for scenarios.In the network alignment problem, we are given two graphs and we are asked to match their respective vertices in order to minimize some similarity measure. The problem implies as a special case the fundamental problem of subgraph isomorphism, and arises in several applications, from computer vision to the analysis of protein-protein interaction networks. In this talk, we will give an overview of some algorithms for tackling the problem and discuss their Julia implementations.We present a recently developed Julia implementation of the Domain Decomposition Method (DDM). DDM is a powerful framework for formulating preconditioning techniques. The objectives of our package are twofold: (1) to implement DDM primitives for easy prototyping of new preconditioning strategies and (2) to leverage them to solve the linear systems that arise from Computational Fluid Dynamics applications. We will highlight the key differences between our implementation and state-of-the-art DDM implementations like FreeFEM and PETSc, as well as share the lessons we have learned along the way. The initial release of our package includes classical one-level and two-level preconditioners, and future developments will incorporate advanced preconditioners such as GenEO and multi-level techniques.
- Mathieu Besançon (Zuse Institute Berlin)
- Vincent Duval (INRIA Paris)
- Xavier Gandibleux (Laboratoire des Sciences du Numérique Nantes)
- Vincent Leclère (CERMICS Marne la Vallée)
- Antoine Levitt (Laboratoire de Mathématiques Orsay)
- Pierre Navaro (IRMAR Rennes - Groupe Calcul)
- Francois Pacaud (Mines Paris)
- Agnès Plateau (CNAM Paris)