Classical scheduling and Packing

I am working on scheduling for many years. The main contributions concerned the extensions of scheduling models and algorithms in order to take into account communications and the design of optimization algorithms on interconnection networks (like broadcasting or total exchange). More recently, I was interested in designing low-cost approximation algorithms for scheduling parallel jobs (moldable and malleable jobs). Here, scheduling is considered as a 2 dimensional packing problem. Today, I am interested in scheduling with non availability periods (reservations): complexity analysis, approximations and implementation on an actual platform.

Implementing parallel applications and Bio-Computing

I was always interested in implementing efficiently large parallel applications on the successive generations of parallel supports. This is very useful for the users, but also for computer scientists for identifying new hard problems. Originally, such applications were mainly reduced to regular numerical algorithms, then, following the evolution of parallel platforms, the target applications highly diversified. I worked on the parallel implementation of adaptive meshing in oceanography for the simulation of ocean flows and parallel molecular dynamics. In 2001 I started working on the parallel construction of evolutionary trees and other combinatorial optimization problems.

Adaptive algorithms and sensitivity analysis

The apparition of new computing platforms highly changed the way of designing algorithms. One of the main feature that is requested today is that the algorithms can adapt them-selves to versatile conditions (coming from external factor on the platforms or on interactions with other users or coming from internal factors in the applications). It is possible to adapt existing algorithms for selecting adequate algorithms depending on the considered data, but a more challenging problem is to determine automatically good combinations between algorithms. Another view of this problem is to study and design algorithms that can absorb the effects of disturbances or incertainties in an evolving environment. Several approaches are possible from sensitivity analysis to robustness. I believe that flexibility is the right one. Starting from an initial solution computed with estimated data, some correction mechanisms are added at run-time for being able to react to disturbances. This approach includes the stability analysis.

Multi-objective optimization

There exist many efficient techniques for solving classical combinatorial problems. Today, we are looking for solutions of problems with diversified objectives, and more and more with simultaneous objectives. I started few years ago studying classical multi-objective problems for determining one "good" compromize solution (approximation of the Zenith point). We are looking now for approximation algorithms of the Pareto set. Today, I am working on how to adapt the existing methods for many objectives. This topic has many links with Game Theory.

Analysis and optimization of Fault-Tolerance algorithms

This is a new topic started in 2007. Today, the computing platforms are larger and larger, they involve often more than thousands of processors. While designing parallel algorithms, we can not assume that they will run without failures. I studied this problem in the perspective of bi-objective optimization (for determining the best trade-off between performance and reliability). Recently, we proposed a new probabilistic model for computing the expected execution time with checkpoint-restart mechanisms.