Public Seminar of our project-team Lifeware

To receive the program by e-mail, please register to https://sympa.inria.fr/sympa/info/lifeware.sem

Our public seminars are typically of 45 min plus questions.

They are less often in hybrid form with possibility to connect to a webex link indicated on this page soon before the seminar. In this case you are kindly asked to keep your camera on, with micro off, during the seminar. The seminars are not supposed to be recorded, but in any case participating in the seminar makes you accept its possible recording if any.


Monday-Tuesday March 18-19 2024, Workshop around the HDR defense of Jakob Ruess, Amphi Sophie Germain

Monday March 18

09:30-10:00 Coffee break

10:00-10:40 Ovidiu Radulescu (University of Montpellier):

Inferring stochastic models of gene transcription from initiation events by computer algebra

10:45-11:25 Heinz Koeppl (TU Darmstadt):

Precision and Energy in Gene Expression

11.30-13.00 Lunch at Ecole Polytechnique

13.00-13.30 Coffee break

13.30-14.10 Mustafa Khammash (ETH Zurich):

Engineering Adaptation

14.15-14.55 Ramon Grima (University of Edinburgh):

Mechanistic models of transcription: analytical solutions and applications to inference

14.55-15.30 Coffee break

15.30 Jakob Ruess (Inria) HDR defense

Identification and control of stochastic reaction networks in single cells and microbial populations

...

Tuesday March 19th

9.30-9.45 Coffee break

9.45-10.25 Guido Sanguinetti (SISSA):

RNA velocity meets neural ODEs: interpretable modelling of cell dynamics from single-cell RNA-seq data

10.30-11.10 Philipp Thomas (Imperial College London):

Stochastic Gene Expression in Lineage Trees

11.15-11.55 Guillaume Ballif (Inria Saclay):

An extension of the Chemical Master Equation for population process

12.00 Lunch at Ecole Polytechnique


Monday March 11th 2024, 11.00am, MAX team seminar, Room Grace Hopper

François Fages, Inria, Lifeware

On rule-based models of dynamical systems.

Chemical reaction networks (CRN) constitute a standard formalism used in systems biology to represent high-level cell processes in terms of low-level molecular interactions. A CRN is a finite set of formal kinetic reaction rules with well-defined hypergraph structure and several possible dynamics. One CRN can be interpreted in a hierarchy of formal semantics related by either approximation or abstraction relationships, including the differential semantics (ordinary differential equation), stochastic semantics (continuous-time Markov chain), probabilistic semantics (probabilistic Petri net forgetting about continuous time), discrete semantics (Petri net forgetting about transition probabilities), Boolean semantics forgetting about molecular numbers, or just the hypergraph structure semantics. We shall show how these different semantics come with different analysis tools which can reveal various dynamical properties of the other interpretations. In our CRN modeling software BIOCHAM (biochemical abstract machine), these static analysis tools are complemented by dynamic analysis tools based on quantitative temporal logic, and by an original CRN synthesis symbolic computation pipeline for compiling any computable real function in an elementary CRN over a finite set of abstract molecular species.


Tuesday February 20th 2024, 11.00am, Room Grace Hopper

Christian Jendreiko, Univ. Dusseldorf, Germany

Three projects of artistic sonification (slides)

In this lecture, Christian Jendreiko will introduce his take on the method of sonification. Within Jendreiko’s artistic research, sonification serves 2 objectives: 1) making scientific models audible to gain a better understanding of them and 2) exploring the generative potential of scientific models for generating musical output. Jendreiko will give insight into three projects: Reading a reaction network as a score, developing sound performances from quantum structures and rewriting partition logics developed by Quantum Physicist Karl Svozil as DCGs to generate visual and acoustic patterns.

Christian Jendreiko (born 1969) is Professor for experimental design at HSD, University of applied sciences, Duesseldorf, Germany. A set of his audio-works is part of the collection of the Musée d'Art Moderne, Centre Pompidou, Paris.


Wednesday January 24th 2024, 10.30am, Room Grace Hopper

Josué Tchouanti, ENS Lyon

Individual based approach to modeling bacterial metabolic heterogeneity

Bacterial metabolism refers to the chemical reactions by which bacteria use a substrate to carry out the different phases of their cell cycle. Although it was long thought that metabolic activity was homogeneous within an initially clonal population, recent experiments have shown that significant heterogeneity can be observed. In this talk, we propose a macroscopic modeling approach to bacterial population dynamics in a batch in the presence of two sugars. Guided by experimental data, this model makes it possible to highlight the influence of certain metabolic parameters by comparing different bacterial strains of the same species. We then propose a second model that takes into account the individual expression of a protein, and for which we show a scaling limit in large population. We also show that despite the non-linearity and degeneracy of the equation that characterizes it, the scaling limit obtained is unique and describes a regular distribution of protein expression within the population.


Thursday December 7th 2023, 14.00am, Room Grace Hopper (Slides)

Maxime Mahout, UPSaclay

aspefm pour le calcul des modes de flux élémentaires et les ensembles de coupures minimaux dans les réseaux métaboliques

aspefm est une méthode de programmation logique pour le calcul des Elementary Flux Modes (EFMs), à base de Answer Set Programming (ASP), étendue par des contraintes linéaires. Cette méthode permet de récupérer des voies métaboliques lorsque les méthodes classiques de calcul d'EFMs ne le peuvent pas, d’interroger directement le réseau et d’éviter l’explosion en mémoire. En particulier, la méthode se révèle efficace pour le calcul de voies minimales de grande taille, en comparaison avec les méthodes MILP. La méthode peut prendre en compte des contraintes biologiques importantes de tous types, ce que nous avons illustré sur un réseau central d’Escherichia coli. Elle est aussi applicable aux réseaux à l’échelle du génome.

Notre méthode a été appliquée, en collaboration avec Ross Carlson (Montana State University), à la bactérie pathogène Pseudomonas aeruginosa (PA) qui est présente dans ~80% des plaies chroniques. PA utilise des stratégies écologiques différentes de celles des bactéries modèles comme E. coli. Elle est retrouvée généralement dans les plaies avec une autre bactérie pathogène, Staphylococcus aureus (SA). Nous supposons que leurs deux métabolismes sont complémentaires, ce qui permet une production de biomasse plus élevée conduisant à des mauvais pronostics pour les patients. L’extension de notre outil aspefm à l’analyse des Minimal Cut Sets (MCSs) sur un modèle de consortium de ces deux bactéries nous a permis de retrouver des métabolites dont l’échange entre les deux bactéries permettrait de compenser des phénotypes prédits léthaux, ainsi que d’explorer des cibles thérapeutiques potentielles contre les bactéries. Par ailleurs, dans un autre cadre, nous avons appliqué notre méthode de calcul des EFMs au métabolisme de la cellule cancéreuse humaine et à la formation du stroma tumoral.


Tuesday December 5th 2023, 10.30am, Room Grace Hopper (slides)

Van-Giang Trinh, Univ. Aix-Marseille

Trap spaces of Boolean networks are conflict-free siphons of their Petri net encoding

Boolean network modeling of gene regulation but also of post-transcriptomic systems has proven over the years that it can bring powerful analyses and corresponding insight to the many cases where precise biological data is not sufficiently available to build a detailed quantitative model. Besides simulation, the analysis of such models is mostly based on attractor computation, since those correspond roughly to observable biological phenotypes. The recent use of trap spaces made a real breakthrough in that field allowing to consider medium-sized models that used to be out of reach. However, with the continuing increase in model size and complexity of Boolean update functions, the state-of-the-art computation of minimal trap spaces based on prime implicants shows its limits due to the difficulty of the prime implicant computation.

In this work we explore and prove for the first time a connection between trap spaces of a general Boolean network and siphons of its Petri net encoding. Besides important theoretical applications in studying properties of trap spaces, the connection enables us to propose an alternative approach to compute minimal trap spaces, and hence complex attractors, of a general Boolean network. It replaces the need for prime implicants by a completely different technique, namely the enumeration of maximal siphons in the Petri net encoding of the original model. We then demonstrate its efficiency and compare it to the state-of-the-art methods on a large collection of real-world and randomly generated models. Finally, we draw several interesting topics of research inspired by the connection between trap spaces in Boolean networks and siphons in Petri nets.

[1] Trinh, Van-Giang, Belaid Benhamou, Kunihiko Hiraishi, and Sylvain Soliman. "Minimal trap spaces of Logical models are maximal siphons of their Petri net encoding." In International Conference on Computational Methods in Systems Biology, pp. 158-176. Cham: Springer International Publishing, 2022.

[2] Trinh, Van-Giang, Belaid Benhamou, and Sylvain Soliman. "Trap spaces of Boolean networks are conflict-free siphons of their Petri net encoding." Theoretical Computer Science 971 (2023): 114073.


Thursday October 12th 2023, 2pm, Amphi Sophie Germain, Build. Alan Turing, Inria Saclay.

Marine Collery PhD thesis public defence,

Expressive classification rule learning with an emphasis on learning from sequential data.

Machine learning, specifically neural networks, have seen significant advancements in classification tasks across various fields. Although they achieve high accuracy, their decision-making process is often incomprehensible to humans. Rule-based methods, on the other hand, are interpretable and widely used in industries with Business Rules Management Systems. In practice, however, these rules are written manually by experts and cannot easily be replaced by learned rules because of their limited complexity when obtained with existing methods. In this thesis, we propose an end-to-end neural-based approach to learn expressive rules for classification problems. Different levels of expressiveness in rules are presented, implemented and evaluated. First, the learning of basic disjunctive normal form (AND-OR) with a neural network is studied. Second, solutions to support sequential data are introduced with a recursive and a convolutional approaches. Finally, the model is extended to learn more expressive rules with predefined aggregation functions and overall complex rules.


Friday September 22nd 2023, 2pm, G2 Building, CEA Évry

Sahar Aghakhani PhD thesis public defence,

Computational modeling of metabolic reprogramming in rheumatoid arthritis synovial fibroblasts and cancer-associated fibroblasts


Tuesday June 27th 2023, 10h30 room Grace Hopper

10h30 Sara Riva, Univ. Bordeaux

Hypotheses and Equations on Discrete Dynamical Systems

A Finite Discrete-time Dynamical System (DDS) consists of a finite set X, called state space, and a function f, called next-state map (which associates to a state v the state f(v)). DDS are a formal tool for modeling phenomena that appear in Physics, Mathematics, Biology, and, of course, Computer Science. While the mathematical formalization and the results that found up to nowadays are elegant and meaningful, often they are not very suitable in practice because of their high computational cost. In the literature, it is known that DDS equipped with appropriate sum and product operations form a commutative semiring. This algebraic structure allows us to write polynomial equations in which the coefficients and unknowns are DDS. In particular, if we are interested in some dynamics derived from experimental data, we can write an equation with this as a constant right-hand term and model assumptions about the function f (or its properties) in a polynomial left-hand term. Finding solutions to this equation allows us to better understand the phenomenon and its properties. In addition, equations on DDS allow the study of different decompositions and factorizations (or, more generally, simplifications) for dynamics of interest. This approach is interesting but it has important limitations from a computational point of view. Solving a polynomial equation on DDS (with several variables) is, in general, undecidable, and even if we focus on the case of hypothesis validation, the computational cost remains high. The idea is then to look for approximations that give relevant information about the solutions of the original equation. It is possible to introduce three abstractions (simpler equations) to identify: the number of states of the variables, the asymptotic behavior, or the transient behavior (what happens before the system stabilizes). Each one is built from a theoretical and algorithmic point of view to introduce a method to perform hypothesis validation on DDS. Through algebraic transformations, it is possible to enumerate the solutions of a polynomial equation with a constant term by enumerating a finite number of simpler equations. Finally, the connection between the solution of these simple equations and the cancellation problem, known in graph theory, is explored to find a linear upper bound on the number of solutions.


Tuesday March 28th 2023, 11h00 room Grace Hopper

Visio expired

11h00 Corentin Briat, ex ETH Zurich

Cybergenetics: A Control Theory for Living Cells - First Results, Open Problems and Research Directions.

The objective of this talk is to give an idea of the problems encountered in the control of biological systems, and how they can be solved. In particular, we will discuss the concept of reaction networks as models for a wide class of dynamical systems including biological, ecological, and epidemiological systems. The concept of noise or randomness and its role in those systems will be also discussed. When starting the discussion on the control of biological processes, a parallel between the concept of regulation, integral action, homeostasis and perfect adaptation will be drawn and will serve as a bridge between control theory and biology. Various control paradigms will be briefly introduced, with a specific emphasis on the in-vivo control of cellular networks, that is the design of controllers that can be implemented inside living organisms, such as bacteria. A solution to the robust regulation problem based on the so-called Antithetic Integral Controller will also be discussed and shown to theoretically work both in the deterministic and the stochastic settings. Simulations and experimental results will be shown to illustrate the results. Open problems in the fields will finally be discussed together with some tentative research directions proposed.


Tuesday 7 February 2023, 11h00 room Grace Hopper

+ visio link expired

11h00 Mathieu Hemery, EPI Lifeware Inria Saclay

On computing logarithm and derivative functions with chemical reactions (slides)

Chemical Reaction Networks (CRN) is a widely adopted formalism to model biochemical processes in terms of molecular interactions. It is also the core-language of Biocham where we have implemented different compilation pipelines that allows us to automatically translate mathematical functions or programs into CRN able to compute them. Despite the Turing completeness of finite continuous CRNs shown in [1], adding the stability requirement for online computation limits the expressivity of CRN computation to algebraic functions only [2], hence excluding the logarithm. Computing the derivative of an input signal does not really fit into that computational framework, since computing the exact derivative would require to look forward in time.

Using intuition in one case and an evolutionary algorithm [3] in the other one, we have nonetheless managed to design CRNs for computing correct approximations of the log and derivative functions which can be explained by theoretical arguments. This illustrates some differences in terms of CRN architecture and possibility of concrete implementations, between compilation-based, evolution-based and manual CRN design approaches.

[1] Fages, F., Le Guludec, G., Bournez, O., & Pouly, A. (2017). Strong turing completeness of continuous chemical reaction networks and compilation of mixed analog-digital programs. In Computational Methods in Systems Biology: 15th International Conference, CMSB 2017, Darmstadt, Germany, September 27–29, 2017, Proceedings 15 (pp. 108-127). Springer International Publishing.

[2] Hemery, M., & Fages, F. (2022, August). Algebraic Biochemistry: A Framework for Analog Online Computation in Cells. In Computational Methods in Systems Biology: 20th International Conference, CMSB 2022, Bucharest, Romania, September 14–16, 2022, Proceedings (pp. 3-20). Cham: Springer International Publishing.

[3] Hemery, M., & François, P. (2019). In silico evolution of biochemical log-response. The Journal of Physical Chemistry B, 123(10), 2235-2243.


Tuesday 24 janvier 2023, 10h00 room Grace Hopper + visio expired

10h00-11h00 Jakob Ruess, EPI LIfeware Inria Saclay

BridgingScales: bridging the gaps between synthetic circuit design and emerging dynamics of heterogeneous populations

At the single-cell level, biochemical processes are inherently stochastic. Such processes are typically studied using models based on stochastic chemical kinetics, governed by a chemical master equation (CME). The CME describes the time evolution of the probability distribution over system states and has been a tremendously helpful tool in shedding light on the functioning of cellular processes. However, both in nature and the majority of lab experiments, single cells are not living in isolation but are part of a growing population or community. In such contexts, stochasticity at the single-cell scale leads to population heterogeneity and cells may be subject to population processes, such as growth and selection, that drive the population distribution away from the probability distribution of the single-cell process.

In this talk, I will provide an overview of the research topics that will be pursued in the BridgingScales ERC project. I will first discuss how the advancement of research in synthetic biology creates an increasing need for modeling approaches that take both single-cell and population scale into account. I will then discuss work that we have done in the past on coupling the CME, or a Fokker-Planck approximation of the CME, to population-level processes and on solving optimal control problems for the resulting multi-scale models. I will conclude by discussing important future research directions that need to be pursued to aid the design and control of synthetically engineered microbial population and consortia.

11h00-11h30 Guillaume Ballif, Post-Doc Pasteur Institute, EPI LIfeware Inria Saclay

Stochastic multiscale modeling for population dynamics: application to the ovarian follicle population

The modelling of morphogenesis processes with successive maturation stages generally involves structured population models. Individuals are sorted according to a phenotypic structuring variable (e.g. maturity), and the whole population evolves through birth, death and migration processes. A follicle (which is a tissue containing a cell that will become an egg cell, if the follicle grows and survives) is the main entity involved in the reproduction of mammals. In this talk, I will present the multiscale model based on continuous-time Markov chain developed to study the influence of nonlinear and nonlocal regulations on the population of ovarian follicles all along the reproductive life. Taking advantage of the different timescales and using perturbation theory, it is possible to prove under certain assumptions the existence and unicity of a reduced stochastic model that captures well the behaviour of each timescale. In a second part, I will present a comparison of the model with experimental data on ovarian follicle populations. I will detail the theoretical and practical study of the identifiability of the model with respect to the data and some biological interpretations.

11h30-12h00 Henri Mermoz Kouyé, Post-Doc candidate EPI LIfeware Inria Saclay

A sensitivity analysis approach for compartmental models based on continuous time Markov chains (slides)

Stochastic models are increasingly used in various fields (epidemiology, biology, chemistry, physics, etc.) and can be based on stochastic processes such as continuous time Markov chains (CTMCs). The use of these processes to describe a dynamic phenomenon allows to include randomness. Because of this so-called intrinsic randomness, the global sensitivity analysis of stochastic models is challenging, especially when the randomness acts as a noise on the outputs of interest considered.

In this talk, I will present a sensitivity analysis approach for compartmental models based on CTMCs. This approach consists in separating the two sources of uncertainty in these models, namely: parameter uncertainty (X) and intrinsic randomness (Z), so that the model outputs can be represented as deterministic functions of X and Z. Thus, such an approach allows to perform global sensitivity analysis of these models based on methods such as Sobol' indices. Furthermore, characterizing Z enables to estimate not only the indices associated with the uncertain parameters of the models but also those of the intrinsic randomness. An application of this approach to a compartmental model describing the propagation of SARS-CoV-2 is provided.


Tuesday Oct. 4th, 2022, 2pm, Amphi Sophie Germain, Bât. Turing, Inria Saclay + visio

Elea Thibault-Greugny, Johnson&Johnson, EPI LIfeware Inria Saclay, ED IPP

PhD thesis defence: Computational modeling approaches to multifactorial aspects of atopic dermatitis


Friday June 17th, 2022, 2pm, Amphi Sophie Germain, Bât. Turing, Inria Saclay + visio webex

Jeremy Grignard, Institut de Recherche Servier, EPI LIfeware Inria Saclay, ED IPP

Soutenance de thèse: Méthodes computationnelles pour améliorer les phases primaires de recherche de nouveaux médicaments

Le processus de découverte de nouveaux médicaments est long, coûteux et très risqué. L’objectif de cette thèse de doctorat est d’améliorer la pertinence des phases primaires de recherche pharmaceutique en développant des méthodes computationnelles. La première contribution porte sur le développement du graphe de connaissances Pegasus afin de capitaliser sur les données pharmaco-biologiques hétérogènes et de provenances multiples du secteur pharmaceutique. Les applications industrielles de Pegasus répondent à des problématiques de projets thérapeutiques et permettent de caractériser des effets hors cibles de perturbateurs, de concevoir une nouvelle expérience, et d’identifier des librairies de criblage focalisées. La deuxième contribution porte sur le développement d’un algorithme d’identification de composés contrôles positifs et d’un algorithme de normalisation afin d’améliorer la conception et l’analyse d’expériences de criblage phénotypiques à haut contenu. Ces algorithmes permettent de normaliser les signatures phénotypiques obtenues à partir de campagnes de criblage et d’intégrer des similarités phénotypiques informatives dans le graphe de connaissances Pegasus. La troisième contribution porte sur le développement d’un modèle mathématique du cycle de tyrosination des microtubules qui explique, d’une part, l’inactivité de composés chimiques dans les cellules montrés actifs hors cellule, et d’autre part, suggère la nécessité d’activer deux réactions de ce cycle, en synergie, pour obtenir un effet dans les modèles cellulaires. Ceci illustre l’apport de la modélisation mathématique pour, d’une part, prédire et comprendre la dynamique contre-intuitive de processus biochimiques, qui n’est pas représentable par des graphes de connaissances statiques comme dans Pegasus, et d’autre part, guider la conception de nouvelles expériences de criblage. Les contributions scientifiques et les applications industrielles de cette thèse sont développées dans le cadre des phases primaires de recherche de nouveaux médicaments et ont vocation à s’étendre aux phases cliniques du processus pharmaceutique.


Wednesday April 20th 2022

10.30 Jakob Ruess (Inria Paris, Pasteur Institute) Room Grace Hopper - Visio link: https://rendez-vous.renater.fr/lifeware

From single cells to microbial consortia and back: stochastic chemical kinetics coupled to population dynamics (slides)

At the single-cell level, biochemical processes are inherently stochastic. Such processes are typically studied using models based on stochastic chemical kinetics, governed by a chemical master equation (CME). The CME describes the time evolution of the probability distribution over system states and has been a tremendously helpful tool in shedding light on the functioning of cellular processes. However, both in nature and the majority of lab experiments, single cells are not living in isolation but are part of a growing population or community. In such contexts, stochasticity at the single-cell scale leads to population heterogeneity and cells may be subject to population processes, such as selection, that drive the population distribution away from the probability distribution of the single-cell process. Here, I will present our work on augmenting the CME to construct models that capture coupled dynamics of single-cell and population processes. I will then present an optogenetic recombination system that allows one to partition yeast populations into different cell types via external application of blue light to cells. I will show how this system allows one to create and dynamically control simple artificial yeast consortia and demonstrate how augmented CME models can be used to predict emerging population dynamics from a specification of the single-cell recombination system.


Tuesday March 29th 2022

10.30 Deborah Boyenval (I3S Nice) Room Grace Hopper - Visio link: https://rendez-vous.renater.fr/lifeware

A discrete modelling study devoted to the formalization and verification of mammalian cell cycle checkpoints.

During the cell cycle progression (G1-S-G2-M), the integrity of the DNA is guaranteed by checkpoints, which can stop the cell cycle at certain key points when the DNA is damaged or if certain events have not taken place before others. Because checkpoints are mechanisms involving complex intertwined mechanisms, they represent a modelling challenge. Numerous modeling works have been devoted to the cell cycle but very few to its checkpoints. When modelled, they are often reduced to the idea of irreversible transitions between cell cycle phases, while the notion of phases is still fuzzy. Therefore, we propose a discrete cell cycle modelling work entirely devoted to cell cycle phases then checkpoints formalization. Moreover, the discrete nature of checkpoint mechanisms opens the door to formal methods for discrete modeling in biology, such as CTL model-checking and the genetically modified Hoare logic. The TotemBioNet tool efficiently combines these two methods to exhaustively identify the models that do not contradict formalized dynamical biological properties. A phase is naively considered as a consensus sequence of events from a single initial state. Then thanks to TotemBioNet, the identification of all events that can initiate and can end a phase allows us to identify a set of non-permutable key events, ensuring that a phase does not start until all events in the previous phase have been completed. These constitute the main characteristic of the inter-phase checkpoints predicate (G1/S, S/G2 and G2/M) implemented in Prolog. Other steps of formalization have been included. One example is the consideration of phase space exits in the formalization of checkpoints.


Wednesday, February 2022, 2pm, Amphi Burg, Institut Curie, rue Lhommond, Paris.

Julien Martinelli, INSERM-Institut Curie, EPI Lifeware Inria Saclay, ED IPP

PhD Thesis Defense: On learning mechanistic models from time series data with applications to personalised chronotherapies

Mathematical modeling of biological processes aims at providing formal representations of complex systems to enable their study, both in a qualitative and quantitative fashion. The need for explainability suggests the recourse to mechanistic models, which explicitly describe molecular interactions. Nevertheless, such models currently rely on the existence of prior knowledge on the underlying reaction network structure. Moreover, their conception remains an art which necessitates creativity combined to multiple interactions with analysis and data fitting tools. This rules out numerous applications conceivable in personalized medicine, and calls for methodological advances towards machine learning of patient-tailored models. This thesis intends to devise algorithms to learn models of dynamical interactions from temporal data, with an emphasis on explainability for the human modeler. Its applications are in the context of personalized chronotherapies, that consist in optimizing drug administration with respect to the patient’s biological rhythms over the 24 hour span. Three main themes are explored: mechanistic modeling, network inference and treatment personalization. The first chapter describes the development of the first quantitative mechanistic model of the cellular circadian clock integrating transcriptomic, proteomic and subcellular localization data. This model has been successfully connected to a model of cellular pharmacology of an anticancerous drug, irinotecan, achieving personalization of its optimal administration timing. The second chapter introduces a novel protocol for inferring wholebody systemic controls enforced on peripheral clocks. On the long run, this approach will make it possible to integrate individual data collected from wearables for personalized chronotherapies. The third chapter presents a general algorithm to infer reactions with chemical kinetics from time series data.


Wednesday May 12th 2021 (live on https://rendez-vous.renater.fr/inria-lifeware)

14.00 Zhang Xingjian (LIX)

Interpretable exact linear reductions via positivity (slides)

Kinetic models of biochemical systems are polynomial ODEs with hundreds or even thousands of variables. Their large size is an obstacle to deriving mechanistic insights. A way to address this issue is to perform an exact model reduction by finding a self-consistent lower-dimensional projection of the corresponding dynamical system. A number of approaches were suggested with the projection matrix being a zero-one matrix. A recent algorithm CLUE (https://arxiv.org/abs/2004.11961) finds that, by allowing arbitrary matrices, one can construct an exact linear reduction of smaller dimension than it was possible before, while preserving fixed variables of interest. However, there was a drawback to the method: some of the new variables had lost their mechanistic insights. In this talk, I will present an algorithm that re-parametrizes exact linear reductions by performing an invertible transformation of the new coordinates to improve the physical interpretability of the new variables. We show the improvement through case studies of models in the literature.

This is joint work with Gleb Pogudin.


Thursday 11th February 2021 (live on https://rendez-vous.renater.fr/inria-lifeware)

14.00 Barbara Bravi (ENS Paris)

Path integral approaches to model reduction in biochemical networks

In this talk, I will consider the general problem of describing the dynamics of subnetworks of larger biochemical reaction networks, e.g., protein interaction networks involving complex formation and dissociation reactions. I will describe model reduction strategies based on the theoretical physics tool of path integrals. The subnetwork-reduced dynamics obtained contains a memory term and an extrinsic noise term, the later in particular provides a principled derivation of different components of the extrinsic noise that is observed experimentally in cellular biochemical reactions, over and above the intrinsic noise from the stochasticity of biochemical events in the subnetwork. An example from the epidermal growth factor receptor signalling pathway is provided to probe the prediction accuracy and computational efficiency of our method.


Thursday 26th November 2020 (live on https://rendez-vous.renater.fr/inria-lifeware)

11.00 Thomas Chatain (LSV, ENS Paris-Saclay)

Variations on alignments in conformance checking

Conformance checking techniques asses the suitability of a process model in representing an underlying process, observed through a collection of real executions. An important problem for conformance checking is to align a log trace with a model. We show that several variants of this notion of alignment provide relevant feedback on the model: anti-alignments help to asses the precision of the model with respect to the observed executions, while multi-alignments are used to cluster log traces and relate the clusters to runs of a model. We present a tool that uses SAT encodings to exploit these variants of alignments.


Tuesday 17th November 2020 (live on https://rendez-vous.renater.fr/inria-lifeware )

11.00 Jérémie Pardo (IBISC, Univ. Evry)

Sequential Reprogramming of Biological Network Fate

A major challenge in precision medicine consists in finding the appropriate network control for reprogramming the cell fate. Recent research in computational biology provides new inference methods for reprogramming a network in order to converge the dynamics towards an expected state. However, these control strategies are often limited to a single shot. In some cases, a therapy abides by a time-scheduled drug administration protocol and similarly, some diseases are the result of a sequence of mutations. Therefore the issue is to extend the single control strategy to the definition of a sequence of controls. Thus, we extend previous works on abductive-based inference of control reprogramming by investigating the sequential control of Boolean networks. In this talk, we will present a theoretical framework formalizing the basis of control sequences from which we deduce two control sequences inference algorithms.

Joint work with Franck Delaplace and Sergiu Ivanov


Tuesday 10th November 2020 (VISIO ended )

11.00 Katrina Podsypanina (IHES)

From a quantitative model for metastatic colonization by benign cells to a framework for modelling metastatic spread.

There are striking similarities in the behavior of normal and malignant epithelial cells introduced into systemic circulation (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2694414/). I have experimental data that suggest that, in mice, non-transformed cells are an integral part of systemic cancer development (https://www.biorxiv.org/content/10.1101/085423v1.full). I hypothesize that dissemination of non-malignant epithelial cells provoked by a growing tumor, with subsequent transformation at the ectopic sites is another source of metastasis - in humans, too. This scenario ties together two contradictory observations: - that metastasis is associated with large and rapidly growing primary tumors, while metastatic tumors themselves often take long time to grow. If systemic cancer is a result of dissemination of non-malignant epithelial cells provoked by a growing tumor, it may help explain why adjuvant chemotherapy fails to prevent some metastases: - because these not-yet-malignant cells are spared by the treatment, and will only become tumor cells later, directly at the ectopic site. Thus, investigating whether non-linear metastasis plays a role in human cancer spread has clear translational implications. If metastatic growth is initiated from non-malignant cells, as suggested by the evidence supporting the non-linear metastasis model, improved systemic treatments may be identified based on the ability to kill these ectopic cells. Alternatively, if only tumor cells initiate metastases, it may not be possible to treat cancer more effectively. To probe the relevance of our findings to human disease, we generated a mathematical model describing the dynamic of the disseminated ectopic cells in the peripheral tissues. Its central conclusion is that when the flow of mammary cells from the breast is constant, their number at the periphery reaches a steady-state within a short period of time. These calculations predict that normal cells detected in circulation of patients with established tumors constitute evidence of sustained non-linear dissemination from the primary site.


Thursday 23rd July 2020 (room Grace Hopper )

11.00 Gleb Pogudin (LIX)

Exact model reduction by constrained linear lumping

Many processes in life sciences are described as ODE systems with polynomial right-hand side. Such systems can be very big: thousands or even millions of equations. It turns out that systems appearing in systems biology often have some structure that allows to perform an exact model reduction, that is find a system of smaller order such that its solutions are exactly the images of the solutions of the original system under a linear transformation (such reduction is called linear lumping). Such exact reduction are of interest not only because they make the system smaller but also may provide additional insights into the models structure. The problem we tackle is: given some variables of interest to be preserved find such a reduction of the smallest possible dimension. We will present an algorithm for solving the problem and applications of the algorithm to examples from literature. After that, I will discuss the limitations of this approach and directions for further reseach.

This is joint work with A. Ovchinnikov, I.C. Perez Verona, and M. Tribastone.


CANCELLED Tuesday 31 March 2020 (VISIO renater) NEW DATE TO BE DECIDED

11.00 Nicolas Dupin (LRI, Orsay)

Hybridizing mathematical programming and heuristic approaches in combinatorial optimization

This talk gives an overview of techniques to hybridize mathematical programming and heuristic techniques to tackle large-scale optimization problems, in the case of Mixed Integer Linear Programming (MILP). MILP is a common framework to model combinatorial optimization problems, real-life optimization problems in the industry, or in fields like networks and bioinformatics. On one hand, exact MILP solvers (like Gurobi, Cplex, SCIP) are nowadays efficient to solve optimally mid-size instances of highly constrained optimizations problems, with a exponential explosion when dealing with larger size of instances. On the other hand, heuristics and meta-heuristics for combinatorial optimization allow to give good solutions of large scale instances, without any optimality guarantee, and with difficulties dealing with highly-constrained optimization problems. Matheuristics, the hybridization of exact and heuristics methods, combine both advantages of such methods, to solve efficiently and heuristically large instances of MILP problems, possibly highly-constrained, in defined time limits. Several such techniques are overviewed, with an illustration on results from an industrial application in energy management.


Tuesday 10 March 2020 (VISIO renater)

17.00 Pierre-Jean Meyer (U.C. Berkeley)

Reachability analysis and decompositions for abstraction-based control synthesis

This talk gives an overview of three recent results in the field of abstraction-based control synthesis, where we create a finite abstraction of a continuous dynamical system to synthesize a high-level controller with respect to some temporal logic specification. First, we introduce new results on reachability analysis using intervals and applicable to most nonlinear dynamical systems. Then, we discuss the notion of compositional abstraction, where a large continuous system is decomposed into smaller subsystems which are easier to abstract. Finally, we consider a framework for hierarchical decomposition of temporal logic control problems into three successive subproblems.


Wednesday 4 March 2020 (VISIO renater)

Aurélien Naldi (EP Lifeware, Inria)

11.00 Sémantiques de mises a jour et attracteurs dans les modèles d'influence qualitatifs


Monday 24 Feb 2020 (salle Radhia Cousot)

11.00 Alexandru Iosif (JRC-COMBINE, Aachen, Germany)

Toric biochemical systems

Often, biochemical processes can be approximated by molecular interaction digraphs, called chemical reaction networks. When the dynamics of these networks are governed by the mass action law, their stationary states form semi-algebraic varieties. Realistic models tend to be large and measurement data is usually noisy. Therefore, the mathematical models contain a significant number of parameters. However, biological models tend to have interesting algebraic and combinatorial properties. For example, in 2016 Millán et al. demonstrated that a significant class of biochemical systems are toric, that is, that their stationary states form a toric variety. In a recent experiment, Gregoriev et al. showed that an important percentage of the polynomial models from the BioModel repository are toric. In this talk we briefly review some recent results about toric biochemical systems. Then we introduce a sufficient condition for toricity of these systems, namely the isolation property.


Monday 17 Feb 2020 (salle Radhia Cousot)

11.00 Philippe Jacquet (Inria Saclay, IEEE fellow)

AI versus Information Theory, toward a theory of learnability?

We will fly over a quick review on how Information Theory could help to understand Artificial Intelligence, in particular around the crucial aspect on how a complex system can be made to evolve into a more complex system. We will address the most specific problem of learnability. It turns out that we are at the very beginning of a Theory of learnability, since deep neural network are sometimes incapable of learning surprisingly simple problems. We will try to hint a characterisation of such unlearnable problems.


Friday 8 November 2019 (salle Radhia Cousot)

11.00 Aurélien Naldi (ENS Paris)

Methods and tools for the qualitative modeling of the immune response

Qualitative dynamical models, based on generalized Boolean functions, have been used over the last decades to study complex biological systems. As a reasoning tool for researchers in biology, they facilitate the identification of missing or inconsistent knowledge, and can ultimately guide experimental design.

We are interested in particular in the regulation of the immune response, involving several activation, differentiation, and feedback processes in different cell types. Our main current model, encompassing over one hundred components, focuses on regulatory T cells, but future work will account for other cell types.

The analysis of these models drives the development of a various formal methods in several research groups. An ongoing collaborative effort further facilitates the definition of reproducible analyses.


Friday 18 October 2019 (salle Grace Hopper)

11.00 Jean-Louis Lassez (retired IBM Yorktown)

Le Tao de la Preuve Vide

Les algorithmes et les preuves de théorèmes sont étroitement liés depuis l’ an 800. Les structures de données bien choisies conduisent a des algorithmes simples et à des langages de programmation de haut niveau. En parallèle des définitions bien choisies conduisent a des preuves simples, tellement simples qu’ elles peuvent disparaitre. Le type de simplicité que nous allons illustrer est fondé sur le fait qu’ on peut lire les données de gauche a droite, de droite a gauche, de bas en haut, ou même dans le sens des aiguilles d’ une montre. Nous allons donner des exemples venus de langages formels appliqués au théatre, McBeth et Shakespeare, au brevet qui a conduit au moteur de recherche de Google, qui a une nouvelle interprétation utilisant une version symbolique de l’élimination de Gauss. Et aussi des preuves vides de théorèmes importants en géométrie et en programmation linéaire, utilisant un algorithme dû à Fourier. Une version de la théorie des codes permet de revoir de manière positive une vieille proposition de Crick sur le code génétique qui avait été rejetée.


Monday 7 January 2019 (room Radhia Cousot, A. Turing building)

14.00 Anne Auger (Inria Saclay IdF, CMAP)

Convergence Results of Adaptive Evolution Strategies: an Overview

Evolution Strategies (ES) are adaptive stochastic search optimization algorithms commonly used to solve difficult numerical optimization problems, e.g. when the underlying function is non convex, non-smooth, non-differentiable, ... Among ESs, the Covariance Matrix Adaptation ES (CMA-ES) is seen today as the state-of-the-art method. ES have not been designed with a mathematical framework to ease convergence proofs in mind. There is thus today a strong discrepancy between convergence results that we observe empirically and that we can formally prove.

In this context, I will review existing convergence frameworks and results for ESs and more generally adaptive stochastic search methods. I will give some insights on existing proof techniques and discuss open questions.


Friday 16 November 2018 (room Radhia Cousot, A. Turing building)

10.30 Lucia Nasti (Universita di Pisa, Italy)

Towards a better understanding of initial concentration robustness

The main goal of systems biology is to understand the dynamical properties of biological systems by investigating the interactions among the components of a biological system. We focus on the robustness property, a behaviour observed in several biological systems that allows them to preserve their functions despite external and internal perturbations. We first propose a new formal definition of robustness using the formalism of continuous Petri nets. In particular, we focus on robustness against perturbations to the initial concentrations of species. We demonstrate the validity of our definition by applying it to the models of three different robust biochemical networks. To make our definition effective, we study monotonicity of the systems, in particular, we are intererest in the monotonicity of the output concentrations with respect to initial concentrations. To this aim we propose new monotonicity definitions that would allows us to restrict the number of simulations required to prove our robustness properties. Sufficient conditions are then devised that guarantee such monotonicity properties on biological systems.


Friday 26 October 2018 (room Radhia Cousot, A. Turing building)

10.30 Carlo Spaccasassi (Microsoft Research Cambridge UK)

A logic programming language for computational nucleic acid devices

Computational nucleic acid devices show great potential for enabling a broad range of biotechnology applications, including smart probes for molecular biology research, in vitro assembly of complex compounds, high-precision in vitro disease diagnosis and, ultimately, computational theranostics inside living cells. This diversity of applications is supported by a range of implementation strategies, including nucleic acid strand displacement, localisation to substrates, and the use of enzymes with polymerase, nickase and exonuclease functionality. However, existing computational design tools are unable to account for these strategies in a unified manner. This paper presents a logic programming language that allows a broad range of computational nucleic acid systems to be designed and analysed. The language extends standard logic programming with a novel equational theory to express nucleic acid molecular motifs. It automatically identifies matching motifs present in the full system, in order to apply a specified transformation expressed as a logical rule. The language supports the definition of logic predicates, which provide constraints that need to be satisfied in order for a given rule to be applied. The language is sufficiently expressive to encode the semantics of nucleic strand displacement systems with complex topologies, together with computation performed by broad range of enzymes, and is readily extensible to new implementation strategies. Our approach lays the foundation for a unifying framework for the design of computational nucleic acid devices.


Tuesday 15 May 2018

13.30 Debdas Paul (Univ. Stuttgart)

Stochastic sequestration dynamics can act as intrinsic noise filter in signaling network motifs

The relation between design principles of signaling network motifs and their robustness against intrinsic noise still remains illusive. In this talk, I will summarize our recent investigation on the role of cascading for coping with intrinsic noise due to stochasticity in molecular reactions. We use stochastic modeling approaches to quantify fluctuations in the terminal kinase of phosphorylation-dephosphorylation cascade motifs and demonstrate that cascading highly affects these fluctuations. We show that this purely stochastic effect can be explained by time-varying sequestration of upstream kinase molecules. In particular, we discuss conditions on time scales and parameter regimes which lead to a reduction of output fluctuations. Our results are put into biological context by adapting rate parameters of our modeling approach to biologically feasible ranges for general binding-unbinding and phosphorylation-dephosphorylation mechanisms. Overall, this study reveals a novel role of stochastic sequestration for dynamic noise filtering in ubiquitous signaling motifs. Keywords: Phosphorylation-dephosphorylation cycle, sequestration, noise filtering, stochastic modeling

Reference: Paul, D. and Radde, N., 2018. The role of stochastic sequestration dynamics for intrinsic noise filtering in signaling network motifs. bioRxiv, p.278929. Stochastic sequestration dynamics can act as intrinsic noise filter in signaling network motifs


Friday 2 February 2018

11.00 Loic Paulevé (CNRS, LRI)

On the Colomoto interactive notebook toward reproducible qualitative systems biology

In this talk I shall describe the current development of the CoLoMoto tool which aims to provide, on the one hand, a joint distribution of tools for computational systems biology, and on the other hand an interface to write and document model analysis workflows, which typically rely on several tools, using the Jupyter notebook web interface.


Monday 22 January 2018

11.00 Mathieu Hemery (McGill Univ, Montreal, Canada) room Grace Hooper (2nd floor, build. Turing)

Evolutionary Algorithms!

In 1975 J. Holland proposed to copy the process of Darwinian natural selection to define a new kind of algorithms: the evolutionary algorithms. The purpose was both to develop a new method that may help to solve hard problems and to study in depth the natural evolution. I will here summarize what these algorithms are and why they are worth of interest. Then I will present my own works with these algorithms, emphasizing two biological features: the importance of fluctuating environment and the primacy of function.


Tuesday 16 January 2018

14.00 Mathias Fuegger (LSV, Cachan) room Grace Hopper (2nd floor, buil. Turing)

Fast Asymptotic and Approximate Consensus in Biological Networks

The talk focuses on the theoretical foundation for distributed consensus algorithms we plan to implement in bacterial colonies.

Reaching consensus on a common value in a distributed system is a well-studied central problem in distributed computing. Unfortunately, even modest network dynamics, such as they occur in biological systems, prohibit solvability.

For many biological applications, such as, distributed control, clock synchronization, etc., it is however sufficient to asymptotically converge to the same value, or decide on values not too far from each other. We study solvability of these consensus variants in highly dynamic networks, provide time complexity results, and present fast algorithms.

The talk is on previous and current research with Bernadette Charron-Bost (LIX) and Thomas Nowak (LRI).


Friday 13 October 2017

14.00 Hadi Zaatiti (CEA, LRI) salle Postel, Bât. Turing

Abstractions for hybrid systems diagnosability analysis

Verifying behavioral or safety properties of hybrid systems, either at design stage such as state reachability and diagnosability, or on-line such as fault detection and isolation, is a challenging task. We are concerned here with abstractions oriented towards hybrid systems diagnosability checking. The verification can be done on the abstraction by providing a counterexample in case of non-diagnosability. The absence of such a counterexample proves the diagnosability of the original hybrid system. In presence of a counterexample, the first step is to check if it is not a spurious effect of the abstraction and actually exists for the hybrid system, witnessing thus non-diagnosability. Otherwise, we show how to refine the abstraction, guided by the elimination of the counterexample, and continue the process of looking for another counterexample until either a final result is obtained or we decide to stop the refinement with an inconclusive verdict. We make use of qualitative modeling and reasoning to compute abstractions represented by discrete event systems with temporal information modeled as timed automata.


Tuesday 21 March 2017, exceptionally at Inria Paris, room J.L. Lions 1

10.30 Christine Solnon, LIRIS, CNRS, INSA Lyon

Revisiting AES Related-Key Differential Attacks with Constraint Programming

The Advanced Encryption Standard (AES) is one of the most studied symmetric encryption schemes. During the last years, several attacks have been discovered in different adversary models. In this paper, we focus on related-key differential attacks, where the adversary may introduce differences in plaintext pairs and also in keys. We show that Constraint Programming (CP) can be used to model these attacks, and that it allows us to efficiently find all optimal related-key differential characteristics for AES-128, AES-192 and AES-256. In particular, we improve the best related-key differential for the whole AES-256 and give the best related-key differential on 10 rounds of AES-192, which is the differential trail with the longest path. Those results allow us to improve existing related-key distinguishers, basic related-key attacks and q-multicollisions on AES-256.


Wednesday, December 13, 2016, 2.30pm, 454A seminar room of the Matière et Systèmes Complexes laboratory (Condorcet Building, Paris Diderot University).

Jean-Baptiste Lugagne, Inria Saclay, EP Lifeware

PhD Thesis Defense: Real-time control of a genetic toggle switch

The main project of this PhD is real-time control of the state of a genetic toggle switch in Escherichia coli. The impetrant developed and used a platform for single-cell external gene control in bacteria combining microfluidics, microscopy and informatics. The core contribution is the multiple-input multiple-output control of a multistable gene regulatory network: in an analogy with the benchmark control problem of the inverted pendulum, we successfully stabilized the system in its unstable area. After stabilizing single cells harboring the genetic toggle switch, we also stabilized populations of cells in their unstable areas. Finally, after numerical analysis of the dynamics of the toggle switch constructs in the cells, we identified a range of open-loop dynamic stimulations that force the toggle switch cells out of their basins of attraction and maintains them in the unstable area.


Wednesday, December 7, 2016, Amphithéâtre Sophie German, Inria Saclay

10:00 Sylvain Soliman, Inria Saclay, EP Lifeware

Soutenance d'HDR : Une perspective structurelle sur la dynamique des systèmes biochimiques

Résumé de l'HDR :

La Biologie des Systèmes est depuis peu devenue un domaine de recherche florissant qui tente de tirer parti de toutes les informations que les techniques à haut débit des biologistes ont rendues disponibles.

Il y a une quinzaine d'années, la publication de la carte du contrôle du cycle cellulaire des mammifères par Kurt Kohn a été une avancée fondamentale. Sa similarité avec un circuit électronique est cruciale dans l'impossibilité pour des humains de l'appréhender complètement mais aussi dans l'idée d'utiliser pour cela des méthodes formelles.

Cependant, depuis cette carte, les réseaux construits par les biologistes et modélisateurs ont continué à croître, enrichis de détails mécanistes de plus en plus précis, en particulier des informations post-transcriptionnelles acquises récemment, mais sans données cinétiques précises. Or les méthodes d'analyse qui fournissent des informations dynamiques s'appuient principalement sur une information cinétique complète, par conséquent, ce qui était un défi pour l'humain il y a dix ans l'est devenu pour l'ordinateur aussi.

Dans ce manuscrit nous allons tenter de donner une idée de notre travail des douze dernières années, centré autour de la notion de modèle. Nous allons définir plus précisément, et formellement, cet objet, et essayer de répondre aux défis liés au foisonnement des données et à la croissance des modèles de Biologie des Systèmes. Notre approche est fondée sur les liens entre structure et dynamique d'un tel modèle, ces liens permettant d'utiliser de nombreuses méthodes formelles, comme par exemple la programmation par contraintes, pour raisonner sur la dynamique de ces objets. Le problème de la taille nous amènera par ailleurs à envisager les questions de réduction de modèle, mais aussi de liens entre modèles, formalismes et interprétations.

La nature discrète de la structure qui sous-tend un modèle peut sembler opposée à la continuité de la dynamique qui lui est souvent associée via des équations différentielles. Cependant nous espérons démontrer que cette séparation est essentiellement artificielle, et que les résultats récents offrent des perspectives prometteuses de la réduire.


Wednesday 9 November 2016,

16:00 Bernadette Charron-Bost, CNRS, LIX

Tout ce que vous avez toujours voulu savoir sur le calcul distribué ...

L’objectif de cet exposé est de présenter les bases du calcul distribué. Pour cela, je passerai en revue les différents modèles de systèmes distribués ainsi que les principaux problèmes de contrôle et de coordination que l’on s’attache à résoudre. Je présenterai les techniques algorithmiques de base développées pour résoudre ces problèmes. Dans une deuxième partie, j’aborderai la question de la tolérance aux pannes et résumerai les résultats fondamentaux (calculabilité et complexité) du domaine. Je conclurai cet exposé en donnant une liste de problèmes ouverts et en décrivant ce que sont, aujourd’hui, les enjeux principaux en calcul distribué.


Wednesday 27 July 2016, room Radhia Cousot,

11:00 Patrick Gardy, ENS Cachan

Unnatural dependencies in the boolean goal strategy logic

The boolean goal strategy logic (SL[BG] for short) is an extension of LTL adapted for problems with multiple agents and objectives at the same time. SL[BG] formulas are made of a bloc of quantification storing some strategies into variables followed by a boolean combination of temporal objectives. A formula \phi is said non-behavioral when in order for \phi to hold, existentially quantified strategies on a history \pi must depend on choices made by strategies quantified before on counter-factual or future plays. Theses unnatural dependencies rarely occur in concrete reactive-synthesis related problems, therefore one can wonder what cause them to appear. They are also conjectured to be the cause of a high model checking complexity. We present the problem, extend the framework proposed in « Reasoning about strategies : On the model checking problem » (Mogavero,Murano,Perelli, Vardi) and partially explain the rise of unnatural dependencies.


Wednesday 15 June 2016, Jacques-Louis Lions room at Inria Paris (2 rue Simone Iff, 75012 Paris)

11:00 Seminar of Mark Chaplain, University of St-Andrews, UK

Spatio-temporal modelling of gene regulatory networks: The role of molecular movement.

Gene regulatory networks (GRN), i.e. DNA segments in a cell which interact with each other indirectly through their RNA and protein products, lie at the heart of many important intracellular signal transduction processes. In this talk, we will present and analyse mathematical models of various GRNs using partial differential equations (PDE) and stochastic PDEs. We will then analyze a mathematical model of a canonical gene regulatory network consisting of a single negative feedback loop between a protein and its mRNA (e.g. the Hes1 transcription factor system). The model consists of two partial differential equations describing the spatio-temporal interactions between the protein and its mRNA in a one-dimensional domain. Such intracellular negative feedback systems are known to exhibit oscillatory behavior and this is the case for our model, shown initially via computational simulations. In order to investigate this behavior more deeply, we undertake a linearized stability analysis of the steady states of the model. Our results show that the diffusion coefficient of the protein/mRNA acts as a bifurcation parameter and gives rise to a Hopf bifurcation. This shows that the spatial movement of the mRNA and protein molecules alone is sufficient to cause the oscillations. Our result has implications for transcription factors such as p53, NF-kappaB and heat shock proteins which are involved in regulating important cellular processes such as inflammation, meiosis, apoptosis and the heat shock response, and are linked to diseases such as arthritis and cancer.

14:00 PhD Thesis of François Bertaux, Inria (now Imperial College)

Cell-based multi-scale modeling for systems and synthetic biology: from stochastic gene expression in single cells to spatially organized cell populations.

Abstract: Cell-intrinsic, non-environmental sources of cell-to-cell variability, such as stochastic gene expression, are increasingly recognized to play an important role in the dynamics of tissues, tumors, microbial communities... However, they are usually ignored or oversimplified in theoretical models of cell populations. In this thesis, we propose a cell-based (each cell is represented individually), multi-scale (cellular decisions are controlled by biochemical reaction pathways simulated in each cell) approach to model the dynamics of cell populations. The main novelty compared to traditional approaches is that the fluctuations of protein levels driven by stochastic gene expression are systematically accounted for (i.e., for every protein in the modeled pathways). This enables to investigate the joint effect of cell-intrinsic and environmental sources of cell-to-cell variability on cell population dynamics. Central to our approach is a parsimonious and principled parameterization strategy for stochastic gene expression models. The approach is applied on two case studies. First, it is used to investigate the resistance of HeLa cells to the anti-cancer agent TRAIL, which can induce apoptosis specifically in cancer cells. A single-cell model of TRAIL-induced apoptosis is constructed and compared to existing quantitative, single-cell experimental data. The model explains fractional killing and correctly predicts transient cell fate inheritance and reversible resistance, two observed properties that are out of reach of previous models of TRAIL-induced apoptosis, which do not capture the dynamics of cell-to-cell variability. In a second step, we integrate this model into multi-cellular simulations to study TRAIL resistance in virtual scenarios constructed to help bridging the gap between standard in-vitro assays and the response of in-vivo tumors. More precisely, we consider the long-term response of multi-cellular spheroids to repeated TRAIL treatments. Analysis of model simulations points to a novel, mechanistic explanation for transient resistance acquisition, which involves the targeted degradation of activated proteins and a differential turnover between pro- and anti- apoptotic proteins. Second, we apply our approach to a synthetic spatial patterning system in yeast cells developed by collaborators. Focusing first on a sensing circuit responding to a messenger molecule, we construct a single-cell model that accurately capture the response kinetics of the circuit as observed in flow cytometry data. We then integrate this model into multi-cellular simulations and show that the response of spatially-organized micro-colonies submitted to gradients of messenger molecules is correctly predicted. Finally, we incorporate a model of a killing circuit and compare the predicted patterns of dead or alive cells with experimental data, yielding insights into how the circuit parameters translate into multi-cellular organization phenotypes. Our modeling approach has the potential to accelerate the obtention of more quantitative and predictive models of cell populations that encompass the molecular scale.

Jury:

  • Alessandra Carbone, Université Pierre et Marie Curie
  • Mark Chaplain, University of St-Andrews
  • Paul Macklin, University of Southern California
  • Emmanuel Barillot, Institut Curie
  • Jérémie Roux, Université de Nice
  • Dirk Drasdo, Inria Paris
  • Gregory Batt, Inria Saclay - Ile-de-France

Tuesday 14 June 2016, 11.00

Adrien Basso-Blandin, IRIF Paris

Kami (Knowledge Aggregator and Model Instantiator) : A graph rewriting pipeline from the biological literature to rule-based models.

The study of cellular signalling pathways and their deregulation in disease states, such as cancer, is a large and extremely complex task. Indeed, these systems involve many parts and processes but are studied piecewise and their literatures and data are consequently fragmented, distributed and sometimes—at least apparently—inconsistent. This makes it extremely difficult to build significant explanatory models with the result that effects in these systems that are brought about by many interacting factors are poorly understood.

The rule-based approach to modelling has shown some promise for the representation of the highly combinatorial systems typically found in signalling . However, the rule-based approach requires highly detailed information about the precise conditions for each and every interaction which is rarely available from any one single source. Rather, these conditions must be painstakingly inferred and curated, by hand, from information contained in many papers—each of which contains only part of the story.

We introduce a graph-based meta-model, attuned to the representation of cellular signalling networks, which aims to ease this massive cognitive burden on the rule-based curation process. Our approach provides a basis for the gradual aggregation of fragmented biological knowledge extracted from the literature into an instance of the meta-model from which we have defined an automated translation into executable Kappa programs.


Tuesday 10 May 2016, Salle Favard, 2nd floor, Institut de Biologie de l'ENS, 46 rue d'Ulm, Paris

11.00 Attila Attila Csikász-Nagy, King's College London

Protein Complex Prediction Through Proteome-Wide Simulations

Multiple copies of each protein are present in cells and some of these could be involved in multiple complexes, thus it is a challenging task to identify protein complex compositions and abundances of all possible complexes. In this talk I propose an integrative simulation based computational approach able to predict protein complexes together with their abundances from existing data sources on protein-protein and domain-domain interactions and protein abundances. Our simulations are consistent with manually curated data and can also predict how perturbations by a drug can influence the composition and abundance of protein complexes.

15.00 PhD Thesis of Pauline Traynard,

Model Building by Temporal Logic Constraint Solving: Investigation of the Coupling between the Cell Cycle and the Circadian Clock

Abstract: This thesis explores the use of temporal logic and model checking in systems biology. Temporal logic provides a powerful language to formalize complex yet imprecise dynamical properties that characterize biological systems, and to partly automate model building as a constraint satisfaction problem. This approach is first applied to update a logical model of the mammalian cell cycle, before considering quantitative deterministic models. Strategies to obtain efficient solving of dynamical constraints over a finite time horizon are developed and applied to the investigation of model coupling for the mammalian cell cycle and the circadian clock.

Jury:

  Attila Csikász-Nagy, Senior Lecturer at King's College London (reviewer)
  Nir Piterman, Reader at University of Leicester (reviewer)
  Ahmed Bouajjani, Professor at Université Paris Diderot
  Laurence Calzone, Research Scientist at Institut Curie
  Franck Delaunay, Professor at Université Nice Sophia Antipolis
  Anne Siegel, Senior Research Scientist at IRISA – CNRS
  François Fages, Senior Research Scientist at Inria (thesis advisor)
  Denis Thieffry, Professor at ENS (thesis advisor)

Wednesday 4 May 2016, 11.00 Room Radhia Cousot, Building Alan Turing

Paul Ruet, CNRS, lab. PPS Univ, Paris-DIderot

Structure and dynamics of Boolean networks.

Boolean networks represent the interaction of finitely many variables which may take two values, 0, and 1. They have been extensively used as models of various biological networks, in particular genetic regulatory networks since the early works of S. Kauffman and R. Thomas. In this context, a variable models the (discretized) expression level of a gene, and certain dynamical properties of the network have a natural biological interpretation: the existence of several fixed points (or more generally attractors) corresponds to cellular differentiation, and sustained oscillations (cyclic attractors) correspond to a form of homeostasis. In this talk, I will give a brief overview of recent results relating these dynamical properties to the structure of the networks, in terms of (positive or negative, local or global) cycles.


Friday 4 March 2016, 11.00 Room Grace Hooper, Building Alan Turing

David Rosenblueth, Univ. Mexico

Making Boolean networks behave themselves (Model update via computation-tree logic)

We present a nondeterministic, recursive algorithm for updating a Boolean network so as to satisfy a given formula of the computation-tree logic (CTL). Recursive algorithms for model update face two dual difficulties: (1) Removing transitions from a Boolean network to satisfy a universal subformula may dissatisfy some existential subformulas. Conversely, (2) adding transitions to satisfy an existential subformula may dissatisfy some universal subformulas. To overcome these difficulties, we employ "protections", recording information about the satisfaction of subformulas previously treated by the algorithm. We show that protections may vastly reduce the backtracking of a naive generate-and-test CTL update algorithm.


Thursday 11 February 2016, 13.00 Room Grace Hooper 2, Building Alan Turing

Jakob Ruess, IST Austria

Towards coherent stochastic modeling in systems biology

In this talk, I will give a brief overview of how stochastic models are currently used by systems biologists with various theoretical backgrounds. I will argue that no single theoretical discipline has the appropriate concepts for providing a framework for stochastic modeling that is sufficiently general to be adjustable to the applications as necessary. I will then discuss what I believe to be the ingredients that are required for such a framework and talk about a recent paper where we worked on one of these ingredients in the context of a specific case study: the identification of a predictive stochastic model of a light-inducible gene expression circuit in yeast through the use of optimal experimental design.


Wednesday 10 February 2016, 10.30 Room Gilles Kahn, Building Alan Turing

Amaury Pouly, Univ. Oxford, UK

On the decidability and complexity of some continuous/hybrid reachability problems

In this talk, I will present some results about continuous space/time problems or hybrid reachability problems. The first problem is that of region to region reachability for piecewise affine functions. We show that, starting from dimension 2, the bounded time version is NP-complete, even if the function is assumed to be continuous. We also have some results on the related bounded safety problem, which is coNP complete and on some restricted precision version of the reachability problem, which is PSPACE-complete. The second problem is is related to polynomial differential equations (system of ODEs of the form y'=p(y) where p is a polynomial). We show that one can numerically solve such ODEs in polynomial time in the length of the solution curve. The third problem is that of transformation reachability in a nondeterministic hybrid linear automata with no guards: given a finite state machine where the dynamics of each state is given by a linear differential equation and one can nondeterministically change state at any time, is it possible to obtain a given linear transformation ? We show that when the matrices giving the continuous dynamics all commute, the problem is decidable. On the other hand, we show that the general problem is undecidable.


Thursday 4 February 2016, 13.00 (previously planned at 14.00) Room Grace Hooper 2, Building Alan Turing

Christoph Zechner, ETH Zurich

Dynamic noise filtering in synthetic molecular circuits

Context-dependency and environmental uncertainty are among the key challenges that hinder synthetic biology in reaching many of its potential applications. For instance, engineered circuits are exposed to a variety of noise sources when implanted into host organisms, often causing a prohibitive mismatch between circuit specification and actual behavior. In this talk I will discuss strategies for designing and engineering molecular circuits that are able to attenuate external noise and in turn achieve a higher degree of robustness in real environments. I will first present a mathematical framework for devising optimal noise filters and demonstrate how those can be realized using biochemical substrates. Subsequently, I will demonstrate several applications of our approach using simulation studies as well as experiments performed both in vitro and in vivo.


Wednesday 27 January 2016, 14.00, Room Grace Hooper, Building Alan Turing

Natalio Krasnogor, Newcastle University, UK

Accelerating synthetic biology via software and hardware advances

In this talk I will discuss recent work done in my lab that contributes towards accelerating the specify -> design -> model -> build -> test & iterate biological engineering cycle. This will describe advances in biological programming languages for specifying combinatorial DNA libraries, the utilisation of off-the-shelf microfluidic devices to build the DNA libraries as well as data analysis techniques to accelerate computational simulations.


Lifeware has moved from Inria Rocquencourt to Inria Saclay campus Ecole Polytechnique


Thursday 17 December 2015,

Ph.D. Defence of Thierry Martinez, University Paris-DIderot.

More details (access plan, manuscript, summary).

Be careful: due to the Plan vigipirate, the access to enter the university will be controlled. Please inform us of your venue by sending an e-mail to thierry.martinez@inria.fr to have your name on the list.

14.00-14.45: Jean Goubault-Larrecq, ENS Cachan, Categorical models of Multiplicative-Additive Linear Logic

14.45-15.15: Frank Valencia, LIX Palaiseau, Algebraic Structures for Space/Belief and Extrusion/Utterances

15.15-15.45: Tom Schrijvers, KUL Leuven, From monoids to near-semirings: the essence of MonadPlus and Alternative

15.45-16.00: break

16.00: Thierry Martinez, Execution models for Constraint Programming: kernel language design through semantics equivalence


Tuesday 16 December 2015, 14h (Bât 8)

Heinz Koeppl, Bioinspired Communication Systems, TU Darmstadt

Statistical inference of cellular behaviour from heterogeneous single-cell data

Modeling and inferring stochastic biomolecular processes based on single-cell data requires an extension of the traditional Markov chain description to account for the random molecular environment into which the process of interest is embedded. In particular, we seek an isolated process model that behaves as if the process was still embedded into the molecular environment. Based on that novel process model we develop a Bayesian inference framework that resorts to traditional MCMC schemes in combination with sequential Monte Carlo techniques. We apply the framework to live-cell imaging data of a inducible gene expression system in budding yeast and show that it allows to separate intrinsic from extrinsic noise components from single measurements without the need for dedicated dual-color constructs.


Tuesday 15 December 2015, 15.30, Room Luc Valentin 454A, Building Condorcet,10 rue Alice Domon 75013 Oaris

Ph.D. Defence of Artemis Llamosi: Les effets du stress osmotique répété sur l’expression génétique et la prolifération : de la variabilité entre cellules à l’individualité cellulaire dans la levure bourgeonnante Saccharomyces cerevisiae.

Face à un stress, les cellules sont capables de réponses et d’adaptations complexes. Bien que la réponse cellulaire à un stress ait été étudiée en détails, très peu de travaux ont étudié les effets de fluctuations dynamiques de stress et le rôle de la variabilité intercellulaire dans cette réponse. Nous avons étudié le stress osmotique dans S. cerevisiae à l’échelle de la cellule unique en combinant systèmes micro-fluidiques, microscopie de fluorescence et des méthodes avancées d’analyse d’image.

De manière originale, nous nous intéressons ici aux différences stables entre cellules dans leur dynamique d’expression génétique. L’utilisation de modèles de population et de méthodes statistiques avancées permet de représenter par un modèle mathématique à la fois les dynamiques d’une population et de cellules uniques de manière concise et cohérente. Cette approche quantitative de l’individualité stable dans la dynamique de l’expression génétique peut permettre la définition d’une forme d’identité non-génétique.

Nous avons étudié la relation entre les particularités cellulaires dans l’expression génétique et le phénotype ou le microenvironnement. En retraçant les généalogies cellulaires, nous avons également pu identifier que cette individualité était partiellement héritée.

La compréhension des conséquences évolutives d’une identité cellulaire non-génétique héritable nécessite de connaître d’avantage l’impact de fluctuations de stress sur la prolifération cellulaire. Nous avons cherché à séparer les conséquences du stress répété sur le cycle cellulaire d’une part et sur la croissance d’autre part, ce qui nous donne un aperçu des effets énergétiques et temporels du stress répété.


Wednesday 18 November 2015, 10.30

Morgan Madec, Telecom Physique Strasbourg

Application des méthodes de conception de la microélectronique à la biologie synthétique

La biologie synthétique est une nouvelle science à l’interface entre les biotechnologies et les sciences pour l’ingénieur. Elle vise à la fabrication et à l’intégration de nouvelles fonctions biologiques qui n’existent pas à l’état naturel. C’est un domaine à fort potentiel applicatif dans divers domaines tels que la synthèse de nouvelles thérapies et de moyens de diagnostic, l’agronomie ou l’environnement. En plus de toutes les biotechnologies nécessaires à la réalisation pratique de ces nouvelles fonctions (synthèse d’ADN, intégration de matériel génétique dans un génome, PCR, séquençage haut-débit …), la conception in silico de ces nouvelles fonctions biologiques reste un défi en soit, similaires à certains autres défis qu’ont relevé d’autre domaines de l’ingénierie, tel que la microélectronique. Grâce à plus de 40 ans de développement de CAO en électronique, il est aujourd’hui possible d’automatiser à plus de 90% la conception de circuits intégrant plus d’un milliard de transistors. La biologie synthétique et l’électronique étant assez proche, tout du moins d’un point de vue de la démarche de conception et des fonctions réalisées, nous explorons la possibilité de réutiliser certains outils développées en microélectronique pour la biologie synthétique.


Monday 2 November 2015, 10.30, Salle Verte1, Inria Italie

Prof. Nicolas Beldiceanu, EPI TASC, Ecole des Mines de Nantes

Using Finite Transducers for Describing and Synthesising Structural Time-Series Constraints for Constraint Programming and Linear Programming

We describe a large family of constraints for structural time series by means of function composition. These constraints are on aggregations of features of patterns that occur in a time series, such as the number of its peaks, or the range of its steepest ascent. The patterns and features are usually linked to physical properties of the time series generator, which are important to capture in a constraint model of the system, i.e. a conjunction of constraints that produces similar time series. We formalise the patterns using finite transducers, whose output alphabet corresponds to semantic values that precisely describe the steps for identifying the occurrences of a pattern. Based on that description, we automatically synthesise constraint checkers as well as automata with accumulators . These automata can be used both in the context of constraint programming and of linear programming. The description scheme not only unifies the structure of the existing 30 time-series constraints in the Global Constraint Catalogue, but also leads to over 600 new constraints, with more than 100,000 lines of synthesised code.


Monday 10 August 2015, 2pm

Prof. Andreas Weber, Univ. Bonn, Germany

Towards Efficient Methods to Compute Equilibria of Chemical Reaction Networks Using Special Structure Information

For large scale systems arising from chemical reaction networks, already the parametric computation of equilibria and singularities in the positive orthant has been a major challenge---but it is also of great importance for applications by understanding properties of the reaction network. Whereas for the parametric computation of thermodynamic equilibria an efficient algorithm can be devised combining recent methods, for the task to compute the concentration fixed points which are not in thermodynamic equilibrium no better algorithms than generic ones for solving polynomial equation systems have been found. Some attempts to use a vertex cover method on quadratic terms has not proven to be too successful, as the corresponding complexity parameters in biochemical reaction networks still is growing prohibitively.

However, the tree width of the reaction networks seen as bipartite graphs seem to grow rather slowly. We will sketch an algorithmic idea based on a corresponding tree decomposition of a network, in which polynomial systems of size bounded by the tree width have to be solved only. A parametric description of the solutions of the full polynomial system given the the entire network then can be computed by substituting „from leaves to root“ the solutions of the systems of child nodes to parent nodes in the tree decomposition.


Tuesday 26 May 2015, Inria Italie, Paris

PhD Thesis Defense of Steven Gay

Subgraph Epimorphisms: Theory and Application to Model Reductions in Systems Biology

This thesis develops a framework of graph morphisms and applies it to model reduction in systems biology. We are interested in the following problem: the collection of systems biology models is growing, but there is no formal relation between models in this collection. Thus, the task of organizing the existing models, essential for model refinement and coupling, is left to the modeler. In mathematical biology, model reduction techniques have been studied for a long time, however these techniques are far too restrictive to be applied on the scales required by systems biology. We propose a model reduction framework based solely on graphs, allowing to organize models in a partial order. Systems biology models will be represented by their reaction graphs. To capture the process of reduction itself, we study a particular kind of graph morphisms: subgraph epimorphisms, which allow both vertex merging and deletion. We first analyze the partial order emerging from the merge/delete graph operations, then develop tools to solve computational problems raised by this framework, and finally show both the computational feasibility of the approach and the accuracy of the reaction graphs/subgraph epimorphisms framework on a large repository of systems biology models.

PhD Thesis

Slides


Tuesday 14 April 2015 10h30

Damien Woods, CalTech, USA

Programming Molecular Structure and Dynamics: Principles and Practice Using DNA

The goal of my research is to control the structure and dynamics of molecular systems at the nanoscale and to determine the fundamental limitations of this endeavor. A major unanswered challenge is to design molecular-scale systems with arbitrary structure and dynamics that consist of millions of simple interacting components and yet are robust to erroneous interactions, fluctuations in temperature, fluid flows and other uncontrolled factors. DNA is a versatile and programmable material that can meet these daunting criteria. The kinetics and thermodynamics of DNA are reasonably well-understood, and through straightforward Watson-Crick base-pairing interactions we can program this material to create complicated shapes and patterns, and to have intricate, even algorithmic, chemical dynamics, all at nanometer spatial resolution. The theory of computation will be a necessity to design and analyze the capabilities of such systems.

In this talk I will highlight some of my recent work on the theory and practice of programming molecules. Ongoing experimental work in the wet-lab includes building a DNA-based self-replicator capable of multiple generations of replication. Theoretical work involves developing a framework, or a complexity theory, for comparing self-assembly systems and knowing when one system is better than another. The talk will show how computer science and biology can inspire our molecular designs, and how we can hope to use mathematical and algorithmic tools to control a cacophony of interacting molecules by simply letting them interact in a hands-off self-assembling fashion.


Monday 13 April 2015 10h30

Hugo Fort, Institute of Physics, Montevideo, Uruguay

Niche games and the fate of communities ruled by different payoff matrices

In the first part of this talk I will review multi species competition by resources, which defines a niche axis. I will also present results concerning the total abundance and diversity of ecological communities.

In the second part I will propose a way to extend niche competition theory by including interactions beyond mutual competition -like mutualism and parasitism- via game theory, giving rise to Niche Game Theory (NGT). I will analyze the effects on total abundance and diversity of introducing cooperative interspecific interactions for different popular dilemma games (prisoner's dilemma, chicken and stag hunt games).

Finally I will address possible practical applications of NGT in different realms: virology, agriculture, plant-pollinator networks, strategic market decision analysis.


Wednesday 1st April 2015 10h30

Maxime Folschette, Univ. Kassel, Germany

Modeling and analysis of large regulatory networks with the Process Hitting framework

The analysis of large networks provided by biologists carries two complementary problems. The first relies in the representation of the models, which has to be sufficiently detailed to represent the phenomena of interest and not to contain unwanted dynamics. The second problem is the analysis of these models, which has to be efficient enough to ensure the production of interesting results in little time. In this talk, I will present several modelings based on the Process Hitting framework and show how they can answer to both problems.

The Standard Process Hitting was previously proposed by Paulevé, Magnin & Roux as a subset of Synchronous Automata Networks. I will show how enrichments such as classes of priorities, neutralizing edges or synchronous actions can raise its expressivity to the equivalent of Logical Networks (such as Thomas' Models), which was not the case before. These enrichments also allow to integrate time properties or delay parameters in the models to some extent. Furthermore, the particular form of Process Hitting models allows to perform reachability analyses in polynomial time thanks to abstract interpretation, which is much more efficient than the classical model checking that usually requires exponential time. This permits to study the dynamics of large models (up to hundreds of components and potentially more, due to its low complexity) in tenths of a second.

In the end, I will also present the equivalences between these formalisms and other widespread frameworks such as Petri Nets of the Boolean semantics of Biocham. In particular, I will focus on the relationship between the Process Hitting and Thomas' Models, which have two different levels of granularity.


Friday 20 February 2015 10h30

Wassim Abou-Jaoude, ENS

Logical modeling and model-checking of T-helper cell differentiation and plasticity

T helper (CD4+) lymphocytes play a key role in the regulation of immune responses. Potentially faced with a large diversity of microbial pathogens, antigen-inexperienced (naïve) CD4+ T cells differentiate into various T helper (Th) subsets, which secrete distinct sets of cytokines. This differentiation process requires the integration of multiple signals, mainly produced by antigen presenting cells (APC), triggering specific surface receptors, including the T cell receptor, co-stimulatory molecules, and cytokine receptors. Diverse combinations of these signals lead to the differentiation of naïve T cells into diverse Th subsets, associated to specific physiopathological functions, tailoring an adaptive response to the invading pathogen. During the last years, experimental studies highlighted the diversity and plasticity of Th lymphocytes, challenging the classical linear view of Th differentiation and raising questions regarding the mechanisms which underlie the observed diversity and plasticity of Th phenotypes. This type of questions can be actually investigated using dynamical modeling. In particular, logical modeling has proven suitable for the dynamical analysis of large signaling and transcriptional regulatory networks. In this talk, I will present an extended version of a recently published logical model of T-helper cell differentiation, which accounts for novel Th cellular subtypes. Computational methods recently developed to efficiently analyze large models are first used to study static properties of the model (i.e. stables states). Symbolic model checking is then used to analyze reachability properties between Th subtypes upon changes of environmental cues. The model reproduces novel reported Th subtypes and predicts additional Th hybrid subtypes in term of stables states. Using the model checker NuSMV-ARCTL, an abstract representation of the dynamics is produced providing a global and synthetic view of Th plasticity. The model is consistent with experimental data showing the polarization of naïve Th cells into the canonical Th subtypes, and further predicts substancial plasticity of Th subtypes depending on the signalling environment.


Wednesday 11 February 2015 10h30

Konstantinos Koutroumpas, Genopole Evry,

Modeling and Analysis of DNA replication

DNA replication, the process of duplication of the cells genetic material, is central to the life of every living cell. To ensure that the cells genetic information is maintained, every base of the genome must be replicated once and only once before cell division. While in bacteria and some eukaryotes DNA replication is rather predictable in most eukaryotes it involves considerable uncertainty, both in the location and in the time of activation of the origins. The presence of uncertainty in such a fundamental process is intriguing and raises questions such as why do cells tolerate this uncertainty, how do they regulate it, and how do they ensure that the process is completed in a timely manner despite of it. An important tool in answering such questions has been the development of mathematical and computational models that capture the dynamics and stochastic phenomena underlying DNA replication. In this presentation a stochastic hybrid model for the process of DNA replication is presented. In silico analysis of the model leads to interesting conclusions and conjectures about the mechanisms that govern the DNA replication process.


Thursday 27 novembre 2014 14h00

David Fournier - Inria Paris-Rocquencourt, General Electric Transportation

PhD Thesis Defense. Metro Regenerative Braking Energy Optimization through Rescheduling: Mathematical Model and Greedy Heuristics Compared to MILP and CMA-ES

Room 127, Bâtiment Olympe De Gouges, 8 place Paul-Ricoeur, 75013 Paris.

Abstract

PhD Thesis

Access map


Friday 14th November 2014, PhD Defense of Xavier Duportet

(salle Jules Bordet, Institut Pasteur, 25 rue du Docteur Roux, Paris 14)

registration

10.00 Prof. Diego Di Bernardo - TIGEM, Italy

Engineering and control of gene networks

11.00 Prof. Timothy Lu - MIT, USA

Accelerating the engineering of life for human health applications"

14.30 Xavier Duportet - Inria, Paris & MIT, USA

(amphitéâtre Duclaux, Institut Pasteur, 28 rue du Docteur Roux)

PhD Thesis Defense: Developing new tools and platforms for mammalian synthetic biology: from the assembly and chromosomal integration of large genetic circuits to the engineering of artificial intercellular communication systems

(abstract)


Monday 20th October 2014 (amphi Turing Bât 1)

14.00 Katherine Chiang, NTU Taiwan

Building Reconfigurable Digital and Neuromorphic Computations in a Biochemical World.

Realizing complex systems within a biochemical environment is a common pursuit in synthetic biology, an emerging technology with promising potential in biomedicine and other applications. Such systems achieve certain computation through properly designed biochemical reactions. Despite fruitful progress being made, most existing reaction designs have fixed target functionality. Their lack of reconfigurability can be disadvantageous, especially when a system has to adapt to a varying biochemical environment. In this talk, we will present an analog approach to economically construct reconfigurable logic or neuromorphic circuits similar to silicon based field programmable gate arrays (FPGA). For digital computation, the effective “logic” and “interconnect” of the circuit can be dynamically reconfigured by controlling the concentrations of certain knob species; for neuromorphic computation, weights between connected neurons can adapt to environmental input, mimicking the synaptic plasticity of biological neurons.


Friday 5th September 2014 (amphi Turing Bât 1)

10.30 Reka Albert, Penn State University, USA

Logic modeling elucidates the outcomes of within-cell networks

Interaction networks formed by gene products form the basis of cell behavior (growth, survival, apoptosis, movement). Experimental advances in the last decade mark the first steps toward understanding and modeling the properties of the various networks that control the behavior of the cell. Such an understanding would allow the development of comprehensive and effective therapeutic strategies.

This talk will focus on my group's recent work on discrete dynamic modeling of signal transduction networks in various organisms. These models can be developed from qualitative information yet show a dynamic repertoire that can be directly related to the real system's outcomes. For example, our model of the signaling network inside T cells predicted therapeutic targets for the blood cancer T-LGL leukemia, several of which were then validated experimentally. I will then present a method to integrate network structure and dynamics into an expanded network. Extension of existing network measures and analyses, performed on this expanded network, allows an efficient way to determine the dynamic repertoire of the network and to predict manipulations that can stabilize or, conversely, block, certain outcomes.


Wednesday 18th June 2014

10.30 Carlos Olarte, LIX

A Proof Theoretic Study of Soft Concurrent Constraint Programming

Concurrent Constraint Programming (CCP) is a simple and powerful model for concurrency where agents interact by telling and asking constraints. Since their inception, CCP-languages have been designed for having a strong connection to logic. In fact, the underlying constraint system can be built from a suitable fragment of intuitionistic (linear) logic --ILL-- and processes can be interpreted as formulas in ILL. Constraints as ILL formulas fail to represent accurately situations where "preferences'' (called soft constraints) such as probabilities, uncertainty or fuzziness are present. In order to circumvent this problem, c-semirings have been proposed as algebraic structures for defining constraint systems where agents are allowed to tell and ask soft constraints. Nevertheless, in this case, the tight connection to logic and proof theory is lost. In this work, we give a proof theoretical interpretation to soft constraints: they can be defined as formulas in a suitable fragment of ILL with subexponentials (SELL) where subexponentials, ordered in a c-semiring structure, are interpreted as preferences. We hence achieve two goals: (1) obtain a CCP language where agents can tell and ask soft constraints and (2) prove that the language in (1) has a strong connection with logic. Hence we keep a declarative reading of processes as formulas while providing a logical framework for soft-CCP based systems. An interesting side effect of (1) is that one is also able to handle probabilities (and other modalities) in SELL, by restricting the use of the promotion rule for non-idempotent c-semirings.This finer way of controlling subexponentials allows for considering more interesting spaces and restrictions, and it opens the possibility of specifying more challenging computational systems.


Wednesday 21st May 2014

11.00 David Cohen, Curie Institute, Paris

Model-based analysis of the Notch activation and p53 inhibition effect on metastasis

Both a gain of function of NICD1 and a loss of function of p53 have been shown to induce metastasis in mice. Due to the complexity of the two pathways Notch and p53, and their crosstalks, it is difficult to understand the mechanisms leading to metastasis. We constructed an influence network including the main components of the two pathways and other pathways known to induce metastasis to explain explain the phenotype of the double mutant. We concentrated more particularly on the effect of NICD1 and p53 on the inducers of EMT. Using logical modelling, we were able to reconstruct the double mutant and other mutants in silico suggesting that the influence network is correctly recapitulating the mutant behaviour. Furthermore, we tried to associated experimental data to stable states, a way to validate the model network.


Wednesday 12th March 2014

14.00 Alexander Bockmayr, Freie Universität Berlin, Germany

Dynamic optimization of metabolic networks coupled with gene expression

Adjusting the metabolic network organization to the environment by tuning enzyme expression levels is crucial for cellular growth, in particular in a changing environment or during metabolic adaptation. Metabolic networks are often studied with optimization methods applied to constraint-based steady state models. But, a corresponding dynamic modeling framework including a tight interplay between metabolic fluxes and gene expression is currently lacking. Due to that, the cost of producing enzymes so far could not be taken into account in dynamic optimization of metabolic fluxes. Here, we present a modeling framework combining the metabolic network and the enzyme production costs. A rigorous mathematical approximation by a timescale separation yields a coupled model of quasi steady state constraints on the metabolic reactions, and differential equations for the substrate concentrations and biomass composition. Based on this model, we propose a dynamic optimization approach to determine reaction fluxes, explicitly taking production costs for enzymes and enzymatic capacity into account. In contrast to the established dynamic flux balance analysis, the proposed approach thereby allows to analyse dynamic changes in both the metabolic fluxes and the detailed biomass composition in situations of metabolic adaptation.

(joint work with Steffen Waldherr and Diego A. Oyarzún)


Wednesday 7th March 2014 at MSC Lab, Paris Diderot University, room 646A

14.00 Gregory Batt, Inria

Design, optimization and control in systems and synthetic biology (HDR defense)

How good is our understanding of the way cells treat information and make decisions? To what extend our current understanding enables us to reprogram and control the way cells behave? In this presentation I will describe several approaches developed for the computational analysis of the dynamics of biological networks. I will in particular present work done on (i) the analysis of large gene networks with partial information on parameter values, (ii) the use of specification languages to express observations or desired properties in an abstract manner and efficiently search for parameters satisfying these properties, and (iii) recent efforts to use models to drive gene expression in real-time at the cellular level.


Wednesday 7th March 2014 at MSC Lab, Paris Diderot University, room 646A

11.00 Mustafa Khammash, ETH Zurich

Cybergenetics: real-time feedback control of living cells

Wiener envisioned a science where the study of control and communication in man and the machine can be unified. His Cybernetics, or the art of steering, embodied this ambitious vision. Wiener's ideas predated the discovery of DNA and the subsequent biology revolution, and his cybernetic examples were inevitably limited to known physiological control systems at the scale of the organism. Today, the confluence of modern genetic manipulation techniques, powerful experimental measurement technologies, and advanced analysis and computational methodologies is making it possible to achieve real-time feedback control of living cells at the gene level. We refer to this nascent area of research Cybergenetics. This presentation will describe novel analytical and experimental work that demonstrates how de novo control systems can be interfaced with living cells and used to control their behaviour. We will draw comparisons with endogenous feedback control systems in biology as well as with engineered control systems in technology.


Tuesday 4th March 2014 at ENS 46 rue d'Ulm, Paris 5, room 505

11.00 Hannes Klarner, Freie Universität Berlin, Germany

Transient and Asymptotic Behaviors of Logical Networks

How do we know whether time course observations refute or validate our logical model? In the first part of the talk I will discuss the notion of compatibility for time series data and asynchronous logical networks. Starting from the presence of a time series path I will present a family of additional properties including monotony, robustness and partial stability and show how LTL and CTL model checking applies to the inherent decision problems. As an application I will present a method for the assessment of the data's sampling rate.

The theme of the second part is the challenge of predicting the long-term behavior of logical networks. I will begin by discussing the notion of partial fixpoints and their connection to the attractors of a model. Motivated by the observation that they reduce the problem of predicting the attractors to a case-by-case analysis of sub-networks, I will propose a procedure for their computation.


Wednesday 26th February 2014

10.30 Shahrad Jamshidi, LINA, Nantes

Comparing discrete and piecewise affine differential equation models of gene regulatory networks

We compare the discrete asynchronous logical modeling formalism for gene regulatory networks due to R. Thomas with piecewise affine differential equation models. We provide a local characterization of the qualitative dynamics of a piecewise affine differential equation model using the discrete dynamics of a corresponding Thomas model. Based on this result, we investigate the consistency of higher-level dynamical properties such as attractor characteristics and reachability. We show that although the two approaches are based on equivalent information, the resulting qualitative dynamics are different. In particular, the dynamics of the piecewise affine differential equation model is not a simple refinement of the dynamics of the Thomas model.

(Joint work with Heike Siebert and Alexander Bockmayr)


Wednesday 29th January 2014

15.00 Alexander Bockmayr, Freie Universität Berlin

Sequential Metabolic Phases as a Means to Optimize Cellular Output

Temporal changes of gene expression are a well-known regulatory feature, which is commonly perceived as a strategy to adapt the cellular behaviour to varying external conditions. However, temporal changes of gene expression are also observed under virtually constant external conditions. Here we hypothesize that such changes are a means to render the synthesis of the metabolic output more efficient than under conditions of constant gene activities. In order to substantiate this hypothesis we analyse a flux-balance model of the cellular metabolism extended by a constraint taking into account that the amount of protein that can be spent on the synthesis of enzymes and transporters is limited. Applying this concept to a simplified metabolic network of the central carbon metabolism with glucose or lactate as alternative substrates, we demonstrate that switching between optimally chosen stationary flux modes comprising different sets of active genes allows producing a required amount of target metabolites in a significantly shorter time than by a single optimal flux mode at fixed gene activities. Our model-based findings suggest that temporal expression of metabolic genes can be advantageous even under conditions of constant external substrate supply.

Joint work with A. Palinkas, S. Bulik, and H.-G. Holzhütter.


Tuesday 28th January 2014

10.30 Alexander Bockmayr, Freie Universität Berlin

Computing Elementary Flux Modes Involving a Set of Target Reactions

Elementary flux mode (EM) analysis is an important tool in the constraint-based analysis of genome-scale metabolic networks. Due to the combinatorial complexity of these networks, as well as the advances in the level of detail to which they can be reconstructed, an exhaustive enumeration of all EMs is often not practical. Therefore, in recent years interest has shifted towards searching EMs with specific properties. We present a novel method based on mixed-integer programming, which allows computing EMs containing a given set of target reactions. This generalizes previous algorithms where the set of target reactions consists of a single reaction. In the one-reaction case, our method compares favorably to the previous approaches. In addition, we present several applications of our algorithm for computing EMs containing two target reactions in genome-scale metabolic networks.

Joint work with L. David.


Tuesday 14th January 2014

14.00 Erwan Bigan, MSC lab, Paris-Diderot University and LIX, Ecole Polytechnique.

Emerging properties of large random chemical reaction networks and application to proto-cell models

Large random chemical reaction networks are generated to investigate minimal conditions for life. The network generation procedure ensures mass conservation with arbitrary stoichiometry, and the generated networks are endowed with reaction kinetics that are compatible with thermodynamics. These chemical systems are characterized by computing the exact time trajectory of chemical concentrations, at equilibrium and out of equilibrium. A self-consistent proto-cell model incorporating such networks is built assuming one of the chemical constituents self assembles into a structured membrane, and that this self-assembled membrane is semi-permeable to some other constituent (nutrient) diffusing from an outside growth medium into the proto-cell cytoplasm. The diffusing nutrient is primarlly metabolized into the membrane precursor, and partially into other constituents. Synthesized membrane precursors get incorporated into the growing membrane, resulting in cell growth and dilution of all its constituents. Computations on several tens of random networks reveal a similar qualitative behavior: provided a simple condition is met by the stoichiometry matrix, all randomly generated networks converge towards a stationary cellular growth regime regardless of initial conditions, precise network topology or reaction kinetics. This suggests that the most stringent requirements for sustained proto-cellular life relate to the membrane, details of chemical reactions leading to nutrient transformation into other cell constituents being only of secondary importance. Perspectives on the formal demonstration of observed properties will be discussed.


Public Seminar of former EPI Contraintes


Thursday 12th December 2013 (building 8)

10.30 Laurent Fribourg, CNRS, ENS Cachan

Control of switching systems by invariance analysis: application to power electronics

Switched systems are embedded devices widespread in industrial applications such as power electronics and automotive control. They consist of continuous-time dynamical subsystems and a rule that controls the switching between them. Under a suitable control rule, the system can improve its steady-state performance and meet essential properties, such as safety and stability, in desirable operating zones. We explain in this talk that such controller synthesis problems are related to the construction of appropriate invariants of the state space, which approximate the limit sets of the system trajectories. We present several approaches of invariant construction based on techniques of state space decomposition and backward/forward fixed-point computation. All these approaches will be illustrated in a number of case studies, mainly taken from the field of power electronics.

(joint work with Romain Soulat)


Tuesday 10th Dec 2013, 10.30 (building 8)

Bernadette Charron, LIX Ecole Polytechnique

Orientation and Connectivity Based Criteria for Asymptotic Consensus

We consider a set of autonomous agents that interact with each other by exchanging values and form weighted averages of received values. We overview existing convergence results, and present orientation and connectivity based criteria on the communication graphs to achieve asymptotic consensus in the context of time-varying topology and communication delays. We use these criteria to prove the convergence of dynamical systems for synchronization, opinion dynamics, and social epistemology.


16th July 2013, 2 p.m (room 454A, MSC lab, Condorcet Building, University Paris Diderot)

map and adress

2 p.m Deepak Mishra (Weiss Lab for Synthetic Biology, MIT)

The Use of Protein-Protein Interactions in Synthetic Biology

While protein-protein interactions are ubiquitous in natural biological systems, the use of these interactions in synthetic biology has been minimal. Here, I describe recent work in synthetic networks utilizing protein-protein interactions in Saccharomyces cerevisiae. First, I will introduce the use of mixed regulatory modalities comprising transcriptional cascades and protein phosphotransfer elements for improved modularity in synthetic networks. By utilizing such an approach, we can create circuits that remove context dependency upon interconnection of transcriptional modules by attenuating retroactivity even in the presence of high fan-out. We demonstrate experimentally that due to load, a circuit suffers from 76% delay in response time to a step input and 81% decrease in amplitude of response to a periodic input. By contrast, when a load driver is incorporated within the circuit, addition of load changes the response by only 2% to 12%. Second, I will describe a set of design rules for building circuits with protein elements and demonstrate their use to build a toggle switch comprised solely of protein-protein interactions. We demonstrate experimentally the ability to rewire the endogenous high osmolarity MAPK cascade with exogenous mammalian, plant, and bacterial components to achieve designer phosphorylation networks and continued work to create a protein-phosphorylation bistable toggle switch. Finally, I will discuss some future application areas and technical challenges for using protein-protein circuits in synthetic biology.


10 July 2013 (room 2015, Bâtiment Sophie Germain, University Paris Diderot)

Sophie Germain Building - 2th Floor, room 2015 - avenue de France / rue Alice Domon et Léonie Duquet, 75013 Paris (map)

10.30 Alexander Bockmayr, Freie Universität Berlin and DFG Research Center Matheon.

Constraint-based analysis of metabolic networks.

We give an overview of the most important mathematical and algorithmic ideas underlying the constraint-based analysis of metabolic networks at steady-state. We will discuss the steady-state flux cone, flux balance analysis (FBA), flux variability analysis (FVA), flux coupling analysis (FCA), elementary flux modes (EFMs), minimal metabolic behaviors (MMBs), and thermodynamic constraints. On the algorithmic side, the focus will be on linear programming and mixed-integer programming methods.

11.30 Karsten Wolf, Rostock University,

Generating Petri net State Spaces

Analyzing a system based on its state space appears to be a task that is mostly independent of the language in which the system is represented. We argue in favor of the contrary. We show that characteristics that make the formalism of Petri nets a unique modeling language can be used to gain efficiency in explicit state space exploration. We underpin our claims with experience in using our tool LoLA.

12.30-14.00 lunch

14.00 Faten Nabli, Inria Paris Rocquencourt,

Thesis Defense: A constraint programming approach to the analysis of Petri nets structural properties and application to biochemical networks


Friday 5 July 2013 (building 8)

10.30 Carlos Olarte and Elaine Pimentel, LIX

A General Proof System for Modalities in Concurrent Constraint Programming

The combination of timed, spatial, and epistemic information is often needed in the specification of modern concurrent systems. We propose the proof system SELLU, which extends linear logic with subexponentials with quantifiers over subexponentials, therefore allowing for an arbitrary number of modalities. We then show how a proper structure of the subexponential signature in SELLU allows for the specification of concurrent systems with timed, spatial, and epistemic modalities. In the context of Concurrent Constraint Programming (CCP), a declarative model of concurrency, we illustrate how the view of subexponentials as specific modalities is general enough to modularly encode into SELLU variants of CCP with these three modalities, thus providing a proof-theoretic foundations for those calculi.


19 April 2013 (Amphitheatre Pierre-Gilles de Gennes, Bâtiment Concorcet, University Paris Diderot)

10.00 John Lygeros, ETH Zurich,

Moment based approach to estimation and control of cell populations

11.30 Diego di Bernardo, TIGEM, Naples,

miRNAs confer phenotypic robustness to gene networks: a synthetic biology approach

14.00 Jannis Uhlendorf, Inria Paris Rocquencourt,

Thesis Defense: Real-time feedback control of gene expression.


8 April 2013 (Building 11)

14.00 Stéphane Gaubert, Inria and CMAP Ecole Polytechnique.

Algorithmic aspects of tropical convexity: an introduction

We will review some algorithmic problems emerging from tropical convexity, discussing specially the equivalence between mean payoff games and tropical linear programming and algorithms to solve tropical linear feasibility problems. We will also discuss more briefly the question of computing all the vertices of a tropical polyhedron.

This talk is based on:

M. Akian, S. Gaubert, and A. Guterman. Tropical polyhedra are equivalent to mean payoff games. International of Algebra and Computation, 22(1):125001, 2012, arXiv:0912.2462,

X. Allamigeon, S. Gaubert, and E. Goubault, Computing the vertices of tropical polyhedra using directed hypergraphs, Discrete & Computational Geometry, 49(2):247{279, 2013, arXiv:0904.3436v4.

Slides (pdf)


25 January 2013 (amphi Turing, Building 1)

10.30 Emanuele Leoncini, EPI RAP, Inria Rocquencourt,

Analysis of the Stochastic Variability of Protein Production of Prokaryotic Cells

The protein production, also referred to as gene expression, is a fundamental process within prokaryotic and eukaryotic cells and has been studied over decades by biologists. Experiments have given direct proof of the stochastic nature of the gene expression. As a consequence, the concentration of a given protein shows stochastic fluctuations, which may be costly given that the whole protein production process consumes 80% of the resources of a prokaryotic cell.

To analyze the fluctuations of the number of copies of a specific protein, a generalized version of some classical Markovian models of the late 70s is presented. These earlier models assumed an exponential distribution for the duration of the different steps of the production process. Nevertheless, it is well-known that some steps of the gene expression, such as the protein elongation, are not exponentially distributed, showing an almost deterministic duration. By using a new representation of the number of proteins in terms of marked Poisson point process, closed formulas for the fluctuations of the number of proteins are obtained in this general setting.

Our results show in particular that the exponential assumption may, surprisingly, underestimate significantly the variance of the number of proteins when some steps are not exponentially distributed with smaller fluctuations than an exponential random variable. This counter-intuitive result stresses the importance of the statistical assumptions in the protein production process.

Joint work with Vincent Fromion (INRA) and Philippe Robert (INRIA)


30 November 2012 (amphi Turing, Building 1)

10.00 Prof François Boullier, Université de LIlle

Differential Algebra methods for the analysis of chemical networks

The differential algebra is an algebraic theory for polynomial differential equations, such as the ones arising from chemical reaction systems, by application to the mass-action law. In this talk, I will present it through some applications, illustrated by software demonstrations. The software, which is a MAPLE package, actually is just an interface for a stand-alone, open source, C library, developed in the computer algebra team of Lille.

Slides (pdf, MM, Hill). Hab. Dir. Rech.

11.00 Prof Andreas Weber, University of Bonn

Parametric analysis of equilibria of chemical reaction networks

For large scale systems arising from chemical reaction networks, already the parametric computation of equilibria and singularities in the positive orthant is a major challenge---but also of great importance for applications by understanding properties of the reaction network. We have implemented a newly developed algorithm by Dima Grigoriev, by which we could solve equilibria for most examples from BIOMOD database and also for the examples showing ``absolute concentration robustnest'' (Shinar and Feinberg, 2010).

As the existence of Hopf bifurcations points yield oscillations we build on previous work with M. El Kahoui and T. Sturm that yield reductions to quantifier elimination over the reals (and computations using REDLOG). In recent joint work with H. Errami, W. Seiler, M. Eiswirth we could compute larger examples fully algorithmically using reaction coordinates.

On the systems side we have recently built PoCaB -- a platform to explore bio-chemical reaction networks by algebraic methods. It does not only provide the software framework for automated computations involving different systems, but also contains a database for the algebraic entities computed from the models of chemical reaction networks.

Slides (pdf)


Monday 5 November 2012 (amphi Turing, Building 1)

10.30 Jean-Guillaume Fages (Ecole des Mines de Nantes)

On using graph variables in Constraint Programming: the SCC constraint

Solving combinatorial optimization problems often involves to find a partial subgraph of a given graph, which satisfies a set of constraints. Currently, when facing such problems, CP modelers have to manipulate binary, integer or set variables in order to model a graph. This practice involves unnecessary modeling and computational issues, which are especially visible on large scale problems.

In order to improve the resolution of graph problems in CP, we implemented graph variables in CHOCO. In this talk, I will present the management of graph variables and graph constraints in CHOCO. Then, as an example, I will describe an efficient incremental maintenance of the Strongly Connected Components (SCCs) of a graph variable, and see how it can filter the domain of the graph variable in order to obtain a Hamiltonian Path. Finally, I will show practical benefits of using graph variables.


Friday 27 July 2012 (amphi Turing, Building 1)

10.30 Chris Banks, University of Edinburgh

Formal Experiments in Computational Biochemistry

In this talk I will present my logic of formal experiments. The logic has two layers: temporal logic for classifying behaviour of a system and a contextual operator which classifies behaviour in a given context. The contextual operator is similar to the guarantee operator from spatial logic, allows the introduction of new species into the system, and nests within the temporal modalities to express complex behaviour.

The logic is supported by efficient model checking and tool support and is built into my interpreter for the continuous pi-calculus (cpi), a continuous state-space process algebra for biochemical systems. I will also give an overview of cpi and some of examples of the application of the logic to cpi processes.


Friday 20 July 2012 (amphi Turing, Building 1)

10.30 Ovidiu Radulescu, Université de Montpelier

Reduction of biochemical networks with multiple time scales

Biochemical networks are used in computational biology, to model mechanistic details of systems involved in cell signaling, metabolism, regulation of gene expression. Parametric and structural uncertainty, as well as combinatorial explosion are strong obstacles against analyzing the dynamics of large models of this type. Multi-scaleness is another property of these networks, that can be used to get past some of these obstacles. Network with many, well separated, time scales, can be reduced to simpler networks, in a way that does not depend on the exact values of the parameters, but on their order relations. I will show how tropical geometry can be used to unify various approaches to this problem, also provide examples from signaling and cell cycle regulation to illustrate these ideas.

Slides (pdf, restricted access)


Thursday 19 July 2012 (amphi Turing, Building 1)

10.30 Luca Grieco, ENS Paris

Logical modelling of MAPK network

Based on an extensive analysis of published data, we built a comprehensive molecular map for Mitogen-Activated Protein Kinase (MAPK) signalling network, using the CellDesigner software.

Mammalian MAPKs can be activated by a wide variety of stimuli, including growth factors and environmental stresses. Activation of MAPK pathways affects diverse cellular activities, including gene expression, cell cycle machinery, survival, apoptosis and differentiation. To date, three groups of MAPKs have been extensively studied: extracellular regulated kinases (ERK1/2), Jun NH2 terminal kinases (JNK1/2/3), and p38 kinases (p38 α/β/γ/δ).

A recurrent feature of MAPK pathways is a central three-tiered core signalling module, consisting of a set of sequentially acting kinases, in response to external stimuli. At each level of a MAPK cascade, protein phosphorylation is regulated by the opposing actions of phosphatases. Moreover, scaffold proteins bring together the components of a single pathway and insulate the module from activation by irrelevant stimuli and negative regulators (like phosphatases). Finally, the three main MAPK cascades are strongly intertwined, making the prediction of cell fate decision an extremely difficult task.

Using the CellDesigner map as a reference, we derived a comprehensive logical model for the MAPK network, with the aim to reproduce the response of MAPK cascades to different stimuli and better understand their contributions to cell fate decision (between Apoptosis, Growth Arrest and Proliferation). The resulting logical model encompasses the three main MAPK cascades in response to four inputs: EGFR, FGFR3, TGFb, and DNA damage.

The model, consisting of 54 variables, has been built using GINsim software. Exhaustive simulations of this model are not currently possible, given the huge dimension of its associated state transition graph. We overcame this problem by tooking advantage of a novel model reduction function implemented into GINsim, which preserves the attractors of the systems.

Our simulations show that the dynamical behaviour of our model is in agreement with our current biological knowledge.

In order to uncover the mechanisms by which the MAPK cascades mediate cell fate decision, we performed more extensive simulations and model analyses, using the reduced version of our model. These include model mutant tests, representing specific scenarios of interest. Exploitation of our results led us to formulate novel hypotheses concerning the inter-play of the model components, suggesting wet experiments aimed at uncovering novel therapeutical targets. Such experiments will be finally performed in order to definitely validate our model.

Slides (pdf restricted access)


Thursday 10 May 2012 (amphi Turing, Building 1)

10.30 Lukasz Kaiser, LIAFA, Paris

Quantitative Logics on Structure Rewriting Systems

Structure rewriting systems are infinite Kripke structures with metafinite relational structures assigned to each node. Transitions in the Kripke structure are used to model temporal changes, while the assigned finite structures model the state of the system at specific points in time. We will show how various systems can be modeled naturally in this way, and we will discuss which model-checking techniques can be applied in this setting. This leads to several decidability results for the quantitative mu-calculus on both purely discrete and hybrid structure rewriting systems.


Friday 27 April 2012 (amphi Turing, Building 1)

10.30 Francesco Santini, CWI, Amsterdam

Similarity-based Web Service Discovery through Soft Constraint Satisfaction Problems

In this talk, we focus on the discovery process of Web Services (WSs) by basing the search on the similarities among the service requirements, in order to cope with over-constrained queries or to find satisfactory alternatives for user requirements. This discovery process needs to involve the so-called Soft Constraint Satisfaction Problems (SCSPs). First we represent both WSs and the search query of the user as Rooted Trees, i.e., a particular form of Conceptual Graphs. Then, we find a homomorphism between these two trees as a solution of an SCSP. The main contribution is the enhanced expressiveness offered by this “softness”: in over-constrained scenarios, when a user query cannot be satisfied, classical crisp constraints (i.e., CSP) are not expressive enough to find “close” solutions to meet the users’ needs.


Thursday 29 March 2012 (amphi Turing, Building 1)

10.30 Prof. Fernando Buarque, University of Pernambuco, Brazil

Fish School Search (FSS): the need for yet another optimization metaphor

Nature inspired algorithms– especially population based such as swarms and colonies –are metaphors able to deal fairly well with complex optimization problems. Moreover they tackle well large dimensional search spaces of highly non-monotonical nature; hence, they are normally good to search parameters in problems of high cardinality at relative low computational cost.

In 2008, Bastos Filho and Lima Neto proposed a new metaheuristic in the fast growing family of swarm intelligence techniques, namely, Fish School Search (FSS) [1]. In FSS, the school collectively “swims”(searches) for “food”(candidate solutions) in the “aquarium”(search space). Similarly to PSO (Particle Swarm Optimization) or GA (Genetic Algorithms), the search guidance in FSS is driven by the merit of individual members of the population and the weight of each fish acts as a factual-memory of its individual success. In contrast with PSO, the weight can obviate the need to keep a log of best positions visited as well as any other topological information. As opposed to GA, the actual location of each fish directly substitutes the need of a chromosome. As for the social reasoning, the barycenter of the whole school can automatically guide expansion and contraction of the school, evocating exploration and exploitation when necessary [2]. In other words, the quality of the search can be inferred from regions where larger ensembles of fish are located (and vice-versa).

In the talk, after a brief review on the rationale of Computational Intelligence, it is planed an overview on the FSS operators and presentation of several simulations results. Our main goal is to make it clear why FSS is well suited to hard optimization tasks and why it affords computational features such as: (i) self-adaptable individual guidance towards sought solutions, (ii) on-the-‘swim’ collective selection between exploration and exploitation, and (iii) non-monotonic & high-dimensional search abilities (that can solve multi-modal optimization problems) [3].

[1] BASTOS FILHO, C.; LIMA NETO, F. et al. "A Novel Search Algorithm based on Fish School Behavior". Proceedings of IEEESMC2008 (Singapore), pp. 2646-265.

[2] BASTOS-FILHO, Carmelo J. A.; LIMA NETO, Fernando Buarque de; LINS, Anthony J. C. C.; NASCIMENTO, Antônio I. S.; LIMA, Marília P. "Fish School Search: an overview". In: CHIONG, Raymond (Ed.). Nature-Inspired Algorithms for Optimisation. Series: Studies in Computational Intelligence, Vol. 193.. pp. 261-277. Berlin: Springer-Verlag, 2009. {ISBN: 978-3-642-00266-3}.

[3] MADEIRO, S.; LIMA NETO, F.; Bastos-Filho, C.; FIGUEIREDO, E. Density as the Segregation Mechanism in Fish School Search for Multimodal Optimization Problems. In ICSI’2011: Second International Conference on Swarm Intelligence. Springer - Lecture Notes in Computer Science, v. 6729, p. 563-572, 2011.

(slides in pdf)



Monday 24 October 2011

10.30 Neda Saeedloei, University of Texas, Dallas, USA

Modeling and Verification of Real-time and Cyber-Physical Systems

We develop techniques for including continuous time in computations. These techniques result in frameworks for modeling and verification of real-time systems. Our proposed framework is based on logic programming (LP) extended with constraints and coinduction. To build the theoretical foundations necessary for this work, we present a new programming paradigm, co-constraint logic programming (co-CLP for brevity), that merges two programming paradigms: constraint logic programming and coinductive logic programming. We investigate incorporation of continuous time in various domains of computer science. In particular we study the extension of omega-grammars with continuous time (timed omega-grammars) and also continuous time extension of pi-calculus (timed pi-calculus). We present the implementation of these formalisms in co-CLP and show how these co-CLP-based realizations can be used for modeling and verifying real-time systems. We also present a co-CLP framework for modeling hybrid automata; we use this logic programming realization of hybrid automata to develop a framework for modeling and verification of cyber-physical systems, including real-time systems.


Monday 10 October 2011 (Seminar room of Building 16)

10.00 Calin Belta, Boston University, USA

Formal verification and control of piecewise affine systems with applications to gene networks

Discrete-time piecewise affine systems (PWA) are a particular class of hybrid systems that evolve along different affine dynamics in different polyhedral regions of the continuous state space. Such models are very popular because they can approximate nonlinear systems with arbitrary accuracy and because of the existence of algorithms for the identification of such systems from experimental data. In this talk, I will show how the particular structure of PWA systems can be exploited to develop computationally efficient algorithms for their verification and for synthesis of parameters and control strategies. I will show how such tools can be used for analysis and synthesis of synthetic gene networks.


Jeudi 21 avril 2011

10.00 Szymon Stoma, Humboldt University, Berlin, Germany

Towards systematic study of spatio-temporal cellular systems

During the presentation I will discuss the application of computational techniques to study complex, spatio-temporal cellular systems. I will present models and simulations together with experiments used to heighten our comprehension of shoot apical meristem (SAM) development in Arabidopsis Thaliana, and in particular the process of regular organ initiation, called phyllotaxis. To develop phyllotaxis models, we focused on the intra cellular transport of one of the essential plant hormones, auxin. Transport of auxin is mediated by its efflux carrier, a protein called PIN1. The location of PIN proteins in a single cell changes dynamically during the growth process and this so-called repolarization is coordinated at a tissue level. To capture these events, presented models are at the scale of tissue, with cellular resolution and since they include growth, from the Computer Science point of view they are Dynamic Systems with Dynamic Structures (DS2), which brings important implications to the modeling. I will discuss also the physical-based framework to simulate growth of the meristem including tropisms. Since auxin modifies rigidity of cell walls leading to an increase in growth rates in the spots of its high concentration, the introduced framework is used to upgrade auxin transport-based model of phyllotaxis. In this upgraded model the transport-based patterning mechanism directly modifies the growth directions of the meristem, allowing us to study the coupling of growth, auxin and PIN distributions. Finally, I will discuss the usage of experimental, microscopic data to validate microbiological models which will bring us to the motivation for developing Spatio Temporal Simulation Environment (STSE): http://stse-software.org


Lundi 18 avril 2011

13.30 Cédric Joncour, Université Bordeaux 1

Nouvelles approches combinatoires pour les problèmes de placement

Le problème de placement sur deux dimensions consiste à décider s'il existe un rangement d'objets rectangulaires dans une boîte donnée sans chevauchement ni dépassement du conteneur. C'est un problème combinatoire très difficile. La rotation des rectangles n’est pas autorisée. En 1997, Fekete et Schepers ont utilisé les graphes d’intervalles comme structure de données afin de caractériser des classes de placements réalisables. Ils évitent ainsi d’énumérer des solutions symétriques. Leur algorithme a permis de faire un saut significatif quant à la taille des instances traitées.

Durant ma thèse, j'ai developpé de nouvelles approches combinatoires basées sur diverses caractérisations des graphes d'intervalles. Un premier algorithme est basé sur l'énumération de matrices de uns-consécutifs. Un autre utilise des arbres étiquetés pour éliminer plus efficacement les cas de symétries entre placements. Ces approches se sont révélées plus performantes que celle de Fekete & Schepers sur la plupart des instances classiques de la littérature.


Lundi 11 avril 2011

14.00 Paolo Ballarini, INRIA Rennes Bretagne Atlantique,

Hybrid Automata Stochastic Logic: expressive statistical verification of stochastic models

We introduce the Hybrid Automata Stochastic Logic (HASL), a new temporal logic formalism for the verification of discrete event stochastic processes (DESP). HASL employs Linear Hybrid Automata (LHA) as machineries to select prefixes of relevant execution paths of a DESP. The advantage with LHA is that rather elaborate information can be collected "on-the-fly" during path selection, providing the user with a powerful means to express sophisticated measures. A formula of HASL consists of an LHA (A) and an expression (Z) referring to moments of "path random variables". A (simulation-based) statistical engine is employed to obtain a confidence-interval estimate of the expected value of Z. In essence HASL provides a unifying verification framework where sophisticated temporal reasoning is naturally blended with elaborate reward-based analysis. In this talk I will illustrate the HASL approach and a discuss about its expressivity. I will also present preliminary empirical evidence obtained through COSMOS, a prototype software tool for HASL verification.

(slides in pdf)


Lundi 28 mars 2011

10.30 Yaakov Setty, Weizmann Institute, Revohot, Israel,

Modeling Organogenesis: Scaling Development from Stem Cells to Organs

In recent years, we have used software engineering tools to develop reactive models to simulate and analyze the development of organs. The modeled systems embody highly complex and dynamic processes, by which a set of precursor stem cells proliferate, differentiate and move, to form a functioning tissue. Three organs from diverse evolutionary organisms have been thus modeled: the mouse pancreas, the C. elegans gonad, and partial rodent brain development. Analysis and execution of the models provided dynamic representation of the development, anticipated known experimental results and proposed novel testable predictions. In my talk, I will l discuss challenges, goals and achievement in this direction in science.


Mardi 1er mars 2011

10.00 Xuefeng Gao, University College, Cork, Ireland,

A cell-based multiscale model for avascular tumor growth

The growth dynamics of tumors is controlled by nutrients, biomechanical forces and other factors at different stages and in different environments is still largely unknown. We present a computer simulation for avascular tumor growth aimed at investigating the interaction between tumor morphology and the local environment. At the cellular level, a Glazier-Graner-Hogeweg (GGH) model describes cellular dynamics including cell proliferation, viability and adhesion. At the subcellular level, the expression of protein p27 regulates the cell cycle. At the extracellular level, the diffusion of oxygen, glucose and hydrogen ions describe the chemical dynamics involved in metabolism. In avascular phase, tumor cell proliferation depends on consuming oxygen and glucose from the pre-existing surrounding tissue. When the oxygen level drops below a threshold, the tumor cells become hypoxic and start anaerobic metabolism (glycolysis). Experimental evidence suggests that cancer cells undergo hypoxia-induced quiescence (G0 phase in the cell cycle). We assume that this progression is affected by protein p27, whose expression is upregulated under hypoxia, inhibits the activation of the cyclin dependent kinases (CDKs), thus preventing DNA synthesis and regulating the cell-cycle.



Mercredi 8 décembre 2010

11.00-12.00 (amphi Turing, building 1)

Radu Grosu, SUNY at Stony Brook, USA

Modeling and Analysis of Cardiac Tissue

We address the problem of modeling and analysis of emergent behavior in cardiac tissue, spiral electric waves in particular, a condition known as tachycardia and a precursor to cardiac fibrillation.

We first introduce a minimal nonlinear model of a single cardiac cell. For analysis, we transform this model to a multiaffine system and identify the parameter ranges for which the system lacks excitability. A region of unexcitable cells within cardiac tissue, can be responsible for tachycardia or fibrillation: 1) The region becomes an obstacle, in the way of the propagating electrical wave; 2) This triggers a spiral rotation of the wave (tachycardia); 3) The spiral may eventually break up into many spirals, a condition known as fibrillation.

We then address the problem of specifying and detecting spatial properties in cardiac tissue. To solve this problem we: (1) Apply discrete mode-abstraction to the single-cell automata described above; (2) Introduce the new concept of spatial-superposition of CLHA modes; (3) Develop a new spatial logic, based on spatial-superposition, for specifying emergent behavior; (4) Devise a new method for learning the formulae of this logic from the spatial patterns under investigation; and (5) Apply bounded model checking to detect (within milliseconds) the onset of spiral waves. We have implemented our methodology as the Emerald tool-suite, a component of our EHA framework for specification, simulation, analysis and control of excitable hybrid automata. We illustrate the effectiveness of our approach by applying Emerald to the scalar electrical fields produced by our CX simulator.


Vendredi 19 novembre 2010

14.00-15.00 Jean Krivine, CNRS PPS, University Paris 7

Epigenetics, aging and symmetry: or why DNA is not a program

Without committing to any controversial definition, one may describe aging as a continuous and irreversible process by which molecules, cells or organs accumulate changes in time. With this simple definition, the process by which embryonic stem cells progressively loose their pluripotency by differentiation and acquire fixed functions in various tissues can be viewed as an instance of the aging process. From a modeling perspective there is a need to find a mechanistic molecular implementation of this phenomenon with the hope to better understand age related deceases or cancer. This model of cell development is strongly constrained by the fact that each cell of a single organism, be it a neuron or a stem cell, shares a copy of the same genetic code. It results that the "program" that governs the function of a cell cannot be solely contained in the genes and has to be hidden in some "meta" instructions on top of these genes.

Recent technological breakthrough in molecular biology are starting to reveal the structure of this meta-code that biologists call epigenetic. In this talk we will survey this revolutionary view of genes, cells and function with a computer-science oriented spirit, having in mind that we are facing an evolution-based toolkit for the treatment of genetic information that should most likely beget interesting concepts for computer science and informatics.

15.30-16.30 Zohra Khalis, CNRS I3S, Sophia Antipolis

Using Hoare logic for constraining parameters of discrete models of gene networks

The technology of DNA chips allows the quantification of the expression level of thousands genes in the same organism at the same time. Nevertheless, analysis of data from DNA chips requires development of adapted methods.

We propose a path language that allows the description, in an abstract way, of the concentration level variations from temporal data like temporal profiles of gene expression. When concentration level variations have been expressed through a program of the path language, it becomes possible to apply methods from computer science like Hoare logic.

Hoare logic is made of a system composed of axioms and rules. It permits one to prove if a program is correct in comparison to its specifications that is described through assertions, that is, logical formulas, that define properties on the program. The precondition specifies the initial state before the execution of the program and the postcondition specifies the final state after the execution of the program. A program is said (partially) correct if it is possible to prove that from an initial state satisfying the precondition, the program leads (if the program terminates) to a final state satisfying the postcondition.

To model gene regulatory networks, the main difficulty remains in the parameter identification problem, that is, the search of parameter values which lead to a model behavior that is consistent with the known or hypothetical properties of the system. So, we apply a Hoare-like logic designed for the defined path language and discrete modelling framework. The axioms and rules of this Hoare-like logic are adapted to gene networks and permits us to prove that the path described by the program exists in the dynamics. Given a path program and a postcondition, we can apply calculus of the weakest precondition, based on this Hoare-like logic. Calculus of the weakest precondition, thanks to defined axioms and rules, permits us to constrain parameters associated with the program and the postcondition. Although Hoare logic is well known, its application to constrain parameter values of gene networks appears to be brand new and helpful in order to select consistent models. Moreover, expressing DNA profiles into programs gives another way to process such data.



Mercredi 31 mars 2010

10.30 Adam Halasz, West Virginia University – Department of Mathematics

[Non]-Deterministic dynamics in cells: From multistability to stochastic switching.

I will discuss the interplay between deterministic dynamics and stochastic effects in live cells. The main example is the lac operon, a set of self-promoting genes, which, together with the enzymes they express, act as a hysteretic switch. Stochastic simulations of the same system predict spontaneous, random switching by individual cells. This behavior can be well described by a simple abstraction. In a broader context, I will review a few examples where stochasticity leads to useful and quantifiable behavior that amounts to a hedging strategy on the level of a population of cells.


Lundi 1er mars 2010

13.30 Vianney le Clément, UCL Louvain, Belgique

Appariements de graphes basés sur des contraintes

Les graphes combinatoires offrent une structure riche pour modéliser divers objets: réseaux, images, molécules, etc. Pour mesurer la similarité entre deux graphes, l'on peut calculer un appariement entre leurs nœuds. La plupart des problèmes d'appariement de graphes sont NP difficiles. Dans cet exposé, nous verrons qu'il est possible de les résoudre en utilisant des techniques de programmation par contraintes (CP) ou de recherche locale (LS). Nous aurons aussi un aperçu du travail en cours sur CBGML, un langage de modélisation déclaratif à base de contraintes, dont le but est de générer automatiquement un algorithme de résolution CP ou LS sur base des caractéristiques du problème.


Mercredi 10 février 2010

10.30 Alexis Saurin, PPS Paris,

Focalisation en Ludique

La propriété de Focalisation constitue l'un des résultats essentiels de la théorie de la démonstration de la logique linéaire, mettant en évidence la notion de polarité en logique. La Focalisation est la source de développements importants, allant de la programmation logique linéaire aux approches interactives de sémantiques de jeux. La ludique, quant à elle, est un formalisme logique introduit par Girard il y a une dizaine d'années dont l'élément de base est la notion d'interaction.

Terui a revisité récemment la théorie ludique de Girard en y apportant de nouvelles perspectives, notamment selon l'aspect calculatoire, sous le nom de computational ludics. Je présenterai une analyse interactive du théorème de focalisation menée dans le cadre de la Ludique de Terui, que l'on peut voir comme une internalisation de la focalisation. J'expliquerai également les motivations de ce travail provenant de la remarque que la propriété de focalisation est connectée à des résultats de théorie de la complexité algorithmique.

(Travail en collaboration avec Basaldella et Terui, RIMS, Kyoto)



Vendredi 27 novembre 2009

10.30 Christine Solnon, LIRIS Lyon, Présidente AFPC, (transparents)

Mesurer la similarité de graphes.

De nombreuses tâches telles que, par exemple, la classification et la recherche d'informations, nécessitent de mesurer la similarité d'objets. Lorsque les objets sont modélisés par des graphes, il s'agit alors de mesurer la similarité de graphes, c'est à dire, mettre en correspondance les sommets des graphes de façon à retrouver un maximum de caractéristiques communes. Nous présenterons tout d'abord les principales mesures de similarité de graphes existantes, allant de l'isomorphisme de (sous-)graphes, aux distances d'édition, en passant par la recherche de plus grands sous-graphes communs. Nous présenterons ensuite les principales approches de résolution utilisées pour résoudre ces différents problèmes. Nous décrirons enfin un système pour mesurer la similarité de graphes, basé sur les contraintes : ce système intègre un langage de haut niveau, permettant à un utilisateur de définir une mesure de similarité de façon déclarative en termes de contraintes, et un synthétiseur d'algorithmes, générant un algorithme de résolution adapté au problème décrit.


Lundi 9 novembre 2009

10.30 Nikolaus Hansen, LRI Orsay

The difficulties of black-box optimization and a stochastic variable metric approach.

Stochastic optimization techniques are recognized in practice as useful complements to classical gradient based optimization methods. Population-based stochastic optimizers, like Evolutionary Algorithms, are not as easily trapped into local optima as gradient based methods, because they operate synchronously with a set of solution points and do not exploit local gradient information.

This talk will start with defining a black-box optimization problem and discuss properties that make a function of n continuous variables difficult to optimize in practice. Then we will give an introduction into the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). The CMA-ES is a mature stochastic optimization method, provided with a second order model comparable to quasi-Newton methods in deterministic optimization. We will explain how CMA-ES addresses the discussed difficulties and, if time allows, we will compare the method with other deterministic and stochastic optimization methods.


Mercredi 14 octobre 2009

14.00 Sylvain Pradalier, INRIA Paris-Rocquencourt

Modeling, Simulating and Analyzing nano-machines in the nano-kappa calculus.

The modeling of biochemical systems using process-algebra languages is now a well-established domain. The main goals of this domain are to help biologists and chemists to model, simulate and analyze their systems by taking advantages of modelings done in a formal language. Typical applications are computer-base simulations, model-checking or abstract interpretation.

We focus on the study of nano-machines. Nano-machines are molecular complexes synthesized by assembling various chemical groups in order to achieve a predetermined task. They typically transform chemical events into mechanical events. Because of their intrisic concurrent and compositional nature, the modeling of such systems finds a natural candidate in process-algebra.

I will present the nano-kappa calculus, a formalism dedicated to this task. Molecules are the basic agents of the computation. The stochastic dynamics of the calculus is specified in terms of reactions which create, destroy or flip bonds between molecules and possibly change their internal state. I will also present the modeling of the Rotaxane RaH nano-machine and what information where provided by the simulations we did. I will finally present an encoding from the nano-kappa to the stochastic-pi calculus. We believe that reactive-based approaches like nano-kappa are easier to understand and learn than agent-based approaches, such as the pi-calculus, and moreover that they provide modelings easy to modify, reuse and compose. However the theory of the pi-calculus is much better know, and it has more tools. The very strong correctness of this encoding should permit to use the nano-kappa calculus as a front-end language while keeping the benefits of the theory of traditional process-algebra.


Vendredi 24 avril 2009

10.30 Alexis Saurin, Université de Turin

Recherche de preuve par élimination des coupures.

Les liens entre programmation et théorie de la démonstration (par exemple dans le calcul des séquents de Gentzen) sont étroits et sont construits principalement autour de deux correspondances: (i) la correspondance entre programmation fonctionnelle et élimination des coupures, bien connue sous le nom de correspondance de Curry-Howard; (ii) la correspondance entre recherche de preuve et programmation logique. Ces deux approches sont fondées sur le Hautpsatz de Gentzen qui affirme à la fois (a) la possibilité de transformer une preuve avec des coupures en une preuve sans coupure -- approche (i) -- et (b) l'admissibilité de la coupure, ie les sytèmes de preuve avec et sans coupure ont la même force déductive -- approche (ii). Pour autant, ces deux approches se sont développées de manière largement indépendante. Je présenterai dans cet exposé un modèle de construction de preuve guidée par un processus d'élimination des coupures. Cette construction sera basée sur la Ludique de Girard qui fournit les ingrédients nécessaires à la construction. Informellement, on verra la recherche de preuve se développer au fur et à mesure que la preuve en construction se développe de manière à répondre à des tests (des "objections") qui sont autant d'instructions de recherche de preuve. On discutera ensuite de la possibilité d'utiliser ce cadre pour formuler dans les termes interactifs de la Ludique des mécanismes de contrôle de la recherche comme le backtracking ou la coupure qu'il est difficile de modéliser logiquement dans l'approche traditionnelle (ii).


Mardi 21 avril 2009

10.30 Benoit Razet, INRIA Rocquencourt

Calculs non-déterministes relationnels avec des machines d'Eilenberg.

Les machines d'Eilenberg sont un modèle de calcul non-déterministe à base de relations qui repose sur les automates finis de la théorie des langages formels. Samuel Eilenberg introduisit ce modèle en 1974 parce qu'il rend compte des principaux dispositifs calculatoires de la théorie des automates tels que les automates finis, les transducteurs, les automates à piles mais aussi les machines de Turing. Une machine d'Eilenberg est un automate étiqueté par des relations binaires d'un ensemble quelconque. En interprétant la concaténation sur les mots par la composition sur les relations, une machine d'Eilenberg définit une union de compositions de relations. Les relations étant fermées par union on dit du modèle qu'il est modulaire. Le non-déterminisme apparaît à deux niveaux pour une machine : celui du graphe (décrit par un automate non-déterministe) et celui des relations (modélisant une opération non-déterministe).

Nous proposons une méthode pour simuler les machines d'Eilenberg en utilisant un moteur réactif manipulant des flux. Nous détaillerons cette approche sur les machines d'Eilenberg finies pour lesquelles la formalisation a été complètement effectuée avec l'assistant de preuves Coq. Ensuite nous indiquerons comment étendre ce travail pour des machines d'Eilenberg conduisant à un paradigme très général de programmation relationnelle.


Mardi 17 mars 2009

14.30 Nacim Ramdani, INRIA Coprin

Non-linear continuous reachability analysis using interval constraint programming

Hybrid systems are complex dynamical systems which involve the interaction of discrete and continuous dynamics, and are often components of safety-critical systems. Hence, computing reachable sets for hybrid systems is an important step when one addresses verification or synthesis tasks. A key issue then lays in the calculation of the reachable set for continuous dynamics with nonlinear models, even more when uncertainty is present. In this talk, I will show how to compute an overapproximation for the reachable set of uncertain nonlinear continuous dynamical systems by using guaranteed set integration. I will introduce several ways to do so. The first way relies on a full interval method which handles whole domains for set computation and relies on state-of-the-art validated numerical integration methods, via Taylor series and interval analysis. I will introduce both techniques, and give illustrative examples using Nedialkov VNODE software. The second way can be used with monotone systems only but makes it possible to bracket the uncertain nonlinear system between two nonlinear dynamical systems where there is no uncertainty. I will overview monotone systems and show how to use comparison principles for such systems for computing reachable sets. A third way relies on the Muller's existence theorem for differential inequalities in order to bracket any uncertain dynamics between two dynamical systems where there is no uncertainty. Here again, I will recall this theorem and show how to use it for computing reachable sets. For the second and third ways, the obtained bracketing systems, which may be coupled, are often piecewise Ck-differentiable functions. Then, validated numerical integration methods cannot be used directly. Hence, our contribution resides in the use of hybrid automata to model the bounding systems. I give a rule for building these automata and show how to run them and address mode switching in a guaranteed way in order to compute the over approximation for the reachable set. Here we also make use of solving methods for numerical constraint satisfaction problems. The computational cost of our method is also analyzed and shown to be smaller that the one of classical interval techniques. Sufficient conditions are given which ensure the practical stability of the enclosures given by our hybrid bounding method. Examples are given which show that the performance of our method is very promising.


Mercredi 11 mars 2009

11.15 Claudine Chaouiya, Instituto Gulbenkian de Ciência - Oeiras, Portugal

Qualitative Modelling of regulatory networks using High Level Petri nets: application to developmental processes

Great advances in molecular biology, genomics and functional genomics open the way to the delineation of regulatory maps controlling essential biological processes. To assess the behaviours induced by these large and complex networks, dedicated mathematical and computational tools are very much required. Here, we focus on pattern formation that relies on intracellular and intercellular interactions. We will briefly present two applications, which motivate our work, both related to Drosophila development. The first case deals with the regulatory processes underlying the dorsal appendage morphogenesis in the Drosophila egg. The segment-polarity module involved in the segmentation of the Drosophila embryo constitutes our second example. We rely on the logical formalism to model and analyse the qualitative behaviours of the regulatory networks. After a short presentation of the logical framework, we will show how such models can be represented by means of High Level Petri nets. This representation allows us to define compact models for regulatory modules (here, a module will constitute an intracellular network) and then to compose modules interacting through intercellular signals following a given neighbouring relation. At this stage, we provide a convenient way to settle models defined as the composition of modules that are crucial in patterning processes. Finally, implementation aspects will be discussed.

Joint work with Franck Pommereau (LACL, Université Paris 12) and Hanna Klaudel (IBISC Université d'Evry-Val d'Essonne)


Mercredi 11 mars 2009

10.30 Aurélien Naldi, TAGC, Univ. de Marseille

A Reduction of logical regulatory graphs preserving essential dynamical properties

As ever larger biological molecular regulatory networks are delineated, standard dynamical modelling and analysis approaches are reaching their limits. To cope with the increasing complexity of these networks, we have defined a reduction method for multi-valued logical models. Starting with a detailed model, this method enables the computation of a reduced model by iteratively "hiding" regulatory components. To keep a consistent behaviour, the regulators of the hidden nodes become regulators of their targets, for which new logical functions are determined.

The construction of reduced models ensures the preservation of a number of dynamical properties of the original model. In particular, stable states and more complex attractors are conserved. More generally, we address the question of the relationship between the attractor configuration of the original model and that of the reduced model, along with the issue of attractor reachability.

This novel reduction method has been implemented into our software GINsim, and will be illustrated through its application to model of networks involved in the control of helper T lymphocytes differentiation.

Joint work with E. Remy, D. Thieffry, C. Chaouiya


Vendredi 6 mars 2009

10.30 Fabien Corblin, IMAG Grenoble

Conception et mise en oeuvre d'un outil déclaratif pour l'analyse des réseaux génétiques discrets

Une demande croissante d'outils pour construire et décrypter des réseaux génétiques contrôlant des processus cellulaires est ressentie en biologie. Nous soutenons que l'utilisation de l'a'pproche déclarative est pertinente et applicable pour répondre aux questions des biologistes sur ces réseaux, en général partiellement connus. L'idée principale est de modéliser des connaissances portant à la fois sur la structure et la dynamique d'un réseau par un ensemble de contraintes représentant l'ensemble des solutions, de vérifier sa cohérence, de réparer une incohérence éventuelle par un relâchement automatique, et d?inférer des propriétés sur la structure et la dynamique du réseau. Pour montrer la faisabilité de l'approche, nous formalisons les réseaux discrets de R. Thomas et les propriétés biologiques pertinentes, proposons un outil reposant sur la programmation logique par contraintes en coopération avec un solveur SAT, et la validons sur des applications biologiques significatives.

14.00 Benjamin Pfeuty, ERATO Complex Systems Biology, University of TokyoFabien Corblin, IMAG Grenoble

Modeling G1 phase of mammalian cell cycle: cell fate determination and cancer onset

Upon their exit from mitosis, mammalian cells enter a G1 phase during which they acutely sense all sorts of environmental stimuli. On the basis of these signals that they first need to decipher and integrate, they decide whether to undergo division, quiescence, differentiation, senescence or apoptosis. Failures in this process may lead to cancer.

In this study, I questioned whether, despite the complexity of the G1 regulatory network, simple organizing principles might be identified that could explain how specific input signals are converted into appropriate cell fates. For this purpose, a mathematical model of the G1 regulatory network has been formulated using a simplified description of activities linked to signal transduction, cell growth, cell division and cell death. Bifurcation analysis of the model revealed the existence of multistability between several attractor states corresponding to G0 -arrest, G1 -arrest, S-phase entry and apoptosis cell fates.

I will discuss the result of this modeling study from different perspectives: (i) from a modeler's perspective, how to design a model that captures the qualitative behavior of a certain biological process or system; (ii) from an engineer's perspective, how feedback and feedforward loops are combined within a regulatory network to drive the signal-dependent switch between distinct cellular processes; (iii) from a biologist's perspective, how the control of cell fate determination have evolved in multicellular organisms and how it could fail thereby enabling the emergence of cancers.



Mardi 13 mai 2008

10.30 Hugues Berry, INRIA Saclay

Individual-based simulations for cell biochemistry in crowded environments

Standard modeling works in cell biochemistry are usually based on mean-field equations, most often referred to as "laws of mass-action". Yet, the derivation of these laws is based on strict assumptions. In particular, the reaction medium must be dilute, perfectly-mixed, three-dimensional and spatially homogeneous and the resulting kinetics are purely deterministic. Many of these assumptions may be violated in cells. Notably, single-cell measurements show that protein diffusion in most compartments (including cytosol, nucleus and membranes) is subdiffusive (anomalous diffusion). This phenomenon is mainly caused by physical obstruction to diffusion due to large-size obstacles, that actually abound in cells (organelles, internal networks, large macromolecular complexes...). Fundamentally, this experimental observation tells us that diffusion in cells is not perfectly mixing and that cellular media can be considered as spatially inhomogeneous. My objective is thus to try and evaluate what kind of behaviors can be expected that are not accounted for by the laws of mass-action. To simulate these effects, usual stochastic kinetic approaches such as Gillespie's algorithms are not well suited, because they explicitly assume perfect mixing and do not take into account physical obstruction. Conversely, lattice gas automata or individual-based modeling appears to be the most appropriate tools. I will present a first example concerning simple enzyme reactions in cell membranes with molecular crowding. Individual-based simulations indicate that the classical mean-field formalism for these reactions (Michaelis-Menten kinetics) fails short of describing the kinetics. Conversely, so-called "fractal" kinetics can be used to describe the observed behaviors. Furthermore, the simulations uncover a spontaneous tendency to segregate reaction substrates and products within distinct spatial zones of the membrane, i.e. a phenomenon similar to morphogenesis. Finally, I will present current perspectives & ongoing works concerning modeling of aging and protein aggregation in E. coli (with F. Taddei and A. Lindner, U571 INSERM, Necker, Paris) and the development of hybrid discrete/continuous simulation methods

Joint workwith Olivier Michel, Ibisc Lab., University of Evry, Genopole and A. Lesne, IHES, Bures-sur-Yvette and LPTMC, Univ. P & M. Curie, Paris, France.


Vendredi 15 février 2008

11.00 Antoine Spicher, LORIA

Spatial Programming of an Amorphous Biological Medium

The MGS project is aimed at studying the use of topological notions for programming languages and their applications in developing new data and control structures for the simulation of dynamical systems with a dynamical structure. These goals have led, at a theoretical level, to the development of an approach based on two new concepts:

   1. the topological collections, a new kind of data structure relying

on the notion of cellular complex developed in algebraic topology, and

   2. the transformations, an original kind of case-based definition of

functions on topological collections These concepts are implemented in an experimental programming language dedicated to the specification of dynamical systems: the MGS language.

My thesis work is organized following three directions:

   1. developing the notion of topological collection of arbitrary

dimension

   2. specifying a formal semantics for MGS, especially for

transformations, and finally

   3. validating these works with numerous non trivial examples in

biology and morphogenesis.

These applications are mostly examples of dynamical systems with a dynamical structure: the simulation of the cellular motility by adaptive mesh subdivision, the modeling of the neurulation that exhibits a topological modification, the numerical exact stochastic simulation of chemical models, the modeling of plant growth coupling hormones diffusion and cellular differentiation, declaratives implementations of complex CAD operations like mesh refinement...

During the presentation I will start with an overview of these works. Then I will focus on a particular example where a specific rule application strategy based on the Gillespie stochastic simulation algorithm is used to model chemical reactions. Finally I will present a perspective that consists in developing a formalism to specify and analyze a global spatial behavior and to compile it in the local behaviors of entities of a population and to take advantage of emerging properties. This project relies on recent results in different domains like spatial programming, amorphous computing and synthetic biology.


Lundi 11 février 2008

10.30 Jean Krivine, LIX, Palaiseau

Une plateforme pour la simulation et l'analyse de systèmes bio-moléculaires en k-calcul

Travaux réalisés en collaboration avec Vincent Danos (Laboratoire Preuves Programmes et Systèmes, CNRS), Jerome Feret (Ecole Normale Supérieure, Paris), Walter Fontana (Systems Biology Department, Harvard Medical School, US).

Dans cet exposé nous présenterons une plateforme en cours de dévoloppement à Plectix Biosystems (Cambridge US) permettant d'analyser et de simuler de larges systèmes bio-moléculaires décrits dans le k-calcul de Danos et Laneve [1]. Nous introduirons en premier lieu les concepts de base de la modélisation dans le k-calcul [2], puis nous présenterons ensuite un algorithme de simulation permettant de produire des executions stochastiques d'un système k avec un coup par événement de calcul (application d'une règle et mise à jour du système) constant et indépendant du nombre d'agents simulés [3].

[1] Formal Molecular Biology, Danos et Laneve. TCS 325, 2004

[2] Rule-based modelling of cellular signalling, Vincent Danos, Jerome Feret, Walter Fontana, Russell Harmer, Jean Krivine. CONCUR'07.

[3] Scalable simulation of cellular signaling networks, Vincent Danos, Jerome Feret, Walter Fontana, Jean Krivine. APLAS'07.


Vendredi 25 janvier 2008

10.00 Jacques-Alexandre Sepulchre, INLN CNRS, Nice

Effets de rétroaction dans les cascades de signalisation intracellulaire

Classiquement, une voie de signalisation consiste en une cascade de réactions enzymatiques élémentaires où, à chaque étape, une protéine est activée par une modification covalente induite par la réaction précédente. De plus, une fois activée, cette protéine peut jouer le rôle d'enzyme pour la réaction suivante. Dans ce contexte, l'exemple le plus connu de modification covalente est le processus de phosphorylation/déphosporylation dans lequel la protéine est généralement activée par l'ajout d'un groupe phosphate et désactivée par son retrait. Un exemple connu dans ce contexte est celui des cascades de MAPKs (Mitogen Activated Protein Kinases). Partant d'une description complète d'une cascade de cycles élémentaires, décrite en termes de loi d'action des masses, nous avons dérivé une approximation rigoureuse des équations cinétiques de la chaîne dans laquelle chaque cycle est décrit par une seule variable. Le système d'équations que nous obtenons est différent de celui qui est utilisé depuis plusieurs années dans la littérature, lequel est une extension purement phénoménologique du modèle de Goldbeter-Koshland. Notre approche met en évidence certaines nouvelles propriétés qui, bien qu'implicitement contenues dans le modèle complet, étaient absentes dans l'approximation phénoménologique. Une propriété clé qui apparaît dans notre jeu d'équations est l'existence d'une rétroaction exercée par chaque cycle de la chaîne sur son prédécesseur. Cette rétroaction est non seulement à l'origine d'oscillations temporelles amorties prédites par le modèle, mais aussi montre qu'un signal biochimique peut se propager pas seulement en aval dans la chaîne, mais aussi en amont. Cette propriété, qui se retrouve dans le système complet, remet en question l'idée implicite qu'une voie de signalisation est unidirectionnelle.



Jeudi 4 octobre 2007

10.30 Carlos Orlate and Frank Valencia, LIX Ecole Polytechnique

A Process Calculus for Universal Concurrent Constraint Programming: Semantics, Logic and Application

We introduce the Universal Timed Concurrent Constraint Programming (utcc) process calculus; a generalisation of Timed Concurrent Constraint Programming. The utcc calculus allows for the specification of mobile behaviours in the sense of Milner's pi-calculus: Generation and communication of private links. We first endow utcc with an operational semantics and then with a symbolic semantics to deal with problematic operational aspects involving infinitely many substitutions and divergent internal computations. The novelty of the symbolic semantics is to use temporal constraints to represent finitely infinitely-many substitutions. We also show that utcc has a strong connection with Pnueli's Temporal Logic. This connection can be used to prove reachability properties of utcc processes. As a compelling example, we use utcc to exhibit the secrecy flaw of the Needham-Schroeder security protocol.


Mardi 31 juillet 2007

14.00 Ludovic Langevine, Mission Critical, Belgium

Ontology Driven Software Engineering for Real Life Applications

In development of any large scale software systems the Business has a vision for a project to transform the company. However the vision is informally and incompletely specified and subject to frequent changes that leads to the Business-IT gap. This gap makes it difficult for IT to give a reasonable estimate of time and cost for the project and impacts greatly on the business case. Our experience shows that the gap can be bridged by describing the business knowledge in a formal language understandable by Business and consumable by computer programs.

The Web Ontology Language, OWL, was developed to facilitate greater machine interpretability of human knowledge by providing additional vocabulary along with formal semantics. OWL ontologies can be enriched by business rules using the Semantic Web Rule Language, SWRL. Those languages forms a knowledge continuum between Business and IT, and provides a mechanism by which the Business can drive the evolution of the project by proposing concrete changes to the ontology.

In this talk, we will describe the ODASE platform (Ontology Driven Architecture for Software Engineering) and demonstrate some pedagogical applications. We will show how automatic code generation, on top of a formal description of the Business domain, bridges the Business-IT gap, increases the quality and the flexibility of the application, and reduces its development and maintenance cost.


Mardi 17 juillet 2007

10h00 Jacques Robin, University of Recife, Brazil.

Towards CHR components: preliminary ideas on implementing adaptive

solvers for the built-ins constraints of a CHR base, as another CHR base

CHR was conceived as a host programming platform extension with constraint solving rules. Any CHR base is thus a client of such host platform that it uses as a server to solve the so-called built-in constraints which solving is needed by the CHR base but for which there are no definition in that base (i.e., these constraints appear only in rule guards and bodies of the CHR base but not in any rule head). This single client-server relationship between a CHR base and a host programming platform can be generalized into N client-server relationships in a component assembly of CHR bases. In such an assembly the constraints that appear only in the guards and bodies of a client component CHR base, appear in the heads of its server component CHR bases. We suggest some simple methods for the server CHR base to be used to check the rule guard entailment conditions of its client CHR bases. These methods could serve as the basis to extend CHR with software components, to allow low-cost engineering of large-scale and partially reusable CHR applications.


Mardi 10 juillet 2007

10h00 Jacques Robin, University of Recife, Brazil.

CHROME: a Component-Based Aspect and Object Oriented Model-Driven Architecture for a CHRv Engine

CHROME is an ongoing effort to integrate cutting-edge reuse-fostering software engineering techniques such as components, aspects, and object-oriented model transformations and apply them to the development of an easily reusable and extendable adaptive inference engine for the knowledge representation language Constraint Handling Rules with Disjunctive bodies (CHRv). Following model-driven engineering a fully refined platform independent model of the engine is first specified in UML2 and then automatically transformed in several steps into source code of mainstream component-based object-oriented imperative platforms such as Java OSGi or C#.Net by way of model transformations implemented in INRIA's Atlas Transformation Language. By leveraging the high expressivity and versatily of CHRv, CHROME aims to constitute the base building block of a component framework offering in an integrated way a variety of automated reasoning services such as deductive entailment and satisfiability, individual classification and concept subsumption, various forms inheritance, default reasoning, abductive explanation generation, truth maintenance, belief revision, belief update, temporal projection, planning and constraint solving in any domain.


Jeudi 5 avril 2007

13h30 Samuel Bernard, Humboldt University, Berlin, Germany.

Synchronization-Induced Rhythmicity of Circadian Oscillators in the Suprachiasmatic Nucleus

The suprachiasmatic nuclei (SCN) host a robust, self-sustained circadian pacemaker that coordinates physiological rhythms with the daily changes in the environment. Neuronal clocks within the SCN form a heterogeneous network that must synchronize to maintain timekeeping activity. Coherent circadian output of the SCN tissue is established by intercellular signaling factors, such as vasointestinal polypeptide (VIP). It was recently shown that besides coordinating cells, the synchronization factors play a crucial role in the sustenance of intrinsic cellular rhythmicity. Disruption of intercellular signalling abolishes sustained rhythmicity in a majority of neurons and desynchronizes the remaining rhythmic neurons. Based on these observations, we propose a model for the synchronization of circadian oscillators that combines intracellular and intercellular dynamics at the single-cell level. The model is a heterogeneous network of circadian neuronal oscillators where individual oscillators are damped rather than self-sustained. We simulated different experimental conditions and found that: (i) In normal, constant conditions, coupled circadianoscillators quickly synchronize and produce a coherent output. (ii) In large populations, such oscillators either synchronize or gradually lose rhythmicity, but do not run out of phase, demonstrating that rhythmicity and synchrony are co-dependent. (iii) The number of oscillators and connectivity are important for these synchronization properties. (iv) Slow oscillators have a higher impact on the period in mixed populations. (v) Coupled circadian oscillators can be efficiently entrained by light-dark cycles.

Based on these results, we predict that: (i) SCN neurons need periodic synchronization signal to be rhythmic. (ii) A small number of neurons or a low connectivity results in desynchrony. (iii) Amplitudes and phases of neurons are negatively correlated. We conclude that to understand the orchestration of timekeeping in the SCN, intracellular circadian clocks cannot be isolated from their intercellular communication


Mercredi 4 avril 2007

10h30 David Angeli, University of Firenze, Italy & INRIA Rocquencourt

Structural monotonicity of chemical reaction networks

Systems biology is a cross-disciplinary field which aims at understanding cell life at the molecular level, viz. by systematically understanding and identifying the complex chemical interactions that take place within individual cells and between neighboring ones. From the point of view of a systems scientist this poses remarkable challenges both for modeling and qualitative understanding of the possible dynamical behaviours which such models may exhibit. In this talk we try to analyze the dynamical behaviours of chemical reaction networks by taking into account the huge uncertainties that such models often entail. We seek for qualitative criteria which may apply to ODE models regardless of specific parameters values and which still can provide useful tools for the study of their dynamics. Technical tools used range from monotonicity theory, to Petri nets and graph theory.

Biography: David Angeli graduated in 1996 from University of Florence in Control Engineering, and later obtained his phD degree in Automation within the Dep. of Systems and Computer Science from the same university. He is currently an Associate Professor in Florence as well as a visiting fellow of INRIA de Rocquencourt, Paris. His research interest include nonlinear stability analysis, chemical reaction networks, monotone dynamical systems.


Mercredi 14 mars 2007

10h30 Monika Heiner, Cottbus Technical University, Germany & INRIA Rocquencourt

Modelling and Analysis of Biochemical Networks - a Petri Net Perspective on Systems Biology

Petri nets are well-known for an intuitive and executable modelling style, supporting at the same time exhaustive analysis techniques. Their success stories cover application areas in various engineering disciplines, where they have been proven to be useful for the sound construction of technical systems. In this talk we bridge the gap from the technical systems to natural systems and demonstrate how well-established techniques can be reused for (some of the) challenges current systems biology is confronted with. No prerequisite knowledge is assumed.


Mardi 9 janvier 2007

11h00 Grégory Batt, Boston University

Combining discrete abstraction and model checking for the analysis of partially-known models of natural and synthetic gene networks

Natural gene networks are complex networks of interregulating genes that control at the molecular level the development and functioning of living organisms. Synthetic gene networks are engineered networks that confer a desired behavior to a living organism. Understanding the functioning of these networks is a central problem in systems biology and in bioengineering. However, their complex dynamics, due to the presence of feedback loops and non-linear phenomena, often defeats intuitive understanding. Moreover, the lack of precise information on parameter values hampers a model-based analysis or design of these systems.

In this talk, I will present two recently-developed methods for the verification of the dynamical properties of gene networks under parameter uncertainty. Both methods combine discrete abstraction and model checking. Discrete abstraction is used to obtain a discrete representation of the dynamics of the system in the state space for sets of parameters, and model checking is used for the verification of the dynamical properties of the abstract, discrete system. The approximations obtained by discrete abstraction are conservative. This guarantees that the properties satisfied by the abstract system also hold for the original one.

The above strategy is applied to the analysis of two different types of models, namely piecewise affine and piecewise multiaffine differential equation models. Both types of models capture essential features of genetic regulations but with different levels of detail. A coarse-grained, qualitative approach is used for the analysis of piecewise affine models. This method is well-adapted to the essentially qualitative information available on natural gene networks. A finer-grained, quantitative approach is used for the analysis of piecewise multiaffine models, well-adapted to the better-known synthetic gene networks. Though quantitative, this approach can deal with uncertain parameters specified by intervals.

The practical applicability and biological relevance of the proposed approaches are illustrated by means of two examples: the validation of a model of the nutritional stress response and the tuning of a synthetic transcriptional cascade, both in the bacterium E. coli.

Work done in collaboration with: - Hidde de Jong (INRIA), Johannes Geiselmann (UJF, Grenoble), Jean-Luc Gouzé (INRIA), Radu Mateescu (INRIA), Michel Page (UPMF, Grenoble), Delphine Ropers (INRIA), Tewfik Sari (UHA, Mulhouse) and Dominique Schneider (UJF, Grenoble) and - Calin Belta (Boston University), Boyan Yordanov (Boston University) and Ron Weiss (Princeton University)



Jeudi 14 décembre 2006

11h00 Khalil Djelloul, Université d'Ulm, Allemagne

CHR et les contraintes du premier ordre autour des arbres

CHR (Constraint Handling Rules) est un langage de programmation logique à base de règles de réécriture créé par Thom Fruehwirth dans les années 1990 et dont le fonctionnement est régit par une sémantique raffinée. Un des premiers solveurs CHR publiés est le fameux "CHR-Term-Unification-Algorithm" présenté par Thom Fruehwirth en 1993 et révisé en 1998. Ce solveur élégant sous forme de 4 règles de réécriture permet de résoudre des conjonctions d'équations à termes éventuellement non-plats (termes contenant plusieurs symboles de fonction imbriqués) dans l'algèbre des arbres rationnels. Il a également un intérêt indéniable en programmation car il effectue l'unification des termes : le coeur même de Prolog et de plusieurs autres langages de programmation logique. Cependant, ce dernier utilise un ordre "<" sur les termes et les variables, et sa complexité est une conjecture qui dure depuis plus de 10 ans.

Nous montrons dans cet exposé que ce solveur est de complexité exponentielle (en temps et espace) quelque soit l'ordre "<" choisi et déduisons de cette preuve une complexité quadratique O(n2) dans le cas où les termes des équations contiennent au plus un seul symbole de fonction. Nous étendons ensuite ce solveur par quelque règles CHR afin que ce dernier puisse résoudre, en une complexité quadratique, toute conjonction quantifiée d'équations à termes non-plats dans la théorie des arbres finis ou infinis. Nous terminons cet exposé en présentant une implantation CHR d'un solveur de contraintes du premier ordre dans une théorie étendue "T" d'arbres finis ou infinis, donnant des solutions claires et explicites aux problèmes de satisfaction de contraintes du premier ordre dans "T". Pour cela, nous étendons d'abord la signature et l'axiomatisation de Maher (1988) de la théorie des arbres finis ou infinis par une relation fini(x) qui contraint le terme "x" à être un arbre fini. Nous présentons ensuite notre solveur CHR et montrons entre autres comment modéliser et résoudre une contrainte du premier ordre dans "T" par des règles CHR. Enfin, nous comparons les performances de notre solveur CHR avec ceux obtenues en utilisant une implantation C++ de ce solveur et de notre récente procédure de décision dédiée aux théories décomposables.


Mercredi 25 octobre 2006

11h00 Andras Kovacs, Post Doc ERCIM

Capacitated lot-sizing with sequence-dependent setups

In this talk, I will give a brief review of different models of the lot-sizing and scheduling problem, and introduce our novel results on modelling the common practical requirement of sequence-dependent setups. Although lot-sizing problems have been studied in Operations Research for many years, the latter requirement still presents a challenge for current optimization techniques.

I will present a novel MIP formulation of the single-machine capacitated lot-sizing and scheduling problem with sequence-dependent setup times and setup costs. The model is based on determining during pre-processing all item sequences that can appear in given time periods in optimal solutions. The sequences are used in a mixed-integer program, parameters of which are generated by a heuristic procedure in order to establish a tight formulation. This model allows us to solve significantly larger problems than what was tractable by earlier optimal solution methods, and it often achieves orders of magnitude speedup.


Vendredi 6 octobre 2006 (salle de conf. Bât. 11)

11h00 Jacques Robin, Universidade Federal de Pernambuco, Brazil.

Towards a Model-Driven Automated Reasoning Component Framework Based on

Constraint Handling Rules.

Over the last two decades automated reasoning techniques have tremendously diversified while becoming far better understood, formally founded and scalable. They nevertheless remain highly underused in the larger than ever number of mainstream software services to which they could bring a decisive competitive edge. This is mostly due to the fact that the high level of conceptual reuse existing among these techniques, has not been leveraged in the software architectures of available automated reasoning systems, making them overly difficult and costly to use, reuse, extend, customize and integrate in larger systems. The on-going ORCAS project, a cooperation involving several research groups, addresses this issue. It integrates four model-driven software reuse approaches (task model reformulation, component model assembly, aspect model weaving, and code generation) to develop a framework of reusable automated reasoning components. It leverages and integrates UML2 and related OMG standards for high-level modeling, Constraint Handling Rules (CHR) for formal knowlege representation and Java for executable code deployment.


Mercredi 22 février 2006

14h00 K. Sriram, Epigénomique, Génopole Evry,

I. Delay model for mammalian circadian rhythms and

II. Mathematical model on the dynamics of cyclical organization of simple protein network and its application to budding yeast cell cycle.

The talk consists of two parts in which discrete delay model for mammalian circadian rhythms will be dealt with in the rst part and in the second part dynamical analysis of cyclical organization of protein networks and its application to cell division cycle will be presented. A brief abstract is given below.

Part I: A three variable discrete delay model is proposed for the circadian rhythm of the mammals with BMAL1, PER-CRY complex and REV-ERB® protein concentrations as the dynamical variables. The delay model is phenomenological in nature rather than the precise description of all the underlying complex processes. The goal is to study the e ects of delay in the circadian rhythms of mammals that appears in both the positive and negative feedback loops of the model. The delay model exhibits 24 hr limit cycle oscillations, entrainment to Light-Dark (LD) cycles and phase response curves. The model is also found to exhibit quasiperiodic and chaotic oscillations under LD cycles when delay is varied. These are linked to non 24 hrs sleep wake syndrome and cancer incidence. The delay model in essence, captures the core mechanism of the mammalian circadian rhythms with a fewer number of variables and parameters.

Part II: The dynamics of the cyclical organization of simple protein networks and its application to the budding yeast cell cycle are studied by constructing nonlinear models. The protein network consists of two small cyclical loops, where each loop in the absence of interaction with the other exhibit di erent dynam- ical behavior. Bistability is exhibited by one loop and limit cycle oscillations are exhibited by the second loop. Coupling of both the cyclical loops by positive feedback loop displays complex behavior such as multi- stability and coexistence of limit cycle and multiple steady states. The coupling of two cyclical loops by the positive feedback loop brings in the notion of checkpoint in the model. The model also exhibits dominoe like behavior, where limit cycle oscillations takes place in a stepwise fashion. As an application, the events that govern the cell cycle of budding yeast is considered for the present study. In budding yeast, the feedback interactions among the important transcription factors, cyclins and its inhibitors in G1, S-G2 and M phases are considered for the construction of the biological circuit diagram. Surprisingly, the sequential activation of the transcription factors, cyclins and its inhibitors forms two independent cyclical loops, with transcription factors involves in the cyclic positive regulation in clockwise direction, while the cyclins and its inhibitors involves in the negative regulation in anticlockwise direction. The coupling of the transcription factors and the cyclin and its inhibitors by positive feedback loops generates rich bifurcation diagram that can be related to the di erent events in the G1, S-G2 and M phases in terms of dynamical system theory. The different checkpoints in the cell cycle is accounted for by appropriately silencing the positive feedback loops that couples the transcription factors and the cyclin and its inhibitors.


Mercredi 8 février 2006

14h00 Eric Reiter, INRA Tours

Mécanismes de signalisation cellulaire induits par les récepteurs

couplés aux protéines G: exemple du récepteur de l'hormone folliculo- stimulante (FSH).

Les récepteurs couplés aux protéines G (GPCR) représentent de loin la famille de récepteurs membranaires la plus abondante et la plus versatile chez les mammifères. Des GPCRs sont cruciaux pour l'ensemble des grandes fonctions physiologiques: perception sensorielle (vision, odorat, nociception,...), neurotransmission, systèmes endocriniens, .. etc. Pour cette raison, les GPCRs sont directement ciblés par plus de la moitié des médicaments actuellement utilisés. Comprendre les mécanismes généraux et particuliers qui régissent la signalisation cellulaires de ces récepteurs ou de certains d'entre eux reste une question centrale en biologie. Seulement trois familles de protéines sont capables de reconnaître et d'interagir spécifiquement avec les GPCRs après qu'ils aient été activés par leur ligand: les protéines G hétérotrimériques, les kinases des GPCRs (GRKs) et les b-arrestines. Au cours des 30 dernières années, les concepts concernant la transduction des GPCRs et leur pharmacologie ont découlés du rôle joué par les seules protéines G hétérotrimériques.

Au cours de la première partie de ce séminaire, je traiterais des fonctions signalétiques nouvelles des GRKs et des b-arrestines. Je comparerais les contributions respectives des voies protéine G et b-arrestines dependantes dans l'activation des MAP kinases ERK par différents GPCRs; le rôle modulateur des GRKs ainsi que les sensibilités respectives des deux mécanismes d'activation des ERKs. Dans une seconde partie, j'expliquerais plus particulièrement notre problématique de recherche sur le récepteur de la FSH ainsi que les grandes lignes du projet de signalisation à « haut débit » que nous voulons mettre en place.


Vendredi 3 février 2006

10h30 Francis Lévi, INSERM Villejuif,

Implications thérapeutiques des interactions entre cycle cellulaire et rythme circadien en cancérologie


Jeudi 26 janvier 2006 (salle de conférence du bâtiment 11)

10h30 Nicolas Beldiceanu, Ecole des Mines de Nantes,

Graph-Based Filtering

Joint work with : M. Carlsson, T. Petit, J.-X. Rampon, G. Rochart, C. Truchet

Over the past 20 years global (combinatorial) constraints have gradually been introduced within the area of constraint programming for dealing with concrete problems. Efficient filtering algorithms, usually based on graph theory, were developped in an ad-hoc way for each global constraint.

This talk presents a systematic approach which aims at providing generic filtering algorithms for global constraints. Given a specification of a global constraint C in terms of graph properties, we show how to derive a filtering algorithm for C (i.e., check feasibility and eliminate infeasible choices). For this purpose we present three complementary methods:

(1) The first method is based on the bounds of the graph properties used in the description of a global constraint. We provide lower and upper bounds for frequently used graph properties (number of arcs, vertices, connected components, strongly connected components, sinks).

(2) The second method shows how to determine the status of vertices and arcs of an intermediate digraph so that the final digraph does not contain more than (resp. does no contain less than) a given fixed number of arcs, vertices, connected components, strongly connected components, or sinks.

(3) The third method presents a database of about 200 graph invariants for deriving systematically necessary conditions when a global constraint is defined by more than on graph property.

We conclude with different questions raised by this approach.

14h00 Nicolas Beldiceanu, Ecole des Mines de Nantes,

Contraintes d'arbres et applications aux problèmes de phylogénie et aux problèmes de cheminement

Travail fait avec: Xavier Lorca, Pierre Flener, Irit Katriel

La première partie dresse un apercu général sur les contraintes de couverture de graphe par des arbres, à la fois dans le cas de graphes orientés et le cas de graphes non-orienté. Nous présentons trois contraintes de couverture par des arbres et nous montrons comment assurer la consistance hybride (hybride car on utilise à la fois des variables ensemblistes et des variables domaines) pour ces trois contraintes. La deuxième partie se focalise dans le cadre de graphe orientés sur la prise en compte de contraintes additionelles (contraintes de précédences, d'incomparabilité, de degré maximum) rencontrées dans des applications pratiques. Elle donne les résultats obtenus sur des problèmes de phylogénie et de cheminement.


Lundi 23 et mardi 24 janvier 2006 (Institut Henri Poincaré, Paris)

Abstraction, modularité et compositionalité dans les réseaux géniques et protéiques

Réunion de l'ACI VicAnne et réunion de lancement de l'ARC MOCA


Mercredi 11 janvier 2006:

14h00 Stephanie Spranger, Ludwig Maximilian University, Munich, Germany,

Calendars as Types: Data Modelling, Constraint Reasoning, and Type Checking with Calendars

I have investigated real-life calendars, calendar and time expressions, and time and date formats. This work aims at the development of computer-based tools for modeling and processing such calendric data. The realization is based on a programming language approach to time and calendars that essentially differs from logic-based and algebraic approaches. The thesis underlying this work is twofold: (1) "Calendar as Type", i.e. time and calendar expressions are not modeled in a logic or an algebra but instead, by means of data types. The user is provided with a set of language constructs (so-called type constructors). Advantages of this approach are: user-friendly modeling, increase of efficiency and consistency, program and document annotation, and abstraction. (2) "Theory Reasoning", i.e. problems such as appointment scheduling or travel planning are formulated in the environment of a constraint solver specific to arbitrary calendar domains (i.e.user-defined data types like "day or "consultation hour) rather than by axiomatization, commonly used with approaches based on ontology modeling and reasoning. The constraint solver refers to and relies on (user-defined) calendric types, it maintains the semantics of different calendric types like "day and "consultation hour, and it allows for efficient constraint solving with arbitrary calendric data. This work shows perspectives for future research on using data types and specific inference algorithms (e.g. constraint solving) for modeling and reasoning on specific theories (e.g. topologies and locational data). The calendar modeling language CaTTS is an outcome of this work. This talk introduces into CaTTS and its Multi-Calendar constraint solver.



Lundi 19 décembre 2005:

14h00 Khalil Djelloul, Université de Marseille Luminy,

Théories complètes autour des arbres

Nous présentons dans cet exposé un algorithme général pour la résolution de contraintes du premier ordre dans des théories dites "décomposables". Tout d'abord, en utilisant des quantificateurs spéciaux, nous donnons une caractérisation formelle des théories décomposables et montrons quelques unes de leurs propriétés. Nous présentons ensuite un algorithme général pour la résolution de contraintes du premier ordre dans une théorie décomposable "T" quelconque. L'algorithme est donné sous forme d'un ensemble de cinq règles de réécriture. Il transforme une formule "a", qui bien entendu peut contenir des variables libres, en une conjonction "b" de formules résolues, équivalente dans T et ne faisant pas intervenir d'autre variables libres que celle de "a". Cette conjonction de formules résolues est soit la formule "vrai", soit la formule "faux", soit contient au moins une variable libre et n'est équivalente ni à "vrai" ni à "faux". En particulier, si "a" n'a pas de variables libres alors "b" est soit la formule "vrai" soit la formule "faux". La correction de notre algorithme démontre la complétude des théories décomposables.

Nous donnons ensuite une manière automatique pour mélanger une théorie T quelconque avec la théorie des arbres et montrons que sous certaines conditions très simples sur T et uniquement sur T le mélange T+arbre donne une théorie décomposable et donc complète. Nous terminons alors notre exposé par une série de benchmarks réalisées par une implantation de notre algorithme résolvant des formules sur des jeux à deux partenaires qui font intervenir des imbrications de plus de 160 quantificateurs alternés dans une théorie combinant les arbres aux rationnels additifs ordonnées (Q,+,-,<).


Jeudi 17 novembre 2005:

10h30 Jacques Cohen, Brandeis University, Waltham, MA, USA

Towards a comprehensive view of computational approaches in systems biology.

In the past few years the number of available papers describing computational approaches in systems biology has increased significantly. These approaches vary considerably from one another. In this talk we aim to provide a birds eye view of the various methods that have been proposed to analyze and infer gene regulatory networks. In so doing we are bound to find novel combined approaches that are worth pursuing.

A rough initial classification of the current methods distinguishes between discrete and continuous, stochastic (probabilistic) and non-stochastic, analysis (forward) and inference (reverse engineering). Starting from discrete methods of analysis (e.g., Thomas, Thieffry, de Jong) we proceed to cover existing probabilistic (e.g., Shmulevich) and reverse engineering approaches (e.g., Akutsu, Somogyi) including those based on perturbation (e.g., Karp, Collins).

One of the findings based on this comprehensive view is that it is worth considering model-checking as a filter for networks being tested as possible candidate in the reverse engineering process. Another finding suggests a perturbation method for translating continuous models into discrete ones. We conclude by raising a few relevant questions about research directions in systems biology that are worth exploring.

(Work done jointly with Sam Blasiak; the talk is directed to computer scientists with no advanced knowledge in biology)


Vendredi 14 octobre 2005:

10h30 Carolyn Talcott (Stanford Research Institute, USA)

Pathway Logic: Application of Formal Modeling Techniques to Understanding Biological Signaling Processes.

Pathway Logic is an application of techniques from formal methods to the modeling and analysis of signal transduction networks in mammalian cells. These signaling network models are developed using Maude, a formal language and execution environment founded on rewriting logic. Network elements (reactions) are represented as rewrite rules. Models can be queried (analyzed) using the execution, search and model-checking tools of the Maude system. Alternatively, models can be exported as Petri Nets for analysis using tools specialized for Petri Nets. The Pathway Logic Assistant (PLA) is a tool for browsing and querying Pathway Logic models via an interactive graphical representation. A model can be queried to find relevant subnetworks and pathways leading to situations of interest. Situations to be avoided can be specified allowing the user to explore knockouts and alternative pathways. <> In the talk we will give a brief introduction to Rewriting Logic, Maude, and the associated analysis techniques. We will then explain how biological signaling networks are represented, both textually and the interactive graphical representation of the PLA. The ideas will be illustrated using a curated model of signaling in human mammary epithelial cell. A link to downloads of the Pathway Logic Assistant (Linux and Mac OS X), a sample model, and some tutorial materials can be found on the Pathway Logic website [http://www.csl.sri.com/~clt/PLweb/].


Lundi 13 juin 2005:

10h30 Albert Goldbeter (Université libre de Bruxelles)

Modélisation des rythmes circadiens: Du mécanisme moléculaire aux troubles du sommeil

Les rythmes circadiens proviennent de boucles de rétroaction entremêlées au sein de réseaux de régulation génétique. Des modèles de complexité croissante ont été proposés pour le mécanisme moléculaire de ces rythmes qui se produisent spontanément, avec une période proche de 24 heures. Des modèles déterministes pour les rythmes circadiens chez la drosophile rendent compte de propriétés dynamiques telles que le déphasage ou la suppression des rythmes par une impulsion lumineuse, et l’entraînement par des cycles lumière-obscurité. Une version stochastique de ces modèles permet de tester la robustesse des oscillations circadiennes par rapport au bruit moléculaire. L’extension de la modélisation à l’horloge circadienne des mammifères contribue à clarifier les bases dynamiques de certains troubles du cycle veille-sommeil chez l’homme.


Vendredi 27 mai 2005:

10h30 Richard Lassaigne (Université Denis Diderot Paris 7)

Vérification probabiliste et approximations

Model checking is an algorithmic method allowing to automatically verify if a system which is represented as a Kripke model satisfies a given specification. Specifications are usually expressed by formulas of temporal logic. The first objective of this talk is to give an overview of methods able to verify probabilistic systems. Models of such systems are labelled discrete time Markov chains and specifications are expressed in extensions of temporal logic by probabilistic operators. The second objective is to focus on the complexity of these methods and to answer the question: can probabilistic verification be efficiently approximated? In general, the answer is negative. However, in many applications, the specification formulas can be expressed in some positive fragment of linear time temporal logic. In this paper, we show how some simple randomized approximation algorithms can improve the efficiency of the verification of such probabilistic specifications.


Mardi 3 mai 2005:

14h00 Vladislav Vyshemirsky (University of Glasgow, Scotland)

Analysis of signalling pathways using the PRISM model checker

We describe a new modelling and analysis approach for signal transduction networks in the presence of incomplete data. We illustrate the approach with an example, the RKIP inhibited ERK pathway. Our models are based on high level descriptions of continuous time Markov chains: reactions are modelled as synchronous processes and concentrations are modelled by discrete, abstract quantities. The main advantage of our approach is that using a (continuous time) stochastic logic and the PRISM model checker, we can perform quantitative analysis of queries such as if a concentration reaches a certain level, will it remain at that level thereafter? We also perform standard simulations and compare our results with a traditional ordinary differential equation model. An interesting result is that for the example pathway, only a small number of discrete data values is required to render the simulations practically indistinguishable.


Mercredi 13 avril 2005:

11h00 Erik Sandewall (Linköping University, Suède)

Renaissance et Reforme dans le Domaine du Logiciel

The work reported here is performed in a broader context where we propose to change the overall software architecture (operating systems, programming languages, etc etc) in order to eliminate the considerable redundancy of concepts and constructs that contemporary software technology exhibits. This requires, among other things, a realignment so that some constructs that used to be placed on higher levels of software now become incorporated in a kernel on a much lower level.

In this framework, we propose in particular to use the construct of an *action* already in the kernel, whereby it becomes available for many applications as a conceptual and computational resource and in a uniform fashion. The article describes and discusses the ramifications of this approach, including how it relates to the current state of the art in logics of actions and change, as well as the nonmonotonic character of one of its computational constructs.


Vendredi 25 février 2005:

10h30 Kazunori Ueda (Waseda University, Tokyo)

LMNtal: a Language Model with Logical Links and Membranes

LMNtal (pronouned "elemental") is a simple language model based on graph rewriting that uses logical variables to represent links and membranes to represent hierarchies. The two major goals of LMNtal are (i) to unify various computational models based on multiset rewriting and (ii) to serve as the basis of a truly general-purpose language covering various platforms ranging from wide-area to embedded computation. Another contribution of the model is it greatly facilitates programming with dynamic data structures.

LMNtal is an outcome of the attempt to unify two extensions of concurrent logic programming, namely concurrent constraint programming and Constraint Handling Rules, where our focus has been multiset rewriting rather than constraint handling. At the same time, LMNtal can be regarded as a hierarchical multiset rewriting language augmented with logical links, and many computational models including Petri Nets, Gamma, the pi-calculus, etc., can be encoded into LMNtal.

Yet another view of LMNtal is a constructor-free linear concurrent logic language in which data structures are encoded as process structures. Constructor-freeness means that LMNtal links are zero-assignment variables, where the purpose of a variable is to connect two points using a private identity. Multisets are supported by the membrane construct that allows both nesting and mobility. Logical links can represent (among other things) communication channels, but the key differences from pi-calculus channels are that (i) a message sent through a link changes the identity of the link and (ii) links are always private.

The LMNtal system, running on a Java platform, has been released and the talk will include animating demonstrations of the system in addition to language issues.


Mercredi 5 janvier 2005:

10h30 Benno Schwikowski (Institut Pasteur)

Computational Systems Biology at Institut Pasteur: Enabling high-throughput mass spectrometry, and integrating heterogeneous data in large-scale interaction networks

This talk will highlight the two major areas of interest of our new research Unite at Institut Pasteur. The first part of the talk will focus on the dev

elopment of new measurement approaches, in particular a novel way to interpret data acquired from mass spectrometry (MS) analysis of very complex samples. Typically, multidimensional liquid chromatography (LC) is used to separate such samples into fractions of lower complexity. Until now, the multidimensional structure among samples has not been used, but we will demonstrate how dimensionality can be exploited to improve many aspects of the experimental-computational protocol for identifying and quantifying proteins on a large scale.

Our second interest lies in the integration of heterogeneous data types in molecular interaction networks. We will present a novel approach to find sequence motifs relevant for protein interactions. In addition to sequence information, an experimentally determined large-scale interaction network is used. The introduction of this new type of data allows the efficient usage of a more complex and realistic model for biologically and medically important families of proteins, and leads to improved predictions of protein interactions.



Jeudi 25 novembre 2004:

14h00-17h00 André Gagalowicz (projet MIRAGES)

Modélisation texturelle.


Vendredi 19 novembre 2004:

14h00 Luc de Raedt (University of Freiburg, Allemagne)

An overview of some traditional ILP systems that might be of interest to Biocham


Mardi 5 octobre 2004:

10h30 Yann Georget (Koalog)

Présentation et architecture de Koalog Constraint Solver

Dans cet exposé, je présenterai Koalog Constraint Solver, une bibiothèque Java pour la programmation par contraintes. Après avoir justifié le choix du langage (Java) et la forme (bibiothèque), je passerai en revue plusieurs choix d'architecture pour la réalisation de Koalog Constraint Solver : couches logicielles, généricité du noyau de base, ordonnancement des contraintes, références backtrackables, . . . Koalog Constraint Solver étant optimisé pour les contraintes globales, je détaillerai l'une d'elles, la contrainte Disjunctive (pour l'ordonnancement disjonctif). Enfin, je donnerai quelques résultats concernant les performances de Koalog Constraint Solver et présenterai quelques axes de R&D.


Vendredi 1er octobre 2004 (salle de conférence du bâtiment *11*):

10h30 Jean Clairambault (INRIA-INSERM Villejuif))

Modéliser l’apoptose : buts, questions, outils


Vendredi 17 septembre 2004:

10h30 Laurence Calzone (INRIA, Post-doc ARC CPBIO)

How could learning methods for parameter estimation assist biologists/modelers?

We propose to present an example of a generic kinetic model of the mammalian cell cycle. We will expose the different problems raised by fitting the experimental data and the techniques used in developing such numerical models. General ideas on how automated ways in parameter search could assist the biologist will be developed.


Mardi 20 juillet 2004:

10h30 Laurence Calzone (Budapest, Hungary & Virginia State University)

Modélisation mathématique du cycle cellulaire chez la levure et la drosophile.

Des modèles mathématiques ont été réalisés dans le but de décrire les mécanismes de régulation du cycle cellulaire chez les levures (Saccharomyces cerevisiae et Schizosaccharomyces pombe) et les mammifères. Ces modèles représentent essentiellement des interactions entre protéines menant à la division cellulaire. Beaucoup de ces réseaux de protéines se retrouvent dans les cellules somatiques et embryonnaires de plusieurs eucaryotes. Cependant, la régulation de ces deux types de cellules présente des aspects différents et inattendus selon l'organisme. Ainsi, nous proposons d'étudier en détails les cycles cellulaires embryonnaires de la drosophile (Drosophila melanogaster). Juste après la fécondation, une série de treize cycles est observée suivie d'un arrêt en phase G2. Durant ces cycles rapides et synchronisés, les noyaux se divisent mais continuent de partager le même cytoplasme. Le fait le plus singulier est que, durant ces divisions, la quantité totale du complexe Cdk1/CycB (aussi appelé MPF) n'oscille que dans les derniers cycles, ce qui est contraire aux observations et au rôle associés à ce complexe. En effet, les cycles de divisions nucléaires sont généralement régulés par l’activité des complexes Cdk1/CycB.

Après une courte présentation d'un modèle du cycle cellulaire chez la levure, je proposerai un modèle décrivant les treize cycles cellulaires chez la drosophile. Quelques modifications d'un modèle déjà existant de Xenopus (telles que compartimentation et dégradation locale de certaines composantes du système) permettent de simuler des phénomènes observés expérimentalement.


Vendredi 14 mai 2004, Bâtiment 1, amphithéatre Turing:

9h30-16h30 Revue finale du projet RNTL OADymPPaC

Outils pour l'Analyse Dynamique et la mise au Point de Programmes avec Contraintes

  9.30  Café d'accueil
 10.00  Introduction
 10.15  Présentations techniques et démos
 11.30  Pause
 11.45  Présentations techniques et démos    13.00  Déjeuner sur place (sur invitation)
 14.30 Présentations techniques et/ou Discussion avec les experts
 15.30  Pause et clôture des exposés
 16.00
 Discussion ouverte (travaux futurs)
 16.30
 Clôture 

Programme détaillé sur http://contraintes.inria.fr/OADymPPaC/Public/Revue/progr.html


Lundi 8 mars 2004:

10h30 Alexandre Frey (Trusted Logic)

Problèmes de contraintes pour le sous-typage en monde ouvert.

L'exposé portera sur une partie de mon travail de thèse.

J'ai étudié le typage à la ML d'un langage d'ordre de supérieur avec objets, multi-méthodes, et sous-typage. Une approche algébrique m'a permis de réduire de façon générique le typage à un problème de contraintes. La correction du typage est montrée moyennant quelques axiomes minimaux sur l'algèbre domaine d'interprétation de ces contraintes.

Cette réduction est indépendante du modèle effectivement utilisé pour interpréter les contraintes. J'ai utilisé cette propriété pour décrire finement le typage de l'aspect "monde ouvert" des langages à objets. Cette caractérique essentielle permet d'utiliser un module dans un "monde" qui n'est pas connu au moment de son écriture.

Je discuterai de la description de cet aspect en terme de quantification sur le modèle utilisé pour interpréter les contraintes (quantification sur toutes les extensions possibles). Je tenterai une classification des problèmes obtenus. Nous pourrons ensuite discuter de leur originalité et des connections avec d'autres domaines, en particulier la programmation logique par contraintes.


Lundi 5 janvier 2004:

14h00 François Fages (INRIA Rocquencourt)

La machine abstraite biochimique BIOCHAM: un pas vers une biologie formelle.

La biologie est clairement engagée dans un travail d'élucidation des processus biologiques de haut niveau en termes de leurs bases biochimiques à l'échelle moléculaire. La production en masse de données post-génomiques (expression génique, synthèse de protéines, intéractions protéines-protéines, etc.) soulève le besoin d'un effort parallèle important sur la représentation formelle des processus biochimiques et de leur dynamique globale.

Dans cet exposé nous proposons un langage simple de modélisation de processus biochimiques et l'utilisation de la logique temporelle CTL comme langage d'expression des propriétés biologiques d'un système. Nous montrons que les techniques de vérification symbolique de modèles (issues de la preuve de circuits et de programmes concurrents) s'appliquent aux réseaux biochimiques, et qu'elles présentent certains avantages sur la simulation pour interroger et valider des modèles formels de processus biologiques. Nous dresserons un bilan de nos expériences sur la conception du langage BIOCHAM à travers la modélisation et l'interrogation de plusieurs exemples de processus biochimiques, dont un modèle qualitatif du contrôle du cycle cellulaire chez les mammifères.

(Travail joint avec Nathalie Chabrier-Rivier et Sylvain Soliman en collaboration avec l'ARC CPBIO)



Lundi 18 août 2003:

11h00 Stephen Muggleton (Imperial College, Londres)

Inductive logic programming in bioinformatics.

We present the inductive logic programming system Progol and relate our experience of using Progol in two applications in bionformatics: one in the functional genomics of the yeast, and one for predicting the toxicity of proteins.



Vendredi 6 décembre 2002, lieu: Paris, Université Paris 6,

Salle C769, 8 rue du Capitaine Scott, 75015 Paris (Metro Dupleix)

15h00 Vincent Devloo (SCMBB, Faculté des Sciences de Bruxelles)

Vers la modélisation du réseau des processus cellulaires

L'Analyse Logique Généralisée (René Thomas et collaborateurs) est une théorie permettant de mettre en évidence des comportements multistationnaires et périodiques dans le cadre de la modélisation de réseaux génétiques. Elle permet ainsi de caractériser la différentiation et l'homéostasie. Dans le cadre de mon travail de thèse, une reformalisation de la théorie ainsi que l'utilisation de la programmation par contraintes ont permis de rendre la théorie accessible à des systèmes beaucoup plus grands que précédemment. Je présente aussi un logiciel de modélisation permettant le "reverse engeenering" et l'expérimentation virtuelle. Enfin, deux applications tirées de la littérature sont examinées à l'aide des outils théoriques et logiciels développés, dont l'une concerne un réseau non génétique afin de mettre en évidence la pertinence de la théorie dans un cadre plus général.


Jeudi 8 août 2002:

11h00 Rafaël Ramirez-Melendez (INRIA Rocquencourt)

A constraint-based methodology for concurrent programming

The task of programming concurrent systems is substantially more difficult than the task of programming sequential systems with respect to both correctness and efficiency. In this talk we describe a constraint-based methodology for writing concurrent applications. A system is modeled as: (a) a set of processes containing a sequence of markers, denoting the processes points of interest; and (b) a constraint store. Process synchronization is specified declaratively by incrementally adding constraints on the markers' execution order into the constraint store. The store, thus, acts as a coordination entity which on the one hand encapsulates the system synchronization requirements, and on the other hand, provides a declarative specification of the system concurrency issues. This provide great advantages in writing concurrent programs and manipulating them while preserving correctness.



Lundi 10 décembre 2001:

11h00 Arnaud Courtois (LORIA, Nancy)

Modélisation de systèmes biologiques en programmation par contraintes

L'exposé essaiera de montrer que l'utilisation de la programmation concurrente par contraintes hybrides est à même de reproduire différentes techniques de modélisations usuelles en biologie. Après un bref résumé du contexte qui nous intéresse, seront présentés le langage de programmation utilisé, différents modèles dynamiques, ainsi que des exemples de modélisations. La conclusion portera sur les limites de la modélisation, en particulier les possibilités de raffinement ou les obligations de simplification.

13h30 Vincent Danos (CNRS, PPS Paris)

Communication et Controle: le EEP-calculus

On construit un modèle de communication, le EEP-calcul, du type PI-calcul ou JOIN-calcul appliqué (c'est-à-dire avec "locations"). A la différence des modèles usuels, les processus résidents exercent un contrôle sur les processus mobiles, avec notamment la possiblité de transformer dynamiquement les noms d'un agent (masquerading). Environnements agressifs, rapiéçage (patch) dynamique, séquestration de processus mobiles ont une représentation simple dans le calcul.


Jeudi 8 novembre 2001:

11h00 Pierre Boullier (INRIA Rocquencourt)

Les grammaires à concaténation d'intervalles

Ce séminaire va être consacré à la présentation des grammaires à concaténation d'intervalles (range concatenation grammars -- RCG). C'est un formalisme syntaxique qui est à la fois simple, puissant et efficace. En fait, les RCG définissent exactement la sous-classe des langages dépendant du contexte qui peut s'analyser en temps polynomial. Bien sûr, cette sous-classe contient les langages non-contextuels, mais aussi, par exemple, les langages « mildly context-sensitive », célèbres en traitement de la langue naturelle. D'autre part, les RCG ont des propriétés de fermeture (clôture par intersection et complémentation) qui les rendent modulaires.

Les applications déjà réalisées montrent que les temps d'exécution des analyseurs produits, fonction polynomiale de la longueur du source, sont particulièrement compétitifs. Outre les langues naturelles, Les RCG pourraient trouver des applications en biologie (prédiction des structures secondaires de l'ARN ou de protéines), ou même servir de base à une "extension" des DCG.


Mardi 12 juin 2001:

14h00 Hubert Dubois (LORIA Nancy)

Programmer avec des objets et des contraintes en ELAN

Partant d'une volonté de développer des systèmes de règles de production en calcul de réécriture, nous avons dégagé un formalisme de règles travaillant sur une mémoire de travail composée d'objets et coopérant avec une base de contraintes. L'application de cet ensemble de règles est contrôlé par des stratégies.

Le système ELAN est un environnement reposant sur le paradigme de règles et stratégies. Le calcul de réécriture est un nouveau calcul permettant de définir une sémantique uniforme dans laquelle on peut exprimer à la fois les termes et le concept d'application de règles sur ces termes. Le calcul de réécriture fournit une sémantique opérationnelle au système ELAN.

On a donc développé une extension objet pour le langage ELAN respectant la sémantique du calcul de réécriture, ainsi qu'un formalisme de règles permettant de manipuler à la fois objets et contraintes associées. L'ensemble forme un nouveau paradigme de programmation permettant de modéliser des problèmes de planification ou d'ordonnancement par exemple.


Lundi 21 mai 2001: Second CSE Penn State-ENS-INRIA Workshop on "Logic and Concurrency"

11h00-12h00 Dale Miller, CSE, Penn State

Encoding Generic Judgments

14h00-14h45 Jeremie Wajs, CSE Penn State

Reasoning about logic programs

14h45-15h30 Sorin Craciunescu, INRIA Rocquencourt

A Soundness Proof of a Formal System for Equivalence of Logic Programs


Mardi 22 mai 2001: Second CSE Penn State-ENS-INRIA Workshop on "Logic and Concurrency"

11h00-12h00 Catuscia Palamidessi, CSE, Penn State

(joint work with Oltea Mihaela)

Challenges and Promising Directions in the Distributed Implementation of Choice

14h00-14h45 Frank Valencia, BRICS, Aarhus, Danemark (joint work with Catuscia Palamidessi , CSE, Penn State)

A Calculus for Temporal Constraint Programming

14h45-15h30 Lionel Kalill, ENS Paris

(joint work with Maribel Fernandez)

Reseaux d'interaction avec ambiguite


Vendredi 20 avril 2001 (Université Paris 7 Chevaleret salle 1C6):

10h30 Soutenance de thèse de Sylvain Soliman (INRIA Rocquencourt, Univ. Paris 7)

Programmation concurrente avec contraintes et logique linéaire

Dans cette thèse, nous étudions les liens étroits entre la logique linéaire et la programmation concurrente par contraintes, sous l'angle de la sémantique, et plus précisément de la vérification de programmes.

Nous raffinons les observables caractérisables grâce a la logique linéaire et étendons des résultats antérieurs, afin d'obtenir une sémantique plus précise et plus générale. Ces résultats sont obtenus par une traduction plus fidèle des agents en formules logiques et un enrichissement de la théorie.

Nous présentons aussi une méthode originale de preuves de programmes, basée sur la sémantique de la prouvabilité de la logique linéaire: la sémantique des phases. Celle-ci nous donne un outil de vérification avec de nombreuses bonnes propriétés: la possibilité de montrer une propriété universelle du programme simplement par un contre-exemple; une facilite d'abstraction; enfin la simplicité des preuves obtenues.

Nous construisons donc une méthode systématique de preuve et, après l'avoir testée avec succès sur des exemples classiques, nous étudions son automatisation. Cette étude aboutit a l'implémentation d'un prototype de prouveur qui accompagne un interpréteur pour les langages linéaires concurrents avec contraintes.

14h00 Catuscia Palamidessi (CSE, Penn State, USA)

(Joint work with Mihaela Herescu)

On the generalized dining philosophers problem.

The problem of the dining philosophers, proposed by Dijkstra, is a very popular example of control problem in distributed systems. In the classic formulation the philosophers sit at a round table and there is a fork between any two philophers. In order to eat, each philosopher needs both his adjacent forks. The aim is to make sure that if there are hungry philosophers then some of them will eventually eat (deadlock freedom), or, more ambitiously, that every hungry philosopher will eventually eat (lockout freedom).

In the early 80's, Lehman and Rabin showed that there exist no deterministic, fully distributed and symmetric solution to the problem of the dining philosophers, and they proposed two randomized solutions: one which guarrantees deadlock freedom with probability 1 and another one which guarrantees lockout freedom with probability 1.

In this talk, we consider a generalization of the dining philosophers problem where the number of philosophers can exceed the number of forks (although each philosopher can still only access two forks), and we study whether the randomized algorithms of Lehman and Rabin still work in this situation. We show that the answer is "No": in most cases, a malicious scheduler can induce a deadlock. We then show a randomized solution, still fully distributed and symmetric, which works for any number of philosophers and any kind of connection graph, and guarrantees lockout freedom with probability 1.


Jeudi 8 mars 2001:

10h30 Dao Thi Bich Hanh (LIM, Marseille)

Résolution de contraintes du premier ordre dans la théorie des arbres finis ou

infinis

Le sujet de cet exposé est la résolution de contraintes dans la théorie complète T des arbres éventuellement infinis. Ces contraintes se présentent sous forme de formules générales du premier ordre construites à partir d'un ensemble infini de symboles d'opérations et de l'égalité. Par, résoudre une contrainte p dans T, nous entendons : transformer p en une contrainte q équivalente dans T, qui est la constante logique Vrai si p est toujours vraie dans T, est la constante Faux si p est toujours fausse dans T. De plus, si p ou sa négation a une base fini de solutions dans un modèle de T, alors ces solutions ou non-solutions doivent être explicites dans q.

Après avoir introduit l'algèbre des arbres et montré qu'elle constitue bien un modèle de la théorie des arbres, nous montrons que nos contraintes ont un pouvoir d'expression << quasi-universel >> et proposons un algorithme en 11 règles de réécriture de sous-formules pour les résoudre. Nous considérons ensuite la structure obtenue en introduisant dans cette algèbre un prédicat unaire fini(x), exprimant que x est un arbre fini. Nous proposons une extension TuFINI de la théorie T admettant cette structure comme modèle. Nous donnons un algorithme pour résoudre les contraintes dans TuFINI en 16 règles de réécriture et concluons ainsi que TuFINI est une théorie complète. Nous donnons également un système de 12 règles de réécriture qui effectue le même type de résolution de contraintes dans la théorie des arbres finis. Nous terminons par des détails sur l'implantation d'algorithmes de résolution de contraintes, dans la théorie T des arbres éventuellement infinis et dans la théorie des arbres finis. Les exemples qui nous ont servi à montrer le pouvoir d'expression de nos contraintes nous permettent de produire des benchmarks intéressants.


Mardi 27 février 2001:

14h00 Slim Abdennadher (Univ. Ludwig Maximilian de Munich)

Automatic Generation of Rule-based Constraint Solvers

A general approach to implement propagation and simplification of constraints consists of applying rules over these constraints. However, a difficulty that arises frequently when writing a constraint solver is to determine the constraint propagation algorithm. In this talk, we present a method for generating propagation and simplification rules for constraints over finite domains defined extensionally by e.g. a truth table or their tuples. The generation of rules is performed in two steps. In a first step, only propagation rules are generated. This method is inspired by techniques used in the field of knowledge discovery. In a second step, some propagation rules are transformed into simplification rules. The method is based on a confluence notion, i.e. a technique adapted from rewriting systems. Using our algorithm, the user has the possibility to specify the admissible syntactic forms of the rules. The generated rules will be implemented as rules of the language Constraint Handling Rules.



Jeudi 23 novembre 2000:

10h30 Tomasz Truderung (University of Wroclaw, Pologne)

Polymorphic Directional Types.

We present a directional, polymorphic, rule based type system for logic programming languages. We discuss the notion of main type and present the main type reconstruction algorithm. We describe the current state of implementation, and show some sample runs of our algorithms. We analyse their practical time complexity. We also discuss the possibility of embedding our system in real life languages such as Prolog.

13h30 Emmanuel Coquery (INRIA)

Typed Constraint Logic Programs:

checking domain coercions and metaprograms through subtyping.

We propose a general prescriptive type system with parametric polymorphism and subtyping for constraint logic programs. The aim of this type system is to detect programming errors statically. It introduces a type discipline for constraint logic programs and modules, while maintaining the capabilities of performing the usual coercions between constraint domains and of typing meta-programming predicates, thanks to the flexibility of subtyping. The property of subject reduction expresses the consistency of the type system w.r.t. the execution model: if a program is ``well-typed, then all derivations starting in a ``well-typed goal are again ``well-typed''. That property is proved w.r.t. the abstract execution model of constraint programming which proceeds by accumulation of constraints. We describe our implementation of the system for type checking and type inference. We report our experimental results on type checking the libraries of Sicstus Prolog and other Prolog programs.


Mercredi 13 septembre 2000, bâtiment 16:

10h30 Rajeev Goré, Australian National University.

CardKt: Automated Multi-modal Deduction on Java Cards for Multi-application Security.

We describe an implementation of a Java program to perform automated deduction in propositional multi-modal logics on a Java smart card. The tight space limits of Java smart cards make the implementation non-trivial. A potential application is to ensure that applets down-loaded off the internet conform to personalised security permissions stored on the Java card using a security policy encoded in multi-modal logic. In particular, modal logic may be useful to ensure that previously checked ``trust relationships between addition of a new applet. That is, by using multi-modal logic to express notions of permissions and obligations, we can turn the security check into an on-board theorem proving task. Our theorem prover itself could be down-loaded ``just in time to perform the check, and then deleted to free up space on the card once the check has been completed. Our work is thus a ``proof of principle'' for the application of formal logic to the security of multi-application Java cards.


Lundi 17 juillet 2000: CSE Penn State-ENS-INRIA Workshop on "Logic and Concurrency"

11h00-12h00 Dale Miller, CSE, Penn State

Encoding Transition Systems in Sequent Calculus

14h00-14h45 Jeremie Wajs, CSE Penn State

A theorem prover for operational semantics

14h45-15h20 Sorin Craciunescu, INRIA Rocquencourt

Proofs by Coinduction for Logic Programs

15h40-16h20 Sylvain Soliman, INRIA Rocquencourt

Semantics and Precision

Mardi 18 juillet 2000: CSE Penn State-ENS-INRIA Workshop on "Logic and Concurrency"

11h00-12h00 Catuscia Palamidessi , CSE, Penn State

Concurrent Constraint Programming with Process Mobility

14h00-14h45 Oltea Mihaela Herescu, CSE, Penn State.

Probabilistic Asynchronous Pi-Calculus

14h45-15h30 Lionel Khalil, ENS Paris

Generalization of Interaction Nets with a Unique Parallel Agent


Mardi 23 mai 2000

10h30 Daniel Diaz, INRIA, Univ. Paris 1.

GNU-Prolog: un compilateur natif pour Prolog intégrant un résolveur decontraintes sur les domaines finis.

Nous présenterons le système GNU-Prolog, un compilateur natif pour le langage Prolog incluant un puissant résolveur de contraintes sur les domaines finis. Le système GNU-Prolog a été retenu par GNU pour etre le Prolog officiel de cette organisation. Après un aperçu général de GNU-Prolog, nous détaillerons le schéma de compilation de GNU-Prolog. Celui-ci est basé sur un langage intermédiaire de bas niveau qui reste toutefois indépendant de la machine. Un programme Prolog+contraintes est donc compilé vers ce langage qui est ensuite traduit vers l'assembleur de la machine cible. Cette dernière étape est simplifiée du fait que ce langage intermédiaire est basé sur un jeu d'instructions réduit. Enfin, nous montrerons quelques aspects du résolveur de contraintes sur les domaines finis de GNU-Prolog et, en particulier, le langage de description de contraintes permettant à l'utilisateur de définir de nouvelles contraintes si nécessaire.


Jeudi 13 avril 2000

14h Jean-Pierre Banâtre, INRIA Dir. UR Rocquencourt.

Gamma et la programmation par transformation de multi-ensembles.

Gamma fut propose a l'origine en 1986 comme un formalisme pour la definition de programmes non sequentiels. L'idee de base sous-jacente au formalisme consiste a decrire le calcul comme une forme de reaction chimique dans une solution de donnees individuelles. De par la nature tres minimale du langage, et l'absence de biais sequentiel, il a ete possible d'exploiter le paradigme initial dans diverses directions. Dans cet expose nous ferons un survol des travaux effectues dans ce domaine, en insistant sur

1) la programmation parallele par transformation de multi ensembles, 2) l'interet du modele de reaction chimique, 3) les extensions du modele d'origine et 4) les questions d'implementation. multi-ensembles.


Jeudi 2 mars 2000

14h00 Aviv Regev, Udi Shapiro, Weizmann Institute.

Representing Biomolecular Processes with Process Algebra:pi-Calculus as a Formalism for Signal Transduction Networks

Molecular processes, such as gene expression, metabolism and signal transduction (ST) networks play a key role in the normal and pathological function of cells, tissues and organisms. However, while biomolecular sequence and structure information has an objective representation that is amenable to critical evaluation and computer manipulation, knowledge of biomolecular processes has so far eluded such representation and study. We propose to treat these molecular systems as concurrent computational processes using process algebras. In particular, we show that complex ST pathways can be represented formally in the pi-calculus. pi-calculus representations of ST unify dynamic behavior and function of pathways with the molecular details that underly this behavior. They provide a comprehensive model of ST, which is both mathematically well-defined and biologically visible. This model is amenable to simulation studies, analogous to mutational manipulation and experimentation, as well as to formal verification and analysis. Furthermore, based on notions of behavioral equivalence, pathways may be compared to each other, and a novel measure of homology, the "homolgy of process" can be established. Thus, adopting such a common formalism may facilitate our understanding of the function and evolution of biomolecular processes.


Mercredi 23 février 2000: Journée "Contraintes et Chaos", Bâtiment 16

9h30 Arnaud Lallouet, LIFO Orléans

Une sémantique co-inductive pour la propagation de contraintes et le labeling.

10h30 Gérard Ferrand, LIFO Orléans

Relaxation parfaite de contraintes.

11h30 François Fages, INRIA

Propagation de contraintes et itération chaotique.


Mardi 8 février 2000

14h-15h Pierre-Etienne Moreau, Bouygues DTN.

Compilation de regles de reecriture et de strategies non-deterministes'

Les techniques de réécriture ont été développées depuis les années 1970 et appliquées en particulier au prototypage des spécifications formelles algébriques et à la démonstration de propriétés liées à la vérification de programmes.

ELAN est un système qui permet de spécifier et d'exécuter des démonstrateurs, des résolveurs de contraintes et plus généralement tout processus décrit par des règles de transformation. Il possède des opérateurs associatifs-commutatifs (AC) et un langage de stratégies qui permettent une gestion fine de l'exploration d'un arbre de recherche et une manipulation aisée d'opérateurs mathématiques tels que les connecteurs booléens, les opérateurs arithmétiques ou les opérateurs de composition parallèle par exemple.

Ces deux notions améliorent grandement l'expressivité du langage mais introduisent un double non-déterminisme lié à la possibilité d'appliquer plusieurs règles, de différentes façons, sur un terme donné. Cela rend difficile et généralement peu efficace leur implantation.

L'objectif principal de ce travail est d'étudier des techniques de compilation qui améliorent l'efficacité de ce type de langages. Nous proposons un nouvel algorithme, à base d'automates déterministes, pour compiler efficacement le filtrage syntaxique. Nous définissons ensuite différentes classes de règles pour lesquelles nous proposons un algorithme efficace de filtrage AC. Cet algorithme utilise une structure de donnée compacte et les automates définis précédemment, ce qui améliore considérablement les performances du processus de normalisation dans son ensemble.

L'étude du langage de stratégies conduit à implanter des primitives originales de gestion du backtracking et à définir un algorithme d'analyse du déterminisme permettant de réduire leur usage et d'améliorer encore les performances, tout en réduisant l'espace mémoire nécessaire. Enfin, l'implantation des méthodes proposées a donné lieu à l'élaboration de nombreuses optimisations théoriques et techniques qui peuvent être largement réutilisées pour implanter d'autres langages de programmation par réécriture.