AI4Climate Seminar: High-resolution canopy height and wood volume maps in France based on satellite remote sensing with a deep learning approach – Martin Schwartz, 27th of March 11:00 CET at Kayrros

The seminar is on March the 27th, at 11:00 (CET) remotely and in person. The in-person meeting will be held in the Kayrros conference room, in Paris (map at the end of the post).

If you like to attend online, the link for the zoom is here:

Short Abstract: In European forests that are divided into stands of small size and may show heterogeneity within stands, a high spatial resolution (10 – 20 meters) is arguably needed to capture the differences in canopy height. In this work, we developed a deep learning U-Net model based on multi-stream remote sensing measurements to create a high-resolution canopy height map of France. Our model uses multi-band images from Sentinel-1 and Sentinel-2 with composite time averages as input to predict tree height derived from GEDI waveforms. Thanks to forest inventory plots, we then turned this height map into a 10 m resolution wood volume map that we used to carry out wood volume estimations in different french regions. This works paves the way to a monitoring of biomass at high resolution and could bring key information for forest management policies.

BioMartin Schwartz is a young researcher who graduated from Ecole Polytechnique in 2020. During his gap year, he worked at Carbone 4, a leading consulting firm specializing in climate change and sustainability. Currently, Martin is in his 3rd year of PhD, supervised by Philippe Ciais and Catherine Ottlé at the Laboratory for Climate Science and Environment (LSCE). His research is focused on remote sensing and deep learning techniques applied to GEDI data, with the aim of advancing our understanding of forest canopy height and forest biomass at fine scale. His work has been done in collaboration with Kayrros which helped in the development of his research.

Kayrros: Kayrros is a French company that leverages advanced technologies like AI and satellite imagery to provide real-time insights and forecasting for the energy and commodity markets, with a focus on sustainability and environmental issues. They offer data-driven solutions for emissions, deforestation, and other environmental risks, helping businesses and governments make informed and sustainable decisions.

Location: Kayrros offices, Rue La Fayette, 75009 Paris. It is 18 minutes from Jussieu by metro.

AI4Climate seminar: Machine Learning for Climate Change and Environmental Sustainability – Claire Monteleoni – 6th of February 11:00CET

The seminar is on February the 6th, at 11:00 (CET) remotely and in person. The in-person meeting will be held in the SCAI conference room (map at the end of the post).

If you like to attend online, the link for the zoom is here:

Despite the scientific consensus on climate change, drastic uncertainties remain. Crucial questions about regional climate trends, changes in extreme events, such as heat waves and mega-storms, and understanding how climate varied in the distant past, must be answered in order to improve predictions, assess impacts and vulnerability, and inform mitigation and sustainable adaptation strategies. Machine learning can help answer such questions and shed light on climate change. I will give an overview of our climate informatics research, focusing on challenges in learning from spatiotemporal data, along with semi- and unsupervised deep learning approaches to studying rare and extreme events, and precipitation and temperature downscaling.

Claire Monteleoni is a Choose France Chair in AI and Directrice de Recherche at INRIA Paris, an Associate Professor in the Department of Computer Science at the University of Colorado Boulder, and the founding Editor in Chief of Environmental Data Science, a Cambridge University Press journal, launched in December 2020. She joined INRIA in 2023 and has previously held positions at University of Paris-Saclay, CNRS, George Washington University, and Columbia University. She completed her PhD and Masters in Computer Science at MIT and was a postdoc at UC San Diego. She holds a Bachelor’s in Earth and Planetary Sciences from Harvard. Her research on machine learning for the study of climate change helped launch the interdisciplinary field of Climate Informatics. She co-founded the International Conference on Climate Informatics, which turns 12 years old in 2023, and has attracted climate scientists and data scientists from over 20 countries and 30 U.S. states. She gave an invited tutorial: Climate Change: Challenges for Machine Learning, at NeurIPS 2014. She currently serves on the NSF Advisory Committee for Environmental Research and Education.

Variational data assimilation with deep prior – Arthur Filoche

Our next seminar will be held on Wensday the 5th of October 2022, at 14h30 ECT, -not on the 30th of September as previously advertised-, at the Pierre et Marie Curie campus of Sorbonne Université, in seminar room 105 of LiP6, located on the first floor of the corridor 25/26 (easier access through tower 26).

The seminar can also be followed remotely through zoom here:

Password : n5jBHd 

You can ask questions during and after the talk, in the slack channel.

Arthur Filoche’s talk is entitled:

« Variational data assimilation with deep prior »

Data Assimilation remains the operational choice when it comes to
forecast and estimate Earth’s dynamical systems, and proposes a large panel of methods to optimally combine a dynamical model and observations
allowing to predict, filter, or smooth system state trajectory.

The classical variational assimilation cost function is derived from
modelling errors prior with uncorrelated in times Gaussian distribution.
The optimization then relies on errors covariance matrices as
But such statistics can be hard to estimate particularly for background
and model errors. In this work, we propose to replace the Gaussian prior
with a deep convolutional prior circumventing the use of background
error covariances.

To do so, we reshape the optimization so that the initial condition to
be estimated is generated by a deep architecture. The neural network is
optimized on a single observational window, no learning is involved as
in a classical variational inversion.
The bias induced by the chosen architecture regularizes the proposed
solution with the convolution operators imposing locality.
 From a computational perspective, control parameters have simply been
organized differently and are optimized using only the observational
loss function corresponding to a maximum-likelihood estimation.

We propose several experiments highlighting the regularizing effect of
deep convolutional prior. First, we show that such prior can replace
background regularization in a strong-constraints 4DVar using a shallow
water model. We extend the idea in a 3DVar set-up using spatio-temporal
convolutional architecture to interpolate sea surface satellite tracks
and obtain results on par with optimal interpolation with fine-tuned
background matrix. Finally, we give perspective toward applying the same
method in weak-constrained 4DVar removing the need for model-errors
covariances but still enforcing correlation in space and time of model

Biographic notice:
Arthur Filoche is a Ph.D. student at the LiP6 of Sorbonne Université in France, under the supervision of Dominique Béréziat, Julien Brajard, and Anastase Charantonis. His research interests lie in combining deep learning and data assimilation

Working group 4: Marie Dechelle – Bridging Dynamical Models and Deep Networks to Solve Forward and Inverse Problems

 21 Juin à 10h
Campus de Jussieu, Salle de réunion SCAI,
Bâtiment Esclangon 1er étage

Partially observed dynamical systems embrace a wide class of phenomena and represent an overwhelming majority of Earth science modeling, traditionally relying on ordinary or partial differential equations. Recent trends consider Machine Learning as an alternative or complementary approach to traditional physical models, allowing the integration of observations and potentially faster computations through model reduction. In this regard, latest works study the learning of the decomposition between model-based (MB) and data driven (ML) dynamical representations. However, learning such a decomposition with the sole supervision on the trajectories is ill-posed.

We introduce a learning algorithm to bridge model-based prediction and data-based algorithms, while solving the ill-posedness. This one relies on a cost function based on the computation of an upper bound of the prediction error, which enables us to minimize the contribution of the data driven algorithm while recovering physical parameters of the MB part. We evidence the soundness of our approach on a physical dataset based on simplified Navier-Stokes equations. We also present preliminary results on outputs of the ocean model NATL60.

L’Atelier interne « SCAI & AI4Climate » réunit les chercheurs, ingénieurs, doctorants, post doctorants concernés par les thématiques liées à conception et l’utilisation de nouvelles méthodes d’Intelligence Artificielle pour l’étude de l’environnement, allant du modèle à l’observation. Les premières réunions seront consacrées aux travaux des doctorants. L’exposé sera suivi d’une discussion avec les participants sur l’approche et les perspectives possibles du travail. 

Narrowing uncertainties of climate projections using data science tools? -Pierre Tandeo

The seminar is on March the 26th, at 10 o’clock and will be held remotely, in english.

The slides can be found here. We are currently uploading the video of the talk and will be adding a link to it as soon as it uploads.

Link to the zoom session:

Pierre Tandeo’s presentation is entitled:

« Narrowing uncertainties of climate projections using data science tools? »


 Climate indices show large variability in CMIP climate predictions. In this presentation, we propose to weight multi-model climate simulations to reduce the uncertainty in climate predictions, and better estimate the future evolution of climate indices. The proposed methodology is based on advanced data science tools (i.e, data assimilation, analog forecasting, model evidence metrics), to accurately compute distances between current observations and simulated climate indices. This low-cost procedure is tested on a simplified climate model. The results show that the methods can be applied locally and is able to identify relevant parameterizations.

Short bio:

Pierre Tandeo an associate professor at IMT Atlantique (Brest, France) and an associate researcher at the Data Assimilation Research Team, RIKEN Center for Computational Science (Kobe, Japan). More information:

2nd Working Group: Learning dynamics from partial and noisy observation with the help of Data Assimilation – Arthur Filoche

11 December, 10 o’clock, Salle de réunion SCAI, Jussieu.
Batiment Esclangon 1er étage

Participer à la réunion Zoom

Geosciences have long-standing experience in modeling, forecasting, or estimating complex dynamical systems like the atmosphere or the ocean. Most of these models came from physical laws and are described by PDE. Usually, sparse and noisy observations of such systems are available. The first need to produce a forecast is to estimate initial conditions. This is usually done via Data Assimilation (DA), a set of methods that optimally combines a dynamical model and observations, focusing on system state estimation. In variational formalism, it’s a PDE-constrained optimization problem that requires adjoint modeling to calculate gradients. This field is very close to Machine Learning (ML) in the sense that both learn from data.

ML algorithms have demonstrated impressive results of spatiotemporal forecasting, but to do so it needs dense data which is rarely the case in earth sciences. Also, tools provided by the deep learning community based on automatic differentiation are particularly suitable for variational DA, avoiding explicit adjoint modeling.

What motivates this discussion is that physics-based model is often
incomplete and machine learning can provide a learnable class of model
while data assimilation can provide dense data.

The a similar talk can be found here, and an early conference paper can be found here.

Tenure track in Statistical learning at Ecole Polytechnique

Ecole polytechnique  is opening a tenure track position on statistical learning and artificial intelligence for energy/climate. The description is attached. See also

This is a position between the Applied Math departement  and the Meteorology  departement.We are primarily interested in applicants whose research in statistical learning and artificial intelligence shall contribute to address societal challenges in energy, sustainability and climate change (e.g. statistical learning for energy efficiency, for load curve prediction, for pricing mechanisms, for smart grid control, for load curve disaggregation, etc.). The Ecole Polytechnique offers an exceptional environment, with an adapted teaching load, as well as the support (scientific, administrative and budgetary). 
To  apply, follow the link
The position  number is 71.

Inferring causation from time series with perspectives in Earth system sciences – Jakob Runge

Link to the slides

The seminar is on December 4th at 10:00 14:00 and will be held remotely.

Link to the zoom session:

Jakob Runge presentation is entitled:

« Inferring causation from time series with perspectives in Earth system sciences »


The heart of the scientific enterprise is a rational effort to understand the causes behind the phenomena we observe. In disciplines dealing with complex dynamical systems, such as the Earth system, replicated real experiments are rarely feasible. However, a rapidly increasing amount of observational and simulated data opens up the use of novel data-driven causal inference methods beyond the commonly adopted correlation techniques. In this talk, I will present a recent Perspective Paper in Nature Communications giving an overview of causal inference methods and identify key tasks and major challenges where causal methods have the potential to advance the state-of-the-art in Earth system sciences. Several methods will be illustrated by `success’ examples where causal inference methods have already led to novel insights and I will close with an outlook of this relatively new and exciting field. I will also present the causal inference benchmark platform that aims to assess the performance of causal inference methods and to help practitioners choose the right method for a particular problem.

Runge, J., S. Bathiany, E. Bollt, G. Camps-Valls, D. Coumou, E. Deyle, C. Glymour, M. Kretschmer, M. D. Mahecha, J. Muñoz-Marı́, E. H. van Nes, J. Peters, R. Quax, M. Reichstein, M. Scheffer, B. Schölkopf, P. Spirtes, G. Sugihara, J. Sun, K. Zhang, and J. Zscheischler (2019). Inferring causation from time series in earth system sciences. Nature Communications 10 (1), 2553.

Short bio:

Jakob Runge heads the Climate Informatics working group at the German Aerospace Center’s Institute of Data Science since 2017. The group combines innovative data science methods from different fields (graphical models, causal inference, nonlinear dynamics, deep learning) and closely works with experts in the climate sciences. Jakob studied physics at Humboldt University Berlin and obtained his PhD at the Potsdam Institute for Climate Impact Research in 2014. For his studies, he was funded by the German National Foundation (Studienstiftung) and his thesis was awarded the Carl-Ramsauer prize by the Berlin Physical Society. In 2014 he won a $200.000 Fellowship Award in Studying Complex Systems by the James S. McDonnell Foundation and joined the Grantham Institute, Imperial College, from 2016 to 2017. On he provides Tigramite, a time series analysis python module for causal inference. For more details, see:

A direct approach to detection and attribution of climate change – Eniko Szekely – 24/01/2020

Lien pour les slides

Le prochain séminaire aura lieu le 24 Janvier à 14h30 au campus Pierre et Marie Curie de Sorbonne Université dans la salle 105 du LIP6 couloir 25-26 au 1er étage.

La présentation de Eniko Szekely est intitulée:

« A direct approach to detection and attribution of climate change »


In this talk I will present a novel statistical learning approach for detection and attribution (D&A) of climate change. Traditional optimal D&A studies try to directly model the observations from model simulations, but practically this is challenging due to high-dimensionality. Here, we propose a supervised approach where we predict a given metric or external forcing directly from the high-dimensional spatial pattern of climate variables, and use the predicted metric as a test statistic for D&A. The first part of the talk will focus on daily detection and show that we can now detect climate change from global weather for any single day since spring 2012. The second part of the talk will focus on attribution of climate change. For attribution, we want the prediction of the external forcing, e.g., anthropogenic forcing, to work well even under changes in the distribution of other external forcings, e.g., solar or volcanic forcings. Therefore we formulate the optimization problem from a distributional robustness perspective, and use anchor regression to ensure good predictions even under such distributional changes.

Notice Biographie:

Eniko is a senior data scientist at the Swiss Data Science Center, EPFL & ETH Zurich, working on machine learning for climate science. Previously, she was a postdoctoral researcher at the Courant Institute of Mathematical Sciences, New York University, and she obtained her PhD in Computer Science from the University of Geneva, Switzerland. Broadly she is interested in machine learning for high-dimensional data and nonlinear phenomena arising from dynamical systems. More recently she has been working on using machine learning and statistical learning approaches for climate science, and has been involved in the organization of the Climate Informatics workshop since 2015.