New ways for dynamical prediction of extreme heat waves: rare event simulations and machine learning with deep neural networks. – Freddy Bouchet (ENS Lyon)

The seminar is on October the 19th, at 14:00 (CEST)both in-person and remotely

Place of the seminar: « Campus Pierre & Marie Curie » of Sorbonne University. It will take place in SCAI seminar room, building « Esclangon », 1st floor

If you like to attend online, here is the link for zoom: https://us02web.zoom.us/j/82605468661

Freddy Bouchet’s presentation is entitled:

«New ways for dynamical prediction of extreme heat waves: rare event simulations and machine learning with deep neural networks.»

Abstract:

In the climate system, extreme events or transitions between climate attractors are of primarily importance for understanding the impact of climate change. Recent extreme heat waves with huge impacts are striking examples. However, it is very hard to study those events with conventional approaches, because of the lack of statistics, because they are too rare for historical data and because realistic models are too complex to be run long enough.

We cope with this lack of data issue using rare event simulations. Using some of the best climate models, we oversample extremely rare events and obtain several hundreds more events than with usual climate runs, at a fixed numerical cost. Coupled with deep neural networks this approach improves drastically the prediction of extreme heat waves.

This shed new light on the fluid mechanics processes which lead to extreme heat waves. We will describe quasi-stationary patterns of turbulent Rossby waves that lead to global teleconnection patterns in connection with heat waves and analyze their dynamics. We stress the relevance of these patterns for recently observed extreme heat waves and the prediction potential of our approach.

Climate Modeling in the Age of Machine Learning – Laure Zanna (NYU)

The seminar is on June the 23th, at 15:00 and will be held remotely, in english.

Link to the zoom session: https://us02web.zoom.us/j/85178591120

Laure Zanna’s presentation is entitled:

«Climate Modeling in the Age of Machine Learning »

Abstract:

Numerical simulations used for weather and climate predictions solve approximations of the governing laws of fluid motions on a grid. Ultimately, uncertainties in climate predictions originate from the poor or lacking representation of processes, such as ocean turbulence and clouds that are not resolved on the grid of global climate models. The representation of these unresolved processes has been a bottleneck in improving climate simulations and projections. The explosion of climate data and the power of machine learning algorithms are suddenly offering new opportunities: can we deepen our understanding of these unresolved processes and simultaneously improve their representation in climate models to reduce climate projections uncertainty? In this talk, I will discuss the current state of climate modeling and its future, focusing on the advantages and challenges of using machine learning for climate projections. I will present some of our recent work in which we leverage tools from machine learning and deep learning to learn representations of unresolved ocean processes and improve climate simulations. Our work suggests that machine learning could open the door to discovering new physics from data and enhance climate predictions.

Short bio:

Laure Zanna is a Professor in Mathematics & Atmosphere/Ocean Science at the Courant Institute, New York University.  Her research focuses on the role of ocean dynamics in climate change. Prior to NYU, she was a faculty member at the University of Oxford until 2019 and obtained her PhD in 2009 in Climate Dynamics from Harvard University. She was the recipient of the 2020 Nicholas P. Fofonoff Award from the American Meteorological Society “For exceptional creativity in the development and application of new concepts in ocean and climate dynamics”. She is the lead principal investigator of M²LInES, an international effort supported by Schmidt Futures to improve climate models with scientific machine learning. 

Filling gaps in ocean satellite data -Aida Alvera-Azcárate & Alexander Barth

Link to the slides.

The seminar is on May the 6th, at 14:00 and will be held remotely, in english.

Link to the zoom session: https://zoom.us/j/93683521817

Aida Alvera-Azcárate’s presentation is entitled:

« Filling gaps in ocean satellite data »

Abstract:

 Satellite data offer an unequalled amount of information of the Earth’s surface, including the ocean. However, data measured using visible and infrared wavebands are affected by the presence of clouds and have therefore a large amount of missing data (on average, clouds cover about 75% of the Earth). The spatial and temporal scales of variability in the ocean require techniques able to handle undersampling of the dominant scales of variability. The GHER (GeoHydrodynamics and Environment Research) of the University of Liege in Belgium has been working over the last two decades on interpolation techniques for satellite and in situ ocean data. In this talk we will focus on techniques developed for satellite data. We’ll start with DINEOF – Data Interpolating Empirical Orthogonal Functions- which is a data-driven technique using EOFs to infer missing information in satellite datasets. We will follow with a more recent development, DINCAE – Data Interpolating Convolutional AutoEncoder. Training a neural network with incomplete data is problematic, and this is overcome in DINCAE by using the satellite data and its expected error variance as input. The autoencoder provides the reconstructed field along with its expected error variance as output. We will provide examples of reconstructed satellite data for several variables, like sea surface temperature, chlorophyll concentration, and some recent developments with DINCAE to grid altimetry data to complete fields.

Short bios:

Aida Alvera-Azcárate is a researcher at the GHER (GeoHydrodynamics and Environment Research) of the University of Liege in Belgium. She did a PhD in Science at the University of Liege and made a post-doc at the University of South Florida (US) before joining the GHER in 2007 where she studies the ocean using satellite and in situ data and works in the development of interpolation techniques to reconstruct satellite data.

Alexander Barth is a researcher working at the University of Liege (Belgium) in the GHER group (GeoHydrodynamics and Environment Research). He did a PhD on nested numerical ocean models and data assimilation. Currently he is working on variational analysis schemes for climatologies and neural networks to reconstruct missing data.

Narrowing uncertainties of climate projections using data science tools? -Pierre Tandeo

The seminar is on March the 26th, at 10 o’clock and will be held remotely, in english.

The slides can be found here. We are currently uploading the video of the talk and will be adding a link to it as soon as it uploads.

Link to the zoom session: https://zoom.us/j/94170014183

Pierre Tandeo’s presentation is entitled:

« Narrowing uncertainties of climate projections using data science tools? »

Abstract:

 Climate indices show large variability in CMIP climate predictions. In this presentation, we propose to weight multi-model climate simulations to reduce the uncertainty in climate predictions, and better estimate the future evolution of climate indices. The proposed methodology is based on advanced data science tools (i.e, data assimilation, analog forecasting, model evidence metrics), to accurately compute distances between current observations and simulated climate indices. This low-cost procedure is tested on a simplified climate model. The results show that the methods can be applied locally and is able to identify relevant parameterizations.

Short bio:

Pierre Tandeo an associate professor at IMT Atlantique (Brest, France) and an associate researcher at the Data Assimilation Research Team, RIKEN Center for Computational Science (Kobe, Japan). More information:  https://tandeo.wordpress.com/.

Machine learning and natural hazards – Sophie Giffard-Roisin

Link for the slides

The seminar is on February 10th 14:00 and will be held remotely.

Link to the zoom session: https://us02web.zoom.us/j/88657656183

Sophie Giffard-Roisin presentation is entitled:

« Machine learning and natural hazards »

The goal of this talk is to show how we can use the strength of artificial intelligence to help making diagnosis and finding concrete and local solutions to natural hazards. Tropical cyclones, avalanches, earthquakes or landslides affects often vulnerable areas and populations, where the understanding of the phenomena and better risk assessment and predictions can make a substantial impact. The data available to monitor these natural phenomena has considerably increased in the recent years. For example, SAR (synthetic aperture radar) imaging data, provided by the Sentinel 1 satellites, is now freely available up to every 6 days in a majority of regions, even remote areas. Yet, artificial intelligence (AI) and machine learning (ML) have only scarcely been used in these domains. But these techniques have already showed their impact in many scientific fields having similar data structures (large volume of data, presence of noise, complex physical phenomena) such as medical imaging (detection/segmentation of pathologies), crop yield (prediction), security (recognition). We will see in this talk, with concrete examples, how to design machine learning models for specific tasks with real imaging or temporal data inputs. Concretely, starting mainly from convolutional neural networks, what are the key aspects to consider and what are pitfalls to avoid?

Short bio:
Sophie Giffard-Roisin is a researcher hired by IRD (French National Institute for Sustainable Development) and based at ISTerre, Grenoble (UGA, France). Her work focuses on machine learning applications for natural hazards, especially using remote sensing and time series data. She did her PhD at Inria, Nice (France) under the supervision of Nicholas Ayache on machine learning and modelling for medical image analysis. Then she did a post-doc in CU Boulder, Colorado (USA) in Claire Monteleoni’s team where she worked on climate and meteorological applications of machine learning. She moved to ISTerre, the Earth Science Laboratory of Grenoble Université (UGA, France), for a permanent position in 2019 where she now focuses on machine learning for natural hazards in geosciences.

Power-efficient deep learning algorithms – Sébastien Loustau

Link for the slides

Next seminar is on October 14th October (14:30) in « Campus Pierre & Marie Curie » of Sorbonne University. It will take place in SCAI seminar room, building « Esclangon », 1st floor

Si vous souhaitez assister en personne à ce séminaire:

Sébastien présentera ses travaux à la salle de séminaire de SCAI (plan d’accès: https://ai4climate.lip6.fr/wp-content/uploads/2020/09/plan_SCAI_extrait.pdf)
Merci de vous inscrire sur ce lien : https://docs.google.com/forms/d/e/1FAIpQLSc4scBTJZnOquz2FZkQbPKAKEvacQ0BC52WKs52CzTD6amCAw/viewform?usp=sf_link
Nous vous conseillons néanmoins d’apporter avec vous votre ordinateur portable afin d’être connecté en même temps sur la salle zoom (voir ci-dessous)


Si vous souhaitez assister à distance: 

Voici le lien zoom: https://us02web.zoom.us/j/81893439500
Vous pourrez également poser des questions sur le chat qui seront retransmises dans la salle.

Sebastien Loustau presentation is entitled:

« Power-efficient deep learning algorithms»

Abstract:
In this talk, I will present both theoretical and practical aspect of how designing power-efficient deep learning algorithms. After a non-exhaustive survey of different contributions about the machine learning perspective (training low bit-width networks), the hardware counterpart (CNNs accelerators) and the relationship with Auto-ML and the NAS procedure, I will present a theoretically based approach to add the power efficiency constraint into the optimization procedure of training deep nets. This work in progress bridges optimal transport and information theory with online learning.

Short bio:
Sébastien is a researcher in mathematical statistics and Machine Learning. He has studied the theoretical aspect of both statistical and online learning. His research interests include online learning, unsupervised learning, adaptive algorithms and minimax theory. He also founded LumenAI 5 years ago.

Inferring causation from time series with perspectives in Earth system sciences – Jakob Runge

Link to the slides

The seminar is on December 4th at 10:00 14:00 and will be held remotely.

Link to the zoom session: https://us02web.zoom.us/j/84003686532

Jakob Runge presentation is entitled:

« Inferring causation from time series with perspectives in Earth system sciences »

Abstract:

The heart of the scientific enterprise is a rational effort to understand the causes behind the phenomena we observe. In disciplines dealing with complex dynamical systems, such as the Earth system, replicated real experiments are rarely feasible. However, a rapidly increasing amount of observational and simulated data opens up the use of novel data-driven causal inference methods beyond the commonly adopted correlation techniques. In this talk, I will present a recent Perspective Paper in Nature Communications giving an overview of causal inference methods and identify key tasks and major challenges where causal methods have the potential to advance the state-of-the-art in Earth system sciences. Several methods will be illustrated by `success’ examples where causal inference methods have already led to novel insights and I will close with an outlook of this relatively new and exciting field. I will also present the causal inference benchmark platform www.causeme.net that aims to assess the performance of causal inference methods and to help practitioners choose the right method for a particular problem.

Runge, J., S. Bathiany, E. Bollt, G. Camps-Valls, D. Coumou, E. Deyle, C. Glymour, M. Kretschmer, M. D. Mahecha, J. Muñoz-Marı́, E. H. van Nes, J. Peters, R. Quax, M. Reichstein, M. Scheffer, B. Schölkopf, P. Spirtes, G. Sugihara, J. Sun, K. Zhang, and J. Zscheischler (2019). Inferring causation from time series in earth system sciences. Nature Communications 10 (1), 2553.

Short bio:

Jakob Runge heads the Climate Informatics working group at the German Aerospace Center’s Institute of Data Science since 2017. The group combines innovative data science methods from different fields (graphical models, causal inference, nonlinear dynamics, deep learning) and closely works with experts in the climate sciences. Jakob studied physics at Humboldt University Berlin and obtained his PhD at the Potsdam Institute for Climate Impact Research in 2014. For his studies, he was funded by the German National Foundation (Studienstiftung) and his thesis was awarded the Carl-Ramsauer prize by the Berlin Physical Society. In 2014 he won a $200.000 Fellowship Award in Studying Complex Systems by the James S. McDonnell Foundation and joined the Grantham Institute, Imperial College, from 2016 to 2017. On https://github.com/jakobrunge/tigramite.git he provides Tigramite, a time series analysis python module for causal inference. For more details, see: www.climateinformaticslab.com

A direct approach to detection and attribution of climate change – Eniko Szekely – 24/01/2020

Lien pour les slides

Le prochain séminaire aura lieu le 24 Janvier à 14h30 au campus Pierre et Marie Curie de Sorbonne Université dans la salle 105 du LIP6 couloir 25-26 au 1er étage.

La présentation de Eniko Szekely est intitulée:

« A direct approach to detection and attribution of climate change »

Abstract:

In this talk I will present a novel statistical learning approach for detection and attribution (D&A) of climate change. Traditional optimal D&A studies try to directly model the observations from model simulations, but practically this is challenging due to high-dimensionality. Here, we propose a supervised approach where we predict a given metric or external forcing directly from the high-dimensional spatial pattern of climate variables, and use the predicted metric as a test statistic for D&A. The first part of the talk will focus on daily detection and show that we can now detect climate change from global weather for any single day since spring 2012. The second part of the talk will focus on attribution of climate change. For attribution, we want the prediction of the external forcing, e.g., anthropogenic forcing, to work well even under changes in the distribution of other external forcings, e.g., solar or volcanic forcings. Therefore we formulate the optimization problem from a distributional robustness perspective, and use anchor regression to ensure good predictions even under such distributional changes.

Notice Biographie:

Eniko is a senior data scientist at the Swiss Data Science Center, EPFL & ETH Zurich, working on machine learning for climate science. Previously, she was a postdoctoral researcher at the Courant Institute of Mathematical Sciences, New York University, and she obtained her PhD in Computer Science from the University of Geneva, Switzerland. Broadly she is interested in machine learning for high-dimensional data and nonlinear phenomena arising from dynamical systems. More recently she has been working on using machine learning and statistical learning approaches for climate science, and has been involved in the organization of the Climate Informatics workshop since 2015.

Deep Learning for Satellite Imagery: Semantic Segmentation, Non-Rigid Alignment, and Self-Denoising – Guillaume Charpiat – 4 Decembre 2019

Quand : 4 Decembre 2019 à 10:30

Où : Campus Pierre and Marie Curie (Sorbonne Université) salle 105 du LIP6 couloir 25-26 1er étage.

Résumé
Neural networks have been producing impressive results in computer vision these last years, in image classification or segmentation in particular. To be transferred to remote sensing, this tool needs adaptation to its specifics: large images, many small objects per image, keeping high-resolution output, unreliable ground truth (usually mis-registered). We will review the work done in our group for remote sensing semantic segmentation, explaining the evolution of our neural net architecture design to face these challenges, and finally training a network to register binary cadaster maps to RGB images while detecting new buildings if any, in a multi-scale approach. We will show in particular that it is possible to train on noisy datasets, and to make predictions at an accuracy much better than the variance of the original noise. To explain this phenomenon, we build theoretical tools to express input similarity from the neural network point of view, and use them to quantify data redundancy and associated expected denoising effects.
If time permits, we might also present work on hurricane track forecast from reanalysis data (2-3D coverage of the Earth’s surface with temperature/pressure/etc. fields) using deep learning.

Notice Biographie:

After a PhD thesis at ENS on shape statistics for image segmentation, and a year in Bernhard Schölkopf’s team at MPI Tübingen on kernel methods for medical imaging, Guillaume Charpiat joined INRIA Sophia-Antipolis to work on computer vision, and later INRIA Saclay to work on machine learning. Lately, he has been focusing on deep learning, with in particular remote sensing imagery as an application field.

Affiliation:
Guillaume Charpiat (Équipe TAU, INRIA Saclay / LRI – Université Paris-Sud)

Prévision d’ensemble par apprentissage séquentiel en météorologie, et méta-modélisation en pollution urbaine – Vivien Mallet 20 Septembre 2019

Quand : 20 Septembre 2019 à 14:00

Où : Campus Pierre and Marie Curie (Sorbonne Université) salle 105 du LIP6 couloir 25-26 1er étage.

Résumé
Le séminaire aura pour objectif d’illustrer certains apports de l’apprentissage dans des applications environnementales complexes.
La première partie concernera la prévision d’ensemble. Un objectif est d’agréger un ensemble de prévisions en une prévision unique et meilleure que chaque prévision de l’ensemble. Une approche plus ambitieuse consiste à prévoir une distribution de probabilité afin de conserver une mesure de l’incertitude de prévision. Nous verrons qu’il est possible de prévoir une distribution plus performante que toute distribution empirique formée par une pondération constante des prévisions de l’ensemble. Les travaux seront illustrés par la prévision du rayonnement solaire et de la production photovoltaïque d’EDF.
La seconde partie concernera la substitution d’un modèle environnemental, complexe et numériquement coûteux, par un méta-modèle extrêmement rapide et pourtant suffisamment fidèle au modèle complet. Nous verrons comment il est possible de remplacer un modèle non-linéaire opérant en grande dimension en (1) procédant à une réduction de dimension sur ses entrées et ses sorties, et (2) apprenant le comportement du modèle par un échantillonnage adapté. Il est aussi possible d’y mêler des données d’observation (issues de stations ponctuelles) pour améliorer les prévisions du méta-modèle. L’approche sera illustrée par la simulation de la pollution atmosphérique et de la pollution sonore en milieu urbain, à la résolution de la rue.

Notice Biographie:
Vivien Mallet est chercheur au centre INRIA de Paris. Il travaille sur l’assimilation de données (couplage modélisation/observation) et la quantification des incertitudes pour des problèmes en environnement.