Technical Tutorials

Important notice: for practical reasons, only 20 participants will be allowed per session.

JURSE 2019 will feature 4 technical tutorials that aim to provide hands-on training to attendees: Deep Learning, Google Earth Engine, MicMac and OTB.

These tutorials will be held on Tuesday 21 May and scheduled on half-day parallel sessions. Tutorials will be given morning and/or afternoon depending on the number of attendees. Each registered participant should have received a confirmation email on May 14, 2019.

Deep Learning for Remote Sensing (morning and afternoon)

Teachers: Loic Landrieu (IGN, French Mapping Agency) and Bertrand Le Saux (ONERA, French Aerospace Agency)

This tutorial presents an overview of current approaches for deep learning for remote sensing.

The first part focuses on 2D techniques for information extraction and classification of 2D Earth observation data. We present neural network models for processing data from various sensors (including hyperspectral and SAR) and tackling different comon applications: multi-modal analysis or change detection.

The second part presents an overview of the recent developments in neural network architecture for the semantic segmentation of 3D point clouds. We start by presenting image-inspired approaches, such as the multi-view projection strategy, tangent convolution, and 3D voxel grid methods. We then present networks specifically designed for handling unordered 3D points such as PointNet. Finally, we present some strategies for scaling segmentation based on recurrent neural networks, such as the SuperPoint Graph approach.

The lecture part will be completed with practical examples.

Google Earth Engine: A hands-on introduction to Google Earth Engine (morning only)

Teacher: Noel Gorelick (Google)

This session is a hands-on workshop that takes participants through the basics of getting up and running with Earth Engine. Participants will learn how to perform tasks such as image visualization, time-series compositing and areal statistic computations using the Earth Engine API and interactive development environment.

Image-based 3D reconstruction in MicMac (afternoon only)

Teachers: Ewelina Rupnik (IGN, French Mapping Agency)

This course is an introduction to image-based 3D reconstruction with the free open-source software MicMac. During the session, two different datasets will be processed: a simple UAV acquisition and a satellite triplet. To be able to perform 3D reconstruction from images it is indispensable to know the camera model (the interior orientation), as well as its position and orientation (the exterior orientation) at the time of the image-taking. As far as the satellite imagery is concerned, their approximate orientation parameters are always known a priori thus in the course of the processing these parameters are solely refined. In the UAV-case, the orientation parameters must be deduced given the images and the observations of homologous points, which indirectly encode the “connectivity” between images. If no external data is available the reconstruction is purely relative. By including external data such as ground control points or position of the camera perspective centers collected by a GNSS receiver, one can move from relative to absolute coordinate system. Finally, once the orientation parameters are retrieved, one can generate a digital surface model and a orthophotograph of the 3D scene. This processing chain is known as the photogrammetric processing chain, and we aim to carry it out within this tutorial.

Important notice: it is expected that the participants are familiar with the Ubuntu operating system as well as the principles of photogrammetry / computer vision. The participants will be provided with a VirtualBox image with pre-installed MicMac (+dependencies), CloudCompare or Meshlab. The image will also contain the datasets. Alternatively, participants with their own PC with pre-installed MicMac are also welcome (however, it should be noted that no compilation help will be provided during the tutorial due to time constraints).

OTB guided tour

Teachers: Yannick Tanguy and David Youssefi (CNES, French Space Agency) (afternoon only)

Orfeo ToolBox (OTB) is an open-source library for remote sensing images processing. It has been initiated and funded by CNES to promote the use and the exploitation of the satellites images. Orfeo ToolBox aims at enabling large images state-of-the-art processing even on limited resources laptops, and is shipped with a set of extensible ready-to-use tools for classical remote sensing tasks, as well as a fully integrated, end-users oriented software called Monteverdi ; OTB is also accessible via Quantum GIS processing module.

This tutorial will present the ORFEO Toolbox and showcase available applications for processing and manipulating satellite imagery.

As a first step, we will learn how to manipulate OTB-Applications through their different interfaces.
Then, we will follow a guided tour of two main processing frameworks (segmentation & classification): we will use data from different sensors (Pléiades or Spot 6/7 with a Sentinel 2 time series) to make a value-added classification map of the region of Rennes (Brittany).