Skip to content

Hackathon with Missing Maps & MSF, Pilsen 2019 | Detection of vegetation loss from satellite imagery

Notifications You must be signed in to change notification settings

benetka/missing_maps_hackathon

Repository files navigation

Missing Maps Hackathon: Detection of vegetation loss

This notebook demonstrates work of Jan Benetka and Jakub Matějka carried out during the first Missing Maps & MSF Hackathon organized in the Czech Republic. The goal of the challenge is to identify rural locations with potential population growth.

Expansion of population in certain area is almost always accompanied with side effects such as increased number of houses and consequent emergence of paths among them, loss of trees in nearby forests (used for heating or cooking), or degradation of natural environment due to the industrial growth. A common denominator of all of these effects is a decrease of vegetation mass. Our aim, therefore, is to provide simple yet effective methods to detect loss of greenery and analyze its rate.

Notebook preview

Data

After brief search for available datasets, we settled on images from the Copernicus Sentinel-2 mission. It provides multi-spectral images of the Earth that are publicly available. Morover, the satellite(s) revisits the same area with high frequency (~5-10 days), which increases our chances of collecting nice cloud-free snapshots. The pictures come in a spatial resolution that varies from 10m/pixel for some spectral bands (RGB, NIR), through 20m/pixel (e.g., Red Edge), up to 60m/pixel (see complete list). It is not quite enough for reliable recognition of people or rooftops, nevertheless, it is fine for spotting variations in vegetation levels.

One tool that certainly deserves our mention is the EO browser by SINERGISE. Not only you can download the satellite images here, you can also visualize and choose any of the available spectral bands. On top of that, the browser has a function to create a timelapse from imagery for a given region and you can decide what percentage of clouds you are able to tolerate in the pics.


Method(s)

TL;DR

Simply put, we take two satellite images from the same area in two different moments and we look for differences in vegetation. We then highlight these differences for each consecutive pair of snapshots and measure their rate.

Long version

1) Vegetation identification

Trying to distinguish vegetation based on color in visible spectrum might work for some areas (lush grasslands), but not for others (autumn tundra). Much better way of vegetation remote-sensing is to understanding that plants, in order to produce sugar via photosynthesis, do absorb solar radiation of certain wavelengths (see figure below) while they reflect other ranges not to overheat. SENTINEL-2 is well-equipped to capture images in red and near-infrared spectrum, and those are the wavelenght ranges needed to calculate NDVI: Normalized difference vegetation index. High values of NDVI index represent rich vegetation, low values suggest lack of greenery. The index is computed as: $NDVI=\frac{(NIR - RED)}{(NIR + RED)}$, where NIR is near-infrared spectrum and RED is red (visible) spectrum.

Photosynthesis and wavelength ranges.
Ranges of solar radiation wavelength where photosynthesis happens. Refer to Wikipedia article about NDVI. to find more information.

Below, you can see a satellite image of Manhattan in NDVI mode as well as in true color spectrum (see original): NDVI mask

2) Vegetation change tracking

We saw that in the "NDVI world" brigh equals little vegetation, dark means a lot. Next step, naturally, is to compare two temporally different snapshots from the same area and calculate the extent of vegetation loss. We simply compute per-pixel differences between two images $I_1$ and $I_2$:

I_d = |I_1(x,y) - I_2(x,y)|

after first converting them to grayscale and normalizing their contrast/brightness:

I_{2_norm} = \frac{\sigma_1}{\sigma_1}(I_2(x,y) - \nu_2)+\nu_1,

where \sigma is mean brightness and \nu is mean contrast. Find the method described at sentinel-hub blog.

An example of a (quick) deforestation process captured on two NDVI snapshots converted to grayscale looks like this (see original): Deforestation in BW

Image pairs like that serve as our input data to express the differences (marked below in red color): Loss of vegetation


Resources

There are tons of resources available on the topic of deforestation. We found the following especially useful:

Data

Tutorials & Blog posts

Image sources

Tools

About

Hackathon with Missing Maps & MSF, Pilsen 2019 | Detection of vegetation loss from satellite imagery

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published