Beautiful Map Backgrounds - Slope Hillshades

In the field of remote sensing you often have to present classification results as maps and need a nice, unobtrusive background that gives the viewer an idea where everything is located without being as distracting as a RGB satellite image. A nice choice for this is a grey hillshading or slope relief background that adds some texture information to your map by using a Digital Elevation Model.

Read More

Prototyping time-series analysis with Google Earth Engine

Sometimes you are struck by an idea how to analyze satellite images, but how fast can you get from a remote sensing idea to a prototype of the result? The sheer size and amount of openly available satellite images makes developing prototypes rather cumbersome. To get from an idea to a prototype product you need to download gigabytes of data, pre-process it, write the code that calculates your prototype product and save it somewhere. Oftentimes you’ll end up spending more time on downloading and pre-processing than the actual prototyping and calculation. More often than not I discarded ideas because I had no time to test them. Today I’ll show an example of how Google’s Earth Engine can be used to put the rapid back into rapid prototyping for remote sensing.

Read More

Download Copernicus Sentinel-2 images

The Sentinel satellites are an amazing opportunity for scientists all over the world to explore unprecedented amounts of remote sensing data free of charge. I am genuinely happy that this is one of the first large remote sensing missions that has Open Data and Open Access baked in right from the start. All Sentinel data can be accessed through Copernicus SciHub. This post is an update to the Sentinel-1 tutorial and aims to show a typical workflow of searching and downloading Sentinel-2 data with the Python package sentinelsat

Read More

How to fill a Donut

When you are classifying pixels in satellite images you often encounter the dreaded artefact commonly referred to as Donut. A good example is the classification of lakes where you are not able to correctly classify the complete shoreline (resulting in non-closed circles or Open Donuts) and remaining pixels in the wrong class which make up the Hole inside the Donut. Recently a question popped up on GIS.Stackexchange on How to transform raster donuts to circles This problem can be solved with the use of mathematical morphology.

Read More

Download Copernicus Sentinel images with Python

The Sentinel satellites are an amazing opportunity for scientists all over the world to explore unprecedented amounts of remote sensing data free of charge. I am genuinely happy that this is one of the first large remote sensing missions that has Open Data and Open Access baked in right from the start. All Sentinel data can be accessed through Copernicus SciHub. Visualization, search and downloading aren’t exactly responsive if you are looking to acquire large datasets. Luckily ESA allows programmatic access to the archive through the OData protocol, so it was just a matter of time until alteernative search&download tools popped up. This post aims to show a typical workflow of searching and downloading Sentinel-1 data with the Python package sentinelsat

Read More

np.nanpercentile() - there has to be a faster way!

Recently I was trying to calculate the quantiles of the vegetation index of an area over time. For this I have a time-series of satellite raster images of a certain region that cover identical extents. This is represented as a numpy.ndarray of the shape(96, 4800, 4800) - in other words 96 satellite images each measuring 4800 by 4800 pixels. I want to calculate the 10th, 25th, 50th, 75th and 90th quantile along the time/z-axis, which can be done easily with np.percentile(a, q=[10,25,50,75,90], axis=0). The data I am working with contains no_data areas due to residual cloud cover, rainfall, etc. represented as np.NaN. Naturally I was turning to numpys np.nanpercentile(a, q=[10, 25, 50, 75, 90], axis=0). Unfortunately np.nanpercentile() was ~300x slower on my dataset than np.percentile() so I had to find a way to speed it up.

Read More

Fast linear 1D interpolation with numba

I am currently doing time-series analysis on MODIS derived vegetation index data. In order to get a reliable signal from the data outliers need to be removed and the resulting gaps interpolated/filled before further filtering/smoothing of the signal. The time-series for one tile, covering 10° by 10°, spans roughly 14 years with 46 images per year. Each image weighs in at around 70-100 Mb. If you are processing, say, Africa you are looking at roughly 2.3 Terrabyte of input data. Interpolation of such massive amounts of data begs teh question - What is the fastest way to do it?

Read More

Landsat batch download from Google and Amazon

Landsat is the work horse for a lot of remote sensing applications with it’s open data policy, global data vailability and long spanning acquisition time-series. The USGS Bulk Downloader however is clunky, depends on special ports being open on your network an can not be scripted to suit needs like automatic ingestion of new acquired Landsat-8 scenes. Fortunately Google and Amazon provide mirrors to a lot of the Landsat datasets which can be used for scripted bulk downloading.

Read More

Generating a global tree cover vector dataset

Trees cover a large part of the earth and can sometimes be quite annoying when you are trying to classify any other land cover than forest. In my case I was looking for a dataset I could use to mask out the forest areas so I don’t have to worry about them producing false positives in my classification therefore making the whole process simpler, faster and more accurate. Unfortunately there aren’t a whole lot of options to choose from if you are looking for something global with a resolution not less than MODIS (250m). One quite prominent and easy to use dataset is the Global Forest Change 2000 - 2013 by Hansen et al.. The problem now is: How do you get such a large dataset into an easy to use vector dataset?

Read More