This code implements the snow cover extent detection algorithm LIS (Let It Snow) for Sentinel-2, Landsat-8 and SPOT4-Take5 data. It also implements different temporal syntheses based on time series of snow products.
This code implements :
* snow cover extent detection algorithm LIS (Let It Snow) for Landsat-8 and SPOT4-Take5 data.
* fractional snow cover (FSC) for Sentinel-2 (which includes snow coverage)
* temporal syntheses based on time series of snow products (snow cover and/or FSC).
The algorithm documentation with examples is available here:
The input files are Sentinel-2 or Landsat-8 level-2A products from the [Theia Land Data Centre](https://theia.cnes.fr/) or [SPOT-4/5 Take 5 level-2A products](https://spot-take5.org) and a Digital Terrain Model (DTM). The output is a Level-2B snow product.
The syntheses are temporally aggregated (level-3A) products derived from individual snow products after gapfilling. The three products are: the snow cover duration, the snow disappearance date and the snow appearance date. These products are typically computed over a hydrological year (more details : [Snow cover duration map](doc/snow_annual_map.md).
The syntheses are temporally aggregated (level-3A) products derived from individual snow products after gapfilling. The three products are: the snow cover duration, the snow disappearance date and the snow appearance date. These products are typically computed over a hydrological year (more details : [Snow cover duration map](doc/snow_annual_map.md)).
## Usage
### Snow detector
### Snow cover and FSC using "let_it_snow_fsc"
Run the python script run_snow_detector.py with a json configuration file as unique argument:
The snow detection is performed in the Python script [run_snow_detector.py](app/run_snow_detector.py).
with launch_configuration_file.json :
All the parameters of the algorithm, paths to input and output data are stored in the json file. The JSON schema (snow_detector_schema.json) is available in the [Algorithm Theoretical Basis Documentation](doc/atbd/ATBD_CES-Neige.tex) and gives more information about the roles of these parameters.
NB: To build DEM data download the SRTM files corresponding to the study area and build the .vrt using gdalbuildvrt. Edit config.json file to activate preprocessing : Set "preprocessing" to true and set the vrt path.
All launch parameters are described in [fsc_launch_schema.json](doc/atbd/fsc_launch_schema.json)
and can be **overwritten** by the following command line options:
```
* "-i", "--input_dir" - Path to input directory, containing L2A Theia Product or this directory as zip
* "-o", "--output_dir" - Path to output directory; which will contains FSC Product
* Tree cover density file is only used for FSC computation (only available for Sentinel-2 products). If not defined for Sentinel-2 snow detection, only FSC-OG (on ground) will be computed.
* Snow detection without water mask could lead to confusions between snow and water.
* Lis configuration file contains algorithm's parameters. Default configuration is available here : [lis_default_configuration.json](doc/lis_default_configuration.json).
As an expert, you can look at its description file [fsc_config_schema.json](doc/atbd/fsc_config_schema.json) and explaination about how to change specific parameters in [LIS configuration for experts](doc/LIS_configuration_for_experts.md).
You can use the command line option '-v' or '--version' to know lis version.
[Algorithm Theoretical Basis Documentation](doc/atbd/ATBD_CES-Neige.tex) gives more information about the scientific roles of these parameters.
NB: To build DEM data download the SRTM files corresponding to the study area and build the .vrt using gdalbuildvrt. Edit config.json file to activate preprocessing : Set "preprocessing" to true and set the vrt path.
Warning : DEM with nodata value could alter snow detection. zs should be contained between [-431, 8850].
### Snow syntheses
### Snow synthesis using "let_it_snow_synthesis"
Run the python script let_it_snow_synthesis.py with a json launch file as unique argument:
The snow syntheses are performed in the Python script [run_snow_annual_map.py](app/run_snow_annual_map.py).
Synthesis configuration file contains system's parameters. Default configuration is available here : [synthesis_default_configuration.json](doc/synthesis_default_configuration.json).
As an expert, you can look at its description file [synthesis_config_schema.json](doc/atbd/synthesis_config_schema.json)
Algorithm is detailled here : [Snow Annual Map](doc/snow_annual_map.md)
All the parameters of the algorithm, paths to input and output data are stored in the json file. The JSON schema (snow_annual_map_schema.json) and its description are available in the [readme](doc/snow_annual_map.md).
You can use the command line option '-v' or '--version' to know lis version.
## Products format
### Snow product
* SNOW_ALL: Binary mask of snow and clouds.
Since lis 1.7, product name matches the following nomenclature :
@@ -62,7 +152,7 @@ For example if you want to get the snow from pass1 and clouds detected from pass
```python
pixel_value&00000101
```
* SEB: Raster image of the snow mask and cloud mask.
*LIS_SEB: Raster image of the snow mask and cloud mask.
* 0: No-snow
* 100: Snow
* 205: Cloud including cloud shadow
...
...
@@ -73,38 +163,40 @@ pixel_value & 00000101
* 100: Snow
* 205: Cloud including cloud shadow
* 255: No data
* LOG file: **lis.log**, the log file for the standard and error output generated during processing
### Snow syntheses
Each product is computed for a given tile [TILE\_ID] and a given period from [DATE\_START] to [DATE_STOP]. Products are identified by a tag according the following naming convention: [TILE\_ID]\_[DATE\_START]\_[DATE_STOP]
Since lis 1.7, synthesis name matches the following nomenclature :
- Raster: **DAILY\_SNOW\_MASKS\_<*tag*>.tif**, the snow time series file interpolated on a daily basis (1 image with one band per day). Each band are coded as follows (the interpolation removing any clouds or nodata):
- 0: No-snow
- 1: Snow
LIS synthesis generates the following files:
- Raster: **SCD\_<*tag*>.tif**, the snow cover duration map (SCD), pixel values within [0-number of days] corresponding the number of snow days.
- Raster: **LIS\_<*mission*>-SNOW-SCD\_<*tag*>_<*chain_version*>_<*product_counter*>.tif**, the snow cover duration map (SCD), pixel values within [0-number of days] corresponding the number of snow days.
- Raster: **CLOUD\_OCCURENCE\_<*tag*>.tif**, the cloud/nodata annual map image, pixel values within [0-1] corresponding the cloud or nodata occurrences in the non-interpolated time series
- Raster: **LIS\_<*mission*>-SNOW-SMOD\_<*tag*>_<*chain_version*>_<*product_counter*>.tif**, the date of snow disappearance (Snow Melt-Out Date), defined as the last date of the longest snow period. The dates are given in number of days since the first day of the synthesis.
- Raster: **SMOD\_<*tag*>.tif**, the date of snow disappearance (Snow Melt-Out Date), defined as the last date of the longest snow period. The dates are given in number of days since the first day of the synthesis.
- Raster: **LIS\_<*mission*>-SNOW-SOD\_<*tag*>_<*chain_version*>_<*product_counter*>.tif**, the date of snow appearance (Snow Onset Date), defined as the first date of the longest snow period. The dates are given in number of days since the first day of the synthesis.
- Raster: **SOD\_<*tag*>.tif**, the date of snow appearance (Snow Onset Date), defined as the first date of the longest snow period. The dates are given in number of days since the first day of the synthesis.
- Raster: **LIS\_<*mission*>-SNOW-NOBS\_<*tag*>_<*chain_version*>_<*product_counter*>.tif**, the number of clear observations to compute the SCD, SMOD and SOD syntheses
- Raster: **NOBS\_<*tag*>.tif**, the number of clear observations to compute the SCD, SMOD and SOD syntheses
-<**mission**> : S2 or S2L8 (if densification is used)
Output directory will also contain the following files :
Output directory will also contain the following files in the tmp directory :
- Text file: **input_dates.txt**, the list of observation dates in the non-interpolated time series
- Text file: **output_dates.txt**, the list of interpolated dates
-JSON file: **param.json**, the configuration file used for the products generation (optional)
-Raster: **CLOUD\_OCCURENCE\_<*tag*>.tif**, the cloud/nodata annual map image, pixel values within [0-1] corresponding the cloud or nodata occurrences in the non-interpolated time series
- LOG file: **stdout.log**, the log file for the standard output generated during processing (optional)
- Raster: **DAILY\_SNOW\_MASKS\_<*tag*>.tif**, the snow time series file interpolated on a daily basis (1 image with one band per day). Each band are coded as follows (the interpolation removing any clouds or nodata):
- 0: No-snow
- 1: Snow
- LOG file: **stderr.log**, the log file for the error output generated during processing (optional)
- LOG file: **lis.log**, the log file for the standard and error output generated during processing.
## Data set example
...
...
@@ -128,15 +220,14 @@ Following a summary of the required dependencies:
* Python interpreter >= 3.6
* Python libs >= 3.6
* Python packages:
* numpy
* lxml
* matplotlib
* rasterio
* numpy
* lxml
* matplotlib
* rasterio
GDAL itself depends on a number of other libraries provided by most major operating systems and also depends on the non standard GEOS and Proj libraries. GDAL- Python bindings are also required
Python package dependencies:
* sys
* subprocess
* glob
...
...
@@ -183,7 +274,7 @@ chmod -R 755 ${install_dir}
```
The files will be installed by default into /usr/local and add to the python default modules.
To overrsouride this behavior, the variable CMAKE_INSTALL_PREFIX must be configure before build step.
To override this behavior, the variable CMAKE_INSTALL_PREFIX must be configured before build step.
Update environment variables for LIS. Make sure that OTB and other dependencies directories are set in your environment variables:
Then go to lis-build-script and launch the local install of LIS:
```bash
sh ./build-lis-local.sh {path_to_master_repository}{OTB_version_number}{install_repository}
```
When the install is completed, the tests are launched. The X tests have to be OK.
let-it-snow is now installed.
## Tests
Tests list is available here : [LIS_tests.md](LIS_tests.md) in the test directory.
Enable tests with BUILD_TESTING cmake option. Use ctest command to run tests. Do not forget to clean your output test directory when you run a new set of tests.
Data (input and baseline) to run validation tests are available on Zenodo: