The input files are Sentinel-2 or Landsat-8 level-2A products from the [Theai Land Data Centre](https://theia.cnes.fr/) or [SPOT-4/5 Take 5 level-2A products](https://spot-take5.org) and the SRTM digital elevation model reprojected at the same resolution as the input image.
The input files are Sentinel-2 or Landsat-8 level-2A products from the [Theia Land Data Centre](https://theia.cnes.fr/) or [SPOT-4/5 Take 5 level-2A products](https://spot-take5.org) and a Digital Terrain Model (DTM) like SRTM for instance reprojected at the same resolution as the input image.
##Usage
##Usage
Run the python script run_snow_detector.py with a json configuration file as unique argument:
...
...
@@ -24,7 +28,7 @@ The snow detection is performed in the Python script [run_snow_detector.py](app/
All the parameters of the algorithm, paths to input and output data are stored in the json file. See the provided example [param_test_s2_template.json](tes/param_test_s2_template.json) file for an example.
Moreover The JSON schema is available in the [Algorithm theoritical basis documentation](doc/tex/ATBD_CES-Neige.tex) and gives more information about the roles of these parameters.
Moreover The JSON schema is available in the [Algorithm Theoretical Basis Documentation](doc/atbd/ATBD_CES-Neige.tex) and gives more information about the roles of these parameters.
NB: To build DEM data download the SRTM files corresponding to the study area and build the .vrt using gdalbuildvrt. Edit config.json file to activate preprocessing : Set "preprocessing" to true and set the vrt path.
...
...
@@ -37,6 +41,8 @@ NB: To build DEM data download the SRTM files corresponding to the study area an
* 2nd bit: Snow mask after pass2
* 3rd bit: Clouds detected at pass0
* 4th bit: Clouds refined at pass0
* 5th bit: Clouds initial (all_cloud)
* 6th bit: Slope flag (optional 1: bad slope correction)
For example if you want to get the snow from pass1 and clouds detected from pass1 you need to do:
```python
...
...
@@ -47,7 +53,7 @@ pixel_value & 00000101
* 100: Snow
* 205: Cloud including cloud shadow
* 254: No data
* SEB_VEC: Vector image of the snow mask and cloud mask. Two fields of information are embbeded in this product. DN (for Data Neige) and type.
* SEB_VEC: Vector image of the snow mask and cloud mask. Two fields of information are embedded in this product. DN (for Data Neige) and type.
* DN field :
* 0: No-snow
* 100: Snow
...
...
@@ -79,7 +85,7 @@ Following a summary of the required dependencies:
* lxml
* matplotlib
GDAL itself depends on a number of other libraries provided by most major operating systems and also depends on the non standard GEOS and PROJ4 libraries. GDAl- Python bindings are also required
GDAL itself depends on a number of other libraries provided by most major operating systems and also depends on the non standard GEOS and Proj libraries. GDAL- Python bindings are also required
Python package dependencies:
...
...
@@ -100,7 +106,7 @@ Optional dependencies:
In your build directory, use cmake to configure your build.
```bash
cmake -C config.cmake source/lis/
cmake -C config.cmake source_lis_path
```
In your config.cmake you need to set :
```bash
...
...
@@ -129,7 +135,7 @@ chmod -R 755 ${install_dir}
```
The files will be installed by default into /usr/local and add to the python default modules.
To overide this behavior, the variable CMAKE_INSTALL_PREFIX must be configure before build step.
To overrsouride this behavior, the variable CMAKE_INSTALL_PREFIX must be configure before build step.
Update environment variables for LIS. Make sure that OTB and other dependencies directories are set in your environment variables:
```bash
...
...
@@ -142,9 +148,11 @@ let-it-snow is now installed.
## Tests
Enable tests with BUILD_TESTING cmake option. Use ctest to run tests. Do not forget to clean your output test directory when you run a new set of tests.
Enable tests with BUILD_TESTING cmake option. Use ctest command to run tests. Do not forget to clean your output test directory when you run a new set of tests.
Data (input and baseline) to run validation tests are available on Zenodo:
Data (input and baseline) to run validation tests are available on [Zenodo](http://doi.org/10.5281/zenodo.166511).
This files list the identifies open points, and also the actions in terms of data processing and production.
**WARNING: The following applies to LIS version 1.5**
## Densification using Landsat 8 products (old format)
The old format Landsat products are available through Theia. They were not used to densify the input snow products timeseries in the frame of the snow annual map generation (available under /work/OT/siaa/Theia/Neige/SNOW_ANNUAL_MAP_LIS_1.5). This can now be done by using the library [amalthee/0.2](https://gitlab.cnes.fr/datalake/amalthee.git). Among the parameters, it is possible to request products corresponding to a region of interest. This is the best way to retrive products corresponding to the S2 and L8 new format tiles (example: "T31TCH")
On CNES HPC:
```
module load amalthee/0.2
amalthee = Amalthee('oldlandsat')
amalthee.show_params("Landsat")
```
Once the old Landsat products are retrived and processed to obtain the snow products, the snow annual map could
be densified by simply adding them to the densification list. Please refer to [tutorials/prepare_snow_annual_map_data.md]((https://gitlab.orfeo-toolbox.org/remote_modules/let-it-snow/blob/develop/doc/tutorials/prepare_snow_annual_map_data.md))
for the usage of the script on CNES HPC.
## Modification of the [prepare_snow_annual_map_data.py](https://gitlab.orfeo-toolbox.org/remote_modules/let-it-snow/blob/develop/hpc/prepare_data_for_snow_annual_map.py)
This script must be modified to at least change the sub-processes that are currently asynchronous and that requires to run it multiple times.