Commit 5c846846 authored by Germain Salgues's avatar Germain Salgues

pre-release: preparation for release LIS-1.5

parent 943f56f2
......@@ -4,6 +4,26 @@ All notable changes to Let It Snow (LIS) will be documented in this file.
## [Unreleased]
### Added
-
### Fixed
-
## [1.5] - 2018-12-20
Please register or sign in to reply
### Added
- The snow annual map module is now operational, the associated files are:
- app/run_snow_annual_map.py, the main application
- python/s2snow/snow_annual_map.py, the core of the annual map processing
- python/s2snow/snow_annual_map_evaluation.py, provide the possibility to compare with other snow products and modis snow annual map
- python/s2snow/snow_product_parser.py, class to handle the supported type of snow products
- doc/snow_annual_map_schema.json, parameters descriptions
- hpc/prepare_data_for_snow_annual_map.py, preprocessing script on CNES HPC,
- doc/tutorials/prepare_snow_annual_map_data.md, tutorial
- Provided new data pack for tests "Data-LIS-1.5"
- Add tests for snow annual map computation
- The version of the s2snow module is now stored in file python/s2snow/version.py
- Add support and tests for zipped products in build_json.py and run_snow_detector.py,
- Add a mode to build_json.py script to configure and run LIS on Level 2A MAJA native products
- Add a tutorial to describe how to run LIS on MAJA native products
- Add a mode to build_json.py script to configure and run LIS on SEN2COR Level 2A products
......@@ -14,15 +34,15 @@ All notable changes to Let It Snow (LIS) will be documented in this file.
- The expert mask now includes an optional 6th bit propagating the slope correction flag from the product mask when available
- The cold cloud removal (pass 1.5) now use an area threshold to process only significant snow areas within clouds and reduce time.
- Link ATBD and LIS Data for test validation to their Zenodo DOI in README.md
### Fixed
- Fix all python scripts headers to avoid to mix python versions
- Fix all python scripts headers to avoid to mix python versions
- Fix preprocessing json option which was broken which allows to resample input DTM
- Fix typos in README.md documentation
- Change nodata management (read -32768 from input and write 0 in the
output) in DTM resampling to avoid error in snow line estimation
on area without DTM information
output) in DTM resampling to avoid error in snow line estimation
on area without DTM information
## [1.4] - 2018-02-14
### Added
......
# Snow Annual Maps: open points and perspectives
This files list the identifies open points, and also the actions in terms of data processing and production.
**WARNING: The following applies to LIS version 1.5**
## Densification using Landsat 8 products (old format)
The old format Landsat products are available through Theia. They were not used to densify the input snow products timeseries in the frame of the snow annual map generation (available under /work/OT/siaa/Theia/Neige/SNOW_ANNUAL_MAP_LIS_1.5). This can now be done by using the library [amalthee/0.2](https://gitlab.cnes.fr/datalake/amalthee.git). Among the parameters, it is possible to request products corresponding to a region of interest. This is the best way to retrive products corresponding to the S2 and L8 new format tiles (example: "T31TCH")
On CNES HPC:
```
module load amalthee/0.2
amalthee = Amalthee('oldlandsat')
amalthee.show_params("Landsat")
```
Once the old Landsat products are retrived and processed to obtain the snow products, the snow annual map could
be densified by simply adding them to the densification list. Please refer to [tutorials/prepare_snow_annual_map_data.md]((https://gitlab.orfeo-toolbox.org/remote_modules/let-it-snow/blob/develop/doc/tutorials/prepare_snow_annual_map_data.md))
for the usage of the script on CNES HPC.
## Modification of the [prepare_snow_annual_map_data.py](https://gitlab.orfeo-toolbox.org/remote_modules/let-it-snow/blob/develop/hpc/prepare_data_for_snow_annual_map.py)
This script must be modified to at least change the sub-processes that are currently asynchronous and that requires to run it multiple times.
{
"$schema": "http://json-schema.org/draft-04/schema#",
"id": "snow_annual_map_params",
"properties": {
"log": {
"default": true,
"description": "Log output and error to files (std***.log).",
"id": "log",
"title": "The Log schema.",
"type": "boolean"
},
"mode": {
"deafult":"RUNTIME",
"description": "The processing mode to use, RUNTIME to obtain only output products faster, DEBUG to obtain all intermediate files",
"id": "mode",
"title": "The Mode schema.",
"type": "string"
},
"tile_id": {
"description": "The identifier of the tile corresponding to the input input_products_list products",
"id": "tile_id",
"title": "The Tile_id schema.",
"type": "string"
},
"input_products_list": {
"default": [],
"description": "The input products list, containing the paths of homogeneous snow products only on tile_id at same resolution and size",
"id": "input_products_list",
"title": "The input_products_list schema.",
"type": "list"
},
"path_tmp": {
"default":"",
"description": "The path where to store temporary files, else the application try to retrive $TMPDIR in env",
"id": "path_tmp",
"title": "The Path_tmp schema.",
"type": "string"
},
"use_densification": {
"default":"false",
"description": "Activate the densification using snow products from heterogeneous sensors",
"id": "use_densification",
"title": "The Use_densification schema.",
"type": "boolean"
},
"densification_products_list": {
"default": [],
"description": "The densification list, containing the paths of heterogenous snow products from heterogeneous sensors",
"id": "densification_products_list",
"title": "The densification_products_list schema.",
"type": "list"
},
"path_out": {
"description": "Path to output directory.",
"id": "path_out",
"title": "The Path_out schema.",
"type": "string"
},
"date_start": {
"description": "Start of the date range for which we want to generate the snow_annual_map (DD/MM/YYYY)",
"id": "date_start",
"title": "The Date_start schema.",
"type": "string"
},
"date_stop": {
"description": "Stop of the date range for which we want to generate the snow_annual_map (DD/MM/YYYY)",
"id": "date_stop",
"title": "The Date_stop schema.",
"type": "string"
},
"date_margin": {
"default": 15,
"description": "The margin ouside the date range to use for better interpolation results (in days)",
"id": "date_margin",
"title": "The Date_margin schema.",
"type": "string"
},
"ram": {
"default": 4096,
"description": "Maximum number of RAM memory used by the program.",
"id": "ram",
"title": "The Ram schema.",
"type": "integer"
},
"nb_threads": {
"default": 6,
Please register or sign in to reply
"description": "Maximum number of threads use by the program.",
"id": "nb_threads",
"title": "The Nb_threads schema.",
"type": "integer"
},
"comments": "the following parameters concerns only the snow_annual_map_evaluation",
"run_comparison_evaluation": {
"default":"false",
"description": "Activate the one to one comparison using snow products from heterogeneous sensors",
"id": "run_comparison_evaluation",
"title": "The run_comparison_evaluation schema.",
"type": "boolean"
},
"comparison_products_list": {
"default": [],
"description": "The comparison list, containing the paths of heterogenous snow products to compare with daily interpolation",
"id": "comparison_products_list",
"title": "The comparison_products_list schema.",
"type": "list"
},
"run_modis_comparison": {
"default":"false",
"description": "Activate the comparison between annual map and modis snow annual map",
"id": "run_modis_comparison",
"title": "The run_modis_comparison schema.",
"type": "boolean"
},
"modis_snow_map": {
"description": "The path to the modis daily snow masks (one file with one band per day)",
"id": "modis_snow_map",
"title": "The modis_snow_map schema.",
"type": "string"
},
"modis_snow_map_dates": {
"description": "The dates corresponding to the bands of the modis_snow_map",
"id": "modis_snow_map_dates",
"title": "The modis_snow_map_dates schema.",
"type": "string"
},
"dem": {
"description": "The dem to use during modis comparison, to generate snow per altitude slices",
"id": "dem",
"title": "The dem schema.",
"type": "string"
}
},
"type": "object"
}
# How to generate configuration file for snow annual map computation on CNES HPC
This tutorial aims at describing and explaining the usage of the python script prepare_snow_annual_map_data.py.
[prepare_snow_annual_map_data.py](https://gitlab.orfeo-toolbox.org/remote_modules/let-it-snow/blob/develop/hpc/prepare_data_for_snow_annual_map.py)
**WARNING: The following tutorial applies to LIS version 1.5 and only on CNES HPC.
However, it could be an example for generating custom preparation script outside of the CNES HPC**
## Prerequisites
The script prepare_snow_annual_map_data.py can only be launch on CNES HPC and with specific modules:
- Python in version 3.5.2
- Amalthee in version 0.2
On CNES HPC:
```
module load python/3.5.2
module load amalthee
```
The script prepare_snow_annual_map_data.py must be located along the following scripts, in order to launch correctly the sub-tasks:
- [run_lis_from_filelist.sh](https://gitlab.orfeo-toolbox.org/remote_modules/let-it-snow/blob/develop/hpc/run_lis_from_filelist.sh), it is a PBS script dedicated to the production of the snow products
- [run_snow_annual_map.sh](https://gitlab.orfeo-toolbox.org/remote_modules/let-it-snow/blob/develop/hpc/run_snow_annual_map.sh), it is a PBS script dedicated to the production of the snow annual maps
## Configuration parameters
The scripts prepare_snow_annual_map_data.py does not take additional argument in itself. To edit the configuration,
the script must be modified manually with any text editor. The section of the script to modified is reported below.
Please register or sign in to reply
```
def main():
params = {"tile_id":"T32TPS",
"date_start":"01/09/2017",
"date_stop":"31/08/2018",
"date_margin":15,
"mode":"DEBUG",
"input_products_list":[],
# path_tmp is an actual parameter but must only be uncomment with a correct path
# else the processing use $TMPDIR by default
#"path_tmp":"",
"path_out":"/work/OT/siaa/Theia/Neige/SNOW_ANNUAL_MAP_LIS_1.5/L8_only",
"ram":8192,
"nbThreads":6,
"use_densification":False,
"log":True,
"densification_products_list":[],
# the following parameters are only use in this script, and doesn't affect snow_annual_map processing
"snow_products_dir":"/work/OT/siaa/Theia/Neige/PRODUITS_NEIGE_LIS_develop_1.5",
"data_availability_check":False}
```
These parameters are describe in [snow_annual_map_schema.json](https://gitlab.orfeo-toolbox.org/remote_modules/let-it-snow/blob/develop/doc/snow_annual_map_schema.json)
Please register or sign in to reply
and correspond to the parameters of the json file to provide to the application run_snow_annual_map.py.
However, the two last parameters are specific to prepare_snow_annual_map_data.py:
- "snow_products_dir", must be filled with the storage path chosen for all the snow products.
- "data_availability_check", must remains at "false" and is only modified by the script itself if all the data required for the snow annual map processing are available.
The only external configuration parameters is the following file:
- [selectNeigeSyntheseMultitemp.csv](https://gitlab.orfeo-toolbox.org/remote_modules/let-it-snow/blob/develop/hpc/selectNeigeSyntheseMultitemp.csv), it is the list of tiles (ex: "T31TCH") for which we want to generate the snow annual map.
## Execution
The script can now simply be launched by the following command.
Please register or sign in to reply
```
python prepare_snow_annual_map_data.py
```
The generated log allows to monitor the status of the data requires before snow annual map processing.
In the case where all the required snow products are not already available in the **"snow_products_dir"** (which is more than likely!), the script will call two different type of asynchronous sub-process:
Please register or sign in to reply
- the first one is a request for the generation of the missing snow products under **"snow_products_dir"**, when the L2A products are available.
- the second one is triggered when some L2A products are not available, the sub-process in then in charge for the downloading of the products in to the datalake (see Amalthee documentation for command amalthee_theia.fill_datalake())
Because these sub-processes are asynchronous, it is required to run the prepare_snow_annual_map_data.py multiple time.
For example, if no data is available and that it is the first time you run the script, it would require to run it 3 times:
- the first time to fill the datalake with the requested L2A products.
- the second time to generate all the snow products corresponding to these L2A products
- and finally to check that all the requested data is now available and to trigger the actual computation of the snow annual map.
Between each run it is adviced to monitor off-line the status of the different sub-processes, instead of running the script more than necessary.
## Results
......@@ -2,7 +2,7 @@
#PBS -N TheiaNViz
#PBS -j oe
#PBS -l select=1:ncpus=4:mem=10gb
#PBS -l walltime=00:35:00
#PBS -l walltime=02:00:00
# make output figures for a better vizualisation
# qsub -v tile="29SRQ",input_folder="/work/OT/siaa/Theia/Neige/test_snow_in_cloud_removal" makefigureTile_lis_Sentinel2_cluster_muscate_2.sh
# useful option: qsub -W depend=afterok:<jobid> where jobid is the job id of qsub runTile..
......
......@@ -93,16 +93,20 @@ class prepare_data_for_snow_annual_map():
logging.info('Process tile:' + self.tile_id +'.')
logging.info(' for period ' + str(self.date_start) + ' to ' + str(self.date_stop))
# compute the range of required snow products
search_start_date = self.date_start - self.date_margin
search_stop_date = self.date_stop + self.date_margin
# open a file to store the list of L2A products for which we need to generate the snow products
filename_i = os.path.abspath(self.processing_id +"_pending_for_snow_processing.txt")
FileOut = open(os.path.join(".", filename_i),"w")
resulting_df = None
snow_processing_requested = 0
# loop on the different type of products to require
for mission_tag in self.mission_tags:
# use amalthee to request the products from Theia catalogues
parameters = {"processingLevel": "LEVEL2A", "location":str(self.tile_id)}
amalthee_theia = Amalthee('theia')
amalthee_theia.search(mission_tag,
......@@ -116,6 +120,7 @@ class prepare_data_for_snow_annual_map():
snow_products_list=[]
if nb_products:
# get the dataframe containing the requested products and append extra needed fields.
df = amalthee_theia.products
df['snow_product'] = ""
df['snow_product_available'] = False
......@@ -123,6 +128,7 @@ class prepare_data_for_snow_annual_map():
datalake_product_available = 0
datalake_update_requested = 0
# loop on each products from the dataframe
for product_id in df.index:
logging.info('Processing ' + product_id)
......@@ -169,8 +175,10 @@ class prepare_data_for_snow_annual_map():
# @TODO request only the products for which the snow products are not available
#amalthee_theia.fill_datalake()
logging.info("End of requesting datalake.")
# we only append a single type of products to the main input list
if mission_tag == "SENTINEL2":#"LANDSAT":#
self.input_products_list.extend(snow_products_list)
# the other types are use for densification purpose only
else:
self.densification_products_list.extend(snow_products_list)
......@@ -244,16 +252,18 @@ def main():
"date_margin":15,
"mode":"DEBUG",
"input_products_list":[],
"snow_products_dir":"/work/OT/siaa/Theia/Neige/PRODUITS_NEIGE_LIS_develop_1.5",
# path_tmp is an actual parameter but must only be uncomment with a correct path
# else the processing use $TMPDIR by default
#"path_tmp":"",
#"path_out":"/home/qt/salguesg/scratch/multitemp_workdir/tmp_test",
"path_out":"/work/OT/siaa/Theia/Neige/Snow_Annual_Maps_L8_Densification_with_merging",
"path_out":"/work/OT/siaa/Theia/Neige/SNOW_ANNUAL_MAP_LIS_1.5/L8_only",
"ram":8192,
"nbThreads":6,
"use_densification":True,
"use_densification":False,
"log":True,
"densification_products_list":[],
# the following parameters are only use in this script, and doesn't affect snow_annual_map processing
"snow_products_dir":"/work/OT/siaa/Theia/Neige/PRODUITS_NEIGE_LIS_develop_1.5",
"data_availability_check":False}
with open('selectNeigeSyntheseMultitemp.csv', 'r') as csvfile:
......
......@@ -111,14 +111,11 @@ mkdir -p $pout
echo "pout" $pout
Please register or sign in to reply
#Load LIS modules
#module load lis/develop
source /home/qt/salguesg/load_lis.sh
module load lis/1.5
#configure gdal_cachemax to speedup gdal polygonize and gdal rasterize (half of requested RAM)
export GDAL_CACHEMAX=2048
echo $GDAL_CACHEMAX
export PATH=/home/qt/salguesg/local/bin:/home/qt/salguesg/local/bin:$PATH
echo $PATH
#check if L8 products to use the correct DEM
if [[ $product_zip == *LANDSAT8* ]];
......
#!/bin/bash
#PBS -N TheiaNeige
#PBS -N TheiaNeigeRunSnowAnnualMap
#PBS -j oe
#PBS -l select=1:ncpus=8:mem=20000mb
#PBS -l walltime=24:00:00
#PBS -l walltime=04:20:00
# run LIS for one Sentinel-2 Level-2A tile and one date (walltime is higher)
# specify the path to the tile folder, the path the DEM and the template configuration file (.json)
# First argument is the tile name (nnccc): qsub -v config="path/to/config/json" run_snow_annual_map.sh
# First argument is the tile name (nnccc): qsub -v config="path/to/config/json",overwrite="false" run_snow_annual_map.sh
if [ -z $config ]; then
......@@ -15,19 +15,36 @@ fi
echo $config
#config_list=`ls /work/OT/siaa/Theia/Neige/Snow_Annual_Maps/*/*.json`
if [ -z $overwrite ]; then
echo "overwrite is not set, not overwrite will be done"
overwrite="false"
fi
expected_target_path=$(dirname $config)
echo "Expected output path $expected_target_path"
echo $(ls -A -- ${expected_target_path}/*.tif)
if [ -n "$(ls -A -- ${expected_target_path}/*.tif)" ]; then
echo "$expected_target_path already contains tif results!"
if [ $overwrite == "false" ]; then
echo "exiting to avoid overwrite"
exit 1
fi
fi
outpath=$(dirname $config)
#Load LIS modules
#module load lis/develop
source /home/qt/salguesg/load_lis.sh
module load lis/1.5
#configure gdal_cachemax to speedup gdal polygonize and gdal rasterize (half of requested RAM)
export GDAL_CACHEMAX=2048
echo $GDAL_CACHEMAX
export PATH=/home/qt/salguesg/local/bin:/home/qt/salguesg/local/bin:$PATH
echo $PATH
# run the snow detection
date ; echo "START run_snow_annual_map.py $config"
run_snow_annual_map.py $config
date ; echo "END run_snow_annual_map.py"
chgrp -R lis_admin $expected_target_path
chmod 775 -R $expected_target_path
echo "Results available under $expected_target_path"
......@@ -37,9 +37,7 @@ def main(argv):
op.join(data_path,"SENTINEL2A_20180131-105416-437_L2A_T31TCH_D_V1-4")
],
"log": True,
"log_stdout": op.join(out_path,"stdout.log"),
"date_start": "01/01/2018",
"log_stderr": op.join(out_path,"stderr.log"),
"path_tmp": tmp_path,
"ram": 1024,
"use_densification": True,
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment