Commit ace4fbb7 authored by Manuel Grizonnet's avatar Manuel Grizonnet
Browse files

Merge branch 'release/1.5'

parents 18e89f9c 1aec066a
# Change Log
All notable changes to LIS will be documented in this file.
All notable changes to Let It Snow (LIS) will be documented in this file.
## [1.5] - 2019-01-11
### Added
- The snow annual map module is now operational, the associated files are:
- app/run_snow_annual_map.py, the main application
- python/s2snow/snow_annual_map.py, the core of the annual map processing
- python/s2snow/snow_annual_map_evaluation.py, provide the possibility to compare with other snow products and modis snow annual map
- python/s2snow/snow_product_parser.py, class to handle the supported type of snow products
- doc/snow_annual_map_schema.json, parameters descriptions
- hpc/prepare_data_for_snow_annual_map.py, preprocessing script on CNES HPC,
- doc/tutorials/prepare_snow_annual_map_data.md, tutorial
- Provided new data pack for tests "Data-LIS-1.5"
- Add tests for snow annual map computation
- The version of the s2snow module is now stored in file python/s2snow/version.py
- Add support and tests for zipped products in build_json.py and run_snow_detector.py,
- Add a mode to build_json.py script to configure and run LIS on Level 2A MAJA native products
- Add a tutorial to describe how to run LIS on MAJA native products
- Add a mode to build_json.py script to configure and run LIS on SEN2COR Level 2A products
- Add a tutorial to describe how to run LIS on SEN2COR products
- Add a mode to build_json.py script to configure and run LIS on U.S. Landsat Analysis Ready Data (ARD) Level 2A products
- Add a tutorial to describe how to run LIS on Landsat ARD products
- The expert mask now includes a 5th bit for the clouds that were present in the product/original cloud mask
- The expert mask now includes an optional 6th bit propagating the slope correction flag from the product mask when available
- The cold cloud removal (pass 1.5) now use an area threshold to process only significant snow areas within clouds and reduce time.
- Link ATBD and LIS Data for test validation to their Zenodo DOI in README.md
### Fixed
- Fix all python scripts headers to avoid to mix python versions
- Fix preprocessing json option which was broken which allows to resample input DTM
- Fix typos in README.md documentation
- Change nodata management (read -32768 from input and write 0 in the
output) in DTM resampling to avoid error in snow line estimation
on area without DTM information
## [1.4] - 2018-02-14
......@@ -23,7 +57,7 @@ All notable changes to LIS will be documented in this file.
## [1.3.1] - 2017-11-23
### Hotfix
### Fixed
- Fix the intermediate data format (used 1 bit instead of type uint8)
## [1.3] - 2017-11-02
......
......@@ -2,8 +2,6 @@ PROJECT(lis)
CMAKE_MINIMUM_REQUIRED(VERSION 2.8)
include_directories(src)
# Find necessary packages
find_package(GDAL REQUIRED)
......@@ -14,17 +12,12 @@ if(NOT GDAL_FOUND)
endif()
if (GDAL_CONFIG)
## extract gdal version
# extract gdal version
exec_program(${GDAL_CONFIG}
ARGS --version
OUTPUT_VARIABLE GDAL_VERSION )
string(REGEX REPLACE "([0-9]+)\\.([0-9]+)\\.([0-9]+)" "\\1" GDAL_VERSION_MAJOR "${GDAL_VERSION}")
string(REGEX REPLACE "([0-9]+)\\.([0-9]+)\\.([0-9]+)" "\\2" GDAL_VERSION_MINOR "${GDAL_VERSION}")
#MESSAGE("DBG GDAL_VERSION ${GDAL_VERSION}")
#MESSAGE("DBG GDAL_VERSION_MAJOR ${GDAL_VERSION_MAJOR}")
#MESSAGE("DBG GDAL_VERSION_MINOR ${GDAL_VERSION_MINOR}")
# check for gdal version
if (GDAL_VERSION_MAJOR LESS 2)
message (FATAL_ERROR "GDAL version is too old (${GDAL_VERSION}). Use 2.1 or higher.")
......@@ -41,7 +34,7 @@ find_package( PythonLibs 2.7 REQUIRED)
include_directories( ${PYTHON_INCLUDE_DIRS} )
# Link to the Orfeo ToolBox
# LIS required OTB 6.0 which provides patch regarding management of 1 byte tiff image)
# LIS required OTB 6.0
SET(OTB_MIN_VERSION "6.0.0")
find_package(OTB ${OTB_MIN_VERSION} REQUIRED)
......@@ -73,6 +66,8 @@ endif(NOT lis_INSTALL_INCLUDE_DIR)
set(BUILD_SHARED_LIBS ON)
include_directories(src)
add_subdirectory(src)
add_subdirectory(python)
add_subdirectory(app)
......
......@@ -5,15 +5,19 @@ This code implements the snow cover extent detection algorithm LIS (Let It Snow)
The algorithm documentation with examples is available here:
* [Algorithm theoretical basis documentation](http://tully.ups-tlse.fr/grizonnet/let-it-snow/blob/master/doc/tex/ATBD_CES-Neige.pdf)
* [![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.1414452.svg)](https://doi.org/10.5281/zenodo.1414452)
Access to Theia Snow data collection:
* [![DOI:10.24400/329360/f7q52mnk](https://zenodo.org/badge/DOI/10.24400/329360/f7q52mnk.svg)](http://doi.org/10.24400/329360/f7q52mnk)
To read more about the "Centre d'Expertise Scientifique surface enneigée" (in French):
* [Bulletin THEIA](https://www.theia-land.fr/sites/default/files/imce/BulletinTHEIA3.pdf#page=10)
The input files are Sentinel-2 or Landsat-8 level-2A products from the [Theai Land Data Centre](https://theia.cnes.fr/) or [SPOT-4/5 Take 5 level-2A products](https://spot-take5.org) and the SRTM digital elevation model reprojected at the same resolution as the input image.
The input files are Sentinel-2 or Landsat-8 level-2A products from the [Theia Land Data Centre](https://theia.cnes.fr/) or [SPOT-4/5 Take 5 level-2A products](https://spot-take5.org) and a Digital Terrain Model (DTM) like SRTM for instance reprojected at the same resolution as the input image.
##Usage
## Usage
Run the python script run_snow_detector.py with a json configuration file as unique argument:
......@@ -24,7 +28,7 @@ The snow detection is performed in the Python script [run_snow_detector.py](app/
All the parameters of the algorithm, paths to input and output data are stored in the json file. See the provided example [param_test_s2_template.json](tes/param_test_s2_template.json) file for an example.
Moreover The JSON schema is available in the [Algorithm theoritical basis documentation](doc/tex/ATBD_CES-Neige.tex) and gives more information about the roles of these parameters.
Moreover The JSON schema is available in the [Algorithm Theoretical Basis Documentation](doc/atbd/ATBD_CES-Neige.tex) and gives more information about the roles of these parameters.
NB: To build DEM data download the SRTM files corresponding to the study area and build the .vrt using gdalbuildvrt. Edit config.json file to activate preprocessing : Set "preprocessing" to true and set the vrt path.
......@@ -37,6 +41,8 @@ NB: To build DEM data download the SRTM files corresponding to the study area an
* 2nd bit: Snow mask after pass2
* 3rd bit: Clouds detected at pass0
* 4th bit: Clouds refined at pass0
* 5th bit: Clouds initial (all_cloud)
* 6th bit: Slope flag (optional 1: bad slope correction)
For example if you want to get the snow from pass1 and clouds detected from pass1 you need to do:
```python
......@@ -47,7 +53,7 @@ pixel_value & 00000101
* 100: Snow
* 205: Cloud including cloud shadow
* 254: No data
* SEB_VEC: Vector image of the snow mask and cloud mask. Two fields of information are embbeded in this product. DN (for Data Neige) and type.
* SEB_VEC: Vector image of the snow mask and cloud mask. Two fields of information are embedded in this product. DN (for Data Neige) and type.
* DN field :
* 0: No-snow
* 100: Snow
......@@ -79,7 +85,7 @@ Following a summary of the required dependencies:
* lxml
* matplotlib
GDAL itself depends on a number of other libraries provided by most major operating systems and also depends on the non standard GEOS and PROJ4 libraries. GDAl- Python bindings are also required
GDAL itself depends on a number of other libraries provided by most major operating systems and also depends on the non standard GEOS and Proj libraries. GDAL- Python bindings are also required
Python package dependencies:
......@@ -100,7 +106,7 @@ Optional dependencies:
In your build directory, use cmake to configure your build.
```bash
cmake -C config.cmake source/lis/
cmake -C config.cmake source_lis_path
```
In your config.cmake you need to set :
```bash
......@@ -129,7 +135,7 @@ chmod -R 755 ${install_dir}
```
The files will be installed by default into /usr/local and add to the python default modules.
To overide this behavior, the variable CMAKE_INSTALL_PREFIX must be configure before build step.
To overrsouride this behavior, the variable CMAKE_INSTALL_PREFIX must be configure before build step.
Update environment variables for LIS. Make sure that OTB and other dependencies directories are set in your environment variables:
```bash
......@@ -142,9 +148,11 @@ let-it-snow is now installed.
## Tests
Enable tests with BUILD_TESTING cmake option. Use ctest to run tests. Do not forget to clean your output test directory when you run a new set of tests.
Enable tests with BUILD_TESTING cmake option. Use ctest command to run tests. Do not forget to clean your output test directory when you run a new set of tests.
Data (input and baseline) to run validation tests are available on Zenodo:
Data (input and baseline) to run validation tests are available on [Zenodo](http://doi.org/10.5281/zenodo.166511).
* [![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.166511.svg)](https://doi.org/10.5281/zenodo.166511)
Download LIS-Data and extract the folder. It contains all the data needed to run tests. Set Data-LIS path var in cmake configuration files.
Baseline : Baseline data folder. It contains output files of S2Snow that have been reviewed and validated.
......
file(INSTALL ${CMAKE_CURRENT_SOURCE_DIR}/run_snow_detector.py DESTINATION ${CMAKE_BINARY_DIR}/app)
file(INSTALL ${CMAKE_CURRENT_SOURCE_DIR}/run_cloud_removal.py DESTINATION ${CMAKE_BINARY_DIR}/app)
file(INSTALL ${CMAKE_CURRENT_SOURCE_DIR}/run_snow_annual_map.py DESTINATION ${CMAKE_BINARY_DIR}/app)
file(INSTALL ${CMAKE_CURRENT_SOURCE_DIR}/build_json.py DESTINATION ${CMAKE_BINARY_DIR}/app)
install(FILES ${CMAKE_CURRENT_SOURCE_DIR}/run_snow_detector.py DESTINATION ${CMAKE_INSTALL_PREFIX}/app)
install(FILES ${CMAKE_CURRENT_SOURCE_DIR}/run_snow_annual_map.py DESTINATION ${CMAKE_INSTALL_PREFIX}/app)
install(FILES ${CMAKE_CURRENT_SOURCE_DIR}/run_cloud_removal.py DESTINATION ${CMAKE_INSTALL_PREFIX}/app)
install(FILES ${CMAKE_CURRENT_SOURCE_DIR}/build_json.py DESTINATION ${CMAKE_INSTALL_PREFIX}/app)
#!/usr/bin/python
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import os
import re
import sys
import json
import logging
import argparse
import zipfile
### Configuration Template ###
conf_template = {"general":{"pout":"",
......@@ -46,25 +49,43 @@ conf_template = {"general":{"pout":"",
"strict_cloud_mask":False,
"rm_snow_inside_cloud":False,
"rm_snow_inside_cloud_dilation_radius":1,
"rm_snow_inside_cloud_threshold":0.85}}
"rm_snow_inside_cloud_threshold":0.85,
"rm_snow_inside_cloud_min_area":5000}}
### Mission Specific Parameters ###
S2_parameters = {"multi":10,
"green_band":".*FRE_B3.*\.tif$",
"green_bandNumber":1,
"red_band":".*FRE_B4.*\.tif$",
"red_bandNumber":1,
"swir_band":".*FRE_B11.*\.tif$",
"swir_bandNumber":1,
"cloud_mask":".*CLM_R2\.tif$",
MAJA_parameters = {"multi":10,
"green_band":".*FRE_R1.DBL.TIF$",
"green_bandNumber":2,
"red_band":".*FRE_R1.DBL.TIF$",
"red_bandNumber":3,
"swir_band":".*FRE_R2.DBL.TIF$",
"swir_bandNumber":5,
"cloud_mask":".*CLD_R2.DBL.TIF$",
"dem":".*ALT_R2\.TIF$",
"shadow_in_mask":32,
"shadow_out_mask":64,
"shadow_in_mask":4,
"shadow_out_mask":8,
"all_cloud_mask":1,
"high_cloud_mask":128,
"rf":12}
SEN2COR_parameters = {"mode":"sen2cor",
"multi":10,
"green_band":".*_B03_10m.jp2$",
"green_bandNumber":1,
"red_band":".*_B04_10m.jp2$",
"red_bandNumber":1,
"swir_band":".*_B11_20m.jp2$",
"swir_bandNumber":1,
"cloud_mask":".*_SCL_20m.jp2$",
"dem":"",
"shadow_in_mask":3,
"shadow_out_mask":3,
"all_cloud_mask":8,
"high_cloud_mask":10,
"rf":12}
Take5_parameters = {"multi":1,
"green_band":".*ORTHO_SURF_CORR_PENTE.*\.TIF$",
"green_bandNumber":1,
......@@ -73,6 +94,8 @@ Take5_parameters = {"multi":1,
"swir_band":".*ORTHO_SURF_CORR_PENTE.*\.TIF$",
"swir_bandNumber":4,
"cloud_mask":".*NUA.*\.TIF$",
"div_mask":".*DIV.*\.TIF$",
"div_slope_thres":8,
"dem":".*\.tif",
"shadow_in_mask":64,
"shadow_out_mask":128,
......@@ -80,6 +103,40 @@ Take5_parameters = {"multi":1,
"high_cloud_mask":32,
"rf":8}
S2_parameters = {"multi":10,
"green_band":".*FRE_B3.*\.tif$",
"green_bandNumber":1,
"red_band":".*FRE_B4.*\.tif$",
"red_bandNumber":1,
"swir_band":".*FRE_B11.*\.tif$",
"swir_bandNumber":1,
"cloud_mask":".*CLM_R2.*\.tif$",
"dem":".*ALT_R2\.TIF$",
"div_mask":".*MG2_R2.*\.tif$",
"div_slope_thres":64,
"shadow_in_mask":32,
"shadow_out_mask":64,
"all_cloud_mask":1,
"high_cloud_mask":128,
"rf":12}
L8_parameters_new_format = {"multi":1,
"green_band":".*FRE_B3.*\.tif$",
"green_bandNumber":1,
"red_band":".*FRE_B4.*\.tif$",
"red_bandNumber":1,
"swir_band":".*FRE_B6.*\.tif$",
"swir_bandNumber":1,
"cloud_mask":".*CLM_XS.*\.tif$",
"dem":".*ALT_R2\.TIF$",
"div_mask":".*MG2_XS.*\.tif$",
"div_slope_thres":64,
"shadow_in_mask":32,
"shadow_out_mask":64,
"all_cloud_mask":1,
"high_cloud_mask":128,
"rf":8}
L8_parameters = {"multi":1,
"green_band":".*ORTHO_SURF_CORR_PENTE.*\.TIF$",
"green_bandNumber":3,
......@@ -88,6 +145,8 @@ L8_parameters = {"multi":1,
"swir_band":".*ORTHO_SURF_CORR_PENTE.*\.TIF$",
"swir_bandNumber":6,
"cloud_mask":".*NUA.*\.TIF$",
"div_mask":".*DIV.*\.TIF$",
"div_slope_thres":8,
"dem":".*\.tif",
"shadow_in_mask":64,
"shadow_out_mask":128,
......@@ -95,9 +154,30 @@ L8_parameters = {"multi":1,
"high_cloud_mask":32,
"rf":8}
LANDSAT8_LASRC_parameters = {"mode":"lasrc",
"multi":10,
"green_band":".*_sr_band3.tif$",
"green_bandNumber":1,
"red_band":".*_sr_band4.tif$",
"red_bandNumber":1,
"swir_band":".*_sr_band6.tif$",
"swir_bandNumber":1,
"cloud_mask":".*_pixel_qa.tif$",
"dem":".*\.tif",
"shadow_in_mask":8,
"shadow_out_mask":8,
"all_cloud_mask":224, # cloud with high confidence (32+(64+128))
"high_cloud_mask":800, # cloud and high cloud with high confidence (32 + (512+256))
"rf":8}
mission_parameters = {"S2":S2_parameters,\
"LANDSAT8":L8_parameters,\
"Take5":Take5_parameters}
"LANDSAT8_new_format":L8_parameters_new_format,\
"Take5":Take5_parameters,\
"MAJA":MAJA_parameters,\
"SEN2COR":SEN2COR_parameters,\
"LANDSAT8_LASRC":LANDSAT8_LASRC_parameters
}
def str2bool(v):
if v.lower() in ('yes', 'true', 't', 'y', '1'):
......@@ -111,10 +191,16 @@ def findFiles(folder, pattern):
""" Search recursively into a folder to find a patern match
"""
matches = []
for root, dirs, files in os.walk(folder):
for file in files:
if re.match(pattern, file):
matches.append(os.path.join(root, file))
if folder.lower().endswith('.zip'):
zfile = zipfile.ZipFile(folder)
for filename in zfile.namelist():
if re.match(pattern, filename):
matches.append("/vsizip/"+os.path.join(folder, filename))
else:
for root, dirs, files in os.walk(folder):
for file in files:
if re.match(pattern, file):
matches.append(os.path.join(root, file))
return matches
def read_product(inputPath, mission):
......@@ -126,7 +212,6 @@ def read_product(inputPath, mission):
conf_json = conf_template
conf_json["general"]["multi"] = params["multi"]
conf_json["inputs"]["green_band"]["path"] = findFiles(inputPath, params["green_band"])[0]
conf_json["inputs"]["red_band"]["path"] = findFiles(inputPath, params["red_band"])[0]
conf_json["inputs"]["swir_band"]["path"] = findFiles(inputPath, params["swir_band"])[0]
......@@ -140,12 +225,27 @@ def read_product(inputPath, mission):
else:
logging.warning("No DEM found within product!")
# Check optional div mask parameters to access slope correction flags
if "div_mask" in params and "div_slope_thres" in params:
div_mask_tmp = findFiles(inputPath, params["div_mask"])
if div_mask_tmp:
conf_json["inputs"]["div_mask"] = div_mask_tmp[0]
conf_json["inputs"]["div_slope_thres"] = params["div_slope_thres"]
else:
logging.warning("div_mask was not found, the slope correction flag will be ignored")
conf_json["cloud"]["shadow_in_mask"] = params["shadow_in_mask"]
conf_json["cloud"]["shadow_out_mask"] = params["shadow_out_mask"]
conf_json["cloud"]["all_cloud_mask"] = params["all_cloud_mask"]
conf_json["cloud"]["high_cloud_mask"] = params["high_cloud_mask"]
conf_json["cloud"]["rf"] = params["rf"]
#Check if an optional mode is provided in the mission configuration
# Use in case of SEN2COR to handle differences between maja and sen2cor encoding
if 'mode' in params:
conf_json["general"]["mode"] = params["mode"]
return conf_json
else:
logging.error(inputPath + " doesn't exist.")
......@@ -201,23 +301,49 @@ def main():
inputPath = os.path.abspath(args.inputPath)
outputPath = os.path.abspath(args.outputPath)
if ("S2" in inputPath) or ("SENTINEL2" in inputPath):
sentinel2Acronyms = ['S2', 'SENTINEL2', 'S2A', 'S2B']
# Test if it is a MAJA output products (generated with MAJA processor version XX)
# FIXME: This detection based on directory substring detection is very week and error prone
# FIXME: use a factory and detect by using xml metadata
if '.SAFE' in inputPath:
# L2A SEN2COR product
logging.info("SEN2COR product detected (detect .SAFE in the input path...).")
jsonData = read_product(inputPath, "SEN2COR")
elif '.DBL.DIR' in inputPath:
if any(s in inputPath for s in sentinel2Acronyms):
logging.info("MAJA native product detected (detect .DBL.DIR substring in input path...)")
jsonData = read_product(inputPath, "MAJA")
else:
logging.error("Only MAJA products from Sentinels are supported by build_json.py script for now.")
elif any(s in inputPath for s in sentinel2Acronyms):
logging.info("THEIA Sentinel product detected.")
jsonData = read_product(inputPath, "S2")
elif "Take5" in inputPath:
logging.info("THEIA Take5 product detected.")
jsonData = read_product(inputPath, "Take5")
elif "LANDSAT8-OLITIRS-XS" in inputPath:
logging.info("THEIA LANDSAT8 product detected. (new version)")
jsonData = read_product(inputPath, "LANDSAT8_new_format")
elif "LANDSAT8" in inputPath:
logging.info("THEIA LANDSAT8 product detected.")
jsonData = read_product(inputPath, "LANDSAT8")
elif "LC08" in inputPath:
logging.info("LANDSAT8 LASRC) product detected (LC08_L1TP in input path...).")
jsonData = read_product(inputPath, "LANDSAT8_LASRC")
else:
logging.error("Unknown product type.")
sys.exit(0)
if jsonData:
if not os.path.exists(outputPath):
logging.info("Create directory " + outputPath + "...")
os.makedirs(outputPath)
jsonData["general"]["pout"] = outputPath
# Override parameters for group general
if args.nodata:
if args.nodata is not None:
jsonData["general"]["nodata"] = args.nodata
if args.preprocessing is not None:
jsonData["general"]["preprocessing"] = args.preprocessing
......
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import sys
import os.path as op
......
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import sys
import os.path as op
......@@ -37,8 +38,8 @@ def main(argv):
log = data.get("log", True)
if log:
sys.stdout = open(op.join(pout, "stdout.log"), 'w')
sys.stderr = open(op.join(pout, "stderr.log"), 'w')
sys.stdout = open(data.get('log_stdout', op.join(pout, "stdout.log")), 'w')
sys.stderr = open(data.get('log_stderr', op.join(pout, "stderr.log")), 'w')
# Set logging level and format.
logging.basicConfig(stream=sys.stdout, level=logging.INFO, \
......@@ -50,7 +51,7 @@ def main(argv):
snow_annual_map_evaluation_app = snow_annual_map_evaluation.snow_annual_map_evaluation(data)
snow_annual_map_evaluation_app.run()
if data.get("run_l8_evaluation", False):
if data.get("run_comparison_evaluation", False):
snow_annual_map_evaluation_app.run_evaluation()
if data.get("run_modis_comparison", False):
......
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import sys
import os.path as op
import json
import logging
from s2snow import snow_detector
VERSION = "1.4"
from s2snow.version import VERSION
def show_help():
......
# Snow Annual Maps: open points and perspectives
This files list the identifies open points, and also the actions in terms of data processing and production.
**WARNING: The following applies to LIS version 1.5**
## Densification using Landsat 8 products (old format)
The old format Landsat products are available through Theia. They were not used to densify the input snow products timeseries in the frame of the snow annual map generation (available under /work/OT/siaa/Theia/Neige/SNOW_ANNUAL_MAP_LIS_1.5). This can now be done by using the library [amalthee/0.2](https://gitlab.cnes.fr/datalake/amalthee.git). Among the parameters, it is possible to request products corresponding to a region of interest. This is the best way to retrive products corresponding to the S2 and L8 new format tiles (example: "T31TCH")
On CNES HPC:
```
module load amalthee/0.2
amalthee = Amalthee('oldlandsat')
amalthee.show_params("Landsat")
```
Once the old Landsat products are retrived and processed to obtain the snow products, the snow annual map could
be densified by simply adding them to the densification list. Please refer to [tutorials/prepare_snow_annual_map_data.md]((https://gitlab.orfeo-toolbox.org/remote_modules/let-it-snow/blob/develop/doc/tutorials/prepare_snow_annual_map_data.md))
for the usage of the script on CNES HPC.
## Modification of the [prepare_snow_annual_map_data.py](https://gitlab.orfeo-toolbox.org/remote_modules/let-it-snow/blob/develop/hpc/prepare_data_for_snow_annual_map.py)
This script must be modified to at least change the sub-processes that are currently asynchronous and that requires to run it multiple times.
......@@ -211,6 +211,8 @@ The other output files are rather useful for the expert evaluation and troublesh
\item bit 2: snow (pass 2)
\item bit 3: clouds (pass 1)
\item bit 4: clouds (pass 2)
\item bit 5: clouds (initial all cloud)
\item bit 6: slope flag (optional bad slope correction flag)
\end{itemize}
\item a metadata file (*METADATA.XML)
......
% Created 2018-12-06
\documentclass[a4paper]{article}
\usepackage[utf8]{inputenc}
\usepackage[T1]{fontenc}
\usepackage{lmodern}
\usepackage{fixltx2e}
\usepackage{graphicx}
\usepackage{longtable}
\usepackage{float}
\usepackage{wrapfig}
\usepackage{rotating}
\usepackage[normalem]{ulem}
\usepackage{amsmath}
\usepackage{textcomp}
\usepackage{marvosym}
\usepackage{wasysym}
\usepackage{amssymb}
\usepackage{hyperref}
\hypersetup{
colorlinks=true,
linkcolor=blue,
pdfauthor=Germain Salgues,
pdftitle=Algorithm theoretical basis documentation for the cold cloud removal in
the snow cover product (Let-it-snow)}
\tolerance=1000
\usepackage{amsfonts,bm}
\usepackage{color}
\usepackage[usenames,dvipsnames]{xcolor}
\usepackage[margin=2.5cm,a4paper]{geometry}
\usepackage{enumitem}
\usepackage{fancyhdr}
\usepackage{tabularx}
\usepackage{algorithm}
\usepackage[noend]{algpseudocode}
\makeatletter
\def\BState{\State\hskip-\ALG@thistlm}
\makeatother
\renewcommand{\maketitle}{}
\date{\today}
% \title{ATBD CES surface enneigée}
% \hypersetup{
% pdfkeywords={},
% pdfsubject={},
% pdfcreator={Emacs 24.3.1 (Org mode 8.2.4)}}
\begin{document}
\maketitle
\pagestyle{fancy}
% \providecommand{\alert}[1]{\textbf{#1}}
% \setlist[itemize,1]{label=$\diamond$}
% \setlist[itemize,2]{label=$\ast$}
% \setlist[itemize,3]{label=$\star$}
% \setlist[itemize,4]{label=$\bullet$}
% \setlist[itemize,5]{label=$\circ$}
% \setlist[itemize,6]{label=$-$}
% \setlist[itemize,7]{label=$\cdot$}
% \setlist[itemize,8]{label=$\cdot$}
% \setlist[itemize,9]{label=$\cdot$}
% \renewlist{itemize}{itemize}{9}
\lhead[]{\includegraphics[width=0.1\textwidth]{./images/logo_cesbio.png}}
\rhead[]{\thepage}
% \cfoot{\textcolor{PineGreen}{copyright?}}
\begin{titlepage}
\includegraphics[width=0.3\textwidth]{./images/logo_cesbio.png}
\hspace{5cm}
\includegraphics[width=0.3\textwidth]{./images/Theia_en.png}
\vspace{3cm}
\textcolor{PineGreen}{ \huge \bfseries Theia Land Data Centre\\ }
% \vspace{0.5cm}
\rule{\linewidth}{0.5mm}
\begin{center}
{ \huge \bfseries Algorithm theoretical basis documentation for the cold cloud removal in
the snow cover product (Let-it-snow)\\}
\rule{\linewidth}{0.5mm}