Skip to content
Snippets Groups Projects
Commit 04c5befb authored by Manuel Grizonnet's avatar Manuel Grizonnet
Browse files

DOC: minor updates in classif recipes and otb-app

parent 338d25c8
No related branches found
No related tags found
No related merge requests found
...@@ -16,7 +16,7 @@ library, or compose them into high level pipelines. OTB applications allow to: ...@@ -16,7 +16,7 @@ library, or compose them into high level pipelines. OTB applications allow to:
OTB applications can be launched in different ways, and accessed from different OTB applications can be launched in different ways, and accessed from different
entry points. The framework can be extended, but Orfeo Toolbox ships with the following: entry points. The framework can be extended, but Orfeo Toolbox ships with the following:
- A command-line laucher, to call applications from the terminal, - A command-line launcher, to call applications from the terminal,
- A graphical launcher, with an auto-generated QT interface, providing - A graphical launcher, with an auto-generated QT interface, providing
ergonomic parameters setting, display of documentation, and progress ergonomic parameters setting, display of documentation, and progress
...@@ -145,7 +145,7 @@ Command-line examples are provided in chapter [chap:apprefdoc], page . ...@@ -145,7 +145,7 @@ Command-line examples are provided in chapter [chap:apprefdoc], page .
Using the GUI launcher Using the GUI launcher
~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~
The graphical interface for the applications provides a usefull The graphical interface for the applications provides a useful
interactive user interface to set the parameters, choose files, and interactive user interface to set the parameters, choose files, and
monitor the execution progress. monitor the execution progress.
...@@ -170,7 +170,7 @@ The resulting graphical application displays a window with several tabs: ...@@ -170,7 +170,7 @@ The resulting graphical application displays a window with several tabs:
- Parameters is where you set the parameters and execute the - Parameters is where you set the parameters and execute the
application. application.
- Logs is where you see the informations given by the application - Logs is where you see the output given by the application
during its execution. during its execution.
- Progress is where you see a progress bar of the execution (not - Progress is where you see a progress bar of the execution (not
...@@ -258,7 +258,7 @@ application, changing the algorithm at each iteration. ...@@ -258,7 +258,7 @@ application, changing the algorithm at each iteration.
# Here we configure the smoothing algorithm # Here we configure the smoothing algorithm
app.SetParameterString("type", type) app.SetParameterString("type", type)
# Set the output filename, using the algorithm to differenciate the outputs # Set the output filename, using the algorithm to differentiate the outputs
app.SetParameterString("out", argv[2] + type + ".tif") app.SetParameterString("out", argv[2] + type + ".tif")
# This will execute the application and save the output file # This will execute the application and save the output file
...@@ -313,7 +313,7 @@ An example is worth a thousand words ...@@ -313,7 +313,7 @@ An example is worth a thousand words
-outxml saved_applications_parameters.xml -outxml saved_applications_parameters.xml
Then, you can run the applications with the same parameters using the Then, you can run the applications with the same parameters using the
output xml file previously saved. For this, you have to use the inxml output XML file previously saved. For this, you have to use the inxml
parameter: parameter:
:: ::
...@@ -328,7 +328,7 @@ time ...@@ -328,7 +328,7 @@ time
otbcli_BandMath -inxml saved_applications_parameters.xml otbcli_BandMath -inxml saved_applications_parameters.xml
-exp "(im1b1 - im2b1)" -exp "(im1b1 - im2b1)"
In this cas it will use as mathematical expression “(im1b1 - im2b1)” In this case it will use as mathematical expression “(im1b1 - im2b1)”
instead of “abs(im1b1 - im2b1)”. instead of “abs(im1b1 - im2b1)”.
Finally, you can also launch applications directly from the command-line Finally, you can also launch applications directly from the command-line
...@@ -340,7 +340,7 @@ the application name. Use in this case: ...@@ -340,7 +340,7 @@ the application name. Use in this case:
otbApplicationLauncherCommandLine -inxml saved_applications_parameters.xml otbApplicationLauncherCommandLine -inxml saved_applications_parameters.xml
It will retrieve the application name and related parameters from the It will retrieve the application name and related parameters from the
input xml file and launch in this case the BandMath applications. input XML file and launch in this case the BandMath applications.
In-memory connection between applications In-memory connection between applications
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
...@@ -362,7 +362,7 @@ In-memory connection between applications is available both at the C++ ...@@ -362,7 +362,7 @@ In-memory connection between applications is available both at the C++
API level and using the python bindings to the application presented API level and using the python bindings to the application presented
in the `Using the Python interface`_ section. in the `Using the Python interface`_ section.
Here is a python code sample connecting several applications together: Here is a Python code sample connecting several applications together:
.. code-block:: python .. code-block:: python
...@@ -421,7 +421,7 @@ writing is only triggered if: ...@@ -421,7 +421,7 @@ writing is only triggered if:
- The output filename is ``.tif`` or ``.vrt`` - The output filename is ``.tif`` or ``.vrt``
In this case, the output image will be divived into several tiles In this case, the output image will be divided into several tiles
according to the number of MPI processes specified to the ``mpirun`` according to the number of MPI processes specified to the ``mpirun``
command, and all tiles will be computed in parallel. command, and all tiles will be computed in parallel.
...@@ -459,7 +459,7 @@ Here is an example of MPI call on a cluster:: ...@@ -459,7 +459,7 @@ Here is an example of MPI call on a cluster::
------------ END JOB INFO 1043196.tu-adm01 --------- ------------ END JOB INFO 1043196.tu-adm01 ---------
One can see that the registration and pan-sharpening of the One can see that the registration and pan-sharpening of the
panchromatic and multispectral bands of a Pleiades image has bee split panchromatic and multi-spectral bands of a Pleiades image has bee split
among 560 cpus and took only 56 seconds. among 560 cpus and took only 56 seconds.
Note that this MPI parallel invocation of applications is only Note that this MPI parallel invocation of applications is only
......
...@@ -3,7 +3,7 @@ Recipes ...@@ -3,7 +3,7 @@ Recipes
This chapter presents guideline to perform various remote sensing and This chapter presents guideline to perform various remote sensing and
image processing tasks with either , or both. Its goal is not to be image processing tasks with either , or both. Its goal is not to be
exhaustive, but rather to help the non-developper user to get familiar exhaustive, but rather to help the non-developer user to get familiar
with these two packages, so that he can use and explore them for his with these two packages, so that he can use and explore them for his
future needs. future needs.
......
...@@ -27,7 +27,7 @@ available for each class in your image. The ``PolygonClassStatistics`` ...@@ -27,7 +27,7 @@ available for each class in your image. The ``PolygonClassStatistics``
will do this job for you. This application processes a set of training will do this job for you. This application processes a set of training
geometries and an image and outputs statistics about available samples geometries and an image and outputs statistics about available samples
(i.e. pixel covered by the image and out of a no-data mask if (i.e. pixel covered by the image and out of a no-data mask if
provided), in the form of a xml file: provided), in the form of a XML file:
- number of samples per class - number of samples per class
...@@ -122,7 +122,7 @@ to determine the sampling rate, some geometries of the training set ...@@ -122,7 +122,7 @@ to determine the sampling rate, some geometries of the training set
might not be sampled. might not be sampled.
The application will accept as input the input image and training The application will accept as input the input image and training
geometries, as well class statistics xml file computed during previous geometries, as well class statistics XML file computed during previous
step. It will output a vector file containing point geometries which step. It will output a vector file containing point geometries which
indicate the location of the samples. indicate the location of the samples.
...@@ -215,7 +215,7 @@ images: ...@@ -215,7 +215,7 @@ images:
* **Custom mode:** The user indicates the target number of samples for * **Custom mode:** The user indicates the target number of samples for
each image. each image.
The different behaviours for each mode and strategy are described as follows. The different behaviors for each mode and strategy are described as follows.
:math:`T_i( c )` and :math:`N_i( c )` refers resp. to the total number and needed number :math:`T_i( c )` and :math:`N_i( c )` refers resp. to the total number and needed number
of samples in image :math:`i` for class :math:`c`. Let's call :math:`L` the total number of of samples in image :math:`i` for class :math:`c`. Let's call :math:`L` the total number of
...@@ -223,7 +223,7 @@ image. ...@@ -223,7 +223,7 @@ image.
* **Strategy = all** * **Strategy = all**
- Same behaviour for all modes proportional, equal, custom : take all samples - Same behavior for all modes proportional, equal, custom : take all samples
* **Strategy = constant** (let's call :math:`M` the global number of samples per * **Strategy = constant** (let's call :math:`M` the global number of samples per
class required) class required)
...@@ -397,10 +397,10 @@ Fancy classification results ...@@ -397,10 +397,10 @@ Fancy classification results
~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Color mapping can be used to apply color transformations on the final Color mapping can be used to apply color transformations on the final
graylevel label image. It allows to get an RGB classification map by gray level label image. It allows to get an RGB classification map by
re-mapping the image values to be suitable for display purposes. One can re-mapping the image values to be suitable for display purposes. One can
use the *ColorMapping* application. This tool will replace each label use the *ColorMapping* application. This tool will replace each label
with an 8-bits RGB color specificied in a mapping file. The mapping file with an 8-bits RGB color specified in a mapping file. The mapping file
should look like this : should look like this :
:: ::
...@@ -445,7 +445,7 @@ After having processed several classifications of the same input image ...@@ -445,7 +445,7 @@ After having processed several classifications of the same input image
but from different models or methods (SVM, KNN, Random Forest,...), it but from different models or methods (SVM, KNN, Random Forest,...), it
is possible to make a fusion of these classification maps with the is possible to make a fusion of these classification maps with the
*FusionOfClassifications* application which uses either majority voting *FusionOfClassifications* application which uses either majority voting
or the Demspter Shafer framework to handle this fusion. The Fusion of or the Dempster-Shafer framework to handle this fusion. The Fusion of
Classifications generates a single more robust and precise Classifications generates a single more robust and precise
classification map which combines the information extracted from the classification map which combines the information extracted from the
input list of labeled images. input list of labeled images.
...@@ -465,7 +465,7 @@ parameters : ...@@ -465,7 +465,7 @@ parameters :
- ``-undecidedlabel`` label for the undecided class (default value = 0) - ``-undecidedlabel`` label for the undecided class (default value = 0)
The input pixels with the nodata class label are simply ignored by the The input pixels with the no-data class label are simply ignored by the
fusion process. Moreover, the output pixels for which the fusion process fusion process. Moreover, the output pixels for which the fusion process
does not result in a unique class label, are set to the undecided value. does not result in a unique class label, are set to the undecided value.
...@@ -543,7 +543,7 @@ each of them should be confronted with a ground truth. For this purpose, ...@@ -543,7 +543,7 @@ each of them should be confronted with a ground truth. For this purpose,
the masses of belief of the class labels resulting from a classifier are the masses of belief of the class labels resulting from a classifier are
estimated from its confusion matrix, which is itself exported as a estimated from its confusion matrix, which is itself exported as a
\*.CSV file with the help of the *ComputeConfusionMatrix* application. \*.CSV file with the help of the *ComputeConfusionMatrix* application.
Thus, using the Dempster Shafer method to fuse classification maps needs Thus, using the Dempster-Shafer method to fuse classification maps needs
an additional input list of such \*.CSV files corresponding to their an additional input list of such \*.CSV files corresponding to their
respective confusion matrices. respective confusion matrices.
...@@ -569,9 +569,9 @@ based on the confidence level in each classifier. ...@@ -569,9 +569,9 @@ based on the confidence level in each classifier.
.. figure:: ../Art/MonteverdiImages/classification_chain_inputimage.jpg .. figure:: ../Art/MonteverdiImages/classification_chain_inputimage.jpg
.. figure:: ../Art/MonteverdiImages/QB_1_ortho_DS_V_P_C123456_CM.png .. figure:: ../Art/MonteverdiImages/QB_1_ortho_DS_V_P_C123456_CM.png
Figure 5: From left to right: Original image, and fancy colored classified image obtained by a Dempster Shafer fusion of the 6 classification maps represented in Fig. 4.13 (water: blue, roads: gray, vegetation: green, buildings with red roofs: red, undecided: white). Figure 5: From left to right: Original image, and fancy colored classified image obtained by a Dempster-Shafer fusion of the 6 classification maps represented in Fig. 4.13 (water: blue, roads: gray, vegetation: green, buildings with red roofs: red, undecided: white).
Recommandations to properly use the fusion of classification maps Recommendations to properly use the fusion of classification maps
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
In order to properly use the *FusionOfClassifications* application, some In order to properly use the *FusionOfClassifications* application, some
...@@ -586,7 +586,7 @@ the interpretation of the ``OutputFusedClassificationImage``. ...@@ -586,7 +586,7 @@ the interpretation of the ``OutputFusedClassificationImage``.
Majority voting based classification map regularization Majority voting based classification map regularization
------------------------------------------------------- -------------------------------------------------------
Resulting classification maps can be regularized in order to smoothen Resulting classification maps can be regularized in order to smooth
irregular classes. Such a regularization process improves classification irregular classes. Such a regularization process improves classification
results by making more homogeneous areas which are easier to handle. results by making more homogeneous areas which are easier to handle.
...@@ -663,7 +663,7 @@ images, which means that the value of each pixel corresponds to the ...@@ -663,7 +663,7 @@ images, which means that the value of each pixel corresponds to the
class label it belongs to. The ``InputLabeledImage`` is commonly an class label it belongs to. The ``InputLabeledImage`` is commonly an
image generated with a classification algorithm such as the SVM image generated with a classification algorithm such as the SVM
classification. Remark: both ``InputLabeledImage`` and classification. Remark: both ``InputLabeledImage`` and
``OutputLabeledImage`` are not necessarily of the same datatype. ``OutputLabeledImage`` are not necessarily of the same type.
Secondly, if ip.suvbool == true, the Undecided label value must be Secondly, if ip.suvbool == true, the Undecided label value must be
different from existing labels in the input labeled image in order to different from existing labels in the input labeled image in order to
avoid any ambiguity in the interpretation of the regularized avoid any ambiguity in the interpretation of the regularized
...@@ -801,7 +801,7 @@ classification and regression mode. ...@@ -801,7 +801,7 @@ classification and regression mode.
- K-Nearest Neighbors - K-Nearest Neighbors
The behaviour of application is very similar to . From the input data The behavior of application is very similar to . From the input data
set, a portion of the samples is used for training, whereas the other set, a portion of the samples is used for training, whereas the other
part is used for validation. The user may also set the model to train part is used for validation. The user may also set the model to train
and its parameters. Once the training is done, the model is stored in an and its parameters. Once the training is done, the model is stored in an
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment