Commit 889f1287 authored by Manuel Grizonnet's avatar Manuel Grizonnet

STYLE; fix spelling errors in OTB Documentation

parent 564e2c6d
......@@ -6,7 +6,7 @@ Introduction
This is a replacement of old OTB Cookbook which was written in Latex. This version is completely deviate from existing Latex format to reStructured format (rst).
Converting existing latex to rst is not that straightforward. All rst files for OTB applications are generated using python script otbGenerateWrappersRstDoc.py.
For others in recipes, we used a tool called pandoc to get an inital rst and then edited out errors manually. You do not have to generate them again.
For others in recipes, we used a tool called pandoc to get an initial rst and then edited out errors manually. You do not have to generate them again.
The old Cookbook in otb-documents is deprecated.
Requirements
......
......@@ -425,7 +425,7 @@ def ApplicationToRst(appname):
if len(seealso) >=2:
output += RstHeading("See Also", '~')
# output += ":See Also:" + linesep + linesep
output += "These additional ressources can be useful for further information: " + linesep
output += "These additional resources can be useful for further information: " + linesep
# hlink="<http://www.readthedocs.org/" + ConvertString(app.GetDocSeeAlso()) + ".html>`_ "
# output += linesep + "`" + ConvertString(app.GetDocSeeAlso()) + " " + hlink + linesep + linesep
output += linesep + ConvertString(app.GetDocSeeAlso()) + linesep + linesep
......
......@@ -251,7 +251,7 @@ After you can run:
If you are using *Synaptic*, you can add the repositories, update and
install the packages through the graphical interface.
For further informations about Ubuntu packages go to
For further information about Ubuntu packages go to
`ubuntugis-unstable <https://launchpad.net/~ubuntugis/+archive/ubuntugis-unstable>`__
launchpad page and click on Read about installing.
......
......@@ -72,7 +72,7 @@ The top toolbar is made up of ten icons; from left to right:
Image displaying
~~~~~~~~~~~~~~~~
This part of the main window is intented to display the images loaded by
This part of the main window is intended to display the images loaded by
the user. There are many nice keyboard shortcuts or mouse tricks that
let the user have a better experience in navigating throughout the
loaded images. These shortcuts and tricks are given within the Help item
......
......@@ -16,7 +16,7 @@ library, or compose them into high level pipelines. OTB applications allow to:
OTB applications can be launched in different ways, and accessed from different
entry points. The framework can be extended, but Orfeo Toolbox ships with the following:
- A command-line laucher, to call applications from the terminal,
- A command-line launcher, to call applications from the terminal,
- A graphical launcher, with an auto-generated QT interface, providing
ergonomic parameters setting, display of documentation, and progress
......@@ -145,7 +145,7 @@ Command-line examples are provided in chapter [chap:apprefdoc], page .
Using the GUI launcher
~~~~~~~~~~~~~~~~~~~~~~
The graphical interface for the applications provides a usefull
The graphical interface for the applications provides a useful
interactive user interface to set the parameters, choose files, and
monitor the execution progress.
......@@ -170,7 +170,7 @@ The resulting graphical application displays a window with several tabs:
- Parameters is where you set the parameters and execute the
application.
- Logs is where you see the informations given by the application
- Logs is where you see the information given by the application
during its execution.
- Progress is where you see a progress bar of the execution (not
......@@ -258,7 +258,7 @@ application, changing the algorithm at each iteration.
# Here we configure the smoothing algorithm
app.SetParameterString("type", type)
# Set the output filename, using the algorithm to differenciate the outputs
# Set the output filename, using the algorithm to differentiate the outputs
app.SetParameterString("out", argv[2] + type + ".tif")
# This will execute the application and save the output file
......
......@@ -16,7 +16,7 @@ All of OTB's algorithms are accessible from its graphical interface called
Monteverdi, from QGIS, Python, the command line or C++. Monteverdi is an easy to
use, hardware accelerated visualization tool for satellite images in sensor
geometry. With it, end-users can visualize huge raw imagery products
and access all of the applications in the toolbox. From ressource limited
and access all of the applications in the toolbox. From resource limited
laptops to high performance clusters, OTB is available on Windows, Linux and
Mac. It is community driven, extensible and heavily documented.
Orfeo ToolBox is not a black box!
......@@ -24,7 +24,7 @@ Orfeo ToolBox is not a black box!
This is the CookBook documentation for users. If you are new to OTB and
Monteverdi, start here. It will go through how to install OTB on your system,
how to start using Monteverdi and OTB applications to view and process your
data, and recipies on how to accomplish typical remote sensing tasks.
data, and recipes on how to accomplish typical remote sensing tasks.
Finally, there is also documentation on every application shipped with OTB.
For other documentation, be sure to read:
......
......@@ -307,7 +307,7 @@ Please, also refer to the next section “Application Programming
Interface” ([ssec:API]).
**Function ndvi** This function implements the classical normalized
difference vegetation index; it tkaes two inputs. For instance:
difference vegetation index; it takes two inputs. For instance:
.. math:: ndvi(im1b1,im1b4)
......
......@@ -222,7 +222,7 @@ type is advised for this output image.
::
otbcli_LSMSSmallRegionsMerging -in filtered_range.tif
-inseg segementation.tif
-inseg segmentation.tif
-out segmentation_merged.tif uint32
-minsize 10
-tilesizex 256
......@@ -303,17 +303,17 @@ The application can be used like this:
-out FuzzyModel.xml
The output file ``FuzzyModel.xml`` contains the optimal model to perform
informations fusion.
information fusion.
First Step: Compute Descriptors
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The first step in the classifier fusion based validation is to compute,
for each studied polyline, the choosen descriptors. In this context, the
for each studied polyline, the chosen descriptors. In this context, the
*ComputePolylineFeatureFromImage* application can be used for a large
range of descriptors. It has the following inputs :
- ``-in`` an image (of the sudied scene) corresponding to the choosen
- ``-in`` an image (of the sudied scene) corresponding to the chosen
descriptor (NDVI, building Mask…)
- ``-vd`` a vector data containing polyline of interest
......@@ -338,7 +338,7 @@ pixels along a polyline that verifies the formula “NDVI >0.4” :
-out VD_NONDVI.shp
``NDVI.TIF`` is the NDVI mono band image of the studied scene. This step
must be repeated for each choosen descriptor:
must be repeated for each chosen descriptor:
::
......
......@@ -5,18 +5,18 @@ Input and output images to any OTB application in the form of numpy array is now
The python wrapping only exposes OTB ApplicationEngine module which allow to access existing C++ applications.
Due to blissful nature of ApplicationEngine's loading mechanism no specific wrapping is required for each application.
Numpy extenstion to Python wrapping allows data exchange to application as an array rather than a disk file.
Numpy extension to Python wrapping allows data exchange to application as an array rather than a disk file.
Ofcourse, it is possible to load an image from file and then convert to numpy array or just provide a file as earlier via
Application.SetParameterString(...).
This brige that completes numpy and OTB makes it easy to plug OTB into any image processing chain via python code that uses
This bridge that completes numpy and OTB makes it easy to plug OTB into any image processing chain via python code that uses
GIS/Image processing tools such as GDAL, GRASS GIS, OSSIM that can deal with numpy.
Below code reads an input image using python pillow (PIL) and convert it to numpy array. This numpy array is
used an input to the application set *SetImageFromNumpyArray(...)* method.
The application used in this example is `ExtractROI <../Applications/app_ExtractROI.html>`_. After extracting
a small area the ouput image is taken as numpy array with *GetImageFromNumpyArray(...)* method
a small area the output image is taken as numpy array with *GetImageFromNumpyArray(...)* method
::
......
......@@ -151,7 +151,7 @@ application will find these files and load directly the quicklook from
them, instead of decoding it from the
`Jpeg2000 <http://en.wikipedia.org/wiki/JPEG_2000>`_ file, resulting in
an instant loading of the image in Monteverdi . Since the wheight of
these extra files is ususally of a few megaoctets, it is recommended to
these extra files is usually of a few megaoctets, it is recommended to
keep this option checked unless one has a very good reason not to. Now
that the `Pleiades <http://smsc.cnes.fr/PLEIADES/index.htm>`_ image is
loaded in Monteverdi , it appears in the main Monteverdi window,
......@@ -185,7 +185,7 @@ see the actual name of the place under mouse pointer). Last, as said in
the foreword of this section,
`Pleiades <http://smsc.cnes.fr/PLEIADES/index.htm>`_ image can be quite
large, so it might be convenient to switch the viewer style from
*Packed* to *Splitted*, in which case you will be able to maximize the
*Packed* to *Split*, in which case you will be able to maximize the
*Scroll Window* for better localisation of the viewed area. To do so,
one can go to the *Setup* tab of the *Viewer Control Window*.
......@@ -193,7 +193,7 @@ Handling mega-tiles in Monteverdi
--------------------------------------
If the `Pleiades <http://smsc.cnes.fr/PLEIADES/index.htm>`_ product is
very large, it might happen that the image is actually splitted into
very large, it might happen that the image is actually split into
several `Jpeg2000 <http://en.wikipedia.org/wiki/JPEG_2000>`_ files,
also called mega-tiles. Since the area of interest might span two or
more mega-tiles, it is convenient to stitch together these tiles so as
......@@ -230,7 +230,7 @@ uncompress it to disk. To do so, open the
Figure 5: A Pleiades image in Monteverdi Uncompress Jpeg2000 image module. (c) CNES 2012
`Figure 5` shows what this module looks like. On the left, one can find
informations about the images dimensions, resolution level, and number of
information about the images dimensions, resolution level, and number of
`Jpeg2000 <http://en.wikipedia.org/wiki/JPEG_2000>`_ tiles in image,
dimension of tiles, and size of tiles in mega-octets. The center part of
the module is the most important one: it displays a quick-look of the
......
......@@ -596,7 +596,7 @@ Next we apply the H-alpha-A decomposition:
-out haa_extract.tif
The result has three bands : entropy (0..1) - alpha (0..90) - anisotropy
(0..1). It is splitted into 3 mono-band images thanks to following
(0..1). It is split into 3 mono-band images thanks to following
command :
::
......
......@@ -392,7 +392,7 @@ couple with image :math:`index_{0}` and :math:`index_{1}`, a second with
image :math:`index_{1}` and :math:`index_{2}`, and so on. If left blank
images are processed by pairs (which is equivalent as using “ 0 1,2 3,4
5 ” …). In addition to the usual elevation and projection parameters,
main parameters have been splitted in groups detailled below:
main parameters have been split in groups detailed below:
Output :
output parameters : DSM resolution, NoData value, Cell Fusion
......
......@@ -83,7 +83,7 @@
% Added a few more % signs
% May 10, 1996:
% version 1.99b:
% Changed the syntax of \f@nfor to be resistent to catcode changes of :=
% Changed the syntax of \f@nfor to be resistant to catcode changes of :=
% Removed the [1] from the defs of \lhead etc. because the parameter is
% consumed by the \@[xy]lhead etc. macros.
% June 24, 1997:
......@@ -129,7 +129,7 @@
% version 2.1
% The defaults for \footrulewidth, \plainheadrulewidth and
% \plainfootrulewidth are changed from \z@skip to 0pt. In this way when
% someone inadvertantly uses \setlength to change any of these, the value
% someone inadvertently uses \setlength to change any of these, the value
% of \z@skip will not be changed, rather an errormessage will be given.
% March 3, 2004
......
% PICINS.STY --- Style File zum Einbinden von Bildern
% Autor: J. Bleser, E. Lang
% Author: J. Bleser, E. Lang
% Hochschulrechenzentrum
% Technische Hochschule Darmstadt
% !!! Dieses Style-File ist urheberrechtlich geschuetzt !!!
......
......@@ -90,8 +90,8 @@ MACRO(CONVERT_AND_FLIP_IMG SOME_IMG EPS_IMG PATH)
ENDMACRO(CONVERT_AND_FLIP_IMG)
# Convert an image from some file format to EPS for inclusion in Latex using
# ImageMagick or bmeps.. This image is an input image. A seperate macro is necessary
# cause input images do not have any dependecies
# ImageMagick or bmeps.. This image is an input image. A separate macro is necessary
# cause input images do not have any dependencies
IF(OTB_USE_BMEPS)
MACRO(CONVERT_INPUT_IMG SOME_IMG EPS_IMG PATH)
ADD_CUSTOM_COMMAND(
......@@ -124,8 +124,8 @@ ENDIF(OTB_USE_BMEPS)
# # Convert an image from some file format to EPS for inclusion in Latex using
# # jpg2ps.. This image is an input image. A seperate macro is necessary
# # cause input images do not have any dependecies
# # jpg2ps.. This image is an input image. A separate macro is necessary
# # cause input images do not have any dependencies
# #in case of input image is in jpeg
# MACRO(CONVERT_INPUT_COMPRESSED_IMG BASIC_OPTION SOME_IMG EPS_IMG PATH)
# ADD_CUSTOM_COMMAND(
......@@ -149,8 +149,8 @@ ENDIF(OTB_USE_BMEPS)
# Convert an image from some file format to EPS for inclusion in Latex using
# ImageMagick.. This image is an input image. A seperate macro is necessary
# cause input images do not have any dependecies. Also flip
# ImageMagick.. This image is an input image. A separate macro is necessary
# cause input images do not have any dependencies. Also flip
MACRO(CONVERT_AND_FLIP_INPUT_IMG SOME_IMG EPS_IMG PATH)
ADD_CUSTOM_COMMAND(
SOURCE "${PATH}/${SOME_IMG}"
......@@ -357,7 +357,7 @@ SET( OTB_EXAMPLES_SRCS
${OTB_SOURCE_DIR}/Examples/Markov/MarkovClassification1Example.cxx
${OTB_SOURCE_DIR}/Examples/Markov/MarkovClassification2Example.cxx
${OTB_SOURCE_DIR}/Examples/Markov/MarkovClassification3Example.cxx
${OTB_SOURCE_DIR}/Examples/Markov/MarkovRestaurationExample.cxx
${OTB_SOURCE_DIR}/Examples/Markov/MarkovRestorationExample.cxx
${OTB_SOURCE_DIR}/Examples/Markov/MarkovRegularizationExample.cxx
${OTB_SOURCE_DIR}/Examples/Learning/SVMPointSetModelEstimatorExample.cxx
${OTB_SOURCE_DIR}/Examples/Learning/SVMPointSetClassificationExample.cxx
......
......@@ -27,7 +27,7 @@
#
# Please do not specify paths along with the file names. A list of search paths
# where input data files may be found is specified through CMAKE. Paths are
# specified in a colon seperated list such as
# specified in a colon separated list such as
# /Insight/Examples/Data:/VTK/VTKData
# Specifying the root path will suffice. A recursive search for input data
# is done.
......@@ -54,7 +54,7 @@
# Note that the eps files are flipped, not the inputs or the outputs themselves. The
# files that are used in the command line arguments etc are the original ones. In other
# words every image that you see in the SW guide that is the same as or is generated
# from the list of images in teh CMakeLists file is a flipped version!
# from the list of images in the CMakeLists file is a flipped version!
#
use File::Spec; #for platform independent file paths
use File::Find; #for platform independent recursive search of input images in
......@@ -77,11 +77,11 @@ $numArgs = $#ARGV + 1;
if( $numArgs < 5 )
{
print "Usage arguments: \n";
print " Name of the .cxx/.txx file (with extenstion).\n";
print " Name of the .cxx/.txx file (with extension).\n";
print " OTBExecsDirectoryPath \n";
print " Cmake file to be generated\n";
print " Name of the TEX file generated, so dependencies can be specified\n";
print " Ouput folder to store generated images\n";
print " Output folder to store generated images\n";
print " Double Colon separated list of possible include directories for input images\n";
die;
}
......
......@@ -8,7 +8,7 @@ functions. They were previously known as the OTB-Applications
package but are now part of the OTB library. The new framework is
slightly different from before but they can be used pretty much the
same way: each application has its set of inputs, outputs, parameters.
The applications can be lauched as a command line interface but also
The applications can be launched as a command line interface but also
via a Qt GUI. In addition, they can be wrapped for SWIG and PyQt. For a
complete list of these applications, please refer to the
\href{http://orfeo-toolbox.org/Applications}{applications documentation}.
......
......@@ -523,7 +523,7 @@ Like for the majority voting method, the Dempster Shafer fusion handles not
unique class labels with the maximal belief function. In this case, the output
fused pixels are set to the undecided value.
The confidence levels of all the class labels are estimated from a comparision of
The confidence levels of all the class labels are estimated from a comparison of
the classification maps to fuse with a ground truth, which results in a
confusion matrix. For each classification maps, these confusion matrices are then
used to estimate the mass of belief of each class label.
......
......@@ -21,7 +21,7 @@ A wiki page is also available here : \url{http://wiki.orfeo-toolbox.org/index.ph
\item Some guidelines only apply after modularization is completed
\end{itemize}
Contributions through Remote Modules is the prefered way if your contribution is about adding new classes or application to bring new features to OTB.
Contributions through Remote Modules is the preferred way if your contribution is about adding new classes or application to bring new features to OTB.
Please also refer to ITK guidelines for remote modules (\url{http://www.itk.org/Wiki/ITK/Policy_and_Procedures_for_Adding_Remote_Modules}).
\section{What are remote modules?}
......@@ -48,7 +48,7 @@ To make it short, by contributing a remote module:
\begin{itemize}
\item Follow the instructions on writing a remote module in order to have a working remote module inside your local source code tree (\ref{sec:writemodule}).
\item Host the remote module code on a publicly available git repository. If you do not have access to a git server, bitbucket or github can provide this service for you.
\item Write a short email to the otb-developers list, detailing your contributed remote module, and providing the cmake file to add into Modules/Remote so as to get it into OTB, as well as evidence that you comply with the remote module policy (see bellow).
\item Write a short email to the otb-developers list, detailing your contributed remote module, and providing the cmake file to add into Modules/Remote so as to get it into OTB, as well as evidence that you comply with the remote module policy (see below).
\item Remote module acceptance policy compliance will be checked by the otb-developers list,
\item Acceptance of remote module is submitted to vote on otb-developers (to be reviewed by PSC).
\end{itemize}
......@@ -69,7 +69,7 @@ A remote module can be removed from Modules/Remote (this only requires to remove
So as to get your module accepted as an official remote module, you should comply with the following:
\begin{itemize}
\item Remote module source code should be hosted on a publicly available Git repository
\item Author of the remote module should be identified and registed to otb-developers mailing list
\item Author of the remote module should be identified and registered to otb-developers mailing list
\item Author of the remote module accepts to be contacted by developers or users regarding issues with his module (on a best effort basis),
\item Remote module source code should comply with OTB style as much as possible,
\item Remote module source code should be documented using doxygen tags,
......
......@@ -87,7 +87,7 @@ as an artificial construct for color comparison in the sense that
just as a way of saying that we can produce $ColorB$ by combining $ColorA$ and
$ColorC$. However, we must be aware that (at least in emitted light) it is not
possible to \emph{substract light}. So when we mention
possible to \emph{subtract light}. So when we mention
Equation~\ref{eqn:ColorSubtraction} we actually mean
\begin{equation}
......
......@@ -33,7 +33,7 @@ without too much loss of accuracy.
\input{MNFExample}
\section{Fast Independant Component Analysis}
\section{Fast Independent Component Analysis}
\input{ICAExample}
......
......@@ -37,7 +37,7 @@ by the wall-ground bounce in a radar image)?
We can answer by saying that the images of the same object obtained by different
sensors are two different representations of the same reality. For the
same spatial location, we have two different measures. Both informations
same spatial location, we have two different measures. Both information
come from the same source and thus they have a lot of common
information. This relationship may not be perfect, but it can be
evaluated in a relative way: different geometrical distortions are
......@@ -158,7 +158,7 @@ Arg \max_T(S_c(I,T\circ J));
\subsection{Geometric deformation modeling\label{sec-model}}
The geometric transformation of definition \ref{defin-T} is used for
the correction of the existing deformation between the two images to be
registered. This deformation contains informations which are linked to
registered. This deformation contains information which are linked to
the observed scene and the acquisition conditions. They
can be classified into 3 classes depending on their physical source:
\begin{enumerate}
......
......@@ -3,7 +3,7 @@
% \section{Introduction}
Under the term {\em Feature Extraction} we include several techniques
aiming to detect or extract informations of low level of abstraction
aiming to detect or extract information of low level of abstraction
from images. These {\em features} can be objects : points, lines,
etc. They can also be measures : moments, textures, etc.
......
......@@ -451,12 +451,12 @@ filters that approximate the convolution with a Gaussian
\subsection{Edge preserving Markov Random Field}
The Markov Random Field framework for OTB is more detailled in \ref{sec:MarkovRandomFieldOTB} (p. \pageref{sec:MarkovRandomFieldOTB}).
The Markov Random Field framework for OTB is more detailed in \ref{sec:MarkovRandomFieldOTB} (p. \pageref{sec:MarkovRandomFieldOTB}).
\index{Markov!Filtering}
\index{Markov!Restauration}
\index{Markov!Restoration}
\ifitkFullVersion
\input{MarkovRestaurationExample.tex}
\input{MarkovRestorationExample.tex}
\fi
\section{Distance Map}
......
......@@ -54,7 +54,7 @@ reflectance spectrum associated with each pixel is a linear combination of pure
\end{figure}
The `` left'' term represents the different spectral bands of
data cube. The `` right'' term represents a `` product''
between the reflectance spectra of endmembers and their repective abundances. Abundance band of endmembers is
between the reflectance spectra of endmembers and their respective abundances. Abundance band of endmembers is
image grayscale between $0$ and $1$. The pixel i of the
abundance band of endmember j is $s_ {ji}$. This value is the
abundance of endmember j in the pixel i. Under certain conditions
......@@ -144,7 +144,7 @@ the columns of a projection matrix V of dimension $(Lx(J-1))$. Reduced
data Z, of dimensions $((J-1)xI)$ are obtained by the operation:
$Z=V^{T}(\tilde{R})$
where each column of $\tilde{R}$ where the average spectrum is substracted,
where each column of $\tilde{R}$ where the average spectrum is subtracted,
generally estimated under maximum likelihood. In the
subspace carrying the column-vectors Z, endmembers spectra oare associated to the top of the simplex. If the noise is
negligible, the simplex circumscribed reduced data.
......@@ -206,7 +206,7 @@ Existing algorithms are:
\begin{itemize}
\item {MVSA (Minimum Volume Simplex
Analysis) \cite{Li2008}.}
\item { MVES (Minimum Volume Encosing Simplex)
\item { MVES (Minimum Volume Enclosing Simplex)
[Chan2009].}
\item {SISAL (Simplex Identification via Split Augmented
Lagrangian) \cite{Dias2009}.}
......
......@@ -236,7 +236,7 @@ The existing algorithms are:
\ Begin {itemize}
\ Item {MVSA (Minimum Volume Simplex
Analysis) [Li2008].}
\ Item {VSS (Volume Minimum Encosing Simplex)
\ Item {VSS (Volume Minimum Enclosing Simplex)
[Chan2009].}
\ Item {SISAL (Simplex Identification via Augmented Split
Lagrangian) [Dias2009].}
......
......@@ -19,7 +19,7 @@ To be able to use OTB classes from within IDL or ENVI, the main requirement is t
The mechanism is simple. A layer is added to call the OTB routine with
IDL inputs and outputs, an executable is created and finally an IDL mechanism will create IDL routines that will use our OTB routine.
First we will see how to create the project in section~\ref{buildProject}. Then we will see how to wrap a simple OTB algorithm (Canny filter) in section~\ref{canny}, and later a complete explanation of
the C++ (section~\ref{cfiles}) and IDL files contents will be studied (section~\ref{idlfiles}) . Finaly, we will see how to integrate those new IDL routines in the ENVI GUI (section~\ref{envi}).
the C++ (section~\ref{cfiles}) and IDL files contents will be studied (section~\ref{idlfiles}) . Finally, we will see how to integrate those new IDL routines in the ENVI GUI (section~\ref{envi}).
\section{How to Build the Project: CMake and CMakeLists.txt}
\label{buildProject}
......@@ -234,7 +234,7 @@ After this simple example, let's see more complete (and complex) explanations of
\subsection{Input and Output Types}
To make things easy, we chose to create one C++ file for each wanted IDL function. Those files are \code{.h} and \code{.cxx} files stored in \code{Code/Sources/Example1} in the example tree.
To avoid multiple definitions due to links between executables, an ifndef/define block has to be writen:\\
To avoid multiple definitions due to links between executables, an ifndef/define block has to be written:\\
\code{\#ifndef \_myfunction\_h}\\
\code{\#ifndef \_myfunction\_h}\\
And, at the end of the \code{.h} file:
......
......@@ -574,8 +574,8 @@ describe the main characteristics of such transforms.
%% Please read the ITK Software Guide http://www.itk.org/ItkSoftwareGuide.pdf for
%% a description of the use of the CenteredTransformInitializer class.
%% \item Optimzer steps too large. If you optimizer takes steps that are too
%% large, it risks to become unstable and to send the images too far appart. You
%% may want to start the optimizer with a maximum step lenght of 1.0, and only
%% large, it risks to become unstable and to send the images too far apart. You
%% may want to start the optimizer with a maximum step length of 1.0, and only
%% increase it once you have managed to fine tune all other registration
%% parameters.
......@@ -636,7 +636,7 @@ describe the main characteristics of such transforms.
%% Once you identify the anatomical structures in the
%% histogram, then rerun that same program with less
%% and less number of bins, until you reach the minimun
%% and less number of bins, until you reach the minimum
%% number of bins for which all the tissues that are important
%% for your application, are still distinctly differentiated in the
%% histogram. At that point, take that number of bins and
......@@ -709,7 +709,7 @@ describe the main characteristics of such transforms.
%% Just keep in mind that what the optimizer will do is to
%% "jump" in a paramteric space of 6 dimension, and that the
%% component of the jump on every dimension will be proporitional
%% to 1/scaling factor * OptimizerStepLenght. Since you put
%% to 1/scaling factor * OptimizerStepLength. Since you put
%% the optimizer Step Length to 0.1, then the optimizer will start
%% by exploring the rotations at jumps of about 5degrees, which
%% is a conservative rotation for most medical applications.
......
......@@ -19,7 +19,7 @@ Here we present a complete pipeline to simulate image using sensor characteristi
\item label image : describes image object properties.
\item label properties : describes each label characteristics.
\item mask : vegetation image mask.
\item cloud mask (optionnal).
\item cloud mask (optional).
\item acquisition rarameter file : file containing the parameters for the acquisition.
\item RSR File : File name for the relative spectral response to be used.
\item sensor FTM file : File name for sensor spatial interpolation.
......
......@@ -243,7 +243,7 @@ there is no issue with new changes in OTB and its dependencies. It simply works!
SuperBuild downloads dependencies into the \texttt{DOWNLOAD\_LOCATION} directory, which will be
\texttt{$\sim$/OTB/build/Downloads} in our example.
Dependencies can be downloaded manually into this directory before the compilation step.
This can be usefull if you wish to bypass a proxy, intend to compile OTB without an internet conection, or other network
This can be useful if you wish to bypass a proxy, intend to compile OTB without an internet connection, or other network
constraint. You can find an archive with sources of all our dependencies on the Orfeo ToolBox website (pick the 'SuperBuild-archives' corresponding to the OTB version you want to build) :
\begin{center}
\url{https://www.orfeo-toolbox.org/packages}
......
......@@ -146,7 +146,7 @@ The \code{OTB} contains the following subdirectories:
examples used by this guide and to illustrate important
OTB concepts.
\item \code{OTB/Superbuild}---CMake scripts to automatically download, patch, build and install important
depencies of OTB (ITK, OSSIM, GDAL to name a few).
dependencies of OTB (ITK, OSSIM, GDAL to name a few).
\item \code{OTB/Utilities}---small programs used for the maintenance of OTB.
\end{itemize}
......
......@@ -276,7 +276,7 @@ page~\pageref{sec:ImageAdaptors} for more information about image adaptors).
\subsection{Iteration Loops}
\label{sec:IterationExample}
% Now give a psuedo code example for putting all of this together.
% Now give a pseudo code example for putting all of this together.
Using the methods described in the previous sections, we can now write a simple
example to do pixel-wise operations on an image. The following code calculates
the squares of all values in an input image and writes them to an output image.
......
......@@ -11,7 +11,7 @@ Style Guidelines for the ITK Software Guide
do++;
}
2. \doxygen{} - The first occurence of a class in a section should use this
2. \doxygen{} - The first occurrence of a class in a section should use this
(or for those in sub-name spaces like itk::Statistics::) use \subdoxygen{}.
Subsequent references to the class name should just use plain text (do not
......
......@@ -107,7 +107,7 @@ image formats. In order to do so, the Geospatial Data Abstraction Library, GDAL
the full format list.
Since GDAL is itself a multi-format library, the GDAL IO
factory is able to choose the appropriate ressource for reading and
factory is able to choose the appropriate resource for reading and
writing images.
In most cases the mechanism is transparent to the user who only interacts
......@@ -214,14 +214,14 @@ This case leads to the following question : which geo-referencing element
should be used when opening this image in an OTB reader. In fact, it depends on
the users need. For an orthorectification application, the sensor model must be
used. In order to specify which information should be skipped, a syntax of
extended filenames has been developped for both reader and writer.
extended filenames has been developed for both reader and writer.
\subsection{Syntax}
The reader and writer extended file name support is based on the same syntax,
only the options are different. To benefit from the extended file name
mecanism, the following syntax is to be used:
mechanism, the following syntax is to be used:
\begin{verbatim}
Path/Image.ext?&key1=<value1>&key2=<value2>
......@@ -264,7 +264,7 @@ IMPORTANT: Note that you'll probably need to "quote" the filename.
\begin{itemize}
\item Skip geometric information
\item Clears the keyword list
\item Keeps the projectionref and the origin/spacing informations
\item Keeps the projectionref and the origin/spacing information
\item false by default.
\end{itemize}
\item \begin{verbatim}&skiprpctag=<(bool)true>\end{verbatim}
......
\chapter{Reading and Writing Auxilary Data}
\index{Auxilary data}
\chapter{Reading and Writing Auxiliary Data}
\index{Auxiliary data}
\label{sec:ReadingAuxData}
As we have seen in the previous chapter, OTB has a great capability to
......@@ -27,14 +27,14 @@ this kind of data.
More examples about representing DEM are presented in section~\ref{sec:ViewingAltitudeImages}.
\section{Reading and Writing Shapefiles and KML}
\index{Auxilary data!vector data}
\index{Auxilary data!shapefile}
\index{Auxilary data!KML}
\index{Auxiliary data!vector data}
\index{Auxiliary data!shapefile}
\index{Auxiliary data!KML}
\label{sec:ReadVectorData}
\input{VectorDataIOExample.tex}
\section{Handling large vector data through OGR}
\index{Auxilary data!OGR vector data}
\index{Auxiliary data!OGR vector data}
\index{OGR wrappers}
\index{OGR wrappers!otb::ogr::DataSource}
\index{OGR wrappers!otb::ogr::Layer}
......
......@@ -919,7 +919,7 @@ $3D$.
Given that the space of Versors is not a Vector space, typical gradient descent
optimizers are not well suited for exploring the parametric space of this
transform. The \doxygen{itk}{VersorRigid3DTranformOptimizer} has been
transform. The \doxygen{itk}{VersorRigid3DTransformOptimizer} has been
introduced in the ITK toolkit with the purpose of providing an optimizer that
is aware of the Versor space properties on the rotational part of this
transform, as well as the Vector space properties on the translational part of
......
......@@ -5,7 +5,7 @@ Well, that's it, you've just downloaded and installed OTB, lured by the promise
that you will be able to do everything with it. That's true, you will be able
to do everything but - there is always a {\em but} - some effort is required.
OTB uses the very powerful systems of generic programing, many classes are
OTB uses the very powerful systems of generic programming, many classes are
already available, some powerful tools are defined to help you with recurrent
tasks, but it is not an easy world to enter.
......@@ -406,7 +406,7 @@ parameter with \code{SmarterFilteringPipeline -in QB\_Suburb.png -out output.png
Quite often, when you buy satellite images, you end up with several images. In the case of optical satellite, you often have a panchromatic spectral band with the highest spatial resolution and a multispectral product of the same area with a lower resolution. The resolution ratio is likely to be around 4.
To get the best of the image processing algorithms, you want to combine these data to produce a new image with the highest spatial resolution and several spectral band. This step is called fusion and you can find more details about it in \ref{sec:Fusion}. However, the fusion suppose that your two images represents exactly the same area. There are different solutions to process your data to reach this situation. Here we are going to use the metadata available with the images to produce an orthorectification as detailled in \ref{sec:Ortho}.
To get the best of the image processing algorithms, you want to combine these data to produce a new image with the highest spatial resolution and several spectral band. This step is called fusion and you can find more details about it in \ref{sec:Fusion}. However, the fusion suppose that your two images represents exactly the same area. There are different solutions to process your data to reach this situation. Here we are going to use the metadata available with the images to produce an orthorectification as detailed in \ref{sec:Ortho}.
First you need to add the following lines in the \code{CMakeLists.txt} file:
......
......@@ -46,7 +46,7 @@ It is also mandatory to implement three methods in a new application:
\subsection{DoInit()}
\label{sec:appDoInit}
This method is called once, when the application is instanciated. It should
This method is called once, when the application is instantiated. It should
contain the following actions:
\begin{itemize}
\item Set the name and the description of the application
......
......@@ -8,6 +8,6 @@
This document present the different geometric concepts used in the toolkit,
as well as their relationships and how they can effectively be used.
At the beggining there was the Point
At the beginning there was the Point
*/
......@@ -30,7 +30,7 @@ particular problems.
\subsection RegistrationMetrics Similarity Metrics
Metrics are probably the most critical element of a registration problem. The metric defines what the goal of the process is, they measure how well the Target object is matched by the Reference object after the transform has been applied to it. The Metric should be selected in function of the types of objects to be registered and the expected kind of missalignment. Some metrics has a rather large capture region, which means that the optimizer will be able to find his way to a maximum even if the missalignment is high. Typicaly large capture regions are associated with low precision for the maximum. Other metrics can provide high precision for the final registration, but usually require to be initialized quite close to the optimal value.
Metrics are probably the most critical element of a registration problem. The metric defines what the goal of the process is, they measure how well the Target object is matched by the Reference object after the transform has been applied to it. The Metric should be selected in function of the types of objects to be registered and the expected kind of missalignment. Some metrics has a rather large capture region, which means that the optimizer will be able to find his way to a maximum even if the missalignment is high. Typically large capture regions are associated with low precision for the maximum. Other metrics can provide high precision for the final registration, but usually require to be initialized quite close to the optimal value.
Unfortunately there are no clear rules about how to select a metric, other that trying some of them in different conditions. In some cases could be and advantage to use a particular metric to get an initial approximation of the transformation, and then switch to another more sensitive metric to achieve better precision in the final result.
......
......@@ -289,7 +289,7 @@
In general, iterators are not the kind of objects that users of the
toolkit would need to use. They are rather designed to be used by
code developers that add new components to the toolkit, like writting
code developers that add new components to the toolkit, like writing
a new Image filter, for example.
Before starting to write code that use iterators, users should consider
......
......@@ -53,7 +53,7 @@ itk::NeighborhoodIterator classes. ITK NeighborhoodIterators allow for code
that is closer to the algorithmic abstraction,
\code
ForAllTheIndicies i in Image
ForAllTheIndices i in Image
GetTheNeighbors n in a 3x3 region around i
Compute the mean of all pixels n
Write the value to the output at i
......@@ -164,7 +164,7 @@ pixel in a neighborhood is always Size()/2.
important. SmartNeighborhoodIterator defines a special class of neighborhood
iterators that transparently handle boundary conditions. These iterators store
a boundary condition object that is used to calculate a value of requested
pixel indicies that lie outside the data set boundary. New boundary condition
pixel indices that lie outside the data set boundary. New boundary condition
objects can be defined by a user and plugged into SmartNeighboroodIterators as
is appropriate for their algorithm.
......@@ -230,7 +230,7 @@ for (regions_iterator++ ; regions_iterator != regions.end(); regions_iterator++)
The NeighborhoodAlgorithm::ImageBoundaryFacesCalculator is a special function
object that returns a list of sub-regions, or faces, of an image region. The
first region in the list corresponds to all the non-boundary pixels in the
input image region. Subseqent regions in the list represent all of the boundary
input image region. Subsequent regions in the list represent all of the boundary
faces of the image (because an image region is defined only by a single index
and size, no single composite boundary region is possible). The list is
traversed with an iterator.
......
......@@ -38,7 +38,7 @@ elements. They are:
\li \b Mapper: the particular technique used for interpolating values when objects are resampled through the \e Transform.
\li \b Optimizer: the method used to find the \e Transform parameters that optimize the \e Metric.
A particular registration method is defined by selecting specific implemementations of each one of these basic elements.
A particular registration method is defined by selecting specific implementations of each one of these basic elements.
In order to determine the registration method appropriated for a particular problem, is will be useful to answer the following questions: