Fix the neural network classifier and the TrainImagesClassifier tests
Summary
Closes #2191
Current behavior
The TrainImagesClassifier
tests are broken (the apTvClTrainMethodXXXImagesClassifierQB1
family of test) : they are always successful on CI but, as mentioned by an user on the forum, the output are incoherent.
On the CI, these tests are based on a ascii comparison of the computed confusion matrix, with a tolerance of 2 for numerical values in the tested files. The tolerance has been added in !224 (merged) to take into account the fact that Shark classifier use randomness, and that the random seed cannot be set for these tests, which is fixed by allowing an error of two misclassified samples. But the tolerance in --compare-ascii is relative ... which means that a tolerance of two is very large will always be verified, this is why the tests are always successful on CI.
Summary, there are two initial errors :
- the value of the tolerance
- the incoherent output of shark classifier
For example the ANN test produces the confusion matrix:
#Reference labels (rows):1,2,3,4
#Produced labels (columns):1,2,3,4
42,0,0,0
42,0,0,0
42,0,0,0
42,0,0,0
while the baseline is:
#Reference labels (rows):1,2,3,4
#Produced labels (columns):1,2,3,4
42,0,0,0
0,42,0,0
0,0,42,0
0,0,0,42
but the test is successful on CI. This hides bug #2191.
Correction of intial errors :
-
Tolerance value error : In this merge request the tolerance has been removed on this test, the produced confusion matrix should be the same as the one in the baseline. -
Incoherent output of shark classifier : The baseline checks for the Shark classifier tests in TrainImagesClassifier
have been removed, as it was the case before !224 (merged).
Other corrections :
Once the tolerance value error has been fixed, new bugs appears:
-
Fix bug in OpenCV neural network : This merge request also fix #2191. When fetching values from an openCV matrix (cv::Mat) with the template<typename T> at<T>
method, the template parameter should be the type of data actually stored in the matrix, not the desired output type. Using the wrong type results in a reinterpretation of the stored value. InNeuralNetworkMachineLearningModel
, this was resulting in a reinterpretation of float values into int values when writing the output labels. -
Confusion matrix differences produced by libSVM : the results produced by the SVM classfier are different on Windows and Linux builds. This require further investigations. -
Confusion matrix differences produced by OpenCV's random forest : The difference are caused by the update of the OpenCV version in !822 (merged).
Parameter | before !224 (merged) | !224 (merged) | Regression #2191 | !816 tolerance set to zero | !816 Disable Shark tests | !816 Fix ann bug |
---|---|---|---|---|---|---|
Tolerance | 0 | 2 (relative tolerance, the absolute tolerance is high) | 2 (relatif donc tolérance très grande en absolu) | 0 | 0 | 0 |
Shark RF (sharkrf) | Not tested | OK | OK | Test failing | Not tested | Not tested |
Shark Kmeans (sharkkm) | Not tested | OK | OK | Test failing | Not tested | Not tested |
OpenCV Random Forest (rf) | OK | OK | OK (regression hidden by the tolerance on Windows) | Test failing | Test failing | Test failing |
OpenCV Decision Tree (dt) | OK | OK | OK | OK | OK | OK |
OpenCV Neural network (ann) | OK | OK | OK (regression hidden by the tolerance) | Test failing | Test failing | OK |
OpenCV Normal Bayes (bayes) | OK | OK | OK | OK | OK | OK |
OpenCV Boosting (boost) | OK | OK | OK | OK | OK | OK |
OpenCV K nearest neighbors (knn) | OK | OK | OK | OK | OK | OK |
LibSVM | OK | OK | OK (regression hidden by the tolerance on Windows) | Test failing | Test failing | Test failing |
Copyright
The copyright owner is CNES and has signed the ORFEO ToolBox Contributor License Agreement.
Check before merging:
- All discussions are resolved
- At least 2
👍 votes from core developers, no👎 vote. - The feature branch is (reasonably) up-to-date with the base branch
- Dashboard is green
- Copyright owner has signed the ORFEO ToolBox Contributor License Agreement
- Optionally, run
git diff develop... -U0 --no-color | clang-format-diff.py -p1 -i
on latest changes and commit