Skip to content
Snippets Groups Projects
Commit 98e7dbbd authored by Kim Albertsson's avatar Kim Albertsson Committed by Guilherme Amadio
Browse files

[TMVA] DNN - Tune minimization test to avoid false positives

The minimizer test quite often did not converge. As a result there were
many spurious test failures.

The test is tuned to converge _much_ more reliably by increasing the
learning rate (0.0001 -> 0.001), and the number of early stopping epochs
(5 -> 50).

This commit also makes the `testMinimization` test the minimizer
_without_ using momentum (this code path was previously untested here).

The following code was used to benchmark changes (only single precision
part of the test was tested and some aux text was commented out if you
want to reproduce):
```
for i in `seq 100`; do tmva/tmva/test/DNN/testMinimizationCpu; done | awk '{print $6;}' | python -c 'import numpy; import fileinput; a = map(float, fileinput.input()); print(numpy.std(a), numpy.mean(a), numpy.min(a), numpy.max(a))'
```

Results (typical values):
 - Old version: 2.70*10^{-7} (std dev), 3.34*10^5 (mean), 2.27*10^{-6} (min), 0.0017 (max)
 - New verstion: 2.59 * 10^{-6} (std dev), 2.51*10*{-6} (mean), 1.16*10*{-7} (min), 1.37*10^{-5} (max)

Time taken is roughly doubled (~1 sec -> ~2 secs).

All results on local mac.
parent 54ad0083
No related branches found
No related tags found
No related merge requests found
Loading
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment