-
- Downloads
Fix the RNN Backpropagation. Weight and bias gradients needs to be initialised...
Fix the RNN Backpropagation. Weight and bias gradients needs to be initialised to zero in RNNLayer::Backward. Use correct step size in test program
Showing
- tmva/tmva/inc/TMVA/DNN/RNN/RNNLayer.h 5 additions, 0 deletionstmva/tmva/inc/TMVA/DNN/RNN/RNNLayer.h
- tmva/tmva/src/DNN/Architectures/Cpu/RecurrentPropagation.cxx 4 additions, 6 deletionstmva/tmva/src/DNN/Architectures/Cpu/RecurrentPropagation.cxx
- tmva/tmva/test/DNN/RNN/TestRecurrentBackpropagation.cxx 5 additions, 5 deletionstmva/tmva/test/DNN/RNN/TestRecurrentBackpropagation.cxx
- tmva/tmva/test/DNN/RNN/TestRecurrentBackpropagationCpu.cxx 6 additions, 4 deletionstmva/tmva/test/DNN/RNN/TestRecurrentBackpropagationCpu.cxx
Loading
Please register or sign in to comment